1 / 20

Diagnosis with Fault Modes

Diagnosis with Fault Modes. Philippe Dague and Yuhong Yan NRC-IIT Philippe.dague@lipn.univ-paris13.fr Yuhong.yan@nrc.gc.ca. Diagnosis: Using fault modes. MBD approach provides a framework for diagnosing (detecting and locating faults) a device from its correct behavioural model only.

iolana
Download Presentation

Diagnosis with Fault Modes

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Diagnosis with Fault Modes Philippe Dague and Yuhong Yan NRC-IIT Philippe.dague@lipn.univ-paris13.fr Yuhong.yan@nrc.gc.ca

  2. Diagnosis: Using fault modes • MBD approach provides a framework for diagnosing (detecting and locating faults) a device from its correct behavioural model only. • Big advantage w.r.t. techniques based on a priori knowledge of all failure modes. • Nevertheless, knowledge of likely faulty modes increases discrimination capacity and may allow fault identification. • Idea: extending the consistency-based diagnostic framework by using fault modes, but without requiring their exhaustivity. • Sherlock (de Kleer – Williams) and GDE+ (Struss – Dressler) in 1989.

  3. Behavioural modes • In addition to good behaviour G(c), we consider known faulty behaviours Fi(c) and unknown mode U(c), all distinct. • U(c) is used to model the non exhaustivity of the Fi’s, keeping logical soundness. • Example of an inverter: Inverter(C)  ¬AB(C)  G(C) Inverter(C)  AB(C)  S0(C)  S1(C)  U(C) ¬Inverter(C)  ¬G(C)  ¬S0(C) … ¬Inverter(C)  ¬S1(C)  ¬U(C) G(C)  (In(C)=0  Out(C)=1)  (In(C)=1  Out(C)=0) S0(C)  Out(C)=0 S1(C)  Out(C)=1

  4. Definition of Diagnosis and Conflict • Diagnosis is an assignment of behavioural mode to each component, consistent with system description and observations: SDOBS{mi(c)|cCOMPONENTS} |  • Diagnoses can still be computed from minimal conflicts • A conflict is a set of component behavioural modes, inconsistent with system description and observations: SDOBS{mi(ck)} |=  • Minimal conflicts = minimal for set inclusion • Can be computed as nogoods by an ATMS (assumptions = components behavioural modes)

  5. Computation of Diagnoses • An assignment  of behavioural mode to each component is a diagnosis iff it does not contain any minimal conflict Ci. • That is, complement of  in {modes(c) | c  COMPONENTS} is a hitting set of the collection of minimal conflicts Ci. • Example: the 3-inverter

  6. The 3-inverter A B C X Y O I Space of Diagnoses U: 43=64 ATMS assumptions: 3*4 = 12 {G(A)},{S0(A)},{S1(A)},{U(A)}, {G(B)}, {S0(B)},… Comparison: n components, diagnoses space U with only good modes: 2n with k faulty modes, 1 good mode, 1 unknown mode: (k+2)n

  7. A B C X Y O I The 3-inverter: ATMS label computation (1) Observation: I=0 I=0,{{}} X=1,{{G(A)},{S1(A)}} X=0,{{S0(A)}} Y=0,{{G(A), G(B)},{S1(A),G(B)},{S0(B)}} Y=1,{{S0(A),G(B)},{S1(B)}} O=1,{{G(A), G(B), G(C)}, {S1(A),G(B), G(C)}, {S0(B), G(C)}, {S1(C)}} O=0,{{S0(A),G(B),G(C)},{S1(B),G(C)},{S0(C)}}

  8. A B C X Y O I The 3-inverter: ATMS label computation (2) Observation: I=0, O=0 I=0,{{}} X=1,{{G(A)},{S1(A)}} X=0,{{S0(A)}, {G(C),G(B)}} Y=0,{{G(A), G(B)},{S1(A),G(B)},{S0(B)}} Y=1,{{S0(A),G(B)},{S1(B)}, {G(C)}} O=1,{{G(A), G(B), G(C)}, {S1(A),G(B), G(C)}, {S0(B), G(C)}, {S1(C)}} O=0,{{S0(A),G(B),G(C)},{S1(B),G(C)},{S0(C)}} O=0,{{}}

  9. The 3-inverter: 4 minimal conflicts • 4 minimal conflicts Ci: {G(A), G(B), G(C)},{S1(A),G(B), G(C)}, {S0(B), G(C)}, {S1(C)} • Diagnoses: assign modes to the components, ={mi(A),mj(B),mk(C)} • Diagnoses do not contain any of the 4 minimal conflicts: Ci, ¬(Ci  )  Ci  (U\)  (U\) hits any Ci  (U\) is hitting set of conflicts

  10. The 3-inverter:hitting sets • The hitting sets of 4 minimal conflicts Ci {G(A), G(B), G(C)},{S1(A),G(B), G(C)}, {S0(B), G(C)}, {S1(C)} are • {G(C),S1(C)} • {G(B),S0(B),S1(C)} • {G(A),S1(A),S0(B),S1(C)}

  11. The 3-inverter: diagnoses From hitting set {G(C),S1(C)}: {mi(A),mj(B),S0(C)} or {mi(A),mj(B),U(C)} From hitting set {G(B),S0(B),S1(C)}: {mi(A),S1(B),G(C)} or {mi(A),U(B),G(C)} From hitting set {G(A),S1(A),S0(B),S1(C)}: {S0(A),G(B),G(C)} or {U(A),G(B),G(C)} Total 42 diagnoses out of 64 in diagnoses space

  12. The 3-inverter: compare with using only good mode G(*)  ¬AB(*) U(*)  AB(*) For the diagnoses {mi(A),mi(B),S0(C)}  X {mi(A),mi(B),U(C)}  G(A),G(B),U(C)  ¬AB(A),¬AB(B),AB(C)  {C} G(A),U(B),U(C)  ¬AB(A),AB(B),AB(C)  {B,C} U(A),G(B),U(C)  AB(A),¬AB(B),AB(C)  {A,C} U(A),U(B),U(C)  AB(A),AB(B),AB(C)  {A,B,C} {mi(A),S1(B),G(C)}  X {mi(A),U(B),G(C)}  G(A),U(B),G(C)  ¬AB(A),AB(B),¬AB(C)  {B} U(A),U(B),G(C)  AB(A),AB(B),¬AB(C)  {A,B} {S0(A),G(B),G(C)}  X {U(A),G(B),G(C)}  AB(A),¬AB(B),¬AB(C)  {A} minimal diagnoses are underlined

  13. The Probability of Diagnoses • Diagnoses space is large even for a small problem when the faulty modes are considered • The complete diagnoses are able to get but not necessary for practical work • Only the most probable diagnoses are needed to be found: the leading diagnoses (a subset of all the diagnoses)

  14. The criteria of the leading diagnoses • All leading diagnoses have higher probability than all non-leading diagnoses • Select no more than k1(=5) leading diagnoses • Probability > Max(pi)/k2 • (pi)>k3, k3=0.75, stop to select pi when the sum is greater than k3

  15. Focus in ATMS • Focuses are the leading diagnoses • Only consider the environments  one of the focuses • Best first search to get the focus: select the most probable diagnoses

  16. Probabilities of Diagnoses • Given prior probability p(mi(ck)) • The prior probability of a diagnosis i is p(i) = mjip(mi(ck)) • Update the posterior probability after a new measurement p(i|xi=vik) = p(xi=vik|i)*p(i)/p(xi=vik) p(xi=vik|i)={ 0 if i Rik 1 if i Sik 1/m if i Ui p(xi=vik) = p(Sik) + p(Ui)/m Rik = Sik Ui Sik|=xi=vik Ui|= xi=? Others |= xvik

  17. The cost of measurements • Shannon entropy H = - pilogpi • The expected entropy He(xi) after measure quantity xi is He(xi) = - p(xi=vik)H(xi=vik) • H(xi=vik) H(xi=vik) = - pl’logpl’=… • He(xi)= H+p(xi=vik)logp(xi=vik)+p(Ui)logm • $(xi)= p(xi=vik)logp(xi=vik)+p(Ui)logm+1

  18. Simplified Idea • Assume components fail independently with equal very small probability  p(i) = |i|(1- )n-|i| |i| | i | = the number of components in i • After multiple probe E, p(i|E)= q /(N*mfl) fl=number of times i failed to predict a measurement outcome in the sequence diagnosis q=size of i; N= normalization • If  <<1/mfl, we can keep only minimal cardinality diagnoses and N = q *(1/mfl) p(i|E)= 1/mfl *(1/mfl) It is independent from 

  19. Cost of Measurements • $(xi) is to minimize [p(Sik)+p(Ui)/m]*log[p(Sik)+p(Ui)/m]+p(Ui)logm (p(xi=vik)logp(xi=vik)+p(Ui)logm+1) This function is independent of  and depends only of fl • Special case: diagnoses (of minimal cardinality) always predict outcomes => p(Ui)=0 and fl =0 => P(i|E) = 1/#minimal cardinality diagnoses = 1/N’ • Minimize cost [p(Sik)]*log[p(Sik)]= (Cik/N’)]*log(Cik/N’) minimize (Cik)]*log(Cik) Where Cik = number of diagnoses (in N’) predicts that xi=vik

  20. Example: ploybox • 2 single fault diagnoses {M1} and {A1} {M1}SDOBS|={X=4,Y=6,Z=6} {A1}SDOBS|={X=6,Y=6,Z=6} • $(X) = 1ln1+1ln1=0, the best! • $(Y) = 2ln2 = 1.4 • $(Z) = 2ln2 = 1.4

More Related