1 / 20

Risk Theme

Risk Theme. DIRC Research Conference 16 March 2005. Summary. Scope of the theme Issues for risk in computer-based systems. Exemplars of the Risk Theme approach Outputs from the Risk Theme Plans for the Risk Theme. Scope.

tad
Download Presentation

Risk Theme

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Risk Theme DIRC Research Conference 16 March 2005

  2. Summary • Scope of the theme • Issues for risk in computer-based systems. • Exemplars of the Risk Theme approach • Outputs from the Risk Theme • Plans for the Risk Theme

  3. Scope • The risk literature is huge and spans many disciplines – need to focus. • Focus on computer based systems with a bias towards “everyday risk”: • How to improve the way we deal with risk (Ho, Mackie & Martin). • Develop approaches to new classes of risk (Baxter et al, Wherton). • Consciously take a cross-disciplinary approach – engaging with relevant disciplines.

  4. Issues • Risk Management: • Assessment of current techniques using social, psychological, economic, statistical viewpoints. • Strengthen current approaches to address a wider range of risks. • Ways to use the approaches with real systems. • New Kinds of Computer-Based Risk: • Trust failures, • Emergent Risk, • Adversarial Risk, … (e.g. LTCM studies) • Dissemination: guidance, standards

  5. Exemplars • Risk perception: strong sociological literature – important in policy arena. • Risk in organisational structure: how does the development of new organisational structure affect risk. • Trust and Risk: increasingly human trust relationships are computer mediated – what affect does this have on risk?

  6. Exemplar: Risk and Trust • Human trust is increasingly machine mediated – source of potential failures (e.g. Bed management system). • Trust is a natural link between Risk and Responsibility: A is responsible to B for W often means that B trusts A to do W in the current environment. • B usually can’t justify all the trust in A so there is Risk associated with trust.

  7. Risk and Trust: Bed Management • Two main actors, M: the hospital bed manager, W: ward level staff. • Different perception of danger: • M mostly worries about closing admission (admin concern) • W mostly worries about disrupting the “normal” work (professional concern). • Likelihood is dynamic and depends on current state. • Things are fine if risk is low (prompt accurate reporting) • If close to capacity – should W report a bed coming free: • Decreases likelihood of M’s worries (reduces risk) • Increases likelihood of W’s worries (increased risk)

  8. Risk and Trust: Dimensions • Taxonomy (meanings of trust) – affective, competence, honesty, … dealing with polymorphic aspect of trust. • Compositional accounts of trust – provenance, authentication, … • Diversity/Form of information sources (issues about market mechanisms, fault tolerance) (Ryan et al). • Bounded rationality (Gigerenzer, Tversky): Fast and Frugal Heuristics (Aagaard) • Formal models of trust (Peacock)

  9. Risk and Trust: Issues • Can we reduce “big”, Trust to smaller more acceptable trust in the presence of relevant evidence? • E.g. could we show that in an automated system we can reduce seemingly unacceptable levels of trust to those that are acceptable in non-automated systems? • Does the automated system support fast and frugal heuristics? • How do formal models of trust failure help with system design?

  10. Risk theme outputs • A “Risk Reader” – DIRC outputs with narrative commentary. • Case studies: • Risk in PiMS, Bed Management • SUN, QuinetQ, NATS? • Augmentation of existing Risk standards – NIST SP 800-30, EN 14971. • Short guide to applying DIRC techniques to Risk analysis of computer-based systems. • Possibly web-based body of knowledge.

  11. Plans • Risk reader complete mid 2005. • Complete work on analysing Risk aspects of case studies. • Develop stronger linkages to other themes – via risk perception, to Timing and Structure • Trust and Risk seems to be a good point of contact with Responsibility, • Articulate linkage to Diversity from all the areas of work – particularly market/regulated diversity.

  12. The End

  13. Responsibility and dependability • Most of us can think of failures that occurred because of problems with responsibility • For example, files were lost because A believed than B had responsibility for all backups whereas B believed that all users had responsibility to backup their own files. • Failure of omission • Misunderstood responsibilities meant that something wasn’t done • Failure of commission • Misunderstood responsibilities can mean that something is done in an ‘incorrect’ or unintended way.

  14. The notion of responsibility • Responsibility is a slippery concept because it can mean so many different things: • Responsibility for enacting a ‘workflow’ • X is responsible for the daily backup cycle • Responsibility for a ‘state of affairs’ • Y is responsible for system management • Responsibility for an ‘enterprise’ • Z is responsible for IT in the organisation • Responsibility for a ‘goal’ • The Home Office is responsible for national security

  15. Problems with responsibility • Responsibility and authority • A is given the responsibility for B but not the authority to do things that allow them to fulfil that responsibility. • Responsibility and blame • If A is responsible for B and there is a failure of B then A may be blamed for that failure. A may therefore disclaim the responsibility. • Collective responsibility • The ‘responsible’ may not be a single, clearly identified role or person.

  16. Responsibility ‘failures’ • The ‘responsible’ who is assigned a responsibility is inappropriate. • The ‘responsible’ does not have adequate resources to discharge a responsibility. • The ‘success criteria’ for the responsibility and undefined or poorly defined. • The ‘responsible’ and/or the assigner of the responsibility misunderstand the nature of the responsibility. • ‘Responsibles’ have conflicting responsibilities.

  17. Objectives of the theme • To explore the nature of responsibility and its relationship with system dependability. • To propose notations, and techniques that will help system procurers, designers and owners make responsibilities explicit and reason about these responsibilities. • To investigate how knowledge of responsibilities may be used to reduce vulnerabilities in operational socio-technical systems.

  18. Theme outcomes • A body of work documenting our understanding of the nature of responsibility. • A number of case studies illustrating the significance of responsibilities in different systems. • A notation or a number of notations for modelling responsibility. • Preliminary guidance on how knowledge of responsibilities can be used to inform systems design and operation. • These MAY be delivered as a book, a web site, a collection of papers, etc.

  19. Progress and plans • Progress • On the nature of responsibility. Initial paper discussing responsibility included in PA2 book. • Case studies. Analysis of the Ladbroke Grove rail accident from a responsibility perspective completed. • Responsibility modelling. Experiments with a notation for relating responsibilities to roles (documented in PA2 book). Initial proposals for a notation that allows reasoning about responsibility assignment and delegation. • Plans • Focus on development of responsibility models and case studies. Further case study and proposals for modelling notation and modelling experiments by mid-2005.

  20. Issues • Is responsibility purely a human notion or can automated systems be considered to be ‘responsible’ for something? • Can we have a unified notation that allows causal (who or what did something) responsibility and consequential responsibility (who takes the blame)? • How do we integrate this work with work on the risk theme?

More Related