1 / 102

11:45 – 12:30      From IHE Profiles to conformance testing , closing the implementation gap

11:45 – 12:30      From IHE Profiles to conformance testing , closing the implementation gap Helping the implementers , testing tools , connectathons 12:30 – 13:30 Lunch Break 13:30 - 15:00 How to use IHE resources : hands on experience Technical Frameworks : navigating , Q&A

yana
Download Presentation

11:45 – 12:30      From IHE Profiles to conformance testing , closing the implementation gap

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 11:45 – 12:30      • From IHE Profiles to conformancetesting, closing the implementation gap • Helping the implementers, testingtools, connectathons • 12:30 – 13:30 Lunch Break • 13:30 - 15:00 • How to use IHE resources: hands on experience • TechnicalFrameworks: navigating, Q&A • Test tools: finding, using, configuring • Participating in the testingprocess

  2. IHE Resources Eric Poiseau, INRIA, IHE Europe technical manager Charles Parisot, GE, IHE Europe

  3. Connectathon

  4. History

  5. Connectathon • Started in 1998 in Chicago within the RSNA HQ • Europe started in 2001 • Japan in 2003 • China and Australia now also in the process

  6. Charenton le pont 2001 • 11 companies • 18 systems • 40 m2 • 30 participants Formation IHE France

  7. Paris 2002 • 33 companies • 57 systems • 130 m2 • 100 participants Formation IHE France

  8. Aachen 2003 • 43 companies • 74 systems • 350 m2 • 135 participants Formation IHE France

  9. Padova 2004 • 46 companies • 78 systems • 600 m2 • 180 participants Formation IHE France

  10. Noordwijkerhout 2005 • 75 companies • 99 systems • 800 m2 • 250 participants Formation IHE France

  11. Barcelona 2006 • 67 companies • 117 systems • 1500 m2 • +250 participants Formation IHE France

  12. Berlin 2007 • Companies • systems • 1500 m2 • +300 participants

  13. Oxford 2008 • 83 companies • 112 systems • 1500 m2 • 300 participants

  14. C.A.T Participation in Europe Berlin Oxford Barcelona Noordwijkerhout Padova Aachen Paris Paris

  15. Purpose • Test the implementation of the integration profile within product • Verify that the vendors did a good job • Verify that what the committees invented makes sense ! • Verify that the text is clear enough • Verify that that the committee did not miss anything • Build a community of …

  16. Computer geeks…

  17. …who like to enjoy local brewed beers

  18. From the vendor perspective • Unique Opportunity for vendors to test their implementations of the IHE integration profiles • Controlled environment • Customer is not present ! • Not in a clinical production environment • Specialists available • From SDO • From the peer companies • Bugs are identified and most of the time fixed !!!! • Connectathon Result Matrix • http://sumo.irisa.fr/con_result

  19. But… • Testing is sub-optimal • Only a part of all the possible tests are performed • A system successful at the connectathon is not guaranteed to be error free !!!! • We do not do certification !

  20. From the IHE perspective • Feedback from the vendor community • Did the committee do a good job ? • Did the developed integration profile respond to a demand of the vendors ?

  21. European C.A.T • We have reach now our cruise speed • NA and EU C.A.T are very alike • C.A.T used as an IHE promoting tool • Workshop in parallel to the C.A.T • Berlin : ITEG • Oxford • Vienna

  22. C.A.T. Model

  23. The IHE testing process Users Testing Results Deployed Systems Testing Tools Sponsors:Project Management Team Develop Testing Tools Approves Test Logs Connectathon Product +IntegrationStatement Implement Profile Actors In-House Testing Vendors Sponsors: Exhibits Demonstration IHE Technical Framework (Profiles Specification) Projet IHE-Dev Inria Rennes

  24. Pre-connectathon

  25. Pre-connectathon • Registration • See what can be tested • Exchange of configuration parameters • IP addresses • AE Title • Assigning authorities • OID • Certificates • Affinity domain specification

  26. Pre-connectathon • Mesa testing • In-house testing for vendors to get ready • Vendors return logs • Upon log return participation to C.A.T is accepted

  27. Atconnectathon

  28. Connectathon Testing • 3 types of test to be performed • No peer tests • Peer to peer tests • Workflow tests Participant Workshop

  29. No Peer Tests • Calibration Tests -CPI : • screen calibration • Printer calibration • Scrutiny Tests • Verifythat the objectscreated are « valid » • Providepeerswithsamples Participant Workshop

  30. Peer To Peer Tests (P2P) • Test subsections of a workflowbetween 2 vendors • Preparation to workflow test • Vendor chose when to runthem • Vendor select theirpeer. • Not to berunwithothersystemsfromsamecompany Participant Workshop

  31. Workflow Tests • Test an entire workflow that may combined more than one integration profile • We have a schedule, vendors need to be ready at the time of the test. • We have a list of difficulties to check. • Some test can run in 15 minutes • Some will require more than an hour • No second chance test Participant Workshop

  32. 5 days • Monday morning till 11 am • Set up time • Till Friday noon : • Free peer to peer and no peer testing • From Wednesday till Friday noon : • Directed workflow testing

  33. Monitors • Volunteers • Independent from vendors • Standard specialist • Verify tests • Act as moderator between vendors

  34. Results • Failure are not reported • To be successful • Each peer to peer test needs to be verified with at least 3 peers • There are some exceptions • A vendor may fail for an actor but pass for the others

  35. Connectathon Results • IHE does not report failure • Public resultsonlyat the companylevel • IHE willnever tell youwhat system participated to the connetathon • Vendors have access to theirown test results. Formation IHE France

  36. Connect-a-thon Results Browser Formation IHE France

  37. ConnectathonResults Browser Formation IHE France

  38. ConnectathonResults Browser Formation IHE France

  39. What does it mean ? • The Companywassuccessfulat the connectathon for the actor/integration profile combination • Results do not guarantyproductconformity • This is the role of the « IHE integrationstatements » Formation IHE France

  40. IHE Integration Statement Formation IHE France

  41. Participation Fees • First System € 2750 • Othersystems € 2850 • Per domain €   750  • Covers : • Infrastructure : room, power, monitors, internet… • Lunch and coffee breaks for 2 engineers during 5 days

  42. Next Connectathon • Where : Remise, Vienna, Austria • http://www.koop-kundenweb.at/remise/ • When : Monday 20th April to Friday 24th April 2009 • Registration : November 1st – January 7th 2009 • Announcement to bereleasedsoon

  43. CAT : conclusion

  44. C.A.T : Conclusion • It’s not a certification process • Unique opportunity for vendor to test and discuss • Seems to beusefull as proved by increased participation over the years • Sure, needsimprovement… • … but, we are working on it

  45. Testing

  46. Beforewestart • Impossible to test everything • Whatwe do not test • Design • Performance (Load) • Whatwe are looking for • interoperability • conformity Projet IHE-Dev Inria Rennes

  47. Conformance / Interoperability Specifications/Standards Conformance testing ImplementationA ImplementationB Interoperability testing VendorA Vendor B Projet IHE-Dev Inria Rennes

  48. Conformance Testing (1/2) • Is unit testing • Tests a single ‘part’ of a device • Tests against well-specified requirements • For conformance to the requirements of specified and the referenced standards • Usually limited to one requirement per test. • Tests at a 'low' level • At the protocol (message/behaviour) level. • Requires a test system (and executable test cases) • Can be expensive, tests performed under ideal conditions

  49. Conformance Testing (2/2) • High control and observability • Means we can explicitly test error behaviour • Can provoke and test non-normal (but legitimate) scenarios • Can be extended to include robustness tests • Can be automated and tests are repeatable • Conformance Testing is DEEP and NARROW • Thorough and accurate but limited in scope • Gives a high-level of confidence that key components of a device or system are working as they were specified and designed to do

  50. Limitations of Conformance Testing • Does not prove end-to-end functionality (interoperability) between communicating systems • Conformance tested implementations may still not interoperate • This is often a specification problem rather than a testing problem! Need minimum requirements or profiles • Does not test a complete system • Tests individual system components, not the whole • A system is often greater than the sum of its parts! • Does not test functionality • Does not test the user’s ‘perception’ of the system • Standardised conformance tests do not include proprietary ‘aspects’ • Though this may well be done by a manufacturer with own conformance tests for proprietary requirements

More Related