1 / 163

SDCT

Software. SDCT. Diagnostics And. Conformance. Testing. Program Review. Mark W. Skall, Division Chief Candy Leatherman, Secretary email: skall@nist.gov Tel. No.: 301-975-3262 Div. Web Site: http://www.itl.nist.gov/div897/. Division 897 Funding - FY00. Projected Balance: $257,062

trish
Download Presentation

SDCT

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Software SDCT Diagnostics And Conformance Testing Program Review Mark W. Skall, Division Chief Candy Leatherman, Secretary email: skall@nist.gov Tel. No.: 301-975-3262 Div. Web Site: http://www.itl.nist.gov/div897/

  2. Division 897 Funding - FY00 Projected Balance: $257,062 (+ $637,570 transferred to 2001 OA Reserve) OA - $1,615K ITS - $585K STRS - $5,207K Employees: 35 Full Time 3 Part Time 10 Intermittent 5 Guest Researchers

  3. SDCT’sGOAL • To improve quality of software in industry through the development of: • Technology • Measurements • Standards • Philosophy: • Concentrate on key areas at forefront of technology • Get involved early and partner with industry • Fill industry void • Transfer technology and move on • Specific Strategy: • Conformance Tests • Reference Implementations • Diagnostic Tests • Research to reduce costs of testing • Reference Data • Standards developed jointly with industry

  4. Excerpts from Assessment Panel Report – 6/26/00Division Review “The planning and documentation methods used by SDCT could serve as a model for other ITL divisions. The process has used clearly-defined criteria to select state-of-the-art programs with clearly defined priorities and goals, well-identified roles, and measurable contributions to national and international standards organizations. Examples include the work on conformance tests for XML, Distributed Interactive Learning Systems, and for DASE . . .Directions can change quickly within the software industry. The division carefully monitors these changes and acts appropriately. . . .Industry has provided many testimonials concerning the great value of the division’s standards work, including: ‘The OASIS-NIST XML Conformance test suite is critical for our industry’, Norbert Mikula, CTO, OASIS; ‘NIST made strong contributions (X3D) and resolved our knottiest problems’, Don Brutzman, Board of Directors, Web3D consortium, and ‘The high quality test suite and certification program is an invaluable resource for ATA’, Robert Peel, Director of Airworthiness and Standards”

  5. Excerpts from Assessment Panel Report – 6/26/00Laboratory Level Review “New projects are startedbased on the importance of the work to U.S. industry, and the work viewed by the panel generally had specific, focused goals. An example of this approach is the work on XML done in SDCT. This project is notable because it addresses a major standard on which industry was making little progress, and it also effectively leveraged skills and approaches developed in a now terminated project on VRML. . .Another example of the improved planning process is the existence of clear termination criteria, which have been useful in sunsetting projects. For example, the Real-Time Java project and the work on RBAC satisfied their completion criteria and therefore ended this year.”

  6. SDCT Software Diagnostics and Conformance Testing • Standards and Conformance Testing Group • Software Quality Group • Interoperability Group

  7. SCTG Standards and Conformance Testing Group • Development of conformance tests • Development of reference implementations • Research into better ways to do conformance testing • Develop standards jointly with industry Group Leader: Lynne Rosenthal Supy Proj. Leader: Lisa Carnahan Clare Lucey, Secretary

  8. SCTG Standards and Conformance Testing Group Mary Brady Lisa Carnahan Laurent Ciarletta (GR) Anthony Cincotta (S) Chris Dabrowski Alden Dima Leonard Gallagher Leonard Gebase Neil Gima (S) Alan Goldfine Martha Gray Michael Kass Clare Lucey Thomas Logue Carmelo Montanez-Rivera Thomas Rhodes Richard Rivello Jacqeline Schneider John Tebbutt Marie-Noelle Terasse (GR) Mark Zimmerman

  9. Division 897 Standards and Conformance Testing Group

  10. SQG Software Quality Group • Develop methods to automate software testing • Develop software diagnostic tools • Develop reference data • Formal methods Group Leader: John Barkley Debbie Blackstone, Secretary

  11. SQG Software Quality Group Tamer Ahmed (GR) Paul Ammann (F) Debbie Blackstone David Brinkley (F) Paul Black Neva Carlson John Cherniavsky Anthony Cincotta Keith Gallagher (F) Roger Gima (S) Michael Koo Mary Laamanen James Lyle William Majurski Douglas White

  12. Division 897 Software Quality Group

  13. IG Interoperability Group • Ensure Federal agency requirements are input into voluntary standards committees - work through Federal CIO Council • Help Federal agencies and industry achieve interoperability through application of Division products • Provide technical support to voluntary standards committees • Support NIST paperless office effort utilizing digital signatures Acting Group Leader: Lisa Carnahan Vacant, Secretary

  14. IG Interoperability Group Daniel Allen Bruce Bargmeyer (RA) Frederick Boland Gary Fisher Larry Fitzwater (GR) Elizabeth Fong Kathryn Harvill Roy Morgan Judith Newton Gertrude Sherwood

  15. Division 897 Interoperability Group

  16. Conformance Tests Reference Implementations Standards developed jointly with industry Diagnostic Tests Research to reduce costs of testing Reference Data Strategy

  17. Development of Conformance Tests • Emphasis on newer technologies • Tests developed in parallel with standards • Tests developed before implementations • Need buy-in from industry • Need high impact • Need technology transfer

  18. Development of Conformance Tests (cont’d) • Legacy activities • CGM • SQL • POSIX • Ada and other programming language • VRML • Current activities • XML and companion standards (DOM, XSL, etc.) Digital TV • X3D • Learning Technologies (IMS)

  19. XML Conformance Testing Project Team: Mary Brady Carmelo Montanez Ricky Rivello Mark Zimmerman

  20. Electronic Commerce B2C (business-to-consumer) Interactive Television (DASE, SMPTE) Distributed Learning (IMS) B2B (business-to-business) Manufacturing (ICM) XML Technologies Internet / World Wide Web

  21. Why NIST? • Electronic Commerce Growth • Growth is exponential, already outpacing last year’s predictions • Changing the way businesses do business • Revolutionizing the way we consume and play • Advances are necessary in: • Tools for describing / sharing information • XML technologies • Applying IT to solve vertical market demands • NIST provides: • XML Technologies • Conformance Testing • NIST developed tests, available on-line, for use in testing applications • NIST technical leadership, neutral third-party • Unbiased Feedback of Specification Errors • Vertical Markets • Applied IT research

  22. <Purchaser> <Name> Mary Brady </Name> <Address Street=NN Rm 572 City=Gaithersburg, State=MD Zip=20899</Address> </Purchaser> <Item> <Part_number>1234</Part_number> <ItemDescription>Trees</ItemDescription> <Cost>50.00</Cost> <Quantity>20</Quantity> <TotalCost>1000.00</TotalCost> <Instructions>Be sure to water the root ball day of delivery</Instructions> </Item> <ShipTo> <Name>Susan Carscadden</Name> <Address Street=NIST, Bldg 301 City=Gaithersburg State=MD Zip=20899</Address> </ShipTo> <H1>Purchase Order</H1> <UL> <LI><b> Mary Brady</b> <LI>NN Rm 572, Gaithersburg, MD, 20899 <LI>1234 <LI><I>Trees</I> <LI>50.00 <LI>20 <LI>1000.00 <LI>Be sure to water the root ball day of delivery <LI>Susan Carscadden <LI>NIST, Bldg 301, Gaithersburg, MD, 20899 </UL Figure 1: HTML Purchase Order • What is XML? • Domain-specific languages • Data separate from display • Self-describing data • Schemas HTML or XML? Figure 2: XML Purchase Order

  23. XML Technologies Electronic Commerce, Distance Learning, Health Care Manufacturing, Interactive Televison Programming Languages Transformations Formatting Objects Signatures Stylesheets Query Language DOM Domain Specific Namespaces Registries Repositories E-Business Learning Objects XML/EDI XML.ORG BIZTalk Procurement Education Medical Boards Auto Parts ... ... Ratings Libraries Control ... Information Description XML Syntax, Information Set, Schema, Linking, Fragment

  24. OASIS XML Conformance Sun, IBM, Fuji Xerox, DataChannel, MicroStar, W3C members • XML Test Suite • 1000 XML Tests • DTD + 4000 lines of XML • 400 lines of XSL Stylesheet Why NIST? • Technical Leadership • Coalesced industry partners 18 months First Deliverable • XML Test Suite • XML files & XSL Stylesheet 6 months Future Work • XML StyleSheets, Schemas • XML NS, XLink, XPtr Continued Support XML Conformance Testing

  25. W3C, OASIS Sun, IBM, Microsoft, Netscape, Oracle • DOM Test Suite • 300 ECMAScript Tests • 14,500 lines of code XML ‘98 • Met with W3C WG Chair • NIST asked to develop tests XML ‘98 NIST Released • Fundamental, Extended (ECMAScript) • Interactive test harness June ‘99 Future Work • Fundamental, Extended (Java) • HTML (ECMAScript & Java) • DOM Level 2 Continued Support DOM Conformance Testing

  26. FY00 Progress -Tests • Completed XML Test Suite (release 1) • Completed XML Test Suite (release 2) • Incorporated new XML tests and fixes • Completed DOM-Ecmascript Test Suite (release 1) • Completed DOM-Java Test Suite (release 1) • Develop XSLT Test Suite • partnership with Lotus Corp. and OASIS

  27. FY00 Progress -Committee Leadership • Chair OASIS Conformance and XML Testing Committees • Provide leadership for XSLT and Schema Testing committees • Provide leadership for UN/CEFACT ebXML initiative (conformance and reg/rep) • Provide conformance guidance • Develop conformance clause • Harmonize OASIS and ebXML efforts

  28. FY01 Plans • Complete XSLT test suite • Develop test suites for XML Schema and other XML Technologies • Develop XML-based automated test tools to improve test development • Continue to chair OASIS testing committees

  29. FY00 Resources • 3.5 FTE • $700K (STRS) FY01 Resources • 5 FTE • Travel required to collaborate in standards and test development efforts • Equipment: 10K

  30. XML Conformance Testing Customers: • IT industry: e.g., IBM, Sun, Microsoft, Oracle, DataChannel, Documentum • Vertical industries: e.g., Education (IMS), Financial (e.g., Dun and Bradstreet), Travel (e.g., Sabre) Impact: • Conformance test suites have been used to improve the quality of XML/DOM processors (many of which are embedded in other XML software solutions). These are used in many vertical markets by millions of customers. • Feedback to standards developers improve the specifications

  31. Interactive TV Project Team: Alan Goldfine John Barkley Doug White Len Gebase

  32. Interactive TV • TV and the Internet are each recognized as “a technology …that seems to change everything” (BW, Oct 4 ‘99) • Interactive TV is the convergence of TV and the Internet • Profound impact on Electronic Commerce expected • WebTV: 1 million subscribers, 350 hrs/wk of programming • AOLTV: projected 2 million subscribers in 2 years • ITL partnering to develop the standards and tests required for the success of Interactive TV

  33. Interactive TV Traditionally separate media are converging: • Wireless phones do email • Internet does broadcast radio and TV • Broadcast TV references links to Web pages

  34. TV: } (non-interactive) accessed separately OR Web: (interactive) With DASE/DDE, TV and Web are converging into a single medium: DASE/DDE Interactive TV Currently, for the viewer:

  35. Golf matches on TV Web site with associated products DASE/DDE: single medium DASE/DDE Interactive TV Demonstration Current production:

  36. Interactive TV Demonstration

  37. Interactive TV Advanced Television Systems Committee (ATSC) • Members include: ABC, CBS, IBM, Intel, Lucent, Microsoft, NBC, Warner Brothers • DASE (DTV Application Programming Environment): Standard for a platform independent, high-level abstraction for integrating DTV and the Internet • DASE specifies programming environment for DTV receivers • Downloadable DASE applications portable across different TV set tops

  38. Interactive TV NIST’s Role in ATSC DASE: • Joint project with the High Performance Systems and Services Division (895) • Chair DASE Conformance Working Group where key players include: ABC, Gateway, Microsoft, Sun • Partner with Unisoft to develop conformance tests • Reference implementation for programming environment

  39. Interactive TV Society of Motion Picture and Television Engineers (SMPTE): • Members include: ABC, CBS, IBM, Intel, Microsoft, NBC, Warner Brothers • DDE (Declarative Data Essence): Standard for Internet content and bindings to analog and digital streams (formerly ATVEF) • Declarative Content: HTML, CSS, ECMAScript, DOM, Triggers (for syncing the two mediums)

  40. Interactive TV NIST’s Role in SMPTE DDE: • Contribute text to specification • Partner with Unisoft to develop conformance tests • Develop conformance tests for triggers and stream binding to integrate with existing Unisoft DDE element tests • Develop prototype receiver testbed • Participate in Applications Data Essence (ADE) Study Group

  41. FY00 Progress • Chair DASE conformance task group • Conformance section of the DASE Standard completed • Review of DASE conformance test assertions from Unisoft completed • DDE Standard completed, adoption underway • DDE Bindings Standard underway • Test assertions for DDE triggers underway • Testbed for DDE receivers underway • ADE Study draft completed

  42. FY01 Plans • Continue to chair DASE conformance group • DASE Standard adopted • Continue DASE conformance test development with Unisoft • DDE and DDE bindings Standards adopted • Develop DDE conformance tests for triggers, integrate with Unisoft tests • Develop test assertions for DDE bindings • Complete DDE Receiver testbed • Begin DDE-2 Standard development

  43. FY00 Resources • 2 FTE • $400K (STRS, ATP-$100K) FY01 Resources • 4.5 FTE • Equipment: $20K • Travel required for standards meetings and testing collaborations

  44. Interactive TV • Customers • IT, e.g., IBM, Microsoft • broadcast TV, e.g., NBC, DirecTV • entertainment, e.g., Disney, Universal • consumer electronics, e.g., RCA, Intel • Impact • New Medium: TV/Internet convergence • Interactive applications portable across receivers from different manufacturers • ATSC and SMPTE invited NIST participation • NIST chairs DASE Conformance Group

  45. X3D Conformance Testing Project Leader: Mike Kass

  46. Objectives • Continue the NIST/Web3D partnership begun in 1996 • ITL created the VRML Test Suite (VTS) and Viper parser • Create an X3D Test Suite ( XTS ) • Modify VRML test requirements and test descriptions • Translate existing VRML tests to X3D format • Chair the Interoperability/Conformance WG • Interface with developers and specification writers to resolve ambiguities in specification • Provide feedback to developers on browser conformance

  47. FY00 Progress • Completed translation of VTS to X3D Test Suite (XTS) • Developed test harness to enable developers to contribute tests • Interest from DraW and Sun for using the test suite and making the results public

  48. FY01 Plans • Low level support - assisting community in using the VRML and X3D Test Suites • No new development planned. Maintenance as needed

  49. FY00 Resources • 0.75 FTE • $150K (STRS) FY01 Resources • 0.2 FTE (minimal) • No Travel is expected

  50. X3D Conformance Testing Customers: • X3D Consortium including Sun, Sony DraW Computing, Shout, Blaxxun Impact: • Conformance test suites are being used to improve the quality of X3D browsers • DraW Computing has said that they would make their test results public

More Related