1 / 12

WS Test Forum

WS Test Forum. Doug Davis 01/19/2010. Life without WSTF. Existing TCs/WGs mainly focus on their own specs Very few interoperability issues are actually found during spec-wise interoperability testing - scenarios are very tightly scripted/scoped Scenarios are not based on customer usages

opal
Download Presentation

WS Test Forum

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. WS Test Forum Doug Davis 01/19/2010

  2. Life without WSTF • Existing TCs/WGs mainly focus on their own specs • Very few interoperability issues are actually found during spec-wise interoperability testing - scenarios are very tightly scripted/scoped • Scenarios are not based on customer usages • Real-world problems aren't visible at the spec level so aren't identified • WS-I is limited in the issues/scenarios it may investigate • Limits work to specs that have exited SDOs • WS-I has set a rather high bar to begin work on new profiles (Board approval) • No forum for on-going or forward-looking testing • Nor is there a forum for testing new specs under development • Where do customers/vendors go if they have questions or issues? • Cost of interop testing is too high • Vendor Product Development/SQA teams require expertise in competitive product installation, deployment and development in order to test interoperability

  3. WSTF Goals • New Forum focused on developing and testing non-trivial scenarios • Mainly customer driven, but allows for speculative scenarios as well • Providing a means of continuous long term interoperability/regression testing • Strengthen the WS foundation we’ve provided to the community • Help expose interoperability issues before our customers see them • Make it easy for new specs to be added as they are developed • Provide “shared” test bed available for the entire WS community to use • MxN testing • Similar to the soap encoding tests used for soap many years ago • Save time/effort – each company doesn’t need to setup everyone else’s endpoints • Can test before products are shipped! • Intention is for product teams to host/use these endpoints • Pipeline for new specs/profiles/fixes to existing specs • Identify potential gaps in our overall architecture • Which should lead to fixing those gaps by working with the appropriate spec/profile owners • Requirements for new Profiles, v.Next work for existing specs or even new specs

  4. Main Deliverable: Scenarios • Scenarios can be just about anything • Unit test cases for WS specs • Profile conformance tests • Customer focused usage scenarios • Consists of: • Description of problem • Set of "testcases" (message flows) with expected behavior from all parties • WSDL, XSD, Sample Messages • Findings

  5. WSTF Testing • Each Scenario also includes • List of participants (list of endpoints) • Test Result Matrix • Run each night (testing client → service) • Each testing participant can host • Client – typically with a nice Web interface • Service • Testing client interface (HTTP) • Ancillary info to aid users: config docs, code

  6. WSTF Details • Most work is private (mailing list, scenario development, votes...) • Want people to feel free to discuss issues w/o looking bad • However, very low bar for entry • Sign a Participation Agreement • Scenarios can be made public only after a vote by the implementers • Votes: Requires at least 5 implementations + 2/3 ‘yes’ vote • Provides a simple filtering to prevent diluting the value of the group • Shows broad industry support for published scenarios • Unpublished scenarios can still be tested, but nothing is made public • They can be "announced" though with a 2/3 'yes' vote and 3 impls • An "announced" scenario just has its abstract made public • Testing • Each scenario will have a list of endpoints (private and public lists) • Endpoints are expected to be “long lived” • New implementations (non-members) can test at will using ‘published’ scenarios and endpoints • Interop issues are brought to the appropriate forum(s) by individuals not the group itself

  7. Deliverable: Open Community • Provide a community where most WS vendors can be queried at once • Provide best chance for unbiased answers • Provide formal and informal guidance for "WS Best Practices" • Scenario Findings • Mailing lists • E.g. Review of profiles, specs...

  8. Things you should know... • Zero cost • No dues or fees • IBM pays for the web site (but no IBM logos, etc... ) • Degree of participation is voluntary • Members participate in any scenario they wish • Varying degrees of involvement • Monitor  Scenario Advocate  Scenario Implementer • Anyone can simply monitor the mailing lists, however... • Goal: WS Vendors become “implementers” • Goal: WS Customers keep Vendors grounded in reality • Through monitoring and advocating/suggesting scenarios • Pseudo Open-Source Model • Community driven – similar to soapbuilders • Code talks - limited politics (we hope) • Not limited to just SOAP-style Web Services • Looking at testing REST Web Services as well • Nor are we precluded from doing domain specific testing • E.g. schema/parsing validation testing

  9. WSTF Status • Went "live" on Dec 8th, 2008 • 21 organizations / 103 members so far • Vendors, Customers, Industry Consortiums, Standards Orgs – and individuals • Web site: http://www.wstf.org • Charter and Participant’s Agreement are on web site • Initial Scenario Development • Three scenarios have been published • Two "specish" – testing SOAP and WS-Addressing • Basic ones to test the infrastructure of the group/soap stacks • Provide a base-line for more advanced scenarios • Business Oriented – Purchase Order • More under development • We're already seeing external impact (fast turn-around): • Issues in WS-I and OASIS (next slide) • Customers identified holes in the architecture (e.g. jax-ws)

  10. External impact so far... • Guidance on faults for one-ways • BP issue BP20116 – resolved per our recommendation • Ambiguity around rejected RM CS/Offer • RSP issue i150 – resolved per o/r • WS-Trust WSDL was not BP compliant • WSSX issue 169 – resolved per o/r • WSA Fault Processing Rules for bad EPRs • BP issue BP20121 – resolved per o/r • BP issue BP20126(2.0)/BP12050(1.2) – resolved per o/r • Sync and Async Operations within same Port • BP issue BP20133(2.0)/BP12057(BP1.2) – under discussion • RM Piggy-Backed ACKs should not be mU=1 • RSP issue i171 – under discussion • New spec under consideration...

  11. Summary – Why WSTF? • Leverage community and testing • Allow WS vendors to test key scenarios before you do • Allow WS vendors to enhance their regression tests with the scenarios that matter to you • Use the WSTF community to validate your WS architecture and plans

  12. THANK YOU! • Contact: • Doug Davis (dug@us.ibm.com) for more info • Or admin@wstf.org • Or publicrelations@wstf.org • Questions?

More Related