1 / 26

Testing Web Services in the Cloud

ASQF. WSTEST. Testing Web Services in the Cloud. Prüfregel für Web Service Schnittstellen WSDL Codemängelbericht Beispiel einer WS-Schnittstelle Web Service Schema Web Service Test Request Web Service Testdatengenerierung Web Service Test Web Service Testskript

wadeross
Download Presentation

Testing Web Services in the Cloud

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. ASQF WSTEST Testing Web Services in the Cloud Prüfregel für Web Service Schnittstellen WSDL Codemängelbericht Beispiel einer WS-Schnittstelle Web Service Schema Web Service Test Request Web Service Testdatengenerierung Web Service Test Web Service Testskript Web Service Ergebnisvalidierung Validation der funktionalen Anforderungen Prüfung der qualitativen Anforderungen Zusammenfassung Composite Applikationsintegration Web Service Hierarchie Web Services in the Cloud Web Service Testing as a Service Web Service Test Prozess Web Service Testansätze Spezifikationsbasierter Testen Werkzeuge für den Web Service Test Validierung der Services gegen ein SLA Spezifikation der SLA Kriterien Sneed‘s Web Service Testverfahren Statische Analyse der WS-Schnittstelle WSDL Codemetrikbericht 1 14 15 2 16 3 4 17 18 5 6 19 20 7 8 21 22 9 23 10 24 11 25 12 13

  2. ASQF WSTEST-1 Composite Applikationsintegration XML SNAP @ XML XML XML eBUSINESS/ XML SCM APPLIKATIONS AS/400 UNIX/ WINDOWS DATA CRM ERP MAIN- FRAME USERS APPLIKATIONS PORTALS MOBILE PRINT Composite Application Integrator Enterprise Business Process Manager Enterprise Service Integrator CentraSite SOA Repository Enterprise Information Integrator Enterprise Legacy Integrator

  3. Application Services z.B. OrderEntryProcessing ASQF WSTEST-2 Web Service Hierarchie User Business Processes Functional Services z.B. Billing Object Services z.B. CreateInvoice Elementary Services z.B. CalculatePrice

  4. ASQF WSTEST-3 Web Services in the Cloud Web Services

  5. ASQF WSTEST-4 Web Service Testing as a Service

  6. ASQF WSTEST-5 Web Service Testprozess 1. The test provider registers the test artifacts to the test broker and the test information becomes available for testers. 2. The tester looks for test cases initially rather than the web service itself. 3. The tester gets the test case description, test scripts or the link of the test service that the service provider selected. 4. This collaboration is needed in the case of the testers not being able to carry out the testing by himself. 5. The mapping between the test cases and the associated web services is established by the test broker. 6. Information of the web services that are associated with the selected test cases are delivered to the tester. 7. Web service at the service provider’s location are tested with the test cases. 8. The test results are submitted to the test broker for further evaluation. 9. Collaboration between the test broker and the service broker can act like the Check-in and Check-out mechanism as in Tsai et al.’s enhanced UDDI broker proposal. 10. The service user is presented with the selected services test result in order to help with service selection. Service selection can be solely based on the test result in some cases. Bai et al.’s proposed test broker provides a test case repository for test cases and test scripts, collects test results, maintains defect reports and web service evaluations. Some aspects of testing is provided by test broker as well. Test broker can generate and execute test cases as well. Bai et al. suggest that using DCV&V architecture multiple test brokers can involve in testing process. The outcome of the decentralized architecture is having a more flexible and scalable collaborations among the participants of the testing process.

  7. ASQF WSTEST-6 Web Service Testansätze Bendetto [165] defined the dierence between integration testing of traditional systems and service-oriented systems. Canfora and Di Penta [29] point out the challenges in integration testing of SOA. According to Bendetto and Canfora and Di Penta the challenges of integration testing in service-oriented environments are: 1. Integration testing must include the testing of web services at binding phase, workflows and business process connectivity. Business process testing must also include all possible bindings. 2. Low visibility, limited control and the stateless nature of SOA environment makes integrationtestingharder. 3. Availability of services during testing might also be a problem. 4. Dynamic binding makes the testing expensive due to the number of required service calls. One of the earliest work in integration testing of web services is Tsai et al.’s [166] Coyote framework. Coyote is an XML-based object-oriented testing framework that can perform integration testing. Coyote is formed of two main components; a test master that is capable of mapping WSDL specifications into test scenarios, generating test cases for these scenarios and performing dependency analysis, completeness and consistency checking and a test engine that performs the tests and logs the results for these tests. Coyote performs WSDL-based test datageneration.

  8. ASQF WSTEST-7 Der Web Service Test ist Spezifikationsbasiert Specification-Based Test Data Generation Specification-based testing is the verification of the SUT against a reference document such as a user interface description, a design specification, a requirements list, a published model or a user manual. Naturally in specification-based testing test cases are generated using the availablesystemspecifications. In a web service environment all the information a user receives about a web service is its specifications. In this situation specification-based testing becomes a natural choice. Test case generation for web services, as expected, is based on the web service specifications. For traditional web services the provided specifications include abstract information on the available operations and its parameters. Information from the specifications allow generation of test cases for boundary-value analysis, equivalence class testing or random testing using the XML Schema datatype information. Mainly proposed approaches [35, 36, 37, 38, 39, 40] for WSDL-based data generation are based on the XML Schema datatype information. The datatype information with various constraints for each data type allows generation of test data for each simple type. In XML complex datatypes can also be defined. Test data generation for complex datatypes simply requires decomposition of the complex type into simple types. Then test data is generated for each of these simple types and the combined data is used as complex test data.

  9. ASQF WSTEST-8 Werkzeuge für den Web Service Test Unit testing of web services is performed by sending and receiving SOAP messages by the tester. Tester generates the SOAP messages for the operation under test using the information from the WSDL file. By this way unit testing can verify both the correctness of the WSDL and the correct functioning of the SUT. Abstract operation information is provided by the WSDL and details of SOAP protocol are can also be accessed. According to Canfora and Di Penta [29] one of the main problems of functional testing of web services is its high cost. In testing, the best way of reducing the cost is automation and for unit testing there are existing tools that provide automated testing such as Parasoft SOAtest [64], SOAP Sonar [65], HP service Test [66] and Oracle Application Testing Suite [67]. Even though these tools do help reducing the manual labor required for test case generation and reporting, they do not fully automate the testing process. In all these tools test cases are generated by the tester and the tool generates the SOAP requests for each test case. In some tools even verification of test results has to be performed manually such as in SOAtest. From the provided functionality of all tools one can assume that automation is not at a desired level. Fortunately the need for tools that can automate unit testing was addressed by the research community. For example, Sneed and Huang [40] introduce a tool called WSDLTest for automated unit testing. WSDLTest is capable of generating random requests from WSDL schemas. WSDLTest is also capable of verifying the results of test cases. This capability is achieved by inserting pre-conditions and assertions in test scripts that are manually generated by the tester. The provided verification method requires the tester to be familiar with the SUT in order to generate necessary assertions.

  10. ASQF WSTest-9 Validierung der Web Services gegen ein Service Level Agreement Bereitgestelle Web Services Service Level Agreement SLA Web Services Tatsächliches Verhalten Vereinbartes Verhalten

  11. ASQF WSTest-10 Spezifikation der SLA Kriterien

  12. ASQF WSTest-11 WSDL Schema Sneed‘s WS-Testverfahren 1 Anforderung spezifizieren 2 3 SLA vereinbaren Statische Analyse Prüfregel 9 4 Ist Eingaben Qualitäts- prüfung SLA 5 Soll Ausgaben Dynamische Analyse Mindest- werte 6 Korrektheits Prüfung 7 Performance Prüfung 8 Wiederholung nach jedem neuen Release Compliance Prüfung

  13. ASQF WSTest-12 Statische Analyse der WSDL Schnittstellen Benutzer Oberfläche Namen Templates WSDL Audit Control Module Regel Auswahl WSDL Sources Metrik Gewichte Rule Checker WSDL Parser Metric Calculator Mangel Bericht Metrik Bericht

  14. ASQF WSDL Code Metrikbericht WSTest-13 +----------------------------------------------------------------------+ | Q U A N T I T Y M E T R I C S | +----------------------------------------------------------------------+ | C O D E Q U A N T I T Y M E T R I C S | | Number of Source Members analyzed =======> 1 | | Number of Source Lines in all =======> 205 | | Number of Genuine Code Lines =======> 205 | | | | S T R U C T U R A L Q U A N T I T Y M E T R I C S | | Number of Modules =======> 1 | | Number of Includes =======> 0 | | Number of Classes declared =======> 6 | | Number of Classes inherited =======> 5 | | Number of Interfaces declared =======> 1 | | Number of Object-Points =======> 107 | | | | D A T A Q U A N T I T Y M E T R I C S | | Number of Data Structures =======> 6 | | Number of Defined Definitions =======> 1 | | Number of Data Variables declared =======> 81 | | Number of different Data Types used =======> 10 | | Number of Data References =======> 181 | | Number of Arguments / Input Variables =======> 19 | | Number of Results / Output Variables =======> 19 | | Number of Predicates / Conditional Data =======> 6 | | Number of Parameters / Function Arguments=======> 219 | | Number of Data-Points =======> 117 | | | | P R O C E D U R A L Q U A N T I T Y M E T R I C S | | Number of Statements =======> 201 | | Number of Input Operations =======> 19 | | Number of Output Operations =======> 19 | | Number of File & Database Accesses =======> 0 | | Number of Function References =======> 0 | | Number of Foreign Functions referenced =======> 100 | | Number of Nesting Levels (Maximum) =======> 3 | | Number of Test Cases (Minimum) =======> 32 | | Number of different Statement Types =======> 101 | | Number of Assertions made =======> 12 | | Number of Function-Points =======> 89 | +----------------------------------------------------------------------+

  15. ASQF WSTest-14 Prüfregel für Web Service Schnittstellen • 12 WSDL Kodierregel, die einzuhalten sind: • Adherence to the current standard • Use of prescribed name spaces • Use of in source documentation • Avoidance of any data type • Adherence to the naming convention • Hiding of elementary data types • Restricting the size of data groups • Limiting interface width • Enforcing the first normal form • Minimizing the number of requests • Not exceeding a given size limit • Limiting the use of links

  16. ASQF WSTest-15 WSDL Code Mängelbericht +----------------------------------------------------------------------+ | S O F T A N A L A U D I T O R | | P R O G R A M | | C O N F O R M A N C E R E P O R T | | LANGUAGE: XSD DATE: 16.10.07 | | SOURCE: RSFWBMServiceTypes PAGE: 4 | +-------+--------------------------------------------------------------+ | LINE | PROGRAM RULE VIOLATION / CODE DEFICIENCY | +-------+--------------------------------------------------------------+ | PROB: | Data Structure should have at least one ID Key | | 712 | </GetTopicResponse> | | | | | DEFI: | There should be at least one Entity per Document | | DEFI: | There should be at least one Link per Document | | 795 | RSFWBMService | | | | | DEFI: | Document has more than maximum allowed Elements | | DEFI: | Document has more than maximum allowed Parameters | | 795 | RSFWBMService | | | | +----------------------------------------------------------------------+ | | | Number of major Rule Violations = 1 | | Number of media Rule Violations = 52 | | Number of minor Rule Violations = 0 | | Total Number of Rule Violations = 53 | | Number of Program Statements = 736 | | | | Rate of Rule Conformity = 0.927 | | | +----------------------------------------------------------------------+

  17. ASQF WSTest-16 Beispiel einer Web Service Schnittstelle <operation name="getDocument"> <input message="tns:getDocumentRequest" /> <output message="tns:getDocumentResponse" /> </operation> <operation name="deleteDocument"> <input message="tns:deleteDocumentRequest" /> <output message="tns:deleteDocumentResponse" /> </operation> <operation name="getDoc"> <input message="tns:getDocRequest" /> <output message="tns:getDocResponse" /> </operation> <message name="getDocRequest"> <part name="getDocRequest" type="types:getDocRequestType" /> </message> <xsd:element name="getDocRequest" type="getDocRequestType"/> <xsd:complexType name="getDocRequestType"> <xsd:sequence> <xsd:element name="document_id" type="xsd:decimal" minOccurs="1" maxOccurs="1"/> </xsd:sequence> </xsd:complexType>

  18. ASQF Web Service Schema WSTest-17 targetNamespace="http://example.com/stockquote.wsdl" xmlns:tns="http://example.com/stockquote.wsdl" xmlns:xsd1="http://example.com/stockquote.xsd" xmlns:soap="http://schemas.xmlsoap.org/wsdl/soap/" xmlns="http://schemas.xmlsoap.org/wsdl/"> <types> <schema targetNamespace="http://example.com/stockquote.xsd" xmlns="http://www.w3.org/2000/10/XMLSchema"> <element name="TradePriceRequest"> <complexType> <all> <element name="tickerSymbol" type="string"/> </all> </complexType> </element> <element name="TradePrice"> <complexType> <all> <element name="price" type="float"/> </all> </complexType> </element> </schema> </types> <message name="GetLastTradePriceInput"> <part name="body" element="xsd1:TradePriceRequest"/> </message> <message name="GetLastTradePriceOutput"> <part name="body" element="xsd1:TradePrice"/> </message> <portType name="StockQuotePortType"> <operation name="GetLastTradePrice">

  19. ASQF WSTest-18 Web Service Test Request service: StockQuoteService if (testcase = "StockQuotePortType01"); if ( operation = "GetLastTradePrice"); if ( request = "GetLastTradePriceInput"); assert in.TradePriceRequest.tickerSymbol = "???"; endRequest ; if ( response = "GetLastTradePriceOutput"); assert out.TradePrice.price = "???"; endResponse ; endOperation; endCase; end; /* service StockQuoteService */

  20. ASQF Web Service Testdatengenerierung WSTest-19 WSDL Interface Definition <Request> <Input>Params</Input> </Request> Testdaten Skript DataTest Testskript Compiler Testskript Generator Assert in P1 = Range (10:20); Assert in P2 = Set (“X“, “Y“, “Z“) Assert In P3 = P1 + 7; Test Web Service Request Interface Testwerte Tabelle Repräsentative Werte Grenzwerte Zufallswerte <Params> <P1>10</P1> <P2>X</P2> </Params>

  21. ASQF WSTest-20 Web Service Test

  22. ASQF WSTest-21 Web Service Testskript

  23. ASQF Web Service Testergebnisvalidierung WSTest-22 Test Web Service Response <Params> <P4>10</P4> <P5>X</P5> </Params> Assert out P4 = Range (10:20); Assert out P5 = Set (“X“, “Y“, “Z“) Assert out P6 = P1 + 7; WSDL Interface Definition Testergebnis Validator Testdaten Skript <Request> <Input>Params</Input> </Request> Abweichungs Bericht Exceptions P4 Soll = Range (10:20) P4 Ist = 21

  24. ASQF WSTest-23 Validation der funktionalen Anforderungen +-----------------------------------------------------------------------+ | WSDL Response Validation Report | | Object: Kalender Date: 19.06.04 | | Type : XML System: TEST | | Key Fields of Response (ist,soll) | +-------------------------------------+---------------------------------+ | MsgKey:DayofWeek = 12101977 | | | Ist : DayofWeek | Mercolodi | | SOll: DayofWeek | Mittwoch | +-------------------------------------+---------------------------------+ | MsgKey:DayofWeek = 12101977 | | | Ist : Language | 2 | | SOll: Language | 1 | +-------------------------------------+---------------------------------+

  25. ASQF WSTest-24 Prüfung der nicht-funktionalen Anforderungen • Messung der Antwortzeit • Antwortzeit = Zeit des Response-Empfangs – Zeit der Requestabsendung • Liegt die gemessene Antwortzeit über die vereinbarte Zeit ist die SLA nicht erfüllt. • Messing der Verfügbarkeit • Availability = 1 - {waiting time / total usage time} • Diese Wartezeiten werden akkumuliert und mit dem maximal zulässiger Wartezeit in der SLA verglichen. Liegt sie höher gilt die SLA als nicht erfüllt. • Messing der Sicherheit • Security = 1 - {weighted security violations / invalid requests} • Dieser tatsächlich ermittelter Sicherheitsgrad wird mit dem minimalen Sicherheitsgrad in der SLA verglichen um festzustellen ob die SLA in punkto Sicherheit eingehalten wird oder nicht .

  26. ASQF WSTest-25 Zusammenfassung • Vorgefertigte Web Services ist die Zukunft der IT • Anwender müssen endlich aufhören selber zu entwickeln • Anwender sollten sich darauf beschränken Services zu komponieren und in ihre Geschäftsprozesse einzubauen • Dafür müssen die Services die funktionale und qualitative Anforderungen der Anwender erfüllen • Um dies zu bestätigen, müssen die Services getestet und zertifiziert werden • Der Test von Web Services soll als Dienstleistung angeboten werden • Anwender müssen bereit sein, ihre Services von einem unabhängigen Testbetrieb testen zu lassen.

More Related