1 / 74

Fine Tuning WebFOCUS for the IBM Mainframe (zSeries, System z9)

Fine Tuning WebFOCUS for the IBM Mainframe (zSeries, System z9). Mark Nesson June, 2008. Why WebFOCUS for z. Runs natively on MVS, Linux IBM has brand new specialty engines you can take advantage of

roscoe
Download Presentation

Fine Tuning WebFOCUS for the IBM Mainframe (zSeries, System z9)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Fine Tuning WebFOCUS for the IBM Mainframe (zSeries, System z9) Mark Nesson June, 2008

  2. Why WebFOCUS for z • Runs natively on MVS, Linux • IBM has brand new specialty engines you can take advantage of • Ability to create partitions on z to centralize business intelligence on a single server – where the databases and applications reside

  3. Information Builders products used in benchmark • WebFOCUS • iWay Software • iWay Service Manager, is a unique and powerful Enterprise Service Bus (ESB) that invoked as Web services to provide event-driven integration and B2B interaction management.

  4. There’s an easier way! App Server/ Servlet Container Reporting Server HTTP Clients Web Server adapters ibi_apps focexecs ibi_html rcaster HTTP/ HTTPS HTTP/ HTTPS/ Proprietary TCP synonym ibi_bid basedir data approot worp reports HTTP/ HTTPS TCP/JDBC TCP via Client Driver or JDBC TCP/JDBC RDBMS RDBMS RC Repository RC Distribution Server DB Servers TCP/ JDBC

  5. Benchmark Objectives • Test WebFOCUS, on the proven strengths of System z hardware running the Linux open source OS with UDB, and z/OS with DB2. • Evaluate the scalability and performance of WebFOCUS and iWay Service Manager in all operating systems environment on IBM z Server and the benefit of using various specialty engines on IBM z server. • All test configurations accurately and faithfully replicated prior benchmarks run on other UNIX and Windows platforms. • Test results therefore represent the true performance of WebFOCUS workload on that hardware vendor machine. • Testing was done at IBM Gaithersburg (Washington Systems Center) in November of 2006 by a combined team from IBM and Information Builders.

  6. UDB and DB2 database size • Linux system IBILIN03 under zVM and z/OS system IBI1 are used in the various benchmark configurations to host the test databases. • Two databases are defined on IBILIN03 and IBI1 • With multiple tables defined to each database • 2 million rows of data • 7 million rows of data • Each row is 256 bytes long.

  7. Benchmark Test Workload used • Workload-small: Query to retrieve 61 rows of data • Workload-large: Query to retrieve 3000 rows of data • Workload-complex: CPU intensive query, involves 4 table join retrieve 5118 rows

  8. How IBI tests were measured • For each given test configuration, Information Builders used the same parameter settings, i.e. interval time, keep alive time, to run small, large and complex workload by varying the concurrent active user numbers, then measure the end to end user response time.

  9. Test Environment – 1 (all on Linux)

  10. Test Environment – 1(Linux) Scenarios • Scenario1: • IBILIN02: 2/4/8 CP, 2 GB, WebFOCUS 7.6 Reporting Server, DB2 Connect • IBILIN03: 2/4 CP, 2 GB, UDB 8.2 • Scenario2: • IBILIN01: 2/4 CP, 2/4 GB, WAS 6.0.2.17, WebFOCUS Client 7.6 • IBILIN02: 2/4/8 CP, 2 GB, WebFOCUS 7.6 Reporting Server, DB2 Connect • IBILIN03: 2/4 CP, 2 GB, UDB 8.2 • Scenario3: • IBILIN02: 2/4/8 CP, 2 GB, iWay Service Manager, DB2 JDBC type 4 • IBILIN03: 2/4 CP, 2 GB, UDB 8.2

  11. Test Env. – 1 (Linux), Scenario – 1(WebFOCUS Reporting Server, DB2 Connect, Workload-small) Response Time in seconds Workload Type # user 2 cp 4 cp 8 cp Small 0.616 0.407 ----- 50 1.205 100 0.435 0.223 Small ----- 200 2.388 0.793 Small 1.321 500 3.726 1.946 Small

  12. WebFOCUS 7.6 Performance Statistics for z-Linux Average Request Processing Time (in seconds)WebFOCUS 7.6 Performance Statistics for z-Linux Average Request Processing Time (in seconds)        Across Number of CPUs by Concurrent Users 

  13. Test Env. – 1 (all on Linux), Scenario – 1(WebFOCUS Reporting Server, DB2 Connect, Workload-small)

  14. Test Env.– 1 (all on Linux), Scenario – 1(WebFOCUS Reporting Server, DB2 Connect, WL-large) Response Time in seconds Workload Type # user 2 cp 4 cp 8 cp Large 1.829 0.941 ----- 50 3.529 100 1.867 1.013 Large ----- 200 6.095 3.323 Large 6.883** 500 17.032 5.487 Large ** Linux system swap occurred

  15. WebFOCUS 7.6 Performance Statistics for z-Linux Average Request Processing Time (in seconds)WebFOCUS 7.6 Performance Statistics for z-Linux Average Request Processing Time (in seconds)        Across Number of CPUs by Concurrent Users 

  16. Test Env.– 1 (all on Linux), Scenario – 1 (WebFOCUS Reporting Server, DB2 Connect, Workload-Large)

  17. Response Time in seconds Workload Type # user 2 cp 4 cp 8 cp Complex 4.676 3.133 ----- 10 11.295 25 5.813 ----- Complex 50 8.963 24.443 11.333 Complex 14.92 100 40.467 25.175 Complex Complex 80.928 60.784 200 ----- Test Env. – 1 (all on Linux), Scenario – 1 (WebFOCUS Reporting Server, DB2 Connect, WL-Complex)

  18. WebFOCUS 7.6 Performance Statistics for z-Linux Average Request Processing Time (in seconds)WebFOCUS 7.6 Performance Statistics for z-Linux Average Request Processing Time (in seconds)        Across Number of CPUs by Concurrent Users 

  19. Test Env. – 1 (all on Linux), Scenario – 1 (WebFOCUS Reporting Server, DB2 Connect, WL-Complex)

  20. Test Env. – 1 (all on Linux), Scenario – 2 (WAS, WebFOCUS Reporting Server, DB2 Connect, WL-small) Response Time in seconds Workload Type # user 2 cp 4 cp 8 cp Small 0.799 ----- ----- 50 1.378 100 0.798 0.624 Small ----- 200 2.594 ----- Small 2.065 500 5.417 3.659** Small ** Switched to use JLINK

  21. Average Request Processing Time (in seconds)Across Number of CPUs by Concurrent Users Average Request Processing Time (in seconds)Across Number of CPUs by Concurrent Users 

  22. Test Env. – 1 (all on Linux), Scenario – 2 (WAS, Web FOCUS Reporting Server, DB2 Connect, WL-small)

  23. Test Env. – 1 (all on Linux), Scenario – 2(WAS, WebFOCUS Reporting Server, DB2 Connect, WL-large) Response Time in seconds Workload Type # user 2 cp 4 cp 8 cp Large 1.913 ----- ----- 50 3.702 100 2.45 ----- Large ----- 200 7.279 ----- Large 11.857** ---- 500 14.966 Large ** switched to use JLINK

  24. Average Request Processing Time (in seconds)Across Number of CPUs by Concurrent Users Average Request Processing Time (in seconds)Across Number of CPUs by Concurrent Users 

  25. Test Env. – 1 (all on Linux), Scenario – 2(WAS, WebFOCUS Reporting Server, DB2 Connect, WL-large)

  26. Test Env. – 1 (all on Linux), Scenario – 2 (WAS, WebFOCUS Reporting Server, DB2 Connect, WL-Complex) Response Time in seconds Workload Type # user 2 cp 4 cp 8 cp Complex 11.578 2.151 ----- 10 28.438 25 5.047 ----- Complex 50 7.121 ----- 10.826 Complex ----- 75 ----- 15.312 Complex Complex ----- 19.791 14.056 100

  27. Average Request Processing Time (in seconds)

  28. Test Env. – 1 (all on Linux), Scenario – 2 (WAS, WebFOCUS Reporting Server, DB2 Connect, WL-Complex)

  29. Response Time in seconds Workload Type # user 2 cp 4 cp 8 cp Small 1.731 0.95* 0.734* 25 3.707 50 1.796* 1.221* Small 100 2.62* 7.59 3.451* Small 4.71* 200 11.799 6.255* Small Small 18.333 9.807* 7.34* 500 * JVM size 1024 MB Test Env. – 1 (all on Linux), Scenario – 3(iWay Service Manager, DB2 JDBC Type 4, WL-Small)

  30. Average Request Processing Time (in seconds)Across Number of CPUs by Concurrent Users Average Request Processing Time (in seconds)Across Number of CPUs by Concurrent Users 

  31. Test Env. – 1 (all on Linux), Scenario – 3(iWay Service Manager, DB2 JDBC type 4, WL-Small)

  32. Test Env. – 1 (all on Linux), Scenario – 3(iWay Service Manager, DB2 JDBC type 4, WL-Large) Response Time in seconds Workload Type # user 2 cp 4 cp 8 cp Large 81.522 44.842* 32.34* 25 188.558 50 90.319* 74.42* Large 100 182.265* 369.617* 235.208* Large ----- 200 ----- ----- Large ----- ----- ----- Large 500 * JVM size 1024 MB.

  33. Average Request Processing Time (in seconds)Across Number of CPUs by Concurrent Users

  34. Test Env. – 1 (all on Linux), Scenario – 3(iWay Service Manager, DB2 JDBC type 4, WL-Large)

  35. Benchmark Test Environment – 2(App and driver on Linux, DB on z/OS)

  36. Benchmark Test Environment – 2Scenarios • Scenario1: • IBILIN02: 2/4/8 CP, 2 GB, WebFOCUS Reporting Server 7.6, DB2 Connect, Native Data Driver (CLI) • (z/OS) IBI1: 8 CP, 8 GB, DB2, 2 tables

  37. Test Env. – 2, Scenario – 1(App on Linux, DB on z/OS)        (WebFOCUS Reporting Server, DB2, WL=Small) Response Time in seconds Workload Type # user 2 cp 4 cp 8 cp Small 0.896 0.444 0.214 50 1.78 100 0.875 0.435 Small 0.721 200 3.276 1.885 Small 500 5.166 2.807 1.50 Small

  38. Average Request Processing Time (in seconds)

  39. Test Env. – 2, Scenario – 1(App on Linux, DB on z/OS)  (WebFOCUS Reporting Server, DB2, WL=Small)

  40. Test Env. – 2, Scenario – 1(App on Linux, DB on z/OS) (WebFOCUS Reporting Server, DB2, WL=Large) Response Time in seconds Workload Type # user 2 cp 4 cp 8 cp Large 18.417 9.812 4.998 50 33.183 100 17.145 10.278 Large 18.369 200 49.23 34.74 Large 500 ---- ---- ---- Large

  41. Average Request Processing Time (in seconds)Across Number of CPUs by Concurrent UsersAverage Request Processing Time (in seconds)Across Number of CPUs by Concurrent Users

  42. Test Env. – 2, Scenario – 1(App on Linux, DB on z/OS) (WebFOCUS Reporting Server, DB2, WL=Large)

  43. Benchmark Test Environment – 3(WAS, WebFOCUS Reporting Server, iWay Service Manager – IBI2, DB on z/OS –IBI1)

  44. Benchmark Test Environment – 3 Scenarios • Scenario1: • IBI2: 2/4/8 CP, 8 GB, ISM 5.5, JDBC Type-4 Driver. • IBI1: 2/4/8 CP, 8 GB, DB2, 2 tables • Scenario2: • IBI2: 2/4/8 CP, 8 GB, WAS 6.1, WebFOCUS Reporting Server 76, WF Client, CLI. • IBI1: 2/4/8 CP, 8 GB, DB2, 6 tables • IBI2 communicates with IBI1 via Hipersocket • 1 zIIP engine • Scenario3: • IBI2: 2/4/8 CP, 8 GB, WebFOCUS Reporting Server 76, CLI • IBI1: 2/4/8 CP, 8 GB, DB2, 6 tables • IBI2 communicates with IBI1 via Hipersocket • 1 zIIP engine

  45. Response Time in seconds Workload Type # user 2 cp 4 cp 8 cp Small 2.032 0.716 0.56 25 3.22 50 1.329 1.11 Small 100 2.4 6.782 2.715 Small 5.2 200 15.34 5.864 Small Small 22.54 7.54 6.391 500 Test Env – 3 (2 separate z/OS), Scenario – 1 (ISM, JDBC Type4, WL=Small, 1 zIIP)

  46. Average Request Processing Time (in seconds)Across Number of CPUs by Concurrent Users

  47. Test Env – 3 (2 separate z/OS), Scenario – 1 (ISM, JDBC Type4, WL=Small, 1 zIIP)

  48. Test Env – 3 (2 separate z/OS), Scenario – 1 (ISM, JDBC Type4, WL=Large, 1 zIIP) Response Time in seconds Workload Type # user 2 cp 4 cp 8 cp Large 58.95 39.179 35.0 25 121.98 50 80.675 74.17 Large 100 181.3 292.22 210.811 Large ----- 200 ----- ----- Large Large ----- ----- ----- 500

  49. Average Request Processing Time (in seconds) Across Number of CPUs by Concurrent Users

  50. Test Env – 3 (2 separate z/OS), Scenario – 1 (ISM, JDBC Type4, WL=Large, 1 zIIP)

More Related