1 / 18

An illustration

IBM WebSphere Compute Grid for z/OS. IBM Tivoli Workload Scheduler. See a narrated video of this on YouTube … search on ATSDemos. Click. An illustration. Integrated with an Enterprise Scheduler such as. Click. Preview of Technical Message.

duard
Download Presentation

An illustration

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. IBM WebSphere Compute Grid for z/OS IBM Tivoli Workload Scheduler See a narrated video of this on YouTube … search on ATSDemos Click An illustration Integrated with an Enterprise Scheduler such as

  2. Click Preview of Technical Message IBM WebSphere Compute Grid z/OS has an MDB interface intended to interface with enterprise schedulers The WSGRID utility program is what connects the enterprise scheduler to Compute Grid WSGRID forms up a job submission message and places it on a queue. The MDB picks it up and the job is submitted inside Compute Grid. WSGRID stays active while the job executes in Compute Grid and feeds output to JES spool and alerts the enterprise scheduler of the Java batch job’s status This design allows Compute Grid Java batch to be integrated with traditional batch in a broader batch process

  3. Click Please Note … IBM WebSphere Compute Grid is supported on all platforms supported by WebSphere Application Server The focus of this presentation will be Compute Grid for z/OS Our focus will be on integration with Tivoli Workload Scheduler, but this integration design works with any scheduler capable of submitting JCL to JES Our focus will also be on using WebSphere MQ as the JMS provider, but there is also a solution involving the internal messaging provider of WebSphere Application Server

  4. The WebSphere Compute Grid scheduler function has a browser interface Compute Grid Job Console Tivoli Workload Scheduler JES WebSphere Application Server z/OS This is the interface of particular interest for integration with enterprise schedulers Batch Application Batch Application Compute Grid End Point Compute Grid End Point Compute Grid Scheduler AppServer Data Systems DB2 CICS IMS MQ VSAM etc. Job Submission and Dispatching In addition to a browser interface, Compute Grid also provides: AppServer The question is this: what ties TWS + JES to Compute Grid? Spool AppServer • Command line interface • Web Services interface • RMI Client interface • MDB interface z/OS Facilities and Functions (WLM, RRS, SAF, RMF, Parallel Sysplex, etc.) Click System z and z/OS

  5. Tivoli Workload Scheduler JES Two versions of WSGRID are provided: a C/C++ native implementation on z/OS; and on implemented in Java The answer: WSGRID, a utility program supplied with Compute Grid Input Output Batch Application Batch Application Compute Grid Scheduler Compute Grid End Point Compute Grid End Point MQ AppServer Data Systems DB2 CICS IMS MQ VSAM etc. Spool AppServer The native WSGRID utility interacts with Compute Grid using MQ and BINDINGS mode JOB PGM= WSGRID Native code utility and MQ with BINDINGS means this is very fast z/OS Facilities and Functions (WLM, RRS, SAF, RMF, Parallel Sysplex, etc.) MDB Click WebSphere Application Server z/OS AppServer System z and z/OS

  6. Let’s take a high-level look at how this works, then we’ll dig into some of the details Tivoli Workload Scheduler JES Normal archive process Input Output Job RC = 0 Batch Application Batch Application Compute Grid Scheduler Compute Grid End Point Compute Grid End Point MQ The job executes and completes AppServer Data Systems DB2 CICS IMS MQ VSAM etc. Spool AppServer TWS submits WSGRID JCL to JES (details on JCL coming) JOB PGM= WSGRID z/OS Facilities and Functions (WLM, RRS, SAF, RMF, Parallel Sysplex, etc.) JCL MDB Msg Click WebSphere Application Server z/OS AppServer JCL names PGM=WSGRID, which results in program being launched WSGRID forms up message (details coming) and places on queue MDB in scheduler fires and pulls message off the input queue Job is dispatched, executes and completes Scheduler feeds output back to MQ in a series of messages System z and z/OS WSGRID pulls messages off output queue and writes to JES WSGRID ends and JES alerts TWS of job return code If desired, normal JES spool archiving may take place

  7. Let’s see what the JCL for WSGRID looks like, and start to demystify how this works. Standard JOB card EXEC PGM=WSGRID Tivoli Workload Scheduler JES Output Input STEPLIB to WSGRID module and MQ QLOAD and QAUTH MQ Spool SYSPRINT DD to JES JOB Name the QMGR and input / output queues PGM= WSGRID Specify the input xJCL path and file name JCL Provide any substitution properties you wish to pass into the xJCL Click

  8. The output ends up in the JES spool, and is viewable like any other JES spool Tivoli Workload Scheduler JES Input Output Batch Application Batch Application Compute Grid End Point Compute Grid End Point Compute Grid Scheduler MQ AppServer The first part of the job output showing the xJCL and substitution variables Data Systems DB2 CICS IMS MQ VSAM etc. Spool The second part of the job output showing the output from the return codes for each step as well as the overall job return code AppServer JOB PGM= WSGRID z/OS Facilities and Functions (WLM, RRS, SAF, RMF, Parallel Sysplex, etc.) xJCL Click WebSphere Application Server z/OS AppServer System z and z/OS

  9. Are jobs submitted through WSGRID controllable from the Job Management Console? Tivoli Workload Scheduler JES Output Input Cancel Job Batch Application Batch Application Compute Grid End Point Compute Grid End Point Compute Grid Scheduler MQ AppServer Data Systems DB2 CICS IMS MQ VSAM etc. Spool AppServer JOB Yes! Jobs submitted through WSGRID are controllable through the Job Management Console (JMC). PGM= WSGRID z/OS Facilities and Functions (WLM, RRS, SAF, RMF, Parallel Sysplex, etc.) xJCL And actions in the JMC are fed back to JES and TWS through WSGRID Click WebSphere Application Server z/OS AppServer System z and z/OS

  10. What if you don’t have MQ as a part of your enterprise messaging infrastructure? Then use the Java client with the built-in WAS messaging Tivoli Workload Scheduler JES wsgridConfig.py Batch Application Batch Application Compute Grid End Point Compute Grid Scheduler Compute Grid End Point AppServer Data Systems DB2 CICS IMS MQ VSAM etc. Spool SIBus AppServer JMS Destination Dispatched and Executed JOB BPXBATCH or JZOS WSGrid.sh z/OS Facilities and Functions (WLM, RRS, SAF, RMF, Parallel Sysplex, etc.) JCL MDB JMS Msg Click WebSphere Application Server z/OS AppServer Job Output Run supplied WSADMIN script to create messaging components inside the Compute Grid Scheduler TWS integration using JCL same as before. Difference is the job now launches a Java client rather than the native MQ client. System z and z/OS The WSGrid Java client forms message and places on JMS destination. MDB fires and pulls the message and submits job. If using JZOS then output can be directed back to JES spool. If BPXBATCH then output goes to file system.

  11. Click Integration with Traditional Batch We know that Tivoli Workload Scheduler (TWS) is a powerful enterprise scheduler We’ve seen how it integrates with WebSphere Compute Grid Now let’s see how we can use the power of TWS to integrate Compute Grid and traditional batch into a larger batch process Finally, we’ll simplify the pictures a bit to reduce clutter and focus on the key points

  12. Imagine you have a mixed-batch environment, with Compute Grid and traditional batch Tivoli Workload Scheduler WCG Endpoint Input Output WCG Scheduler C B A Batch Appl Batch Appl Batch Appl JES WCG Endpoint Spool JOB MQ WSGRID MDB System z and z/OS 2 1 3 C/C++ Batch Assem Batch COBOL Batch Click You have Tivoli Workload Scheduler and other z/OS functions (JES) You have a series of traditional batch jobs You have WebSphere Compute Grid in place with several batch applications deployed to the batch endpoints You plan to integrate TWS with WebSphere Compute Grid so you have the WSGRID program ready with MQ input/output queues defined

  13. Imagine further that you have a TWS batch workflow defined with mixed Java and native batch JCL Job Library And you assemble the JCL to invoke an instance of WSGRID for each Java batch job in WCG You assemble the JCL for your traditional native batch jobs so TWS has access to submit to JES Tivoli Workload Scheduler WCG Endpoint COBOL Batch 1 JCL Assembler Batch 2 JCL C/C++ Batch 3 JCL Output Input WCG Scheduler C B A Batch Appl Batch Appl Batch Appl WSGRID JCL for Java B WSGRID JCL for Java C WSGRID JCL for Java A JES WCG Endpoint Spool JOB MQ Assembler WSGRID C A B 3 2 1 Java COBOL C/C++ Java MDB System z and z/OS 3 2 1 C/C++ Batch Assem Batch COBOL Batch Java Click Your TWS batch workflow:

  14. Let’s now walk through an illustration of how TWS would integrate traditional and Java batch … Job dispatched to end point where application deployed JCL Job Library Tivoli Workload Scheduler readies itself to proceed in the workflow … COBOL Job 1 … that’s next Tivoli Workload Scheduler WCG Endpoint COBOL Batch 1 JCL Assembler Batch 2 JCL C/C++ Batch 3 JCL Input Output WCG Scheduler C B A Batch Appl Batch Appl Batch Appl Job completes WSGRID JCL for Java C WSGRID JCL for Java B WSGRID JCL for Java A JES WCG Endpoint Spool JOB MQ Assembler WSGRID C B A 3 2 1 Message formed based on properties inline with JCL or in named properties file TWS process initiated WSGRID job spun down Job output goes to spool WSGRID job initiated Java COBOL C/C++ Java MDB System z and z/OS 3 2 1 C/C++ Batch Assem Batch COBOL Batch Msg Java Click Your TWS batch workflow:

  15. TWS moves on to the next job in its process – a traditional COBOL batch job JCL Job Library Tivoli Workload Scheduler readies itself to proceed in the workflow … simultaneous submission Tivoli Workload Scheduler WCG Endpoint Assembler Batch 2 JCL C/C++ Batch 3 JCL COBOL Batch 1 JCL Output Input WCG Scheduler A C B Batch Appl Batch Appl Batch Appl Job executes WSGRID JCL for Java C WSGRID JCL for Java B WSGRID JCL for Java A JES WCG Endpoint Job completes Spool MQ Assembler B C A 2 3 1 Job output goes to spool JES initiates batch job Java COBOL C/C++ Java MDB System z and z/OS 3 1 1 2 C/C++ Batch COBOL Batch Assem Batch COBOL Batch Java Click Your TWS batch workflow:

  16. A TWS process may consist of multiple jobs run simultaneously. It makes no difference to TWS if the jobs are mixed Java and native. JCL Job Library Tivoli Workload Scheduler Note: We’re going to speed this up quite a bit WCG Endpoint Assembler Batch 2 JCL COBOL Batch 1 JCL C/C++ Batch 3 JCL Output Input WCG Scheduler C A B Batch Appl Batch Appl Batch Appl WSGRID JCL for Java C WSGRID JCL for Java B WSGRID JCL for Java A JES WCG Endpoint Spool JOB MQ Assembler WSGRID C B A 3 2 1 Job output goes to spool Java COBOL C/C++ Java MDB System z and z/OS 1 3 2 2 C/C++ Batch Assem Batch Assem Batch COBOL Batch Msg Java Click Your TWS batch workflow:

  17. The processing of the final two jobs in this batch flow unfolds just like the first two did … Tivoli process complete with all jobs ending RC=0 JCL Job Library Tivoli Workload Scheduler WCG Endpoint Assembler Batch 2 JCL C/C++ Batch 3 JCL COBOL Batch 1 JCL Input Output WCG Scheduler C B A Batch Appl Batch Appl Batch Appl WSGRID JCL for Java C WSGRID JCL for Java B WSGRID JCL for Java A JES WCG Endpoint Spool JOB MQ Assembler WSGRID C B A 3 2 1 Job output goes to spool Java COBOL C/C++ Java MDB System z and z/OS 3 2 1 3 C/C++ Batch Assem Batch COBOL Batch C/C++ Batch Msg Java Click Your TWS batch workflow:

  18. End Summary of this Show … • Integration with Enterprise Schedulers is provided by the WSGRID function • WSGRID is a module that’s easily submitted with batch JCL • One option is a thin MQ client that puts message on an MQ queue. Compute Grid MDB picks it up and submits job • There is a Java based client that does not require MQ. It’s not as fast as the native MQ client however. • WSGRID feeds Compute Grid job output back to the JES spool, and informs enterprise scheduler of RC • Because of this model, Compute Grid may be integrated with traditional batch using Enterprise Scheduler process flows

More Related