1 / 16

High Volume Processing Workshop

High Volume Processing Workshop. Kaki Wynn, Dave Tanner, Andre Curione, Kim Weber, Tammy Huff. Are you a High Volume Client?. Transactional Volume Average – up to 400K Transactions per Day (80M per Year) High – up to 3-4 Million Transactions per Day (750M per Year) Account Volume

Download Presentation

High Volume Processing Workshop

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. High Volume Processing Workshop Kaki Wynn, Dave Tanner, Andre Curione, Kim Weber, Tammy Huff

  2. Are you a High Volume Client? • Transactional Volume • Average – up to 400K Transactions per Day (80M per Year) • High – up to 3-4 Million Transactions per Day (750M per Year) • Account Volume • Average - Lower number of accounts (500 or less), with a focus on cash reconciliation and/or other transactionallyoriented recons.  These are the customers that require matching. • High - Higher number of accounts (in the thousands and above), which are focused on Balance Sheet reconciliation, and may not require matching. • Number of Users • Typically a higher number of accounts translates into a larger User base

  3. Hardware RecommendationsApplication Server • Average Volume • 4 Processing Cores • 8 - 12 GB RAM • 2 - 6 GB available for Operating System and external processes • 6 GB maximum allocation for T-Recs Application Server • 30 GB of available disk space • Windows 2008R2 Server x64 Standard Edition with SP1 • High Volume • 4 Processing Cores • 48 GB RAM • 16 GB RAM available for Operating System and external processes • 32 GB RAM maximum allocation for T-Recs Application Server • Active/Standby T-Recs Application Server cluster

  4. Hardware RecommendationsWeb Server • Average Volume • 4 Processing Cores • 6 - 10 GB RAM • 2 - 6 GB available for Operating System and external processes • 4 GB maximum allocation for T-Recs Web Server • 10 GB of available space • 30 GB for Application Server and Data Files • Windows 2008R2 Server x64 Standard Edition with SP1 • High Volume • 4 Processing Cores • 32 GB RAM • 16 GB RAM available for Operating System and external processes • 24 GB RAM maximum allocation for T-Recs Web Server • Active/Standby T-Recs Web Server cluster

  5. Hardware RecommendationsDatabase Server • Average Volume • 4 Processing Cores • 12 GB RAM • 4 GB available for Operating System and external processes • 8 GB memory limit for SQL Server • 60 GB of available space • Windows 2008R2 Server x64 Standard Edition with SP1 • Microsoft SQL Server 2008R2 • High Volume • ORACLE • Dual Intel Xeon Processors (12 cores each) • 96 GB RAM • Windows 2008R2 Server x64 Enterprise Edition with SP1 • 11.2g running in Active/Active two(2) node RAC • Orahome installed on local disk, all other drives are ISCSI mounted raw volumes (ASM managed) • Recommended 1200 Open Cursors • DB Block size 16K • All other DB parameters are ORACLE DEFAULT

  6. Database Configuration Recommendations • Use of indexes for searches and high volume processessuch as Importing, Matching, Reconciliation Generation, GL Export, etc. • Indexes depends on data distribution particular to each customer business requirements and should be based on actual queries • Speeding up searches reduces database locks • Drawbacks of having too many indexes • Space considerations, time it takes to update indexes during data manipulations, DB maintenance time, etc. • Remove ineffective indexes • Perform database maintenance regularly • If the data changes rapidly do not rely on automatic stats collection • Database partitioning

  7. Code Optimization • Current and on-going process to • Improve application performance: • Hot spots / Bottlenecks • Execution Time • Memory Usage • Bandwidth / Network usage • Java Upgrades 1.7.x and 64 bits platforms • Re-design current code to improve performance and system response time • Functionality migration from legacy desktop applications to T-Recs Enterprise Web Client • Security standards and updates.

  8. Code Optimization • Long term plans: • Updates to the latest technologies in the market • Platform updates: java, utilities, libraries, database engines. • Security standards • Reporting engines (more appealing and flexible reports) • Support for additional platforms (mobile) • Automated System testing under stress / high volume scenarios

  9. Code Optimization • T-Recs is designed to be versatile and perform in a lot of different scenarios. It can be configured to satisfy the requirements of most businesses. • Contact Development to Customize the process according to your specific needs: • Custom Procedures to calculate and update complex data • Scripts to re-format data • Reports to pull customized data • Personalized File Exports to interface with other systems • All these custom processes will be integrated within your T-Recs installation and can be executed on demand or on a scheduled basis.

  10. Code Optimization - Reference Fields • Data in T-Recs can be stored in different Reference field types: • Standard Transaction Fields (limited number of fields) • Custom Fields (unlimited fields, performance can be impacted) Both types are alphanumeric and data validations are unsupported (prone to errors) What is alphanumeric ?… Why performance can be impacted? • Sort by: • 01234 • 56789 • ABC • XYZ • 2014-02-30 • +-*/$ • 2014AB123-$100.5

  11. Code Optimization – Supplemental Fields • Starting with T-Recs 7.1 new Supplemental Fields have been introduced. • Different Data Types • Amount Fields (multi-currency) • Date Fields • Numeric (integer / decimals) • Alphanumeric • Customizable labels • Data can be validated • Faster Operations involvingsupplemental fields • Fields can be secured individually. • Supported in SmartMatch, SmartResolve, Reconciliation…

  12. High Account VolumeConfiguration Considerations and Recommendations • Structured Importing • Load smaller groups of accounts. • Operational Structures • Group accounts together. (i.e. types of accounts, responsibility of the account, departments, or processes) • Don’t put 1 account per node. • Users • As of 7.0 there is now an upload functionality to load users, as well as operational structure assignment.

  13. High Account VolumeConfiguration Considerations and Recommendations • Reconciliation Workflow • Set Up • Load smaller groups of accounts if using the structured importing format. • Make sure that there is a different person for the adjacent role. • Adjusted Balances • Balances to be in rows is recommended if loading more than 1 month in the file. • Workflow Process • Auto-Certify where you are able to. • Mass Verify • Has to be the same workflow • Grouping accounts into 1 T-Recs account vs many T-Recs accounts in 1 grouped recon.

  14. High Transactional VolumeConfiguration Considerations and Recommendations • Importing • Data files should ALWAYS be sorted by Import Account Number • If Import Account Number is determined by concatenating fields, the sort order of the file should be based on these fields • Consider using the new Supplemental fields – there are less table joins than Custom Fields • Match Rules • The primary matching field should be placed in Reference 1 • If additional Reference fields are required for matching, contact CSSI for indexing recommendations • Match Rule Sets • Order match rules with 1 to 1 matching first, then 1 to Many, then Many to Many • Create an Operational Structure for matching purposes • Only include Accounts that matching will actually occur in

  15. High TransactionalVolumeConfiguration Considerations and Recommendations • SmartResolve Procedures • Narrow down main Transaction Criteria Set as much as possible • If Reference fields 2-10 are necessary, contact CSSI for indexing recommendations • If Custom Fields (Reference 11 +) are required consider using the new Supplemental fields instead • For Step Action “Send to Exception List”, consider the volume of data that will qualify to be sent to the Exception List. This tool is designed to assist users in research/resolution of exceptions – what is a manageable amount for users to research? • If volume is unmanageable for the user, the tool is ineffective • For high volume exceptions, consider the Custom Export feature • Reporting • Use Output Format of TXT, CSV or XLS and export file to a directory (filesystem) or use the export functionality

  16. Thank you!

More Related