Charon vax in rdb engineering
Download
1 / 41

Charon-VAX in Rdb Engineering - PowerPoint PPT Presentation


  • 76 Views
  • Uploaded on

Charon-VAX in Rdb Engineering. Norman Lastovica Oracle Rdb Engineering Oracle New England Development Center [email protected] Problem Overview. Needed to reduce computer lab footprint Floor space, power & cooling Very old hardware maintenance headache

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' Charon-VAX in Rdb Engineering' - xanthus-fox


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Charon vax in rdb engineering

Charon-VAX in Rdb Engineering

Norman Lastovica

Oracle Rdb Engineering

Oracle New England Development Center

[email protected]


Problem overview
Problem Overview

  • Needed to reduce computer lab footprint

    • Floor space, power & cooling

  • Very old hardware maintenance headache

    • Unreliable & difficult to repair

    • VAX computers over 15 years old

      • Some star Couplers, HSJs, disks even older


Problem overview1
Problem Overview

  • Performance of CPU, memory, Ethernet, disks, controllers, & busses lag behind Alpha & I64

  • Need multiple VAX environments for building, testing, debugging & support for Rdb product family


Charon vax solution
Charon-VAX Solution

  • Replace approximately 12 large VAX systems (6000 & 7000 class) in several clusters with Charon-VAX emulators

    • Consolidate/simplify existing systems & clusters

  • Migrate primarily to SAN-based storage

  • Sub-goal of improved performance & reliability for users (at least no reduction)


Remove vax hardware
Remove VAX Hardware?

“Oracle Corporation supports VAX versions of Oracle Rdb and Oracle CODASYL DBMS and their related products running on CHARON-VAX provided that any problems reported can be reproduced by Oracle Support on an actual VAX.”

“HP Services supports HP OpenVMS software on the CHARON-VAX and CHARON-AXP emulators running on HP systems only. Existing software service contracts are valid on supported OpenVMS VAX and OpenVMS Alpha AXP applications running on the appropriate emulator. HP fixes software problems if they are also seen in the comparable VAX or Alpha AXP environment.”


Extensive testing
Extensive Testing

  • Before announcing support for Rdb on Charon-VAX

    • Extensive Rdb & DBMS Regression tests

    • Various Performance tests

    • Consultations with HP & SRI



Prime number generation
Prime Number Generation

  • C program from Internet

  • Single-user

  • CPU intensive

  • Dell laptop host system

    • Single 2GHz Intel CPU

    • …at 35,000 feet


More prime number generation
More Prime Number Generation

  • C program from Internet

  • Single-user

  • CPU intensive

  • HP BL25p host

    • Dual 2.6GHz dual-core AMD


Random floating point additions
Random Floating Point Additions

  • Update random floating point numbers in 1MB global section

  • Single-user

  • HP BL25p host

    • Dual 2.6GHz dual-core AMD


Lock request latencies local remote
Lock Request LatenciesLocal & Remote

  • HP BL25p host

    • Dual 2.6GHz dual-core AMD


Dbms regression test
DBMS Regression Test

  • Sun V65 host

    • Dual 3.06GHz Intel with HT

    • MSCP served disks via Ethernet


Rdb database populate
Rdb Database Populate

  • VAX 6650

    • HSJ storage

  • HP DL 585 host

    • Quad 2.4GHz AMD

    • Single IDE disk

  • Single user store data into database

  • Average for 100 txn


Single user oltp
Single User OLTP

  • Single user

  • Random DB update

  • Average for 1,000 txn

  • VAX 6650

    • HSJ storage

  • HP DL 585 host

    • Quad 2.4GHz AMD

    • Single IDE disk


Synchronous random 5 block io
Synchronous Random 5-block IO

  • $IOT /COUNT=2000 /QUE=1 /SIZE=5 SYS$SYSDEVICE

  • CI HSJ40 on VAX 6650

  • Fibre EVA3000 on Charon 6630


Queue of random 5 block io
Queue of Random 5-block IO

  • $IOT /COUNT=2000 /QUE=8 /SIZE=5 SYS$SYSDEVICE

  • CI HSJ40 on VAX 6650

  • Fibre EVA3000 on Charon 6630


Queue of random 5 block io1
Queue of Random 5-block IO

  • $IOT /COUNT=7500 /QUE=8 /SIZE=5 RDB$TEST_SYS1:

  • Fibre EVA3000 on Charon 6630

  • Software Raid set of 10 disks on CI HSJ40s on VAX 6650


Create and sort file of random records
Create and Sort File of Random Records

1,000,000 records / 256,167 blocks



Hp proliant bl25p server blade
HP ProLiant BL25p Server Blade

  • 1.7in (4.3cm) x 10.3in (26.2cm) x 28in (71cm)

  • 21 lb (9.5 kg)

  • Two Dual-Core AMD Opteron™ (2.6 GHz)

  • 6GB PC3200 DDR SDRAM at 400 MHz

  • 4 Gigabit NIC ports

  • Dual Port 2-Gb Fibre Channel Adapter

  • Internal HP 36GB U320 15K disk


Why bl25p
Why BL25p?

  • Two dual-core processors = 4 effective CPUs

    • Run CHARON-VAX/6630 Plus for Windows

    • Windows Server 2003 Standard Edition

      • Only 4GB memory of our 6GB usable due to limit in Standard Edition (larger limits in “higher” Editions) – whoops

  • More cost & space effective than 4p DL585

    • Very near same peak performance (2.6GHz dual-core vs. 2.8GHz single-core)


Why bl25p1
Why BL25p?

  • Up to 8 BL25p servers in single 10.5” tall enclosure

    • Using existing rack space = no additional floor space

  • Remote management capable

    • Software KVM switch console / ILO

  • Alternately: BL35p – 2 NIC ports, two 2.4GHz dual-core, 5k or 10k internal disks - Same price per server - Up to 16 per enclosure


Blade enclosure
Blade Enclosure

10.5 inches high

19 inch rack mount

8 BL25p or 16 BL35p



Phased implementation plan
Phased Implementation Plan

  • Replace 3 test clusters (total of 8 nodes) with single 2 node cluster

    • Install and test and experiment with new hardware and then migrate workload and shutdown old systems

    • Work out installation and configuration issues to avoid impacting development cluster or test environments

  • Replace 3 VAX nodes in development cluster


Best of intentions
Best of Intentions

  • Multiple UPS failures in a single day

  • 2 VAX systems in development cluster suffer serious hardware damage – multiple power supplies failed

  • Leads to Accelerated Charon-VAX deployment


Original development cluster configuration
Original Development Cluster Configuration

2 I64 rx4640 V8.2-1

2 Alpha V8.2

2 VAX 6650 V7.3

Ethernets

DECnet,TCP/IP,

SCS

2 CI Rails

2Gb San

EVA5000

2 Star Couplers

HSJ40s


New development cluster configuration
New Development Cluster Configuration

2 I64 rx4640

2 Alpha

3 BL25p / Charon-6630

Ethernets

DECnet,TCP/IP,

SCS

2Gb San

EVA5000


Test cluster configuration
Test Cluster Configuration

2 BL25p / Charon-6630

Ethernets

DECnet,TCP/IP,

SCS

2Gb San

EVA5000


Host detail configuration
Host Detail Configuration

6 BL25p / Charon-6630

Local windows system disk & page / swap container file per host for VMS

  • Shared “DUA” disks per cluster

  • VMS system & data disks

  • “Raw” LUNs on SAN presented to Windows

EVA5000


Vax disks on san
VAX Disks on SAN

  • Charon presents raw SAN LUN as MSCP DUA device

  • VAX/VMS sees it as “DUAx:” just like from HSJ

  • If needed - must be MSCP served from VAX to other Alpha/I64 nodes –can not access LUN directly because it appears as “DGA” device

  • Multiple Charon-VAX systems in cluster access same SAN LUN with same DUA name


Memory configuration
Memory Configuration

  • Various 128MB, 256MB & 512MB on 76x0, 66x0, 65x0, 64x0 test and development systems

  • 1GB on our Charon-6630

    • Can be increased to 2GB with enough host system memory

  • Perhaps allow VMS processes larger caches and/or working sets to reduce paging & IO


Disk configuration
Disk Configuration

  • Local host Windows system disk

    • Could alternately have been on SAN

  • VAX system disk on EVA5000 disk unit shared between multiple Charon hosts in Cluster

  • VAX system page/swap disk is container file on local host disk


Disk performance
Disk Performance

  • Access to VAX system disk on SAN roughly 2 to 50 times faster than CI based HSJ storage

  • MSCP served disks from Alpha about equal from Charon-6630 (via NI) as hardware VAX (via CI)

  • Once new configuration proves reliable, CI-related hardware will be retired


System performance
System Performance

  • Single Charon-6630 roughly 3 times faster than hardware 6650

    • On our host, Charon-66x0 CPU runs around 3 to 6 times faster than hardware 66x0 CPU

  • Less CPUs should result in less inter-processor contention (ie, MPSYNCH time)


Application performance
Application Performance

  • First Rdb regression test suite run time reduced from about 12 hours to about 8 hours – 33% increase


Relatively simple deployment
Relatively Simple Deployment

  • Install & configure Windows Server 2003 Standard Edition

    • Install windows updates / patches, anti-virus, Charon

    • Disable documented windows services

    • Configure prototype Charon-VAX emulator template files

  • Replicate Windows disk to other hosts


Relatively simple deployment1
Relatively Simple Deployment

  • Create VAX disk LUNs on SAN

    • Make visible to existing Alpha or I64 system in cluster

    • BACKUP/IMAGE existing VAX disks to new LUNs

    • Make LUNs visible to PC host servers

    • Shutdown VAX systems cluster-wide, dismount CI disks on remaining nodes

    • Start Charon-VAX and boot from LUN on multiple hosts

    • Mount new disks (served via MSCP to cluster from Charon-VAX nodes) on remaining nodes


Watch out for
Watch Out For…

  • Read all documentation before you embark

    • Charon-VAX, Windows Server 2003

  • Difficult to work with raw LUNs from SAN to Windows for VMS cluster disks

  • Disk unit numbers presented to cluster & disk allocation classes


Watch out for1
Watch Out For…

  • Boot options & saving in ROM file

  • Windows Server 2003 Standard Edition limits 4p & 4GB

    • Other “editions” offer higher limits

  • Users concerned that things may be broken because they run so fast!



ad