1 / 94

Welcome to TechEdge

Welcome to TechEdge. Why Use Twitter @ TechEdge?. Back channel for real-time conversations Broadcast key takeaways Ask questions Event feedback. How to Use Twitter During TechEdge. Twitter will appear on projector screen during: Breaks Q&A Wireless access code: CTXS_Synergy_TE

gerry
Download Presentation

Welcome to TechEdge

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Welcome to TechEdge

  2. Why Use Twitter @ TechEdge? • Back channel for real-time conversations • Broadcast key takeaways • Ask questions • Event feedback

  3. How to Use Twitter During TechEdge • Twitter will appear on projector screen during: • Breaks • Q&A • Wireless access code: CTXS_Synergy_TE • Join the Conversation • Contribute: Include #TechEdgeC as part of each Tweet • Follow: Visit http://search.twitter.com. Enter #TechEdgeC

  4. Follow Citrix Tech Support on Twitter • Join the Conversation and follow Citrix Tech Support: @citrixsupport • Owner: Mike Stringer - Sr. Director, Americas/India Support

  5. TechEdge 2009 Citrix Delivery Center

  6. Presenters • Kapildev Ramlal • Sr. Escalation Engineer (XenDesktop, XenApp) • Keith McLaughlin • Escalation Engineer (Provisioning Server) • Jacob Maynard • Sr. Escalation Engineer (Acess Gateway Enterprise Edition) • Don Williams • Escalation Manager (Netscaler)

  7. Agenda Citrix Delivery Center Intro XenDesktop and XenApp XenServer Provisioning Server Access Gateway Enterprise Edition NetScaler

  8. Citrix Delivery Center

  9. XenApp

  10. Introducing Citrix XenApp • Citrix XenApp was formerly known as Citrix Presentation Server • Prior to Citrix Presentation Server, it was known as Citrix MetaFrame, and prior to that, Citrix WinFrame • It is the heart of Application Virtualization • It delivers applications as an on-demand service to users anywhere using any device

  11. Citrix XenApp Architecture • Utilizes a Farm concept • A server farm is a logical grouping of servers running XenApp that share a data store California NewYork Data Store Florida

  12. What is IMA? Citrix XenApp Architecture – IMA • Independent Management Architecture (IMA) - Infrastructure for inter-server communication • A collection of subsystems that control the various features of the Citrix XenApp family of products • IMA helps in centralized administration of the farm • Implemented in the form of a Windows Service (managed by the Service Control Manager)

  13. Citrix XenApp Architecture – IMA Subsystems • A subsystem is a DLL (*.dll) file. • Subsystems allow for a modular plug-in architecture. • The subsystem DLL files can be found typically in the following directory: x86 Location Program Files\Citrix\System32\Citrix\IMA\Subsystems x64 Location Program Files (x86)\Citrix\System32\Citrix\IMA\Subsystems

  14. Citrix XenApp Architecture – Farm Data • The Farm relies on data • The IMA service is the backbone of the Farm, and is responsible for manipulating the Farm's data • Each XenApp server runs the IMA service • There are 2 main forms of data: • Static Data • Data which changes infrequently such as published applications, Citrix Administrators, Citrix policies, etc. • Dynamic Data (Dynamic Store) • Data which changes frequently, such as connected sessions etc.

  15. Citrix XenApp Architecture – Farm Data The Dynamic Store • The dynamic data is stored in in-memory tables on the Data Collector (Dynamic Store). • The info can be viewed using the QueryDS.exe utility located in the following directory on the XenApp CD: w2k3\retail\Support\debug\w2k3

  16. Citrix XenApp Architecture – Farm Data • The Local Host Cache (Static Data) • Is a subset of the Data Store containing information required only by that server • Allows the server to operate if the Data Store goes down • Must exist and be accessible for the IMA Service to start • Is an Access database located on every XenApp server in the farm x86 (Program Files\Citrix\Independent Management Architecture\imalhc.mdb) x64 (Program Files (x86)\Citrix\Independent Management Architecture\imalhc.mdb)

  17. PSRequired HKLM\Software\Citrix\MA\Runtime LHC Citrix XenApp Architecture – IMA Startup ImaRpcSs.dll ImaSrvSs.dll ImaAppSs.dll MfSrvSs.dll MfBrowserSs.dll ImaUserSs.dll ImaDomain.dll RMAlertsSS.dll RMMonitorSS.dll RMSummaryDBSS.dll……………….. Service Control Manager ImaRuntimeSs.dll Required Plug-ins IMASrv.exe Product Plug-ins PSRequired=1 PSRequired=0 Zone Data Collector LHC LHC

  18. Citrix XenApp Architecture - Zones • A server farm can consist of one or more zones • A server farm is typically divided into zones when the servers in the server farm are separated geographically • Each zone has a data collector • The data collector is responsible for collecting data from member servers and distributing it to other data collectors • The first server in the zone is designated as the data collector for the zone, by default

  19. Citrix XenApp Architecture - Zones ZONE B ZONE C ZONE A XML Broker Web Interface

  20. Citrix XenApp Architecture – Change Notification Access Management Console Member Server 2512 Member Server Member Server 2513 2512 2512 Data Collector Data Store Member Server Data Collector 2512 2512 Member Server Member Server Zone A Zone B Member Server 2)The member server writes the change to the DS and forwards the information to the DC via TCP port 2512 • Change is made on the CMC via TCP port 2513 • The DC updates the LHC on the member servers in its zones via TCP port 2512 and forwards the information to all the other DC’s 4) The other DCs send the information to their member servers

  21. Active Directory From Logon to Launch • Authentication • XML Service • Basic Networking • CDF Tracing XML Broker • CDF Trace • Verify User Logon Rights • Event Logs • Network Trace Web Interface LHC Data Store Lists Servers Apps Trusts • IIS Logs • Network Trace Data Collector Client Dynamic Store Member Services

  22. XenDesktop Setup

  23. Active Directory Integration • Uses Kerberos to Authenticate DDC to VM traffic • Desktops discover DDCs • No Schema change

  24. Active Directory Integration • Create an OU for XD farm • Run Active Directory Configuration Wizard

  25. XenDesktop Setup Wizard • Integrates with Hosting Infrastructure • Creates multiple virtual desktops • Integrates with PVS

  26. Pool Management

  27. Services Involved Virtual Machines • Citrix Pool Management Service • Hosting Infrastructure • XenServer Pool Master • Vmware Virtual Center • MS SCVMM Pool Master Pool Management Service

  28. Pool Management Virtual Machines Pool Master Desktop Delivery Controller Pool Management Service

  29. Troubleshooting • Logging in Pool Management Service • CTX117452 • XenServer logs • CDF Tracing • XDPing tool

  30. XenServer

  31. Agenda XenServer Benefits Live Migration Xenmotion High Availability Provisioning VM with PVS XenApp Performance Disaster Recovery

  32. Why Virtualize? • IT flexibility/agility • Predictable scaling to dynamically respond to business need • Key part of disaster recovery strategy • Improve application availability • Server or data center consolidation • Higher utilization leads to greater consolidation • Promotes greater centralization and security • "Green Computing" • Consume less power, cooling, and real estate • Support DevTest environments • Works for both IT shops and development houses

  33. XenMotion – Live VM Movement • XenMotion allows minimal downtime movement of VMs between physical systems • Generally 150-200ms of actual “downtime” • Most of the downtime is related to network switch moving IP traffic to new port

  34. XenMotion Enables Zero Downtime Xen Hypervisor Xen Hypervisor Xen Hypervisor Shared Storage

  35. XenApp Optimizations • Specific performance optimizations for XenApp • Pre-built VM Templates for installing XenApp on XenServer

  36. Simplifying Disaster Recovery • Automated backup of VM metadata to SR • Replication of SR includes Virtual Disks and VM metadata • Attach replicated SR • Restore of VM metadata will recreate VMs 1 4 2 1 3 Xen Hypervisor Xen Hypervisor Xen Hypervisor Xen Hypervisor Xen Hypervisor Xen Hypervisor 3 Shared Storage Shared Storage 4 2 DR Site Production Site

  37. High Availability • High availability (HA) provides automatic restarts for VMs in a resource pool • When HA is enabled; • XenServer continually monitors the health of the servers in a resource pool • XenServer uses heartbeats on the network and a storage device (Heartbeat SR) to determine the state of the servers in the resource pool • If a server in the resource pool fails, the VMs running on it automatically restart on another server • If the master fails, a new server is automatically selected to take over the master role

  38. HA Requirements Requirements for enabling the HA feature include: • Shared storage, including at least one iSCSI or Fibre Channel LUN of size 356MiB or greater for the heartbeat storage repository • A XenServer resource pool • Adequate licenses on all hosts • Agile VMs Note: a separate shared storage setup is required for Metadata

  39. Considerations for HA • The iSCSI or Fibre Channel LUN is only required for the storage heartbeat. • Only agile VMs can be protected by the HA feature • An agile VM: • Has its virtual disks on shared storage • Does not have a connection to a local DVD drive configured • Has its virtual network interfaces on pool-wide networks Note: It is a good practice to use a bonded management interface on the servers in the pool if HA is enabled, and multipathed storage for the Heartbeat SR

  40. Configuring HA (XenCenter) 1 Verify the storage repository is compatible and is attached to the XenServer pool Click on an entry for your resource pool in XenCenter. The HA tab appears in the main view. 2 If HA is configured, an overview of the system status displays. If not, a message appears stating HA is not enabled. Click Configure HA. 3 2 1 3

  41. Configuring HA: High Availability Wizard (XenCenter) Click Next after the High Availability dialog opens 4 6 5 Select a storage repository and click Next 5 Specify restart protection levels and click Next 6 Click Finish 7 5 4 6

  42. Host Fencing • If a server failure occurs, the XenServer self-fences to ensure that the VMs are not running on two servers simultaneously • Server failure examples: • Loss of network connectivity • A problem with the control stack • When a fence action is taken, the server immediately is restarted, causing all of the VMs running on it to stop. The other servers detect the VMs are no longer running and the VMs are restarted according to the assigned priorities. The fenced-server enters a reboot sequence and when it has restarted, it attempts to rejoin the resource pool

  43. State.DB High Availability – XenServer Host • Three Components • High Availability recovery plans created at startup stored in statedb • Storage heartbeat to Qurorum Vdisk • Network heartbeat over management interface Heartbeat to SR 1 SAN Quorum 2 Database VDIs Heartbeat to Network 3 Recovery Plans

  44. State.DB High Availability – XenServer Host • Peer Based – Enable recovery plan • Servers 2 and 3 have not heard from server 1 on the network • Server 2 and 3 have not seen an udpate from Server 1 on the Quorum disk • Self-Aware – Assume the HA plans are in play • Server 1 cannot see Quorum disk • Server 1 has not heard from Server 2 or Server 3 • Self Fence network – VMs are expected to be started elsewhere 1 SAN Quorum 2 • High Availability in the hypervisor • Kernel mode • Direct control over local interfaces • Never out of resources Database VDIs 3

  45. Citrix Provisioning Server

  46. Agenda What active directory issue arise when streaming a vDisk Common Issues and Best Practices How does Provisioning resolve these issue

  47. Two main Streaming concerns with AD PVS Server SQL Database Domain Controller Hostname TD1 Hostname TD1 Hostname TD1 Hostname TD1

  48. Unique Hostnames

  49. Machine Account Creation PVS Server Domain Controller Target1 Target1 Add Target1 to Domain Boot Target1

More Related