the srtr program specific reporting tools key points n.
Skip this Video
Loading SlideShow in 5 Seconds..
The SRTR Program-Specific Reporting Tools: Key Points PowerPoint Presentation
Download Presentation
The SRTR Program-Specific Reporting Tools: Key Points

The SRTR Program-Specific Reporting Tools: Key Points

102 Views Download Presentation
Download Presentation

The SRTR Program-Specific Reporting Tools: Key Points

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. The SRTR Program-Specific Reporting Tools: Key Points

  2. Using SRTR Slides • We welcome the use of SRTR slides as we value the distribution of our research for the benefit of patient care and transplant research. • Because the SRTR data and analyses on the following slides are published in the public domain, we ask that users honor our guidelines when using the slides for their own research.

  3. SRTR Slide Use Guidelines • Modifying SRTR data, analyses, tables ,and graphics in any way is not permitted without prior approval from SRTR staff at the Chronic Disease Research Group of the Minneapolis Medical Research Foundation. • Each SRTR slide use must include the citation for the associated publication and feature the corresponding SRTR logo.

  4. About SRTR • The Scientific Registry of Transplant Recipients (SRTR) is an ever-expanding national database of transplantation statistics. • Founded in 1987. • SRTR maintains data on the full spectrum of transplant activity, ranging from organ donation and waitlist candidates to transplant recipients and survival statistics.

  5. SRTR Background • SRTR supports ongoing evaluation of the scientific and clinical status of solid organ transplantation, including kidney, heart, liver, lung, pancreas, and intestine transplants. • SRTR provides data on all solid organ transplants and donations in the United States. • The Health Resources and Services Administration (HRSA), a division of the US Department of Health and Human Services, provides oversight of and funding for SRTR. • The Chronic Disease Research Group of the Minneapolis Medical Research Foundation administers the SRTR contract, awarded by HRSA.

  6. Outline • How SRTR differs from the Organ Procurement and Transplantation Network (OPTN) • Program-specific reports and their intended audience. • Timeline and cohort selection. • Patients who are lost to follow-up: censoring and extra ascertainment. • Expected survival and risk adjustment. • Comparison points: norms vs. targets. • Interpretation of survival statistics: what is important to whom?

  7. Complementary Roles of HRSA Contracts

  8. Key SRTR Roles and Responsibilities • Provide research support to the OPTN Board of Directors, OPTN committees, Health Resource and Services Administration, the Secretary of Health and Human Service’s Advisory Committee on Organ Transplantation, and the scientific and transplant communities. • Perform ongoing evaluations of the scientific and clinical status of solid organ transplantation. • Develop and publish Program- and OPO-Specific Reports. • Create an annual report on trends in solid organ transplantation in the US. • Create a biennial report to congress detailing the state of solid organ transplantation. • Facilitate outside research on transplantation by releasing data to outside researchers.

  9. Relationships: HRSA/OPTN/UNOS/MMRF/SRTR

  10. Where Does the SRTR Fit In? Secretary of HHS ACOT HRSA DOT OPTN Transplant centers --------- OPOs SRTR CMS STAC Candidates and recipients Living donors Investigators and the public

  11. SRTR Data Sources • Transplant centers and OPOs (OPTN, self-reported). • Other transplant centers and OPOs (secondary OPTN via patient linking). • Social Security Administration (SSDMF). • Centers for Medicare & Medicaid Services (CMS). • National Center for Health Statistics. • Cancer Registries (SEERs). SRTR and OPTN cooperate to ensure quality and consistency.

  12. CMS-ESRD SRTR Information Flow Personlinking Monthlytransfer SSDMF OPTN SRTR SEER Analysis File CreationReorganization for researchcleaning and validationAnalysis variables added Dataquality NCHS,etc. Datafixes Standardanalysis files Data use agreements Externalresearch Publicrelease Feedback Program-Specific Analyses Analytic Procedures and Products Program-specific reports OPO-specific reports OPTN, ACOT committee Annual, biennial reports Conferencepresent-ations Journalarticles TACRFI OPTNMPSC

  13. Program-Specific Reporting:Different Formats for Different Audiences • Program-specific reports: • For the entire transplant community: patients and families, payers, transplant centers, and government. • Quarterly reports to the OPTN Membership and Professional Standards Committee. • Outcome assessment for further investigation. • Standardized request for information (RFI). • Part F, experience data. • Centers submit to payers.

  14. Different Audiences, Different Questions:Different Statistics and Interpretations • Patients and families • What will happen to me? • Percent survival at 1 year, 3 years. • Chances of transplant or death while on the waiting list. • CMS-required consent process. • Payers (including CMS) and MPSC • Does a program perform up to standard or systematically fail to do so? • Transplant programs • What choices do our patients have? • What can we tell our patients about waiting time and survival? • How well are we doing? How can we improve?

  15. Advantages to Users: SRTR vs. Self-Reported Data • Consistent and audited data collection. • Consistent statistical methodology. • No duplication of effort by facilities. • Extra ascertainment of mortality and graft failure. • Risk adjustment and comparison points.

  16. Program-Specific Report Contents • Detailed tables • Transplant center activity. • Characteristics of patients. • Patient outcomes: waiting list, posttransplant. • Interpretation of statistics • User’s guide text summary (least complex). • Table notes (more depth). • Technical notes (most depth).

  17. Two Different Program-Specific Reports • Traditional reports show 11 tables describing activities of a specific transplant program. • Graphical reports show both tables and figures describing activities of a specific transplant program. The new-format reports include very little information that is not included in the traditional reports.

  18. Traditional Program-Specific Report Tables Table 1 Waitlist activity (e.g., additions, reasons for removal). Table 2 Characteristics of waitlist candidates. Tables 3-6 Transplant and mortality rates for waitlist candidates. Tables 7-9 Characteristics of recipients, donors, and transplant procedures. Tables 10-11 Graft and patient survival rates compared with expected values. [Source of outcomes in transplant regulations]

  19. Graphical Format Program-Specific Report Sections Section A: Program Summary • A brief summary of important program metrics. Section B: Waitlist Information • Information about candidates on the program's waiting list, including transplant rates, waitlist mortality rates, and demographic and medical characteristics of candidates. Section C: Transplant Information • Information about transplant recipients at the program, including graft and patient survival rates and recipient demographic and medical characteristics.

  20. Timeline for Program-Specific Reports • Updated every 6 months (June, December). • Patient and graft survival tables report 1-month, 1-year, and 3-year outcomes for 2.5-year cohorts of recipients.

  21. Choosing Cohorts for Analysis • The most recent information is usually the most interesting and relevant. • Balance timeliness and completeness. • Cohorts for 3-year survival must end earlier than cohorts for 1-year survival. • Include follow-up information until a transplant “anniversary,” after which a follow-up is due. • Censor at this date, because adverse outcomes are more likely to be reported early. • Allow enough lag time for completeness of all relevant sources.

  22. Why Don’t We Have 1-Year Survival for the Last 6 Months of Transplants? • 1-year outcomes are not available for 18 months. • One year needed to determine 1-year survival. • Time needed for programs to submit 1-year follow-up forms to OPTN. • Two months needed for SRTR to calculate statistics and for centers to comment. • Must include enough transplants to allow stable estimates; PSRs use 2.5 years. Together, these factors require a 2.5-year cohort ending 12 to 18 months before the report date, with some transplants occurring as long as 3.5 to 4 years before.

  23. Incomplete Data and Loss to Follow-UpSolution 1: Censored Data Methods • Actuarial methods (Kaplan-Meier/Cox) can extend estimates to end of follow-up period with some incomplete (censored) data. • When follow-up time is unknown, results are imputed based on results of other patients in the same state at the time of censoring. • Statistical comparison should be based on actual follow-up, not projected follow-up. • Estimates become unstable as fewer patients are being followed.

  24. Calculating Survival Using Incomplete Follow-Up

  25. Incomplete Data and Loss to Follow-UpSolution 2: Extra Ascertainment • Censoring (Kaplan Meier/Cox) works only if failure rates for “lost” patients are similar to failure rates for followed patients (unbiased). • Censoring can produce unstable estimates for small samples. • Use additional sources of data. • Helps address concerns that centers may under-report or report on a biased sample.

  26. Implications of Extra Ascertainment for Survival Analyses • NDI study indicates that SRTR identifies > 99% of deaths. • Assume alive unless known otherwise during time when all sources are expected to be complete (censor at this time). • Survival may increase (healthy patients may have been lost) or decrease because of added time at risk. • Impact: minimal nationally, large for a few facilities. •

  27. Adjusted Outcomes • SRTR computes rates of adverse posttransplant outcomes for each transplant program on a regular schedule. • Observed rates are compared with rates that would be expected based on characteristics of recipients and donors at each program.

  28. Different Outcomes Attributed To: • Differences in: • Patients served by the center. • Characteristics of available donors. • Treatment practices at the center. • Random chance.

  29. Why Compare Observed and Expected Outcomes? • Allows fair comparison among programs that treat different types of patients. • Programs that treat older or sicker patients might provide excellent care even though outcomes are worse than average. • Programs that treat healthier patients might not provide excellent care even though outcomes are better than average.

  30. Survival Rate vs. Death Rate vs. Death CountDelta or Ratio

  31. “ Expected Deaths” Calculation Accounts for Survival Time

  32. Risk Adjustment What rate would be expected for patients at this center if their outcomes were comparable to national outcomes for similar patients? • “Similar” defined by characteristics that affect the rate, such as: • Demographics • Etiology • Severity of illness • Differences between observed and expected outcomes are not due to these adjustment factors.

  33. Choosing Model Variables • Include variables that are statistically significant or near-significant. • Include variables that are clinically important and increase the face validity of the model. • Reject variables that adjust for practice patterns. • Reject variables that produce unstable values. • Review of models by committees for input and feedback.

  34. Model Review • Models are reviewed on an ongoing basis and updated as necessary with: • Input from clinicians and statisticians. • Input from OPTN committees. • As new data elements become available, they are considered for inclusion in the models using previously mentioned criteria.

  35. Examples of Factors Used for Risk Adjustment • Recipient and donor demographic characteristics • ABO compatibility • Primary disease • Donor cause of death • Ischemia time • Previous transplant • Life support • HLA mismatch and CPRA (KI) • Duration on dialysis (KI) • Creatinine (LI)

  36. Documentation of Risk-Adjustment Models

  37. Interpreting Model Coefficients Hazard Ratio > 1, failure/death more likely, lower than expected. Hazard Ratio < 1, failure/death less likely, higher than expected.

  38. Example: Adjusting for Age • Nationally: • Average survival, 85%. • 50% of patients are young with 95% survival. • 50% of patients are old with 75% survival. • Program A treats only older patients, 80% survival: • Program survival of 80% is worse than national average of 85%. • But, 100% are older patients with expected 75% survival. • Thus, Center A patients have better than expected survival compared with similar patients nationwide.

  39. Risk-Adjustment Example:Expanded Criteria Kidney Donors • Will accepting ECD donors adversely affect center survival statistics? No, because: • ECD risk adjustment controls for donor factors: • Hypertension: hazard ratio 1.14 • Creatinine, per 0.5 mg/dL above 1.5: hazard ratio 1.06 • Donor age: ≥ 65 yr (ref, 35-49 yr): hazard ratio 1.71 • COD stroke (vs. other COD): hazard ratio 1.16 • ECD classification: hazard ratio 1.01 • All adjustments are documented at HRs calculated as exp(B) from 1 yr KI GS model, PSRs released 07/11/2008.

  40. The Importance of Adjustment • Center size. • Length of follow-up. • Case mix.

  41. No Adjustment The number of deaths in Center X is much higher than the national average (per center). Should Center X be flagged?

  42. Adjustment: Account for Center Size Center X performed 586 transplants, while the average center performed 137. Consider the proportion of patients who die in 3 years, not the number of deaths.

  43. Adjustment: Account for Center Size The percentage of deaths in Center X is much higher than the national average. Should Center X be flagged?

  44. Center X Treats More Older Recipients Than the National Average

  45. Adjustment: Account for Case Mix The older recipient age at Center X (along with other factors) gives Center X an expected 13.1% deaths, compared with the national average of 9.5%.Use ratio of observed/expected deaths.

  46. Adjustment: Metric and Confounding Factors Center X has a higher observed/expected death ratio. Should Center X be flagged?

  47. Wide Range of Expected Values Source: PSRs released January 2009

  48. Programs Outside Each Review CriterionJanuary 2009, PSR, Adult Survival

  49. For More Information • In the OPTN/SRTR Annual Report: • Analytical approaches section. • Data Sources section. • Technical notes to the PSRs: • SRTR Help Desk: • e-mail: • phone: (877) 970-SRTR •