1 / 39

A Survey of AMC Security Practices

A Survey of AMC Security Practices. J. David Kirby KirbyIMC.com For the AMC Security and Privacy Conference 9/26/05. Dave@KirbyIMC.com. Survey Basics. Done in February 2005 Via phone interview with Information security leaders at 10 U.S. AMCs By the speaker. Purpose of the survey.

teryl
Download Presentation

A Survey of AMC Security Practices

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A Survey of AMC Security Practices J. David Kirby KirbyIMC.com For the AMC Security and Privacy Conference 9/26/05 Dave@KirbyIMC.com

  2. Survey Basics • Done in February 2005 • Via phone interview with • Information security leaders at 10 U.S. AMCs • By the speaker

  3. Purpose of the survey • Gather security practice status from a sample of AMCs in a way that would help the surveyed AMCs and help those AMCs not surveyed. • Use to compare a single AMC against the group • Use to compare the group’s status against the group’s goals (e.g. HIPAA Security Rule Compliance) • Use to compare AMCs in single areas of practice (e.g. timely account termination) • Create efficient way to connect people who have solved some problem well with those in need of a viable solution.

  4. Survey Process • AMCs nominated because they were thought to be peers of a client. • AMCs contacted to participate via email and phone. (90% of those contacted did participate). • Participants had to be ISO/CPOs or others who were highly knowledgeable of security status of their AMC. • Participants were provided the survey content via email. • The survey items were completed as part of a conference call (~2 hours). • Participants later provided their individual results to confirm that the responses were as they intended. • Participants were given a deidentified version of all participants responses – in spreadsheet form.

  5. Survey Content Overview: Institutional Profile Section • Consists of several items that focus on the “size” and “shape” of the institution. • The measures were chosen so that the extent to which these factors affect an AMCs security program could be analyzed. • Key measures include: • Number of employees, faculty • Number of patient visits per year • Number of physical sites of large and small types • Number of IT support staff, • Number of IT security support staff.

  6. Survey Content Overview: Security Practice Section • Structure: 18 major areas with, typically, 3-5 assessment items for each area. • Areas: accountability for security, inventory, audit, documentation, system availability, data integrity, risk analysis, business associates, emailing ePHI, password management, incident handling, training, sanctioning, most valuable practices, practices most in need of improvement. • Item type: Almost all items were framed as Lichert assertions to which the participants responded with their level of agreement. (e.g. “We terminate accounts promptly.” (rated as strongly disagree ….. strongly agree).

  7. Survey Content Overview: Security Practice • Ratings: Items were framed so that responses in the “agree” and “strongly agree” range implied robust security practice. The items in aggregate covered all of the Security Rule Requirements. When a rater selected a “disagree” or “strongly disagree”, a comment was sought as to whether the rater believed that a higher rating would be better security practice. • Managing survey bias: Participants could add “other” comments to each item. N/A was an option with each item. • Item weighting: In calculating aggregate item average responses, the items were weighted so that more substantial compliance/practice issues were given more weight than others. The weights were chosen by the surveyor and validated by the participants.

  8. Selected Survey Results We will post the complete deidentifed results in spreadsheet form on the conference web site If you want to talk to one of the surveyed ISOs, contact me. I’ll check with them and if they agree, I’ll connect the parties.

  9. Participant AMC Profile • Interviewee role: • 8 ISOs • 2 Other (Info Sec Specialist, Policy advisor for Security Compliance) • Patient Visits per year • 3- 500k-800k ; 1- 50k-200k; 2- 200K-400K; 1- 800K+; 3- Don’t know • Number of sites: • 2- 1-5 sites; 3- 5-10 sites; 3- 50-100 sites; 1- 100+ sites; 1- don’t know • Number of large sites (i.e. with >50 staff): • 4- 1-5 large sites; 3- 5-10 large sites; 2- 10-30 large sites; 1 – don’t know • Number of AMC faculty: • 1 – 300-500 faculty; 3- 500-900 faculty; 6- 1000+ faculty; Observation: Ranges are large enough that best security practice may be different for different AMCs. (e.g. 1-5 sites may be handled differently than 100+ sites)

  10. Participant AMC Profile: technical labor Observation: Ranges of IT support and InfoSec support are large- when adjusted for faculty size. AMC A#8 and A#9 have exceptionally low primary security labor elements for faculty size.

  11. Summary of Security Practice Ratings Average Rating Survey items were phrased so that more secure practices would receive higher ratings in the range of 1-5 Observation: No ISO “agrees” that security, overall, is adequate at that AMC; On average, the AMCs are at 2.9 (between “neither agree nor disagree” and “disagree).” Yet, some AMCs are doing much better overall than others. Range is 1.9-3.9) Rating Scale: 1- strongly disagree, 2-disagree, 3 neither agree nor disagree, 4-agree, 5 strongly agree

  12. InfoSec Labor Level Compared with Overall Security Level. Group AVG Sec Level Group AVG Labor InfoSec FTEs per 100 faculty Security Rating Observation: InfoSec staff labor level per faculty member is not strongly related to security level. Perhaps it is InfoSec labor level in combination with other elements that foster good security? Huge range of support levels!

  13. Ratings and Analysis for Security Section Survey Items

  14. Accountability Ratings 1- strongly disagree, 2-disagree, 3 neither agree nor disagree, 4-agree, 5 strongly agree Observation: On average, ISOs are not satisfied with the process of accountability for security. Most ISOs think that they have exec support AND think that labor is inadequate !? Relative labor levels are not strongly related to satisfaction with labor levels.

  15. Accountability Process Other Comments • My AMC is strongly oriented to building system without regard to operating needs. • We have template RFPs with security elements; • I have a concern that nothing effective will happen to create security until there is a catastrophic event; • Info Sec is advisory; not compulsory; more change expected • Concerned that they will get surprised by incidents; • Decentralization inhibits accountability • Because they don't own the budgets; accountability is inhibited; • General misunderstanding of security- is corporate risk mgmt not just data control;

  16. Instant Poll #1 • My AMC’s security program has adequate support from senior executives. • ?? – strongly disagree • ?? - disagree • ?? - neither agree nor disagree • ?? - agree • ?? - strongly agree • There is adequate staffing and other resource available for the security mission at my AMC. • ?? – strongly disagree • ?? - disagree • ?? - neither agree nor disagree • ?? - agree • ?? - strongly agree Comments ??? • (1- strongly disagree, 2- disagree, 3- neither agree nor disagree, 4- agree, 5-strongly agree)

  17. Systems Inventory Ratings Observation: Only 2 AMCs are “satisfied” with their systems inventory. Typical “Other” comment was “We are trying to improve this”;

  18. Top 10 systems • We asked participants to list their top 10 PHI-laden systems. • Many ISOs did not know what their top 10 systems were without some research. • Later items refer to these top 10 systems.

  19. Auditing Practices (some items are hidden)

  20. Overall, ISOs are between disagree and “neutral” about their audit practices (2.87) One ISO “agrees” (4.0) with the presence of these practices. One is below “disagree” (1.78) Typical comments for those with “disagree” ratings on individual items is “We need to change this” and “We plan to change this”. Audit Practice Comments

  21. Instant Poll #1 • My AMC’s security program manages periodic log review well. • ?? – strongly disagree • ?? - disagree • ?? - neither agree nor disagree • ?? - agree • ?? - strongly agree • We use standard “triggers” to describe which events seen in a log represent potential security incidents. • ?? – strongly disagree • ?? - disagree • ?? - neither agree nor disagree • ?? - agree • ?? - strongly agree Comments ??? • (1- strongly disagree, 2- disagree, 3- neither agree nor disagree, 4- agree, 5-strongly agree)

  22. Instant Poll #2 • My AMC’s security program has adequate support from senior executives. • ?? – strongly disagree • ?? - disagree • ?? - neither agree nor disagree • ?? - agree • ?? - strongly agree • There is adequate staffing and other resource available for the security mission at my AMC. • ?? – strongly disagree • ?? - disagree • ?? - neither agree nor disagree • ?? - agree • ?? - strongly agree Comments ??? • (1- strongly disagree, 2- disagree, 3- neither agree nor disagree, 4- agree, 5-strongly agree)

  23. Documentation Practices Observation: A#3, A#4, A#10 do well at this. Overall, ISOs are between “disagree” and “neutral” (2.88). Four ISOs are below “disagree”. Typical comments from “disagrees” are “Need to change this” and “Expect to change this”.

  24. Availability Practices Observation: Overall ISO’s are between “Neutral” and “Agree”. Four ISO’s “agree” or better. “Testing and revising” seems to be challenging most ISOs.

  25. Risk Analysis Practices Observation: Only two ISO’s are above neutral overall. Eight ISO’s are at “agree” or better for at least one form of risk analysis. For those who “disagree” on a given item, typical comments are “we need to do this” and “we plan to do this”

  26. Business Associate Practices Observation: Most ISOs have strong contracting program. Few are assuring that incidents are reported; though some are. A few ISOs monitor BA safeguards, though most don’t.

  27. Emailing ePHI Practices Observation: Most AMCs have a policy against unencrypted email with ePHI and most provide some mechanism to support the encrypting. No one technically enforces its use.

  28. Security Incident Practices Observation: Overall ISOs are between neutral and agree (3.46). Two AMCs are above “Agree”. Typical comment for “disagree” items is “We need to do this” or “We plan to do this”.

  29. Miscellaneous practices Observation: Every practice is done by at least one AMC. Only two practices (g/h) are, overall, at “agree” or better. Typical comment for items rated “disagree” or worse is “We need to do this”

  30. Instant Poll #3 • Staff are held accountable for compliance with security policies at my AMC • ?? – strongly disagree • ?? - disagree • ?? - neither agree nor disagree • ?? - agree • ?? - strongly agree • The sanction policy at my AMC is applied equally to faculty and staff. • ?? – strongly disagree • ?? - disagree • ?? - neither agree nor disagree • ?? - agree • ?? - strongly agree Comments ??? • (1- strongly disagree, 2- disagree, 3- neither agree nor disagree, 4- agree, 5-strongly agree)

  31. Emerging Practices Observation: Biometrics and PKI are not done widely by any AMC. Otherwise, at 2-4 AMCs are widely doing every practice- and 6-8 are not.

  32. Improvement Priorities Observation: No pattern!

  33. Training and Awareness Practices Observation: Almost all AMCs train at hire. Reminder programs are nearly universal.

  34. Sanction Practices Observation: Overall, sanctioning practices are below “neutral”. At least one AMC uses each practice.

  35. Key factors in achieving and sustaining a security program Observation: At least two AMCs are at “agree” on each item. Many security programs have a mandate beyond HIPAA (e). Most programs’ status gets Board review.

  36. Q&A If you have questions, ask them now. I have more polls etc that may also stimulate more conversation after this slide, if we have time. Thanks - Dave Kirby If you want to talk to one of the surveyed ISOs, let me know the A#. Dave@KirbyIMC.com

  37. Instant Poll #4 • The survey is helpful to me in seeing how my AMC’s security program compares with other AMCs’ security programs. • ?? – strongly disagree • ?? - disagree • ?? - neither agree nor disagree • ?? - agree • ?? - strongly agree • I think that my AMC’s security program is, overall, • ?? – above the survey average • ?? - equal to the survey average • ?? - below the survey average Comments ??? • (1- strongly disagree, 2- disagree, 3- neither agree nor disagree, 4- agree, 5-strongly agree)

  38. What area of your security program is most in need of improvement

  39. What are the chief barriers to adequate security practices at your AMC?

More Related