1 / 33

Student Fraud Ring Project ~ AGA - OIG Partnership Project

Northern Virginia AGA Spring Training Event. Student Fraud Ring Project ~ AGA - OIG Partnership Project. U.S. Dept . of ED - O IG D ata A nalytic S ystem. April 1 st 2014. In the beginning…. GAO Data Mining Reports DODIG Purchase Card Data Mining Project Government Wide Report

gunnar
Download Presentation

Student Fraud Ring Project ~ AGA - OIG Partnership Project

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Northern Virginia AGA Spring Training Event Student Fraud RingProject ~ AGA - OIG Partnership Project U.S. Dept. of ED - OIG Data Analytic System April 1st 2014

  2. In the beginning…. GAOData Mining Reports DODIG Purchase Card Data Mining Project Government Wide Report June 2003

  3. False Positives • The erroneous identification of a fraudulent event or dangerous condition that turns out to be harmless. • False positives often occurin Rule based detection systems.

  4. Data Mining Defined “Data Mining is the ability to predict with a high degree of probability, anomalies where fraudulent or inaccurate activity is likely using statistical and mathematical techniques.”

  5. Data Analytics Development Process Cleaning, standardizing data elements and tagging fraudulent indicators. Data Staging Process Develop Trend and Rule Based Exception Reporting using fraud indicators. Interim Reporting Aggregate or amalgamate the information contained in large datasets into manageable (smaller) information nuggets to filter out false positives. Data Reduction Run transactions thru Statistical and AI modules to identify possible fraudulent outliers. Unsupervised Reporting Using Knowledge Database, run transactions thru Statistical and AI modules to predict fraudulent anomalies. Supervised Reporting

  6. Selecting and Standing up a Successful Data Analytical Project~What are the fundamental procedures and/or guidelines to follow? • Senior Management Support • Project Selection • Canvass senior management in key business segments. • Doable in a limited period of time AND has known low hanging fruit. • Proof of Concept Documentation • Clear concise project objective that can both be measured and recognized when you get there. • Detailed process / strategy to achieve this objective. • Identify needed staffing, skill sets and other resources needed • Identify any anticipated barriers, delays or legal issues. • Critical Milestone Dates – primary purpose is twofold, first and most important is the management of expectations and second is project management best practices. • Project Team – ideally you will have identified and gotten acceptance of the core members of the project team. • Executive Champion– identify an executive champion/sponsor who has an agreed upon vision of the value and direction of the implementation the data analytic project.

  7. E-Fraud Data Analytical Model~Detecting Fraud Rings Within Title IV Student Loan Arena

  8. Student Fraud Ring Proof Of Concept Project • Objective- develop a Proof of Concept module that successfully detects with a fairly high degree of probability fraudulent or suspicious activities relating to student fraud rings. • Distance Education - Management controls of these new automated processes are being challenged as never before. • Identifying the increased potential for theft and abuse has also been a challenging pursuit. • Student fraud rings have become a rapidly growing crime activity that now have targeted the U.S. Department of Education (ED) FSA programs. • ED processed over 19 million applications for student financial aid and disbursed over $90 Billion in FSA funds in SY2010.

  9. Fraud Indicators Home Address Phone Number Multiple IP Addresses PIN Number E-Mail PINQuestion/Answer

  10. False Positive Filter and Scoring

  11. Student Fraud Ring Actual Immediate Filtered Results 15 Post Secondary Schools selected as part of the Proof of Concept Project.

  12. Investigation Agents Assessment/Feedback • Results of running this system against 15 schools were disseminated to Investigations for their assessment and feedback of the effectiveness of this model. • Conclusion - model had identified all known fraud rings from SY2010.   Statistically this is very rare, which further gave us a sense of the value generated. • Identified new previously unknown fraud rings. • Added additional students to fraud rings under investigation.

  13. Unintended Consequences Though our projects objective was to identify fraud rings in direct support to our investigative staff, we were now able to: • Identify what the big picture most likely is. Up to this point, no one really knew how large this problem is. • Impact on the Financial Statements – As the primary lender in the student loan arena, financial auditors now want to know what is the potential impact on the financial statement? • Are we working the right cases – Effective resource management of limited investigation assets? Up till now, our cases have been primarily from hot line or school referrals. There may well be very large rings out there that were up to now hidden in the background.

  14. Data Analytical Project Results • Student aid fraud ring activity has increased 82 percent from award year (AY) 2009 (18,719 students) to AY 2012 (34,007 students). • We identified a total of over 85,000 recipients who may have participated in this type of student aid fraud ring activity, who received over $874 million in Federal student financial aid. • Applying a statistical model, we estimated that $187 million of this $874 million in Title IV funds are probable fraud loss.

  15. Lessons Learned • Every effective analytical system has a embedded risk model built within it. • Beware of the wall…you will hit it, but do not get discouraged. • Vetting Process – we discovered that this is the final crucial step in the Phase III – filtering out false/positives. • Unintended Consequences - is a frequent byproduct of analytical projects if you look for it.

  16. GAO/CIGIE/RATBData Analytics for Oversight and Law Enforcement • One of the primary points that was discussed was the limited coordination and cooperation between federal, state, and local oversight entities, which sometimes causes missed opportunities. • Local governments may be reluctant to share data to state and federal agencies because data sharing seemed to be a “one-way street…” • Discussion of sharing interoperable, open source analytical tools, techniques, including open source software. Initial benefits identified: • Acquisition and implementation at zero external cost and obligation. • Large support communities that agencies can tap for guidance. • Eliminate the challenge of developing licensing agreements for proprietary software tools. • www.gao.gov/products/GAO-13-680SP

  17. AGAPartnership for Intergovernmental Management and Accountability • The Partnership to seek opportunities to increase the effectiveness and efficiencies of knowledge sharing between the Federal and State governments. • One of the primary Goals of the Intergovernmental Partnership: • Improving communication among higher education, the federal government, and state and local governments. • Opportunity - The empowerment of State Education Agencies with effective risk management analytical tools, to proactively detect emerging downward trends within the Financial Management and Program Performance areas of their respective Local Education Agencies.

  18. Joint Educational Sharing System(JESS) • JESS is a multi-purpose system which resides within the AWS Cloud. • Goal to promote the sharing of information to discover fraudulent financial activities or improper payments, along with identifying key areas of operational risk. • Federal and designated state personnel would have access to this system to ensure cross-collaboration. • The information provided by JESS would be used for on-going analytical work in support of respective organizational missions. • For example, a state investigator could use purchase card related analytical information in identifying fraudulent and improper use of purchase cards. • These proven methodologies and analytical algorithms could then be translated into their local system to enhance established detection and prevention controls and techniques.

  19. State and Local Education Agencies Risk Model (SLRM)

  20. Objective of SLRM • Identify ‘evidence of risk’ within State and Local Educational Agencies. • The SLRM risk model will provide audit and investigation management with a continuous auditing functionality thereby enhancing audit planning and investigation resource management.

  21. Methodology of SLRM • Group similar size Local Education Agencies (LEA) and rank them based on weighted scores assigned to selected risk factors. • Groups, Risk Factors, Scores, and Weights were agreed to and determined by the SLRM Project Team • LEAs split into six groups. • Risk Factors from six primary sources of data. • Risk factor data transformed into scores ranging from zero to 100. • Scores weighted by multiplying by 1, 2, 3, 4, or 5. • Ranking on a scoring system within each group • Highest score represents the highest risk LEA in group • States ranked by combining all group rankings of LEAs within each state.

  22. Risk Factors were Derived From Six Primary Sources of Data NCES (National Center for Education Statistics) • Common Core Data, which contains the list of LEAs and number of students. Dept. of Education Systems • EDEN Data and EDFACTS Data, which contain LEA Performance information. • G5 – Grant System Dun and Bradstreet A-133 Single Audit - Federal Audit Clearinghouse SAM – GSA’s System for Award Management

  23. Grouping of LEAs

  24. Selecting Your Own Risk Factors and Creating Your Own Groups • You may select which risk factors you want to include in the model. • You may assign your own weights to each risk factor. • You may create your own baskets of risk factors. • You may assign your own weights to each basket.

  25. GROUP 1 – 27 LEA’s

  26. GROUP 3– 499 LEA’s

  27. ALL GROUPS/STATES– 9,616 LEA’s

  28. CALIFORNIA / ALL GROUPS– 614 LEA’s

  29. Questions • Any questions

More Related