1 / 34

Academic Profile Project (APP)

Academic Profile Project (APP). April, 2013. Agenda. Introductions Project Background, Goals, Objectives Project Organization, Roles and Responsibilities Enterprise Information Stewardship on APP System Conceptual Design Status Questions, Issues, Your Thoughts.

charo
Download Presentation

Academic Profile Project (APP)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Academic Profile Project (APP) April, 2013

  2. Agenda • Introductions • Project Background, Goals, Objectives • Project Organization, Roles and Responsibilities • Enterprise Information Stewardship on APP • System Conceptual Design • Status • Questions, Issues, Your Thoughts

  3. Project Background, Goals, Objectives

  4. Project Goals • The Academic Profile Project will involve a single college (the College of Agriculture and Natural Resources) using the Digital Measures faculty profiling tool to implement an instance of this application in a manner consistent with the broader data and functional goals required by the University at large. These goals are to: • Improve institutional processes for developing and recording faculty accomplishments • Provide a single institutional system of record for faculty professional activities • Develop a data governance/stewardship program for these data at an institutional level • Comply with the demand by external reporting agencies (e.g., Star Metrics, SciENCV) for shared researcher profiling • Leverage faculty accomplishment and activity data collected internally with external requirements • Facilitate the identification and connection of faculty expertise • Discover relevant collaboration opportunities for faculty and researchers • Promote the accomplishments of faculty both internally and externally

  5. Time Frame • The time frame for the entire pilot project is approximately two years (one year for pilot and one year for CANR implementation). At the end of the first year, pilot results may be evaluated to determine the feasibility of expanding the college scope beyond CANR. • Because of the stated goal (that the application will be implemented in a manner consistent with the broader data and functional goals of the University at large), other colleges will be brought into the pilot in an advisory mode.

  6. Problem/Opportunity Definition • In fall, 2012 the APP development team completed an extensive Request for Proposal process for a faculty profiling tool which was inconclusive. There is no one tool that meets all institutional requirements; furthermore, the market space for such tools continues to be highly volatile, with functional enhancements either being developed, or in design…but not actually ready to deploy. • Working with CANR creates a significant value-driven opportunity given their experience with the Digital Measures tool and its implementation in the College over the last 5 years. • It is expected that the CANR experience with the tool will greatly facilitate the functional and data decisions that will be required to expand the view to be more representative of the University as a whole (without losing those facets that are unique to them).

  7. Project Urgency • The elements of urgency which precipitated the APP RFP project are still present (and most certainly apply to the pilot with CANR). In fact, the sooner we embark on the pilot, the sooner we will be able to evaluate the efficacy of the tool (for an institutional implementation) in an effort to satisfy the following: • The drive by the federal funding agencies and the Office of Science and Technology Policy (OSTP) to develop SciENCV and Star Metrics • Other institutions are developing or licensing tools to address this expectation as well as meet internal needs. This, in turn, is driving the development of commercial products that can address the federal funding agencies’ expectations for reporting. • Certain MSU departments and colleges are already seeking their own solutions to the problems outlined above, and are securing or developing their own profiling software. • The URC project (to be unveiled in the near future) will provide a web site for showcasing corridor institution faculty accomplishments, and providing expertise searches. MSU will want to insure the most complete compilation of accomplishments for its faculty.

  8. Proposed Solution • Engage CANR over a 2 year time frame as a pilot partner to implement a version of the Digital Measures, Activity Insights application that supports not only CANR needs, but those of MSU as a whole. The pilot APP development team would bring resources to the project in an effort to: • Provide data from multiple internal sources (e.g., HR, SIS, CLIFMS, etc.) and data from external sources (e.g., publications from Scopus Experts) using automated interfaces; • Permit entry of new elements (that to this point have not been captured in institutional systems of record, some of which will be mandated by federal reporting agencies such as the relationship of a publication to a specific grant); • Facilitate functions and processes such as annual review and review for promotion and tenure; • Produce a faculty CV; • Provide various kinds of reports (operational, management, status, dashboard metrics, etc.); • Produce data sets in formats that can be exported to other internal systems (e.g., CLIFMS) and external agencies (e.g., Star Metrics, SciVal Experts).

  9. Business Requirements; Project Focus • Collect full faculty accomplishment data (including as much history as we have) from CANR faculty using a variety of methods: • Creating imports from institutional systems (SIS, CLIFMS, Contracts and Grants, HR, etc.) • Creating imports from external systems, primarily for publications (SciVal Experts, Scopus publications) • Converting and ingesting data that may already have been entered in the existing CANR system (first determining the appropriate system of record for institutional purposes) • Manually entering data • Extract data for use in other internal or external systems; for example, professional accomplishment data for CLIFMS (which will obviate the need for faculty and/or department staff to classify, and enumerate these data themselves) and SciVal. We may soon be required to extract and format faculty accomplishment data to comply with federal requirements, and must be prepared to do so.

  10. Objectives for Project • Collect full faculty accomplishment data (including as much history as we have) from CANR faculty using a variety of methods: • Creating imports from institutional systems (SIS, CLIFMS, Contracts and Grants, HR, etc.) • Creating imports from external systems, primarily for publications (SciVal Experts, Scopus publications) • Converting and ingesting data that may already have been entered in the existing CANR system (first determining the appropriate system of record for institutional purposes) • Manually entering data • Entering data from CV’s using a partially automated scraper/parser • Extract profile data for use in other internal or external systems; for example, professional accomplishment data for CLIFMS (which will obviate the need for faculty and/or department staff to classify, and enumerate these data themselves) and SciVal. We may soon be required to extract and format faculty accomplishment data to comply with federal requirements, and must be prepared to do so. • Develop reports that can be used in support of annual review, RPT, and other departmental and college administrative functions.

  11. Project Organization, Roles and Responsibilities

  12. Project Organizational Structure

  13. Roles and Responsibilities: Project Directors (Estelle McGroarty & Mary Black) • Communicate with lead stake holders regarding • Personnel Resources • Funding • Project Scope and Objectives • Project Status • Escalation of Issues (if necessary) • Review and approve project plan (tasks, dates, priorities, dependencies) • Review and approve project deliverables • Facilitate issue articulation and resolution • Organize and chair Advisory Group • Participate in Data Governance Group (as needed) • Participate in Working Group (as needed)

  14. Roles and Responsibilities: Project Manager (R. Cotter) • Develop project plan (scope, tasks, resources, dependencies, dates) • Meet weekly with Project Directors & Data Resource Administrator to review status • Liaison with Digital Measures • Insuring DM project plan is in sync with MSU’s objectives for the pilot, and that appropriate progress is being made • Providing lead communication and specification regarding system requirements, elements and attributes, reports, interfaces, etc. • Organize and facilitate working group (meeting twice per week, or as needed to keep project pace) • Provide direction to data governance group: develop agenda, meet with data governance group, identify issues, recommend alternatives • Review working group member tasks, identify issues, facilitate resolution

  15. Roles and Responsibilities: (Tech) Working Group • Responsible for the work plan tasks (system analysis, design and development; data analysis; functional and business analysis; process mapping; system testing and usability, etc.) • Understand the dependencies of the work plan in relationship to their tasks; provide feedback to project manager with regard to status and/or issues • Will meet twice per week to review status, discuss tasks, surface & discuss issues • Provide guidance and/or bring issues to Data Governance and Advisory groups

  16. Roles and Responsibilities: Advisory Group • Members represent data and functional requirements of their disciplinary area, while insuring the needs of the university community as a whole are met. • Reviews project objectives, scope, tasks, timeline and status. • Reviews and approves project deliverables (system design and documentation, system forms, reports, and other deliverables) • Facilitates resolution of, and provides guidance on issues brought to them by working group and/or data governance group.

  17. Enterprise Information Stewardship on APP

  18. MSU Enterprise Information Stewardship defined • Definition: A framework in which key information resources are managed in a way similar to other university resources, with care and intention. These information resources include data, definitions and related technology infrastructure. • Information Stewardship will be deployed across the university. • Information Stewardship will be an on-going initiative. • Why Stewardship instead of Governance in an educational institution? • Data Stewardship at MSU may incorporate Data Governance when required.

  19. Data Stewardship defined further • Brings together IT (data) and business (governance) to foster understanding of the data that drives the business. • Facilitates the formation and integration of information resources. • Provides the framework for cross- functional decision-making. • Ensures that Enterprise data meets the strategic and operational needs of the enterprise.

  20. Governance vs. Stewardship • Data Governance: The execution and enforcement of authority over the management of data assets and the performance of data functions. • Data Stewardship: The formalization of accountability for the management of data resources.

  21. Benefits Of Information Stewardship / Governance • Improve data-based decision-making • Provide consistent and rationalized data definitions • Establish and maintain central repositories for Meta and Master data • Enable consistent reporting across business units • Provide data quality metrics and support service level agreements • Security protocols that are consistently carried out and are based on policy, guidelines, laws and audits.

  22. How is EIS facilitating this change? • EIS is an apolitical facilitator to resolve data issues, and will not drive decisions to a predetermined business or academic end. • EIS establishes processes and structures that help MSU arrive at consensus about: • Data definitions • Data structure (data elements, code values to use, data relationships, etc.) • Which are authoritative source systems • Policies for how we create, update, delete and use these data

  23. Initial Deliverables of Data Stewardship • Some things EIS has done collaboratively with functional and technical stakeholders: • Developed data flow diagrams of all current and future processes (take a “data centric” view; tell us how the data flow across the enterprise, and where they land); • Developed conceptual, logical and physical data models • Built and maintain prototype Metadata repository and collaboration web site • Participated in system configurations with focus on master data entities

  24. Dimensions of APP Data Governance • Insuring Information Usefulness By Providing: • Tools • Definitions • A repeatable process for the on-going coordination of shared data that doesn’t end when the project ends! • Defining and Providing for Data Security Through: • Access Policies • Access Management • Best Practices • Seeking Better Data Reliability Using: • Project Plans and Data Models • Business Rules • Data Quality Metrics • Audit

  25. EIS/APP “Working” Group • Define and manage the extract, translation and load (ETL) requirements for conversions and interfaces • Define and manage the ETL requirements for the data warehouse • Facilitate and finalize the data structure of "critical" entities • Provide oversight in the development of user access profiles (roles/templates) insuring that they reflect the laws, policies, regulations etc. that apply. • Review and recommend the data confidentiality classifications • Review and recommend functional and technical data definitions

  26. APP Advisory Group and Data Stewardship • Review, approve and advocate policies governing access (roles, permissions, content based access). (Data Security) • Review, approve and advocate data usage practices intended to ensure that the data assets of the institution and the individual are not misused or abused. (Data Security & Audits) • Where the group is not able to resolve definitional or other differences, the issue is defined and routed to the Advisory Group. • Review and approve the APP before “go-live”.

  27. Focus on Data Quality early and often • Understand your customer’s problems and offer real solutions. • Implement tools and policies that ensure high quality data. • Establish Service Level Agreements with the stakeholders. • Implement dashboard metrics that monitor data quality and utilization.

  28. APP Conceptual Model

  29. Status and Next Steps

  30. Status and Next Steps • Finalized committee membership, other project resources; meeting regularly • Contract finalized w/Digital Measures • Developed Project Plan; monitoring progress • Completed Process Mapping (for critical processes---Building and Maintaining the Profile; Annual Review; Review for Promotion and Tenure) • Completed Data Flow Diagrams (identifies data sources, data feeds, update requirements, data at rest) • Initiated Digital Measures instance, loaded with pilot users, base user data, personal and contact data • Working through updates to prioritized Digital Measures screens and data feeds from MSU systems (more than 50% complete) • Expect “beta” pilot group to begin accessing the system in late May, early June with remainder over the course of the summer as more data are loaded

  31. High Level Schedule Overview • PHASE I • Vendor Start Up: 12/1/2012-1/1/2013 • Initiate Working, Data Governance, Advisory Groups: November/December • Conduct Process Mapping • Conduct Initial Data Requirement Review • Phase 1 is complete • PHASE II • Data Scope Phasing Determination: 12/1/2012-1/1/2013 • Technical Design & Development of interfaces & Unit Testing, Conversion of CANR data, manual data entry: 1/1/2013 – 6/1/2013 • PHASE III • Roll Out Planning, Implementation, Training: 6/1/2013 – 8/15/2013 • Evaluation of Pilot 9/1/2013 – 12/1/2013 • Decision as to future direction: 1/1/2014 • Initiate next steps based on decision: 2/1/2014-12/1/2014 (continue w/CANR implementation, add colleges, look for another tool, etc.)

  32. Questions?

More Related