E-810.1 Data Review and Verification
Authority | Executive Director of Institutional Research and Effectiveness |
Effective Date | September 7, 2016 |
Revision Date | |
Reviewed Date | December 20, 2023 |
Related Policies | |
Related Forms, Policies, Procedures, Statute | E-810 – Data Review and Verification |
The College evaluates specific data's completeness, correctness, and conformance/compliance to the intended purpose. The goal is to ensure and document that the reported data and results are what they purport to be and accurately and appropriately address the intended purpose. If deficiencies in the data are identified, they can be documented for the data users and, where possible, corrected or resolved.
All Employees responsible for collecting and reporting performance measures to external stakeholders must ensure data integrity and credibility, including the interpretation of the data. This Procedure is especially important before submitting or presenting data related to performance measures to the District Board or external stakeholders.
DATA REVIEW AND VERIFICATION PROCEDURE
Step 1: Clarify the Intended Purpose or Question(s) to be Addressed. The first step is clarifying the intended purpose or question. This helps to clarify the type(s) of information that may be needed and can assist in determining the location and source of these records.
Criteria to consider:
The intended purpose or question is clear, understandable, and unambiguous.
The intended purpose or question aligns with and is appropriate to the mission and strategic plan of the College.
The intended purpose or question can effectively support decision-making.
If the intended purpose or question does not meet the following criteria, the Employee responsible for this should obtain clarification from the requesting person(s).
Step 2: Determine Data Needs. The next step is to determine the data needed to address the intended purpose or question. It is essential to clearly connect the data to the question.
Different data types (e.g., quantitative or qualitative) and multiple data sources and repositories are available. Institutional Effectiveness (IE) at the College can assist as needed.
Criteria to consider:
Data needs to align with the intended purpose or question.
Data needs are well-defined, unambiguous, and understandable to the intended audience.
Data needs do not release confidential or otherwise protected information beyond approved audiences.
Data needs can be sufficiently met to support decision-making.
Should the needed data fail to meet the following criteria, Employees responsible for this should discuss the specific issue or data limitation with the requesting person(s).
Step 3: Cite Data Sources and Limitations. Appropriate and detailed citation of data sources is essential. Citation helps the intended audience better understand the reported data and results and increases their perceived validity and credibility. This step is a best practice in applied and academic research and supports the Data Review and Verification Procedure.
Criteria to consider:
Data sources are well-defined, unambiguous, and understandable to the intended audience (e.g., avoid jargon, unclear terms, acronyms, etc.).
Sources are clearly and consistently documented, with data definitions and standards used consistently and available for review.
At times, the data will have restrictions on applicability or other limits. This is common in all fields. Providing appropriate data and clearly explaining any limitations that may impact decision-making is important.
Criteria to consider:
Any data restrictions or limitations are well-defined, unambiguous, and understandable to the intended audience (Example: The following information only pertains to Students who have completed a program and should not be generalized to all Students.).
Data limitations are well-defined, including descriptions of methodologies for estimating data, the timeframes for finalizing incomplete information, and so on.
Data that are anomalous compared to other data with similar measures are explained.
Complementary or supplemental data sources are utilized as cross-checks when feasible.
Data and results, even with limitations, can sufficiently support decision-making.
The key criterion is that data and results address the intended purpose or question and support decision-making. If this cannot be met, Employees should share this with the requesting person(s).
Step 4: Data Review and Verification by Institutional Effectiveness. The fourth step is the review and verification by IE that the appropriate steps were taken to address the intended purpose or question and that accurate, reliable, and validated data and results are provided to the intended audience.
The presentation or report on the identified topic areas should be submitted to IE for review and verification at least two (2) weeks before being presented to the district board or external stakeholders.
Note that the District Board packet materials may be due earlier than this, so plan accordingly.
IE will provide data oversight and certification, which includes a thorough review of the following:
Intended purpose or questions being addressed.
Data needed to address the intended purpose or questions is appropriate and adequate.
Appropriateness, adequacy, and accuracy of reported data, including a source data review.
Adequacy of citation of data sources and limitations.
Adequacy of language, terms, tables, and figures for the intended audience.
If the presentation or report meets the criteria identified in the procedure, then IE will officially indicate on the document that the information has been reviewed and verified with the date. No further action before presentation or reporting will be required.
However, if the presentation or report fails to meet the criteria, IE will contact the Employees responsible to discuss the identified issues and any suggestions or assistance for corrections. The Employee is responsible for presenting or reporting the information, making all corrections, and receiving IE verification before sharing the information.
DEFINITIONS
Data – All institutional information related to Students, courses, and Employees is collected and analyzed for the College, State, and Federal reporting requirements.
Data Governance – The overall management of the availability, usability, integrity, and security of the data employed in an enterprise. A sound data governance program includes a governing body or council, a defined set of procedures, and a plan to execute those procedures. Adapted from “Essential Guide to Measuring a Data Quality Assurance Program.” Tech Target, 2016.
Data Integrity – Maintaining and assuring the accuracy and consistency of data over its entire life cycle, including design, implementation, and usage of any system that stores, processes, or retrieves data. Adapted from Wikipedia (Boritz, J. "IS Practitioners' Views on Core Concepts of Information Integrity." International Journal of Accounting Information Systems. Elsevier.)
Data Quality – The degree to which a set of data characteristics fulfills requirements. Characteristics include completeness, validity, accuracy, consistency, availability, and timeliness. Requirements are defined as the need or expectation stated, generally implied, or obligatory. In short, data are adequate and appropriate for the intended usage in operations, decision-making, and planning. Adapted from ISO 9000. 2015.
Data Verification – The process of evaluating specific data's completeness, correctness, and conformance/compliance concerning the intended purpose. Data verification aims to ensure and document that the reported data and results are what they purport to be and accurately and appropriately address the intended purpose. When deficiencies in the data are identified,
those deficiencies should be documented for the data user’s review and, where possible, resolved by corrective action. Adapted from “Guidance on Environmental Data Verification and Validation.” US Environmental Protection Agency (EPA/G-8). November 2002.
Source Data Review – A “review of source documentation to check the quality of the source, review protocol compliance, ensure critical processes and source documentation are adequate.” TransCelerate, taken from “Providing Clarity on the Definitions of Source Data Verification (SDB) and Source Data Review (SDR).” Stephen Young. August 2014.
Reliability – The degree to which an assessment, metric, or data point is consistent or produces stable results. Adapted from “Exploring Reliability in Academic Assessment.” Colin Phelan and Julie Wren, University of Northern Iowa Office of Academic Assessment. (2005-06).
Validity – How well an assessment, metric, or data point measures what it is purported to measure. In addition, data are appropriate to the intended use or purpose. Adapted from “Exploring Reliability in Academic Assessment.” Colin Phelan and Julie Wren, University of Northern Iowa Office of Academic Assessment. (2005-06).