Barriers and facilitators of effective evaluation

Evaluators
Practitioners
Senior management
Staff with responsibility for the APP

Barriers to effective evaluation

Table 1 presents an overview of the barriers to effective evaluation. Key challenges include data inconsistencies, fragmented infrastructure, and a lack of alignment between institutional goals and evaluation processes. These barriers hinder the ability to gather, interpret, and utilise data effectively. These issues contribute to difficulties in maintaining consistent and accurate evaluation practices.

Ethics/Privacy

Organisational

Data infrastructure

Data

Table 1: Barriers to effective evaluation
Issue Effect Mitigation(s)
Data is not clearly defined so across the institution there is no consistent understanding of what the data means Data is interpreted differently in different institutional contexts, e.g. if the definition of commuter student differs across the institution then data from different courses will not be comparable Data dictionary: central source of fact for what data is recorded and how it is defined
Fragmented data infrastructure means complete dataset requires export from many systems Time-consuming to obtain all the data which requires manual joining, increasing the chance of data errors A system that collates data from all the institution’s data stores and acts as the central repository
Fragmented data infrastructure where the same or similar data is captured in multiple systems Data doesn’t always agree between systems meaning similar analysis can lead to different results A system that collates data from all the institution’s data stores and acts as the central repository
Fragmented data infrastructure such that different systems have different data owners Hinders access to data and may mean the organisation doesn’t have a complete picture of the data captured A system that collates data from all the institution’s data stores and acts as the central repository
Policy change affects what data is captured or how it is captured Data that is needed for evaluation is not collected or collected differently Register of data users so data that is required for specific purposes not directly specified in legislation continues to be captured
Data definition changes over time, e.g. attendance (previously just lectures) now includes seminars, lab-classes etc. Hinders longitudinal comparisons When definitions change, consider how that affects the interpretation and whether data should also continue to be captured using legacy definitions.
Data is insufficient without context, e.g. attendance without knowing what the expected attendance is for that student Makes cross-course comparisons difficult Consider the differing course requirements and identify data that require context, e.g. expected levels of attendance, or for online submissions expected submission requirements.
Student engagement with support activities is not recorded Effectiveness of support services or the signposting/referral to them cannot be determined Consider using a tracking system or learning analytics system to record access but note privacy implications of such data
Data infrastructure changes mid APP cycle Data previously captured is not captured Register of data users so data that is required for specific purposes continues to be captured
Data infrastructure changes mid APP cycle Data definition change for the same data name Data dictionary: Central source of fact for what data is recorded and how it is defined.
Poor institutional memory due to staff churn means the context of the institution at different times isn’t understood Context of data that isn’t recorded is lost, e.g. no-detriment policies during COVID-19 Data like this is hard to record formally. When anomalies show up in data such as temporary narrowing or widening of equality gaps or changes in attainment, talk to staff who were involved at the time.
Lack of data-access provision Data is collected but not used to evaluate service performance Formalise access to data through an ethics process with protections in place for data subjects.
Lack of understanding of APP and its regulatory importance at executive level Resources and structures necessary for evaluation are not supported within the university Senior staff responsible for APP should have a good understanding of evaluation methodologies extending to resourcing needs, e.g. staff, time, data.
Lack of understanding of evaluation in senior management Resources and structures necessary for evaluation are not available Training, or recruitment of student success managers who understand evaluation methodologies.
Lack of understanding of APP and its regulatory importance in the planning team Data isn’t captured or changes to infrastructure are made without reference to APP Senior staff responsible for APP should be feed into governance of IT infrastructure/planning team
APP cycle is out of step with other institutional reporting cycles (e.g. Teaching Excellence Framework, Research Excellence Framework) and its importance is diminished Data infrastructure is primarily arranged around TEF or REF reporting leading to changes that affect APP but Senior staff responsible for APP should be able to feed into governance of the digital infrastructure/planning team
Lack of a clear ethics process for analysing data prevents evaluation of student support initiatives Reduces incentives among institutional statistical experts to analyse data to understand and tackle equality gaps in the institution Ethics governance process sets out clear guidelines for working with secondary data

Facilitators of effective evaluation

In contrast to Table 1, Table 2 highlights best practices that can support institutions in overcoming these challenges. These facilitators include centralised access to data, consistent systems for recording post-entry interventions, and the establishment of clear ethical guidelines for data analysis. Tools like learning analytics systems and \ data dashboard with sufficient granularity can help identify and address equality gaps. A shared understanding of data, coupled with dedicated ethics processes and privacy notices, can improve the effectiveness of evaluation.

Ethics/Privacy

Organisational

Data infrastructure

Data

Table 2: Facilitators of effective evaluation
Facilitator Examples
Centralised access to data Learning analytics system that collates institutional data
Dedicated system for recording details of post-entry support Institutional CRM system, tracking services, e.g. Higher Education Access Tracker (HEAT) or East Midlands Widening Participation Research and Evaluation Partnership (EMWPREP) that utilise the post-entry MOAT
Consistent system for recording post-entry interventions Typology of post-entry student support interventions; e.g. post-entry MOAT
Shared understanding of what data means Data Dictionary
A dedicated ethics process for the analysis of secondary data University of Wolverhampton Ethics Process
Privacy notice that informs students that institutional data will be used for research and evaluation purposes Open University privacy notice
Privacy notice that informs students that data associated with their engagement with student university may be stored on external systems (e.g. HEAT) for research purposes. NTU CenSCE extra-curricular and co-curricular activity participant privacy notice
Back to top