You need cookies enabled

HEFCE closed at the end of March 2018. The information on this website is historical and is no longer maintained.

Many of HEFCE's functions will be continued by the Office for Students, the new regulator of higher education in England, and Research England, the new council within UK Research and Innovation.

The HEFCE domain - - will continue to function until September 2018. At this point we will close the site entirely and all its information will only be available from the National Web Archive.


You need cookies enabled

University of Lincoln

This longitudinal study will follow students from different subjects throughout their undergraduate studies. It will include a self-assessment skills audit, and a situational judgement test.

Partner: University of Huddersfield

Project methodologies: Grades; Standardised test; Mixed methods; Other qualitative methods 

Pilot case study

University of Lincoln

Partner: University of Huddersfield

The project assesses possible means by which to measure the ‘distance travelled’ by students over the three-year period of their undergraduate studies. Specifically, it combines outputs from standardised psychometric tests and reflective student self-assessments with data on academic achievement, attendance, and engagement in extra-curricular activities. 

Aims and objectives

  • To establish the feasibility of measuring undergraduate learning gain through the use of standardised psychometric tests combined with reflective student self-assessments
  • Through the integration of additional university data sets with psychometric test results, to determine the impact on learning gain of student engagement in academic and extra-curricular activities
  • To gauge the potential and suitability of this methodological approach for the measurement of learning gain at UK universities.

Experiences and outcomes

We are convinced that the plethora of data routinely gathered across campuses by UK institutions of higher education can be used to provide highly valuable insights into the ways in which different undergraduates develop key transferable skills.

However, at present, there is a need to ensure that the data currently gathered is collated, analysed and reported effectively. We also believe that this data can be further enriched by

  • the introduction of additional opportunities for focused self-reflection; and
  • the use of psychometric tests.

From the student perspective:

A key outcome of the University of Lincoln project will be the creation of integrated student profiles in which data relating to academic activity, student self-reflection, psychometric testing and engagement with extra-curricular opportunities, ranging from society memberships to engagement with democracy, are drawn together for the first time.

By using these comprehensive student profiles we are hopeful that it will become possible for us to track the interrelationships of these factors and their individual and group contribution to the development of learning gain in participating students.

A key outcome of the University of Lincoln project will be the creation of integrated student profiles in which data relating to academic activity, student self-reflexion, psychometric testing and engagement with extra-curricular opportunities, ranging from society memberships to engagement with democracy, are drawn together for the first time. By using these comprehensive student profiles we are hopeful that it will become possible for us to track the interrelationships of these factors and their individual and group

The value of developing this approach through the learning gain project has already been demonstrated, with the university’s Careers and Employability Service and academics within individual schools requesting data to enable them to develop and shape their curricula and study skills courses to meet the individual needs of students. This result underpins our institution’s conviction that the true value of learning gain data is to be found not in its potential to enable a national comparison between universities, but in its ability to influence and improve teaching, learning and student development.

Despite the challenges of student engagement, our project has also reinforced our conviction that the formal measurement of learning gain at university offers useful opportunities for students to make time to reflect on their skills and be supported in developing bespoke development plans.

At the same time, the results of psychometric tests underline the variety of competencies in first year students, highlighting the challenges associated with the development of a universal measurement of learning gain that would be equally valid across all higher education institutions and emphasising the importance of bespoke teaching. 

Quotes about the impact of the project:


"I think [the self-assessment] does initiate internal thinking and reflection because these aren’t questions you would ask yourself normally.” (2nd year Business student)

“[The self-assessment] is useful as a tool because it promotes [thinking] about your employability skills and [how to] improve them. [Combined with the Situational Judgement Test it provides] a two-pronged, targeted approach.” (2nd year Business student)

“I like things like [the psychometric test] to find out about myself.” (2nd year Psychology student)

“I review my own habits quite a lot, using those questions [as] prompts was useful.” (2nd year Business student)

“It was interesting doing it this year [for a second time] because quite a few of [my scores] changed.” (2nd year Psychology student)

Staff in direct contact with students:

“interesting and good to see a positive result” (Head of Careers and Employability)

“very interesting” (Directory of Pharmacy Education)


The project’s single greatest challenge has been student engagement.

Despite both the Students’ Union and relevant academics from each School championing the project the initial response was relatively poor. In order to increase participation, an incentive worth £10 was introduced for all students who completed both tests. This was in addition to a prize draw for £100 of gift vouchers. Students were encouraged and sent courtesy reminders repeatedly for completions and school staff representatives received a weekly update on percentage completions in the form a ‘league table’.

In addition, an email was sent by the Deputy Vice-Chancellor to key contacts in participating schools; the Students’ Union undertook to telephone students who had begun but not completed the tests; and prompting emails (with test links) were sent in the names of personal tutors. Following these incentives a completion rate of 47% was achieved. As expected, completion statistics for the 10-minute Student Self-Assessment exceeded those for the more time-consuming and labour-intensive 40-minute Situational Judgement Test (SJT) [45% compared with 39%].

This approach was labour intensive and costly. While achievable for a small pilot study it would not be appropriate for a larger study. To both increase the student numbers in the study and test the lessons learnt from the first administration, a second cohort of students from the 2016-17 intake was identified in the schools of Pharmacy, Psychology and Architecture & Design.

The first cohort completion rates were generally highest where the tests were undertaken in formalised co-curricula activities in timetabled workshops. Completion rates were usually lowest where students were introduced to the project by personal tutors but then left to undertake the tests in their own time. Based on these findings, the first administration to Cohort 2 (2016-17 intake) was undertaken in workshop conditions and took place as a scheduled activity during Welcome Week (September 2016). Students were not offered any monetary incentives for their involvement. Reported student numbers indicate that this approach was successful in increasing completion rates while reducing staff time in administering the tests (note the low completion rate for Architecture & Design was due to late timetabling so some students did not know the workshop was occurring). The total number of Level 1 students completing either one or both of the SSA and SJT tests was 675.


This project is collecting data from eight disciplinary areas: architecture and design, business, computer science, engineering, fine and performing arts, history and heritage, pharmacy, and psychology. This was a purposive sample representing contrasting academic traditions (from STEM, Humanities, Social Sciences and the Creative Arts).

During the study, students have been asked to complete two forms of test; the test results will be combined with other relevant data in order to create Individual Student Profiles, as detailed below:


  • A Situational Judgement Test (SJT) which measures competencies in relation to both critical reflection and problem solving. This test is a form of psychological aptitude test which not only provides a measure of how students approach situations they might encounter in the workplace, but also delivers developmental, formative feedback.
  • A Student Self-Assessment (SSA) which captures each student’s perception of their own capability against seven key employment competencies.
  • Repeat testing (SSA), which will be conducted at set points within each academic year and will enable the project to track and influence development in self-perception over a three-year term.

Other data (including, but not necessarily limited to):

  • Academic Performance Statistics, including: course data; grades (aggregated); attendance
  • Student Engagement: the university and the Students’ Union collects a substantial amount of additional information relating to student participation in training, democracy, work experience, and extra-curricular activities
  • Wider Participation data relating to socio-economic background of students

By combining the above data in Individual Student Profiles, we hope not only to track student development in terms of competence and self-perception, but also to identify patterns, trends and correlations in the data and outputs.

Publications and forums

The work has been publicised on UOL and department websites, and national conferences.

Academic publications since 2016: Paper submitted to Higher Education Pedagogies Journal - Special Edition on Learning Gain (publication date to be confirmed)

Further information

Contact Dr Stephen Haddelsey, Learning gain project manager and case study author; email; tel 01522 886350

Find out more about the University of Lincoln's project

See all learning gain pilot projects


Page last updated 13 December 2017

For further information