Developing Continual Adaptive Learning Techniques for Large Language Models in Neural Information Retrieval
| Program start date | Application deadline |
| 2025-10-01 | - |
| 2026-01-01 | - |
| 2026-04-01 | - |
Program Overview
Introduction to the PhD Program
The PhD program in Developing Continual Adaptive Learning Techniques for Large Language Models in Neural Information Retrieval is a research-focused degree that aims to address the critical challenge of catastrophic forgetting in neural information retrieval systems.
Qualification(s) Available
- PhD
Entry Requirements
Applicants should have, or expect to achieve, at least a 2:1 Honours degree (or equivalent) in computer science or a related subject. A relevant Master’s degree and/or experience in one or more of the following will be an advantage: artificial intelligence, information sciences, mathematics with experience in programming.
Fees for Entry
- UK fee: £5,006 Full-time degree per annum
- International fee: £28,600 Full-time degree per annum
Duration and Start Date
- Duration: Full-time, 3 years
- Start date: October 2025, January 2026, April 2026
Application Deadline
- Application deadline: 1 July 2026
Project Reference
- Project reference: CO/GC - SF4/2025
Location
- Location: Loughborough
Subject Area(s)
- Subject area(s): Computer Science
Project Details
This PhD project investigates the critical challenge of catastrophic forgetting in neural information retrieval (NIR) systems. Contemporary NIR architectures exhibit significant performance degradation when integrating new information while attempting to preserve existing knowledge - a fundamental limitation in our expanding digital landscape.
Background
Neural information retrieval has transformed traditional search systems through innovative deep learning implementations. Modern approaches include embedding-based architectures (DRMM, KNRM, DUET) and pre-training based frameworks (BERTdot, ColBERT), which have demonstrated remarkable success in static environments. However, these models face considerable challenges in continuous learning scenarios.
The phenomenon of catastrophic forgetting emerges as a central challenge when models incorporate new information, leading to deterioration of previously acquired knowledge. While continual learning strategies have shown promising results across various domains, their application within NIR systems demands deeper exploration, particularly concerning topic distribution shifts and data volume dynamics. Moreover, adaptive learning strategies are necessary to ensure models can adjust effectively to evolving data and retrieval requirements without compromising performance.
Existing NIR systems require complete retraining to integrate new information—an approach that is computationally demanding and impractical for real-world deployment. Recent research suggests promising directions in applying continual learning to NIR, yet fundamental challenges remain in developing specialised strategies, understanding topic shifts, and implementing efficient memory management solutions.
Research Objectives
- Develop a framework for continual and adaptive learning in NIR systems, addressing catastrophic forgetting while enabling the model to dynamically adjust to new data and retrieval tasks.
- Design and optimise advanced continual learning strategies, focusing on memory management, handling topic diversity, and adapting to variations in data volume, ensuring models can learn continuously and flexibly.
- Integrate domain adaptation techniques and refine evaluation metrics, ensuring the scalability and practical efficiency of the proposed strategies across dynamic information retrieval environments.
- Case Study: Apply the developed methods in a real-world NIR system to demonstrate their practical effectiveness in dynamic and ever-evolving environments.
Supervisors
- Primary supervisor: Professor Georgina Cosma
English Language Requirements
Applicants must meet the minimum English language requirements. Further details are available on the International website.
How to Apply
All applications should be made online. Under programme name, select Computer Science. Please quote the advertised reference number: CO/GC – SF4/2025 in your application.
Selection Criteria
The following selection criteria will be used by academic schools to help them make a decision on your application. Please note that this criteria is used for both funded and self-funded projects.
- The selection criteria include:
- Academic qualifications
- Research experience
- Personal statement
- References
Please note, applications for this project are considered on an ongoing basis once submitted and the project may be withdrawn prior to the application deadline, if a suitable candidate is chosen for the project.
