Skip to main content

MBPV Question Assistant

Ollama Laptop Origin Vector DB MBPV Content Knowledge Base Generate Question Bloom's: Analyse · Advanced Topic: RNA Virus Immunity Generate → AI Generation Draft Question Edit & Save Approved Module Lead Governance Export Word · PDF Question Bank Living Q-Bank 1 2–3 4 5 6 MBPV Question Assistant AI-POWERED CURRICULUM-ALIGNED ASSESSMENT PLATFORM MSc MOLECULAR BIOLOGY & PATHOLOGY OF VIRUSES — IMPERIAL COLLEGE LONDON
← All Projects AI Assessment Curriculum Funding Requested Active

MBPV Question Assistant

A secure, curriculum-aware AI platform for generating, reviewing, and governing MCQ question banks — built for the MSc in Molecular Biology and Pathology of Viruses.

The MBPV Question Assistant began not as a platform project, but as a learning technologist downloading Ollama onto a laptop after an internal GenAI Shorts session. Agnieszka Malisz-Virtanen, Senior Learning Technologist, began feeding programme materials to a local LLM to generate draft MCQs — documents tagged, prompts refined, outputs copied into Word and sent for academic review. It worked. But it was entirely dependent on one person’s machine and availability.

That bottleneck became the design brief. Together with Adrian Cowell (Innovation Lead), the workflow was rebuilt as a secure, web-based platform with proper roles, governance, and a programme-specific vector database built from MBPV teaching materials. The result is a system where 30+ academics across 80 lectures can independently generate curriculum-aligned draft questions — with Module Leads overseeing approval, revision, and exam assignment — without any single point of failure.

Presented at Imperial’s GenAI Shorts internal series in February 2026, the project has received strong academic feedback and a funding bid has been submitted to support expansion across the Faculty of Medicine.

Project at a Glance

Status Active — Phase 1 complete; funding bid submitted for Phase 2 expansion
Programme MSc Molecular Biology and Pathology of Viruses (MBPV) — Imperial London
Scale ~80 lectures · ~30 academics · 25 MCQs per exam · 2 exams per year
Technology Next.js · Supabase (PostgreSQL + vector) · Claude AI · Vercel
Question Types MCQ with Bloom's taxonomy levels, difficulty settings, topic tags, and distractor generation
Export Word (.docx) · PDF · optional answer key · clean exam-ready formatting
Presented Imperial GenAI Shorts — 26 February 2026
Module Leads Prof Peter O'Hare (Module 1) · Dr Rob White (Module 2)

The Six-Step Process

Step 1

Secure Environment

Authenticated platform with role-based access. Teaching materials uploaded and processed safely — no student data, no external sharing, aligned to Imperial governance.

Step 2

Upload & Structure

Programme materials (slides, transcripts) ingested with a tagging structure defined by Module Leads. Builds a bespoke vector database — the system's programme-specific memory.

Step 3

Generate Draft Questions

Academics select question type, Bloom's level, difficulty, and topic. The system retrieves from the MBPV vector database and generates a structured draft: question, options, correct answer, explanation, and metadata.

Step 4

Edit & Save

Academics refine wording, adjust distractors, correct answers, or difficulty tags — or discard. Saved questions enter the programme question bank, accessible and editable by the originating academic.

Step 5

Module Lead Governance

Module Leads view all saved questions, filter by contributor, topic, or difficulty, and mark each as Approved, Needs Revision, or assign to Exam 1 / Exam 2.

Step 6

Exam Assembly & Export

Assigned questions exported as Word or PDF with clean formatting and optional answer key. Eliminates manual copying, formatting inconsistencies, and version confusion.

What the Academics Said

“I quite enjoyed using it and I think it generated good questions. The more specific the prompts were, the better the questions turned out. It became much better with specific instructions about which mechanism I wanted the question to be about.”

“It is a very intuitive interface! I tried multiple levels of difficulty and Bloom's level and it does seem to capture well the material taught.”

“There were subtleties to the alternative responses that were great red-herrings — a really appealing set of challenging half-truths and distractions. I wish I had come up with them myself. With a quick edit, they were perfect.”

“Just wanted to say how impressed I was yesterday. It helped design two great questions based on the materials in the lectures. Much better than what we've done before. Congratulations to the team.”

Why This Matters

Generic AI tools can generate questions, but they are not curriculum-aware and they do not accumulate programme knowledge. MBPV Question Assistant changes this: every generation call retrieves from a vector database built from the programme's own teaching materials, tagged by the academics who designed the course. The output reflects what MBPV actually teaches — not what a general model has seen on the internet.

The shift in the unit of work is significant. Traditionally, an academic authoring an MCQ starts from a blank page: drafting, aligning to outcomes, calibrating difficulty, generating plausible distractors. With MBPV Assistant, that starting point becomes a structured AI draft aligned to specified parameters. The academic's role shifts from potentially time-consuming authoring to expert validation and refinement — a more appropriate use of their time.

Crucially, this is institutional infrastructure, not an individual workaround. The question bank grows and improves through review cycles. Instead of rebuilding exams each year, teams maintain a living, reusable, reviewable bank that reflects the programme's current teaching content.

Phases & Next Steps

Phase 1 — Complete

Proof of concept within MBPV. Secure platform built, vector database populated, full workflow tested with programme academics. Positive feedback received.

Phase 2 — Pending Funding

Expansion within the Faculty of Medicine. Refining governance, evaluation frameworks, and transferability across modules and teaching styles. Funding bid submitted.

Phase 3 — Future

Institutional infrastructure: a transferable platform for any programme team, Canvas LMS integration, and scalable living question banks across Imperial.

Team & Collaborators

Agnieszka Malisz-Virtanen

Senior Learning Technologist — Digital Education Office

Adrian Cowell

Innovation Lead — Technology & Development

Prof Peter O'Hare

Module 1 Lead — MSc MBPV

Dr Rob White

Module 2 Lead — MSc MBPV