Exam queries answered, insights revealed


A new Cogniti assistant is helping UNE's Exams and eAssessment team manage high-volume student inquiries - and uncovering unexpected insights about student needs in the process. 

The Exams Assistant, developed by Kylie Day (Manager, Exams and eAssessment) and Jeanne Heath (Coordinator, Exams and eAssessment) with technical support from the LabNext70 team, addresses what may sound like a familiar challenge: students calling with routine questions about information that is already easily accessible to them. 

"We get a lot of questions from students who may not have engaged with the messages that we've sent them," Kylie explains. "So they ring us up and say 'How do I do that?' And we don't say 'Read the email we sent you'. We just help them with their question." 

With call volumes climbing during exam periods, the team saw an opportunity to ease this load while maintaining visibility over student concerns. 

Accuracy as the baseline 

The team's measure of success was straightforward: the assistant needed to be as reliable as a trained casual staff member. 

"If we gave the wrong advice to students, it could generate very bad experiences for students. We are held to very high expectations by our colleagues for high stakes exams. UNE's Student Grievance Unit, TEQSA (Tertiary Education Quality and Standards Agency), and the federal Ombudsman can get involved if students' expectations are not met," Kylie notes. "So it's exceptionally important to us that our information is always correct."  

An early prototype using web search was discarded when it didn't meet those strict requirements for accuracy, and the team opted instead for a RAG (retrieval-augmented generation approach), strictly limiting the assistant to verified UNE content provided as source documents. 

The knowledge base includes text from existing student communications, web page content provided as PDFs, and staff messaging scripts - what Kylie calls their "magic words" for handling tricky situations. 

The assistant was also instructed to direct students to relevant web pages for self-verification, and to provide contact details when questions fell outside its scope. 

"There's layer upon layer upon layer of exceptions and variations with exams," Kylie says. "So when it couldn't answer a question, we wanted it to tell students to contact us and give our contact details." 

The soft launch reveals unexpected patterns 

Rather than a full rollout, the team embedded a link to the assistant within their existing MyLearn information pages, allowing students to discover it organically during the recent exam period. 

The approach proved educational. Kylie reviewed the questions students were asking and found some surprises. 

"When I looked before the exam period, there were some pretty silly questions in there," she says. "So on the one hand, I'm glad we didn't have to respond to them. But also I don't think a student would feel comfortable asking those silly questions, so the assistant gave them a means to safely ask silly questions, which is nice." 

Maintaining the feedback loop 

The visibility into student questions was deliberate. Kylie's team regularly adjusts communications based on recurring inquiries, and the assistant preserves that insight. 

"That's why it was really important to us to have visibility of what students are asking," she explains. "We didn't want to be arms length from those inquiries. We still wanted to see what gaps do we need to plug in our comms, but also we don't want to have to answer the same questions 300 times a day." 

Expanding for next exam period 

Following the successful soft launch, the team plans a full rollout for the next exam period with expanded scope. 

"We also plan to include information about other functions we manage," Kylie says. "So not just exams, but results, academic transcripts, special assessment, alternative assessment. All of the things." 

The goal is to provide students with an accessible alternative to scrolling through lengthy FAQs and information pages. 

"They can just ask for what they're after and have it served up to them," Kylie notes. 

Looking ahead 

The assistant arrives as Kylie's team prepares for significant workload increases driven by changes to assessment architecture across the university. 

"Because of the threat of generative AI being used in assessment, the university is making a major change to assessment," she explains. "There'll be a lot more assured assessments where there's identity verification and active control of the student's virtual and personal space." 

The shift means additional workload for the team, and Kylie sees AI as part of the solution: "Our plan is to include the use of generative AI to help us with the volume - repetition work, stuff that can be easily scripted and described." 

The Exams Assistant represents an early step in that direction, demonstrating how AI can handle routine inquiries while preserving the human insight needed to continuously improve student support. 

 
Previous
Previous

Speedy pro images with Imagen 4

Next
Next

Turn Word into Moodle books