[2024F] James Finch (Postdoc)

TBA

James Finch

Date: 2024-09-27 / 3:00 - 4:00 PM
Location: White Hall 100


Abstract

This presentation explores the automation of clinical interviews using Large Language Models (LLMs) to address time constraints in brain health assessments. It presents our development of a structured yet natural conversational agent that integrates Dialogue State Tracking (DST) with Speech-to-Text (STT) and Text-to-Speech (TTS) systems, allowing dynamic question adjustments, identification of missing diagnostic information, and estimation of diagnostic confidence. Key challenges include zero-shot domain adaptation for DST, inferring new diagnostic slots, and maintaining a low-latency speech interface. We discuss our prototype implementation using deterministic state tracking and scripted responses, along with plans for real-world deployment. This work demonstrates the feasibility of AI-driven clinical assessments, providing a scalable solution to support clinicians while ensuring high-quality patient interactions.

Link

Presentation