Bijesh Shrestha | Wednesday, May 21 @ 10:00am, Fuller Beckett Conference Room | DS Ph.D. Dissertation Proposal | A Structured Framework for Stakeholder-Driven Design and Evaluation of AI Tools in Analytic Workflows

Wednesday, May 21, 2025
10:00 a.m. to 11:00 a.m.
Floor/Room #
Beckett Conference Room

DATA SCIENCE

Ph.D. Dissertation Proposal

Bijesh Shrestha

Wednesday, May 21, 2025

10:00am - 11:00am

Fuller Beckett Conference Room

Committee:

  • Dr. Lane T. Harrison, Associate Professor, WPI. Advisor.
  • Dr. Nima Kordzadeh, Associate Professor, WPI.
  • Dr. Roee Shraga, Assistant Professor, WPI.
  • Dr. Nathaniel D. Bastian, External Member, Chief Scientist, Army Cyber Institute at West Point
     

Title: A Structured Framework for Stakeholder-Driven Design and Evaluation of AI Tools in Analytic Workflows

Abstract

Analysts in high-stakes domains such as cybersecurity, intelligence, healthcare, and finance rely on data visualization and increasingly on AI summarization tools to process complex data, extract critical insights, and generate tailored reports that support decision-making processes. Despite widespread use, the effectiveness of these tools depends on their ability to align with analytic goals, user’s role-based needs, and operational constraints. While advancements in artificial intelligence (AI) have led to the adoption of AI-driven tools in visual analytics, existing AI-enabled visualization and summarization tools often fall short in their alignment, accuracy, and adaptability to analyst’s needs. To address these gaps, this dissertation develops EvalOps, an evaluation-centered design framework that aims to better align resulting systems with analysts’ needs, making them more trustworthy, adaptable, and usable within operational contexts EvalOps elevates evaluation activities to the beginning of the design process. The importance of alignment is identified in the first research activity, where we conduct an in-depth qualitative study of the visualization lifecycle within Defensive Cyber Operations (DCO), demonstrating the need for a wider range of design methodologies in mission-critical contexts. We then explore and develop EvalOps principles through system development, describing three separate AI-enabled visualization tools, tailored specifically for different aspects of intelligence workflows. Systems include 1) an interactive summary tailoring and source-verification tool, 2) a chart transformation and report-generation tool utilizing large language models, and 3) an AI-enabled visualization tool for speech-to-text triage. Finally, we propose to develop and evaluate a set of EvalOps-centered design activities, building on the success of visualization-focused design activity frameworks. Prior design activities traditionally focus on creative ideation of interface features and capabilities. Instead, we propose to recast these into evaluation-design activities, and evaluate how they help designers elicit meaningful evaluation metrics, analyses, and adapt prior successful studies into their projects. We also describe empirical evaluation studies within our developed systems, including both iterative and milestone-based studies with subject matter experts from the intelligence community, and crowdsourced studies that enable us to rapidly test features and tasks, generating diverse feedback on usability and adaptability. In summary, the results of this dissertation aim to provide contributions both in terms of novel systems and novel methodologies: by offering a structured, domain-agnostic framework (EvalOps) that can enhance user trust and analytic effectiveness, as well as useful insights on implementing adaptive AI systems in high-stakes environments.

Audience(s)

Department(s):

Data Science
Contact Person
Kelsey Briggs

Phone Number: