Analytic Rigor and Performance

Analytic Rigor and Performance (ARP) projects focused on improving intelligence analysis and analytic outcomes by helping analysts balance the need for analytic rigor with real world constraints like timeliness and resources in targeting factors to enhance analytic performance. The ARP team’s efforts this year focused on (a) measuring, (b) identifying risks to, and (c) proposing interventions to mitigate risks to analytic rigor and performance. This included both projects that approached this problem specifically through the lens of analytic rigor (Analytic Rigor) as well as performance more generally (Analytic Performance). Additionally, members of the ARP team explored the power of applying modern journalism techniques like visual storytelling and infographics to the production of intelligence reports (Modernizing Reporting).

Team Leads: Sue Mi Kim, Christine Brugh

Analytic Rigor

Participants: Tim van Gelder (Univ. of Melbourne), Luke Thornburn (Univ. of Melbourne), Christine Brugh (LAS), Sue Mi Kim (LAS), Michele Kolb (LAS)

Request follow-up on this project

In this project we investigated the extent to which the ODNI's Rating Scale, which measures adherence to analytic tradecraft standards as exhibited in analytic products, can be treated as a measure of the analytic rigor in those products. We compared Rating Scale scores for 100 fictional but realistic products with independent assessments of the level of analytic rigor in those products. Those assessments were made by professional intelligence analysts recruited from the US IC. We found a low-moderate correlation between Rating Scale scores and rigor assessments. This suggests the Rating Scale is a only a weak guide to analytic rigor, and that rigorously assessing measuring analytic rigor will require either a new measure of analytic rigor which is demonstrably reliable and valid, or changes in the Rating Scale or the way in which it is applied.

Participants: Judith Johnston (Johnston Analytics), Peyton Frye (LAS), Christine Brugh (LAS), Sue Mi Kim (LAS), Jacque Jorns (LAS), Pauline Mevs (LAS)

Request follow-up on this project

This project explores questions resulting from 2020 LAS research on analytic rigor, i.e., do analytic rigor risks differ based on the type of analysis/analyst? And how can we use this information to inform the identification of risks to analytic rigor and the development of interventions to mitigate them? We created and validated a theoretical model for creating rigor profiles to address these questions and tested an exemplar profile through the engagement of subject matter experts from the analytic community, academia, and industry. The video will give an overview of the theoretical model and how it was implemented via a use case that focused on the rigor profile of a SIGINT analyst responding to/participating in an IC-wide task force.

Analytic Performance

Participants: Resilient Cognitive Solutions

Request follow-up on this project

This year, RCS worked to clarify and capture concepts related to a scientific, quantitative evaluation method that RCS uses to predict mission performance of systems before they are deployed. This decision-centered testing (DCT) methodology focuses on the joint cognitive system (JCS) to analyze human decision-making performance as supported by their tools and systems and is crucial to preventing brittle systems from reaching the field. This methodology applies cognitive pressure to the JCS to make this evaluation. By evaluating systems rigorously prior to deploying, mission performance can be predicted and appropriate interventions can be applied before breakdowns occur on mission.

Participants: Mark Wilson (NCSU), Michele Kolb (LAS)

Request follow-up on this project

The dynamic nature of mission work means that optimizing team performance requires continual effort to maintain team alignment. This project applies Industrial and Organizational Psychology methods in new ways at the team level to enable mission team leaders to find and stay on the critical path for mission success. We provide team leaders with a structured approach to diagnose, prioritize, and address areas where their team may be out of alignment. Team leaders collect data from their team members using a 5-minute diagnostic Rapid Temperature Check prototype. The results are used to prioritize focus areas and select approaches designed to bring their team into alignment in key areas such as performance, mission understanding, customer engagement, and motivation. A second iteration of the Rapid Temperature Check is used to assess if the approach achieved the desired effect and to determine if another area may have drifted out of alignment. By using the Rapid Temperature Check iteratively as a sort of team alignment health check, team leaders can agilely and continuously optimize team performance in a data-driven way, with the added benefit of building trust within the team.

Modernizing Reporting

Participants: Paul Davis (LAS)

Request follow-up on this project

In this pilot study, the research question was, “What are current views about information visualization, infographics, and artificial intelligence in analytic reporting? Can AI be applied to make it easier to include infographics as a way to modernize analytic reporting?” The video presents findings from a pilot study that assessed viewpoints about information visualization and the potential for infographics to enhance and modernize analytic reporting. Approved, semi-structured interviews collected information from staff at LAS. The study also raises awareness for the potential to make infographics from simple, textual sentences by applying natural language processing.