Advancing Analytic Rigor

The Advancing Analytic Rigor (AAR) projects began the process of establishing LAS as a Center of Excellence for Evaluating Analytic Rigor. Specifically, the team focused on establishing a common operational methodology for evaluating rigor, identifying factors with the greatest impact on analytic performance, and conceptualizing methods that can integrate the evaluation of rigor into the next generation of analytic workflows.

Team Leads: Michele Kolb, Christine Brugh, Judy Johnston

 

This presentation describes the work done by Johnston Analytics with the LAS AAR team in developing an operational definition of analytic rigor via a literature review. It presents an overview of the process, findings, insights, and recommendations for future research.

Participants: Judy Johnston (Johnston Analytics), Christine Brugh (LAS), Michele Kolb (LAS), Mark Bowler (ECU), Sue Mi Kim (LAS), Emily Meier (ECU), Vincent Streiff (LAS), Carmen Vazquez (LAS), Elle Winemiller (LAS), Peyton Frye (NCSU)

Request follow-up on this project

In this video, Dr. Mark Wilson and Michele Kolb (with a little help from a hypothetical mission manager named Monica) present the Fundamental 5 Performance Factors and the Significant 7 Work Activity Factors, findings that resulted from analysis of an anonymized dataset created from performance and work activity data collected from over 1000 analysts at different field sites over several years under LAS Project WESTWOLF.

Participants: Michele Kolb (LAS), Mark Wilson (NCSU)

Request follow-up on this project

Summarizes the RCS effort in support of the Advancing Analytic Rigor (AAR) research program. The video provides a short summary of Cognitive Systems Engineering (CSE) and Joint Cognitive Systems (JCS) and how characteristics of the JCS reflect directly into an analyst's decision making and the resulting analytic rigor. By detecting these latent Cognitive Brittlenesses proven patterns of Advanced Decision Support Interventions can be applied into the JCS to improve Analytic Rigor on mission. The results have been adapted into working aids to allow analysts, developers, leaderships and researchers to apply these insights to their efforts. Analysts will be better able to self-advocate for effective tools, developers can prevent introducing known brittlenesses into their products, and leadership will be able to focus their programs on delivering improved analytic rigor.

Participants: Alicia McCormick (RCS), William Elm (RCS), Caroline Christ (RCS)

Request follow-up on this project

Technology has transformed how people receive and consume information. Yet, many Intelligence Community (IC) report production and dissemination methods remain constrained to the preexisting model of long text reports. This video provides an overview of LAS's efforts at identifying some of the technology-driven trends in journalism and imagining how they might be applied to evolve intelligence reporting.

Participants: Elle Winemiller (LAS), James Settles (LAS)

Additional Resources:

Request follow-up on this project

This presentation describes the work done by Johnston Analytics with the AAR team on developing a taxonomy of types of intelligence analysis and exploring ways to facilitate the identification and mitigation of analytic rigor in analysis and production

Participants: Judy Johnston (Johnston Analytics)

Request follow-up on this project

This study was an initial investigation into the behavioral indicators of analytic rigor as would be demonstrated by a typical intelligence analyst. Analytic rigor, as defined by Johnson (2020), is “an effort by an analyst or researcher to be as complete as possible in order to arrive at the most accurate assessment/results possible in conducting an analysis with integrity” and includes six primary components: objectivity; thoroughness; replicability, reliability, validity; transparency; credibility; and relevance. Based on this conceptualization, as series of structured interviews with former intelligence analysts were conducted. From these interviews, a behavioral expectation scale was developed for each of the six components. Overall, participants noted a substantial amount of overlap between the six components of analytic rigor; however, a general set of behavioral expectations was identified for each component at three specific levels of expectations – meets, exceeds, and below. Additionally, the general relevance and expected frequency of each component were also assessed. Results were mixed with participants not agreeing regarding the relevance and frequency of several of the components of rigor.

Participants: Mark Bowler (East Carolina University), Emily Meier (East Carolina University)

Request follow-up on this project

Contact Us