Four Tiers of Verification

Table of Contents

TLDR

Every factual claim in the PAPER TRAIL documentary series is assigned one of four evidence tiers: T1 (government primary source), T2 (corpus-derived), T3 (journalism), or T4 (estimation). Verification counts are published per episode, with EP08 achieving the highest T1 ratio (72%) and EP07 achieving zero unverified claims across 118 total (PAPER TRAIL Project, 2026a; 2026b). This system makes evidentiary strength visible to the audience rather than hidden behind editorial judgment.

The Problem the Tiers Solve

Most investigative reporting relies on an implicit trust model. The journalist or filmmaker makes source quality decisions behind the scenes, and the audience receives polished claims without knowing whether they rest on a government filing, a database query, a newspaper article, or a statistical estimate. This works when trust is high. It fails when the subject matter is as politically charged as the Epstein case, where every claim attracts both defenders and attackers, and where the distinction between documented fact and informed speculation can determine whether an investigation is taken seriously.

The four-tier verification system makes these distinctions explicit. Every claim carries a visible grade, and every episode publishes its tier distribution (PAPER TRAIL Project, 2026c).

The Four Tiers

T1: Government primary source. These are claims sourced directly to official government records -- FAA registry entries, court filings, state financial regulator consent orders, congressional testimony, corporate registration documents. T1 claims carry the highest evidentiary weight because their source is a government body acting in an official capacity. They are independently verifiable by anyone with access to public records.

T2: Corpus-derived. These are claims extracted from the 2.1 million document corpus through the analytical pipeline -- wire transfer parsing, entity extraction, FedEx shipment analysis, co-occurrence networks, temporal change-point detection. T2 claims are verifiable by anyone with access to the same documents and the same analytical tools. Their reliability depends on the accuracy of the pipeline, which is itself documented and calibrated (PAPER TRAIL Project, 2026d).

T3: Journalism. These are claims sourced from vetted media outlets. The verification system uses a 26-domain whitelist for journalism sources, queried through DuckDuckGo (PAPER TRAIL Project, 2026e). T3 claims are useful for context and corroboration but carry less weight than T1 or T2 because they rely on the journalist's own source evaluation.

T4: Estimation/calculation. These are derived quantities -- species richness estimates (statistical projections of how many entities remain undetected), temporal change-point counts, community detection numbers, document prioritization scores. T4 claims are mathematical outputs from documented algorithms applied to documented data. They are reproducible but not directly observed facts.

How the Episodes Stack Up

The verification breakdown varies significantly across episodes, reflecting the different source materials each one draws from.

EP08 "Eight Aircraft" achieved the highest T1 ratio: 99 of 138 claims (72%) sourced to government primary records (PAPER TRAIL Project, 2026a). This makes sense -- FAA registry data, corporate filings, and court records are all T1 sources, and an episode about aircraft registration is built almost entirely on public registries.

EP06 "2,894 Packages" had the highest total claim count at 143, with 124 T2 (corpus-derived) claims reflecting the FedEx shipment database. Only one claim in the entire episode could not be verified: the RADIO FENCE shipment costing $1,710 for a 2-pound package (PAPER TRAIL Project, 2026f).

EP07 "The Wrong Robert" -- the episode dedicated to refuting the project's own OBS-1 false identification -- achieved zero unverified claims across 118 total (55 T1, 44 T2, 12 T3, 7 T4) (PAPER TRAIL Project, 2026b). An episode about getting something wrong was more thoroughly sourced than most episodes about getting things right.

EP05 "The SAR" demonstrated that a single source document can carry an entire episode. Built entirely from one 29-page TD Bank Suspicious Activity Report (a form banks file when transactions look suspicious) (document EFTA01656524.pdf), all claims are T2 -- derived directly from the corpus document (PAPER TRAIL Project, 2026g).

Automated Verification

The tier system is not just an editorial framework. The claim verification tool automates the verification process across all four tiers (PAPER TRAIL Project, 2026e). It queries government registries through OpenCorporates, FAA aircraft registration lookup, and CourtListener for T1 verification. It searches the corpus database for T2 confirmation. It runs DuckDuckGo queries against a 26-domain journalism whitelist for T3 corroboration. And it flags estimation language for T4 classification.

The tool supports single claims, batch processing from files, and interactive mode. Every verification produces a structured output with verdict categories: CONFIRMED, PARTIALLY_CONFIRMED, CONTRADICTED, UNVERIFIABLE, or CALCULATED. These verdicts are exported as JSON and CSV to the verification archive.

Why Publish the Numbers

Publishing verification counts per episode serves two purposes. First, it holds the series to its own standard -- if an episode has a high proportion of T3 or T4 claims, the audience can calibrate their confidence accordingly. Second, it demonstrates that rigorous source attribution is possible at scale. The 14 episodes in the PAPER TRAIL series collectively contain over 1,000 individually sourced claims (PAPER TRAIL Project, 2026c).

Other investigative projects can adopt the same framework. The tier definitions are simple, the automated tools are documented, and the principle is straightforward: tell the audience not just what you found, but how confident they should be in each specific claim.

References

PAPER TRAIL Project. (2026a). EP08 references: Highest T1 ratio (99/138) [Episode documentation]. ep08_slides/references.md

PAPER TRAIL Project. (2026b). EP07 references: Zero unverified (118 claims) [Episode documentation]. ep07_slides/references.md

PAPER TRAIL Project. (2026c). PAPER TRAIL 14-episode documentary series [Series documentation]. communications/

PAPER TRAIL Project. (2026d). Calibration methodology [Research document]. CALIBRATION.md

PAPER TRAIL Project. (2026e). Claim verification CLI: T1-T4 automated verification [Script 26]. 26_verify_claim.py

PAPER TRAIL Project. (2026f). EP06 references: Highest claim count (143, 1 unverified) [Episode documentation]. ep06_slides/references.md

PAPER TRAIL Project. (2026g). EP05 references: Single-source episode (TD Bank SAR) [Episode documentation]. ep05_slides/references.md