Evidence systems lab

Research data pipeline

A controlled interface for the technical layer behind the site: acquisition, normalization, source triage, anomaly review, and publication into maps, dashboards, record panels, and reproducible update commands.

Acquire

Public sources enter the pipeline

Government data, archive tables, public datasets, CSV files, JSON feeds, and source pages are converted into local research assets.

Normalize

Names, dates, coordinates, and categories

Records are aligned across place names, time intervals, source labels, layer types, and confidence levels before they enter a map.

Review

Human review remains decisive

AI-assisted triage can flag anomalies or extraction candidates, but source status, historical claim, and publication quality stay review-controlled.

Publish

Interfaces make the system visible

The final output is a set of maps, dashboards, record panels, review consoles, and reproducible update commands.

401,929 conflict rows mapped

Geocoded conflict-system records condensed into map-ready research layers.

152.7M displacement context

UNHCR public statistics rendered as origin, asylum, and IDP layers.

2,689 disaster signals

USGS, NASA EONET, and GDACS records normalized into a risk atlas.

rebuild data rebuild command

A reproducible local pipeline refreshes conflict, displacement, and disaster datasets.

GISmap-ready layers
CSVstructured public data
JSONstatic site payloads
Reviewhuman claim control