4 Billion Dead

The Content and Communication Machine
Internal Briefing — Core Team
EngineHouse / ASIP Pipeline
Version 0.1

What This System Is For

  • We are building a content and communication machine with a defined purpose
  • That purpose: translate climate and extinction-risk evidence into messages that reach people outside the existing Overton window
  • The machine handles ingestion, processing, routing, generation, and delivery
  • It does not rely on the assumption that showing people science changes minds
  • It starts with consequence, uses science as an explanatory layer, and routes rebuttal only where the audience signals it is needed

Why Science-First Content Fails

  • Most climate communication starts with the finding, not the consequence
  • Audiences share things that feel personally relevant — not papers, not datasets
  • Fact-dumping routes to cognitive shutdown, not to action
  • Rebuttal without audience context is friction, not persuasion
  • Opening with counter-argument tells the audience we expect them to be wrong
  • We are designing for consequence-first, science-as-layer, rebuttal-as-tool

Core Principle

Start with what the audience already cares about.
Science explains it.
Evidence backs it.
The system routes what comes next.

Not all public-facing content is rebuttal-led. Not all audiences need the same depth. The machine adapts.

The Machine

One document in.
Multiple consequence-anchored outputs out.
Routed by audience, platform, and concern type.
Section 2

The Communication Engine

How evidence enters the machine, how consequence is extracted, and how outputs are routed.

ASIP: The Evidence Source

  • ASIP ingests structured climate and extinction-risk evidence
  • Sources: academic papers, reports, datasets, curated articles, direct paste
  • Content enters as structured documents — with metadata, extracted claims, and citations
  • The knowledge base grows incrementally — every ingest extends reach
  • This is the raw material. Nothing flows without it.
[ Ingest sources → ASIP knowledge base → available for all downstream outputs ]

From Evidence to Human Consequence

  • Raw science is not the output — consequence is
  • Every document is processed through a consequence lens before content generation
  • The translation asks: what does this mean for a person?
  • Who loses what, when, and why — and which communities are affected first?
  • That translation drives all content generation downstream

Primary Consequence Anchors

Mortality

The risk of death — preventable, scalable, traceable to specific decisions and inactions. Heat deaths. Flood deaths. Food system collapse. Migration and conflict.

Affordability

The economic destruction of ordinary life — energy, food, housing, insurance. The squeeze that hits before the catastrophe. Already visible to most audiences.

Secondary consequences exist — biodiversity, infrastructure, geopolitical — but they do not replace these two primary anchors. They support them.

Social Hooks: Scene-Setting, Not Rebuttal

  • Public social content opens with a scene the audience already recognises
  • Heat in a city. Food price increases. Insurance withdrawal. Crop failure. Power grid failure.
  • The hook is the lived experience — not the statistic
  • Science arrives as the explanation of what the reader already suspects or feels
  • This structure is not soft. It is strategically correct.
  • Rebuttal is not the opening move

The Website as Routing and Capture Hub

  • The website is not a landing page — it is a routing machine
  • Traffic arrives from social content already matched to a specific article or pathway
  • The article delivers depth, nuance, and evidence
  • The website captures audience signals: reading depth, responses, clicks
  • Those signals feed back into routing decisions and content targeting
[ Social post → matched article → website pathway → signal capture → routing update ]

Questionnaire Logic

  • The questionnaire exists on both social platforms and on the website
  • It is not a survey — it is a routing mechanism
  • Short form on social. Deeper form on the website.
  • It infers: concern type, audience lane, misinformation exposure, rebuttal depth needed
  • Responses route to: specific articles, targeted follow-up, or rebuttal depth ladder

Adaptive Explanation / Rebuttal Depth

  • Not every audience member requires the same explanation
  • Some are already convinced — they need action pathways, not argument
  • Some are genuinely uncertain — they need evidence, not combat
  • Some hold active misinformation — they need graduated counter-evidence
  • The machine adapts. Routing is the mechanism. The questionnaire is the signal.

One Source. Many Outputs.

Source document
Claims + consequence
Social post
|
WP article draft
|
Presentation deck
|
Newsletter
  • Output type is determined by content type, audience routing, and platform context
  • Every output is matched to a source — nothing floats free
  • The machine does not invent. It translates, structures, and routes.
Section 3

Public-Facing Material Structure

How content reaches the public. What it contains. Where it goes.

Social Content Starts with Consequence

  • No social post begins with a scientific finding
  • No social post opens with a rebuttal or a counter-argument
  • Every social post opens with: a scene, a consequence, a human reality
  • The post then explains or questions — it does not lecture
  • Rebuttal, if it appears in social content at all, is lightweight and hooks to a website pathway
  • Consequence is the entry point. Science is the layer. Rebuttal is the ladder.

Social Content Can Carry Questionnaire Elements

  • A social post can open a question the audience already suspects the answer to
  • Poll-style prompts and reaction hooks route people to the website or to a follow-up thread
  • These are not engagement bait — they are audience signal collection
  • They allow the pipeline to infer concern type before the user arrives on site
  • The questionnaire can begin in the feed. It deepens on the website.

Every Public-Facing Item Is Matched to a Pathway

  • A social post without a destination is a dead end
  • Every generated item must be matched to: a source article, a website pathway, or a next-step action
  • This matching happens at ingestion and generation time — not as an afterthought
  • EngineHouse enforces this structurally — no orphan content
[ Social post ] → [ matched article on 4billiondead.org ] → [ pathway routing ] → [ signal capture ]

Website Handles Depth, Routing, and Rebuttal

  • The social layer is consequence and hook
  • The website is explanation, routing, and rebuttal
  • Skeptical Science methodology applies where formal rebuttal is required
  • Rebuttal pages live on the website, not in the social feed
  • The website captures and routes based on what the visitor does, not only what they say
  • Depth is available. It is not the default entry point.
Section 4

Rebuttal Logic

What rebuttal is, what it is not, and how the machine decides when to use it.

Rebuttal Is Not the Headline

  • Leading with rebuttal tells the audience we expect them to be wrong
  • It triggers defensiveness before any communication has taken place
  • Rebuttal is a tool — not a channel, not an opening, not a brand position
  • The channel is consequence. Rebuttal is used only when the audience signals readiness.
  • Most people who need to hear this do not currently identify as needing to be corrected

Rebuttal Is a Variable-Response Layer

  • Not every audience member needs rebuttal
  • Some are already convinced — they need action pathways, not argument
  • Some are genuinely uncertain — they need evidence, not combat
  • Some hold active misinformation — they need structured, graduated counter-evidence
  • Some are outside the Overton window — they need consequence first, then evidence, then rebuttal last
  • The system must route correctly before it delivers rebuttal material

The R0–R3 Rebuttal Depth Model

R0
No rebuttal. Consequence and scene only. The audience is aligned or does not yet need counter-argument. Majority of social output is R0.
R1
Soft explanation. Science layer delivered. Framing is explanatory, not adversarial. Used when audience signals genuine uncertainty.
R2
Structured counter-evidence. A specific claim is addressed directly. Source cited. Framing is factual, not combative. Used on website and longer-form content.
R3
Full rebuttal chain. Misinformation identified by name. Evidence ladder deployed. Skeptical Science depth. Used in targeted rebuttal pages and deep-routing pathways.

Questionnaire Infers Rebuttal Depth

  • Concern type — what the audience is actually worried about
  • Audience lane — prior knowledge, alignment, stated or inferred hostility
  • Misinformation exposure — known false claims in circulation in that lane
  • Rebuttal depth — R0 through R3, determined from signals above
  • Next path — specific article, follow-up content, rebuttal page, or action pathway
[ Questionnaire signals ] → [ lane classification ] → [ R-depth assignment ] → [ content routing ]
Section 5

The Presentation Platform

Why presentations are a defined output. What the platform is. Where it is going.

Why Presentations Are a Core Output

  • Evidence-heavy content needs to be presentable across multiple contexts: briefings, team sessions, public events, media, stakeholder engagement
  • Presentations are not a bolt-on — they are a defined output channel of the pipeline
  • A machine that generates social posts but cannot produce a deck is not a full communication engine
  • This deck is the first output of that layer

Reveal.js as the Presentation Layer

  • Reveal.js renders web-native slide decks — full HTML/CSS/JS control
  • No vendor lock-in. No proprietary format. No export restrictions.
  • Served directly from EngineHouse infrastructure
  • Can embed diagrams, data tables, video, code blocks, live content
  • Deck source is human-editable HTML — any editor, any time
  • The team owns the format and the infrastructure

The Pipeline Needs an LLM Presentation Mapper

  • Manual deck construction does not scale to pipeline volume
  • The next build step: a mapper that takes structured source content and produces a slide draft
  • Input: ingested document + extracted claims + consequence summary + metadata
  • Process: LLM maps content to slide schema using defined templates
  • Output: JSON slide structure → rendered Reveal deck
  • Human reviews and edits the draft before it is presented or published

Source Content → Slide Structure → Reveal Render

Source document
Mapper LLM
Slide schema JSON
Template selection
Rendered deck
Human review
[ Full pipeline diagram — ingest → claims → mapper → schema → template → render → publish ]

Template Library

  • A fixed set of slide templates covers the majority of output needs
  • Templates: title, section divider, headline + bullets, two-column, pull quote, diagram placeholder, data table, call to action
  • Templates are parameterised — the mapper fills them; it does not design
  • New templates are added by the team, not generated dynamically
  • Constraint is intentional — it produces consistent, editable output

Visual Behaviour Presets

Themes

Dark authoritative — for team and stakeholder contexts.
Light editorial — for media and public presentations.
High-contrast public — for large-screen and accessibility.

Motion and Transitions

Minimal by default. Content is the signal — not animation.
Cut or fade only at this stage.
No auto-play. No decorative transitions.

Typography: serif for authority (headings), sans for clarity (body). Both defined as CSS variables — swap globally by editing four lines.

Human-Editable Preview and Refinement

  • Every generated deck surfaces in the EngineHouse presentation page
  • The deck can be reviewed in-browser before sharing
  • Edits are made directly in the HTML or through a future GUI editor
  • No deck is published or shared without human sign-off
  • The machine proposes. The team decides.
Section 6

Phased Build Plan

Six phases. Each phase delivers something usable before the next begins.

Phase 1 — Define and Prove

Objective

Define the first deck workflow manually. Establish Reveal.js serving infrastructure. Confirm the template approach.

Deliverable

This deck, live on EngineHouse, reviewed by the core team. The base HTML structure locked. Presentation server confirmed working.

Status

In progress. Deck v0.1 is this file.

Phase 2 — Structured Presentation Mapper

Objective

Build an LLM-based mapper from source content to slide JSON. Define canonical deck → slide → element schema.

Deliverable

Auto-generated deck draft from a real ingested document. Mapper connected to the ASIP worker pipeline. JSON → rendered HTML confirmed.

Constraint

Mapper must not hallucinate citations or claims. Output is only what the source document contains.

Phase 3 — Template Library and Visual Styles

Objective

Build 8–10 core slide templates. Implement three visual themes. Test mapper output against all template types.

Deliverable

Template library in use. Three themes switchable by parameter. Consistent visual output across all deck types.

Phase 4 — GUI for Generation and Editing

Objective

Add presentation generation trigger to the EngineHouse dashboard. Add in-browser slide preview and basic edit capability.

Deliverable

Non-technical team members can trigger a deck generation, review it, and push minor edits without touching the HTML directly.

Phase 5 — Integrate with Website and Pipeline

Objective

Decks link to source articles and content estate items. Presentation output routed automatically from ingested documents based on content type and flag.

Deliverable

End-to-end flow confirmed: ingest a document → pipeline generates → deck available in EngineHouse → linked to live article on 4billiondead.org.

Phase 6 — Expand Functionality

Objective

Additional output formats. Interactive slide elements. Audience-adaptive deck variants. Publishing and sharing workflows.

Scope items

PDF export. Video/recording export. Live data embeds. Embedded questionnaire slides. Multi-audience deck variants from one source.

Deliverable

Full presentation publishing system. Decks as a first-class EngineHouse output on par with social posts and article drafts.

Section 7

What We Build Next

Concrete build items in sequence. Each one is a dependency for the next.

Canonical Content / Presentation Schema

  • Define the JSON structure for an EngineHouse-compatible deck
  • Fields: deck_id, title, theme, version, sections[], slides[], slide_type, headline, bullets[], diagram_ref, speaker_notes, source_ref
  • This schema is the contract between the mapper and the renderer
  • Must be versioned and documented before any mapper is built
[ Schema definition v1 ] — to be committed to the EngineHouse repository

First Reveal Deck Template

Lock down
Base HTML structure — section wrapper, slide hierarchy, Reveal.js initialisation config
Define
CSS custom properties for all themed values — colours, typography, spacing, component styles
Establish
Section / slide hierarchy consistent with the canonical schema
Ship
This deck is version 0.1 of that template — used as the reference

The Presentation Mapper

  • LLM prompt that accepts: document text + extracted claims + consequence summary
  • Outputs: structured slide JSON matching the canonical schema
  • Mapper selects slide types from the template library — it does not invent layout
  • Hard constraint: mapper outputs only what the source contains — no hallucinated claims
  • First test target: the climate-heat-poverty document already ingested in ASIP

Presentation Page GUI

  • Extend the EngineHouse presentations page in the dashboard
  • Add: generate new deck trigger, deck list with status, in-browser preview panel
  • Add: edit and export actions per deck
  • First version: edit triggers a download, re-upload publishes updated deck
  • Simple first. Structured editor comes in Phase 4.

Template System

  • Each template is a named HTML partial: title.html, bullets.html, two-col.html, quote.html, diagram.html, cta.html
  • The mapper selects template by slide_type value in the schema
  • The renderer assembles partials into the full deck HTML
  • The team can add templates without touching the mapper or renderer

Editing Flow

Step 1
Mapper generates draft deck from source content
Step 2
Human reviews in EngineHouse presentation page — in-browser preview
Step 3
Minor edits: in-browser text edit (contenteditable or field form)
Step 4
Major edits: download HTML → edit locally → re-upload to publish
Step 5
Future: structured field editor in the EngineHouse dashboard (Phase 4)

Routing from Ingested Documents

  • When a document is ingested, it is flagged with: consequence type, audience lane, output eligibility
  • Eligible documents are queued for: social generation, WP draft, presentation mapping
  • The routing table determines which outputs are generated automatically and which require human trigger
  • Presentations are human-triggered for now — auto-queuing comes in Phase 5
[ Ingest → consequence flag → routing table → output queue: social / WP draft / deck / newsletter ]

End of Deck v0.1

This is the machine we are building.
This is how it works.
This is how we will know what to build next.

EngineHouse / ASIP Pipeline — Internal — Core Team Only