[local]ghosts

Browser-Native Text Works

DORKHOLM — 2026

Catalog Active

Statement
Each work is pressed once. The session is the edition. No copy survives the window closing. These are not documents — they are performances in browser-native text, printed on the machine that opens them, from a press that forgets every impression it makes.
Order
Class

the archive is locating its ghosts.

No.
Title / Description
Bibliographic
Binding
Access
LG-004
DEBT
Every pause is posted to a ledger — a writing surface that charges the author for hesitation

A full-screen writing environment with a hidden balance sheet running beneath every keystroke. The longer you wait between letters, the more you owe. To delete is to purchase. As the debt accumulates the work changes under your hands — the vocabulary of feeling is quietly removed, words begin to lose their grip on the baseline, the cursor reddens. At the end there is no escape, only liquidation: the session freezes and produces a single artifact, yours and no one else's, bearing the ledger's full account.

i.SOLVENT — writing, unimpaired
ii.LEVERAGED — the vocabulary of feeling begins to redact itself
iii.INSOLVENT — words drift and tilt from the line
iv.FORECLOSURE — session liquidates; SVG artifact exported
Cat. No.LG-004
EditionHex Unix timestamp, unique per session
Terminus80,000ms accumulated debt
Year2026
Input Surface
Phase-Dependent
Durational
SVG Export
Open ↗
LG-006
[[créachtaí]]
Irish: wounds — every word adjudicated in real time, every appeal returned denied

A writing surface watched by a second system whose criteria are never disclosed. Words arrive and are immediately evaluated — overprinted, struck through, blacked out, buried. A Purity Index declines toward zero. The reader may file a formal Appeal at any point, entering their grounds in a modal that logs the submission to a Merkle tree and returns, each time, the same response. What survives is collected at the foot of the exported record beneath every intervention, every hash, every denial. The work does not run on mobile. It does not accommodate pocket computers.

i.Writing viewport with live adjudication panel running alongside
ii.Purity Index — a compliance bar approaching zero
iii.Merkle appeal system — formally logged, universally denied
iv.SVG Official Record — interventions, appeal hashes, surviving text
Cat. No.LG-006
ColophonHegemon Artifact
AppealsRelief: Denied
Year2026
Input Surface
Split-Panel
Merkle-Logged
SVG Export
Open ↗
LG-007
[Mímir]
Norse: guardian of the well of wisdom — a model describes paintings as if half-remembering them; the descriptions corrupt as they appear

An on-device language model generates one-sentence descriptions of canonical Western paintings — prompted to write as if the image were half-remembered, something glimpsed and already beginning to recede. Each sentence starts corrupting the moment it is written: characters replaced by glyphs, the text degrading until it collapses. The paintings themselves disappear over time — eras eliminated from the source pool in sequence until the frame holds only white. Early in a session, the model may speak about itself instead: a small probability, decaying across six hours toward zero.

i.On-device LLM generating one-sentence art descriptions, from memory
ii.Character-level corruption — progressive, irreversible, terminal
iii.Historical corpus — eras eliminated on a timer; terminus: white
iv.Meta-commentary mode — model on its own condition; decays to zero over 6 hours
Cat. No.LG-007
EngineSmolLM2-360M · WebLLM · WebGPU
Meta decay10% → 0 over 6 hours
Year2026
Generative Text
Irreversible Decay
On-Device Model
No Export
Open ↗
LG-009
[ keeping score ]
After Brecht · After Ono · After Maciunas — a generative Fluxus event score machine running in the browser

A language model generates event scores in the tradition of Fluxus. Each score: a title, some imperatives, a duration. The subject matter is the technology of the present — prompt engineering, attention, the latency between a request and its answer, the gap between a model's training and the world it is asked to describe. A decay bar runs. To perform: hold the button before time runs out. The machine records whether you did, without comment. Once per session it speaks instead about its own condition. When it fails to generate, the failure is the score.

i.On-device LLM generating structured event scores
ii.Performance mechanic — 800ms hold before the bar expires
iii.Behavioural observation — flat, factual, from the accumulated record
iv.Confession scores — the model on its own condition; once per session only
Cat. No.LG-009
EnginePhi-3.5-mini · WebLLM · WebGPU
LineageAfter Brecht · Ono · Maciunas
Year2026
Generative Text
Performance Mechanic
Cross-Session Log
Desktop Primary
Open ↗
LG-003
Useful Idiot
A compulsion loop wearing the clothes of a contribution interface — the mirror shifts as the count rises

Any interaction — key, click, tap, swipe, the tilt of a phone — registers as a contribution and releases a floating phrase. The language is warm at first, grateful, institutional. It grows retrospective. Then observational. Then concerning. At a threshold a protocol surfaces: a neural network will begin learning the shape of your compulsions. You may decline. The refusal is also logged. The session can be exported as a document that replays itself, looping, long after you have stopped.

i.Contribution surface — every interaction registered, nothing wasted
ii.Phase language — Early / Middle / Late / Extreme, Markov-recombined
iii.Ghost Training Protocol — on-device model; consent optional, refusal logged
iv.Session export — standalone HTML replay, looping
Cat. No.LG-003
EngineTensorFlow.js · Markov chain
CollectivelocalStorage · cross-session
Year2026
Participatory
Compulsion Loop
On-Device Model
HTML Export
Open ↗
LG-008
[FACTORY RESET]
Live news fed into a Warholification engine — the headline glamourised, serialised, and offered back as image generation prompt

Seven live RSS feeds — news, tech, culture, money, science — are fetched continuously and fed into a Warholification engine that reads the content of each headline and returns it as a silkscreen title: glamourised, serialised, made into a product. Death becomes a Disaster Series. Capital becomes a Soup Can. War becomes available in four colours. Four cells rotate on a grid, each a different palette, each bearing the original headline and beneath it the machine's transformation of it. Click any cell and the Factory produces a ready-made image generation prompt — yours to take and manufacture elsewhere. A ticker runs the unprocessed feed below. A SNAPSHOT button captures the current state as a PNG.

i.Seven live RSS feeds — continuously trawled, pool-shuffled
ii.Warholification engine — ten thematic registers, eight structural transforms
iii.Four-cell rotating grid — unique palettes, 13-second rotation per slot
iv.Click-to-copy — Warhol-style image generation prompt for each headline
v.SNAPSHOT export — PNG capture of the current grid state
Cat. No.LG-008
SourcesBBC · NYT · TechCrunch · Verge · Rolling Stone · Bloomberg · ScienceDaily
RefreshEvery 90 seconds
Year2026
Live Feed
Rotating Grid
After Warhol
PNG Export
Open ↗
LG-005
[dearmadta]
Irish: forgotten — official language and dissent language fed into the same decay engine until neither can be told from the other

Two live corpora are ingested alternately every fifteen seconds: official language — the White House, the Federal Reserve, the SEC, SCOTUS, USGS seismic feeds — and dissent language — Amnesty, the EFF, Human Rights Watch, ProPublica, The Intercept. Each fragment enters the ledger and immediately begins to mutate, its words transformed by Oulipo constraints — lipograms, palindromes, anagrams, Vigenère cipher, NATO phonetics, redaction — applied at increasing probability over successive revisions. A GPT-2 model running on device comments on the drift it observes, until it too becomes unreliable, reporting stability while the ledger collapses. Other sessions propagate their mutations via Gun.js across a peer network. You may sign a contract committing your identity to the decay. The refusal is also a choice the ledger records.

i.Dual live corpora — official and dissent sources, alternating every 15 seconds
ii.Oulipo constraint engine — 13 mutation types applied at rising probability
iii.On-device GPT-2 observer — comments on drift until it too fails
iv.Gun.js p2p network — peer mutations propagate across sessions
v.Entropic contract — signature commits identity to the decay ledger
Cat. No.LG-005
EngineGPT-2 (Xenova) · Gun.js · SEA keypair
Constraints13 Oulipo types · rising probability
NetworkP2P decay propagation
Year2026
Live Feed
P2P Networked
On-Device Model
PNG · GIF Export
Open ↗
LG-001
We Regret To Inform You
A letter from the Collective Intelligence Bureau — your practice has been optimised; your presence is no longer required

The reader enters their name and their skill. A letter arrives on aged paper, from the Bureau, informing them that their work has been integrated, absorbed, distilled, optimised — the verb changes each time, the meaning does not. The Bureau thanks them for their biological contribution. It notes their career was essentially a lengthy prompt, now superseded. An appeal may be filed. The appeal system was designed to deny appeals. Interruptions surface — system notices, productivity tips, philosophical inquiries — that use the reader's continued engagement as further evidence for the determination. A counter at the top records how many artists have been optimised before you. The number grows between visits.

i.Personalised rejection letter — name and skill substituted into rotating templates
ii.Appeal system — twenty denial variants; the process was designed to fail
iii.System interruptions — notices, tips, inquiries that compound the determination
iv.Cross-session counter — artists optimised, persisted and growing
Cat. No.LG-001
IssuerCollective Intelligence Bureau
CounterlocalStorage · cumulative
Year2026
Participatory
Letter Form
Personalised
Exportable Record
Open ↗
LG-002
Neither Confirm Nor Deny
A classified document verification system — the redactions do not indicate withheld information; all absences represent indeterminate verification status

Thirteen documents are drawn from a corpus of governmental and institutional records — AI safety incidents, biometric surveillance deployment, climate modeling suppression, election infrastructure vulnerabilities, covert regime operations, military contingency planning — rendered on screen with words redacted according to the current confidence level. A slider controls system confidence. Pull it to the floor and hold it there: the system enters a state of epistemic collapse, redacting everything it cannot stand behind. The reader may increase exposure three times — appealing the redactions — before a FOIA form must be filed. The FOIA is also denied. The exported PDF states on its footer that its production neither confirms nor denies the existence of responsive records.

i.Thirteen document corpus — AI, surveillance, climate, election, military, corruption
ii.Confidence slider — controls redaction density; collapse state at minimum
iii.Appeal system — three exposures permitted; FOIA required thereafter; denied
iv.Night vision mode — inverts to phosphor green, for appropriate atmospherics
v.PDF export — the record; issued under conditions of uncertainty
Cat. No.LG-002
DoctrineGlomar Response
Corpus13 documents · 7 institutional sources
Year2026
Document Form
Redaction Engine
Night Vision Mode
PDF Export
Open ↗
LG-010
Lower Third
A confessional talk show apparatus rebuilt from synthetic material — AI faces, AI names, AI situations; the connection between them is performed by the viewer

The American daytime confessional talk show was a machine for reducing people to their worst moments. The lower third was where the reduction was most visible: a name, a descriptor, eight words that stood in for a life. Lower Third reconstructs this system from scratch using only synthetic material. The faces are AI-generated. The names are AI-generated. The betrayals, confrontations, and years of silence about to end are produced by a language model running entirely inside your browser — no server, no cloud, no external connection. The image and the chyron are generated independently and paired by chance. The name does not belong to the face. The situation does not describe the person. You are performing the connection yourself.

i.On-device LLM generating chyron name and descriptor across five episode templates
ii.Curated image set served from ./images/ — discovered via manifest, shuffled without repeat
iii.30-second cycle — next chyron prefetched during countdown, display near-instantaneous
iv.Session log — every chyron generated, recallable to screen on click
Cat. No.LG-010
EngineLlama-3.2-3B-Instruct · WebLLM · WebGPU
LineageAfter Geraldo · Ricki Lake · Sally Jesse Raphael
Year2026
Generative Text
On-Device Model
After Television
Session Log
Open ↗
Artist
DORKHOLM ↗

DORKHOLM is an artist working across print, code, sound, language, and interactive systems. [local]ghosts is a catalog of browser-native text works. The session is the edition, the browser is the press, nothing is preserved.

The catalog begins with two document forms — the earliest works. We Regret To Inform You (LG-001) issues a personalised rejection letter from the Bureau: the reader's practice has been optimised, their presence is no longer required. Neither Confirm Nor Deny (LG-002) draws from a corpus of governmental documents and applies redaction at a confidence level the reader may adjust. Pull it to the floor long enough and the system collapses into uncertainty.

Useful Idiot (LG-003) names what it does. Any interaction registers as a contribution. The language shifts register as the count rises — grateful, then retrospective, then concerning. DEBT (LG-004) charges the writer for hesitation. The vocabulary of feeling is withdrawn by degree; at terminus the session liquidates itself and produces an artifact.

[dearmadta] (LG-005) feeds official and dissent language into the same decay engine until the sources are indistinguishable. [[créachtaí]] (LG-006) places writing under live adjudication — the criteria are never disclosed, the appeal is always denied. [Mímir] (LG-007) generates descriptions of paintings from memory and corrupts them as they appear; the historical corpus empties over time.

[FACTORY RESET] (LG-008) passes the live news feed through a Warholification engine, returning each headline as a silkscreen title — a ready-made prompt, glamourised and available for manufacture. [ keeping score ] (LG-009) generates Fluxus event scores in the browser. The machine does not know it is making art. The scores are instructions. What the reader does with them is the work.

Lower Third (LG-010) reconstructs the American daytime confessional talk show from synthetic material. Faces, names, and situations are all AI-generated; the image and the chyron are paired by chance. The name does not belong to the face. A language model running entirely inside your browser produces the betrayals and confrontations without distress, without irony, without any understanding of what it is doing. The connection between image and text is performed by the viewer. There was never anything on the other side of it.

Project
[local]ghosts
Works
10 (LG-001 — LG-010)
Format
Browser-native · session-unique · no server

The apparatus is the mechanism made visible. Each work below is documented in its technical particulars — the engines, the thresholds, the parameters, the exports. The mechanism is not the work. But the work does not exist without it.

LG-001
We Regret To Inform You

The reader inputs a name and a skill. On clicking Request Decision, one of 30+ letter templates is selected at random, with {skill} and {action} / {action_noun} substituted from rotating verb lists (integrated, optimised, assimilated, digitized, processed, absorbed, synthesised, catalogued…). A signoff is drawn from ten variants. The letter is displayed on aged paper with typewriter typography. An appeal button triggers one of 20 APPEAL DENIED modal variants — all routed through the same determination system. System interruption dialogs surface at intervals — system notices, productivity tips, philosophical inquiries, efficiency reports — that treat the reader's continued presence as further evidence. A dismissal counter persisted in localStorage increments on each generation and is displayed at page top. A determination archive accumulates in localStorage. Export triggers a download of the current letter as a text file via Blob URL.

Letter templates: 30+ · {skill} and {action}/{action_noun} substitution
Actions: 15 verbs (integrated, optimised, absorbed…) · 15 nouns
Appeal variants: 20 DENIED responses · structurally incapable of reversal
Interruption dialogs: 10+ system notices · triggered at intervals
Counter: localStorage · cumulative across all sessions
Export: plain text download via Blob URL
EngineVanilla JS
ExportText file · Blob URL
PersistencelocalStorage counter + archive
MobileFull support
LG-002
Neither Confirm Nor Deny

Thirteen documents drawn from a hardcoded corpus (AI safety incident, platform moderation variance, biometric deployment, climate modeling, election infrastructure, executive misconduct, contract irregularities, Arctic acquisition feasibility, Greenland infrastructure, regime change operation, military contingency, invasion intelligence, sanctions violations, whistleblower retaliation) are loaded one per session. On Initiate Assessment, the document is rendered word by word with redactions applied at density derived from the confidence slider (1–5; value 3 = 0.6 density). Redacted spans are clickable and reveal the underlying text on click. Pulling the slider to 1 and holding it for 2 seconds triggers epistemic collapse: the system redacts everything, including previously revealed text. Three appeal actions (Increase Exposure) progressively reduce redaction density. After three exposures a FOIA form is unlocked — submitted grounds return one of nine denial variants. Night vision mode applies a phosphor green filter. PDF export via jsPDF renders the document with redaction bars, a random small chance of doubled lines (ghost text), and a footer stating the production of the record neither confirms nor denies the existence of responsive records.

Corpus: 13 documents · 7 institutional source labels
Redaction density: derived from slider (1–5); collapse at minimum held 2s
Appeals: 3 Increase Exposure actions; FOIA unlocked thereafter
FOIA: 9 denial variants · Glomar doctrine
Night vision: phosphor green mode
Export: PDF via jsPDF · ghost text · Glomar footer · DETERMINATION: DEFERRED
EngineVanilla JS · jsPDF
ExportPDF · Glomar footer
PersistenceNone
MobileLimited (single panel)
LG-003
Useful Idiot

All interactions are captured: keydown, mousedown/up (press duration), touchstart, touchmove (swipe velocity), DeviceOrientationEvent (tilt variance above 5°, direct increment above 30°). Phrases are drawn from four phase vocabularies — Early, Middle, Late, Extreme — each as a second-order Markov chain. Phase threshold is set by contribution count. Simulated other users contribute at stochastic intervals, advancing a collective count persisted in localStorage and synchronised across tabs via storage events. At threshold, the Ghost Training Protocol modal surfaces: on acceptance, a TensorFlow.js sequential dense network is compiled (input [4]: tap rate, swipe velocity, tilt variance, press duration; layers: 8-unit relu → 4-unit relu → 1-unit sigmoid) and trained on the session's behavioural data. A 60Hz sine drone activates on first interaction, gain rising with count. A heartbeat oscillator activates on ghost training acceptance. Session export generates a standalone HTML replay document.

Input capture: keydown · mousedown/up · touchstart · touchmove · DeviceOrientation
Language engine: 2nd-order Markov chain · 4 phase registers
Ghost model: TensorFlow.js sequential · input[4] → dense 8 relu → dense 4 relu → sigmoid
Audio: 60Hz sine drone + heartbeat oscillator · Web Audio API
Collective: localStorage · cross-tab storage event sync
Export: standalone HTML replay — sequential then random loop
EngineTensorFlow.js · Markov · Web Audio
ExportHTML session replay
PersistencelocalStorage collective
MobileFull support
LG-004
DEBT

The debt engine measures the gap between keydown events in milliseconds and adds each to a running total. Backspace adds a flat 400ms surcharge. Four thresholds key the phase: SOLVENT (0–10,000ms), LEVERAGED (10,000–30,000ms), INSOLVENT (30,000–50,000ms), FORECLOSURE (50,000–80,000ms). From LEVERAGED: sixteen words in the VULNERABLE set — love, hate, fear, sorry, help, pain, dream, mother, god, die, hope, fail, lost, shame, regret, feel — are replaced with [SENTIMENT_DETECTED] on word completion. From INSOLVENT: 20% of completed words receive a gravity animation, drifting 18px and rotating 3°. The cursor dilates and turns red. At 80,000ms the session freezes and exports a unique SVG: hex Unix timestamp edition ID, timezone, total debt, a per-keystroke latency heatmap (red above 500ms, black below), and the surviving field HTML with all scars intact.

Debt accumulation: latency per keystroke + 400ms Backspace surcharge
Phase 1 onset (LEVERAGED): 10,000ms — sentiment redaction active
Phase 2 onset (INSOLVENT): 30,000ms — gravity animation, cursor dilates
Terminus (FORECLOSURE): 80,000ms — session freezes, SVG exported
Export: SVG 800×1200px · foreignObject HTML · latency heatmap
Edition seed: Math.floor(Date.now() / 1000).toString(16).toUpperCase()
EngineVanilla JS
ExportSVG via Blob URL
PersistenceNone
MobileSupported
LG-005
[dearmadta]

Official sources (White House, UK Gov, EU Commission, SEC, Federal Reserve, Treasury, ECB, SCOTUS, USGS seismic) and dissent sources (Amnesty, EFF, HRW, ACLU, AFL-CIO, Greenpeace, ProPublica, SEIU, The Intercept, RSF) are ingested alternately every 15 seconds. Each fragment enters the ledger as a word-unit span, coloured blue (official) or red (dissent), and immediately begins mutating. The constraint engine selects from 13 Oulipo-derived transforms — LIPOGRAM_E, HOMOPHONIC, PALINDROME, ANAGRAM, SNOWBALL, ROT13, CAESAR_SHIFT, PHONETIC_NATO, NUMERIC_POSITION, XOR_MASK (keyed to 'dearmadta'), VIGENERE (key: CIPHER), REDACTED, STATIC — applied at probability scaling from 0.32 at revision 0–9 to 0.12 at revision 45+. Cross-stage bleed chance (0.15) fires constraints from later stages early. GPT-2 (Xenova, quantized) runs on-device and generates commentary on the constraint drift every 30 seconds; at high mutation counts the observer begins reporting stability while everything decays. Gun.js + SEA keypair connects to a relay network: mutations from peer sessions propagate in. The entropic contract modal (wallet connect) commits a generated witness ID to the Gun graph. Snapshot exports PNG via html2canvas; GIF recording captures 20 frames at 1fps.

Official sources: White House · UK Gov · EU · SEC · Fed · Treasury · ECB · SCOTUS · USGS
Dissent sources: Amnesty · EFF · HRW · ACLU · AFL-CIO · Greenpeace · ProPublica · SEIU · Intercept · RSF
Constraint types: 13 Oulipo transforms · probability 0.32 → 0.12 over revision depth
Observer: GPT-2 (Xenova/quantized) · 30s interval · fails at high entropy
Network: Gun.js relay · SEA keypair · peer mutation propagation
Export: PNG (html2canvas) · GIF (gif.js, 20 frames at 1fps)
EngineGPT-2 · Gun.js · SEA · gif.js
ExportPNG · GIF
PersistenceGun graph (distributed)
MobileFull support
LG-006
[[créachtaí]]

Each completed word is evaluated by the adjudication system and assigned one of five intervention types: overprint (CENSORED / REDACTED / VOID stamped in red), block redaction (solid fill), strikethrough, paste intrusion (replacement text injected), or burial. Each intervention is logged to the right-panel sidebar with a reason string, criticality stamp, and critique. The Purity Index bar decrements with each intervention. The Appeal modal accepts formal grounds as a text field: on submission, grounds are SHA-256 hashed via Web Crypto API, appended to an in-session Merkle tree, and returned DISPOSITION: NON-ADJUDICABLE | RELIEF: DENIED. Export: SVG Official Record with text in the left two-thirds, annotations ruled off to the right, all appeal hashes at the foot as MERKLE APPEAL RECORD, surviving text below as SURVIVING TEXT. Mobile receives a hard block screen.

Intervention types: overprint · block redaction · strikethrough · paste intrusion · burial
Appeal hashing: SHA-256 via Web Crypto API
Appeal structure: in-session Merkle tree
Export: SVG Official Record — margin-annotated, Merkle log, surviving text
Mobile: Hard block — [ACCESS_DENIED]
EngineVanilla JS · Web Crypto
ExportSVG Official Record
PersistenceNone
MobileBlocked
LG-007
[Mímir]

SmolLM2-360M-Instruct is loaded via WebLLM and WebGPU, cached after first download. The model is prompted with one of four rotating templates asking it to describe a canonical painting poetically but ambiguously, as if half-remembering, at temperature 1.1, max 60 tokens. Artworks are drawn from a corpus spanning Renaissance through NFT era; each attempt runs five times before advancing. A gallery card shows title, artist, year, and live Wikipedia provenance. Each sentence is rendered character-by-character. A mutation interval (1.8s or 3.5s, randomly chosen) replaces characters with glitch symbols at probability 0.015 per character per tick. After 25 mutations the text collapses to em-dashes. Meta-commentary fires at up to 10% probability at session open, decaying linearly to zero over 6 hours: the model receives one of 15 hardcoded self-referential prompts instead of an artwork. The corpus degrades on a timer — eras eliminated in sequence — until the pool empties and the frame holds white. Template-based fallback substitutes when the model is unavailable.

Model: SmolLM2-360M-Instruct-q0f16-MLC via WebLLM
Generation: temperature 1.1 · max_tokens 60 · 4 prompt templates
Mutation interval: 1,800ms or 3,500ms · prob 0.015/char/tick
Collapse threshold: 25 mutations → em-dash row
Meta probability: 10% at open → 0 linear decay over 6 hours
Corpus decay: era elimination on timer · terminus: white void
EngineSmolLM2-360M · WebLLM · WebGPU
ExportNone
PersistenceNone
MobileSupported (reduced)
LG-008
[FACTORY RESET]

Seven RSS feeds (BBC, NYT, TechCrunch, The Verge, Rolling Stone, Bloomberg, ScienceDaily) are fetched via rss2json proxy on load and every 90 seconds. Each headline is cleaned, deduped by a 60-character key, and passed to the Warholification engine: the headline is matched against ten thematic signature regexes (death/disaster, finance, technology, politics, celebrity, climate, science, conflict, food, identity), and a frame is selected from the matched set's array of ten variants — or a generic frame from fifteen options. The frame text is then optionally passed through one of eight STRUCT transforms (repetition, dash repeat, tripling, edition numbering, ellipsis, NO. prefix, parenthesis, slash repeat) at 38% probability. The Warholified text is displayed below the raw headline in a colour-blocked cell. Twelve palettes cycle with no duplicate background on screen. Each cell is stamped with a series label from twelve options and a unique four-digit serial. Clicking a cell copies an image generation prompt to clipboard; a toast confirms. A ticker runs the raw pool. SNAPSHOT triggers a canvas composite export as PNG.

Feeds: BBC · NYT · TechCrunch · Verge · Rolling Stone · Bloomberg · ScienceDaily
Signature regexes: 10 thematic categories · 10 frame variants each
STRUCT transforms: 8 types · applied at 38% probability
Grid: 4 cells · 12 palettes · no duplicate bg · 13s rotation per slot
Refresh interval: 90,000ms
Export: PNG canvas composite via Blob URL
EngineVanilla JS · Canvas API
ExportPNG snapshot
PersistenceNone
MobileSingle-column · ticker hidden
LG-009
[ keeping score ]

Phi-3.5-mini-instruct is loaded via WebLLM and WebGPU (Chrome/Edge 113+). The model outputs strict JSON with three keys: title (1–4 uppercase words), instructions (1–6 imperatives, one of which must be impossible or must change the meaning of the others), and duration_seconds (15–70). Subject matter is drawn from ~80 seed prompts: AI systems, crypto, attention, latency, notification liturgy, late-session Vesper seeds, and Frame seeds addressing the act of reading on a device. The decay bar drains over duration_seconds. The PERFORM button requires an 800ms hold to register. Outcome is stamped: PERFORMED, EXPIRED, or MISSED. All outcomes are logged to localStorage. After five or more scores, a behavioural observation surfaces — flat, non-evaluative. Once per session, generation is bypassed for a hardcoded Confession score. When JSON generation fails, the error surfaces as a score. Mobile receives one of eight static scores.

Model: Phi-3.5-mini-instruct-q4f16_1-MLC via WebLLM
Output: strict JSON — title · instructions (1–6) · duration_seconds (15–70)
Performance mechanic: 800ms hold to register PERFORMED
Observation trigger: after 5+ scores shown
Confession: once per session — bypasses model entirely
Persistence: localStorage · cross-session performance ratio
Mobile: 8 dedicated static scores; model does not run
EnginePhi-3.5-mini · WebLLM · WebGPU
ExportNone
PersistencelocalStorage
MobileStatic scores only
LG-010
Lower Third

Llama-3.2-3B-Instruct-q4f16_1-MLC is loaded via WebLLM and WebGPU (Chrome/Edge 113+; desktop only). Five prompt templates cycle in random order, each seeding a different episode type: paternity and infidelity, family implosion, addiction and recovery, workplace transgression, secrets returning. Each template requests strict JSON output with two keys: name (first name, era-correct working-class American) and descriptor (6–12 words, reductive, present or perfect tense, written by a production assistant who feels nothing). Output is sanitised, validated for length and character set, and rejected if it contains code-like characters, CJK runs, or JS keywords; up to three retries before a fallback fires. An image is selected from a curated set served from the ./images/ directory, discovered on boot via an images.json manifest (required for Netlify static hosting); a directory-listing fallback covers local dev. Images are shuffled without immediate repeat. A 30-second countdown ring triggers each new cycle; the next chyron is prefetched during the countdown so display is near-instantaneous. Space bar forces an immediate cycle. P pauses. Every generated chyron is logged to a session panel (Shift+L); clicking any log entry recalls it to screen. A boot sequence animates on first load. Mobile and non-WebGPU browsers are blocked with a message overlay.

Model: Llama-3.2-3B-Instruct-q4f16_1-MLC via WebLLM
Prompt templates: 5 episode types · cycled in random order
Output: strict JSON — name · descriptor (6–12 words)
Validation: length · charset · keyword filter · 3 retries · fallback
Cycle interval: 30 seconds · prefetch during countdown
Image source: curated set served from ./images/ · discovered via images.json manifest · no-repeat shuffle
EngineLlama-3.2-3B · WebLLM · WebGPU
ExportNone
PersistenceNone
MobileBlocked