Philip K. Dick and the Un-Scraped Real
Artificial intelligence promises foresight. It swallows oceans of text, images, and recordings and spits back something that looks uncannily like thought. The premise is simple: if you know enough, you can predict the rest. But prediction isn’t understanding. It’s just scaffolding, a replica stretched over life. And because AI trains on what can be scraped, indexed, and digitized, its reality is bounded. Anything not posted, logged, or archived might as well not exist.
Philip K. Dick understood that problem long before neural nets. His fiction isn’t about spaceships or laser guns so much as realities inside realities—fragile, counterfeit, barely coherent. The people in his novels are trapped in worlds that look real but prove to be projections, scripts, or someone else’s hallucination. To read Dick in 2025 is to recognize the danger of mistaking predictive imitation for life.
In A Scanner Darkly, undercover agents wear scramble suits; shifting masks of thousands of identities at once. You can’t pin them down. It’s a defense, but also a trap: the wearer eventually forgets who they are.
That’s our digital life. We blur our faces in photos, use burner accounts, scramble our identities for privacy. It protects us from surveillance. It also chips away at coherence. The cost of avoiding predictive capture is fragmentation. You are visible everywhere and recognizable nowhere.
Dick didn’t need the internet to see how anonymity can curdle into decay. One of Dick’s most consistent themes is that entropy is undefeated. Eventually it all unravels.
Do Androids Dream of Electric Sheep? gives us the Voight-Kampff test, a machine for detecting empathy. In Dick’s world, empathy is the last line between human and android. But if it can be tested, it can be faked. Today, “empathetic AI” is the hot feature. Chatbots tuned to comfort, sentiment analysis that “reads” emotions. But empathy as an input/output system is just mimicry. It’s performance with no moral weight. The Voight-Kampff in all of us can detect most of what companion AI’s give us. And so we’re bound to Dick’s reality again. That once empathy is a script, anyone can run it. A machine. A sociopath. An actor. The test stops measuring humanity and starts measuring compliance. Entropy again moves in.
“Kipple” was Dick’s word for the junk of consumer life; empty packages, broken gadgets, useless debris that reproduces itself until it buries everything. Our kipple is data. Click-trails, duplicates, low-value posts, bot-written filler. AI is trained on it. And worse: AI now trains on AI outputs, piling synthetic kipple on synthetic kipple until the signal drowns.
Entropy isn’t a bug; it’s the whole arc. Dick’s junk-cluttered apartments were prophecy for the age of content farms and hallucinated footnotes. Left alone, kipple wins.
What’s left is memories. Or something like a unit of mental past. Dick loved the idea that memory itself could exist apart from our minds. That’s our reality now. Platforms resurface curated “memories” in slideshows. Our past gets packaged and replayed on demand. Deepfakes take it further: memories manufactured outright. But if memories can be sold, who owns the past? The person who lived it, the company that stores it, or the algorithm that edits it? Dick’s characters often can’t tell. Increasingly, neither can we.
Dick’s realities often unravel at the edges. Objects flicker, worlds fall apart, counterfeits expose themselves by mistake. Those glitches aren’t failures. They’re revelations. They prove the world is a construct.
AI’s hallucinations do the same. When a model invents a book or a court case, it’s not just wrong. It’s showing you where its training ends. The glitch is the crack in the wall. That’s where the truth leaks in.
Personalization builds a world tailored just for you. What you see feels true because it feels meant for you. But that “revelation” is just the softest form of control: shaping what you notice, nudging what you believe, selling you what you already want. The cage is invisible, and it’s comfortable. Dick would have called it a prison.
The most radical Dickian idea for our time is the one that resists capture altogether. And it doesn’t even come directly from Dick’s writing. Instead it’s the logical progression of an his idea, as so much of sci-fi is. In Three-Body Problem, Liu Cixin imagines “wall-facers”—people who keep their plans entirely in their heads, inaccessible to any surveillance. In Dick’s work, the survivors are often those who keep something opaque, unshareable, off the record. Pre-cogs only share some of the truth. Replicants can only know what they’re allowed to. There can be no alternate reality if all that is known is kept in the flesh and blood substrait of the last frontier; the human brain. The sin of so much of Dick’s books is its corruption.
That’s our last defense against predictive AI. The air-gapped mind. The conversation not posted. The notebook that never gets scanned. The part of life that doesn’t live on a server.
If AI sees only what it can scrape, then the unscripted, the unwritten, the face-to-face becomes a civic virtue. A strategy of keeping reality alive by refusing to let it be indexed.
AI builds an endless surface world…predictive, generative, smooth. Dick’s fiction insists that the real world is fragile, glitchy, entropic, and irreducible. He warns us not to confuse the replica for the real.
The lesson is not nostalgia for a pre-digital past. It’s something sharper: a reminder that the most important parts of life are the ones that can’t be modeled. Empathy that can’t be scored. Memories that can’t be replayed. Decisions that remain undecided until they’re lived.
Dick wrote realities within realities. We live in one now: the reality of the world and the reality of the model trained to imitate it. His stories remind us that the line between them is not academic. It’s real. As real as anything is ever going to be again.


Thank you for the very relevant review of his work. I don't enjoyed reading sci-fi, but of course concepts like the ones you wrote about are so important for us to keep in mind, so I am grateful to learn about this from your writing. I regularly quote some of your other pieces about AI, and will be adding this to my bank for discussions on the topic.