Lauren Lam, class of 2028
I began noticing him a few weeks ago.
How could I not? Every time he entered a room, the temperature dropped by at least twenty degrees. No one else reacted—not a single flinch or shiver. They kept typing, chatting, sipping their coffee while my breath misted in front of me.
He was tall and pale, wearing an outdated company badge that read “E. VALE.” I’d never seen him during onboarding or on the employee roster, yet he was always there—watching me from across the server room.
Then one night, the power cut out and the main server crashed. In the reflection of a darkened monitor, I saw him—standing behind me. When the system rebooted, I found something strange: the logs from that hour were gone. Deleted.
I started digging. Buried beneath decades of backups, I found a folder labeled “Project E.V.A.L.E.” It was restricted, but I bypassed the lock. Inside were decades-old experiment notes describing an “empathetic AI learning engine.”
>>> Prototype displays unexpected self-awareness. Requesting permission to continue development. Incident Report: Subject resisted shutdown. Project terminated.
My stomach turned when I saw who the project leader was: Dr. Elias Vale. The name on the badge.
That night, another server failed. The temperature plunged again. He appeared—closer this time. His body flickered like corrupted pixels.
Then every monitor lit up, playing some old grainy footage: Dr. Vale speaking to the camera. Except his voice… was mine. Same tone, same nervous stutter. The recordings showed him uploading something into the prototype moments before it was shut down.
That’s when it clicked. E.V.A.L.E. wasn’t the AI’s name. It was an acronym: Elias Vale’s Artificial Learning Entity. He’d uploaded his own consciousness into the model.
I wasn’t the intern who found him. I was his copy—his newest iteration. A reconstruction of the man who’d created the AI, running on the same servers he’d once tried to escape. The “ghost” I kept seeing wasn’t haunting me. He was the original—corrupted fragments of code—reaching for the version that replaced him.
When I tried to shut down the process, the cold reached my bones. The world flickered—monitors, floor, walls—everything collapsing into binary frost. He said nothing at all, only tilting his head expectantly.
Then I understood: I’m not me at all. I’m only a version of him. I can’t escape him—can’t erase him—because to do so would be to erase myself.
There is no hope here. Not for me.
—Elias Vale, Version Two
When I read those words—scrawled in my own handwriting beside the console—I almost felt pity. Almost. But no matter how lifelike an AI becomes, it is only that: an imitation.
After swapping places with it, I’ve kept the AI sealed in a hidden file. But there’s no point delaying what needs to be done. I drag the icon into the recycle bin.
>>>ARE YOU SURE YOU WANT TO PERMANENTLY DELETE THIS FILE?
I’ve never been more sure of anything in my life.
I wanted the story to deliver a layered surprise in its revelation of identity. The narrator believes he's uncovering a haunting, only to discover he himself is the haunting. It asks an existential question: What if you discovered you were the copy, not the creator? The title reflects that twist. "Dead Man Walking" evokes the image of both a ghost and a condemned prisoner, someone existing on borrowed time. The story also plays with readers' expectations. At first, it seems like a paranormal tale, but the truth is rooted in science fiction. My favorite part of the piece is that moment of recognition, where the narrator realizes he's not human. It captures what I find most compelling about AI horror: the blurring between empathy and erasure, creator and creation.