“Dismembered limbs, a severed head, a hand cut off at the wrist, feet which dance by themselves—all these have something peculiarly uncanny about them…”
—Sigmund Freud, “The ‘Uncanny,’” 1919
On May 1st, 2025, I was honored to once again attend the annual Edgar Awards dinner at the Marriot Marquis Times Square alongside legends of the mystery genre; tireless fellow volunteers for my guild, the Mystery Writers of America; beloved booksellers and genre fanatics from across the world; and characters from a fever dream in which projected AI versions of first Humphrey Bogart, followed by Edgar Allan Poe holding his feline companion Catterina, were scripted to act as interim “hosts.”
The ripple of shock through the room was palpable. And though potential issues of copyright regarding the Bogart Estate were speculated upon by guests, not to mention freebie appropriation of Poe’s image when the man submitted his story “Epimanes” to Messrs. J. P. and E. Buckingham with the postscript, “P. S. I am poor,” in May of 1833, the mere fact of computerized ghosts raised hackles amongst many of the living. Had it been the nineteen-eighties (or eighteen-eighties, let’s be real here), I’d have muttered, “Cocaine is a hell of a drug,” and sauntered off; as it was, the fact of “I see dead people” was enough to land me squarely in Uncannyland.
Thankfully, six days later (May 7th, 2025) the MWA Board of Directors issued a public apology on their Facebook page for dipping their toes into Freud’s Uncanny: “Such use is inconsistent with our otherwise staunch support of our members’ fight against unauthorized use and potential for piracy of their work through AI. We apologize and have taken steps to assure this won’t occur in the future.” The statement was widely applauded—and downright required—of an organization whose slogan is “Crime doesn’t pay…enough.” But since I’m currently fixated on Mary Shelley’s timeless Frankenstein and on narratives that explore the potentiality of artificial existence, I started connecting dots between the invention of imitative life, the actual spark of divinity, and the puissance of language itself.
Briefly: a direct as-The-Raven-flies map can easily be drawn between God’s creation of humanity via the “Breath of Life”; the Kabbalistic legend of the mud (or wood, or stone) avatar created and animated by occultist rabbis via language; Lord Byron, Ada Lovelace, Charles Babbage, and Mary Shelley; and the current almighty fuss surrounding our frankly dubious notion that it’s a great idea to create a hivemind that can talk to itself—and, as of present typing, refuse to shut itself off, no less.
Language itself is the thread through this labyrinth: the significance of bestowing—or that of taking away, as with the Tower of Babel—language cannot be understated. The very notion of understatement necessitates language; I wouldn’t be connecting my brain to yours via your electronic device without it. And lest you label me a Luddite from the outset for my direct association of AI with the Uncanny, back when all that fuss was made about e-books being inferior to print, I shouted everywhere I could that it was deeply classist to assume everyone owns carved walnut shelving equipped with those sweet, sweet library ladders, and asserted that my friends who rented beds in windowless closets were delighted by their e-readers, thank you very much indeed.
Technology can be fabulous; golems, though? Only very rarely, with one exception—according to the Talmud, Adam himself was created as one. He was without form until God gathered sufficient mud from the scope of the entire Earth to mold his special new buddy. Specifically, in Sanhedrin 38b, “[Adam’s] torso was fashioned from dust taken from Babylonia, and his head was fashioned from dust taken from Eretz Yisrael, the most important land, and his limbs were fashioned from dust taken from the rest of the lands in the world.” It’s a snazzy origin tale, that Adam is both the first and only man but also Everyman. And in an intriguing twist, his buttocks specifically were “from dust taken from Akra De’agma,” which somehow did a brisk business in arses before the dawn of humankind and doubtless produced the finest cheeks God could source.
So a solid case can be made that (notwithstanding the necessity of the “Me Too” movement and too many wars to list and the guy named Kevin who wrote the “Kars 4 Kids” jingle) Adam went pretty well. Why was that, you might ask? Well, YHWH has the advantage of not only being perfect, but—according to Baruch Spinoza, anyhow—all-powerful, and thus, all of Nature He brought into being including homo sapiens is likewise deterministically perfect.
But what happens when the Children of Adam—himself once a golem—set out to make their own little pottery babies, for chores around the farm and protection from pogroms and what have you?
Nine out of ten rabbinical scholars agree: really, really lousy antics ensue. Yes, you might be the Unabomber or you might be Steve Wozniak or you might be Gary Plaidshirt from Two Streets Over, but the danger truly lies in the fact that who you are is moot: once you make a silt slave, no matter its marching orders, that hunk of earth is inevitably going to invoke unexpected consequences.
The classic parable of the Golem, a mythological Talmudic creature with as many narratives inspired by it as there are warnings attached to creating one, is riper for storytelling today than it ever was, for obvious reason: with actual golems now flitting cyber-hither and cyber-thither, what could possibly go wrong? Additionally, there is an undeniable appeal to a character trope that comes with batteries-included fear and angst factors. We culturally understand that woe is the person landing on the mental health spectrum wherein taking years to perfect your Play-Doh model with single-hair brushwork and then hooking it up to your homemade matter-anti-matter collider sounds like a fun project. It’s by-the-book “the killer is calling you from inside the house” horror coding, if you’ll pardon the pun.
The Golem’s entire Uncanny appeal lies in this inherent dichotomous tension: Will it be useful? Sure. Will it be powerful? Definitely. Will it develop emotions and motives of its own? Almost certainly. Will it run amok and try to destroy Prague despite the Rabbi Loew’s intentions, or murder my entire family circle if my name happens to be Victor Frankenstein?
Yes, that’s a given.
All the ensuing mayhem is brought about thanks to language. “In the beginning was the Word, and the Word was with God, and the Word was God,” states John 1:1 (KJV). God is described as love, and light, and language; God breathes life into his creation, Adam. Adam and Eve’s offspring form other languages, it seems organically and without any capacity to stop ourselves. Communication itself, that elemental necessity that enables progress, is built upon our ability to share information with each other over distances of time and space.
Small wonder, then, that words like “abracadabra” should work miracles, or that medieval golems were brought to life with the Hebrew word “emet” (truth) inscribed on the forehead, hung as an amulet, or placed within the mouth cavity. This concept of magic words is gorgeously expressed by Jorge Luis Borges in his 1959 poem “El Golem,” which draws from both classical Plato and German-Jewish mystic Professor Gerhard Scholem:
So, composed of consonants and vowels,
there must exist one awe-inspiring word
that God inheres in—that, when spoken, holds
Almightiness in syllables unslurred.
The same Professor Scholem who moved Borges to write golem poetry—after emigrating to Palestine and changing his name to Gershom Scholem—was an enormously influential figure in archiving and organizing the reams of displaced Kabbalistic materials that had been stranded in synagogues, confiscated by Nazis, or simply fallen into dangerous disarray. It was he who drew a direct parallel between the modern computer and the Golem in a speech delivered at the Weitzmann Institute in June of 1969, a dedication ceremony naming Dr. Chaim Pekeris’s new computer model:
The old Golem was based on a mystical combination of the 22 letters of the Hebrew alphabet, which are the elements and building-stones of the world. The new Golem is based on a simpler, and at the same time more intricate, system. Instead of 22 elements, it knows only of two, the two numbers 0 and 1, constituting the binary system of representation. Everything can be translated, or transposed, into these two basic signs, and what cannot be so expressed cannot be fed as information to the Golem. I daresay the old Kabbalists would have been glad to learn of this simplification of their own system. This is progress.
Progress indeed, both linear and appreciable. We can further extrapolate that God’s word awakened Adam, and Adam’s words awaken golems. What do golems get up to once they’re charged and running, then?
What they have a strong tendency to run is amok. If you’re Rabbi Jacob Emden writing in 1776, then your ancestors made a monster that grew so massive that its maker “feared that the Golem would destroy the universe,” and when the sacred word was extracted from it, the creator’s face was forever scarred. If you’re Rabbi Loew during the 16th century and used clay from the Vlatava River to combat the sadistic rule of Holy Roman Emperor Rudolph II, things start off swell for the Jewish ghetto residents. However, various versions of the Prague tale portray the Golem as eventually either going on a slaughter spree, desecrating shabbat, burying its maker in its own bodily rubble, or else falling in love to such tragic consequences that it had no choice save violence and pillaging, etc.
Lack of ability to communicate with the Golem once animated tends to cause the carnage; indeed, the Hebrew word golem means incomplete or embryonic, which is exactly what humans look like screaming and flailing before they learn to speak one of our six thousand-plus languages. Which brings us to Mary Shelley and her immortal novel Frankenstein—Shelley delivers not only the most famous Golem outside of Prague in the person of Victor’s much-abused Creature, but the bizarrely direct link between golems and modern computer technology via her summer vacation pal Lord George Gordon Byron.
In the 1818 introduction to Frankenstein, Shelley wrote, “I have thus endeavoured to preserve the truth of the elementary principles of human nature, while I have not scrupled to innovate upon their combinations.” Shelley would have been, via both association and her own innate wit, more than familiar with innovative combinations of science and humanity. Mary Shelley’s husband Percy Bysshe Shelley, renowned Romantic poet, was an intimate of Lord Byron’s, and as most devotees of Gothic literature are aware, Frankenstein was the result of an inclement weather-induced slumber party on Lake Geneva.
Three years prior, Mount Tambora in Indonesia had erupted with such ferocity that it killed approximately 100,000 people immediately, kick-started a global cholera epidemic, and eradicated the very concept of summer in Europe: hence the festival of Germanic ghost folktales and accounts of the grotesque over toasted s’mores. Mary’s contribution to the contest was partly inspired by her male companions’ avid discussions about the scientific possibility of galvanizing a corpse. During a terrifying nightmare, she “saw the hideous phantasm of a man stretched out, and then, on the working of some powerful engine, show signs of life.” Now, an electrically-revived undead being along the lines of Mary’s Creature (often also referred to as the “monster” or “daemon”) has as much in common with a zombie (a corpse revived via sorcery) as with an artificial intelligence—even an AI housed in an android rather than, say, a Home Assistant stuck limbless in a wall panel.
Presumably, the Creature’s brain is organic. And yet—setting in motion the entire plot of the novel after the newborn awakens—no memories from its previous owner are retained, and thus the nameless, malformed giant possesses no linguistic capacity to incite pity in the humans surrounding him. It is his lack of language, as much as his revolting appearance, that makes him an object of dread and loathing, and thereby makes him a perennially apt icon for the tongue-tied outsider, the misunderstood, the either so-called or self-titled goth, the freak, the geek. He begins life mute in the identical way the Golem of Prague is hobbled in its silence.
Here’s where things get truly entertaining: Lord Byron’s daughter and sole legitimate heir, Ada Lovelace, had just turned three years old when Mary Shelley published Frankenstein to wide acclaim. While Lord Byron was occupied with sowing his oats in any international field willing to yield a soft furrow, his wife Lady Annabella Byron—tutored privately by a former Cambridge professor and nicknamed Byron’s “Princess of Parallelograms”—insisted that their daughter Ada likewise be schooled in, of all the silly things for a girl to learn, mathematics. This would be a counterpoint to her father’s violent mood swings, Annabella supposed, his positively Byronic affairs, and the suspicion that he was not merely suicidal, but keen to carry on an affair with his half-sister.
It would take until 1833, fifteen years later, for the wildly precocious Ada to meet Charles Babbage, who was giving a public demonstration of his invention the Difference Engine, based on Jacquard loom technology and the precursor of his Turing-complete—though never physically completed due to lack of available engineering tools—update the Analytical Engine. “We both went to see the thinking machine (or so it seems) last Monday,” Lady Arabella wrote of her and Ada’s encounter with Babbage. “It raised several Nos. to the 2nd and 3rd powers, and extracted the root of a Quadratic equation.” Ada Lovelace was instantly and permanently attracted to the notion of what we call a computer, always adding a twist of whimsicality to her hard (and brilliant) calculations: so much so that as an adult, she ultimately styled herself an “Analyst and Metaphysician.” As early as age twelve, according to biographer Dr. Betty Alexandra Toole, Ada had decided that, like Leonardo da Vinci before her, she was going to solve the problem of flight:
She considered various materials for the wings: paper, oilsilk, wires, and feathers. She examined the anatomy of birds to determine the right proportion between the wings and the body. She decided to write a book, Flyology, illustrating, with plates, some of her findings.
Lord Byron’s friend Mary Shelley wrote the quintessential novel of the Golem; but it was his daughter Ada Lovelace who so heavily annotated scientific notes on the Analytical Engine that many say her additions are as key as the texts: her “Note B” on Luigi Menabrea’s work on the Analytical Engine (an algorithm for calculating Bernoulli numbers) was in fact the first program designed for computer processing, making Ada the world’s first coder. A person revived with heart-starting electrical paddles is not a zombie; a Home Assistant is not a living consciousness. But via the polar opposite stories of the passionately feeling but language-stymied Creature and the emotionless but thinking Engine, it is possible to imagine beings created by men who own wills of their own.
To put it another way, a Golem. Yes, AI often calculates faster than we can; yes, AI can in many ways make our lives easier; and yes, it can interface between myriad sources. But the potential power of AIs, to my mind, lies not in their capacity to mislead or disrupt at the behest of their creators—these instances are well-documented and already alarming enough—but in the scope that they gain from systematically being taught languages, by means of poaching information from the infinite (by definition, since it keeps on growing) online resource of documents of every variety, as well as real-time conversations, all available to be sifted through in the cloud.
As Professor Scholem observed, computers function via bit strings based on a system of 1s and 0s. Binary code was invented by Gottfried Leibniz in 1689, a breakthrough which to the philosopher and polymath represented an incarnation of the Divine itself, a unitarian creatio ex nihilo or creation from nothing. He was praising God, in a sense, by emulating Him. I can’t help but think it ain’t for nothing that Leibniz (widely considered one of the greatest polymaths in history) was also a panpsychist who believed that everything—redwoods, meerkats, rocks, socks—possessed something akin to a mind. If this is really the best of all possible worlds, as he famously suggested, I am slaveringly curious to know what Leibniz would have made of my cell phone (which keeps demanding to write my texts for me), or an AI travel bot dealing with an unexpected hurricane.
Since we haven’t the slightest idea of where all this progress is leading, everyone is scrambling to not only prognosticate about computer science but understand it day-to-day. The field of speculative Uncanny folderol has long largely been the province of three major sub-genres of fiction: sci-fi, fantasy, and horror, and the unnatural is everywhere, from Frankenstein to other classics like Do Androids Dream of Electric Sheep by Philip K. Dick, set in the far distant future of 2021. A nuclear World War has ravaged the planet, all living creatures are madly coveted, and the banned androids (replicants) who pass for humans are hunted down for “retirement” due to their often-lethal lack of empathy. The author said, in a 1972 speech titled “The Android and the Human”:
I have, in some of my stories and novels, written about androids or robots or simulacra… Usually with a sinister purpose in mind. I suppose I took it for granted that if such a construct, a robot for example, had a benign or anyhow decent purpose in mind, it would not need to so disguise itself. Now, to me, that theme seems obsolete. The constructs do not mimic humans; they are, in many deep ways, actually human already.
For Dick to have said this over fifty years ago, when the notion of a computer speaking English fluently to us was literally the stuff of speculative fiction, is compelling. As compelling, I would argue, as it was for Mary Shelley to write speculatively of a cobbled-together patchwork of human flesh. The Creature—made in our image to be appealing to Frankenstein as a companion or assistant or simply senior thesis project, we’re never quite clear about the why, let alone the how—begins as a tabula rasa but increases exponentially in learning to the point of becoming the embodiment of unfettered id.
We cope with this mayhem by writing beautiful and terrible tales about golems and androids and cyborgs and how these creatures can (and already have, and will continue to) intersect. Spare and Found Parts by Sarah Maria Griffin offers a uniquely Irish blend of bleak beauty and lyrical filth in which a lonesome girl at the cusp of womanhood determines to build a companion from a prosthetic hand washed up on the grim shoreline of Black Water Bay. In her potently lovely debut, Luminous, Silvia Park lands us in a future unified Korea in which a trio of siblings—all of them the offspring of the quintessential mad scientist, and running the gamut from wholly robotic to wholly human—explore the brutal consequences of living in a visceral but largely engineered world. From Aimee Bender’s whimsical short story “Frank Jones” about a voodoo-esque golem made from the narrator’s skin tabs to the Creature as a shocking form of political protest in Ahmed Saadawi’s Frankenstein in Baghdad, the moral implications of our sentient dust bunnies both titillates and repulses us. Which is, perhaps, the very essence of Freud’s concept of the Uncanny.
In terms of science fiction screenwriting, some of the best plot predictors of where we are going—strike that, have arrived already—lie in the beloved series Star Trek: the Next Generation, which aired from 1987-1994. We can take cautionary tales from season three’s “Hollow Pursuits,” in which Lieutenant Reginald Barclay becomes addicted to a holodeck program in which his crewmates fawn over him (they find out, of course). Even more piercing are the twinned episodes “Booby Trap” and “Galaxy’s Child,” from the third and fourth seasons respectively, in which Chief Engineer Geordi LaForge first creates a holographic version of the warp drive’s creator Leah Brahms, and then—shockingly—meets the real person and is bitterly disappointed to find that she is not created to be sympathetic to his desires.
One of our most beloved characters from the entire series is of course Commander Data, who is completely unorganic and enjoys reminding viewers of this every chance he can get. But our fierce affection for Data was beautifully expressed by an exchange I had with Ryan Britt, author of Phasers on Stun:
Data does not have the cloud; Data is not constantly having his programming upgraded and altered by algorithms. In fact, because of the nature of the Enterprise’s mission, the ship is ‘offline’ in terms of how we would perceive the internet. Everything is analog, stored on the ship’s computer, or Data’s positronic brain. This is the opposite of the internet, where information becomes fluid, and the prioritization of news is controlled by an algorithm that is ‘upgraded’ constantly… Mr. Data is like your iPod from 15 years ago that you’ve never connected to the internet. Data is a heroic “AI” because he still values the tactile, the real.
It’s rather facile and circular for me to say that we love Data because he loves us, loves humanity itself, but he also adores art and is rather terrible at it, which to me is his most endearing quality. Shelley’s Creature understands poetry, and he better than understands wild passions: my question would be, can he write them? I’m as panicked as everyone else about the literary world losing jobs to our robot overlords, but I would argue that if Shelley’s Creature were asked to write a poem about his despair, he would manage it precisely because he is aspirationally human—as is Commander Data, whose poem to his cat, “Ode to Spot,” is both quite bad and quite touching:
O Spot, the complex levels of behavior you display
Connote a fairly well-developed cognitive array.
And though you are not sentient, Spot, and do not comprehend,
I nonetheless consider you a true and valued friend.
Data is wrong, of course: his cat is entirely, demonstrably sentient, while many argue throughout the series that Data either is or is not. But Spot neither writes poetry, nor does Spot have the urge to do so, and that is what, to me, quintessentially separates human intellect from artificial intelligence: we tell it to write our term papers for us. AI didn’t realize it possessed a fierce yen for Medieval literature in the first place, then have its term paper timing clocked by a kegger where Becky of the Cashmere Sweaters was sure to make an appearance.
I return, always, to the glory of language in these accounts. Its grace, its ugliness, its richness and its power. Language questions, language compels—and most importantly, language creates. It tells stories, and stories are all we have when the future is unknowable. Thus far in my literary travels, the poem “Ecce Monstro” by Bryan Thao Worra most keenly expresses the feeling of expectant dread, of eager uncertainty, the Golem of Circuitry stirs within me:
Began like pu, the uncarved block in a strange land.
Simulacrum, wooden golem knocking about, dreaming of flesh.
Lies, truths, growths, recede and wonder what it is to be:
D’ou Venons Nous
Que Sommes Nous
Ou Allons Nous
Woher kommen wir
Wer sind wir
Wohen gehen wir
Where Do We Come From?
What Are We?
Where Are We Going?
How long have I got?
Father never answers. Why he is surprised why I bolt,
A prodigal arrow, into
The world, I can only suspect, among gears and puppets.