Generative AI is accelerating at a rapid pace—but what does that mean for writers?
Technological progress is vital—but not at the expense of moral progress, identity, and human creativity.
In this article, we dig into the recent history and implications of generative AI, and what it means for creative expression.
Consider "The Writer"
Consider The Writer—one of the 18th-century’s most fascinating curiosities. Created in 1774 by the Swiss watchmaker Pierre Jaquet-Droz, this 70-centimeter-tall automaton was wildly ahead of its time, even a time teeming with technological feats. Sat at an elegant writing desk, the self-moving android, controlled without electricity by a complex programming system disk, simulated the act of writing thousands of notes with a goose quill pen in fluid human penmanship for its enthralled and confounded audience.
The mechanism of The Writer was similar to a modern programmable computer. A stack of gears placed in its back dictated its penmanship and message. According to Jaquet-Droz, The Writer would be equipped with a custom gear stack for its visitors, allowing it to write any word or sentence—a level of sophistication that raised deep philosophical questions and concerns about artificial intelligence, even then. But despite its ingenuity, it was in essence a stunt, a gimmick—an advertisement for a watchmaker to sell more watches.
From meme to “masterpiece”
Two hundred fifty years later, we once again found ourselves enthralled by creative automatons. In the span of a few months, generative AI models underwent rapid improvements—internet discourse shifted from the surrealist hellscape of “Weird Dall-E” memes to Midjourney winning its first art competition blue ribbon. Jason Allen’s AI-generated Théâtre D’opéra Spatial beat over 20 artists in the “digitally manipulated photography” category for a $300 prize, prompting a collective meltdown/debate from creatives and media figures on the nature and meaning of art—and the nebulous training methods of visual AI models which, it seemed, relied on the scraped output of online creators.
By December, the debate had reached a boiling point, culminating in a protest staged by artists on ArtStation, the portfolio platform for professionals and amateurs. Many artists, who had long viewed ArtStation as a community-focused site, felt betrayed by its perceived inaction against the influx of AI-generated work and the secretive nature of AI “training.” In response, ArtStation released an FAQ defending its inclusion of AI generations, which was criticized for seeming to value the interests of AI research and possible commercialization as much as (if not more than) the concerns of its human user base. The result was a quasi-blackout protest, with artists slapping a big red X over "AI" in place of their preview icons.
But for writers, the conversation was just getting started.
Sudowrite vs. the Omegaverse
Fan fiction writers have a long-standing symbiotic relationship with fan artists, who had raised alarms during the ArtStation debacle. Fan fiction sits under the umbrella of transformative work; it occupies a nebulous legal zone that forbids copyright ownership and can’t be sold—fanfic writers write for themselves and their communities for the love of it, without expecting profit. But although they offer their work freely, the reality of a possible AI “capitalizing” on their content stirred unrest, and they soon realized that their imagination was “lining someone else's pockets.” Understandably, they felt "exploited—their sparks of creativity, their moments of inspiration, being commandeered."
Seeing the writing on the wall, they signed up for tools like Sudowrite (a play on pseudo, the mimicry of writing, and sudo, the command you give computers to do tasks), which was built on the bones of OpenAI’s GPT-3, to investigate. They prompted the tool to elaborate on deep-cut tropes like the Omegaverse, familiar to writers on sites like Archive of Our Own (the beloved non-profit-run fanfic database). And Sudowrite obliged, generating Alphas and Omegas with uncanny accuracy. But it didn’t stop at knotting. It seemed eerily capable of mimicking niche narratives across many other fan fiction genres, and recognizable passages from bestselling novels and other copyrighted material.
Generative AIs, or LLMs (large language models), have been compared to eloquent parrots, predicting words and sentences based on massive amounts of text mainlined into their digital synapses during training. They indiscriminately sweep up text, learning to predict patterns, styles, and context by analyzing anything and everything—all the brilliance and trash of collective intellect, discourse, and culture—though which texts remains a general mystery.
“These language models have performed almost as well as humans in comprehension of text. It’s really profound,” said James Yu, co-founder along with Amit Gupta of Sudowrite. Debuting in 2021 alongside other AI writing tools like NovelAI, Sudowrite billed itself as the “non-judgmental… AI writing partner you’ve always wanted.” Its fusion with GPT-3, while not pioneering in itself, marked a significant milestone—suggesting that machine collaboration might help writers find their voice.
At the very least, it raised some eyebrows.
Navigating AI’s ethical maze
On Twitter, writer David Lee Zweifler highlighted an exchange with Gupta from their time together in an online writing workshop. Zweifler shared screenshots of Gupta mentioning his involvement in developing an AI tool, stating that he had experimented with one of Sudowrite's unreleased features, a story critique function, using Zweifler's story for testing (or training). Zweifler expressed that he had not authorized the use of his work, though Gupta later denied that any author’s works were used to train Sudowrite.
But even if Sudowrite had "done no scraping or fine-tuning ourselves without consent,” they could not say the same for their integration with ChatGPT, the true engine that runs their story—as OpenAI has been notoriously silent on the sources scraped by its models.
Recently, a string of lawsuits has emerged claiming "systemic theft on a massive scale" against OpenAI. An article in The Atlantic revealed a cornerstone of the suits—a dataset for AI training known as Books3, which reportedly included more than 170,000 published books. Though not widely known outside the AI sphere, it was available for download for more than two years until its disappearance coinciding with the lawsuits. But locating and accessing the dataset has become more difficult, effectively camouflaging it.
As the Writers’ Guild of America stated during the writers’ strike (which has since resolved in favor of humans over AI): "It is important to note that AI software does not create anything. It generates a regurgitation of what it's fed. If it's been fed both copyright-protected and public-domain content, it cannot distinguish between the two. Its output is not eligible for copyright protection, nor can an AI software program sign a certificate of authorship. To the contrary, plagiarism is a feature of the AI process." It remains to be seen if AI-generated text can be copyrighted (the US Copyright Office ruled that Midjourney’s blue-ribbon-winning AI prompt cannot).
Legality issues aside, the question remains—is this what writers want?
Productivity over passion
In a heavily-ratioed announcement (8.6 million views and 1,193 likes), co-founder James Yu announced Sudowrite’s new Story Engine feature. This long-form-focused AI tool would allow authors to write “a page of words in less time than it takes to make your coffee” or “an entire novel in just a few days.”
“You could rewrite the same sentence 100 times,” says Sudowrite’s website, “or you could make the computer do it.”
Tools like Sudowrite, ostensibly “built by writers” for the benefit of writers, are writing tools for people who hate writing. There’s no denying that writing is hard (ask us how long it took to write this article), and speed is not necessarily an advantage. Writers spend weeks, months, even years crafting the right plot, character arcs, setting, descriptions, themes, and dialogue for a work sometimes decades in the making. Craft relies on consideration, creative solutions and nuance; improvement is impossible without the time and effort spent to improve, and collaboration requires subjective judgment.
Pure technologists—watchmakers—are here for the relentless pursuit of productivity, but they often miss the true point of writing, which is expression. The process of releasing an idea into the world—of writing (and publishing)—is full of catharsis. Many communities, like fan fiction writers, do it for the sake of expression, connection, and joy. The act of reading a human-generated work is one of deep communication, of communion, of empathy. The stories that matter are not hollow novelties, they mean something to people.
New frontier or foe?
In the past year, major writing tools have rolled out generative AI features: Google’s Bard is now linked to Google Docs; Microsoft, which holds a 49% stake in OpenAI (investing a further $10 billion in January), has integrated GPT into its Office suite. LLMs are here to stay, and, for now, they seem inescapable. The fiction that tech giants will protect anything other than their revenue stream in developing and deploying their AI models is fast dissipating. Of course, AI holds many benefits in different disciplines—the detection of cancers, folding novel proteins; even finding social consensus. But assuming these companies continue to pursue acceleration and AGI (artificial general intelligence, the stated goal of OpenAI, and other LLMs like Google's DeepMind), how can we be certain that the machines—and their makers—share our values?
For all our innovation, one of the most fundamental human tools (and probably the most undervalued) is human creativity—storytelling, and the authenticity that is inherent to it. That is where we find meaning, our constant north star in a field of dark uncertainty. To preserve it is not merely a convenience, but a moral imperative. Despite all these watches, we don’t have much time.
So—what do we do? Is there anything that writers can do beyond watching as our tools and platforms become inundated by hollow facsimile, and hope that our stories continue?
We think we can help. And we believe the next battle is where creativity meets ownership.