Most of you know me as a writer and publisher of fiction, but that isn’t my only job. I’m also a freelance news writer for a tech website. This puts me in a somewhat unique position when it comes to the topic of generative AI. There is the tech geek in me who can’t help but marvel at the capabilities of the software, there’s the anticapitalist in me who hates to think of all the terrible ways that this technology will be used by corporations to disenfranchise its workers, and then there’s the artist in me, that doesn’t want to see the human experience boiled down to an algorithm.

Here’s the thing though. I’m not sure that’s going to happen.

At least, not anytime soon. In my experience, AI-generated text is coherent but generic. Nothing remotely original ever comes out of it. Even when you give it page-long prompts and ask it to imitate the writing style of a certain author, it usually only does so in the most base way possible. Of course, it’s easy to look at the text generation models from OpenAI, Gemini, and Copilot and think: Wow, if they can do that now, just imagine what they’ll be able to do in a few years!

But I’m not so sure it works that way. AI isn’t magic. Its text is generated by imitating massive quantities of information. This kind of makes its metric for quality the lowest common denominator by default. It doesn’t have a voice, it’s an amalgamation of hundreds of thousands of voices. To imitate exceptional writing, it would actually have to narrow its scope considerably, which would make it even more plagiaristic and legally dicey than it already is.

I’m not here to debate the ethics of AI data mining authors and artists, though. You’ve probably already heard plenty of those arguments and I doubt very much that I could change your mind one way or the other on the subject. I’m here to talk about whether or not the technology even can pull off a well-written story.

Good fiction is about more than imitating a good writing style. It’s about honesty. It’s about showing human truth through the lie of the narrative framework. AI can convincingly imitate the framework, but not the part of the story that makes people connect with it. You don’t identify with its perspective because it has no perspective. I’m not saying that it never will, but I’m pretty sure it will take a while for software engineers to make a machine figure out how to do that.

I doubt tech companies are all that interested in the fiction publishing industry anyway. AI will most likely excel in other, more profitable, areas. Copywriting, advertisements, content mills, and maybe even news sites will likely become more AI-dependent as the technology continues to evolve. This isn’t harmless. Unless there is legislation to regulate this, it will probably put a lot of people (like me) out of a job.

I think there will be some writers who will use AI in their fiction. It can certainly write descriptions and even dialogue when instructed. But I believe the architect of the story will have to remain human for it to hold any real narrative weight. That’s why I think fiction will be safe for a while, but unfortunately, I can’t see the future.

-Cody