Page 7 - This Episode Was Written by Humans. Or a Human
(Music fades in)
Oh hi! I’m MJ Bailey, and I write things. Sometimes. It doesn’t mean I always know what to write, but hey, we’ll get there. I guess.
(Music fades out and new music fades in)
And this episode, all episodes of this podcast, and all episodes of all of my podcasts have been written by a human being. Not Generative AI. They have likely been fed into generative AI because I make my transcripts publicly accessible on my website, but I try not to think about that. Accessibility is important. Even if that accessibility was also to a stupid AI.
And I could apologize for that slight crack in my composure, but I think you get it. Generative AI is causing so many problems. And I say generative AI specifically because there is some good AI when you cast that net pretty wide. Like that robot that occasionally goes mini viral again and again that can distinguish both pasties and cancerous cells. Basically, it’s a robot that was made to distinguish between different bakery items so that companies that sell them could maximize variety and freshness but also profit. It did so by recognizing shapes, and it turns out that you can use a similar process on cancer cells. That’s technically AI by some definitions. And it’s cool. It’s useful. It takes a very mundane task–like shape recognition–and removes the room for human error, which adds a touch of speed to it. Also, it saves lives. Via cancer detection. Not so much croissant identification.
That’s good AI. AI that shuffles your music. Also good idea. These are tasks no human wants to do, and frankly, humans would only create complications. Repetitive tasks can dull our minds and lead us to make more errors. Not all the time, mind you. Sometimes it’s a matter of arrogance. Either way. There are things that an AI can do better than humans because the tasks are simple, humans hate them, and AI is not human. The math all works out.
Generative AI–the kind that has become vogue as of late–is not that. And I think I’m just speaking to the choir right now. And I did just want to talk about the cancer croissant robot because that’s my favorite story right now. Or ever. Like, what do you mean the machine that makes sure I am properly charged for this donut can also save someone’s life? That’s cool. And it isn’t being used to justify a bunch of lay offs nor is it stealing intellectual property and pushing people into psychosis. It works with humanity and not against it. I don’t think generative AI can necessarily say the same thing.
And in saying that, I’m late to the conversation. I get it. We’ve had the “what the fuck is this” discourse for a while now. I should have been more vocal about my two cents. But honestly, AI keeps coming up at my new job. And never in a good way.
(Music fades out and new music fades in)
Which isn’t to say that I’m at risk of being replaced by AI. I’m not. My boss does understand what I think a lot of the AI detractors know: there are certain tasks that need human discretion. I.e. you can look at a series of events or numbers and think you understand them, but there’s a larger picture that requires human eyes to make sense of. It’s basically a story, you could say. There are various threads that have to be stitched together into a narrative and then the narrative has to be deployed appropriately. But it seems like only my boss gets that.
My role is somewhat public facing. And I work in fundraising. So you can imagine, potentially, how frustrated I must be when someone asks Chat GPT what their donation history is with our organization. How does that make sense? I don’t… (sigh) Okay, I need to calm down.
My point is that I keep finding new and oddly specific reasons to be annoyed by the existence of Chat GPT. And if I don’t want to dox myself, I can’t really talk about most of them, but they are there. And I hate them.
(Music fades out and new music fades in)
Because that’s the thing, for whatever good generative AI can do, for whatever tasks it has made easier, it has also somehow found new ways to make our lives harder. In the coding world, this can mean bad code generated by a generative AI is being inserted into programs or websites that will then require human intervention. I.e., that human is going to have to find them, remove them, troubleshoot them, or reprogram them. Or all the AI slop books that are wastes of money for us or causing a literacy crisis in small children because–give generative AI some credit–this whole thing has shown how little we value children’s books and how little we understand the process of developing literacy. Or how little the general population understands it. If you ask a librarian, they know quite a bit.
Really, if generative AI has some sort of value, it’s in the way it shows the cracks in our society and the things we really care about. It’s allegedly being used to justify lay offs when it doesn’t work as well as humans. Why? Because at first glance it seems to save money, and no company cares about their staff or even their office properly functioning apparently.
Also, it’s becoming abundantly clear that a good segment of the population also don’t care about the arts, which wasn’t entirely news to anyone. But the dialogue itself has found some dimensions. This new light cast on a familiar problem has shown what I was missing at first. Namely, that we’ve moved away from media as a form of connecting and towards this emphasis on the finished product as a commodity.
And that seems obvious, and this is a half baked thought, I’ll admit. However, I think the only way I can finish baking this thought is by saying it aloud, so indulge me if possible. But you’ve heard people say that they like that generative AI allows them to make the exact thing they have in their heads. That could be the image of a picture they can’t draw themselves, a video that they don’t have the ability to animate, or the books they have always wanted to write. And if the value solely rests in the existence of the thing and not in the journey that was undertaken in order to make the thing, the argument makes a half bit of sense.
(Music fades out and new music fades in)
Actually, that’s the part of the process that is profitable, which is increasingly becoming the crux of our entire lives. If you have a finished book, you can sell a finished book. And this need for profitability is becoming central on the whole live or die thing that all living creatures care about, you know, and that’s a larger problem that we aren’t going to be able to address with a few thought pieces. So in that sense, this is a symptom not so much the problem itself.
And yet, not treating the symptom is an issue that’s worsening the condition as a whole because it mucks up the water, as it were. Good works are getting buried by slop, and because the slop is generating income, it encourages more people to make said slop.
Personally, I’ve had concerns for a while that the big publishing houses have been removing editors or the whole editing process, just not making writers do it, in the name of speeding up the publishing pipeline. Books that went viral on tiktok might not be getting the full editing treatment in the name of getting them into people’s hands while the figurative iron is still hot. And then the writer is accused of being an industry plant or a nepo baby because we don’t have a shorthand word for the actual problem, which is that we’re being offered a worse product in the name of time saving and profit maximization.
Once again, not entirely a surprise. Think “shrinkflation” but in terms that can’t be easily measured by a scale. The final product represents less labor, less hands, and as a result the quality went down. Sure, there’s always a chance that a savant comes along that doesn’t need much in the way of editing, but it’s not super likely. And it’s certainly not going to be a whole industry of people. So we’re going to get a lot more slop but human-made slop!
And honestly, pulling us out of that hole should also pull us out of the AI slop hole too. Or I want to believe that it will. No guarantees, though.
I don’t want to be pessimistic, but I think as time goes on, I feel myself being shoved towards pessimism. Like we keep making things worse, and because it will take all of us exerting a directed effort, we have no chance of getting out of it. I mean, look at how well COVID went.
At the same time, though, the AI books have consistently been so bad that maybe self interest will work out for us. Maybe everyone will be so mad at what they are spending their money on that it will force a market correction.
(Music fades out and new music fades in)
Or maybe generative AI will cannibalize itself into a more muted role in our society. Like if a marketing director has to decide between letting Chat GPT come up with their campaign or running a really racist eugenics type ad, I would rather Chat GPT. I don’t like that I’m being presented with that dichotomy, but you get it. If the chips really are down, I suppose you could imagine a situation where Chat GPT is the answer. They’re just very niche situations, and ideally, we avoid them. Full Stop. AI can’t do what a human can, but it also can’t redefine terrible and offensive, so I mean, I guess there’s that.
But that’s just it. It can’t do what a human can in terms of a story or longer form content. You’ve heard about AI hallucinations, right? When a chat bot can’t keep the details straight and things start to fly off the rails. It’s not unlike a small child trying to lie. An adult human mind can track characters and visuals. Generative AI is just trying to guess where everything is supposed to be. It can’t relate to the world like we can. It can’t see all the moving pieces and line them up the way we can. It isn’t as compatible with the world as we are.
Which actually reminds me what storytelling and art were supposed to be in the beginning of humanity. They were a way for us to relate to the world, to the things that happened to us, and to each other. It was a way to convey lessons or insights. It was a way to comfort or a way to take control of situations that were wholly out of our control. Art was always meant to be a tool for dealing with everything that was beyond us.
(Music fades out and new music fades in)
But generative AI doesn’t need to deal with those things. It doesn’t have the same perils we do. It doesn’t deal with those same circumstances nor does it have the same needs. It can only be directed in pretending. But lacking any authentic reference point, this isn’t the sort of commissioning that is central to the art world.
And in saying that, I feel compelled to revisit or reconsider what a commission is. Because I’ve commissioned art. And I’ve ghost written a few stories. Issues with those things are few and far between, fewer still when you assume that the artist is paid properly for their work. Because it’s still a human being at the helm. Humans are coming together, when commissioning something, collaborating on something with a very specific division of labor, but even still, it is the human eye (figuratively or literally) that guides the pieces. It is interacting with that same whole. It is participating in some grand tradition. It is still doing what we have always done, what we need to do, and in that way, it affirms our deeply personal and human tendencies or needs. It is many things, but it is not the rejection of that which we hold dear.
Because to use AI is to reject that distinctly human aspect of the whole process. A generative AI can only pretend to use that which only humans have: experiences and a presence in the larger world. Those are things we value because they are proof that we have lived, that we have done things, and also because they are part of us. So to reject the importance of those things is to reject ourselves. Even if we aren’t being laid off from our jobs
(Music fades out and new music fades in)
And also, this is more of a pet peeve that maybe the new Chat GPT model fixed, but have you noticed that generative AI tends to use a specific tone of voice? Or is the observation just mine and the over representation of Chat GPT in my world. But to me, text generated by an AI sound very salesman-y. Like it’s overly cheerful. There’s so much emphasis on final bottom lines. It has a certain cadence that seems to lead you to a purchase page. And it’s consistent. So if we all used these generative AI bots then all writing would sound the same. And as it’s fed back into the machines, it would enforce that specific tone. And our children and grandchildren would think that’s the tone they need to write in. Which seems better than the alternative of a reactive AI that adapts to us and may lead some people to psychoses by affirming delusions, which is already happening. Allegedly. Probably. It depends who you believe, I guess.
And yeah, I don’t see us getting out of either of those options. Or any of this really. Honestly, I am trying to be optimistic. I’m trying to be hopeful that this is a flash in a pan, and we’ll come out on the other side better than we were before. But that is a hard belief to hold onto. Admittedly, I do find myself worrying about my place in the world as a creative in an age when creativity isn’t valued as a process. Because while I can write rather quickly, it is not instantaneous. Especially with the edits.
But with that, I’m MJ Bailey, and I’m a writer, I guess. Whatever that means.
(Music fades out and new music fades in)
The Writer’s Open Book is a podcast from Miscellany Media Studios. It is written, edited, and produced by MJ Bailey with music from the Sounds like an Earful music supply. The logo was made by Keldor777 on Twitch. Unintended sound design from the cats Minx and Midnight. And to the Queen of Cups in my life, you know who you are, thank you for helping me process so much of this writing journey and for all the support. I couldn’t have done it without you.