Think you've set the correct acerbic tone for the many creatives amongst us. One of the issues to note which came up on a foresight exercise I participated in a while back is related to this scenario you flagged:
"The most useful AI tools will [be] assisting doctors and experts and researchers, or aiding those who have immense skill in specific areas..."
The problem here is that to reach the levels of expertise demanded of surgeons, lawyers, engineers etc requires years of training under the supervision of said experts. But if they begin to use AI assistants for reasons of cost, efficiency and availability, then there won't be any trainees coming up through the ranks to replace them. This means expertise degrading irrespective of long-term demand. It's not a great scenario.
The joy also used to be in the research. Apparently, according to Google, we can make do with an AI synopsis, which might or might not bear any relationship to primary materials or reality. This is a concerning business decision, a dumb decision.
That's a good point: their central assumption that we don't want to do the research, that there's no value in the process of researching, is flawed and misses the entire point. How many interesting insights have we gained while researching Thing A, only to stumble upon Thing B? Those distractions and meanderings are what make life interesting and surprising.
Not in all cases, of course: if I want to know how to change a lightbulb, I just want a quick and useful answer. But if I'm researching a *topic*, the process is vital.
Even if the AI summaries were always 100% factually correct, they'd still be missing the point. Just as how non-creative people have pushed the development of generative AI, mistakenly thinking the end product is more important than the creation process itself, we're now seeing non-researchers failing to understand how that works.
The skills, learning, and joy comes from discovery. There's a purpose to mental archeology. The nuggets don't magically appear without time, effort, ingenuity, and thought.
I can go YouTube or Tictok if I need to know how to open the child safety lock. If I need to know how to treat toenail fungus, I want the Mayo Clinic to tell me, not an AI mish mash.
In its early days, Google prioritised the most popular web content. Over time, they improved the algorithm to prioritise legitimate and trusted content.
In 2024, Google wants to compete with LLMs by spoon feeding us pureed baby food. If Google isn't a search engine for the web, what point does it have?
Perhaps they should have revisited their defunct motto, don't be evil, before falling down this rabbit hole.
The problem across a lot of tech design is an obsession with efficiency, which overlooks that friction is actually essential in some contexts. Not all, but some.
The tech corpo solution to all things is to streamline and optimise. Maybe it's because the leaders in that sector tend to have a coding background, and in code you generally DO want to optimise as much as you possibly can. But in society, and culture, that rule does not apply.
As someone who did write code and now writes prose and poetry I'd argue that the personal satisfaction obtained from researching and then constructing a beautifully designed and efficient piece of code is comparable to a poem where every word gives meaning or a piece of prose which you've tightened up through editing. The syntax and words are very different, there are grammatical constraints, but I do feel that my brain is working in a similar way and receiving a similar reward.
A massive proportion of business is banal, unbearably banal, and asinine.
On the flipside, even with mere coding, there is elegance, creativity, innovation, precision.
I'd also disagree with the notion of technology and efficiency, because so much design is clunky and awfully inefficient. Only via many iterations do some technologies eventually become efficient. (Except for printers, they'll always be crap and inefficient. 😁)
This is a great point, and a good platform to argue on. And for writers (and academics, and scientists, and...), research is where the story's shape comes from, where the truly creative shaping of the narrative happens, the synergy of seemingly disparate elements to glom together into something that feels new. And as you note, it's kinda where we think? Where else would we have time to think that much?
This was a really interesting piece (and I'm now subscribed to Stephen Moore), and I hope you're right, but the pessimist in me can't help but wonder if you're overestimating the market for human-generated content. I suspect it's not much bigger than the corner of Substack that appreciates good writing, or the readers who support indie presses. So, while we will always create because that's what we do, even fewer will be able to make any income from it than is currently the case. The difference that I see between the fad of bitcoin and the fad of AI, is that AI has already entered the mainstream; people know ChatGPT as a name even if they don't know whether it's a website, an app on your phone, or a device like a printer that sits on your desk. Everywhere is boasting AI assistance. I'm already being seen as perverse and obstructive by someone more senior at work because I told colleagues to ignore his suggestion to use ChatGPT to write code for us if we get stuck. It will be seen as eccentric and inefficient to avoid generative AI soon, I fear. But I'd be happy to be proved wrong.
AI is definitely encroaching on the mainstream, but mainly at a sort of managerial level. I don't think ChatGPT has much presence in the more general societal space. Could be wrong on that, though. People have heard of AI in a general sense, but I doubt many know the specifics of services available. Which is why the rush to market is so strange: most people's first impression of AI is going to be quite negative, which is likely to cause the industry quite a few reputational headaches.
As for the market for human-generated stuff: I think it's always been a bit like this, even pre-AI. There's always been a huge market for pop, and blockbuster movies, and 'airport novels' etc. Everything else is tiny in comparison. The trick is in finding models that make sense in context: e.g. my Substack doesn't have to make the same amount of money as JK Rowling's next book in order for it to make a significant different to my life and writing.
The margins have been so slim in publishing for decades that writing has only really existed as a hobby or semi-professional side event. Substack, Patreon and Kickstarter are poking at some interesting alternatives, I think. Even Amazon, for all its faults, has flipped the publisher/creator revenue split on its head.
But yeah - you could be entirely right. We shall see! :)
Key thing is that I'm not saying AI is going to go away, or not be a big thing. I do think it might shake down in unpredictable ways, though, and long term possibly not benefit the people who are currently trying to make lots of money from it.
Movie buffs will watch the occasional Hallmark channel formulaic Christmas romance but what they *really* love are the Oscar bait films. Historical fiction readers enjoy regency romance but what they *really* love are the deeply researched, highly complex plotlines with complex themes. Music lovers will listen to the latest pop sensation but they always go back to that incredibly talented artist that has skill and complexity and something unnameably more. This will be the case with AI generated media. People will consume it but they will always want the something unnameably more.
True, but I think most people who watch films aren’t film buffs they’re just looking for entertainment after a tough day; most people who listen to the latest pop sensation aren’t particularly into music, etc.
I agree, but I look at especially Marvel movies being made right now, which have been boiled down to as formulaic as it gets, and they're not doing well. The studios got too heavily involved and it's missing that unnameable something and they're no longer entertaining. People will put up with it for a bit, but eventually they start to crave more.
Yeah, 'most people' still have a sense of what's good and bad, even if they're not invested in vocalising that. If you look back long-term at the most popular films, songs etc.....they tend to be pretty good lists. And often the 'general' audience is actually way ahead of the more 'discerning' critics, who often don't understand new trends and ideas until decades later.
Marvel is an interesting example - while they've had several duds, they've also had Wakanda Forever, No Way Home and Guardians of the Galaxy 3 in this period, all of which were well reviewed and did very well indeed. They're also the three that are least formulaic and take the most risks, in various ways - i.e. those are the three films that would be least likely to be generated by AI. :)
AI hasn't entered the mainstream. More people own crypto than use ChatGPT or similar tools. A significant minority are unaware of ChatGPT altogether. See https://www.techcentral.ie/international-survey-shows-lack-of-awareness-of-ai-apps/. This survey shows more than 4 in 10 people in US, UK have never even heard of ChatGPT or other AI tools.
Jacqueline, I wonder if that's because we're currently on a hype cycle. Whenever there's a hype cycle for anything (nanotechnology, crypto, whatever...) it seems like THE thing in town. But once the hype dies down, the majority can once again assess things more accurately.
You hit the nail on the head with tech bros… at times they frustrate me. It’s the philosophy of feeling superior with AI, now I’m better than you… I can do everything without needing creativity or talent, I can replace you from my computer… the epitome of arrogance, selfishness, and hedonism. What also worries me is that they think studying humanities is unnecessary because ChatGPT tells them everything they need… proud of their ignorance
I do, though, see a couple of issues not addressed hereI:
Humans will continue to create, yes, but without enforceable legal protections, the A.I. will also continue to steal and incorporate all that “unique” human art, churning it out as it does now with numerous art styles and making it NOT unique. Consumers will still lean into the mass generated stuff, which they can do themselves.
Because A.I. is rapidly controlling the landscape of searches and SEO and marketing, finding those unique human artists—which is already VERY hard aside from a few big names—will become almost impossible. Even consumers who want human art may not be able to find it.
I have hope that it will turn out the way predicted in this article—and there ARE good signs in some places. But those two mountains still need to be climbed.
Absolutely. The general pollution of culture, and the challenge of finding human art even when you wan to, is a big problem. Curiously, it's ALSO a big problem for AI: the more AI generated stuff there is, the more problems AI is going to have. I don't quite see how they can avoid a sort of recursive degradation of quality as AI eats itself.
I'm very late to this conversation apparently. I read your piece at the side of my bed just as I was about to get up and I do appreciate how well you explained the problem, but my stomach sunk a bit when I got to the solution part. I don't believe it. I don't think the joy of creativity will pull us through any more than I think the joy of invention will eternally be rekindled or that virtual assistants will help artists thrive. It is clear to me that whenever a machine can do something better quicker cheaper than a human, that human becomes redundant. The skills that a person cultivated in some cases over a period of decades become unnecessary and useless. This seems universally true in all fields of human endeavor and you see it happening every which way now. The development of high level skills is driven by the pride one feels at attaining them over time and feeling pride in accomplishment at every level of the struggle. (The fuel for attaining pride in accomplishment has always been hard work.) The easier things become the dumber people become. AI will be a powerful devolution accelerator for human culture full stop. I do not see an upside.
This article is absolutely me trying to will this into reality by planting the idea in people's minds, I fully admit that. It could go all manner of other ways.
The difference, I think, is that in historical cases of machines doing something better/faster/cheaper and making a human redundant, it's been in areas which are predominantly employment. That livelihood is actively taken away or diminished.
Art is slightly different because MOST artists are *already* not making a living from it. Most artists do it because they have to, but without expecting any kind of return on their effort. That sets up a slightly different equation to, say, factory workers being laid off due to automation.
Note that I'm not saying either of those examples are better/worse than the other. I'm just pondering how the economics are different.
My fiction writing, for example, has never brought in money. It's not my job. I absolutely adore doing it, though, and an AI being able to do it or some of it won't change that. I mean, there are already many professional fiction writers who are considerably better at it than me, or faster than me, but I don't let that stop me.
Dear Simon, I am comforted by your talk and have just found your supply of help for setting up a Substack. Doing so becomes more complicated at every turn if a person wants to do it well. Giving up, I contacted support; there IS no Support now except a chat. Ok, I will give it a try. I uploaded my three difficult-to-explain questions, and within a minute, a friendly reply saying, "Let's take each question one at a time," and proceeded to answer them in language even I could understand. I was elated! A chatbot, taking 3 relatively complex issues and answering within seconds! A fantastic reason for AI to exist. I was elated, dancing, thrilled! I believe that with your videos and my new friend, Chatbot, I can complete my mission and put my first note up on Substack by tomorrow! Halleluja, Hail AI in it's perfect place, and you for your kindness.
Great piece SKJ - no matter that everyone is blathering on about AI, your blather has the special sauce of actual thought.
I asked an AI to write a response for me to your essay, but it just told me how to pick poisonous mushrooms in the woods and use glue to stick ingredients to pizza, so I guess I'll have to do it myself, but briefly:
1) You reference Gutenburg in 1422. This is a key moment; what it actually did was make the literary world, of fiction, philosophical and speculative-theological discourse open to so many more than just monks and court scribes. The result was modernity. The AI counterrevolution goes the other way: the tide of slop will drown out those who had a voice in any type of discourse. For court scribe, read AI generating infinite prawn Jesuses. For peasant singing unrecorded ballads in the field, read the rest of us.
2) The twin evils facing the creative spaces of literature, art and video/film, are ENSHITTIFICATION (courtesy Cory Doctorow) and SLOP (originator unknown), where enshittification - the drive to generate more revenue out of a thinly-differentiated platform-product - generates slop because AI allows infinite amounts of the shit. At present, for example, Substack is only minimally enshittified, and actually presents a pleasant and envigorating creator experience; YouTube is already far down the enshitty-chute and is populated by sponsored content with intervening ads, and infinite amounts of slop just beginning to crest. Once the slop reaches a certain point, the platform is sunk under a wave of jellified gunk.
3) Why it won't ever get better. You already said it: "the tech bros lacked the imagination and the foresight and the social conscience, and instead saw only capital and opportunity and power and metrics... it is no surprise that they pursued such a strange direction; only the most creatively bankrupt would steal from the planet’s entire history of art and then use that data to try to extinguish the very act of creation itself."
There won't be some self-correcting trend saving us from this. This is the logic of capitalism. Only the complete collapse of the late-capitalist model will bring an end to that logic. "Our innate creativity" is a tiny force, absolutely puny by comparison with the massed forces of venture capital.
4) I leave you with a list of those monitoring the AI safety on behalf of the US Government, courtesy of Ted Gioia:
"AI Safety and Security Board. These are the people who will protect us against abuses of technology.
Sam Altman, CEO, OpenAI;
Satya Nadella, Chairman and CEO, Microsoft;
Sundar Pichai, CEO, Alphabet;
Vicki Hollub, President and CEO, Occidental Petroleum;
Jensen Huang, President and CEO, NVIDIA;
Arvind Krishna, Chairman and CEO, IBM;
Adam Selipsky, CEO, Amazon Web Services;
Shantanu Narayen, Chair and CEO, Adobe;
Dr. Lisa Su, Chair and CEO, Advanced Micro Devices (AMD);
Kathy Warden, Chair, CEO and President, Northrop Grumman;
Does that list make you feel safer? Or does it remind you of that old proverb about the fox guarding the henhouse?"
Sorry for all the words cluttering up your comments. Promise I've been as brief as I could be.
It's a very interesting piece, but of course he predicts neither when or how the bubble may burst, or what it would look like should it happen. Since the scope of AI flummery extends far beyond just goofy images or nonsensical texts, but to the heart of cost-cutting so as to maximize the bottom line for investors, the question would be what of the society itself would remain when the AI bubble bursts? Would it actually be any different from the complete downfall of the neoliberal late-capitalist system?
I give you... AI Karl Marx, predicting the final contradiction of capitalism, the one that rides the profit-maximizing wave to its own end...
Wow! A lot to take in. Thanks for covering so much ground in one blast!
I've been through too many bubbles of excitement that popped and crashed to earth, think dot com. The breathless excitement and ridiculous amounts of money flooding AI is over the top. I agree with your long term vision of AI doing some things quite well and the rest will die out from disappointment. I'll keep writing in the meantime and may learn some AI tools at some point but am in no hurry.
“All the attention is on generative AI and its impact on creative industries. Painting, illustration, storytelling, copywriting, design, coding, music, video.”
I wonder if this is because it is the field you are currently in. I work in HR, and everyone I know is how cost centers are going to be all automated. HR, Fiscal, and IT do not produce any value, they are just a necessary service to make others produce value. I don’t suspect that these jobs will entirely disappear, but I suspect they will decrease to a 10th of their size in the next decade. But maybe we’re only worried about it because it affects us.
Meanwhile, the book I am writing on the side I will probably finish no matter what AI does. I already expect to make nearly nothing compared to the thousands of hours it is going to take, but it’s fun to write. HR is not fun at all, so I think it’s a good thing for it to go away, but I need to save enough money and be ready to switch jobs before AI gets too powerful.
This was well thought out and I agree with it. I get a newsletter about stuff going on in AI and there is good and bad in most of them.
I see it good as a tool for some things. It's the matter of knowing how to use it. I know a private school using ChatGPT to teach kids how to frame questions, verify the answers, and determine where to dig deeper for their research. Usually it is used as a starter tool to help them home in on what they really want to do for their projects.
Now the proof that AI is only as good as the programming behind it comes with, I believe it was Google's, bad things like saying glue is good in pizza and other bizarre things. They are definitely backpedalling and figuring out how to fix it. So the scary and funny thing with AI, is it is only as good as the humans creating it at this point, like any other technology we ever created.
So I'm with you, Simon. I'm waiting for the bubble to burst and see what really comes out of this. AI is here to stay in some form. Now we wait and see who survives this bandwagon and what tools really help humans and society.
As a creative, I am on the finale of a third novel and it has only led to MORE ideas. I'll be writing more and I love working with a live person for my covers and other items I can't do myself.
“We’ll look down at them and whisper ‘no.’” I got that juicy bit from your audio version where your intonation rendered it hilarious, at least to me.
If this were the 1980s and I were Eric B. I would suggest that that line would be worth flipping out as sampling into a great new beat beatbeatbeatbeat beat beat beat. Right up there with the likes of “I think somebody better call my mother” and “ this is a journey into sound”
Ha, thanks Carolyn! Alas, I can’t claim credit for the line itself. I purloined it from Alan Moore’s Watchmen. albeit repurposed for a new context. It IS a brilliant line, though, for sure.
"only the most creatively bankrupt would steal from the planet’s entire history of art and then use that data to try to extinguish the very act of creation itself."
This is such a powerful statement. It's gutting and terrible how these tools were created. I wonder if there will ever be any accountability for the theft it took to make these. But how could there be? What a mess!
Great article. Thank you for the food for thought.
Think you've set the correct acerbic tone for the many creatives amongst us. One of the issues to note which came up on a foresight exercise I participated in a while back is related to this scenario you flagged:
"The most useful AI tools will [be] assisting doctors and experts and researchers, or aiding those who have immense skill in specific areas..."
The problem here is that to reach the levels of expertise demanded of surgeons, lawyers, engineers etc requires years of training under the supervision of said experts. But if they begin to use AI assistants for reasons of cost, efficiency and availability, then there won't be any trainees coming up through the ranks to replace them. This means expertise degrading irrespective of long-term demand. It's not a great scenario.
I think we likely have at least a couple of decades of unintended consequences to bump into still to come.
The joy also used to be in the research. Apparently, according to Google, we can make do with an AI synopsis, which might or might not bear any relationship to primary materials or reality. This is a concerning business decision, a dumb decision.
That's a good point: their central assumption that we don't want to do the research, that there's no value in the process of researching, is flawed and misses the entire point. How many interesting insights have we gained while researching Thing A, only to stumble upon Thing B? Those distractions and meanderings are what make life interesting and surprising.
Not in all cases, of course: if I want to know how to change a lightbulb, I just want a quick and useful answer. But if I'm researching a *topic*, the process is vital.
Even if the AI summaries were always 100% factually correct, they'd still be missing the point. Just as how non-creative people have pushed the development of generative AI, mistakenly thinking the end product is more important than the creation process itself, we're now seeing non-researchers failing to understand how that works.
One hundred percent this.
The skills, learning, and joy comes from discovery. There's a purpose to mental archeology. The nuggets don't magically appear without time, effort, ingenuity, and thought.
I can go YouTube or Tictok if I need to know how to open the child safety lock. If I need to know how to treat toenail fungus, I want the Mayo Clinic to tell me, not an AI mish mash.
In its early days, Google prioritised the most popular web content. Over time, they improved the algorithm to prioritise legitimate and trusted content.
In 2024, Google wants to compete with LLMs by spoon feeding us pureed baby food. If Google isn't a search engine for the web, what point does it have?
Perhaps they should have revisited their defunct motto, don't be evil, before falling down this rabbit hole.
The problem across a lot of tech design is an obsession with efficiency, which overlooks that friction is actually essential in some contexts. Not all, but some.
The tech corpo solution to all things is to streamline and optimise. Maybe it's because the leaders in that sector tend to have a coding background, and in code you generally DO want to optimise as much as you possibly can. But in society, and culture, that rule does not apply.
As someone who did write code and now writes prose and poetry I'd argue that the personal satisfaction obtained from researching and then constructing a beautifully designed and efficient piece of code is comparable to a poem where every word gives meaning or a piece of prose which you've tightened up through editing. The syntax and words are very different, there are grammatical constraints, but I do feel that my brain is working in a similar way and receiving a similar reward.
A massive proportion of business is banal, unbearably banal, and asinine.
On the flipside, even with mere coding, there is elegance, creativity, innovation, precision.
I'd also disagree with the notion of technology and efficiency, because so much design is clunky and awfully inefficient. Only via many iterations do some technologies eventually become efficient. (Except for printers, they'll always be crap and inefficient. 😁)
This is a great point, and a good platform to argue on. And for writers (and academics, and scientists, and...), research is where the story's shape comes from, where the truly creative shaping of the narrative happens, the synergy of seemingly disparate elements to glom together into something that feels new. And as you note, it's kinda where we think? Where else would we have time to think that much?
This was a really interesting piece (and I'm now subscribed to Stephen Moore), and I hope you're right, but the pessimist in me can't help but wonder if you're overestimating the market for human-generated content. I suspect it's not much bigger than the corner of Substack that appreciates good writing, or the readers who support indie presses. So, while we will always create because that's what we do, even fewer will be able to make any income from it than is currently the case. The difference that I see between the fad of bitcoin and the fad of AI, is that AI has already entered the mainstream; people know ChatGPT as a name even if they don't know whether it's a website, an app on your phone, or a device like a printer that sits on your desk. Everywhere is boasting AI assistance. I'm already being seen as perverse and obstructive by someone more senior at work because I told colleagues to ignore his suggestion to use ChatGPT to write code for us if we get stuck. It will be seen as eccentric and inefficient to avoid generative AI soon, I fear. But I'd be happy to be proved wrong.
AI is definitely encroaching on the mainstream, but mainly at a sort of managerial level. I don't think ChatGPT has much presence in the more general societal space. Could be wrong on that, though. People have heard of AI in a general sense, but I doubt many know the specifics of services available. Which is why the rush to market is so strange: most people's first impression of AI is going to be quite negative, which is likely to cause the industry quite a few reputational headaches.
As for the market for human-generated stuff: I think it's always been a bit like this, even pre-AI. There's always been a huge market for pop, and blockbuster movies, and 'airport novels' etc. Everything else is tiny in comparison. The trick is in finding models that make sense in context: e.g. my Substack doesn't have to make the same amount of money as JK Rowling's next book in order for it to make a significant different to my life and writing.
The margins have been so slim in publishing for decades that writing has only really existed as a hobby or semi-professional side event. Substack, Patreon and Kickstarter are poking at some interesting alternatives, I think. Even Amazon, for all its faults, has flipped the publisher/creator revenue split on its head.
But yeah - you could be entirely right. We shall see! :)
You aren't wrong. In a recent multi-country survey of 12,000 people (including US & UK) almost half (42-47%)had never heard of ChatGPT. Only 2-7% used it daily. https://www.techcentral.ie/international-survey-shows-lack-of-awareness-of-ai-apps/
I wish he is correct but I fear this is akin to "the horse is here to stay." This might be technically true but only technically in our world of cars.
#PauseAI is my other hope.
Key thing is that I'm not saying AI is going to go away, or not be a big thing. I do think it might shake down in unpredictable ways, though, and long term possibly not benefit the people who are currently trying to make lots of money from it.
Movie buffs will watch the occasional Hallmark channel formulaic Christmas romance but what they *really* love are the Oscar bait films. Historical fiction readers enjoy regency romance but what they *really* love are the deeply researched, highly complex plotlines with complex themes. Music lovers will listen to the latest pop sensation but they always go back to that incredibly talented artist that has skill and complexity and something unnameably more. This will be the case with AI generated media. People will consume it but they will always want the something unnameably more.
True, but I think most people who watch films aren’t film buffs they’re just looking for entertainment after a tough day; most people who listen to the latest pop sensation aren’t particularly into music, etc.
I agree, but I look at especially Marvel movies being made right now, which have been boiled down to as formulaic as it gets, and they're not doing well. The studios got too heavily involved and it's missing that unnameable something and they're no longer entertaining. People will put up with it for a bit, but eventually they start to crave more.
Yeah, 'most people' still have a sense of what's good and bad, even if they're not invested in vocalising that. If you look back long-term at the most popular films, songs etc.....they tend to be pretty good lists. And often the 'general' audience is actually way ahead of the more 'discerning' critics, who often don't understand new trends and ideas until decades later.
Marvel is an interesting example - while they've had several duds, they've also had Wakanda Forever, No Way Home and Guardians of the Galaxy 3 in this period, all of which were well reviewed and did very well indeed. They're also the three that are least formulaic and take the most risks, in various ways - i.e. those are the three films that would be least likely to be generated by AI. :)
AI hasn't entered the mainstream. More people own crypto than use ChatGPT or similar tools. A significant minority are unaware of ChatGPT altogether. See https://www.techcentral.ie/international-survey-shows-lack-of-awareness-of-ai-apps/. This survey shows more than 4 in 10 people in US, UK have never even heard of ChatGPT or other AI tools.
Jacqueline, I wonder if that's because we're currently on a hype cycle. Whenever there's a hype cycle for anything (nanotechnology, crypto, whatever...) it seems like THE thing in town. But once the hype dies down, the majority can once again assess things more accurately.
You hit the nail on the head with tech bros… at times they frustrate me. It’s the philosophy of feeling superior with AI, now I’m better than you… I can do everything without needing creativity or talent, I can replace you from my computer… the epitome of arrogance, selfishness, and hedonism. What also worries me is that they think studying humanities is unnecessary because ChatGPT tells them everything they need… proud of their ignorance
Spot on.
Thanks! This was all rather spontaneous and unplanned, so I'm glad it seems to be hitting the right notes.
A hopeful piece. It’s good to see.
I do, though, see a couple of issues not addressed hereI:
Humans will continue to create, yes, but without enforceable legal protections, the A.I. will also continue to steal and incorporate all that “unique” human art, churning it out as it does now with numerous art styles and making it NOT unique. Consumers will still lean into the mass generated stuff, which they can do themselves.
Because A.I. is rapidly controlling the landscape of searches and SEO and marketing, finding those unique human artists—which is already VERY hard aside from a few big names—will become almost impossible. Even consumers who want human art may not be able to find it.
I have hope that it will turn out the way predicted in this article—and there ARE good signs in some places. But those two mountains still need to be climbed.
Absolutely. The general pollution of culture, and the challenge of finding human art even when you wan to, is a big problem. Curiously, it's ALSO a big problem for AI: the more AI generated stuff there is, the more problems AI is going to have. I don't quite see how they can avoid a sort of recursive degradation of quality as AI eats itself.
It's a shame our corporate culture has a tendency to lean into degraded quality for the sake of profit...
We are gonna end up on street corners, handing out art we made ourselves to anyone who might listen. ;)
I'm very late to this conversation apparently. I read your piece at the side of my bed just as I was about to get up and I do appreciate how well you explained the problem, but my stomach sunk a bit when I got to the solution part. I don't believe it. I don't think the joy of creativity will pull us through any more than I think the joy of invention will eternally be rekindled or that virtual assistants will help artists thrive. It is clear to me that whenever a machine can do something better quicker cheaper than a human, that human becomes redundant. The skills that a person cultivated in some cases over a period of decades become unnecessary and useless. This seems universally true in all fields of human endeavor and you see it happening every which way now. The development of high level skills is driven by the pride one feels at attaining them over time and feeling pride in accomplishment at every level of the struggle. (The fuel for attaining pride in accomplishment has always been hard work.) The easier things become the dumber people become. AI will be a powerful devolution accelerator for human culture full stop. I do not see an upside.
This article is absolutely me trying to will this into reality by planting the idea in people's minds, I fully admit that. It could go all manner of other ways.
The difference, I think, is that in historical cases of machines doing something better/faster/cheaper and making a human redundant, it's been in areas which are predominantly employment. That livelihood is actively taken away or diminished.
Art is slightly different because MOST artists are *already* not making a living from it. Most artists do it because they have to, but without expecting any kind of return on their effort. That sets up a slightly different equation to, say, factory workers being laid off due to automation.
Note that I'm not saying either of those examples are better/worse than the other. I'm just pondering how the economics are different.
My fiction writing, for example, has never brought in money. It's not my job. I absolutely adore doing it, though, and an AI being able to do it or some of it won't change that. I mean, there are already many professional fiction writers who are considerably better at it than me, or faster than me, but I don't let that stop me.
But its very likely no one will read it. And do you know if the craftsmen and wome who were replaced did not enjoy the process of creation?
Dear Simon, I am comforted by your talk and have just found your supply of help for setting up a Substack. Doing so becomes more complicated at every turn if a person wants to do it well. Giving up, I contacted support; there IS no Support now except a chat. Ok, I will give it a try. I uploaded my three difficult-to-explain questions, and within a minute, a friendly reply saying, "Let's take each question one at a time," and proceeded to answer them in language even I could understand. I was elated! A chatbot, taking 3 relatively complex issues and answering within seconds! A fantastic reason for AI to exist. I was elated, dancing, thrilled! I believe that with your videos and my new friend, Chatbot, I can complete my mission and put my first note up on Substack by tomorrow! Halleluja, Hail AI in it's perfect place, and you for your kindness.
Great piece SKJ - no matter that everyone is blathering on about AI, your blather has the special sauce of actual thought.
I asked an AI to write a response for me to your essay, but it just told me how to pick poisonous mushrooms in the woods and use glue to stick ingredients to pizza, so I guess I'll have to do it myself, but briefly:
1) You reference Gutenburg in 1422. This is a key moment; what it actually did was make the literary world, of fiction, philosophical and speculative-theological discourse open to so many more than just monks and court scribes. The result was modernity. The AI counterrevolution goes the other way: the tide of slop will drown out those who had a voice in any type of discourse. For court scribe, read AI generating infinite prawn Jesuses. For peasant singing unrecorded ballads in the field, read the rest of us.
2) The twin evils facing the creative spaces of literature, art and video/film, are ENSHITTIFICATION (courtesy Cory Doctorow) and SLOP (originator unknown), where enshittification - the drive to generate more revenue out of a thinly-differentiated platform-product - generates slop because AI allows infinite amounts of the shit. At present, for example, Substack is only minimally enshittified, and actually presents a pleasant and envigorating creator experience; YouTube is already far down the enshitty-chute and is populated by sponsored content with intervening ads, and infinite amounts of slop just beginning to crest. Once the slop reaches a certain point, the platform is sunk under a wave of jellified gunk.
3) Why it won't ever get better. You already said it: "the tech bros lacked the imagination and the foresight and the social conscience, and instead saw only capital and opportunity and power and metrics... it is no surprise that they pursued such a strange direction; only the most creatively bankrupt would steal from the planet’s entire history of art and then use that data to try to extinguish the very act of creation itself."
There won't be some self-correcting trend saving us from this. This is the logic of capitalism. Only the complete collapse of the late-capitalist model will bring an end to that logic. "Our innate creativity" is a tiny force, absolutely puny by comparison with the massed forces of venture capital.
4) I leave you with a list of those monitoring the AI safety on behalf of the US Government, courtesy of Ted Gioia:
"AI Safety and Security Board. These are the people who will protect us against abuses of technology.
Sam Altman, CEO, OpenAI;
Satya Nadella, Chairman and CEO, Microsoft;
Sundar Pichai, CEO, Alphabet;
Vicki Hollub, President and CEO, Occidental Petroleum;
Jensen Huang, President and CEO, NVIDIA;
Arvind Krishna, Chairman and CEO, IBM;
Adam Selipsky, CEO, Amazon Web Services;
Shantanu Narayen, Chair and CEO, Adobe;
Dr. Lisa Su, Chair and CEO, Advanced Micro Devices (AMD);
Kathy Warden, Chair, CEO and President, Northrop Grumman;
Does that list make you feel safer? Or does it remind you of that old proverb about the fox guarding the henhouse?"
Sorry for all the words cluttering up your comments. Promise I've been as brief as I could be.
Doctorow wrote another article on AI, specifically relating to the inevitable bubble burst, which is worth a read if you haven't seen it: https://locusmag.com/2023/12/commentary-cory-doctorow-what-kind-of-bubble-is-ai/
It's a very interesting piece, but of course he predicts neither when or how the bubble may burst, or what it would look like should it happen. Since the scope of AI flummery extends far beyond just goofy images or nonsensical texts, but to the heart of cost-cutting so as to maximize the bottom line for investors, the question would be what of the society itself would remain when the AI bubble bursts? Would it actually be any different from the complete downfall of the neoliberal late-capitalist system?
I give you... AI Karl Marx, predicting the final contradiction of capitalism, the one that rides the profit-maximizing wave to its own end...
https://www.linkedin.com/pulse/spectre-artificial-intelligence-proletarian-warning-age-gallagher-4ygae
Ha! Yes, good point. How much irreversible hollowing-out will be done before it's realised that AI isn't going to work as well as expected?
Great comment! You could turn this into a fascinating article, if you like.
Thanks Louise, I may well do so quite soon, it's on the To Do list.
Looking forward to it! :)
Wow! A lot to take in. Thanks for covering so much ground in one blast!
I've been through too many bubbles of excitement that popped and crashed to earth, think dot com. The breathless excitement and ridiculous amounts of money flooding AI is over the top. I agree with your long term vision of AI doing some things quite well and the rest will die out from disappointment. I'll keep writing in the meantime and may learn some AI tools at some point but am in no hurry.
“All the attention is on generative AI and its impact on creative industries. Painting, illustration, storytelling, copywriting, design, coding, music, video.”
I wonder if this is because it is the field you are currently in. I work in HR, and everyone I know is how cost centers are going to be all automated. HR, Fiscal, and IT do not produce any value, they are just a necessary service to make others produce value. I don’t suspect that these jobs will entirely disappear, but I suspect they will decrease to a 10th of their size in the next decade. But maybe we’re only worried about it because it affects us.
Meanwhile, the book I am writing on the side I will probably finish no matter what AI does. I already expect to make nearly nothing compared to the thousands of hours it is going to take, but it’s fun to write. HR is not fun at all, so I think it’s a good thing for it to go away, but I need to save enough money and be ready to switch jobs before AI gets too powerful.
This was well thought out and I agree with it. I get a newsletter about stuff going on in AI and there is good and bad in most of them.
I see it good as a tool for some things. It's the matter of knowing how to use it. I know a private school using ChatGPT to teach kids how to frame questions, verify the answers, and determine where to dig deeper for their research. Usually it is used as a starter tool to help them home in on what they really want to do for their projects.
Now the proof that AI is only as good as the programming behind it comes with, I believe it was Google's, bad things like saying glue is good in pizza and other bizarre things. They are definitely backpedalling and figuring out how to fix it. So the scary and funny thing with AI, is it is only as good as the humans creating it at this point, like any other technology we ever created.
So I'm with you, Simon. I'm waiting for the bubble to burst and see what really comes out of this. AI is here to stay in some form. Now we wait and see who survives this bandwagon and what tools really help humans and society.
As a creative, I am on the finale of a third novel and it has only led to MORE ideas. I'll be writing more and I love working with a live person for my covers and other items I can't do myself.
Exactly. It's not like humans are going to suddenly stop having their own ideas.
Pure perfection. 👌🏻
🫨 Thanks, Louise!
Thank you for:
“We’ll look down at them and whisper ‘no.’” I got that juicy bit from your audio version where your intonation rendered it hilarious, at least to me.
If this were the 1980s and I were Eric B. I would suggest that that line would be worth flipping out as sampling into a great new beat beatbeatbeatbeat beat beat beat. Right up there with the likes of “I think somebody better call my mother” and “ this is a journey into sound”
Ha, thanks Carolyn! Alas, I can’t claim credit for the line itself. I purloined it from Alan Moore’s Watchmen. albeit repurposed for a new context. It IS a brilliant line, though, for sure.
"only the most creatively bankrupt would steal from the planet’s entire history of art and then use that data to try to extinguish the very act of creation itself."
This is such a powerful statement. It's gutting and terrible how these tools were created. I wonder if there will ever be any accountability for the theft it took to make these. But how could there be? What a mess!
Great article. Thank you for the food for thought.
Great read!!! 👍👍👍