I've found AI to be really helpful sometimes. I also really, really hate generative AI slop. So I created some rules for myself when using AI in my creative process.
You’ve given this subject a lot of thoughtful attention, so let me ask what you would do:
You just found out that one of your favorite songs of all time - let’s say it’s Norwegian Wood, was created not by your beloved Beatles, but by AI. Do you still listen to it because, whether made by man or machine, it is a great song? Do you stop listening to not only Norwegian Wood but everything the Beatles wrote out of concern that all of their songs may have been generated by AI? Do you embrace AI on the basis that if a machine could produce a song like this then you want that machine (à la
Estelle Reiner’s famous line, “I’ll have what she’s having” in When Harry Met Sally).
Haha, it’s a great question. I don’t think it would happen, but if it did, yes I think it would impact my enjoyment of the track. Part of the reason we listen to songs is to hear the effort that went into it, to connect with the artist, and because we love stories. So yeah, I probably wouldn’t listen anymore to be honest
I agree with Stephan, this is thoughtful and nuanced, which is hard to find within creative circles. And if I used AI, I like to think I would also adopt some sort of framework like this. But that said, my gut reaction to this essay, and others that are similar in approach, is to wonder where the environmental impact is in all of this for you. As far as I have read and understand, it's pretty devastating- much, much worse than the other types of computing we do on a regular basis. And not just environmentally devastating, but also culturally in the way that data centers are impacting the specific places that they're being built (and imposed upon). I'm worried that this part of AI's impact is being lost in the conversations around it because we're all simply accepting that it's here to stay. I would say this is the one thing that I'm having a hard time wrapping my brain around, and the reason that I've taken active steps to remove it from my life wherever possible (switching browsers, etc). That said, I understand that my feelings around it are probably also wrapped up in the existential threat that I feel as a creative... I appreciate your thoughtfulness, so I'd love to know how you're thinking about this.
It’s a really good question and definitely a concern. I suppose my calculation is something like this…
1. I don’t believe personal use of AI is a big driver of its environmental impact. Partly that’s because I know entire engineering teams that are now using it 24/7, and any use you or I could have is just dwarfed by that. It’s like when corporations convinced us that people taking one or two flights a year was causing global warming as opposed to commercial or industrial emissions which was always 99% of the problem. Partly it’s because I’ve read that a single query equals about 5 Google searches in terms of energy use, but I’d say a single query is about 10-15x more productive than a single Google search, so that calculation makes my personal use feel fairly reasonable.
2. Generative music and video is much more energy intensive, but yeah, I think that stuff is also bad for lots of other reasons too (as outlined above and in Adam’s video), so I’m generally not down with that.
3. I do believe that AI is so useful and productive for certain fields and industries that it’s here to stay long-term, and I would rather be an advocate of moderate, reasonable use of it.
And then on top of that, I’m curious and intrigued and find it really helpful at times! I want to learn and grow as a person and do cool things, and I think it can help me with that! And if in my very small field of influence, I can help push people toward uses of AI that don’t replace humans but rather push us to be better, then that’s what I’d like to do.
But also… life is great without it too so if you don’t want to partake, absolutely go for it!
This is all great and very thoughtful. I'm encouraged to do more research on energy usage and also be more clear when what I'm really thinking about is generative AI use, or not. I hope we can talk more about this in person some day! (it's rare to be able to have nuanced conversations about this...)
Some thoughts that come to mind: I struggle with the "inevitability" narrative that we're sold from tech companies and billionaires. The "it's here to stay" thing, so we may as well buy what they're selling. It feels very convenient, especially when billions of dollars have been invested and investors' profits are on the line. I think we're sold this narrative in part so that we're led to believe our individual consumption doesn't matter. Which leads me to the other thing I struggle with, which is that individual use cases don't matter. I certainly agree that with all resource use, corporations and billionaires are the largest offenders. We need regulation so badly...! But I once again think it's a way of selling this idea that what we do as individuals doesn't matter. I struggle with that because I think we're told that in so many different ways to keep us as passive consumers, of everything, not just AI. And I'm not pointing to your specific use here (if only everyone was as thoughtful about this as you!) I'm sincerely happy that it's useful for you.
It feels akin to social media, or even music streaming: individual use cases might be great. There are certainly lots of wonderful things that happen because of these platforms. But if we zoom out, the system as a whole is hugely toxic and hurtful. So at what point does the toll of the whole system outweigh the positive individual uses? And at what point do we say that we're not buying what they're selling because it's not worth it?
My bro-in-law is an engineer and a runs a software company. He says they basically never write code anymore — AI just does what they used to do 100x more efficiently and productively. So for certain areas of society, it is inevitable, not because some corporate interest is telling you it’s inevitable but because it’s so valuable and efficient and impactful, all the incentives are driving its uptake. Stopping it would be like going up to a farmer who has sweated and toiled over his field his whole life to squeeze out a tiny amount of food but then discovered a horse and plow allowed him to do it so much quicker and harvest a lot more food and trying to take the horse and plow away from him and make him go back to sweating and toiling again. It’s being used because it works incredibly well.
I don’t think that’s the same calculus for the arts. Because efficiency isn’t the goal. So yeah, I don’t think it’s inevitable in the arts, and that’s why I’m trying to promote a specific approach to it.
Yeah, I agree and have seen that it's a game changer in certain industries. It's that it's being pushed down individual consumers' throats in every possible place that bothers me so deeply. Maybe it's my stubbornness coming out. But I fear we're losing creativity and judgement and logic and attention at an alarming pace. I fear for our young thinkers. And I fear so deeply for our aching planet. Again, this is not about individual use cases like yours. I'm happy for people like you who have found it helpful. This is just where my brain goes when talking about AI-at-large and my own fears and questions. And it probably doesn't help that I'm not privy to smarter, nuanced conversations around it. Maybe all of this comes from the fact that I feel a creeping, existential dread around how our brains are changing. I don't know, I just know I worry deeply about it.
That said, I think we're essentially on the same page here! I agree with most everything you're saying, I think we're just coming at it from different angles.
I think you have the right of if Sova. It's more like a toxic pesticide, it works great as a pesticide but at the same time it kills everyone who comes in contact with it.
I'm also a software developer and my own anecdotal experience is that it's not so easy as just suddenly 100x output. Sure anyone can generate a billion lines of code all day, but literally writing the code was never the hard part of software delivery.
Just like how you can generate 10k songs every day, but none of them are good
And btw many toxic pesticides are banned, and so was asbestos which is a great insulator. AI has already been linked to many deaths and it will be linked to more - the issue is that corporate America is so blinded by the possibility of being able to replace people with text generators that it's very hard to argue against how Ian framed it as having so many incentives because they can supposedly 100x productivity
I would definitely not use text generators to fact check myself. It's the opposite, you need to fact check the LLMs because their text output has no relationship to facts or reality. All text generated by LLMs is hallucinated, it's just that it often happens to be not wrong
E.g. Just a few days ago Ars Technica published a piece that had 3 LLM-generated quotes in it which they published as being real quotes
I’ve been reading your comments and I don’t disagree with a lot of it! There’s a heck of a lot not to like about AI. I’m broadly very sympathetic to your points. I suppose the reason I wrote this is three-fold: 1) I personally have found AI to be useful when used in a highly restrained, intentional way that supports rather than replaces creativity (you may disagree, that’s fine), 2) I think people currently using AI in the arts would be better off if they used it in this way rather than generatively or to “maximize efficiency,” and 3) I do believe AI is here to stay and in my very very small corner of influence I would like to advocate for a restrained, intentional use of it. I think if people did that, it would probably also resolve a lot of the environmental impacts since it wouldn’t be this crazy, all-out race to gobble resources and build data centers or whatever. But I’m really just trying to navigate this new world myself as well! I like hearing your perspective! And I got replies also saying my approach was too restrictive so I’m getting critiques from both sides here.
Re: fact checking, you can see my Principle No 6. I still find it very useful for fact checking since when you’re writing long for you often don’t know where your mistakes are, and AI is very good at pointing out places you might have overstated something.
A nuanced stance on the use of AI in music! What a rare delight.
Hahaha, appreciated!
You’ve given this subject a lot of thoughtful attention, so let me ask what you would do:
You just found out that one of your favorite songs of all time - let’s say it’s Norwegian Wood, was created not by your beloved Beatles, but by AI. Do you still listen to it because, whether made by man or machine, it is a great song? Do you stop listening to not only Norwegian Wood but everything the Beatles wrote out of concern that all of their songs may have been generated by AI? Do you embrace AI on the basis that if a machine could produce a song like this then you want that machine (à la
Estelle Reiner’s famous line, “I’ll have what she’s having” in When Harry Met Sally).
Haha, it’s a great question. I don’t think it would happen, but if it did, yes I think it would impact my enjoyment of the track. Part of the reason we listen to songs is to hear the effort that went into it, to connect with the artist, and because we love stories. So yeah, I probably wouldn’t listen anymore to be honest
I agree with Stephan, this is thoughtful and nuanced, which is hard to find within creative circles. And if I used AI, I like to think I would also adopt some sort of framework like this. But that said, my gut reaction to this essay, and others that are similar in approach, is to wonder where the environmental impact is in all of this for you. As far as I have read and understand, it's pretty devastating- much, much worse than the other types of computing we do on a regular basis. And not just environmentally devastating, but also culturally in the way that data centers are impacting the specific places that they're being built (and imposed upon). I'm worried that this part of AI's impact is being lost in the conversations around it because we're all simply accepting that it's here to stay. I would say this is the one thing that I'm having a hard time wrapping my brain around, and the reason that I've taken active steps to remove it from my life wherever possible (switching browsers, etc). That said, I understand that my feelings around it are probably also wrapped up in the existential threat that I feel as a creative... I appreciate your thoughtfulness, so I'd love to know how you're thinking about this.
It’s a really good question and definitely a concern. I suppose my calculation is something like this…
1. I don’t believe personal use of AI is a big driver of its environmental impact. Partly that’s because I know entire engineering teams that are now using it 24/7, and any use you or I could have is just dwarfed by that. It’s like when corporations convinced us that people taking one or two flights a year was causing global warming as opposed to commercial or industrial emissions which was always 99% of the problem. Partly it’s because I’ve read that a single query equals about 5 Google searches in terms of energy use, but I’d say a single query is about 10-15x more productive than a single Google search, so that calculation makes my personal use feel fairly reasonable.
2. Generative music and video is much more energy intensive, but yeah, I think that stuff is also bad for lots of other reasons too (as outlined above and in Adam’s video), so I’m generally not down with that.
3. I do believe that AI is so useful and productive for certain fields and industries that it’s here to stay long-term, and I would rather be an advocate of moderate, reasonable use of it.
And then on top of that, I’m curious and intrigued and find it really helpful at times! I want to learn and grow as a person and do cool things, and I think it can help me with that! And if in my very small field of influence, I can help push people toward uses of AI that don’t replace humans but rather push us to be better, then that’s what I’d like to do.
But also… life is great without it too so if you don’t want to partake, absolutely go for it!
This is all great and very thoughtful. I'm encouraged to do more research on energy usage and also be more clear when what I'm really thinking about is generative AI use, or not. I hope we can talk more about this in person some day! (it's rare to be able to have nuanced conversations about this...)
Some thoughts that come to mind: I struggle with the "inevitability" narrative that we're sold from tech companies and billionaires. The "it's here to stay" thing, so we may as well buy what they're selling. It feels very convenient, especially when billions of dollars have been invested and investors' profits are on the line. I think we're sold this narrative in part so that we're led to believe our individual consumption doesn't matter. Which leads me to the other thing I struggle with, which is that individual use cases don't matter. I certainly agree that with all resource use, corporations and billionaires are the largest offenders. We need regulation so badly...! But I once again think it's a way of selling this idea that what we do as individuals doesn't matter. I struggle with that because I think we're told that in so many different ways to keep us as passive consumers, of everything, not just AI. And I'm not pointing to your specific use here (if only everyone was as thoughtful about this as you!) I'm sincerely happy that it's useful for you.
It feels akin to social media, or even music streaming: individual use cases might be great. There are certainly lots of wonderful things that happen because of these platforms. But if we zoom out, the system as a whole is hugely toxic and hurtful. So at what point does the toll of the whole system outweigh the positive individual uses? And at what point do we say that we're not buying what they're selling because it's not worth it?
It’s a good conversation to have! I would definitely love to have it in person someday!
My bro-in-law is an engineer and a runs a software company. He says they basically never write code anymore — AI just does what they used to do 100x more efficiently and productively. So for certain areas of society, it is inevitable, not because some corporate interest is telling you it’s inevitable but because it’s so valuable and efficient and impactful, all the incentives are driving its uptake. Stopping it would be like going up to a farmer who has sweated and toiled over his field his whole life to squeeze out a tiny amount of food but then discovered a horse and plow allowed him to do it so much quicker and harvest a lot more food and trying to take the horse and plow away from him and make him go back to sweating and toiling again. It’s being used because it works incredibly well.
I don’t think that’s the same calculus for the arts. Because efficiency isn’t the goal. So yeah, I don’t think it’s inevitable in the arts, and that’s why I’m trying to promote a specific approach to it.
Yeah, I agree and have seen that it's a game changer in certain industries. It's that it's being pushed down individual consumers' throats in every possible place that bothers me so deeply. Maybe it's my stubbornness coming out. But I fear we're losing creativity and judgement and logic and attention at an alarming pace. I fear for our young thinkers. And I fear so deeply for our aching planet. Again, this is not about individual use cases like yours. I'm happy for people like you who have found it helpful. This is just where my brain goes when talking about AI-at-large and my own fears and questions. And it probably doesn't help that I'm not privy to smarter, nuanced conversations around it. Maybe all of this comes from the fact that I feel a creeping, existential dread around how our brains are changing. I don't know, I just know I worry deeply about it.
That said, I think we're essentially on the same page here! I agree with most everything you're saying, I think we're just coming at it from different angles.
I think you have the right of if Sova. It's more like a toxic pesticide, it works great as a pesticide but at the same time it kills everyone who comes in contact with it.
I'm also a software developer and my own anecdotal experience is that it's not so easy as just suddenly 100x output. Sure anyone can generate a billion lines of code all day, but literally writing the code was never the hard part of software delivery.
Just like how you can generate 10k songs every day, but none of them are good
And btw many toxic pesticides are banned, and so was asbestos which is a great insulator. AI has already been linked to many deaths and it will be linked to more - the issue is that corporate America is so blinded by the possibility of being able to replace people with text generators that it's very hard to argue against how Ian framed it as having so many incentives because they can supposedly 100x productivity
Yeah, I agree with everything you wrote.
I would definitely not use text generators to fact check myself. It's the opposite, you need to fact check the LLMs because their text output has no relationship to facts or reality. All text generated by LLMs is hallucinated, it's just that it often happens to be not wrong
E.g. Just a few days ago Ars Technica published a piece that had 3 LLM-generated quotes in it which they published as being real quotes
I’ve been reading your comments and I don’t disagree with a lot of it! There’s a heck of a lot not to like about AI. I’m broadly very sympathetic to your points. I suppose the reason I wrote this is three-fold: 1) I personally have found AI to be useful when used in a highly restrained, intentional way that supports rather than replaces creativity (you may disagree, that’s fine), 2) I think people currently using AI in the arts would be better off if they used it in this way rather than generatively or to “maximize efficiency,” and 3) I do believe AI is here to stay and in my very very small corner of influence I would like to advocate for a restrained, intentional use of it. I think if people did that, it would probably also resolve a lot of the environmental impacts since it wouldn’t be this crazy, all-out race to gobble resources and build data centers or whatever. But I’m really just trying to navigate this new world myself as well! I like hearing your perspective! And I got replies also saying my approach was too restrictive so I’m getting critiques from both sides here.
Re: fact checking, you can see my Principle No 6. I still find it very useful for fact checking since when you’re writing long for you often don’t know where your mistakes are, and AI is very good at pointing out places you might have overstated something.