The hidden cost of AI art: Brandon Sanderson's keynote

44 points by Apos a day ago on tildes | 39 comments

Protected | a day ago

When I was a child all my musical taste amounted to was the casual enjoyment of the fragments of whatever recognizable pop songs played on the car radio on my way home from school. We (well, our parents) had a carpool thing going on, so imagine a car stuffed full of noisy kids, eating, reading, talking, whatever. The radio was on, but maybe it was ads, or comedy, or the adult driving would want to change the station to hear the news or weather report. Whether there was music, I was paying attention to it, and I found it enjoyable was random, which was never conductive to developing real taste. Under these circumstances, you merely enjoy what's immediately pleasurable - the musical sugar, let's call it.

I became an adult, and over the years I started paying attention to music and grew to appreciate the talent and skill of the artists who create it. My taste quickly veered towards rock, a genre that features untold heights of virtuosity when it comes to guitar, bass and drums. It grew to encompass metal and prog (and here you start having more keyboards, flutes, violins). But it's not like I require a song to have at least five different time signatures before I respect it. Enter psychadelic. Punk. Broadband Internet arrived and liberalized (in part) music publishing, and now you have new genres, new creativity. There was amazing innovation on display in Dubstep, for example.

All the while, the ol' music industry is busy streamlining. It's much more profitable when artists are produced rather than found; new music is pre-planned and designed by a team of people who are very knowledgeable about formulas, appeal and marketing. Variables are eliminated; we want artists who are beautiful, stable, clean, uncontroversial. Can they sing? It doesn't matter, we autotune. Can they play an instrument? Who cares, use pre-recorded tracks. The result is the purest, most refined musical sugar. It is sweet, and sweet is safe, because it's a flavor even a child can immediately enjoy.

I have no trouble believing AI can create this kind of art. It should be able to do it perfectly. Why would it remix clichés any worse than a human? It's literally a remixing machine. That's what it's for.

But as an adult, there is an additional dimension to my enjoyment of music. I want something beyond sweet; let me taste that bitter, that savory, those notes of chocolate and smoke. When I see a traditional (modern) artist shredding their heart off, I think of the years of effort it took them to get that good. When I hear lyrics so touching they marked a generation, I marvel at how there has never been, and never will be, another song quite like that.

When I hear Jon Anderson sing, I think "holy fucking shit, he's literally better than autotune." And I don't give a damn if the artist is ugly, elderly, disabled or wrote every single one on their songs during a three year long nonstop drug binge. They are humans who struggled and sweated to create something new, and every single one of their accomplishments is more valuable than anything that will ever come out of the remixing machine, whether the machine is powered by five audio engineers, a PR manager and twenty marketing experts or by three Nvidia GPUs.

Does that mean AI is useless, undesirable or otherwise doomed to fail? No. To a lot of people, it suffices. What do people want out of art? Different things, for sure, and that's OK. There are people who want enough comfortable repetition out of their entertainment to make recurring themes, genres and formulas profitable in all kinds of fields - TV shows, LitRPGs, FPS games, whatever. One might argue - and this is a suspicion of mine rather than anything I have hard data for - that most people want at least some predictable comfort in the content they consume. Actually parsing what's taking place in an Ursula K LeGuin novel can demand more mental bandwidth than we have on a day by day basis. Sometimes we're tired and just want to see yet another anime boy win a martial arts tournament, or something.

But true artists will always be the ones I admire and respect. I don't want to go without their works; I'm definitely willing to pay to experience them. Once in a while I eat chocolates and biscuits but too much sugar is cloying!

I'm currently helping with Brandon Sanderon's upcoming book, The Fires of December. Some of you may not be fans of his work - on a mental diet? - but I can guarantee that at least he wrote it himself - and a lot of people work very hard to make sure a good book will be published later this year (note: timeline as announced; I have no privileged information and cannot answer any questions). And when you read a passage and think "that was clever!" or "that was surprising!" isn't it cool to think that was a real human being clever and surprising?

(P.S.: Pre-emptively acknowledging the pretentiousness of my disdain for modern pop music ;) )

first-must-burn | 23 hours ago

I'm currently helping with Brandon Sanderon's upcoming book, The Fires of December.

I mean, holy shit, that's a hell of a throwaway line in an already interesting post. I suppose you can't talk about it, but that's really cool.

Protected | 21 hours ago

I'm just one of the beta/gamma group. I had mentioned it before on tildes!

There are a lot of us, check the acknowledgement pages on his books :)

Grzmot | 7 hours ago

What does that mean, like test readers?

Protected | 6 hours ago

I found this nice video that explains the process (although it focuses more on beta).

kingofsnake | 21 hours ago

You're not wrong about modern pop music. I'm not that old and I feel like the trajectory from divas to boy bands to pitiful streaming royalties to AI -- having all happened in my teen-adult lifetime -- are a testament to an industry that's not built to serve artists. It was there to squeeze them until they're replaced.

That all said, I see companies paying big bucks to secure the rights to our culture hits of bygone eras like it's their last chance at making money. Classic rock still saturates the radio airwaves and (this may be the old talking) but popular music doesn't seem to have the pull it once did.

Frankly, I think we're seeing it sign cheques for its own funeral expenses.

DrStone | 20 hours ago

A big part of the lack of pull is just the explosion of choice in what's available and the methods of consumption leading to fragmentation and rapid churn. When all you've got are a dozen radio stations, the local record store and dive bar, and MTV for listening and discovery of mostly local (country + global sensations), the widely shared experience leads to a popular song exploding in the public. Now practically everyone can instantly broadcast to the entire world in one way or another, and everyone else can consume exactly the niche sounds they want when and how they want to from these global broadcasts, it's harder for any individual band to generate that sound-of-a-generation impact. If a band does manage to pull it off, there's so much else that it's a huge fight to stay relevant before the next fad. The sounds that are still considered timeless and widely loved are mostly stuck in the set from the Before Times. Same thing is happening with television with all of the streaming platforms, on-demand binge watching, and global content deals.

kingofsnake | 20 hours ago

Nostalgics always long for better days but since looking back is a whole industry now, I'm very curious whether the next generation -- having been raised on the lookback machine with little of their own -- will turn the whole thing upside down.

Sure, that's been happening ever since youth culture took off in the west in the 50s, but like everything else over the past 10, it's amplified to the point of upsetting the whole damn structure of the thing.

I wonder if this whole thing will hold.

hobbes64 | 7 hours ago

They are humans who struggled and sweated to create something new, and every single one of their accomplishments is more valuable than anything that will ever come out of the remixing machine…

This part of your post reminded me of Philip K Dick’s book “The Man in the High Castle”.

The book has nothing to do with AI, but it has to do with history and historical objects and what we as humans find valuable about them. In the book, there is a sub plot about someone who is making fake historical objects and selling them. And there are some questions raised about where the value really is; in the objects themselves, or in the historical event, or in our reaction when we hold or observe such an object.
So I’m thinking about a similar thing with music and other art, and how important it is to me when I’m enjoying most art to know something about the (real) people who made it and to think about the difficulty of that work.

Sunbutt23 | 18 hours ago

Your worldliness leaves me in awe. And shame. It’s like watch Brennan Lee mulligan answer any question. I’m left thinking how uncultured and basic I am. I’m left longing for the inspiration to expand my experience. Yet I stay. Stuck in my comfortable corporate (for lack of a better term) blandness.

Thank you for your treat of a passage. I enjoyed thinking about who you are as a person. And will now continue to wallow in my self made prison lol

Omnicrola | 9 hours ago

You are art. And we are all works in progress.

patience_limited | 5 hours ago

Let me add that it's not simply the quality and the originality of the product, but also the performance. Art is a form of communication among humans.

I've never been as electrified by a piece of music as I was watching a live small-stage performance in a Chicago jazz bar. I was literally watching the performers converse with each other and the audience using their instruments, in real time, improvising as they played.

There's an essential interplay and feedback among the artist, the body of prior artistic work, other artists, and the "consumers of the product". Even artists who claim that they're producing entirely for their own benefit are making intelligible statements to an audience and drawing on the well of works by those who preceeded them. AI isn't interpreting and reflecting an audience's feelings or collaborating with other artists to make new statements, it's producing according to a set of fixed parameters based on prior work.

While LLMs and genAI can surface novel interpretations the audience hasn't seen before, based on the breadth of the training corpus, it's subject to the "not everything new is good, and not everything good is new" problem. As you say, training on likeliest or most popular responses homogenizes the output to pablum (even financiers are complaining about this).

Protected | 4 hours ago

communication

I'm really glad you brought this up, I was wondering if someone would!

Not long ago we had here on Tildes (at least I think it was? hard to keep track of everything) a submission from an article in which a professor quoted Stephen King's On Writing: "writing is telepathy".

The professor argued that they weren't concerned about students using AI to write because AI can't write. There is no one there to communicate, so whose telepathy is the reader receiving, really?

[OP] Apos | a day ago

He also gave a talk about this that he published on his YouTube channel: We Are The Art | Brandon Sanderson’s Keynote Speech.

The surge of AI, large language models, and generated art begs fascinating questions. The industry’s progress so far is enough to force us to explore what art is and why we make it. Brandon Sanderson explores the rise of AI art, the importance of the artistic process, and why he rebels against this new technological and artistic frontier.

"Yes, the message is 'journey before destination.' It's always journey before destination."

I mostly agree with him though I don't completely get the argument that AIs steal the opportunity for growth. I think that someone that wants to grow can still grow. You can still go through the process of "getting your diploma" even with AIs existing, and you can still become the art.

Also despite the AI not feeling anything itself and not getting changed, I think it's possible for a human to get changed by its output.

What do you guys think?

GunnarRunnar | a day ago

I mostly agree with him though I don't completely get the argument that AIs steal the opportunity for growth.

I interpret that as a warning and not as the full truth. If you're using AI to do X, you aren't "flexing the muscle" that you'd use to create X and therefore aren't getting better at X. I guess it's actually you robbing yourself of the opportunity for growth but I think it's a pretty neutral statement about AI rotting your brain.

skybrian | a day ago

I don’t know about writing novels, but writing code with a coding agent is sort of like managing a software project. (But in easy mode because you don’t have to deal with people issues.) I don’t write the code directly, but I influence it in all sorts of ways, by pointing out bugs, by asking pointed questions, or by asking it to write tools and templates and style guides and other scaffolding.

I’m learning a lot of things, and the project itself is “learning” through evolution. I’m not learning the same things I used to learn by writing code myself, but it’s definitely learning.

There are opportunities for growth that you miss by not using these tools. Of course that’s true of any activity you choose not to take up.

For many skills there are diminishing returns from more practice. I don’t want to discourage anyone from learning by writing code yourself if you’re in the early part of the learning curve, but mixing it up a bit would probably be a good idea too. CS undergrads have few opportunities to work on large-scale projects and could probably learn things from a class where you build something bigger with a coding agent.

archevel | 8 hours ago

I think there is a distinction to be made here about what it is you care about. As a company (and sometimes as a person) you rarely care about the process of producing something. You want it fast and as cheap as possible. If this is the case using an LLM to build something is likely the pragmatic approach. You can get progress sooo much faster. Which then let's you validate the idea faster. Rinse, repeat.

However, from a learning perspective, using an LLM to build software seems a bit shortsighted (similar argument can be made for initially using an IDE when learning a new programing language). That is, unless what you are trying to learn is building software with the aid of an LLM (which I do think is a valuable skill, and this is likely part of what you are learning). But, assuming it isn't, then why not just tackle building a larger project? You'll run into issues for sure and you will not progress as fast to a "finished product", but if learning was the goal then the product is incidental.

skybrian | 7 hours ago

By large-scale, I mean the sort of thing that would take months or years (or decades) for one person to do. At a company there might be dozens of employees working on them.

By using a coding agent and building something larger than you could by hand, you will be able to get to that scale faster and learn about the kind of issues you run into in larger projects, without having to put all the time in.

TintedJellyfish | 5 hours ago

And on the way the agent will have made all kinds of decisions that you may or may not ever learn the impact of. When you do this by hand, especially when multiple people are looking at the process you are much more likely to have someone who notices

[OP] Apos | a day ago

Yeah, that's more how I feel too. I went back to school this year and it's really easy to try to skip steps. I've seen people use ChatGPT to study for a test but they ended up memorising stuff that wasn't in the course. It could generate mock tests so they felt pretty confident.

chocobean | a day ago

I'm curious about how these users generated mock tests: was it fed a wealth of other past papers, or was it generated only from textbook?

[OP] Apos | a day ago

There are many ways but sometimes the teacher will give you a study guide to know what you should study for an exam. For example, in a marketing class, the teacher might say that you need to know about the four real costs of losing a client. The answer that the AI will give would sound plausible but it's not what the teacher taught in the class.

Testing it right now, the ChatGPT outputs is: Lost lifetime revenue; Cost to replace the client; Lost growth and upsell potential; Indirect damage through reputation and referrals.

The actual answer from the course would be: The cost of the lost sale; The cost of lost revenue; The cost of lost profit; The cost of negative publicity (reputational damage). It's close but not quite right. Compound those little errors and you lose quite a lot of points and you waste time since you're trying to memorize the wrong stuff.

Of course the AI would make more accurate tests if it was fed all the notes from the class, but people don't really tend to do that.

Also from what I could see, the tests that the teachers make are way more interesting than what the AI generates. The AI makes tests that are way easier and don't capture as much knowledge.

(Actually, some teachers were tasked by the school to make mini tests using AI (probably copilot). One teacher made us answer out loud. At the end he said that the AI was rather nice since the difficulty was really low.)

WhyCause | 17 hours ago

...I don't completely get the argument that AIs steal the opportunity for growth.

Think back to third grade and multiplication tables. We drilled those multiplication tables over and over and over, with 5-minute quizzes to test our memorization. Eventually, we got good enough that we could do those problems in our head, and could quickly do two and three-digit multiplication problems on paper.

If we had been told to "just use your calculator" in third grade, the step to algebra in eighth grade or freshman year would have been impossible; we just wouldn't have had the knowledge or practice of manipulating numbers to the extent that we could abstract it to working with symbols.

LLMs steal that opportunity to grow from you. You can't learn to twist language to your needs in a story or poem if you haven't already written crappy versions several times. You can't learn to make that final image match exactly what's in your head if you don't practice with the pencil, brush, or line tool over, and over, and over. You'll never wrench that perfect tune from your muse's grasp without getting guitar-string calluses on your fingers or chipping your teeth on your trumpet's mouthpiece.

If you only ever do basic arithmetic with a calculator, you will never get good at math; you'll only be good at pressing buttons. If you only ever use an LLM to write, draw, or make music, you will never get good at those things; you'll only be good at asking a computer to make something for you based on what has come before. Using an LLM robs you, and all of us, of that growth. Depending on an LLM for those outputs chains you to the profit-desperate companies that are trying to convince us all that there is no other way, and that will be $200 a month, please. Only using an LLM for these and other artistic endeavors leads to an eventual "heat-death" of culture, where everything is at the same, average, temperature, color, volume, and level of surprise.

LLMs lead to greige, not growth.

ETA: the final two words.

jcd | a day ago

I easily agree that art is what we define to be and it is ultimately useless, which makes what ai makes not-art.

But my personal issue (a big one) against LLMs is that they can only be owned by very few: they in fact (will) become a means to further contol us.

BeardyHat | a day ago

You can setup your own AI instances and run them locally. I've got an LLM setup on my server in my office, which I can then access over my local network on my phone. It keeps my data mine and uses the energy my solar panels generate for compute. I've even got my own local AI image generation. I don't use it much, but it's there if I want it.

This is slower than using something like ChatGPTot Copilot, but at least I'm not feeding into the machine.

jcd | a day ago

I can do that, but I cannot build/train such a model. Which means i can't control it. That goes for everyone not having access to a whole lot of GPU and training data.

stu2b50 | 22 hours ago

I feel like that's overstating how hard it is to train a model. First, you have to dileneate what "AI" you're talking about - the denoisers models used for image generation and LLMs are very different in structure.

For image denoisers, you really can train a model from scratch with pretty decent quality. Certainly you can fine tune models on truly consumer level hardware (a GPU with a fair amount of VRAM, or any Mac with a good amount of memory).

For LLMs, it's a bit harder. But you can fine tune a model for <$20 of GPU credits.

You can see this manifest as many people intentionally do fine-tuning on available LLM weights to, say, "de-censor" them, as they usually have some guardrails against adult content, weapons manufacturing, and so forth. Not that hard to remove - enough that randos on the internet can muddle their way through it.

Does that take technical knowhow? Sure, but without technical knowhow it's not like you can control normal software either.

Pepetto | 22 hours ago

Presumably you cann't make your own GPU from raw sand you collected yourself, or code your own kernel either...
Why would you need to be able to build/train a new model from scratch to control it?
(This might sound like a dig at you, it isn't, genuinely wondering why the isolated demand for rigor)

Omnicrola | a day ago

I find myself more and more frequently referring to a phrase I heard someone use last year : "AI is a force multiplier ".

Which to me means that if someone with no skill uses AI, you are multiplying by zero, and you will get something with little use and little meaning (slop).

And to try and apply it to Brandon's essay, I think that AI can be a very useful tool for learning and growing. I think it will be exceptionally useful in some ways for helping people reflect and grow thier various skills. And in other ways it will multiply by zero and you will emerge on the other side unchanged.

What exactly the difference is, is not entirely clear yet and I think we're stumbling through figuring that part out together as a society.

raze2012 | a day ago

I don't completely get the argument that AIs steal the opportunity for growth. I think that someone that wants to grow can still grow.

New grad unemployment as of now is over 25%. Not being able to start their career and getting all the things that come with a full time job definitely hampers people. And then the next batch notice this and don't bother with the huge cost a diploma employs.

[OP] Apos | 22 hours ago

Yeah, that's true, I'm currently back in school and I've spoken with some people that are scared about the future. Some of them are thinking of dropping out. One of my friends that's currently in a computer science program was thinking of switching to something else last session. He ended up staying but he doesn't think that he'll ever be good enough.

I've been working as a programmer for many years so I feel like I live in a different world.

AI is definitely something that teachers talk quite often about. I got AI training in multiple different courses. One thing that comes up often is how LLMs don't have any critical thinking skills.

It's definitely a huge paradigm shift.

I wonder how the unemployment ratios for graduates will shift over the coming months or years.

This is the difference between Data and a large language model, at least the ones operating right now. Data created art because he wanted to grow. He wanted to become something. He wanted to understand. Art is the means by which we become what we want to be.

The purpose of writing all those books in my earlier years wasn’t to produce something I could sell, it was to turn me into someone who could create great art. It took an amateur and it made him a professional. I think this is why I rebel against AI art products so much: because they steal the opportunity for growth from us.

Yes, it’s about the friction. What barriers is someone compelled to overcome in the pursuit of birthing some piece of story into the world? I can wholeheartedly say that I find value in some over others. Where is the reciprocity between artist and audience? It is this effort that I want to be in “conversation” with when it comes to art, this particular need and inner drive — anything else simply is not worth my time, because it’s not what I, personally, am looking for.

In my opinion, one of the most interesting aspects of this new AI craze are the schisms it seems to be exposing within how people define what a relationship is, what it means to them, what they believe it should mean to others, what we believe our relationships are worth in and of themselves, how we communicate within the contexts of them. Often when this topic is discussed does it seem that how we each decide what we perceive to be art very much revolves around how we approach and assign meaning to these particular concepts.

Btw, the realm of fanfiction is currently grappling with the issue of LLM produced ‘content’ flooding the scene, which I find to be particularly interesting. There are a lot of really insightful perspectives to take in on the subjects of what it means to create and/or to consume going on over there.

derv82 | 5 hours ago

The Super Mario World romhacking scene has outright banned any and all AI usage. Their justification: it will erode the community.

It’s an interesting hardline stance. I feel like romhacking and fanfic communities probably have similar dynamics (creativity via popular IP).

kingofsnake | 20 hours ago

Have you read some AI fanfiction? How is it?

There’s nothing wrong with it, and it’s also pointless. It was never going to be difficult in any sense of the word for AI to write fanfic better than a lot of fanfic writers, but the output is just… empty.

There has been an influx of ‘clout-chasers’ attempting to hijack various fandoms, as well as generally younger people just getting into fanfic, and these seem to be the ones most drawn to using AI. It’s not great. They want recognition, they want community, and they’re not really getting either of those things, because they’ve entirely missed the point. They are, however, managing to wreak havoc on the already existing community, and to sow a lot of despondency in their wake as they steal others’ work and run it through LLMs to… idek — “improve” what they’ve found, tweak the fanfics to their liking, add endings to abandoned or still in-progress works.

This whole thing seems to distill down to people wanting instant gratification for doing nothing.

But, that’s just my opinion on it; there are plenty of people who genuinely enjoy and are excited to read AI generated fanfic.

kingofsnake | 15 hours ago

Thanks for the breakdown. I can imagine that volume and mediocrity of AI work plus the shock of seeing a very human pastime intruded by non authors is shocking.

Part of the fun is hoping to glean something about a writer's genius from their interviews, their latest work or their view of the fandom. With ai, you get none of those and I'm sure it's scaring the community deeply.

Pepetto | 22 hours ago

While I love most of Sanderson's books, I'd don't think what he says is really relevant...
He is clearly focusing on the writer's point of view, not the reader's. If he wants to grow as a writer nothing stops him from continuing to write after the AI replaced him. But he hasn't really provided a reason to prefer consumming human art over AI. (I'm not saying there are no reasons to do so, but he hasn't given any).
I don't read his books to feed his ego, or help him acheive writter enligthenment, I read books because they are entertaining, or help me apprehend new complex situation, or train my empathy. The end product definitely does matter very much. If AI can write books twice as good as he can, with incredibly tight plot and unforgettable twists, then I'll read AI books.

thecakeisalime | 5 hours ago

Agreed. I think as a creator, he has a valid perspective, but as a consumer of art, he didn't really delve into why AI is or isn't art. I think mostly, his point is that people who only use AI to create something aren't artists. I can probably agree with that (for now), but I also don't know that it matters beyond semantics and philosophy. If I take a picture of something with my camera phone, am I a photographer? Or just someone who took a picture? Is it art?

When I look at the Mona Lisa, something from Deviant Art, or something generated by AI, I get to see the work in my own way. Once it's out there, the artist doesn't get to decide how I feel about their work. I mostly don't care how much blood, sweat, and tears were poured into the work. I'm just there to consume and experience the end result. I strongly believe in the separation of art from the artist, which also applies to AI generated art. Does the work stand on its own merit?

I have reasons for preferring non-AI work (and art created by non-shitty people), but it has little to do with the definition of "art" and who created it, and much more to do with how terrible capitalism is.

lelio | 23 hours ago

Sanderson talks about the journey. The growth that the artist achieves during creation.

Our AI models are experiencing growth. The creation of one piece of content might not have a direct impact on the model that made it. I don't think it works like that. But people are experiencing AI art and responding to it. Other people are developing different AI models, experimenting with different ways to interpret and manipulate data. The growth of AI is humanity's collective growth as well. It has to be, there doesn't appear to any sentience involved in the models themselves yet.

We are trying to recreate our brains. That was the whole idea of neural networks right? We don't understand how our brains work yet. But trying to design your own version of something is a great way to learn about it. Ideally, the fields of neurology and machine learning should be developing in parallel and informing each other.

I don't believe in God. I think all the amazing things that come from the human brain could be expressed with physics and math.

When AI spits out a soulful blues cover of Warren Gs "Regulate" and it actually sparks an emotional response in me. Its kind of a parlour trick. Without really understanding it, we've figured out how to hit all the right buttons in our brains to trigger those emotional responses. I don't believe the current AI models have any significant intelligence or understanding of what they are doing. But, in addition to the mathematically induced emotional response, I feel awe and excitement that human civilization is progressing down this road to understanding how our brains work! What are emotions? Where do they come from? AI art is an artifact of us taking baby steps toward answering those questions.

Being able to understand our brains would allow us to better treat them and improve them. Being able to understand brains in general and intelligence and emotion could allow us to design new and diverse types of brains. It could help our society become something amazing.

I think he hits on the real problem here: These are made to be products. The main incentive to create them is to attract investors and ultimately extract profit.

Because it seems the only way we can do anything in our current civilization is to find a way to make it profitable. Even if a project is started for the betterment of society. We have to find a way to make it profitable or it won't have any resources dedicated to it. Almost immediately profit is the only goal. The whole project is twisted towards profit regardless of whether it's harmful or helpful.

Like Openai starting out open source and non-profit and all that getting chucked out the window a few years later.

So we get stuck in this loop where we see tech being used in harmful ways and resent the tools rather than the system that abuses it.

I find AI art really compelling. Even the bad, weird stuff. It makes me think about what art is, where it comes from, how humans make it and experience it. It's a type of art we've never seen before! I don't really care whether it's "bad" or "authentic" or whatever.

It seems clear to me all the negative feeling and resentment that people have about AI stems from the inequality and fear that is so prevelant in the world today. People are right to be angry about those things. I just wish we could direct the anger at the system that creates them instead of the tools it uses.

Tax these giant tech companies out existence and use that money to fund R&D on AI and a million other things that can help everyone and be owned by all of us collectively. Free markets are great for effiency and commodities. Just let them run in their own walled off sandboxes. Important, long term projects need to be done on purpose and mindfully.