I’'m curious about the strong negative feelings towards AI and LLMs. While I don’t defend them, I see their usefulness, especially in coding. Is the backlash due to media narratives about AI replacing software engineers? Or is it the theft of training material without attribution? I want to understand why this topic evokes such emotion and why discussions often focus on negativity rather than control, safety, or advancements.
AI has only one problem to solve: salaries
Many people on Lemmy are extremely negative towards AI which is unfortunate. There are MANY dangers, but there are also Many obvious use cases where AI can be of help (summarizing a meeting, cleaning up any text etc.)
Yes, the wax how these models have been trained is shameful, but unfoet9tjat ship has sailed, let’s be honest.
Because so far we only see the negative impacts in human society IMO. Latest news haven’t help at all, not to mention how USA is moving towards AI. Every positive of AI, leads to be used in a workplace, which then will most likely lead to lay offs. I may start to think that Finch in POI, was right all along.
edit: They sell us an unfinished product, which we build in a wrong way.
My skepticism is because it’s kind of trash for general use. I see great promise in specialized A.I. Stuff like Deepfold or astronomy situations where the telescope data is coming in hot and it would take years for humans to go through it all.
But I don’t think it should be in everything. Google shouldn’t be sticking LLM summaries at the top. It hallucinates so I need to check the veracity anyway. In medicine, it can help double-check but it can’t be the doctor. It’s just not there yet and might never get there. Progress has kind of stalled.
So, I don’t “hate” any technology. I hate when people misapply it. To me, it’s (at best) beta software and should not be in production anywhere important. If you want to use it for summarizing Scooby Doo episodes, fine. But it shouldn’t be part of anything we rely on yet.
Also, it should never be used for art. I don’t care if you need to make a logo for a company and A.I. spits out whatever. But real art is about humans expressing something. We don’t value cave paintings because they’re perfect. We value them because someone thousands of years ago made it.
So, that’s something I hate about it. People think it can “democratize” art. Art is already democratized. I have a child’s drawing on my fridge that means more to me than anything at any museum. The beauty of some things is not that it was generated. It’s that someone cared enough to try. I’d rather a misspelled crayon card from my niece than some shit ChatGPT generated.
Yeah, “democratize art” means “I’m jealous of the cash sloshing around out there.”
People say things like “I’m not as good as this guy on TikTok.” Why do you need to be? Literally, who asked?
Because the goal of “AI” is to make the grand majority of us all obsolete. The billion-dollar question AI is trying to solve is “why should we continue to pay wages?”. That is bad for everyone who isn’t part of the owner class. Even if you personally benefit from using it to make yourself more productive/creative/… the data you input can and WILL eventually be used against you.
If you only self-host and know what you’re doing, this might be somewhat different, but it still won’t stop the big guys from trying to swallow all the others whole.
Reads like a rant against the industrial revolution. “The industry is only concerned about replacing workers with steam engines!”
You’re probably not wrong. It’s definitely along the same lines… although the repercussions of this particular one will be infinitely greater than those of the industrial revolution.
Also, industrialization made for better products because of better manufacturing processes. I’m by no means sure we can say the same about AI. Maybe some day, but today it’s just “an advanced dumbass” considering most real world scenarios.
Read ‘The Communist Manifesto’ if you’d like to understand in which ways the bourgeoisie used the industrial revolution to hurt the proletariat, exactly as they are with AI.
The industrial revolution is what made socialism possible, since now a smaller amount of workers can support the elderly, children, etc.
Just look at China before and after industrializing. Life expectancy way up, the government can provide services like public transit and medicine (for a nominal fee)
We’re discussing how industry and technology are used against the proletariat, not how state economies form. You can read the pamphlet referenced in the previous post if you’d like to understand the topic at hand.
You should check out this https://thenib.com/im-a-luddite/
I can only speak as an artist.
Because it’s entire functionality is based on theft. Companies are stealing the works of ppl and profiting off of it with no payment to the artists who’s works its platform is based on.
You often hear the argument that all artists borrow from others but if I created an anime that is blantantly copying the style of studio Ghibili I’d rightly be sued. On top of that AI is copying so obviously it recreates the watermarks from the original artists.
Fuck AI
AI is theft in the first place. None of the current engines have gotten their training data legally. The are based on pirated books and scraped content taken from websites that explicitely forbid use of their data for training LLMs.
And all that to create mediocre parrots with dictionaries that are wrong half the time, and often enough give dangerous, even lethal advice, all while wasting power and computational resources.
My main gripes are more philosophical in nature, but should we automate away certain parts of the human experience? Should we automate art? Should we automate human connections?
On top of these, there’s also the concern of spam. AI is quick enough to flood the internet with low-effort garbage.
If you don’t hate AI, you’re not informed enough.
It has the potential to disrupt pretty much everything in a negative way. Especially when regulations always lag behind. AI will be abused by corporations in the worst way possible, while also being bad for the planet.
And the people who are most excited about it, tend to be the biggest shitheads. Basically, no informed person should want AI anywhere near them unless they directly control it.