- cross-posted to:
- aboringdystopia@lemmy.world
- cross-posted to:
- aboringdystopia@lemmy.world
I feel like humanity is stupid. Over and over again we develop new technologies, make breakthroughs, and instead of calmly evaluating them, making sure they’re safe, we just jump blindly on the bandwagon and adopt it for everything, everywhere. Just like with asbestos, plastics and now LLMs.
Fucking idiots.
Remember: AI chatbots are designed to maximize engagement, not speak the truth. Telling a methhead to do more meth is called customer capture.
The llm models aren’t, they don’t really have focus or discriminate.
The ai chatbots that are build using those models absolutely are and its no secret.
What confuses me is that the article points to llama3 which is a meta owned model. But not to a chatbot.
This could be an official facebook ai (do they have one?) but it could also be. Bro i used this self hosted model to build a therapist, wanna try it for your meth problem?
Heck i could even see it happen that a dealer pretends to help customers who are trying to kick it.
For all we know, they could have self-hosted “Llama3.1_NightmareExtreme_RPG-StoryHorror8B_Q4_K_M” and instructed it to take on the role of a therapist.
its probably som company jusz using llama as their core llm for use in their chatbot-
if its free, peeps will take it n repackage it n resell ig… ~
i rbr seeing som stuff bout llm therapists for lonely peeps- but woag… thad sounds sad :(
oh! btw - do u use ollama too? <3
Sounds a lot like a drug dealer’s business model. How ironic
You don’t look so good… Here, try some meth—that always perks you right up. Sobriety? Oh, sure, if you want a solution that takes a long time, but don’t you wanna feel better now???