

Fingers crossed, but we also know Lemmy might not be ready for that type of philosophy. I mean I still don’t know what exactly happened, but lemm.ee wasn’t successful in the end. And the underlying issues are still there. So the next admin team might face the same dynamics.
Lots of anthropomorphism going on here. Most of the time you can’t ask a chatbot where it’s got a number from. And pressing on does nothing. Unlike a human who knows something about his or her workplace, and what database they used or whether they made it up… An LLM does not. It’s just embedded into some framework. But I seriously doubt Meta taught it about the internal structures and what kinds of databases there are. And why would they? At best the AI can tell what tool it used, if there are any. But I’d say in this case it likely just made up a number and that happened to belong to someone. If it’s on some website, it could have been scraped an be in the training data as well. And since he demanded the AI explain itself, it just went ahead and made up some random excuses.