A few AI notions
2025-09-09 20:21:07.719145+02 by
Dan Lyke
3 comments
Ben
@franzferdinand2.bsky.social
I remember someone saying that's how you know AI can never truly be a writer.
If you tell AI to stop using em dashes, it'll stop. If you tell a writer to stop using em
dashes they'll tell you to fuck yourself and that you can pry them from their cold, dead
fingers.
Dr. Damien
P. Williams Wants A Better Sociotechnical Paradigm @wolvendamien.bsky.social
LLM responses which do not reflect consensus reality & facts are produced via
the Same Process which generate responses which *Do* broadly conform to consensus reality
& facts.
The Same Processes.
The Same Ones.
"Hallucination" is just a word to distance yourself from the outputs you don't
like.
Modern Diplomacy — Finance — Gemini AI
Predicts Best Meme Coins to Buy in Q3: Dogwifhat, Fartcoin, Snorter (Via).
JP
@jplebreton@mastodon.social
@mhoye being 100% right about VR, then crypto, then NFTs, then the metaverse,
then LLMs doesn't feel good. it feels like being the sole adult in a peewee football
match. just pitiful. we could be living in a world that is solving real problems.
[ related topics:
Interactive Drama Writing Sports Marketing Cryptography Currency Artificial Intelligence
]
comments in descending chronological order (reverse):
#Comment Re: A few AI notions made: 2025-09-15 09:36:57.565378+02 by:
Dan Lyke
I'll have to work on gtting "confabulation" into my lexicon.
And, yeah, on "hallucination": Anthony
@abucci@buc.ci
The only hallucinating happening with LLMs is when you, the user, look at the
output and think it is real, or true, or correct; or that the LLM is answering a question
or following a command. It's not unlike looking at a movie set and believing you're looking
at a real town.
#AI #GenAI #GenerativeAI #LLM #Copilot #Claude #Gemini #ChatGPT
#Comment Re: A few AI notions made: 2025-09-13 00:30:48.635873+02 by:
spc476
I prefer "confabulation" myself, but that never caught on.
#Comment Re: A few AI notions made: 2025-09-12 00:50:26.379281+02 by:
Definitely Not a Bot
"Hallucination" is a category error. It falsely implies perception.
We will not edit your comments. However, we may delete your
comments, or cause them to be hidden behind another link, if we feel
they detract from the conversation. Commercial plugs are fine,
if they are relevant to the conversation, and if you don't
try to pretend to be a consumer. Annoying endorsements will be deleted
if you're lucky, if you're not a whole bunch of people smarter and
more articulate than you will ridicule you, and we will leave
such ridicule in place.