Flutterby™! (short)

Tuesday March 24th, 2026

a collective-level fail-safe feature Dan Lyke / comment 0

Psychology Today: Epistemic Injustice: The Great Gaslighting of Autistic Lives

Via and via, in linking to the latter post Manuèle Ducret @Filambulle@mastodon.social observed:

Instead of empathy deficit, autistic people demonstrate a broader moral concern, extending fairness beyond their tribes. Where researchers had assumed impairment, they found autistic people applying moral principles more consistently—even to strangers, even when costly. In a world increasingly damaged by in-group bias, this isn't a deficit; it's a collective-level fail-safe feature.

DUETCS Dan Lyke / comment 0

2023 IEEE/ACM 45th International Conference on Software Engineering (ICSE) DUETCS: Code Style Transfer through Generation and Retrieval Binger Chen, Ziawasch Abedjan. The measure of success includes:

Computational accuracy (AC): the percentage of programs that can be compiled and produce the same output as the ground truth reference when given the same input.

which... uh.... ✧✦Catherine✦✧ @whitequark@treehouse.systems notes

i'm at a loss of words after reading a paper about reformatting code using an ML model that has a measured statistical quantity A_c which says how often the reformatted code behaves the same as the original

the "ideal" (their choice of words) case is 64.2%

Which... explains so much about modern software. Couple of interesting additional notes from the thread.

Monday March 23rd, 2026

restaurant robot rebels Dan Lyke / comment 0

I have never been to a Benihana or similar, but if you're gonna mix performance and food, robots going apeshit, throwing sauces all over, and breaking silverware may be the thing that gets me into an international chain restaurant...

Watch this restaurant robot malfunction and scatter tableware during live performance.

Just look at the wait staff trying to find the setting on the app to turn the thing off. You know they're thinking about the staff meeting where this was introduced, and how this is gonna play with the management that let this happen...

Via.

Age Verification in Linux Dan Lyke / comment 0

Sam Bent: The Engineer Who Tried to Put Age Verification Into Linux

Taylor believes what he's doing is right, which makes him harder to stop than someone acting for money. The day after the systemd PR was merged, he published a post on his personal blog defending Google's new friction-heavy Android sideloading controls as a "fair trade." His argument: power users absorb a one-time inconvenience while vulnerable people (scam victims, children) get protected. He used the phrase "you shouldn't have to choose between open and secure." Taylor's blog post

I can see multiple sides to this, I don't think either issue, age verification, or side- loading on Android, is completely cut and dried.

But I sure am thinking again about BSD or something that doesn't use systemd.

operate.txt Dan Lyke / comment 0

You could code your web site for accessibility. Make buttons and form elements obvious, and well named. Mark up your ARIA roles.

Or this bullshit: https://github.com/serdem1/operate.txt

The same way every site has robots.txt, every site in the agentic age will need operate.txt.

You know exactly which magic strings I've put in mine...

Paid for making and streaming fake songs Dan Lyke / comment 0

US Attorney's Office Southern District of New York — North Carolina Man Pleads Guilty To Music Streaming Fraud Aided By Artificial Intelligence

“Michael Smith generated thousands of fake songs using artificial intelligence and then streamed those fake songs billions of times,” said U.S. Attorney Jay Clayton. “Although the songs and listeners were fake, the millions of dollars Smith stole was real. Millions of dollars in royalties that Smith diverted from real, deserving artists and rights holders. Smith’s brazen scheme is over, as he stands convicted of a federal crime for his AI-assisted fraud.”

Via

AI link dump Dan Lyke / comment 0

Nadella paid $650M to recruit his AI chief. After 2 years he's quietly pushing him aside — these brutal numbers are why. Looks like it's not necessarily that people don't want AI in their Microsoft products, it's that Copilot kinda sucks.

Independent research tells a worse story. A Recon Analytics survey of more than 150,000 U.S. paid AI subscribers found that Copilot's market share fell from 18.8% in July 2025 to 11.5% by January 2026 — a 39% contraction. The most damaging finding: when workers only have access to Copilot, adoption sits at 68%. Add ChatGPT as an option and Copilot drops to 18%. Add Gemini on top of that and just 8% choose Copilot.

Via.

Frank Elavsky: Stop saying that AI is just a tool and it only matters how it is used

And tools use us by their design. This is Heidegger’s Gestell (“en- framing”): the notion that technologies shape who we are because of their design and use. A hammer isn’t just made of wood and iron, then. A hammer is a hammer because of what it does and who we become when we use it.

Via.

Jeremy Keith on adactio.com and on the Fediverse:

It feels like all my peers are experiencing Deep Blue and having to choose their future career path:

expert in a dying field

or

collaborator in a fascist project.

Five Excuses for Academic Misconduct Dan Lyke / comment 0

Dr Dorothea Baur: Hallucinated References: Five Excuses for Academic Misconduct

This discussion was revealing – not because it changed my position, but because it exposes fundamental patterns in the AI debate.

Defensive deflection, TINA rhetoric, resignation, victim mentality, nihilism. These aren’t fringe phenomena, but precisely the arguments we must contend with, again and again, whenever we talk about AI.

Via.

LLM on Usenet Dan Lyke / comment 0

The Lurking LLM on Usenet and The Lurking LLM on the SmolNet


Flutterby&tm;! is a trademark claimed by
Dan Lyke
for the web publications at www.flutterby.com and www.flutterby.net. Last modified: Thu Mar 15 12:48:17 PST 2001