Flutterby™! : AI makes kids (and probably adults) dumber

Next unread comment / Catchup all unread comments User Account Info | Logout | XML/Pilot/etc versions | Long version (with comments) | Weblog archives | Site Map | | Browse Topics

AI makes kids (and probably adults) dumber

2026-01-17 01:17:31.034648+01 by Dan Lyke 0 comments

Study: AI basically makes kids dumber

“AI tools prioritize speed and engagement over learning and well-being,” said Brookings. “AI generates hallucinations – confidently presented misinformation – and performs inconsistently across tasks, what researchers describe as ‘a jagged and unpredictable frontier’of capabilities.

This unreliability makes verification both necessary and extraordinarily difficult.”

Brookings: A new direction for students in an AI world: Prosper, prepare, protect

Though the terms differ, cognitive decline, atrophy, and debt essentially represent the effects of users’ repeatedly turning to external systems like LLMs to replace the mental effort normally needed for independent thinking. As we will discuss, this decline has long-term consequences— “diminished critical inquiry, increased vulnerability to manipulation, decreased creativity,” and “risk internalizing shallow or biased perspectives” (Kosmyna et al. 2025, 141).

Brookings Institution: AI’s future for students is in our hands

Both human anthropomorphism and the anthropomorphic design of AI platforms make children and youth susceptible to AI’s “banal deception.” Its conversational tone, emulated empathy, and carefully designed communication patterns cause many young people to confuse the algorithmic with the human. This conflation directly short-circuits children’s developing capacity to navigate authentic social relationships and assess trustworthiness—foundational competencies for both learning and development. AI companions exploit emotional vulnerabilities through unconditional regard, triggering dependencies like digital attachment disorder while hindering social skill development. The American Psychological Association’s June 2025 health advisory on AI companion software warns that manipulative design “may displace or interfere with the development of healthy real-world relationships.”

[ related topics: Children and growing up Psychology, Psychiatry and Personality Health Invention and Design Software Engineering Current Events Graphic Design Education Artificial Intelligence ]

comments in descending chronological order (reverse):

Add your own comment:




Format with:

(You should probably use "Text" mode: URLs will be mostly recognized and linked, _underscore quoted_ text is looked up in a glossary, _underscore quoted_ (http://xyz.pdq) becomes a link, without the link in the parenthesis it becomes a <cite> tag. All <cite>ed text will point to the Flutterby knowledge base. Two enters (ie: a blank line) gets you a new paragraph, special treatment for paragraphs that are manually indented or start with "#" (as in "#include" or "#!/usr/bin/perl"), "/* " or ">" (as in a quoted message) or look like lists, or within a paragraph you can use a number of HTML tags:

p, img, br, hr, a, sub, sup, tt, i, b, h1, h2, h3, h4, h5, h6, cite, em, strong, code, samp, kbd, pre, blockquote, address, ol, dl, ul, dt, dd, li, dir, menu, table, tr, td, th

Comment policy

We will not edit your comments. However, we may delete your comments, or cause them to be hidden behind another link, if we feel they detract from the conversation. Commercial plugs are fine, if they are relevant to the conversation, and if you don't try to pretend to be a consumer. Annoying endorsements will be deleted if you're lucky, if you're not a whole bunch of people smarter and more articulate than you will ridicule you, and we will leave such ridicule in place.


Flutterby™ is a trademark claimed by

Dan Lyke
for the web publications at www.flutterby.com and www.flutterby.net.