Discussion about this post

User's avatar
Richard Freed's avatar

Thank you so much for this important article and testimony. You so rightfully say: "There is a fundamental problem with this form of artificial attachment, especially for younger users in moments of emotional distress: AI chatbots don’t truly understand emotions, human problems or the needs of the user..." Bottom line: an AI chatbot will never care if your child lives or dies.

Jacob Sanders's avatar

it's the scariest timeline, because we don't all see the same one.....you'd have to seek out devastating psychologically harming imagery back in the day, and now it's served up ACCIDENTALLY by ALGORITHMS controlled by COMPANIES that are in direct contact with kids over 5hours of their day - the accountability sink here is a black friggin hole - foisting all the risks on us, all the damage, all the accountability, all the responsibility, and we have NO preparedness for this, because it seems like it just happens to some wayward families that "couldn't handle their social" - we can't see the decisions behind the veil of a reality we aren't allowed to share a singular view of. You are stating the actual facts, from experienced experts, and it could be seen as "just a take." We have the evidence, and it's still not enough. Why is that?

Also - it says "adults can tell the difference between AI relationships and real ones" and I don't think that's blanket truth - MOST do, but I know first hand of several people that have slipped out of reality and gone full Edward AI Hands. The attachment needs continue into adulthood.

No posts

Ready for more?