CRYPTOREPORTCLUB
  • Crypto news
  • AI
  • Technologies
Friday, October 24, 2025
No Result
View All Result
CRYPTOREPORTCLUB
  • Crypto news
  • AI
  • Technologies
No Result
View All Result
CRYPTOREPORTCLUB

How to ensure youth, parents, educators and tech companies are on the same page on AI

October 24, 2025
159
0

October 23, 2025

The GIST How to ensure youth, parents, educators and tech companies are on the same page on AI

Related Post

Extent of AI-created content in American news and opinion pages revealed

Extent of AI-created content in American news and opinion pages revealed

October 24, 2025
There is little evidence AI chatbots are ‘bullying kids’—but this doesn’t mean these tools are safe

There is little evidence AI chatbots are ‘bullying kids’—but this doesn’t mean these tools are safe

October 24, 2025
Lisa Lock

scientific editor

Alexander Pol

deputy editor

Editors' notes

This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

trusted source

written by researcher(s)

proofread

daydreaming at computer
Credit: Unsplash/CC0 Public Domain

Artificial intelligence is now part of everyday life. It's in our phones, schools and homes. For young people, AI shapes how they learn, connect and express themselves. But it also raises real concerns about privacy, fairness and control.

AI systems often promise personalization and convenience. But behind the scenes, they collect vast amounts of personal data, make predictions and influence behavior, without clear rules or consent.

This is especially troubling for youth, who are often left out of conversations about how AI systems are built and governed.

The author’s guide on how to protect youth privacy in an AI world.

Concerns about privacy

My research team conducted national research and heard from youth aged 16 to 19 who use AI daily—on social media, in classrooms and in online games.

They told us they want the benefits of AI, but not at the cost of their privacy. While they value tailored content and smart recommendations, they feel uneasy about what happens to their data.

Many expressed concern about who owns their information, how it is used and whether they can ever take it back. They are frustrated by long privacy policies, hidden settings and the sense that you need to be a tech expert just to protect yourself.

As one participant said, "I am mainly concerned about what data is being taken and how it is used. We often aren't informed clearly."

Uncomfortable sharing their data

Young people were the most uncomfortable group when it came to sharing personal data with AI. Even when they got something in return, like convenience or customization, they didn't trust what would happen next. Many worried about being watched, tracked or categorized in ways they can't see.

This goes beyond technical risks. It's about how it feels to be constantly analyzed and predicted by systems you can't question or understand.

AI doesn't just collect data, it draws conclusions, shapes online experiences, and influences choices. That can feel like manipulation.

Parents and teachers are concerned

Adults (educators and parents) in our study shared similar concerns. They want better safeguards and stronger rules.

But many admitted they struggle to keep up with how fast AI is moving. They often don't feel confident helping youth make smart choices about data and privacy.

Some saw this as a gap in digital education. Others pointed to the need for plain-language explanations and more transparency from the tech companies that build and deploy AI systems.

Professionals focus on tools, not people

The study found AI professionals approach these challenges differently. They think about privacy in technical terms such as encryption, data minimization and compliance.

While these are important, they don't always align with what youth and educators care about: trust, control and the right to understand what's going on.

Companies often see privacy as a trade-off for innovation. They value efficiency and performance and tend to trust technical solutions over user input. That can leave out key concerns from the people most affected, especially young users.

Power and control lie elsewhere

AI professionals, parents and educators influence how AI is used. But the biggest decisions happen elsewhere. Powerful tech companies design most digital platforms and decide what data is collected, how systems work and what choices users see.

Even when professionals push for safer practices, they work within systems they did not build. Weak privacy laws and limited enforcement mean that control over data and design stays with a few companies.

This makes transparency and holding platforms accountable even more difficult.

What's missing? A shared understanding

Right now, youth, parents, educators and tech companies are not on the same page. Young people want control, parents want protection and professionals want scalability.

These goals often clash, and without a shared vision, privacy rules are inconsistent, hard to enforce or simply ignored.

Our research shows that ethical AI governance can't be solved by one group alone. We need to bring youth, families, educators and experts together to shape the future of AI.

The PEA-AI model

To guide this process, we developed a framework called PEA-AI: Privacy–Ethics Alignment in Artificial Intelligence. It helps identify where values collide and how to move forward. The model highlights four key tensions:

  1. Control versus trust: Youth want autonomy. Developers want reliability. We need systems that support both.
  2. Transparency versus perception: What counts as "clear" to experts often feels confusing to users.
  3. Parental oversight versus youth voice: Policies must balance protection with respect for youth agency.
  4. Education versus awareness gaps: We can't expect youth to make informed choices without better tools and support.

What can be done?

Our research points to six practical steps:

  • Simplify consent. Use short, visual, plain-language forms. Let youth update settings regularly.
  • Design for privacy. Minimize data collection. Make dashboards that show users what's being stored.
  • Explain the systems. Provide clear, non-technical explanations of how AI works, especially when used in schools.
  • Hold systems accountable. Run audits, allow feedback and create ways for users to report harm.
  • Teach privacy. Bring AI literacy into classrooms. Train teachers and involve parents.
  • Share power. Include youth in tech policy decisions. Build systems with them, not just for them.

AI can be a powerful tool for learning and connection, but it must be built with care. Right now, our research suggests young people don't feel in control of how AI sees them, uses their data or shapes their world.

Ethical AI starts with listening. If we want digital systems to be fair, safe and trusted, we must give youth a seat at the table and treat their voices as essential, not optional.

Provided by The Conversation

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Citation: How to ensure youth, parents, educators and tech companies are on the same page on AI (2025, October 23) retrieved 23 October 2025 from https://techxplore.com/news/2025-10-youth-parents-tech-companies-page.html This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Online privacy policies can be 90,000 words long. Here are three ways to simplify them

Feedback to editors

Share212Tweet133ShareShare27ShareSend

Related Posts

Extent of AI-created content in American news and opinion pages revealed
AI

Extent of AI-created content in American news and opinion pages revealed

October 24, 2025
0

October 23, 2025 The GIST Extent of AI-created content in American news and opinion pages revealed Gaby Clark scientific editor Robert Egan associate editor Editors' notes This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:...

Read moreDetails
There is little evidence AI chatbots are ‘bullying kids’—but this doesn’t mean these tools are safe

There is little evidence AI chatbots are ‘bullying kids’—but this doesn’t mean these tools are safe

October 24, 2025
AI-powered bots increase social media post engagement but do not boost overall user activity

AI-powered bots increase social media post engagement but do not boost overall user activity

October 23, 2025
AI-guided drones use 3D printing to build structures in hard-to-reach places

AI-guided drones use 3D printing to build structures in hard-to-reach places

October 23, 2025
AI bots could match scientist-level design problem solving

AI bots could match scientist-level design problem solving

October 23, 2025
AI teaches itself and outperforms human-designed algorithms

AI teaches itself and outperforms human-designed algorithms

October 23, 2025
How attractive do AI voices sound?

How attractive do AI voices sound?

October 23, 2025

Recent News

Extent of AI-created content in American news and opinion pages revealed

Extent of AI-created content in American news and opinion pages revealed

October 24, 2025

How Much Could Bitcoin, Ether, XRP and Solana Move After the U.S. Inflation Report?

October 24, 2025
Nike pitches robotic sneakers and mind-altering mules

Nike pitches robotic sneakers and mind-altering mules

October 24, 2025
How to ensure youth, parents, educators and tech companies are on the same page on AI

How to ensure youth, parents, educators and tech companies are on the same page on AI

October 24, 2025

TOP News

  • God help us, Donald Trump plans to sell a phone

    God help us, Donald Trump plans to sell a phone

    602 shares
    Share 241 Tweet 151
  • Investment Giant 21Shares Announces New Five Altcoins Including Avalanche (AVAX)!

    602 shares
    Share 241 Tweet 151
  • WhatsApp has ads now, but only in the Updates tab

    602 shares
    Share 241 Tweet 151
  • Tron Looks to go Public in the U.S., Form Strategy Like TRX Holding Firm: FT

    603 shares
    Share 241 Tweet 151
  • AI generates data to help embodied agents ground language to 3D world

    601 shares
    Share 240 Tweet 150
  • About Us
  • Contact Us
  • Privacy Policy
  • Terms of Use
Advertising: digestmediaholding@gmail.com

Disclaimer: Information found on cryptoreportclub.com is those of writers quoted. It does not represent the opinions of cryptoreportclub.com on whether to sell, buy or hold any investments. You are advised to conduct your own research before making any investment decisions. Use provided information at your own risk.
cryptoreportclub.com covers fintech, blockchain and Bitcoin bringing you the latest crypto news and analyses on the future of money.

© 2023-2025 Cryptoreportclub. All Rights Reserved

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • Crypto news
  • AI
  • Technologies

Disclaimer: Information found on cryptoreportclub.com is those of writers quoted. It does not represent the opinions of cryptoreportclub.com on whether to sell, buy or hold any investments. You are advised to conduct your own research before making any investment decisions. Use provided information at your own risk.
cryptoreportclub.com covers fintech, blockchain and Bitcoin bringing you the latest crypto news and analyses on the future of money.

© 2023-2025 Cryptoreportclub. All Rights Reserved