CRYPTOREPORTCLUB
  • Crypto news
  • AI
  • Technologies
Wednesday, July 30, 2025
No Result
View All Result
CRYPTOREPORTCLUB
  • Crypto news
  • AI
  • Technologies
No Result
View All Result
CRYPTOREPORTCLUB

From position to meaning: How AI learns to read

July 7, 2025
158
0

July 7, 2025

The GIST From position to meaning: How AI learns to read

Related Post

‘Marathon at F1 speed’: China bids to lap US in AI leadership

‘Marathon at F1 speed’: China bids to lap US in AI leadership

July 30, 2025
Fraud detection strategies outlined may explain how to survive explosion of deepfakes

Fraud detection strategies outlined may explain how to survive explosion of deepfakes

July 30, 2025
Sadie Harley

scientific editor

Robert Egan

associate editor

Editors' notes

This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

trusted source

proofread

Gemini AI
Credit: Unsplash/CC0 Public Domain

The language capabilities of today's artificial intelligence systems are astonishing. We can now engage in natural conversations with systems like ChatGPT, Gemini, and many others, with a fluency nearly comparable to that of a human being. Yet we still know very little about the internal processes in these networks that lead to such remarkable results.

A study titled "A Phase Transition between Positional and Semantic Learning in a Solvable Model of Dot-Product Attention," published in the Journal of Statistical Mechanics: Theory and Experiment reveals a piece of this mystery.

It shows that when small amounts of data are used for training, neural networks initially rely on the position of words in a sentence. However, as the system is exposed to enough data, it transitions to a new strategy based on the meaning of the words.

The study finds that this transition occurs abruptly, once a critical data threshold is crossed—much like a phase transition in physical systems. The findings offer valuable insights for understanding the workings of these models.

Just like a child learning to read, a neural network starts by understanding sentences based on the positions of words: depending on where words are located in a sentence, the network can infer their relationships (are they subjects, verbs, objects?). However, as the training continues—the network "keeps going to school"—a shift occurs: word meaning becomes the primary source of information.

This, the new study explains, is what happens in a simplified model of the self-attention mechanism—a core building block of transformer language models, like the ones we use every day (ChatGPT, Gemini, Claude, etc.).

A transformer is a neural network architecture designed to process sequences of data, such as text, and it forms the backbone of many modern language models. Transformers specialize in understanding relationships within a sequence and use the self-attention mechanism to assess the importance of each word relative to the others.

"To assess relationships between words," explains Hugo Cui, a postdoctoral researcher at Harvard University and first author of the study, "the network can use two strategies, one of which is to exploit the positions of words." In a language like English, for example, the subject typically precedes the verb, which in turn precedes the object. "Mary eats the apple" is a simple example of this sequence.

"This is the first strategy that spontaneously emerges when the network is trained," Cui explains. "However, in our study, we observed that if training continues and the network receives enough data, at a certain point—once a threshold is crossed—the strategy abruptly shifts: the network starts relying on meaning instead."

"When we designed this work, we simply wanted to study which strategies, or mix of strategies, the networks would adopt. But what we found was somewhat surprising: below a certain threshold, the network relied exclusively on position, while above it, only on meaning."

Cui describes this shift as a phase transition, borrowing a concept from physics. Statistical physics studies systems composed of enormous numbers of particles (like atoms or molecules) by describing their collective behavior statistically.

Similarly, neural networks—the foundation of these AI systems—are composed of large numbers of "nodes," or neurons (named by analogy to the human brain), each connected to many others and performing simple operations. The system's intelligence emerges from the interaction of these neurons, a phenomenon that can be described with statistical methods.

This is why we can speak of an abrupt change in network behavior as a phase transition, similar to how water, under certain conditions of temperature and pressure, changes from liquid to gas.

"Understanding from a theoretical viewpoint that the strategy shift happens in this manner is important," Cui emphasizes.

"Our networks are simplified compared to the complex models people interact with daily, but they can give us hints to begin to understand the conditions that cause a model to stabilize on one strategy or another. This theoretical knowledge could hopefully be used in the future to make the use of neural networks more efficient and safer."

More information: A Phase Transition between Positional and Semantic Learning in a Solvable Model of Dot-Product Attention, Journal of Statistical Mechanics Theory and Experiment (2025).

Provided by SISSA Medialab Citation: From position to meaning: How AI learns to read (2025, July 7) retrieved 7 July 2025 from https://techxplore.com/news/2025-07-position-ai.html This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Bilinear sequence regression model shows why AI excels at learning from word sequences 0 shares

Feedback to editors

Share212Tweet133ShareShare27ShareSend

Related Posts

‘Marathon at F1 speed’: China bids to lap US in AI leadership
AI

‘Marathon at F1 speed’: China bids to lap US in AI leadership

July 30, 2025
0

July 30, 2025 The GIST 'Marathon at F1 speed': China bids to lap US in AI leadership Andrew Zinin lead editor Editors' notes This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility: fact-checked reputable news...

Read moreDetails
Fraud detection strategies outlined may explain how to survive explosion of deepfakes

Fraud detection strategies outlined may explain how to survive explosion of deepfakes

July 30, 2025
AI agent autonomously solves complex cybersecurity challenges using text-based tools

AI agent autonomously solves complex cybersecurity challenges using text-based tools

July 29, 2025
Why AI leaderboards are inaccurate and how to fix them

Why AI leaderboards are inaccurate and how to fix them

July 29, 2025
How US adults are using AI, according to AP-NORC polling

How US adults are using AI, according to AP-NORC polling

July 29, 2025
Trading AI. How Artificial Intelligence Is Revolutionizing Financial Markets

Trading AI. How Artificial Intelligence Is Revolutionizing Financial Markets

July 29, 2025
‘AI veganism’: Some people’s issues with AI parallel vegans’ concerns about diet

‘AI veganism’: Some people’s issues with AI parallel vegans’ concerns about diet

July 29, 2025

Recent News

WLFI Invests $10M in Falcon Finance to Boost On-Chain Dollar Liquidity

July 30, 2025
Google is bringing image and PDF uploads to AI Mode

Google is bringing image and PDF uploads to AI Mode

July 30, 2025
‘Marathon at F1 speed’: China bids to lap US in AI leadership

‘Marathon at F1 speed’: China bids to lap US in AI leadership

July 30, 2025

Warning from Crypto Analysis Platform Matrixport! Fear and Greed Index Nears Peak! What Does It Mean? Here Are the Details

July 30, 2025

TOP News

  • AI-driven personalized pricing may not help consumers

    AI-driven personalized pricing may not help consumers

    543 shares
    Share 217 Tweet 136
  • Our favorite power bank for iPhones is 20 percent off right now

    543 shares
    Share 217 Tweet 136
  • God help us, Donald Trump plans to sell a phone

    544 shares
    Share 218 Tweet 136
  • Investment Giant 21Shares Announces New Five Altcoins Including Avalanche (AVAX)!

    543 shares
    Share 217 Tweet 136
  • WhatsApp has ads now, but only in the Updates tab

    543 shares
    Share 217 Tweet 136
  • About Us
  • Contact Us
  • Privacy Policy
  • Terms of Use
Advertising: digestmediaholding@gmail.com

Disclaimer: Information found on cryptoreportclub.com is those of writers quoted. It does not represent the opinions of cryptoreportclub.com on whether to sell, buy or hold any investments. You are advised to conduct your own research before making any investment decisions. Use provided information at your own risk.
cryptoreportclub.com covers fintech, blockchain and Bitcoin bringing you the latest crypto news and analyses on the future of money.

© 2023-2025 Cryptoreportclub. All Rights Reserved

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • Crypto news
  • AI
  • Technologies

Disclaimer: Information found on cryptoreportclub.com is those of writers quoted. It does not represent the opinions of cryptoreportclub.com on whether to sell, buy or hold any investments. You are advised to conduct your own research before making any investment decisions. Use provided information at your own risk.
cryptoreportclub.com covers fintech, blockchain and Bitcoin bringing you the latest crypto news and analyses on the future of money.

© 2023-2025 Cryptoreportclub. All Rights Reserved