CRYPTOREPORTCLUB
  • Crypto news
  • AI
  • Technologies
Saturday, June 14, 2025
No Result
View All Result
CRYPTOREPORTCLUB
  • Crypto news
  • AI
  • Technologies
No Result
View All Result
CRYPTOREPORTCLUB

Training robots without robots: Smart glasses capture first-person task demos

June 12, 2025
157
0

June 12, 2025 feature

The GIST Training robots without robots: Smart glasses capture first-person task demos

Related Post

Benchmarking hallucinations: New metric tracks where multimodal reasoning models go wrong

Benchmarking hallucinations: New metric tracks where multimodal reasoning models go wrong

June 14, 2025
AI-generated podcasts open new doors to make science accessible

AI-generated podcasts open new doors to make science accessible

June 14, 2025
Ingrid Fadelli

contributing writer

Lisa Lock

scientific editor

Robert Egan

associate editor

Editors' notes

This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

preprint

trusted source

proofread

A new system to collect action-labeled data for robot training using smart-glasses
Human demonstrations are done with only black ovens (top). The policy transfers zero-shot to the robot with the same oven (middle) and also generalizes to a new oven instance (bottom). The points are color-coded to represent the correspondence. Credit: Liu et al.

Over the past few decades, robots have gradually started making their way into various real-world settings, including some malls, airports and hospitals, as well as a few offices and households.

For robots to be deployed on a larger scale, serving as reliable everyday assistants, they should be able to complete a wide range of common manual tasks and chores, such as cleaning, washing the dishes, cooking and doing the laundry.

Training machine learning algorithms that allow robots to successfully complete these tasks can be challenging, as it often requires extensive annotated data and/or demonstration videos showing humans the tasks. Devising more effective methods to collect data to train robotics algorithms could thus be highly advantageous, as it could help to further broaden the capabilities of robots.

Researchers at New York University and UC Berkeley recently introduced EgoZero, a new system to collect ego-centric demonstrations of humans completing specific manual tasks. This system, introduced in a paper posted to the arXiv preprint server, relies on the use of Project Aria glasses, the smart glasses for augmented reality (AR) developed by Meta.

Credit: https://egozero-robot.github.io/

"We believe that general-purpose robotics is bottlenecked by a lack of internet-scale data, and that the best way to address this problem would be to collect and learn from first-person human data," Lerrel Pinto, senior author of the paper, told Tech Xplore.

"The primary objectives of this project were to develop a way to collect accurate action-labeled data for robot training, optimize for the ergonomics of the data collection wearables needed, and transfer human behaviors into robot policies with zero robot data."

EgoZero, the new system developed by Pinto and his colleagues, relies on Project Aria smart glasses to easily collect video demonstrations of humans completing tasks while performing robot-executable actions, captured from the point of view of the person wearing the glasses.

These demonstrations can in turn be used to train robotics algorithms on new manipulation policies, which could in turn allow robots to successfully complete various manual tasks.

"Unlike prior works that require multiple calibrated cameras, wrist wearables, or motion capture gloves, EgoZero is unique in that it is able to extract these 3D representations with only smart glasses (Project Aria smart glasses)," explained Ademi Adeniji, student and co-lead author of the paper.

"As a result, robots can learn a new task from as little as 20 minutes of human demonstrations, with no teleoperation."

A new system to collect action-labeled data for robot training using smart-glasses
Architecture diagram. EgoZero trains policies in a unified state-action space defined as egocentric 3D points. Unlike previous methods, EgoZero localizes object points via triangulation over the camera trajectory, and computes action points via Aria MPS hand pose and a hand estimation model. These points supervise a closed-loop Transformer policy, which is rolled out on unprojected points from an iPhone during inference. Credit: Liu et al.

To evaluate their proposed system, the researchers used it to collect video demonstrations of simple actions that are commonly completed in a household environment (e.g., opening an oven door) and then used these demonstrations to train a machine learning algorithm.

The machine learning algorithm was then deployed on Franka Panda, a robotic arm with a gripper attached at its end. Notably, they found that the robotic arm successfully completed most of the tasks they tested it on, even if the algorithm planning its movements underwent minimal training.

"EgoZero's biggest contribution is that it can transfer human behaviors into robot policies with zero robot data, with just a pair of smart glasses," said Pinto.

"It extends past work (Point Policy) by showing that 3D representations enable efficient robot learning from humans, but completely in-the-wild. We hope this serves as a foundation for future exploration of representations and algorithms to enable human-to-robot learning at scale."

The code for the data collection system introduced by Pinto and his colleagues was published on GitHub and can be easily accessed by other research teams.

In the future, it could be used to rapidly collect datasets to train robotics algorithms, which could contribute to the further development of robots, ultimately facilitating their deployment in a greater number of households and offices worldwide.

"We now hope to explore the tradeoffs between 2D and 3D representations at a larger scale," added Vincent Liu, student and co-lead author of the paper.

"EgoZero and past work (Point Policy, P3PO) have only explored single-task 3D policies, so it would be interesting to extend this framework of learning from 3D points in the form of a fine-tuned LLM/VLM, similar to how modern VLA models are trained."

Written for you by our author Ingrid Fadelli, edited by Lisa Lock, and fact-checked and reviewed by Robert Egan—this article is the result of careful human work. We rely on readers like you to keep independent science journalism alive. If this reporting matters to you, please consider a donation (especially monthly). You'll get an ad-free account as a thank-you.

More information: Vincent Liu et al, EgoZero: Robot Learning from Smart Glasses, arXiv (2025). DOI: 10.48550/arxiv.2505.20290

Journal information: arXiv

© 2025 Science X Network

Citation: Training robots without robots: Smart glasses capture first-person task demos (2025, June 12) retrieved 12 June 2025 from https://techxplore.com/news/2025-06-robots-smart-glasses-capture-person.html This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Novel framework can create egocentric human demonstrations for imitation learning 0 shares

Feedback to editors

Share212Tweet133ShareShare27ShareSend

Related Posts

Benchmarking hallucinations: New metric tracks where multimodal reasoning models go wrong
AI

Benchmarking hallucinations: New metric tracks where multimodal reasoning models go wrong

June 14, 2025
0

June 14, 2025 feature The GIST Benchmarking hallucinations: New metric tracks where multimodal reasoning models go wrong Ingrid Fadelli contributing writer Gaby Clark scientific editor Robert Egan associate editor Editors' notes This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following attributes...

Read moreDetails
AI-generated podcasts open new doors to make science accessible

AI-generated podcasts open new doors to make science accessible

June 14, 2025
The most eye-catching products at Paris’s Vivatech trade fair

The most eye-catching products at Paris’s Vivatech trade fair

June 14, 2025
Anthropic says looking to power European tech with hiring push

Anthropic says looking to power European tech with hiring push

June 13, 2025
Vision-language models gain spatial reasoning skills through artificial worlds and 3D scene descriptions

Vision-language models gain spatial reasoning skills through artificial worlds and 3D scene descriptions

June 13, 2025
New ocean mapping technology helps ships cut fuel use and CO₂ emissions

New ocean mapping technology helps ships cut fuel use and CO₂ emissions

June 13, 2025
Explainable AI: New framework increases transparency in decision-making systems

Explainable AI: New framework increases transparency in decision-making systems

June 13, 2025

Recent News

Apple will repair some Mac minis powered by M2 chips for free

Apple will repair some Mac minis powered by M2 chips for free

June 14, 2025

Why Are So Many Public Companies Pivoting to Crypto, And What Happens If Bitcoin Crashes?

June 14, 2025
Playdate Season 2 review: Long Puppy and Otto’s Galactic Groove!!

Playdate Season 2 review: Long Puppy and Otto’s Galactic Groove!!

June 14, 2025
MasterClass deal: Get up to 50 percent off for Father’s Day

MasterClass deal: Get up to 50 percent off for Father’s Day

June 14, 2025

TOP News

  • Meta plans stand-alone AI app

    Meta plans stand-alone AI app

    555 shares
    Share 222 Tweet 139
  • Kia’s EV4, its first electrical sedan, will probably be out there within the US later this 12 months

    560 shares
    Share 224 Tweet 140
  • New Pokémon Legends: Z-A trailer reveals a completely large model of Lumiose Metropolis

    560 shares
    Share 224 Tweet 140
  • Lazarus, the brand new anime from the creator of Cowboy Bebop, premieres April 5

    559 shares
    Share 224 Tweet 140
  • Pokémon Champions is all in regards to the battles

    557 shares
    Share 223 Tweet 139
  • About Us
  • Contact Us
  • Privacy Policy
  • Terms of Use
Advertising: digestmediaholding@gmail.com

Disclaimer: Information found on cryptoreportclub.com is those of writers quoted. It does not represent the opinions of cryptoreportclub.com on whether to sell, buy or hold any investments. You are advised to conduct your own research before making any investment decisions. Use provided information at your own risk.
cryptoreportclub.com covers fintech, blockchain and Bitcoin bringing you the latest crypto news and analyses on the future of money.

© 2023-2025 Cryptoreportclub. All Rights Reserved

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • Crypto news
  • AI
  • Technologies

Disclaimer: Information found on cryptoreportclub.com is those of writers quoted. It does not represent the opinions of cryptoreportclub.com on whether to sell, buy or hold any investments. You are advised to conduct your own research before making any investment decisions. Use provided information at your own risk.
cryptoreportclub.com covers fintech, blockchain and Bitcoin bringing you the latest crypto news and analyses on the future of money.

© 2023-2025 Cryptoreportclub. All Rights Reserved