CRYPTOREPORTCLUB
  • Crypto news
  • AI
  • Technologies
Tuesday, July 15, 2025
No Result
View All Result
CRYPTOREPORTCLUB
  • Crypto news
  • AI
  • Technologies
No Result
View All Result
CRYPTOREPORTCLUB

New simulation system generates thousands of training examples for robotic hands and arms

July 15, 2025
152
0

July 14, 2025

The GIST New simulation system generates thousands of training examples for robotic hands and arms

Related Post

Pentagon inks contracts for Musk’s xAI, competitors

Pentagon inks contracts for Musk’s xAI, competitors

July 15, 2025
Mexican voice actors demand regulation on AI voice cloning

Mexican voice actors demand regulation on AI voice cloning

July 15, 2025
Lisa Lock

scientific editor

Andrew Zinin

lead editor

Editors' notes

This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

preprint

trusted source

proofread

Simulation-based pipeline tailors training data for dexterous robots
PhysicsGen can multiply a few dozen virtual reality demonstrations into nearly 3,000 simulations per machine for mechanical companions like robotic arms and hands. Credit: Alex Shipps/MIT CSAIL, using photos from the researchers

When ChatGPT or Gemini give what seems to be an expert response to your burning questions, you may not realize how much information it relies on to give that reply. Like other popular generative artificial intelligence (AI) models, these chatbots rely on backbone systems called foundation models that train on billions, or even trillions, of data points.

In a similar vein, engineers are hoping to build foundation models that train a range of robots on new skills like picking up, moving, and putting down objects in places like homes and factories. The problem is that it's difficult to collect and transfer instructional data across robotic systems. You could teach your system by teleoperating the hardware step-by-step using technology like virtual reality (VR), but that can be time-consuming. Training on videos from the internet is less instructive, since the clips don't provide a step-by-step, specialized task walk-through for particular robots.

A simulation-driven approach called "PhysicsGen" from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) and the Robotics and AI Institute customizes robot training data to help robots find the most efficient movements for a task. The system can multiply a few dozen VR demonstrations into nearly 3,000 simulations per machine. These high-quality instructions are then mapped to the precise configurations of mechanical companions like robotic arms and hands.

PhysicsGen creates data that generalizes to specific robots and conditions via a three-step process. First, a VR headset tracks how humans manipulate objects like blocks using their hands. These interactions are mapped in a 3D physics simulator at the same time, visualizing the key points of our hands as small spheres that mirror our gestures. For example, if you flipped a toy over, you'd see 3D shapes representing different parts of your hands rotating a virtual version of that object.

The pipeline then remaps these points to a 3D model of the setup of a specific machine (like a robotic arm), moving them to the precise "joints" where a system twists and turns. Finally, PhysicsGen uses trajectory optimization—essentially simulating the most efficient motions to complete a task—so the robot knows the best ways to do things like repositioning a box.

Each simulation is a detailed training data point that walks a robot through potential ways to handle objects. When implemented into a policy (or the action plan that the robot follows), the machine has a variety of ways to approach a task, and can try out different motions if one doesn't work.

"We're creating robot-specific data without needing humans to re-record specialized demonstrations for each machine," says Lujie Yang, an MIT Ph.D. student in electrical engineering and computer science and CSAIL affiliate who is the lead author of a new paper posted to the arXiv preprint server that introduces the project. "We're scaling up the data in an autonomous and efficient way, making task instructions useful to a wider range of machines."

Generating so many instructional trajectories for robots could eventually help engineers build a massive dataset to guide machines like robotic arms and dexterous hands. For example, the pipeline might help two robotic arms collaborate on picking up warehouse items and placing them in the right boxes for deliveries. The system may also guide two robots to work together in a household on tasks like putting away cups.

PhysicsGen's potential also extends to converting data designed for older robots or different environments into useful instructions for new machines. "Despite being collected for a specific type of robot, we can revive these prior datasets to make them more generally useful," adds Yang.

Addition by multiplication

PhysicsGen turned just 24 human demonstrations into thousands of simulated ones, helping both digital and real-world robots reorient objects.

Yang and her colleagues first tested their pipeline in a virtual experiment where a floating robotic hand needed to rotate a block into a target position. The digital robot executed the task at a rate of 81% accuracy by training on PhysicGen's massive dataset, a 60% improvement from a baseline that only learned from human demonstrations.

The researchers also found that PhysicsGen could improve how virtual robotic arms collaborate to manipulate objects. Their system created extra training data that helped two pairs of robots successfully accomplish tasks as much as 30% more often than a purely human-taught baseline.

In an experiment with a pair of real-world robotic arms, the researchers observed similar improvements as the machines teamed up to flip a large box into its designated position. When the robots deviated from the intended trajectory or mishandled the object, they were able to recover mid-task by referencing alternative trajectories from their library of instructional data.

Senior author Russ Tedrake, who is the Toyota Professor of Electrical Engineering and Computer Science, Aeronautics and Astronautics, and Mechanical Engineering at MIT, adds that this imitation-guided data generation technique combines the strengths of human demonstration with the power of robot motion planning algorithms.

"Even a single demonstration from a human can make the motion planning problem much easier," says Tedrake, who is also a senior vice president of large behavior models at the Toyota Research Institute and CSAIL principal investigator. "In the future, perhaps the foundation models will be able to provide this information, and this type of data generation technique will provide a type of post-training recipe for that model."

The future of PhysicsGen

Soon, PhysicsGen may be extended to a new frontier: diversifying the tasks a machine can execute.

"We'd like to use PhysicsGen to teach a robot to pour water when it's only been trained to put away dishes, for example," says Yang. "Our pipeline doesn't just generate dynamically feasible motions for familiar tasks; it also has the potential of creating a diverse library of physical interactions that we believe can serve as building blocks for accomplishing entirely new tasks a human hasn't demonstrated."

Creating lots of widely applicable training data may eventually help build a foundation model for robots, though MIT researchers caution that this is a somewhat distant goal. The CSAIL-led team is investigating how PhysicsGen can harness vast, unstructured resources—like internet videos—as seeds for simulation. The goal: transform everyday visual content into rich, robot-ready data that could teach machines to perform tasks no one explicitly showed them.

Yang and her colleagues also aim to make PhysicsGen even more useful for robots with diverse shapes and configurations in the future. To make that happen, they plan to leverage datasets with demonstrations of real robots, capturing how robotic joints move instead of human ones.

The researchers also plan to incorporate reinforcement learning, where an AI system learns by trial and error, to make PhysicsGen expand its dataset beyond human-provided examples. They may augment their pipeline with advanced perception techniques to help a robot perceive and interpret their environment visually, allowing the machine to analyze and adapt to the complexities of the physical world.

For now, PhysicsGen shows how AI can help us teach different robots to manipulate objects within the same category, particularly rigid ones. The pipeline may soon help robots find the best ways to handle soft items (like fruits) and deformable ones (like clay), but those interactions aren't easy to simulate yet.

More information: Lujie Yang et al, Physics-Driven Data Generation for Contact-Rich Manipulation via Trajectory Optimization, arXiv (2025). DOI: 10.48550/arxiv.2502.20382

Journal information: arXiv Provided by Massachusetts Institute of Technology

This story is republished courtesy of MIT News (web.mit.edu/newsoffice/), a popular site that covers news about MIT research, innovation and teaching.

Citation: New simulation system generates thousands of training examples for robotic hands and arms (2025, July 14) retrieved 15 July 2025 from https://techxplore.com/news/2025-07-simulation-generates-thousands-examples-robotic.html This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Training robots without robots: Smart glasses capture first-person task demos 27 shares

Feedback to editors

Share212Tweet133ShareShare27ShareSend

Related Posts

Pentagon inks contracts for Musk’s xAI, competitors
AI

Pentagon inks contracts for Musk’s xAI, competitors

July 15, 2025
0

July 15, 2025 The GIST Pentagon inks contracts for Musk's xAI, competitors Andrew Zinin lead editor Editors' notes This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility: fact-checked reputable news agency proofread xAI apologized on...

Read moreDetails
Mexican voice actors demand regulation on AI voice cloning

Mexican voice actors demand regulation on AI voice cloning

July 15, 2025
The forgotten 80-year-old machine that shaped the internet—and could help us survive AI

The forgotten 80-year-old machine that shaped the internet—and could help us survive AI

July 14, 2025
AI-powered occupancy tracking system optimizes open-plan office design

AI-powered occupancy tracking system optimizes open-plan office design

July 14, 2025
Amazon’s AI assistant struggles with diverse dialects, study finds

Amazon’s AI assistant struggles with diverse dialects, study finds

July 14, 2025
AI engineers don’t feel empowered to tackle sustainability crisis, new research suggests

AI engineers don’t feel empowered to tackle sustainability crisis, new research suggests

July 14, 2025
AI helps stroke survivors find their voice

AI helps stroke survivors find their voice

July 14, 2025

Recent News

One of our favorite Ninja air fryers is 36 percent off right now

One of our favorite Ninja air fryers is 36 percent off right now

July 15, 2025

How Was the First Half of 2025? Institutional and Individual Investors Followed Different Strategies! “Institutions Flocked to Bitcoin and Ethereum, While Individual Investors Flocked…

July 15, 2025
Grok’s AI chatbot now includes companions for you to ‘romance’

Grok’s AI chatbot now includes companions for you to ‘romance’

July 15, 2025
John Wick Hex will be delisted from all platforms on July 17

John Wick Hex will be delisted from all platforms on July 17

July 15, 2025

TOP News

  • Обменник криптовалют Dmoney.cc Выгодные обмены, которым можно доверять

    Обменник криптовалют Dmoney.cc Выгодные обмены, которым можно доверять

    536 shares
    Share 214 Tweet 134
  • Meta plans stand-alone AI app

    561 shares
    Share 224 Tweet 140
  • Kia’s EV4, its first electrical sedan, will probably be out there within the US later this 12 months

    566 shares
    Share 226 Tweet 142
  • AI-driven personalized pricing may not help consumers

    538 shares
    Share 215 Tweet 135
  • Our favorite power bank for iPhones is 20 percent off right now

    538 shares
    Share 215 Tweet 135
  • About Us
  • Contact Us
  • Privacy Policy
  • Terms of Use
Advertising: digestmediaholding@gmail.com

Disclaimer: Information found on cryptoreportclub.com is those of writers quoted. It does not represent the opinions of cryptoreportclub.com on whether to sell, buy or hold any investments. You are advised to conduct your own research before making any investment decisions. Use provided information at your own risk.
cryptoreportclub.com covers fintech, blockchain and Bitcoin bringing you the latest crypto news and analyses on the future of money.

© 2023-2025 Cryptoreportclub. All Rights Reserved

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • Crypto news
  • AI
  • Technologies

Disclaimer: Information found on cryptoreportclub.com is those of writers quoted. It does not represent the opinions of cryptoreportclub.com on whether to sell, buy or hold any investments. You are advised to conduct your own research before making any investment decisions. Use provided information at your own risk.
cryptoreportclub.com covers fintech, blockchain and Bitcoin bringing you the latest crypto news and analyses on the future of money.

© 2023-2025 Cryptoreportclub. All Rights Reserved