CRYPTOREPORTCLUB
  • Crypto news
  • AI
  • Technologies
Friday, May 9, 2025
No Result
View All Result
CRYPTOREPORTCLUB
  • Crypto news
  • AI
  • Technologies
No Result
View All Result
CRYPTOREPORTCLUB

Integrated multi-modal sensing and learning system could give robots new capabilities

November 26, 2024
155
0

November 26, 2024 feature

Editors' notes

Related Post

Ping pong robotic returns photographs with high-speed precision

Ping pong robotic returns photographs with high-speed precision

May 9, 2025
System lets robots establish an object’s properties by dealing with

System lets robots establish an object’s properties by dealing with

May 9, 2025

This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

preprint

trusted source

proofread

Integrated multi-modal sensing and learning system could give robots new capabilities

A new multi-modal sensing and learning platform that could enhance robot manipulation
Soft robot fingers equipped with tactile sensors grasping an egg. The bottom-right images show the tactile sensing results. Credit: Binghao Huang.

To assist humans with household chores and other everyday manual tasks, robots should be able to effectively manipulate objects that vary in composition, shape and size. The manipulation skills of robots have improved significantly over the past few years, in part due to the development of increasingly sophisticated cameras and tactile sensors.

Researchers at Columbia University have developed a new system that simultaneously captures both visual and tactile information. The tactile sensor they developed, introduced in a paper presented at the Conference on Robot Learning (CoRL) 2024 in Munich, could be integrated onto robotic grippers and hands, to further enhance the manipulation skills of robots with varying body structures.

The paper was published on the arXiv preprint server.

"Humans perceive the environment from multiple sensory modalities, among which touch plays a critical role in understanding physical interactions," Yunzhu Li, senior author of the paper, told Tech Xplore. "Our goal is to equip robots with similar capabilities, enabling them to sense the environment through both vision and touch for fine-grained robotic manipulation tasks."

As part of their study, the researchers set out to develop a multi-modal sensing system that could be used to gather both visual data, which can be used to estimate the position of objects in its field of view and their geometry, as well as tactile information, such as contact location, force, and local interaction patterns.

The integrated multi-modal sensing and learning system they developed, called 3D-ViTac, could give robots new sensing capabilities, allowing them to better tackle real-world manipulation tasks.

"Compared with existing state-of-the-art solutions, especially optical-based sensors, our sensor is as thin as a piece of paper, flexible, scalable and more robust for long-term use and large-scale data collection," explained Li.

"Coupled with visual observation, we developed an end-to-end imitation framework that enables robots to perform a variety of manipulation tasks, demonstrating significant improvements in safe interactions with fragile items and long-horizon tasks involving in-hand manipulation."

Li and his colleagues tested their sensor and the end-to-end imitation learning framework they developed in a series of experiments employing a real robotic system. Specifically, they integrated two of their sheet-like sensing devices onto each of a robotic gripper's fin-like hands.

The team then tested the gripper's performance on four challenging manipulation tasks, including steaming an egg, placing grapes on a plate, grasping a hex key and serving a sandwich. The findings of these initial tests were very promising, as their sensor appeared to improve the gripper's ability to successfully complete all tasks.

"We demonstrate that our proposed visuo-tactile imitation learning framework enables even low-cost robots to perform precise manipulation tasks," said Li. "It significantly outperforms vision-only approaches, particularly in handling fragile objects and achieving high precision in fine-grained manipulation."

The new sensor developed by this team of researchers could soon be deployed on other robotic systems and assessed on a broader range of object manipulation tasks that require high levels of precision. Meanwhile, Li and his colleagues plan to develop simulation methods and integration strategies that could make their sensor easier to apply and test on other robots.

"In our next studies, we aim to develop simulation techniques for tactile signals, explore ways to integrate the sensor into dexterous robotic hands and larger-scale surfaces (e.g., robot skin) and democratize tactile sensing in robotics," added Li.

"This will facilitate large-scale data collection and contribute toward multimodal robotic foundation models that better understand physical interactions through touch."

More information: Binghao Huang et al, 3D-ViTac: Learning Fine-Grained Manipulation with Visuo-Tactile Sensing, arXiv (2024). DOI: 10.48550/arxiv.2410.24091

Journal information: arXiv

© 2024 Science X Network

Citation: Integrated multi-modal sensing and learning system could give robots new capabilities (2024, November 26) retrieved 26 November 2024 from https://techxplore.com/news/2024-11-multi-modal-robots-capabilities.html This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Simplified robotic gripper can still tackle complex object manipulation tasks 3 shares

Feedback to editors

Share212Tweet133ShareShare27ShareSend

Related Posts

Ping pong robotic returns photographs with high-speed precision
AI

Ping pong robotic returns photographs with high-speed precision

May 9, 2025
0

Could 8, 2025 The GIST Editors' notes This text has been reviewed in accordance with Science X's editorial course of and insurance policies. Editors have highlighted the next attributes whereas making certain the content material's credibility: fact-checked preprint trusted supply proofread Ping pong robotic returns photographs with high-speed precision Time...

Read moreDetails
System lets robots establish an object’s properties by dealing with

System lets robots establish an object’s properties by dealing with

May 9, 2025
Rivalry as a craft: Research reveals how writers compete with AI

Rivalry as a craft: Research reveals how writers compete with AI

May 9, 2025
Girls’s sports activities are combating an uphill battle towards social media algorithms

Girls’s sports activities are combating an uphill battle towards social media algorithms

May 9, 2025
New chip makes use of AI to shrink massive language fashions’ vitality footprint by 50%

New chip makes use of AI to shrink massive language fashions’ vitality footprint by 50%

May 9, 2025
Gender, nationality can affect suspicion of utilizing AI in freelance writing

Gender, nationality can affect suspicion of utilizing AI in freelance writing

May 9, 2025
New research explores position of generative AI in utilizing copyrighted materials

New research explores position of generative AI in utilizing copyrighted materials

May 8, 2025

Recent News

The brand new Apple iPad A16 has dropped to a brand new low of $278

The brand new Apple iPad A16 has dropped to a brand new low of $278

May 9, 2025
Gentle Cellphone III overview: Minimalism stretched to the purpose of frustration

Gentle Cellphone III overview: Minimalism stretched to the purpose of frustration

May 9, 2025
How you can watch Samsung unveil the Galaxy S25 Edge at Unpacked

How you can watch Samsung unveil the Galaxy S25 Edge at Unpacked

May 9, 2025
The Morning After: What we discovered from the FTC v. Meta antitrust trial (to date)

The Morning After: What we discovered from the FTC v. Meta antitrust trial (to date)

May 9, 2025

TOP News

  • Meta plans stand-alone AI app

    Meta plans stand-alone AI app

    547 shares
    Share 219 Tweet 137
  • Kia’s EV4, its first electrical sedan, will probably be out there within the US later this 12 months

    551 shares
    Share 220 Tweet 138
  • New Pokémon Legends: Z-A trailer reveals a completely large model of Lumiose Metropolis

    550 shares
    Share 220 Tweet 138
  • Lazarus, the brand new anime from the creator of Cowboy Bebop, premieres April 5

    550 shares
    Share 220 Tweet 138
  • Pokémon Champions is all in regards to the battles

    547 shares
    Share 219 Tweet 137
  • About Us
  • Contact Us
  • Privacy Policy
  • Terms of Use
Advertising: digestmediaholding@gmail.com

Disclaimer: Information found on cryptoreportclub.com is those of writers quoted. It does not represent the opinions of cryptoreportclub.com on whether to sell, buy or hold any investments. You are advised to conduct your own research before making any investment decisions. Use provided information at your own risk.
cryptoreportclub.com covers fintech, blockchain and Bitcoin bringing you the latest crypto news and analyses on the future of money.

© 2023-2025 Cryptoreportclub. All Rights Reserved

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
No Result
View All Result
  • Crypto news
  • AI
  • Technologies

Disclaimer: Information found on cryptoreportclub.com is those of writers quoted. It does not represent the opinions of cryptoreportclub.com on whether to sell, buy or hold any investments. You are advised to conduct your own research before making any investment decisions. Use provided information at your own risk.
cryptoreportclub.com covers fintech, blockchain and Bitcoin bringing you the latest crypto news and analyses on the future of money.

© 2023-2025 Cryptoreportclub. All Rights Reserved