Sensors / Sensing Systems Archives - The Robot Report https://www.therobotreport.com/category/technologies/sensors-sensing/ Robotics news, research and analysis Thu, 05 Dec 2024 19:55:58 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.1 https://www.therobotreport.com/wp-content/uploads/2017/08/cropped-robot-report-site-32x32.png Sensors / Sensing Systems Archives - The Robot Report https://www.therobotreport.com/category/technologies/sensors-sensing/ 32 32 binder introduces M16 connectors with compact design, high sealing performance https://www.therobotreport.com/binder-introduces-m16-connectors-with-compact-design-high-sealing-performance/ https://www.therobotreport.com/binder-introduces-m16-connectors-with-compact-design-high-sealing-performance/#respond Wed, 04 Dec 2024 13:20:24 +0000 https://www.therobotreport.com/?p=581845 Binder USA has released redesigned M16 connectors designed for reliability and performance in harsh conditions.

The post binder introduces M16 connectors with compact design, high sealing performance appeared first on The Robot Report.

]]>
binder new modular M16 connectors.

The new M16 connectors have been redesigned to be modular and easier to handle. Source: binder

For demanding environments, Binder USA LP has introduced a new generation of molded M16 connectors, which it said are engineered to deliver reliability and performance even in the harshest conditions. The M16 circular connectors are designed for applications ranging from heavy-duty machinery like construction cranes and excavators to precision-driven laboratory equipment.

These connectors must meet diverse requirements, ensuring stable and reliable connections in extreme conditions, such as freezing temperatures and exposure to dirt and dust. To address these challenges, they must combine high electrical performance with durability and resilience, noted Camarillo, Calif.-based binder.

binder redesigns connectors to be modular

binder said it has completely redesigned its latest generation of molded M16 connectors. The previous version included many existing parts from field-wireable connectors, not all of which were ideal for the molded version, the company explained.

With an expanding portfolio and increasing demand, the company said it decided to fundamentally redesign the product to use a modular system, enabling many common parts between the unshielded and shielded variants.

“A key feature of the new connector design is the reduction in components,” said Sebastian Ader, product manager at binder. “Thanks to the modular system, we only need one additional part for the shielded and unshielded variants. This allows us to produce much more efficiently, offering cost advantages to customers without compromising on quality.”

Developing the new M16 connector was particularly challenging, said binder, because it had to comply with both the M16 standard (DIN EN 61076-2-106) and the stringent AISG standard (for the eight-pin shielded variant) in terms of IP68 sealing and compatibility between different manufacturers.

By optimizing the sealing system, the new M16 system resolves compatibility problems that have previously led to insufficient sealing, the company said. It added that the new generation of connectors is lead-free, meeting the EU RoHS2 Directive 2011/65/EU, including 2015/863/EU.

[SiTEAD]

M16 suitable for industrial, field applications

When redesigning the M16 molded connectors, binder said it paid particular attention to applications in industrial machinery, camera systems, and pressure sensors. These areas require maximum electrical reliability, and therefore a robust connector system that functions under difficult operating conditions, it noted.

“Crane and excavator applications are a good example. Here, fixed-plug connections are required,” said Ader. “Particularly in critical moments, such as when lifting heavy loads, it is important that the connectors not only fit securely, but are also quick and easy to use.”

A triangular design is intended to make the new M16 connectors are easy to handle, even in sub-zero temperatures or when wearing gloves, for example.

“The new triangular design not only makes handling easier, but it also minimizes dirt-prone areas and undercuts, which enables use even in very harsh and demanding environments,” Ader said. “The new connectors can be reliably mated, unmated and locked at any time.’

The molded M16 connectors also meet requirements for shock resistance, vibration tolerance, and tightness, said binder. “In summary, the robust design ensures a reliable connection in extreme temperatures, dirt, and moisture, minimizes the risk of failure, and ensures the continuous operational readiness of the machines,” it asserted.

“With the molded M16 connector, we have succeeded in meeting market demands in terms of technical properties, handling, and price,” Ader said. “All this makes our solution a future-proof choice for demanding industrial applications.”

About binder

Binder USA LP is a subsidiary of binder Group, a leading global manufacturer of circular connectors, custom cord sets, and LED lights. The company‘s products are used worldwide in industrial environments for factory automation, process control, and medical technology applications.

Binder said its technical innovations meet the highest standards of quality and reliability. The company’s quality management system is ISO 9001 and 14001-certified, but binder said its solution-focused approach to customer applications and commitment to service differentiate it from the competition.

The post binder introduces M16 connectors with compact design, high sealing performance appeared first on The Robot Report.

]]>
https://www.therobotreport.com/binder-introduces-m16-connectors-with-compact-design-high-sealing-performance/feed/ 0
Project CETI uses AI and robotics to track down sperm whales https://www.therobotreport.com/project-ceti-uses-ai-and-robotics-to-track-down-sperm-whales/ https://www.therobotreport.com/project-ceti-uses-ai-and-robotics-to-track-down-sperm-whales/#respond Tue, 03 Dec 2024 21:19:23 +0000 https://www.therobotreport.com/?p=581810 Project CETI researchers developed the AVATARs framework to make the most out of the small amount of time sperm whales spend on the surface.

The post Project CETI uses AI and robotics to track down sperm whales appeared first on The Robot Report.

]]>
An image of a pod of sperm whales swimming underwater.

Sperm whales spend, on average, 10 minutes of every hour on the surface, presenting challenges for researchers studying them. | Source: Amanda Cotton/Project CETI

In the chilly waters off the New England coast, researchers from the Cetacean Translation Initiative, Project CETI, can spend hours searching and waiting for an elusive sperm whale to surface. During the minutes the whales spend above water, the researchers need to gather as much information as possible before the animals dive back beneath the surface for long periods.

With one of the widest global distributions of any marine mammal species, these whales are difficult to track down, and even more difficult to learn from. Project CETI aims to use robotics and artificial intelligence to decode the vocalizing of sperm whales. It recently released research about how it tracks down sperm whales across the wide ocean.

“The ocean and the natural habitat of the whales is this vast place where we don’t have a lot of infrastructure, so it’s hard to build infrastructure that will always be able to observe the whales,” said Stephanie Gil, an assistant professor of Computer Science at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) and an advisor on the project.

The project brings together some of the world’s leading scientists in biology, linguistics, robotics, and more. The founder of Project CETI, David Gruber, estimated that it’s one of the largest multi-disciplinary research projects active today.

“Project CETI was formed in March 2020, and we’re now over 50 scientists across eight different disciplines,” he said. “I think we’re over 15 institutions, which I believe puts us as one of the most interdisciplinary, large-scale science projects that’s ever been conducted. It’s incredibly rewarding to see so many disciplines working together.”

Project CETI shares latest research

The researchers at the nonprofit organization have developed a reinforcement learning framework that uses autonomous drones to find sperm whales and predict where they will surface. The paper, published in Science Robotics, said it’s possible to predict when and where a whale may surface using various sensor data and predictive models of sperm whale dive behavior.

This new study involved various sensing devices, such as Project CETI aerial drones with very high frequency (VHF) signal sensing capability that use signal phase along with the drone’s motion to emulate an “antenna array in the air” for estimating the direction of pings from CETI’s on-whale tags.

“There are two basic advantages of [VHF signals]. One is that they are really low power, so they can operate for a really, really long time in the field, like months or even years. So, once those small beacons are deployed on the tag, you don’t have to really replace the batteries,” said Ninad Jadhav, a co-author on the paper and a robotics and engineering Ph.D. student at Harvard University.

“The second thing is these signals that these tags transmit, the VHF, are very high-frequency signals,” he added. “They can be detected at really long ranges.”

“That’s a really huge advantage because we never know when the whales will surface or where they will surface, but if they have been tagged before, then you can sense, for example, simple information such as the direction of the signal,” Jadhav told The Robot Report. “You can deploy an algorithm on the robot to detect that, and that gives us an advantage of finding where the whales are on the surface.”

Sperm whales present unique challenges for data collection

From left to right, Stephanie Gil, Sushmita Bhattacharya, and Ninad Jadhav working on a laptop with an orange drone in the foreground.

From left to right: Stephanie Gil, Sushmita Bhattacharya, and Ninad Jadhav. | Source: Stu Rosner

“Sperm whales are only on the surface for about 10 minutes every hour,” said Gil. “Other than that, they’re diving pretty deep in the ocean, so it’s hard to access information about what the whales are actually doing. That makes them somewhat elusive for us and for science.”

“Even we humans have certain patterns day to day. But if you’re actually out observing whales on a particular day, their behavior is not going to exactly align with the models, no matter how much data you’re using to make those models right. So it’s very difficult to really predict with precision when they might be coming up,” she continued.

“You can imagine, if [the scientists] out on the water for days and days, only having a few encounters with the whales, we’re not being that efficient. So this is to increase our efficiency,” Gruber told The Robot Report.

Once the Project CETI researchers can track down the whales, they must gather as much information as possible during the short windows of time sperm whales spend on the surface.

“Underwater data collection is quite challenging,” said Sushmita Bhattacharya, a co-author on the paper and a computer science and robotics Ph.D. student at Harvard University. “So, what is easier than underwater data collection is to have data collected when they’re at the surface. We can leverage drones or shallow hydrophones and collect as much data as possible.”


SITE AD for the 2025 Robotics Summit registration. Register now


Developing the AVATARS framework

At the center of the research is the Autonomous Vehicles for Whale Tracking And Rendezvous by Remote Sensing, or AVATARS framework. AVATARS is the first co-development of VHF sensing and reinforcement learning decision-making for maximizing the rendezvous of robots and whales at sea.

“We tried to build up a model which would kind of mimic [sperm whale] behavior,” Bhattacharya said of AVATARS. “We do this based on the current information that we gather from the sparse data set.”

Being able to predict when and where the whales will surface allowed the researchers to design algorithms for the most efficient route for a drone to rendezvous with—or encounter—a whale at the surface. Designing these algorithms where challenging on many levels, the researchers said.

“Probably the hardest thing is the fact that it is such an uncertain problem. We don’t have certainty at all in [the whales’] positions when they’re underwater, because you can’t track them with GPS when they’re underwater,” Gil said. “You have to think of other ways of trying to track them, for example, by using their acoustic signals and an angle of arrival to their acoustic signals that give you a rough idea of where they are.”

“Ultimately, these algorithms are routing algorithms. So you’re trying to route a team of robots to be at a particular location in the environment, in the world, at a certain given time when it’s necessary to be there,” she told The Robot Report. “So this is analogous to something like rideshare.”

Before bringing the algorithms into the real world with real whales, the team tested them in a controlled environment with devices the team put together to mimic whales.

We mimicked the whale using an engineered whale,” recalled Bhattacharya. “So basically we used a speed boat, and it had a loud engine. We used that engine noise to mimic the whale vocalization, and we had it move to mimic whale motion. And then we used that as our ground test.”

Project CETI tests AVATARS in the real world

An image of a small white drone flying over the ocean. The top of a whale can be seen poking out of the water.

A customized off-the-shelf drone flying to deploy a whale tag developed by Project CETI researchers. | Source: Project CETI

“Every day was a challenge when we were out on the boat, because this was for me, and my co-author Sushmita, the first time we were deploying real autonomous robots from a boat in the middle of the sea trying to collect some information,” Jadhav said.

“One of the major challenges of working in this environment was the noise in the sensor,” he continued. “As opposed to running experiments in the lab environment, which is more controlled, there are fewer sources of noise that impact your experiments or your sensor data”

“The other key challenge was deploying the drone itself from the board,” noted Jadhav. “I remember one instance where this was probably the first or second day of the second expedition that we went on last November, and I had the drone ready. It had the payload. It was waterproof”

“I had already run experiments here in Boston locally, where I had an estimate of how long the drone would fly with the payload. And then we were out on the boat running some initial tests, and the drone took off,” he said. “It was fine, it was doing its thing, and within a minute of it collecting data, there was a sudden gust of wind. The drone just lost control and crashed in the water.”

The team also had to try to predict and react to whale behavior when performing field tests.

“Our algorithm was designed to handle sensor data from a single whale, but what we ended up seeing is that there were four whales together, who were socializing,” Jadhav said. “They were diving and then surfacing at the same time. So, this was tricky, because then it becomes really hard for us on the algorithm side to understand which whale is sending which acoustic signal and which one we are tracking.”

Team tries to gather data without disturbing wildlife

While Project CETI works closely with sperm whales and other sea life that might be around when the whales surface, it aims to leave the whales undisturbed during data collection.

“The main concern that we care about is that even if we fail, we should not harm the whales,” Bhattacharya said. “So we have to be very careful about respecting the boundaries of those animals. That’s why we are looking at a rendezvous radius. Our goal is to go near the whale and not land on it.”

“Being minimally invasive and invisible is a key part of Project CETI,” said Gruber. “[We’re interested in] how to collect this information without interacting directly with the whale.”

This is why the team works mostly with drones that won’t disturb sea life and with specially developed tags that latch onto the whales and collect data. The CETI team eventually collects these tags, and the valuable data they contain, after they fall off the whales.

“A lot of times, people might think of robotics and autonomy as a scary thing, but this is a really important project to showcase that robots can be used to extend the reach of humans and help us understand our world better,” Gil told The Robot Report.

Project CETI aims to decode whale communications

This latest research is just one step in Project CETI’s overarching goal to decode sperm whale vocalizations. In the short term, the organization plans to ramp up data collection, which will be crucial for the project’s long-term goals.

“Once we have all the algorithms worked out, a future outlook is one where we might have, for example, drone ports in the sea that can deploy robots with sensors around the clock to observe whales when they’re available for observation,” Gil said.

“We envision a team of drones that will essentially meet or visit the whales at the right place, at the right time,” Jadhav said. “So whenever the whales surface, you essentially have a kind of autonomous drone, or autonomous robot, very close to the whale to collect information such as visual information or even acoustic if the drone is equipped with that.”

Outside of Project CETI, organizations could use AVATARS to further protect sperm whales in their natural environments. For example, this information could be used to reroute ships away from sperm whale hot spots, reducing the odds of a ship colliding with a pod of sperm whales.

“The idea is that if we understand more about the wholes, more about the whale communities, more about their social structures, then this will also enable and motivate conservation projects and understanding of marine life and how it needs to be protected,” Gil said.

In addition, the researchers said they could apply these methods to other sea mammals that vocalize.

“Here at Project CETI, we’re concerned about sperm whales, but I think this can be generalized to other marine mammals, because a lot of marine mammals vocalize, including humpback whales, other types of whales, and dolphins,” Bhattacharya said.

The post Project CETI uses AI and robotics to track down sperm whales appeared first on The Robot Report.

]]>
https://www.therobotreport.com/project-ceti-uses-ai-and-robotics-to-track-down-sperm-whales/feed/ 0
Learn about digitalization in the warehouse in new webinar https://www.therobotreport.com/learn-about-digitalization-in-the-warehouse-in-webinar/ https://www.therobotreport.com/learn-about-digitalization-in-the-warehouse-in-webinar/#comments Wed, 27 Nov 2024 14:30:49 +0000 https://www.therobotreport.com/?p=581774 Digitalization of the warehouse involves several emerging technologies; attendees of this free webinar can learn from industry experts.

The post Learn about digitalization in the warehouse in new webinar appeared first on The Robot Report.

]]>
Digital tools such as the simulation shown here from Dexory, are part of digitalization in the warehouse.

Digitalization is bringing emerging technologies into the warehouse. Source: Dexory

Designing and deploying a digital warehouse can be a challenge, with numerous technology options to add to your operations. From robotics and automation to the latest data analytics and artificial intelligence, how can you take advantage of digitalization?

At 2:00 p.m. EST on Wednesday, Dec. 4, expert panelists will discuss how emerging technologies are changing how engineers design warehouse systems and how businesses can gain insights and efficiencies with them. Sensors, digital twins, wearables, and virtual assistants are some of the tools that are part of this digital transformation.

In this free webinar, viewers can learn about:

  • Ways to improve labor productivity with workforce management
  • The orchestration of people and autonomous mobile robots (AMRs) for order picking and fulfillment
  • Where augmented and virtual reality (AR/VR) fit in the warehouse
  • How AI will change how operators use data in positive feedback cycle
  • How to scale digital transformation across facilities and the supply chain

Register now to attend this webinar on digitalization, and have your questions answered live. Registrants will be able to view it on demand after the broadcast date.

Digitalization speakers to share insights

Robert C. Kennedy, principal at RC Kennedy Consulting, will discuss digitalization in the warehouse.

Robert C. Kennedy is principal at RC Kennedy Consulting. For over four decades, he has planned, developed, and implemented industry-leading supply chain execution systems around the globe. Kennedy and his staff have led more than 200 large-scale implementation projects of supply chain execution software for leading customers in a variety of industries, including pharmaceutical, electronics, third-party logistics (3PL), and food and beverage.

As a leading voice of expertise, Bob is featured in regular interviews by industry media and has published articles, and he has presented at numerous trade shows and seminars.

RC Kennedy Consulting provides assistance to companies to improve operational efficiencies through process design and systems. It also helps them develop strategies for growth.

Ken Ramoutar will discuss digitalization in the warehouse.

Ken Ramoutar is chief marketing officer at Lucas Systems, which helps companies transform their distribution center by dramatically increasing worker productivity, operational agility, and customer and worker satisfaction using voice and AI optimization technologies.

In his 25 years of customer centric roles in supply chain software and consulting, Ramoutar has navigated companies through uncertainty and volatility as a thought leader and change agent.

Prior to Lucas, Ken was senior vice president and global head of customer experience at Avanade, a $3 billion Accenture and Microsoft-owned company, and he has held leadership roles at IBM, Sterling Commerce, and SAP/Ariba.

Michael Taylor is chief product officer and co-founder of Duality AI.

Michael Taylor is the chief product officer and co-founder of Duality AI. He has a 20-year career in mobile robotics, with 15 years dedicated to building autonomous field robots at Caterpillar.

While there, Mike led the team developing the autonomy system for Caterpillar’s autonomous dozer, and he helped launch the Autonomous Mining Truck program. His roles included architecting behaviors and planning systems, as well as building a collection of simulation technologies to accelerate deployment to customer sites.

Taylor was also part of the Carnegie Mellon team that won DARPA’s Urban Challenge, where he led both the Controls Team and the Field Calibration Team. Taylor holds dozens of patents in fields ranging from robotics to simulation technologies.

At Duality AI, Taylor leads the company’s Product and Solutions Engineering team. He is responsible for steering Duality’s product strategy, developing technologies to address customer needs, and helping ensure that customers maximize the value they extract from Falcon. This includes projects ranging from a simulation solution to support a drone-based AI perception system, to generating synthetic data for high-volume manufacturing quality assurance, to characterizing and modeling of uncrewed ground vehicles (UGVs) navigating novel environments. 

Eugene Demaitre, editorial director for robotics at WTWH Media

Eugene Demaitre, moderator, is the editorial director for robotics at WTWH Media, which produces Automated WarehouseThe Robot Report, the Robotics Summit & Expo, and RoboBusiness. Prior to working for WTWH Media, he was an editor at BNA (now part of Bloomberg), Computerworld, TechTarget, Robotics Business Review, and Robotics 24/7.

Demaitre has participated in conferences worldwide, as well as spoken on numerous webcasts and podcasts. He is always interested in learning more about robotics. He has a master’s from the George Washington University and lives in the Boston area.

This webinar is sponsored by Baluff and Dexory.

Balluff logo
Dexory logo

The post Learn about digitalization in the warehouse in new webinar appeared first on The Robot Report.

]]>
https://www.therobotreport.com/learn-about-digitalization-in-the-warehouse-in-webinar/feed/ 1
GE HealthCare unveils new applications for mobile C-arm portfolio https://www.therobotreport.com/ge-healthcare-unveils-new-applications-mobile-c-arm-portfolio/ https://www.therobotreport.com/ge-healthcare-unveils-new-applications-mobile-c-arm-portfolio/#respond Mon, 25 Nov 2024 20:28:59 +0000 https://www.therobotreport.com/?p=581737 GE HealthCare said complex pulmonary and thoracic procedures require precise intraoperative imaging systems.

The post GE HealthCare unveils new applications for mobile C-arm portfolio appeared first on The Robot Report.

]]>
The OEC 3D Imaging System, which is made up of three carts with monitors, and one cart with a large, C shaped device.

The OEC 3D Imaging System. | Source: GE HealthCare

GE HealthCare Technologies Inc. last week announced that it has added new clinical applications to its OEC 3D mobile CBCT C-arm portfolio. The Chicago-based company said the additions will enable precise and efficient imaging during endoscopic bronchoscopy procedures in the practice of interventional pulmonology.

Complex pulmonary and thoracic procedures require precise intraoperative imaging systems, explained GE HealthCare. The position of a nodule can differ from pre-operative CT images, it noted. This happens as a result of differences in respiratory patterns, patient positioning, and other factors, resulting in CT-to-body divergence at the time of the procedure, said the company.

GE HealthCare claimed that its operational electronic chart (OEC) 3D intraoperative mobile cone beam computed tomography (CBCT) offers “imaging excellence” and versatility. It said it can aid in everyday procedures ranging from neuro-spine and orthopedic trauma to interventional procedures such as bronchoscopy.

OEC 3D enables the visualization of both 2D and 3D images of the lung using a single mobile C-arm. The lung suite now includes an augmented fluoroscopy overlay of 3D points of interest and adjustable motorized 3D scans.

OEC interfaces continue to expand

During bronchoscopy procedures, clinicians can use navigation or robotic assistance with the OEC Open interface to automatically transfer 3D volumetric data after reconstruction.

GE HealthCare recently added a verified interface with the Intuitive Ion endoluminal robotic bronchoscopy system. The company said it continues to expand OEC open interfaces for a variety of clinical procedures as an agnostic ecosystem. It’s currently verified with eight third-party systems across robotics, navigation, and augmented reality (AR) vision.

“As we continue to build out our OEC ecosystem, GE HealthCare is excited about the addition of the Intuitive Ion robotic system to our OEC Open interface,” said Christian O’Connor, global general manager for surgery at GE HealthCare. “This interface provides interventional pulmonologists using the OEC 3D C-arm a seamless experience during minimally invasive, robotic-assisted bronchoscopy procedures.”

“With Intuitive’s Ion Robotic Bronchoscopy System now verified to interface with GE HealthCare’s OEC 3D through the OEC Open interface, I believe we can now reach and diagnose almost any nodule in the lung,” stated Dr. Dominique Pepper. She is medical director of bronchoscopy and respiratory care at Providence Swedish South Puget Sound and a consultant for GE HealthCare.

“This is a game-changer for clinicians – this can help us confidently and accurately provide answers when we see a suspicious area of interest,” Pepper said.


SITE AD for the 2025 Robotics Summit registration. Register now


About GE HealthCare

GE HealthCare said it is a global medical technology, pharmaceutical diagnostics, and digital solutions innovator. The company said its integrated systems, services, and data analytics can make hospitals more efficient, clinicians more effective, therapies more precise, and patients healthier and happier. It said it is a $19.6 billion business with approximately 51,000 employees worldwide. 

First introduced in 2021, the OEC 3D mobile CBCT C-arm provides precise 3D and 2D imaging in a variety of procedures. During bronchoscopies, clinicians can use CBCT visualization features, such as Lung Preset, to help optimize viewing of airway structures and Augmented Fluoroscopy with Lung Suite to help confirm tool-in-lesion.

The OEC 3D enables a transition from 3D to 2D imaging through one versatile mobile CBCT imaging C-arm. GE said it includes an intuitive user interface and workflow to further optimize space in the bronchoscopy suite.

Editor’s note: This article was syndicated from The Robot Report sibling site MassDevice.

The post GE HealthCare unveils new applications for mobile C-arm portfolio appeared first on The Robot Report.

]]>
https://www.therobotreport.com/ge-healthcare-unveils-new-applications-mobile-c-arm-portfolio/feed/ 0
Imagry moves to make buses autonomous without mapping https://www.therobotreport.com/imagry-moves-to-make-buses-autonomous-without-mapping/ https://www.therobotreport.com/imagry-moves-to-make-buses-autonomous-without-mapping/#respond Mon, 25 Nov 2024 19:18:36 +0000 https://www.therobotreport.com/?p=581732 Imagry has developed hardware-agnostic systems to provide Level 4 autonomy to buses with time to market in mind.

The post Imagry moves to make buses autonomous without mapping appeared first on The Robot Report.

]]>
Imagry says its autonomy kit enables buses to autonomously handle roundabouts, as shown here.

Imagry says its software enables buses to autonomously handle complex situations such as roundabouts. Source: Imagry

Autonomous vehicles often rely heavily on prior information about their routes, but new technology promises to improve real-time situational awareness for vehicles including buses. Imagry said its “HD-mapless driving” software stack enables vehicles to react to dynamic contexts and situations more like human drivers.

The company also said its AI Vision 360 eliminates the need for external sensor infrastructure. It claimed that its bio-inspired neural network and hardware-agnostic systems allow for SAE Level 3/4 operations without spending time on mapping.

“We’ve been focusing on two sectors,” said Eran Ofir, CEO of Imagry. “We’ve been selling our perception and motion-planning stack to Tier 1 suppliers and automotive OEMs for autonomous vehicles. We signed a 10-year contract with Continental and are jointly developing a software-defined vehicle platform.”

“And we’ve started working with transportation operators on providing autonomous buses,” he told The Robot Report. “For example, in Turkey, France, Spain, and soon Japan, we’re retrofitting electric buses to be autonomous.”


SITE AD for the 2025 Robotics Summit registration. Register now


Imagry trains in real time with supervision

Imagry was established in 2015 with a focus on computer vision for retail. In 2018, it began focusing entirely on autonomous driving. The company now has about 120 employees in San Jose, Calif., and Haifa, Israel.

Imagry said its technology is similar to that of Tesla in relying on 3D vision for perception and motion planning rather than rule-based coding or maps.

“Most players in the industry use HD maps with 5 cm [1.9 in.] resolution, telling the vehicle where lights, signs, and lane markers are,” said Ofir. “Our system teaches itself with supervised learning. It maps in real time while driving. Like a human driver, it gets the route but doesn’t know what it will find.”

How does Imagry deal with the need for massive data sets to train for navigation and obstacle detection and avoidance?

“We wrote a proprietary tool for annotation to train faster, better, and cheaper,” Ofir replied. “The data is collected but doesn’t live in the cloud. The human supervisor tells the vehicle where it was wrong, like a child. We deliver over-the-air updates to customers.”

“The world doesn’t belong to HD maps — it’s a matter of trusting AI-based software for perception and motion planning,” he said.

Ofir cited an example of a vehicle in Arizona on a random route with no communications to centralized computing. Its onboard sensors and compute recognized construction zones, skateboarders, a bike lane, and stop signs.

“The capability to drive out of the box in new places is unique to Imagry,” asserted Ofir. “We can handle righthand and lefthand driving, such as in Tokyo, where we’ve been driving for a year now.”

How does the bus know when to stop for passengers?

It could stop at every bus stop, upon request via a button at the stop (for the elderly, who may not use phone apps), or be summoned by an app that also handles payment, responded Ofir. Imagry’s system also supports “kneeling” for people with disabilities.

Why buses are a better focus for autonomy

Imagry has decided to focus on urban use cases rather than highways. Buses are simpler to get to Level 4 autonomy, said Ofir.

“Autonomous buses are better than ride hailing; they’re simpler than passenger vehicles,” said Ofir. “They drive in specific routes and at a speed of only 50 kph [31 mph] versus 80 kph [50 mph]. It’s a simpler use case, with economies of scale.”

“The time to revenue is much faster — the design cycle is four years, while integrating with a bus takes two to three months,” he explained. “Once we hand it over to the transport operator, we can get to L4 in 18 months, and then they can buy and deploy 40 more buses.”

In addition, the regulations for autonomous buses are clearer, with 22 countries running pilots, he noted.

“We already have projects with a large medical center and on a public road in Israel,” Ofir said. “We’re not doing small pods — most transport operators desire M3-class standard buses for 30 to 45 passengers because of the total cost of ownership, and they know how to operate them.”

In September and October, Imagry submitted bids for autonomous buses in Austria, Portugal, Germany, Sweden, and Japan.

Software focus could save money

By being vehicle-agnostic, Ofir said Imagry avoids being tied to specific, expensive hardware. Fifteen vendors are making systems on chips (SoCs) that are sufficient for Level 3 autonomy, he said.

“OEMs want the agility to use different sets of hardware in different vehicles. A $30,000 car is different from a $60,000 car, with different hardware stacks and bills of materials, such as camera or compute,” said Ofir. “It’s a crowded market, and the autonomy stack still costs $100,000 per vehicle. Ours is only $3,000 and runs on Ambarella, NVIDIA, TI, Qualcomm, and Intel.”

“With our first commercial proof of concept for Continental in Frankfurt, Germany, we calibrated our car and did some localization,” he added. “Three days after arrival, we simply took it out on the road, and it drove, knowing there’s no right on red.”

With shortages of drivers, particularly in Japan, operators could save $40,000 to $70,000 per bus per year, he said. The Japanese government wants 50 locations across the country to be served with autonomous buses by the end of 2025 and 100 by the end of 2027.

Autonomous buses are also reliable around the clock and don’t get sick or go on strike, he said.

“We’re working on fully autonomous parking, traffic jam assist, and Safe Driver Overwatch to help younger or older drivers obey traffic signs, which could be a game-changer in the insurance industry,” he added. “Our buses can handle roundabouts, narrow streets, and mixed traffic and are location-independent.”

Phases of autonomous bus deployment

Technology hurdles aside, getting autonomous buses recognized by the rules of the road requires patience, said Ofir.

“Together with Mobileye, which later moved to the robotaxi market, Imagry helped draft Israel’s regulatory framework for autonomous driving, which was completed in 2022,” recalled Ofir. “We’re working with lawmakers in France and Germany and will launch pilots in three markets in 2025.”

Testing even Level 3 autonomy can take years, depending on the region. He outlined the phases for autonomous bus rollout:

  1. Work with the electric bus for that market, then activate the system on a public road. “In the U.S., we’ve installed the full software and control stack in a vehicle and are testing FSD [full self-driving],” Ofir said.
  2. Pass NCAP (European New Car Assessment Programme) testing for merging and stops in 99 scenarios. “We’re the only company to date to pass those tests with an autonomous bus,” said Ofir. “Japan also has stringent safety standards.”
  3. Pass the cybersecurity framework, then allow passengers onboard buses with a safety driver present.
  4. Autonomously drive 100,000 km (62,137 mi.) on a designated route with one or more buses. After submitting a report to a department of motor vehicles or the equivalent, the bus operator could then remove the human driver.

“The silicon, sensors, and software don’t matter for time to revenue, and getting approvals from the U.S. National Highway Traffic Safety Administration [NHTSA] can take years,” Ofir said. “We expect passenger vehicles with our software on the road in Europe, the U.S., and Japan sometime in 2027.”

Imagry has joined Partners for Automated Vehicle Education (PAVE) and will be exhibiting at CES in January 2025.

The post Imagry moves to make buses autonomous without mapping appeared first on The Robot Report.

]]>
https://www.therobotreport.com/imagry-moves-to-make-buses-autonomous-without-mapping/feed/ 0
ANELLO Photonics secures funding for inertial navigation in GPS-denied environments https://www.therobotreport.com/anello-photonics-secures-funding-inertial-navigation-gps-denied-environments/ https://www.therobotreport.com/anello-photonics-secures-funding-inertial-navigation-gps-denied-environments/#respond Tue, 19 Nov 2024 16:15:50 +0000 https://www.therobotreport.com/?p=581641 ANELLO Photonics, which has developed compact navigation and positioning for autonomous systems, has closed its Series B round.

The post ANELLO Photonics secures funding for inertial navigation in GPS-denied environments appeared first on The Robot Report.

]]>
ANELLO evaluation kit for its SiPhOG optical navigation system.

ANELLO offers an evaluation kit for its navigation and positioning system. Source: ANELLO Photonics

Self-driving vehicles, mobile robots, and drones need multiple sensors for safe and reliable operation, but the cost and bulk of those sensors have posed challenges for developers and manufacturers. ANELLO Photonics Inc. yesterday said it has closed its Series B funding round for its SiPhOG inertial navigation system, or INS.

“This investment not only validates our SiPhOG technology and products in the marketplace, but will [also] allow us to accelerate our manufacturing and product development as we continue to push the boundaries and leadership for navigation capabilities and performance to our customers who want solutions for GPS-denied environments,” stated Dr. Mario Paniccia, co-founder and CEO of ANELLO Photonics.

Founded in 2018, ANELLO has developed SiPhOG — Silicon Photonics Optical Gyroscope — based on integrated photonic system-on-chip (SoC) technology. The Santa Clara, Calif.-based company said it has more than 28 patents, with 44 pending. Its technologies also include a sensor-fusion engine using artificial intelligence.

“I spent 22 years at Intel and started this field of silicon photonics, which is the idea of building optical devices out of standard silicon processing, mostly focused on the data center,” recalled Paniccia. “Mike Horton, my co-founder, was a sensor gyro expert who started a company called Crossbow coming out of UC Berkeley.”

“Everyone doing autonomy was saying lidar and radar, but customers told Mike that if we could build an integrated photonic chip, they’d be very interested,” he told The Robot Report. “If you look at fiber gyros, they work great but are big, bulky, and expensive.”

“The stuff on our phones are MEMS [micro-electromechanical systems]-based today, which is not very accurate and is very sensitive to temperature, vibration, and EM interference,” Paniccia explained. “With the the same concept as a fiber gyro — the idea of light going around a coil, and you measure the phase based on rotation — we integrated all those components on a single chip, added a little laser, and put electronics around it, and you now get SiPhOG, which fits in the palm of your hand.”


SITE AD for the 2025 Robotics Summit registration. Register now


SiPhOG combines compactness and precision

SiPhOG brings high-precision into an integrated silicon photonics platform, claimed ANELLO. It is based on the interferometric fiber-optic gyroscope (FOG) but is designed for compactness, said Paniccia.

“It’s literally 2 by 5 mm,” he said. “On that chip, we have all the components — the splitters, the couplers, the phase modulators, and the delay lines. We measure about 50 nano-radians of signal, so a tiny, tiny signal, but we measure it very accurately.”

The system also has a non-ASIC, two-sided electronics board with an analog lock-in amplifier, a temperature controller, and an isolator, Paniccia said. It has none of the drawbacks of MEMS and uses 3.3 volts, he added.

Paniccia said the SiPhOG unit includes an optical gyro, triple-redundant MEMS, accelerometers, and magnetometers. It also has two GPS chips and dual antennas and is sealed to be waterproof.

The ANELLO IMU+ is designed for harsh environments including construction, robotics, mining, trucking, and defense.

The ANELLO IMU+ is designed for harsh environments including in construction, robotics, mining, trucking, and defense. Source: ANELLO

Navigation system ready for multiple markets

Autonomous systems can work with ANELLO’s technology and the Global Navigation Satellite System (GNSS) for navigation, positioning, and motion tracking for a range of applications, said the company.

“We’re shipping to customers now in orchards, where the leaves come in, and the water in them essentially acts like a tunnel, absorbing GPS,” Paniccia said. “Our algorithm says, ‘I’m losing GPS, so weigh the navigation algorithm more to the optical gyro.’ You want the robot to stay within a tenth of a meter across a distance of half a mile. Long-distance, we’re looking at 100 km of driving without GPS with less than 100-m lateral error.”

In addition, SiPhOG is built for scalability and cost-effectiveness.

“VC friends tell me that automakers are putting six lidar systems on a car, and each one is $10,000 each. It’s never going to get to mass market,” Paniccia said. “We have an optical technology for land, air, and sea. And whether that land vehicle is for agriculture or construction, or in the longer term, trucking or autonomous cars, we can do it.”

“You can literally tape SiPhOG to a dashboard and plug it into the cigarette lighter,” he said. “We have self-alignment correction, and within 15 minutes, you can have GPS-denied navigation capability. We’re also shipping this system for indoor robots like in construction.”

“If I put three SiPhOGs in a cube, I can have the same performance but at one-fifth the size and weight and a quarter of the power for precision in three dimensions,” said Paniccia. “That’s exciting for drones and maritime.”

Investors to accelerate ANELLO 

Lockheed Martin, Catapult Ventures, and One Madison Group co-led ANELLO’s unspecified Series B round. New Legacy, Build Collective, Trousdale Ventures, In-Q-Tel (IQT), K2 Access Fund, Purdue Strategic Ventures, Santuri Ventures, Handshake Ventures, Irongate Capital, and Mana Ventures also participated. 

“We’re committed to fostering the art of the possible with investments in cutting edge technologies, including advancements in inertial navigation that have the potential to enhance autonomous operations in GPS-denied environments,” said Chris Moran, vice president and general manager of Lockheed Martin Ventures. “Our continued investment in ANELLO reflects our mission to accelerate technologies that can ultimately benefit national security.”

ANELLO said it plans to use its latest funding to continue developing and deploying its technology. The company has worked with the U.S. Department of Defense to optimize its algorithms against jamming or spoofing.

“Every week, there’s an article about a commercial flight or defense-related mission getting GPS jammed, like thousands of flights to and from Europe affected by suspected Russian jamming,” noted Tony Fadell, founder of Nest and a principal at investor Build Collective. “GPS has become a single point of failure because it’s too easily compromised with various jamming and spoofing techniques.”

“ANELLO’s proven and commercially available optical gyroscope is the only navigational tool that can take over, [offering] precision over long periods of time, the size of a golf ball, low-power, low-cost, that’s immune to shock and vibration,” he added. “ANELLO will save lives in the air, on the road, and over water.”

The post ANELLO Photonics secures funding for inertial navigation in GPS-denied environments appeared first on The Robot Report.

]]>
https://www.therobotreport.com/anello-photonics-secures-funding-inertial-navigation-gps-denied-environments/feed/ 0
CAPTRON joins UR+ to offer sensors through Universal Robots marketplace https://www.therobotreport.com/captron-partners-universal-robots-makes-products-available-ur-marketplace/ Thu, 14 Nov 2024 20:00:49 +0000 https://www.therobotreport.com/?p=581597 CAPTRON is offering its TCP laser sensors for high-precision tool tip calibration on Universal Robots' marketplace.

The post CAPTRON joins UR+ to offer sensors through Universal Robots marketplace appeared first on The Robot Report.

]]>
CAPTRON's TCP Calibration Sensors, now available in the Universal Robots marketplace, are designed for easy integration and high precision.
The TCP Calibration Sensors, now available in the Universal Robots marketplace, are designed for high repeatability and accuracy. Source: CAPTRON

CAPTRON has announced its official UR+ partnership with Universal Robots A/S. The global sensor provider said its TCP Calibration Sensors, now available on the Universal Robots marketplace, deliver high precision and are integrated with UR collaborative robot arms.

“Our UR+ certified products are designed to deliver maximum precision and reliability to support our customers’ automation goals,” stated Sean Walters, general manager at CAPTRON North America LP. “This partnership with Universal Robots reinforces our commitment to delivering high-performance solutions that are easy to integrate, boosting productivity across the board.”

Designed for applications like dispensing and welding, these sensors can maximize accuracy, reduce scrap, and enhance production quality, said CAPTRON. The company claimed that its URCap software enables manufacturers to achieve plug-and-play integration, speeding up deployment and improving efficiency.


SITE AD for the 2025 Robotics Summit registration. Register now


CAPTRON sensors offer ease of use, precision

The TCP Laser Calibration Sensors are available in 40 mm (1.5 in.) and 70 mm (0.2 in.) sizes. They offer precise tool tip calibration with a reproducibility of 0.01 mm (0.0003 in.), said the company, which has locations in the U.S., Germany, China, and Poland.

CAPTRON listed the following features and benefits for its URCap-compatible TCP Calibration Sensors:

  • Ease of use: CAPTRON said its URCap software simplifies sensor setup, drastically cutting integration time.
  • Reliability: The sensors promise precise and repeatable accuracy, minimizing tool deviations and errors.
  • Flexibility: With user-friendly software interfaces, manufacturers can switch between tasks quickly and easily, said CAPTRON, making its systems highly adaptable across various applications.

Customer says TCP Calibration Sensor a ‘game changer’

Popular tool center point applications include checking the welder gun wire tip after cleaning and calibrating the glue-dispensing tip before each process to ensure precision.

One of CAPTRON’s newest customers recently implemented the TCP Calibration Sensor and described it as a “game changer.” The customer said it has seen dramatic improvements in tool precision, significantly reducing errors and downtime.

CAPTRON said the use case highlights how its technology can enhance accuracy and efficiency in automated industrial processes.

“Universal Robots is thrilled to partner with CAPTRON to drive the next wave of innovation in collaborative robotics,” said Michael DeGrace, ecosystem success manager for the Americas at Universal Robots. “By combining CAPTRON’s cutting-edge sensor technology with our flexible, user-friendly robotic arms, we are opening up new possibilities for industries worldwide to enhance automation, improve precision, and achieve greater efficiency in their operations.”

CAPTRON’s TCP Calibration Sensors and URCap Software are now available through the Universal Robots Marketplace. For more information, visit CAPTRON UR Products.

The post CAPTRON joins UR+ to offer sensors through Universal Robots marketplace appeared first on The Robot Report.

]]>
Silicon Sensing to supply PinPoint gyros for Martian moons exploration https://www.therobotreport.com/silicon-sensing-supply-pinpoint-gyros-martian-moons-exploration/ https://www.therobotreport.com/silicon-sensing-supply-pinpoint-gyros-martian-moons-exploration/#respond Wed, 13 Nov 2024 20:57:56 +0000 https://www.therobotreport.com/?p=581578 The size of a fingernail, PinPoint is the smallest gyro in Silicon Sensing Systems' MEMS product range and has been tested for space.

The post Silicon Sensing to supply PinPoint gyros for Martian moons exploration appeared first on The Robot Report.

]]>
Silicon Sensing's CRM200 Pinpoint.

Silicon Sensing’s miniature CRM200 PinPoint gyro. | Source: Silicon Sensing Systems

Robotic missions to Earth’s moon are challenging enough, but motion control on the moons of Mars requires precision technology. Silicon Sensing Systems Ltd. has been contracted by the German Aerospace Centre to supply two miniature PinPoint gyros for use in the Martian Moons eXploration mission. The mission aims to send rovers to survey Deimos and Phobos. 

The German Aerospace Centre (DLR) will use the company’s CRM200 gyros in the vehicle that will explore the larger of these moons, Phobos. There, this rover will collect surface samples. The set of PinPoint gyros will help detect unintended movement of the rover on the unknown surface.

Depending on the initial checkout of the drivetrain that includes the gyros, the team will activate an optional safety module in the software. This module will automatically prevent instability during drive sessions of the rover.

“PinPoint has a proven track record in space applications, but this will be a landmark use on a remarkable mission where this gyro’s reliability and endurance will be critical,” stated David Sommerville, the general manager of Silicon Sensing Systems. 

Founded in 1999, Silicon Sensing Systems engineers gyroscope and inertial systems. Jointly owned by Collins Aerospace and Sumitomo Precision Products, the company develops silicon, micro-electromechanical systems (MEMS)-based navigation and stabilization technology. 

Silicon Sensing said it has supplied millions of MEMS gyroscopes and accelerometers to thousands of customers.

Silicon Sensing designs compact, robust gyros

Just the size of a fingernail, at approximately 5mm x 6mm (0.1 x 0.2 in.), PinPoint is the smallest gyro in Silicon Sensing’s MEMS product range. The company said it is a proven, low drift, single-axis, angular-rate sensor with many applications across diverse market sectors.

In combination, these robust sensors can precisely measure angular rate across multiple axes. This includes any combination of pitch, yaw, and roll – all while consuming very little power, according to Silicon Sensing.

As part of the rigorous selection process for this exploration program, PinPoint completed total ionizing dose (TID) testing at 17kRad radiation and proton tests (up to 68 MeV/proton). This testing demonstrated the gyro’s suitability for space requirements.

“We are also seeing increasing space-sector application for our latest tactical grade IMU [inertial measurement unit] — the DMU41 — which has recently been selected for a number of low-Earth orbit programs,” said Somerville. “This growing interest in our MEMS-based inertial sensors and systems reflects the potential of this technology, with its rugged reliability, compact size, and low power consumption, for the sector.”


SITE AD for the 2025 Robotics Summit registration. Register now


Mission gets ready for Martian moons

JAXA, the Japanese space agency, is leading the Martian Moons eXploration (MMX) mission. It will explore the two moons of Mars with contributions from NASA, ESA, CNES, and DLR. CNES, the French national space agency, and the DLR are jointly contributing a 25-kg (55.1 lb.) rover. 

Approximately one year after leaving Earth, the spacecraft will arrive in Martian space and enter into an orbit around the planet. It will then move into a quasi-satellite orbit (QSO) around Phobos to collect scientific data, drop the rover, and gather a sample of the moon’s surface.

After observation and sample collection, the spacecraft will return to Earth carrying the material gathered from Phobos.

The current schedule has a launch date in 2026, followed by a Martian orbit insertion in 2027. The team said it hopes the probe will return to Earth in 2031.

An illustration of the MMX rover vehicle, a boxy vheicle with four wheels attached on legs, driving on Mars.

The MMX rover vehicle will eventually gather samples from the surface of one of Mars’ moons. | Source: Silicon Sensing Systems

The post Silicon Sensing to supply PinPoint gyros for Martian moons exploration appeared first on The Robot Report.

]]>
https://www.therobotreport.com/silicon-sensing-supply-pinpoint-gyros-martian-moons-exploration/feed/ 0
Arbe Robotics to bring in up to $49M in IPO for perception radar systems https://www.therobotreport.com/arbe-robotics-to-bring-in-up-to-49m-in-ipo-for-perception-radar-systems/ https://www.therobotreport.com/arbe-robotics-to-bring-in-up-to-49m-in-ipo-for-perception-radar-systems/#respond Fri, 08 Nov 2024 21:20:50 +0000 https://www.therobotreport.com/?p=581463 Arbe Robotics expects to bring in around $15 million in gross proceeds from the IPO, before deducting various offering expenses.

The post Arbe Robotics to bring in up to $49M in IPO for perception radar systems appeared first on The Robot Report.

]]>
An illustration of Arbe's radar in action as a car drives down a quiet highway.

The Phoenix Perception Radar enriches algorithms for advanced capabilities including free space mapping, object tracking, and SLAM. | Source: Arbe Robotics

Arbe Robotics Ltd. this week closed its initial public offering of 8,250,000 ordinary shares or pre-funded warrants in lieu thereof. The developer of perception radar systems said it expects to bring in around $15 million in gross proceeds from the IPO.

The company said it plans to use the net proceeds from this offering for working capital and general corporate purposes. Arbe said it aims to empower automakers, Tier 1 suppliers, autonomous ground vehicles (AGVs), commercial and industrial vehicles, and a wide array of safety applications with advanced sensing.

Founded in 2015, Arbe Robotics is based in Tel Aviv, Israel, and has offices in China, Germany, and the U.S.


SITE AD for the 2025 Robotics Summit registration. Register now


Arbe Robotics to move from ADAS to AVs

Arbe Robotics said it is starting with sensors for advanced driver-assist systems (ADAS), paving the way to full autonomous vehicles (AVs) later. The company claimed that its radar technology is 100 times more detailed than other radars on the market and that it is a critical sensor for SAE Level 2 and higher autonomy.

In September, Arbe said Tier 1 supplier Sensrad is providing 4D imaging radars using Arbe’s chipset to Tianyi Transportation Technology. It also announced that Tier 1 HiRain Technologies is using its chipset to develop an ADAS for another Chinese automaker.

HiRain's LRR610 4D Imaging Radar, Powered by Arbe’s Chipset

HiRain’s LRR610 4D Imaging radar uses Arbe’s chipset. Source: Arbe Robotics

More about the IPO

The ordinary shares were accompanied by Tranche A warrants to purchase up to 8.2 million ordinary shares and Tranche B warrants to purchase up to 8.2 million ordinary shares at a combined public offering price of $1.82 per share (or per pre-funded warrant in lieu thereof) and the accompanying Tranche A and Tranche B warrants.

The Tranche A warrants had an exercise price of $2.35 per share, were immediately exercisable upon issuance, and expired on Nov. 4. Meanwhile, the Tranche B warrants had an exercise price of $1.82 per share. These warrants were immediately exercisable upon issuance.

Arbe Robotics said the Tranche B Warrants will expire on Nov. 4, 2027, or 20 trading days after the company achieves all of the following, whichever comes first:

  • The company publicly announces that it has entered into a definitive supply agreement with a named European automotive original equipment manufacturer (OEM) that agrees to purchase a minimum of 500,000 radar chipsets over the term of the “Definitive Agreement Announcement.”
  • The VWAP (as defined in the Tranche B Warrant) for each trading day in any period of 10 consecutive trading days within one calendar year of the date of the Definitive Agreement Announcement is equal to or exceeds $2.25, subject to certain adjustments.
  • The trading volume of the ordinary shares (as reported by Bloomberg L.P.) on each trading day of the measurement period is at least 250,000 ordinary shares, subject to certain adjustments.
  • The ordinary shares underlying the Tranche B warrants and any ordinary shares issuable upon the exercise of any pre-funded warrants issued upon the exercise of a Tranche B warrant (collectively, the “saleable shares”) are then covered by an effective registration statement and a current prospectus, which can be used for the sale or other disposition of the saleable shares. The company said it has no reason to believe that such registration statement and prospectus will not continue to be available for the saleable shares for the next 30 trading days (collectively, the “triggering event”).

This deal was led by institutional investors including AWM Investment Co., the investment adviser of the Special Situations Funds, which also participated in Arbe’s previous $23 million financing round. Canaccord Genuity acted as the sole bookrunner for the offering. Roth Capital Partners acted as the co-manager for the offering.

The aggregate gross proceeds to the Arbe from this offering were approximately $15 million, before deducting the underwriters’ discounts and commissions and other offering expenses payable by the company. It said the potential additional gross proceeds from the Tranche A and Tranche B warrants, if fully exercised on a cash basis, would be about $34.4 million.

The securities described above were offered pursuant to a registration statement on Form F-3 (File No. 333-269235), originally filed on Jan. 13, 2023, with the Securities and Exchange Commission and declared effective by the SEC on Feb. 24, 2023. Arbe filed the offering through prospectus and a prospectus supplement as part of its registration statement.

The post Arbe Robotics to bring in up to $49M in IPO for perception radar systems appeared first on The Robot Report.

]]>
https://www.therobotreport.com/arbe-robotics-to-bring-in-up-to-49m-in-ipo-for-perception-radar-systems/feed/ 0
Fulcrum provides inspection data pipeline for Cantilever analysis, explains Gecko Robotics https://www.therobotreport.com/fulcrum-provides-inspection-data-pipeline-for-cantilever-analysis-explains-gecko-robotics/ https://www.therobotreport.com/fulcrum-provides-inspection-data-pipeline-for-cantilever-analysis-explains-gecko-robotics/#respond Fri, 08 Nov 2024 14:38:46 +0000 https://www.therobotreport.com/?p=581475 Gecko Robotics has developed Fulcrum, which uses AI to provide high-quality infrastructure data to its Cantilever analytics software.

The post Fulcrum provides inspection data pipeline for Cantilever analysis, explains Gecko Robotics appeared first on The Robot Report.

]]>
Screenshot of Gecko Robotics' Cantilever software analyzing data from a robotic tank inspection.

Fulcrum can ensure that Cantilever has high-quality infrastructure data to analyze. Source: Gecko Robotics

Robotic maintenance of large structures and critical infrastructure is only as useful as the data it yields. Gecko Robotics Inc. has announced its Fulcrum software for data acquisition and quality. Its first public use was this week.

The Pittsburgh-based company, best known for its robots that can climb and maintain tanks, has also developed drones and software. It said its Cantilever operating platform uses artificial intelligence and robotics (AIR) for data analysis and to support fast decision-making at scale.

Jake Loosararian, co-founder and CEO, and Jennifer Padgett, engineering manager at Gecko Robotics, explained to The Robot Report how Fulcrum and Cantilever can enable new levels of insights from robotic inspection.

Fulcrum enables data analytics from multiple sources

What is Fulcrum?

Loosararian: Jenn designed and built Fulcrum. Its design is centered around creating an API [application programming interface] for robots.

It’s all in support of our goal for Gecko — to protect critical infrastructure. This requires information about the built world.

Robots armed with different sensors turn the physical world of atoms into bits. The key is ensuring those bits drive useful outcomes.

The sensors on robots and drones can collect a lot of data — how do you determine what’s useful?

Loosararian: We collect so much and different types of information with our robots that climb walls or from fixed sensors. It’s not enough to just gather and post-process this data. We want to get as close to the process as possible.

Fulcrum is specifically built to gather data sets for high-fidelity foundation models. It’s designed not just to ensure quality data from all types of robots and sensors, but also to accelerate our ability to capture data layers for our Cantilever enterprise software.

For example, they can be used to predict when a tank would leak, a bridge collapse, or a naval vessel need to be modernized.

Padgett: We’re building a validation framework with our subject-matter expertise. We’ve collected millions of data points, while humans typically gather data points every square foot or two.

With Fulcrum, we understand the data as you’re collecting it and double-check it. We’ve optimized for inspections of concrete in missile silos, as well as tanks and boilers.

Gecko Robotics offers understanding of infrastructure health

How is Gecko Robotics pivoting from robotics hardware to AI?

Loosararian: We’ve traditionally developed hardware for data collection. Data quality is the starting point.

We’re helping people to understand what their livelihoods are based on by giving a full picture. Inspections affect everything from driving across a bridge to turning on the electricity.

We believe in democratizing data. We can’t build all the robots ourselves, and I recently talked onstage about the potential for humanoid robots like Tesla’s Optimus.

We’ve developed AI and built an ontology to connect things to monitor and maintain infrastructure health. Building and operating critical infrastructure is a matter of global competitiveness.

Padgett: With AI for pre-processing and low-level heuristics on key modules, Gecko can deliver useful data for customers. Fulcrum is really meant to provide higher-level analytics at the edge.

Jake, you mentioned the API and working with other robots. Who are you working with?

Loosararian: We’ve already made partnerships and are vetting a dozen companies for the kinds of tools that will be certified under the Gecko umbrella. We want to onboard as many robots as we can.

At the same time, we’re very skeptical of which robots are actually valuable. As we go to market with the platform, we understand which tools are good for marketing versus actually helping the business.

We’re not interested in research projects; we’re interested in companies that want specific, real-world impacts within 90 days. Right now, there’s a lot of skepticism around hardware and software, but with our robots and AI-powered software, the savings are real.

We’ve built up abstracts for how to interact with certain types of robots, drones, and marine systems. This makes it easy to add; by working them into our communications protocol, we’re language-agnostic.

We’re also interested in new types of sensors and how they can affect outcomes.


SITE AD for the 2025 Robotics Summit registration. Register now


Predictive maintenance key to value proposition

What industries can this help?

Loosararian: It’s not one industry; it’s everyone. Infrastructure is huge — from aircraft carriers to mining companies. We’ve got products and services that help them understand the state of their assets.

Right now, we’re focusing on built structures using next-generation IoT [Internet of Things] sensors. With fixed robots, mesh networks, and 5G, we’re imagining beyond that.

Cantilever is already providing data on 500,000 assets, and it’s already making changes in the way customers operate.

We’re constantly being pinged by companies that want us to integrate automated repairs and cleaning, which are important functions to maintaining safety and environmental sustainability.

We want to ensure that we can meet growing demand for things like shipyard maintenance with the growing scarcity of qualified people. Fulcrum has the ability to offer relevant information, changing the standard operating procedures from human-collected data.

So is the goal to apply IoT and AI to spot issues before they become problems?

Loosararian: We can know what the robot is doing, what it should be collecting, and get the analysis. With the life-extension AIR module, we can look at the data layers in concrete, carbon steel, and stainless steel to extend the useful life of critical infrastructure.

Fulcrum is also part of capex [capital expenditure] optimization. Users want to avoid replacing things, having downtime, or suffering from catastrophic failures. They need specific data rather than broad strokes so they don’t have to worry about overpaying to replace something that doesn’t yet need to be replaced.

Another opportunity is process optimization. For example, an oil company needs to understand how a higher sodium concentration in the Gulf of Mexico will impact its assets. That’s built into the Cantilever data pipeline from Fulcrum.

The post Fulcrum provides inspection data pipeline for Cantilever analysis, explains Gecko Robotics appeared first on The Robot Report.

]]>
https://www.therobotreport.com/fulcrum-provides-inspection-data-pipeline-for-cantilever-analysis-explains-gecko-robotics/feed/ 0
GelSight, Meta AI release Digit 360 tactile sensor for robotic fingers https://www.therobotreport.com/gelsight-meta-ai-release-digit-360-tactile-sensor-for-robotic-fingers/ https://www.therobotreport.com/gelsight-meta-ai-release-digit-360-tactile-sensor-for-robotic-fingers/#respond Sat, 02 Nov 2024 12:26:32 +0000 https://www.therobotreport.com/?p=581372 Digit 360 deepens GelSight and Meta AI’s existing partnership and fosters a community-driven approach to robotics research.

The post GelSight, Meta AI release Digit 360 tactile sensor for robotic fingers appeared first on The Robot Report.

]]>
A robotic hand with glowing fingertips reaching out to touch a sheer fabric.

Digit 360 uses GelSight’s tactile sensing technology for high sensitivity and micron-level resolution. | Source: Meta AI

GelSight, a developer of tactile technology, and Meta AI announced Digit 360, a tactile sensor for robotic fingers. This signifies the next stage of the partnership between the companies, which was established in 2021 with the launch of the Digit tactile sensor.

Digit 360 is equipped with more than 18 sensing features. The companies said these will enable advancements in touch perception research and allow researchers to either combine its various sensing technologies or isolate individual signals for in-depth analysis of each modality.

This new tactile-specific optical lens can see the imprints all around the artificial fingertip, capturing more sensitive details about the surface touching the object.

“GelSight and Meta AI share the same vision to make tactile sensing more ubiquitous and accessible,” said Youssef Benmokhtar, CEO, GelSight. “Digit 360 will advance the digitization of touch and unlock new applications in robotics with its ability to capture omnidirectional deformations on the fingertip surface.”

GelSight is a developer of imaging-based tactile intelligence. The company’s proprietary technology was invented at the Massachusetts Institute of Technology. It provides detailed and rapid surface characterization, enabling several surface measurement applications and robotic sensing capabilities. 

GelSight said its elastomeric 3D imaging systems are currently used in aerospace, automotive, forensics, and robotics research labs worldwide.


SITE AD for the 2025 Robotics Summit registration. Register now


Digit 360 uses optics for a sense of touch

Digit 360 can see the imprints all around the artificial fingertip, the teams said, capturing more sensitive details about the surface touching the object. Over time, researchers can use Digit 360 to develop AI that can better understand and model the real world, including the physicality of objects, human-object interaction, and contact physics. Digit 360 can detect miniature changes in spatial details and capture forces as small as 1 millinewton.

GelSight’s elastomeric and imaging-based tactile sensing digitizes the sense of touch, enabling robotic engineers to develop solutions for the analysis of any surface regardless of material type or reflectivity, complex object manipulation, and many other dexterous tasks.

Beyond advancing robot dexterity, GelSight said Digit 360 has potential applications in medicine, prosthetics, virtual reality, telepresence, and more. For virtual worlds, Digit 360 can help better ground virtual interactions with the environment to more realistic representations of object properties beyond their visual appearances. Meta AI said it will open-source all code and designs developed using Digit 360.

Meta AI integrates sensors and AI

Meta AI also partnered with South Korea-based Wonik Robotics to develop the Allegro Hand. The company said this will be a fully integrated robotic hand with tactile sensors. 

Building on the Meta Digit Plexus platform, the next generation of Allegro Hand could help advance robotics research by making it easier for researchers to conduct experiments. Wonik Robotics will manufacture and distribute the Allegro Hand, which will be made available next year. 

“Wonik Robotics and Meta FAIR aim to introduce robotic hands to global companies, research institutes, and universities so they can continue developing robotic hand technology that is safe and helpful to humankind,” said Dr. Yonmook Park, executive director and the head of future technology headquarters at Wonik Robotics.

The post GelSight, Meta AI release Digit 360 tactile sensor for robotic fingers appeared first on The Robot Report.

]]>
https://www.therobotreport.com/gelsight-meta-ai-release-digit-360-tactile-sensor-for-robotic-fingers/feed/ 0
Silicon Sensing Systems tests IMU with Science Tokyo for space applications https://www.therobotreport.com/silicon-sensing-systems-tests-imu-with-science-tokyo-for-space-applications/ https://www.therobotreport.com/silicon-sensing-systems-tests-imu-with-science-tokyo-for-space-applications/#respond Wed, 30 Oct 2024 12:47:00 +0000 https://www.therobotreport.com/?p=581343 Silicon Sensing Systems partnered with the Institute of Science Tokyo to put its new inertial measurement unit through rigorous testing.

The post Silicon Sensing Systems tests IMU with Science Tokyo for space applications appeared first on The Robot Report.

]]>
DMU41 inertial measurement unit (IMU) un-housed.

The DMU41 inertial measurement unit (IMU) un-housed. Source: Silicon Sensing Systems

Silicon Sensing Systems Ltd. and the Institute of Science Tokyo, a leading Japanese science and technology university, yesterday said that they have jointly tested Silicon Sensing’s DMU41 inertial measurement unit, or IMU, for low earth orbit applications.

“Working with the prestigious Science Tokyo on this new test program to demonstrate the capabilities of our DMU41 IMU has been a milestone development for us, as we celebrate our 25th year of business,” stated David Somerville, general manager of Silicon Sensing. 

Institute of Science Tokyo was established on Oct. 1, following the merger between Tokyo Medical and Dental University and Tokyo Institute of Technology. The organization‘s stated mission is “Advancing science and human well-being to create value for and with society.”

DMU41 undergoes rigorous testing

The DMU41 is a tactical-grade IMU with nine degrees of freedom, said Silicon Sensing Systems. The company said the sensor is a robust micro-electro-mechanical systems (MEMS) product that operates in temperatures ranging from -40oC to +85oC (-40oF to +185oF), delivering low noise performance, bias instability, and angle random walk.

The IMU’s performance “challenges that of typical fiber-optic gyro IMUs in a far more compact package,” Silicon Sensing claimed. It measures just 50 x 50 x 50 mm (1.9 x 1.9 x 1.9 in.), weighs less than 180 g (6.3 oz.), and consumes less than 2.5W, it said. 

The test program explored the use of the high-performance DMU41 for space platform guidance and attitude control in commercial low earth orbit (LEO). This involved exposing the DMU41 to several rounds of radiation to simulate exposure to naturally occurring radiation in the lower earth orbit, including single event effect (SEE) and total ionizing dose (TID) tests.

Silicon Sensing Systems sees its future in satellite market

“The global LEO satellite market, with a CAGR [compound annual growth rate] predicted at around 17% over the next four to five years, is an important future market for us — and an area where we are already experiencing growing demand,” said Somerville. “In this environment, performance, size, endurance, and power consumption are all critical factors where we believe our technology can make a real performance difference.”

Founded in 1999, Silicon Sensing Systems develops and engineers gyroscope and inertial systems. Collins Aerospace and Sumitomo Precision Products jointly own the Plymouth, U.K.-based company

Silicon Sensing said it is a market leader in silicon, micro MEMS-based navigation and stabilization technology. The company said it has supplied millions of MEMS gyroscopes and accelerometers to thousands of customers since its formation. An example user is the autonomous Mayflower, which crossed the Atlantic Ocean in 2022.


SITE AD for the 2025 Robotics Summit registration. Register now


The post Silicon Sensing Systems tests IMU with Science Tokyo for space applications appeared first on The Robot Report.

]]>
https://www.therobotreport.com/silicon-sensing-systems-tests-imu-with-science-tokyo-for-space-applications/feed/ 0
Doctor describes first robotic high intensity ultrasound procedures with Focal One, Unfold AI https://www.therobotreport.com/doctor-describes-first-robotic-hifu-procedures-focal-one-unfold-ai/ https://www.therobotreport.com/doctor-describes-first-robotic-hifu-procedures-focal-one-unfold-ai/#respond Tue, 29 Oct 2024 21:12:13 +0000 https://www.therobotreport.com/?p=581249 Avenda Health’s Unfold AI is a multimodal AI decision support platform cleared by the Food and Drug Administration.

The post Doctor describes first robotic high intensity ultrasound procedures with Focal One, Unfold AI appeared first on The Robot Report.

]]>

Last month, EDAP TMS SA and Avenda Health said they will jointly offer personalized prostate cancer care using Avenda’s Unfold AI technology. Their stated goal is to launch the Focal One robot using an artificial intelligence for high-intensity focused ultrasound, or HIFU, procedures.

Avenda Health’s Unfold AI is a multimodal AI decision-support platform cleared by the U.S. Food and Drug Administration (FDA). It can build 3D patient-specific cancer maps that reveal the extent of tumors that would otherwise be invisible, it said. This enables physicians to avoid leaving cancerous tissue behind while sparing healthy surrounding tissue. 

By combining Unfold AI’s planning with the Focal One robotic HIFU platform, EDAP and Avenda claimed that urologists can provide a more tailored, patient-specific HIFU ablation procedure for their prostate cancer patients.

Wayne G. Brisbane, M.D., an assistant professor of urology at the University of California Los Angeles David Geffen School of Medicine, worked with Avenda and Focal One to launch the system. Dr. Brisbane gave The Robot Report additional insight into how this technology could change urology procedures. 

How Focal One can change urology procedures

A headshot of Dr. Wayne Brisbane.

Wayne G. Brisbane, M.D., an assistant professor of urology at the UCLA David Geffen School of Medicine. | Source: UCLA Health

From the surgeon‘s perspective, how does Focal One change how you approach treating patients with prostate cancer?
Brisbane: Focal One enables us to ablate a specific area in the prostate rather than treating the entire prostate. While this approach is newer and not as well-established, it has been widely sought after by patients to reduce side effects of treatment.

Unfold AI’s 3D cancer-mapping technology helps us identify where to ablate. Traditional imaging methods like MRI can underestimate the extent of the cancer.

Unfold AI combines data from MRI, PSA levels, biopsy locations, and Gleason scores to create a detailed 3D map, which guides us in performing more accurate tumor ablation.

How do you think this technology will affect patient outcomes?
Brisbane: This technology will improve our ability to predict tumor extent and then deliver treatment guided by robotics. It takes the guesswork out of treating the tumor.

Unfold AI also allows us to select patients for focal therapy more confidently, particularly those who fall between active surveillance and more aggressive treatments.

What risks come with HIFU procedures, and how does Focal One minimize those?
Brisbane: HIFU treatments, while non-invasive and generally safe, do carry risks. These include changes in urinary and sexual function. However, the main concern about HIFU is cancer recurrence from incomplete treatment.

By fusing Unfold AI and FocalOne, we hope to limit one of the major risks of HIFU – incomplete ablations.

How does Focal One change your workflow?
Brisbane: Focal One provides an intuitive workflow and easily integrates with Unfold AI.

EDAP and Unfold AI prepped for prostate procedure

How closely did you collaborate with Avenda Health before this first procedure?
Brisbane: Before the first procedure, I collaborated with Avenda Health to ensure a thorough understanding of the Focal One system and its capabilities. Our partnership involved discussions about treatment protocols, technology integration, and how to leverage Unfold AI’s 3D cancer maps effectively.

How have patients reacted to this new technology?
Brisbane:
Patients have responded positively to Focal One. Many feel relieved that we can target the tumor more precisely, reducing the risk of harming healthy tissue. They appreciate the less-invasive nature of the procedure, especially since traditional treatments often come with concerns about side effects.

Patients also value the insights from Unfold AI’s 3D cancer maps. Some have reported fewer side effects compared to conventional treatments, which has improved their perception of the procedure. The technology has helped build greater confidence among patients and improved their overall treatment experience.

In your opinion, could this technology be applied to other kinds of procedures in the future?
Brisbane: I do expect AI maps to be implemented through additional robotic platforms.

The post Doctor describes first robotic high intensity ultrasound procedures with Focal One, Unfold AI appeared first on The Robot Report.

]]>
https://www.therobotreport.com/doctor-describes-first-robotic-hifu-procedures-focal-one-unfold-ai/feed/ 0
RBR50 Spotlight: MOXIE completes historic oxygen-making mission on Mars https://www.therobotreport.com/rbr50-spotlight-moxie-completes-historic-oxygen-making-mission-on-mars/ https://www.therobotreport.com/rbr50-spotlight-moxie-completes-historic-oxygen-making-mission-on-mars/#respond Thu, 24 Oct 2024 13:00:49 +0000 https://www.therobotreport.com/?p=581041 The Mars Oxygen In-Situ Resource Utilization Experiment, or MOXIE, generated oxygen aboard NASA’s Perseverance Rover.

The post RBR50 Spotlight: MOXIE completes historic oxygen-making mission on Mars appeared first on The Robot Report.

]]>


Organization: NASA JPL
Country: U.S.
Website: jpl.nasa.gov
Year Founded: 1936
Number of Employees: 500+
Innovation Class: Technology

The Mars Oxygen In-Situ Resource Utilization Experiment (MOXIE) generated oxygen for the 16th and final time aboard NASA’s Perseverance Rover on Aug. 7, 2023. The microwave-sized device surpassed all expectations from its creators at the National Aeronautics and Space Administration Jet Propulsion Laboratory (NASA JPL) and the Massachusetts Institute of Technology (MIT).

rbr50 banner logo.After Perseverance landed on Mars on Feb. 18, 2021, MOXIE generated 122 grams of oxygen, which is about what a small dog breathes in 10 hours, NASA said. At its most efficient, MOXIE produced 12 grams of oxygen per hour at a purity of 98% or better. This is twice as much as NASA’s original goal for the instrument. On its final run, MOXIE made 9.8 grams of oxygen.

MOXIE produces molecular oxygen through an electrochemical process that separates one oxygen atom from each molecule of carbon dioxide pumped in from Mars’ thin atmosphere. As the gases flow through the system, they’re analyzed to check the purity and quantity of the oxygen produced.

Devices based on the technology at work in MOXIE will be key for future missions. It seems obvious that astronauts who could someday find themselves on Mars will need an oxygen source to breathe, but NASA has other uses in mind.

In particular, the technologies inside MOXIE could serve as a source of rocket propellant, which will be required in industrial quantities to launch rockets with astronauts for their return trip home.

During its time on Mars, MOXIE completed all of its technical requirements and lasted through the varying conditions of a full Mars year. It performed the first-ever demonstration of an oxygen-producing system on the red planet.

Explore the RBR50 Robotics Innovation Awards 2024.


RBR50 Robotics Innovation Awards 2024

OrganizationInnovation
ABB RoboticsModular industrial robot arms offer flexibility
Advanced Construction RoboticsIronBOT makes rebar installation faster, safer
Agility RoboticsDigit humanoid gets feet wet with logistics work
Amazon RoboticsAmazon strengthens portfolio with heavy-duty AGV
Ambi RoboticsAmbiSort uses real-world data to improve picking
ApptronikApollo humanoid features bespoke linear actuators
Boston DynamicsAtlas shows off unique skills for humanoid
BrightpickAutopicker applies mobile manipulation, AI to warehouses
Capra RoboticsHircus AMR bridges gap between indoor, outdoor logistics
DexterityDexterity stacks robotics and AI for truck loading
DisneyDisney brings beloved characters to life through robotics
DoosanApp-like Dart-Suite eases cobot programming
Electric SheepVertical integration positions landscaping startup for success
ExotecSkypod ASRS scales to serve automotive supplier
FANUCFANUC ships one-millionth industrial robot
FigureStartup builds working humanoid within one year
Fraunhofer Institute for Material Flow and LogisticsevoBot features unique mobile manipulator design
Gardarika TresDevelops de-mining robot for Ukraine
Geek+Upgrades PopPick goods-to-person system
GlidanceProvides independence to visually impaired individuals
Harvard UniversityExoskeleton improves walking for people with Parkinson’s disease
ifm efectorObstacle Detection System simplifies mobile robot development
igusReBeL cobot gets low-cost, human-like hand
InstockInstock turns fulfillment processes upside down with ASRS
Kodama SystemsStartup uses robotics to prevent wildfires
Kodiak RoboticsAutonomous pickup truck to enhance U.S. military operations
KUKARobotic arm leader doubles down on mobile robots for logistics
Locus RoboticsMobile robot leader surpasses 2 billion picks
MassRobotics AcceleratorEquity-free accelerator positions startups for success
MecademicMCS500 SCARA robot accelerates micro-automation
MITRobotic ventricle advances understanding of heart disease
MujinTruckBot accelerates automated truck unloading
MushinyIntelligent 3D sorter ramps up throughput, flexibility
NASAMOXIE completes historic oxygen-making mission on Mars
Neya SystemsDevelopment of cybersecurity standards harden AGVs
NVIDIANova Carter gives mobile robots all-around sight
Olive RoboticsEdgeROS eases robotics development process
OpenAILLMs enable embedded AI to flourish
OpteranApplies insect intelligence to mobile robot navigation
Renovate RoboticsRufus robot automates installation of roof shingles
RobelAutomates railway repairs to overcome labor shortage
Robust AICarter AMR joins DHL's impressive robotics portfolio
Rockwell AutomationAdds OTTO Motors mobile robots to manufacturing lineup
SereactPickGPT harnesses power of generative AI for robotics
Simbe RoboticsScales inventory robotics deal with BJ’s Wholesale Club
Slip RoboticsSimplifies trailer loading/unloading with heavy-duty AMR
SymboticWalmart-backed company rides wave of logistics automation demand
Toyota Research InstituteBuilds large behavior models for fast robot teaching
ULC TechnologiesCable Splicing Machine improve safety, power grid reliability
Universal RobotsCobot leader strengthens lineup with UR30

The post RBR50 Spotlight: MOXIE completes historic oxygen-making mission on Mars appeared first on The Robot Report.

]]>
https://www.therobotreport.com/rbr50-spotlight-moxie-completes-historic-oxygen-making-mission-on-mars/feed/ 0
SonicSense robot hand perceives objects via acoustic vibration https://www.therobotreport.com/sonicsense-lets-robots-perceive-objects-via-in-hand-acoustic-vibration/ https://www.therobotreport.com/sonicsense-lets-robots-perceive-objects-via-in-hand-acoustic-vibration/#respond Wed, 23 Oct 2024 18:22:46 +0000 https://www.therobotreport.com/?p=581246 Researchers give robots a sense of touch by “listening” to vibrations, allowing them to identify materials, understand shapes and recognize objects.

The post SonicSense robot hand perceives objects via acoustic vibration appeared first on The Robot Report.

]]>

Researchers at Duke University have developed a system called SonicSense that gives robots a sense of touch by “listening” to vibrations. The researchers said this allows the robots to identify materials, understand shapes and recognize objects.

SonicSense is a four-fingered robotic hand that has a contact microphone embedded in each fingertip. These sensors detect and record vibrations generated when the robot taps, grasps or shakes an object. And because the microphones are in contact with the object, it allows the robot to tune out ambient noises.

“Robots today mostly rely on vision to interpret the world,” explained Jiaxun Liu, lead author of the paper and a first-year Ph.D. student in the laboratory of Boyuan Chen, professor of mechanical engineering and materials science at Duke. “We wanted to create a solution that could work with complex and diverse objects found on a daily basis, giving robots a much richer ability to ‘feel’ and understand the world.”

Based on the interactions and detected signals, SonicSense extracts frequency features and uses its previous knowledge, paired with recent advancements in AI, to figure out what material the object is made out of and its 3D shape. The researchers said if it’s an object the system has never seen before, it might take 20 different interactions for the system to come to a conclusion. But if it’s an object already in its database, it can correctly identify it in as little as four.

“SonicSense gives robots a new way to hear and feel, much like humans, which can transform how current robots perceive and interact with objects,” said Chen, who also has appointments and students from electrical and computer engineering and computer science. “While vision is essential, sound adds layers of information that can reveal things the eye might miss.”

SonicSense enables robot object perception through in-hand acoustic vibration sensing.

Chen and his laboratory showcase a number of capabilities enabled by SonicSense. By turning or shaking a box filled with dice, it can count the number held within as well as their shape. By doing the same with a bottle of water, it can tell how much liquid is contained inside. And by tapping around the outside of an object, much like how humans explore objects in the dark, it can build a 3D reconstruction of the object’s shape and determine what material it’s made from.

“While most datasets are collected in controlled lab settings or with human intervention, we needed our robot to interact with objects independently in an open lab environment,” said Liu. “It’s difficult to replicate that level of complexity in simulations. This gap between controlled and real-world data is critical, and SonicSense bridges that by enabling robots to interact directly with the diverse, messy realities of the physical world.”

The team said these abilities make SonicSense a robust foundation for training robots to perceive objects in dynamic, unstructured environments. So does its cost; using the same contact microphones that musicians use to record sound from guitars, 3D printing and other commercially available components keeps the construction costs to just over $200, according to Duke University.

The researchers are working to enhance the system’s ability to interact with multiple objects. By integrating object-tracking algorithms, robots will be able to handle dynamic, cluttered environments — bringing them closer to human-like adaptability in real-world tasks.

Another key development lies in the design of the robot hand itself. “This is only the beginning. In the future, we envision SonicSense being used in more advanced robotic hands with dexterous manipulation skills, allowing robots to perform tasks that require a nuanced sense of touch,” Chen said. “We’re excited to explore how this technology can be further developed to integrate multiple sensory modalities, such as pressure and temperature, for even more complex interactions.”

SoniceSense

The SonicSense robot hand includes four fingers where each fingertip is equipped with one contact microphone. | Credit: Duke University

The post SonicSense robot hand perceives objects via acoustic vibration appeared first on The Robot Report.

]]>
https://www.therobotreport.com/sonicsense-lets-robots-perceive-objects-via-in-hand-acoustic-vibration/feed/ 0