End Effectors / Grippers Archives - The Robot Report https://www.therobotreport.com/category/technologies/grippers-end-effectors/ Robotics news, research and analysis Wed, 04 Dec 2024 18:30:08 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.1 https://www.therobotreport.com/wp-content/uploads/2017/08/cropped-robot-report-site-32x32.png End Effectors / Grippers Archives - The Robot Report https://www.therobotreport.com/category/technologies/grippers-end-effectors/ 32 32 COVAL releases MPXS, its smallest micro vacuum pump to date https://www.therobotreport.com/coval-releases-mpxs-smallest-micro-vacuum-pump-to-date/ https://www.therobotreport.com/coval-releases-mpxs-smallest-micro-vacuum-pump-to-date/#respond Wed, 04 Dec 2024 18:27:15 +0000 https://www.therobotreport.com/?p=581847 With a width of just 12.5 mm and a weight of only 87 grams, the MPXS is the smallest vacuum pump designed by COVAL.

The post COVAL releases MPXS, its smallest micro vacuum pump to date appeared first on The Robot Report.

]]>
A white hand holding COVAL's MPXS micro vacuum pump.

The MPXS micro vacuum pump puts the features of COVAL’s intelligent vacuum pumps into a smaller physical space. | Source: COVAL

COVAL SAS, a designer, producer, and marketer of vacuum components and systems, has released its latest micro vacuum pump, the MPXS. The Montélier, France-based company said it designed the pump to be pilot-controlled, ultra-compact, and equipped with high-performance communication capabilities. 

The new MPXS series is intended to provide manufacturers with an efficient tool for handling non-porous parts at high speeds on robots or automated systems, said COVAL. The micro vacuum pump follows the design principles of the company‘s intelligent vacuum pumps, which COVAL said are energy-efficiency, high-performance, and communications I/O.

With a width of just 12.5 mm (0.4 in.) and a weight of only 87 g (3 oz.), the company said the MPXS is the smallest vacuum pump it has created. This size means it can be installed as close as possible to suction cups or inside restricted spaces for reduced pick-up time with no loss of load, guaranteeing high speeds.

COVAL is an ISO 9001 V2015-certified company that specializes in vacuum handling systems for multiple industries. It has clients in fields including packaging, automotive, food processing, plastic processing, and aeronautics. COVAL markets its products and services internationally through its subsidiaries and its network of authorized distributors.

More details about the MPXS

Thanks to single-stage Venturi technology, MPXS series micro vacuum pumps can quickly reach a maximum vacuum of 85%. This makes it suited to dynamic applications requiring very short cycle times. 

COVAL said the two power levels of 0.53 and 0.92 SCFM add to the system’s versatility and enable it to adapt to the needs of each application.

The MPXS also provides the user with useful information at every stage of operation. COVAL said it equipped the system with a human-machine interface (HMI) that makes it easy to read operating, diagnostic, and maintenance information. It also enables rapid parameter setting.

In addition, the integrated IO-Link communication interface supports fast, cost-effective installation, continuous diagnostics, centralized parameter setting, and efficient communication with higher-level protocols such as EtherNet/IP, PROFINET, and EtherCAT.

MPXS micro vacuum pumps feature air-saving control (ASC) technology. COVAL said it intelligently regulates vacuum generation, enabling energy savings of 90% on average by stopping air consumption once the desired vacuum level has been reached.

The modularity of the MPXS series offers a wide choice of configurations, ensuring flexibility during installation and use. It is available as stand-alone modules or in islands of up to eight modules, with standard or powerful adjustable blower options.

COVAL said the MPXS micro vacuum pump’s small size, high performance, and wide range of functions and configurations make it suitable for industrial applications requiring high speeds. These include high-speed pick-and-place systems, robot manipulators, and automated production. It is especially useful for the plastics, electronics, and pharmaceutical industries, according to the company.


SITE AD for the 2025 Robotics Summit registration. Register now


The post COVAL releases MPXS, its smallest micro vacuum pump to date appeared first on The Robot Report.

]]>
https://www.therobotreport.com/coval-releases-mpxs-smallest-micro-vacuum-pump-to-date/feed/ 0
KIMM develops automated mooring system for docking autonomous vessels https://www.therobotreport.com/kimm-develops-automated-mooring-system-for-docking-autonomous-vessels/ https://www.therobotreport.com/kimm-develops-automated-mooring-system-for-docking-autonomous-vessels/#respond Tue, 19 Nov 2024 20:50:57 +0000 https://www.therobotreport.com/?p=581646 The manual mooring process demanded substantial manpower and time, while KIMM said its automated method removes these barriers.

The post KIMM develops automated mooring system for docking autonomous vessels appeared first on The Robot Report.

]]>
Dr. Yongjin Kim, Principal Researcher (right), and Senior Researcher Dr. Young-ki Kim (left) with the automated mooring system.

Principal researcher Dr. Yongjin Kim (right) and senior researcher Dr. Young-ki Kim (left) from the Department of Reliability at KIMM, developed an automated mooring system. | Source: the Korea Institute of Machinery and Materials

The Korea Institute of Machinery and Materials, or KIMM, has developed an automated mooring system to enhance the safety and efficiency of docking operations for autonomous vessels. The institute designed the system to overcome the limitations of conventional wire-based mooring methods. KIMM said it expects the innovation to be commercially available by 2025.

“This automated mooring system represents a key advancement in the safe docking of autonomous vessels and will play a pivotal role in the development of smart port infrastructure,” stated Dr. Yongjin Kim, principal researcher in the Department of Reliability at KIMM. “We expect this solution to set a new standard in operational safety and efficiency across the marine industry.”

The Korea Institute of Machinery and Materials is a non-profit government-funded research institute under the Ministry of Science and Information and Communication Technology. Since its foundation in 1976, KIMM has contributed to South Korea’s economic growth by researching and developing key technologies in machinery and materials, conducting reliability evaluations, and commercializing products.


SITE AD for the 2025 Robotics Summit registration. Register now


KIMM aims to make the mooring process safer, faster

Previously, workers secured vessels to the port manually using thick mooring lines. This method required high tensile strength, depending on the ship’s size and weight. If the wire broke, there was a risk of accidents, and the manual mooring process demanded substantial manpower and time.

KIMM said its automated mooring system directly addresses these challenges. It uses vacuum suction pads for secure attachment and a flexible, four degree-of-freedom hydraulic system for automated control.

The new technology can streamline the mooring process, increasing both speed and accuracy while reducing accident risks and labor needs, according to the researchers.

Actual Fixture for Quantitative Evaluation of Suction Force

The actual Fixture for quantitative evaluation of suction force. Source: KIMM

Korean team receives recognition, prepares for commercialization

Dr. Yongjin Kim led the team at KIMM under President Seog-Hyeon Ryu. Dr. Young-ki Kim served as a senior researcher.

The institute‘s project was conducted under the “Development of Smart Port-Autonomous Ship linkage Technology” initiative, supported by Korea’s Ministry of Oceans and Fisheries. For its innovation and impact, the technology has been recognized by the Korea Federation of Mechanical Engineering Societies as one of “Korea’s Top 10 Mechanical Technologies of the Year.”

The final performance will be verified at sea in 2025, after which the technology development will be completed, including efforts to commercialize the system.

The KIMM automated mooring platform, rom concept to reality.

The automated ship-mooring platform, from concept to manufactured product. Source: KIMM

The post KIMM develops automated mooring system for docking autonomous vessels appeared first on The Robot Report.

]]>
https://www.therobotreport.com/kimm-develops-automated-mooring-system-for-docking-autonomous-vessels/feed/ 0
NRL develops robots that can service satellites in orbit https://www.therobotreport.com/nrl-develops-robots-that-can-service-space-satellites/ https://www.therobotreport.com/nrl-develops-robots-that-can-service-space-satellites/#respond Fri, 15 Nov 2024 20:44:07 +0000 https://www.therobotreport.com/?p=581601 As DARPA’s robotic payload developer for the RSGS program, NRL looked to the future to build and test satellite servicing capabilities.

The post NRL develops robots that can service satellites in orbit appeared first on The Robot Report.

]]>
The Robotic Servicing of Geosynchronous Satellites (RSGS) payload, two gold robotic arms inside a grey tunnel.

The Robotic Servicing of Geosynchronous Satellites payload sits in the cryogenic thermal vacuum chamber at the Naval Center for Space Technology. | Credit: Sarah Peterson, U.S. Navy 

Satellites in geosynchronous orbit about 22,000 miles above Earth are crucial for military, government, and commercial communications, as well as Earth-observing science. The U.S. Naval Research Laboratory, or NRL, and the Defense Advanced Projects Agency last month completed the development of a spaceflight-qualified robotics suite capable of servicing satellites in orbit.

Under DARPA funding, the laboratory‘s Naval Center for Space Technology (NCST) developed the Robotic Servicing of Geosynchronous Satellites (RSGS) Integrated Robotic Payload (IRP). The organization delivered this new space capability to commercial partner Northrop Grumman’s SpaceLogistics. The company will integrate the robotic payload with its spacecraft bus, the Mission Robotics Vehicle (MRV).

“The recent completion of thermal vacuum testing marks a major milestone toward achieving the program’s goal of demonstrating robotic servicing capabilities on orbit in the near future,” said Dr. Bruce Danly, NRL director of research.

“NRL’s contributions to the robotic payload are an essential part of realizing this vision, which promises to transform satellite operations in geostationary orbit, reduce costs for satellite operators, and enable capabilities well beyond what we have today,” he continued. “In fact, the anticipated capabilities are potentially revolutionary for both national security and civil applications.”

NRL has longstanding relationships with academia and industry as a collaborator and contractor. It participates in technology-transfer efforts such as commercial licensing, cooperative research and development, and educational partnerships.

The scientific and engineering command is dedicated to research for the U.S. Navy and Marine Corps, from the seafloor to space and the information domain. NRL is headquartered in Washington, D.C., with major field sites in Stennis Space Center, Miss.; Key West, Fla.; and Monterey, Calif. It employs approximately 3,000 civilian scientists, engineers, and support personnel.


SITE AD for the 2025 Robotics Summit registration. Register now


NRL aims to unlock satellite servicing opportunities

Currently, spacecraft face significant challenges, in part because of the inability to perform in-orbit repairs or upgrades. To compensate for the lack of servicing options, satellites are often loaded with backup systems and excess fuel, leading to increased complexity, weight, and cost. 

“The military regularly fixes aircraft, tanks, ships, and trucks that break,” said Glen Henshaw, Ph.D., senior scientist for robotics and autonomous systems at NRL. “We upgrade aircraft and ships with the latest radars, computers, and engines.”

“Satellites are the only expensive equipment we buy that can’t be repaired or upgraded once they are in the field, and this costs the taxpayer money,” he added. “RSGS is intended to change this situation. We intend to demonstrate that we can upgrade and repair these valuable assets using robots.”

As DARPA’s robotic payload developer for the RSGS program, NRL looked to design, build, integrate, and test new satellite-servicing capabilities. Should this project prove successful, satellites could receive in-orbit upgrades to extend their service lives, said Bernie Kelm, superintendent of the Spacecraft Engineering Division at NRL NCST.

“This collaboration unlocks new servicing opportunities for both commercial and government satellites, enabling usual-close inspections, orbital adjustments, hardware upgrades, and repairs,” he said. “We’ve created advanced spaceflight hardware and software that will significantly enhance satellite servicing operations, including all robotic controls.”

Steven Butcher, Technology Service Corporation space robotics and mechanisms engineer, performs an inspection of the Robotic Servicing of Geosynchronous Satellites (RSGS) payload after completing testing in the cryogenic thermal vacuum chamber. The system is made up of two, long, golden robotic arms.

Steven Butcher, Technology Service Corp. space robotics and mechanisms engineer, inspects the RSGS payload after completing testing at the NRL’s Naval Center for Space Technology. | Credit: Sarah Peterson, U.S. Navy

About the Thermal Vacuum (TVAC) testing process

The test campaign put the robotic payload through its paces across the range of temperatures it will face while in orbit and under vacuum conditions similar to space. Engineers tested all aspects of the payload including avionics, cameras, and lights.

They also demonstrated all operations with each of the two robotic arms, including launch lock deployments, calibrations, and tool changing. The test also verified SpaceWire communications, robotic compliance, and visual servo control modes.

NRL worked for over two decades to mature the technology enabling the RSGS program. Its intent is to safely and reliably repair and upgrade satellites, some of which cost over a $1 billion.

In the near future, robotic satellite “mechanics” may extend the useful life of satellites with new electronics, propulsion, or sensor capabilities, said NRL. RSGS robots could demonstrate broad servicing as a precursor to building large structures in orbit such as an observatory or solar power stations, said the researchers.

Following its anticipated 2026 launch on the Northrop Grumman’s MRV spacecraft bus, the robotic payload will undergo initial checkout and calibration with full operational servicing missions to follow.

“NRL’s Team RSGS has spent nearly 10 years focused on the goal of completing this first-of-a-kind, robotic servicing payload,” said William Vincent, NRL RSGS program manager. “The completion of IRP TVAC represents a huge milestone and countless hours of work from an incredible group of dedicated personnel. Like sending a child off to college for the first time, shipping the IRP to Dulles is a bittersweet experience.”

The Robotic Servicing of Geosynchronous Satellites (RSGS) payload resides in the cryogenic thermal vacuum chamber, a round chamber with the door open. On the floor in front of the chamber is the NRL logo.

Once on-orbit, the RSGS payload will inspect and service satellites in geosynchronous orbit. | Source: Sarah Peterson, U.S. Navy

The post NRL develops robots that can service satellites in orbit appeared first on The Robot Report.

]]>
https://www.therobotreport.com/nrl-develops-robots-that-can-service-space-satellites/feed/ 0
GelSight, Meta AI release Digit 360 tactile sensor for robotic fingers https://www.therobotreport.com/gelsight-meta-ai-release-digit-360-tactile-sensor-for-robotic-fingers/ https://www.therobotreport.com/gelsight-meta-ai-release-digit-360-tactile-sensor-for-robotic-fingers/#respond Sat, 02 Nov 2024 12:26:32 +0000 https://www.therobotreport.com/?p=581372 Digit 360 deepens GelSight and Meta AI’s existing partnership and fosters a community-driven approach to robotics research.

The post GelSight, Meta AI release Digit 360 tactile sensor for robotic fingers appeared first on The Robot Report.

]]>
A robotic hand with glowing fingertips reaching out to touch a sheer fabric.

Digit 360 uses GelSight’s tactile sensing technology for high sensitivity and micron-level resolution. | Source: Meta AI

GelSight, a developer of tactile technology, and Meta AI announced Digit 360, a tactile sensor for robotic fingers. This signifies the next stage of the partnership between the companies, which was established in 2021 with the launch of the Digit tactile sensor.

Digit 360 is equipped with more than 18 sensing features. The companies said these will enable advancements in touch perception research and allow researchers to either combine its various sensing technologies or isolate individual signals for in-depth analysis of each modality.

This new tactile-specific optical lens can see the imprints all around the artificial fingertip, capturing more sensitive details about the surface touching the object.

“GelSight and Meta AI share the same vision to make tactile sensing more ubiquitous and accessible,” said Youssef Benmokhtar, CEO, GelSight. “Digit 360 will advance the digitization of touch and unlock new applications in robotics with its ability to capture omnidirectional deformations on the fingertip surface.”

GelSight is a developer of imaging-based tactile intelligence. The company’s proprietary technology was invented at the Massachusetts Institute of Technology. It provides detailed and rapid surface characterization, enabling several surface measurement applications and robotic sensing capabilities. 

GelSight said its elastomeric 3D imaging systems are currently used in aerospace, automotive, forensics, and robotics research labs worldwide.


SITE AD for the 2025 Robotics Summit registration. Register now


Digit 360 uses optics for a sense of touch

Digit 360 can see the imprints all around the artificial fingertip, the teams said, capturing more sensitive details about the surface touching the object. Over time, researchers can use Digit 360 to develop AI that can better understand and model the real world, including the physicality of objects, human-object interaction, and contact physics. Digit 360 can detect miniature changes in spatial details and capture forces as small as 1 millinewton.

GelSight’s elastomeric and imaging-based tactile sensing digitizes the sense of touch, enabling robotic engineers to develop solutions for the analysis of any surface regardless of material type or reflectivity, complex object manipulation, and many other dexterous tasks.

Beyond advancing robot dexterity, GelSight said Digit 360 has potential applications in medicine, prosthetics, virtual reality, telepresence, and more. For virtual worlds, Digit 360 can help better ground virtual interactions with the environment to more realistic representations of object properties beyond their visual appearances. Meta AI said it will open-source all code and designs developed using Digit 360.

Meta AI integrates sensors and AI

Meta AI also partnered with South Korea-based Wonik Robotics to develop the Allegro Hand. The company said this will be a fully integrated robotic hand with tactile sensors. 

Building on the Meta Digit Plexus platform, the next generation of Allegro Hand could help advance robotics research by making it easier for researchers to conduct experiments. Wonik Robotics will manufacture and distribute the Allegro Hand, which will be made available next year. 

“Wonik Robotics and Meta FAIR aim to introduce robotic hands to global companies, research institutes, and universities so they can continue developing robotic hand technology that is safe and helpful to humankind,” said Dr. Yonmook Park, executive director and the head of future technology headquarters at Wonik Robotics.

The post GelSight, Meta AI release Digit 360 tactile sensor for robotic fingers appeared first on The Robot Report.

]]>
https://www.therobotreport.com/gelsight-meta-ai-release-digit-360-tactile-sensor-for-robotic-fingers/feed/ 0
Life Line Emergency Vehicles deploys robotic sander from GrayMatter Robotics https://www.therobotreport.com/life-line-deploys-graymatter-robotic-sanderemergency-vehicles/ https://www.therobotreport.com/life-line-deploys-graymatter-robotic-sanderemergency-vehicles/#respond Thu, 31 Oct 2024 12:03:32 +0000 https://www.therobotreport.com/?p=581354 Life Line didn't have enough staffers to go around making ambulances, so it applied GrayMatter's Scan&Sand system.

The post Life Line Emergency Vehicles deploys robotic sander from GrayMatter Robotics appeared first on The Robot Report.

]]>

GrayMatter Robotics recently provided a peek into its deployment with Life Line Emergency Vehicles. Sumner, Iowa-based Life Line Emergency Vehicles has 35 years of experience building life-saving ambulances, with an emphasis on quality, safety, and durability. 

Each of Life Line’s vehicles such as ambulances is one of a kind and built according to the customer’s needs and specifications. The company said it has built a reputation around customer service and craftsmanship to help its customers provide better patient care.

Robustness is key in this line of work. These vehicles will be saving people’s lives, so malfunctions aren’t an option. 

However, the process of building an ambulance is incredibly labor-intensive. In recent years, Life Line said it has struggled to find enough people to continue serving for its customers.

Sanding, in particular, is a difficult part of the vehicle-construction process. It involves going up and down ladders, dragging hoses around, and hauling heaving equipment. It’s also time-consuming and pulls Life Line’s employees away from work that is better suited for humans.

This is where GrayMatter Robotics comes in. The Carson, Calif.-based company offers its robotic sanders through a robotics-as-a-service (RaaS) model.

It said they can relieve shop floor workers of tedious and ergonomically challenging tasks. They can also enhance production capacity and reduce scrap, repair, and rework costs, said GrayMatter. 


SITE AD for the 2025 Robotics Summit registration. Register now


Scan&Sand designed for reliable finishing

Scan&Sand is an AI-powered system that scans an object and sands it with the push of a button. GrayMatter created the system for high-mix manufacturing, where there is great variety in the shape and size of pieces that need to be sanded. 

The company said its technology can handle complex geometries with ease. The robots can prepare a surface and sand for shape correction. 

GrayMatter’s system autonomously replaces worn sandpaper and switches between different grits as needed. It can also swap end-of-arm tools (EOAT) to shift from sanding to grinding or other applications in just minutes. 

With Scan&Sand, operators can precisely define sanding areas by marking them directly on the part. The system’s advanced packages feature built-in quality control for precise measurement of roughness, gloss, or surface imperfections.

GreyMatter also uses high-fidelity sensors to swiftly scan and model parts within minutes, creating a precise, unique working model for each component placed in the cell.

The company offers multiple Scan&Sand configurations, so the system can fit into a variety of workflows. These include dual-arm and rail configurations, as well as mobile arm and rail configurations.

Life Line shares its results with GrayMatter

Life Line said that GrayMatter’s technology filled essential gaps in its production. The vehicle maker noted that this autonomous technology wouldn’t be replacing any workers, as there simply weren’t enough people to get everything done without the robots. 

In addition to filling these gaps, GrayMatter also made production faster for Life Line. Its employees said it was able to shave 45 minutes off of sanding a single raw truck. For a primer truck, GrayMatter was able to knock 30 minutes to an hour off of the sanding time. 

The Life Line team also noted that GrayMatter was a very present partner when it came to deploying this technology. The vendor didn’t just send them a robot and disappear, it said. In addition, GrayMatter was upfront about the learning curve of its robots, said Life Line’s team. 

The post Life Line Emergency Vehicles deploys robotic sander from GrayMatter Robotics appeared first on The Robot Report.

]]>
https://www.therobotreport.com/life-line-deploys-graymatter-robotic-sanderemergency-vehicles/feed/ 0
SonicSense robot hand perceives objects via acoustic vibration https://www.therobotreport.com/sonicsense-lets-robots-perceive-objects-via-in-hand-acoustic-vibration/ https://www.therobotreport.com/sonicsense-lets-robots-perceive-objects-via-in-hand-acoustic-vibration/#respond Wed, 23 Oct 2024 18:22:46 +0000 https://www.therobotreport.com/?p=581246 Researchers give robots a sense of touch by “listening” to vibrations, allowing them to identify materials, understand shapes and recognize objects.

The post SonicSense robot hand perceives objects via acoustic vibration appeared first on The Robot Report.

]]>

Researchers at Duke University have developed a system called SonicSense that gives robots a sense of touch by “listening” to vibrations. The researchers said this allows the robots to identify materials, understand shapes and recognize objects.

SonicSense is a four-fingered robotic hand that has a contact microphone embedded in each fingertip. These sensors detect and record vibrations generated when the robot taps, grasps or shakes an object. And because the microphones are in contact with the object, it allows the robot to tune out ambient noises.

“Robots today mostly rely on vision to interpret the world,” explained Jiaxun Liu, lead author of the paper and a first-year Ph.D. student in the laboratory of Boyuan Chen, professor of mechanical engineering and materials science at Duke. “We wanted to create a solution that could work with complex and diverse objects found on a daily basis, giving robots a much richer ability to ‘feel’ and understand the world.”

Based on the interactions and detected signals, SonicSense extracts frequency features and uses its previous knowledge, paired with recent advancements in AI, to figure out what material the object is made out of and its 3D shape. The researchers said if it’s an object the system has never seen before, it might take 20 different interactions for the system to come to a conclusion. But if it’s an object already in its database, it can correctly identify it in as little as four.

“SonicSense gives robots a new way to hear and feel, much like humans, which can transform how current robots perceive and interact with objects,” said Chen, who also has appointments and students from electrical and computer engineering and computer science. “While vision is essential, sound adds layers of information that can reveal things the eye might miss.”

SonicSense enables robot object perception through in-hand acoustic vibration sensing.

Chen and his laboratory showcase a number of capabilities enabled by SonicSense. By turning or shaking a box filled with dice, it can count the number held within as well as their shape. By doing the same with a bottle of water, it can tell how much liquid is contained inside. And by tapping around the outside of an object, much like how humans explore objects in the dark, it can build a 3D reconstruction of the object’s shape and determine what material it’s made from.

“While most datasets are collected in controlled lab settings or with human intervention, we needed our robot to interact with objects independently in an open lab environment,” said Liu. “It’s difficult to replicate that level of complexity in simulations. This gap between controlled and real-world data is critical, and SonicSense bridges that by enabling robots to interact directly with the diverse, messy realities of the physical world.”

The team said these abilities make SonicSense a robust foundation for training robots to perceive objects in dynamic, unstructured environments. So does its cost; using the same contact microphones that musicians use to record sound from guitars, 3D printing and other commercially available components keeps the construction costs to just over $200, according to Duke University.

The researchers are working to enhance the system’s ability to interact with multiple objects. By integrating object-tracking algorithms, robots will be able to handle dynamic, cluttered environments — bringing them closer to human-like adaptability in real-world tasks.

Another key development lies in the design of the robot hand itself. “This is only the beginning. In the future, we envision SonicSense being used in more advanced robotic hands with dexterous manipulation skills, allowing robots to perform tasks that require a nuanced sense of touch,” Chen said. “We’re excited to explore how this technology can be further developed to integrate multiple sensory modalities, such as pressure and temperature, for even more complex interactions.”

SoniceSense

The SonicSense robot hand includes four fingers where each fingertip is equipped with one contact microphone. | Credit: Duke University

The post SonicSense robot hand perceives objects via acoustic vibration appeared first on The Robot Report.

]]>
https://www.therobotreport.com/sonicsense-lets-robots-perceive-objects-via-in-hand-acoustic-vibration/feed/ 0
Researchers create robotic finger that could perform medical examinations https://www.therobotreport.com/researchers-create-robotic-finger-that-could-perform-medical-examinations/ https://www.therobotreport.com/researchers-create-robotic-finger-that-could-perform-medical-examinations/#respond Wed, 09 Oct 2024 15:00:13 +0000 https://www.therobotreport.com/?p=581087 The researchers said this technology could make it easier for doctors to detect diseases like breast cancer when they're more treatable.

The post Researchers create robotic finger that could perform medical examinations appeared first on The Robot Report.

]]>
The researchers' robotic finger, which looks like a skinny, ribbed finger with an orange base.

The researchers’ robotic finger contains conductive fiber coils and a twisted liquid-metal fiber at the fingertip. | Source: Hongbo Wang

Researchers at the University of Science and Technology of China said they have developed a soft robotic “finger” with a sense of touch that can perform routine doctor office examinations, including taking a patient’s pulse and checking for abnormal lumps.

The scientists said this technology could make it easier for doctors to detect diseases like breast cancer early, when they are more treatable. It could also put patients at ease during physical exams that can seem uncomfortable and invasive.

“By further development to improve its efficiency, we also believe that a dexterous hand made of such fingers can act as a ‘Robodoctor’ in a future hospital, like a physician,” stated Hongbo Wang, a sensing technologies researcher at the University of Science and Technology of China and an author of the study.

“Combined with machine learning, automatic robotic examination and diagnosis can be achieved, particularly beneficial for these undeveloped areas where there is a serious shortage in health workers,” he said.


SITE AD for the 2025 Robotics Summit registration. Register now


Robotic finger is delicate enough for human contact

While rigid robotic fingers already exist, experts have raised concerns that these devices might not be up to the delicate tasks required in a doctor’s office setting. Some have pointed to potential safety issues, including a fear that overzealous robotic fingers could rupture lumps during examinations. 

More recently, scientists have developed lightweight, safe, and low-cost soft robotics that can recreate the movements of human hands. However, these devices typically haven’t been able to sense the complex properties of objects they touch the way real fingers do.

“Despite the remarkable progress in the last decade, most soft fingers presented in the literature still have substantial gaps compared to human hands,” the authors wrote. By contrast, robotic fingers have not been ready to handle real-world scenarios, they said.

To overcome this challenge, the University of Science and Technology of China developed a simple device that contains conductive fiber coils with two parts. One is a coil wound on each air chamber of the device’s bending actuators, and the other is a twisted liquid metal fiber mounted at the fingertip. This way, the device could perceive an object’s properties as effectively as human touch.

Robot finger with additional sensitivity.

The robotic finger is designed to be soft and sensitive enough for medical diagnosis. Source: Hongbo Wang

Researchers put soft finger to the test

To test the device, the researchers started by brushing a feather against its fingertip.

“The magnified view clearly shows the resistance change, indicating its high sensitivity in force sensing,” the authors wrote.

Next, they tapped and pushed the fingertip with a glass rod and repeatedly bent the finger, observing that the device’s sensors accurately perceived the type and quantity of force they applied.

To test the finger’s medical chops, they mounted it on a robotic arm and watched as it identified three lumps embedded in a large silicone sheet, pressing on them like a doctor would. While mounted on the robotic arm, the finger also correctly located an artery on a participant’s wrist and took their pulse.

“Humans can easily recognize the stiffness of diverse objects by simply pressing it with their finger,” said the researchers. “Similarly, since the [device] has the ability to sense both its bending deformation and the force at the fingertip, it can detect stiffness similar to our human hand by simply pressing an object.”

In addition to taking pulses and examining simulated lumps, the researchers found that the robotic finger can type “like a human hand,” spelling out the word “hello.”

Additional sensors provide even more flexibility in the robotic finger’s joints. They allow the device to move in multiple directions like a human finger, so it may be ready to perform effective and efficient medical examinations in the near future, the team concluded.

“We hope to develop an intelligent, dexterous hand, together with a sensorized artificial muscle-driven robotic arm, to mimic the unparalleled functions and fine manipulations of the human hands,” said Wang.

This work was published in the Cell Press journal Cell Reports Physical Science.

The post Researchers create robotic finger that could perform medical examinations appeared first on The Robot Report.

]]>
https://www.therobotreport.com/researchers-create-robotic-finger-that-could-perform-medical-examinations/feed/ 0
Fourier launches GR-2 humanoid, software platform https://www.therobotreport.com/fourier-launches-gr-2-humanoid-software-platform/ https://www.therobotreport.com/fourier-launches-gr-2-humanoid-software-platform/#respond Mon, 30 Sep 2024 20:29:45 +0000 https://www.therobotreport.com/?p=580928 Fourier launches the second generation of its humanoid robot, the GR-2, and the accompanying development platform.

The post Fourier launches GR-2 humanoid, software platform appeared first on The Robot Report.

]]>

Shanghai-based Fourier today launched GR-2, the latest generation of its GRx humanoid robot series. The company has upgraded its hardware, design, and software.

“GR-2 is a big step into the future of humanoid robotics,” stated Alex Gu, CEO of Fourier. “We’re passionate about building the most intuitive embodied agent for AI, allowing it to engage with the physical world in ways like never before. Fourier is excited to have developers, researchers, and enterprises join us on this incredible journey.”

This announcement followed the company‘s rebranding from Fourier Intelligence to Fourier earlier this year, and the GR-2 release builds on the production release of the first-generation GR-1 in late 2023.


SITE AD for the 2025 Robotics Summit registration. Register now


Fourier improves hardware design

The GR-2 stands 175 cm (68.9 in.) tall and weighs 63 kg (139 lb.), whereas the GR-1 is 165 cm (65 in.) tall and weighs 55 kg (121 lb.). GR-2 offers 53 degrees of freedom and a single-arm load capacity of 3 kg (6.6 lb.).

A new feature of Fourier’s humanoid is a detachable battery with twice the capacity of its predecessor and a runtime of up to two hours. Since the battery is now swappable, users will be able to swap batteries quickly and return GR-2 to work.

GR-2 features an integrated cabling design for power and communication transmission, allowing concealed wires and more compact packaging. The efficient layout optimizes space for easier modularization and greater adaptiveness for application-oriented customization.

To simplify the control system and reduce maintenance, Fourier redesigned GR-2’s joint configuration, shifting from a parallel to a serial structure. It said this improves debugging, lowers manufacturing costs, and enhances the robot’s ability to rapidly learn and transition from AI simulation to real-world applications.

Dexterous hands have 12 degrees of freedom

image of a GR-2 humanoid robot picking up various colored cubes from a table.

The 12-DoF Dexterous Hand is equipped with six array-type tactile sensors. | Credit: Fourier

The robot includes hands with 12 degrees of freedom, doubling the dexterity of previous models. The hands are designed to mirror the flexibility of human physiology and offer greater precision in the tasks that the robot will be asked to complete, said Fourier.

The fingers on GR-2 also have six array-type tactile sensors that sense force and can identify object shapes and materials. This enables new algorithms to optimize parts handling, the company said.

Supporting multiple upper-limb teaching modes—virtual reality remote control, lead-through programming, and direct command—GR-2 can record a comprehensive set of operational data, from motion paths to tactile responses. Fourier said it expects robust data collection to bridge the gap between virtual models and real-world applications, pushing the boundaries of robot training and deployment further.

FSA 2.0 powers dynamic mobility

To optimize its movement, Fourier developed seven distinct Fourier Smart Actuators (FSA) for GR-2, each tailored to meet the specific torque demands of each joint.

With peak torques exceeding 380 N.m (280.3 ft.-lb.), the FSA 2.0 actuators boost GR-2’s agility and dynamic capabilities. The dual-encoder system doubles control accuracy, ensuring precise movements even in high-pressure environments.

Designed for both speed and precision, FSA 2.0 empowers GR-2 to navigate complex tasks with greater flexibility, said Fourier.

Fourier optimizes tools for open-source software development

diagram of six core values.

Fourier GRx series outlines six key areas for humanoid development—locomotion, manipulation, cognition, bionic design, user experience, and commercial viability. | Credit: Fourier

Fourier optimized GR-2’s development platform by introducing a new software development kit (SDK) compatible with programming languages such as ROS. Developers can access a suite of pre-optimized modules for machine vision, path planning, and force feedback control through application programming interfaces (APIs), said the company

Supporting frameworks such as NVIDIA Isaac Lab and Mujoco, the new platform empowers developers to focus on innovation, streamlining their workflows, claimed Fourier.

The post Fourier launches GR-2 humanoid, software platform appeared first on The Robot Report.

]]>
https://www.therobotreport.com/fourier-launches-gr-2-humanoid-software-platform/feed/ 0
Google DeepMind discusses latest advances in robot dexterity https://www.therobotreport.com/inside-google-deepmind-latest-advances-in-robot-dexterity/ https://www.therobotreport.com/inside-google-deepmind-latest-advances-in-robot-dexterity/#respond Fri, 27 Sep 2024 12:57:28 +0000 https://www.therobotreport.com/?p=580710 The DeepMind team said that for robots to be more useful, they need to get better at making contact with objects in dynamic environments.

The post Google DeepMind discusses latest advances in robot dexterity appeared first on The Robot Report.

]]>
Two black robotic arms finish tying shoelaces into a bow on a black and white pair of shoes.

ALOHA Unleashed achieves a high level of dexterity in bi-arm manipulation. | Source: Google DeepMind

Google DeepMind recently gave insight into two artificial intelligence systems it has created: ALOHA Unleashed and DemoStart. The company said that both of these systems aim to help robots perform complex tasks that require dexterous movement. 

Dexterity is a deceptively difficult skill to acquire. There are many tasks that we do every day without thinking twice, like tying our shoelaces or tightening a screw, that could take weeks of training for a robot to do reliably.

The DeepMind team asserted that for robots to be more useful in people’s lives, they need to get better at making contact with physical objects in dynamic environments.

The Alphabet unit‘s ALOHA Unleashed is aimed at helping robots learn to perform complex and novel two-armed manipulation tasks.  DemoStart uses simulations to improve real-world performance on a multi-fingered robotic hand. 

By helping robots learn from human demonstrations and translate images to action, these systems are paving the way for robots that can perform a wide variety of helpful tasks, said DeepMind.


SITE AD for the 2025 Robotics Summit registration. Register now


ALOHA Unleashed enables manipulation with two robotic arms

Until now, most advanced AI robots have only been able to pick up and place objects using a single arm. ALOHA Unleashed achieves a high level of dexterity in bi-arm manipulation, according to Google DeepMind. 

The researchers said that with this new method, Google’s robot learned to tie a shoelace, hang a shirt, repair another robot, insert a gear, and even clean a kitchen.

ALOHA Unleashed builds on DeepMind’s ALOHA 2 platform, which was based on the original ALOHA low-cost, open-source hardware for bimanual teleoperation from Stanford University. ALOHA 2 is more dexterous than prior systems because it has two hands that can be teleoperated for training and data-collection purposes. It also allows robots to learn how to perform new tasks with fewer demonstrations. 

Google also said it has improved upon the robotic hardware’s ergonomics and enhanced the learning process in its latest system. First, it collected demonstration data by remotely operating the robot’s behavior, performing difficult tasks such as tying shoelaces and hanging T-shirts.

Next, it applied a diffusion method, predicting robot actions from random noise, similar to how the Imagen model generates images. This helps the robot learn from the data, so it can perform the same tasks on its own, said DeepMind.

DeepMind uses reinforcement learning to teach dexterity

Controlling a dexterous, robotic hand is a complex task. It becomes even more complex with each additional finger, joint, and sensor. This is a challenge Google DeepMind is hoping to tackle with DemoStart, which it presented in a new paper. DemoStart uses a reinforcement learning algorithm to help new robots acquire dexterous behaviors in simulation. 

These learned behaviors can be especially useful for complex environments, like multi-fingered hands. DemoStart begins learning from easy states, and, over time, the researchers add in more complex states until it masters a task to the best of its ability.

This system requires 100x fewer simulated demonstrations to learn how to solve a task in simulation than what’s usually needed when learning from real-world examples for the same purpose, said DeepMind. 

After training, the research robot achieved a success rate of over 98% on a number of different tasks in simulation. These include reorienting cubes with a certain color showing, tightening a nut and bolt, and tidying up tools.

In the real-world setup, it achieved a 97% success rate on cube reorientation and lifting, and 64% at a plug-socket insertion task that required high-finger coordination and precision. 

A robotic hand with three fingers developed by Google DeepMind and Shadow Robot.

The DEX-EE dexterous robotic hand, developed by Shadow Robot, in collaboration with the Google DeepMind robotics team. | Source: Shadow Robot

Training in simulation offers benefits, challenges

Google says it developed DemoStart with MuJuCo, its open-source physics simulator. After mastering a range of tasks in simulation and using standard techniques to reduce the sim-to-real gap, like domain randomization, its approach was able to transfer nearly zero-shot to the physical world. 

Robotic learning in simulation can reduce the cost and time needed to run actual, physical experiments. Google said it’s difficult to design these simulations, and they don’t always translate successfully back into real-world performance.

By combining reinforcement learning with learning from a few demonstrations, DemoStart’s progressive learning automatically generates a curriculum that bridges the sim-to-real gap, making it easier to transfer knowledge from a simulation into a physical robot, and reducing the cost and time needed for running physical experiments.

To enable more advanced robot learning through intensive experimentation, Google tested this new approach on a three-fingered robotic hand, called DEX-EE, which was developed in collaboration with Shadow Robot

Google said that while it still has a long way to go before robots can grasp and handle objects with the ease and precision of people, it is making significant progress.

The post Google DeepMind discusses latest advances in robot dexterity appeared first on The Robot Report.

]]>
https://www.therobotreport.com/inside-google-deepmind-latest-advances-in-robot-dexterity/feed/ 0
Flexiv develops fish fillet shaping application for its Rizon 4 robot https://www.therobotreport.com/flexiv-develops-fish-fillet-shaping-application-for-its-rizon-4-robot/ https://www.therobotreport.com/flexiv-develops-fish-fillet-shaping-application-for-its-rizon-4-robot/#respond Wed, 25 Sep 2024 20:57:48 +0000 https://www.therobotreport.com/?p=580865 Flexiv said its new fish fillet shaping application can boost speed and accuracy without compromising quality.

The post Flexiv develops fish fillet shaping application for its Rizon 4 robot appeared first on The Robot Report.

]]>
An image of a fish preparation facility with four blue Rizon robotic arms from Flexiv working at stainless steel tables.

A new fish fillet shaping application uses a Rizon 4 adaptive robot to create uniform portions and shapes. | Source: Flexiv

The result of increasingly adaptive robots may soon be no farther than your next meal. Flexiv yesterday said its Rizon 4 robot is part of a new food-processing application.

Traditionally, the task of shaping fish fillets has been performed by manual labor. Employees need to maintain a high level of concentration to shape each fillet so that the end product is consistent in size, explained Flexiv. The exhausting nature of the work led to high staff turnover and operational challenges.

The Santa Clara, Calif.-based company said it developed its latest system for a leading Asian seafood producer. The system uses a combination of computer vision and force control to identify and shape breaded cod fillets.

“We were told that the shaping stage was difficult for the client to staff, so we used our adaptive technology to not only solve the problem, but [also] improve how it’s completed,” stated Liang Mao, Flexiv’s new market solution director. “Our fillet-shaping solution ensures consistency, speeds up production, and maintains the highest standards of quality, all while completing a manual labor type task that no one wants to do.”


SITE AD for the 2025 Robotics Summit registration. Register now


Tooling and AI bring superhuman senses, speed to fillets

Flexiv claimed that its custom end-of-arm tooling (EOAT) and computer vision “enable precision and efficiency far beyond human capabilities.” The system uses AI-powered computer vision to accurately detect the location of each fillet on the production line.

Then, operating in sync with the conveyor line, Flexiv’s Rizon 4 robots shape the portions without lifting them. They use force control to apply the precise amount of pressure needed to form the fillets without causing damage.

The robots can operate at speeds and consistency that are impossible for human workers, boosting productivity without compromising quality, asserted Flexiv. The fully automated system also includes a self-cleaning feature to ensure hygiene and operational effectiveness, it added.

Flexiv designs cobot for ease of deployment

The company equipped its Rizon 4 collaborative robot with seven degrees of freedom and an 8.8 lb. (4 kg) payload capacity. It features industrial-grade capabilities, including fine end force sensing, precise end force control accuracy, multi-dimensional hybrid motion/force control, disturbance rejection, and whole-body multi-point force control.

Flexiv added that its system is designed to be easy to use and quick to deploy. Users can integrate the fish fillet-shaping system into an existing production line in half a working day. It requires no specialized training or prior experience in robotics, it said.

Other applications of Rizon 4 include production of car seats and ultrasound scanning for healthcare diagnosis. Flexiv also recently announced Moonlight, an adaptive parallel robot (see video below).

Founded in 2016, Flexiv has established offices in Silicon Valley, Shanghai, Beijing, Taiwan, and Singapore. With expertise in integrating force control, computer vision, and AI, the company said its turnkey automation can improve efficiency while reducing operational costs and environmental impact.

The post Flexiv develops fish fillet shaping application for its Rizon 4 robot appeared first on The Robot Report.

]]>
https://www.therobotreport.com/flexiv-develops-fish-fillet-shaping-application-for-its-rizon-4-robot/feed/ 0
ATI Industrial Automation’s GBX 10 Gigabit Ethernet module aims to speed up robotic tool changing https://www.therobotreport.com/ati-industrial-automation-gbx-ethernet-module-speeds-robotic-tool-changing/ https://www.therobotreport.com/ati-industrial-automation-gbx-ethernet-module-speeds-robotic-tool-changing/#respond Wed, 28 Aug 2024 20:53:02 +0000 https://www.therobotreport.com/?p=580454 ATI says its GBX 10 Ethernet module excels in high-speed data and signal communication with vision, inspection, metrology, motion control, and more.

The post ATI Industrial Automation’s GBX 10 Gigabit Ethernet module aims to speed up robotic tool changing appeared first on The Robot Report.

]]>
An image showing the tool side of the GBX module and the master side of ATI Industrial Automation's GBX module.

The GBX Ethernet Module includes plug-and-play connectors for pass-through of Cat6/Cat6a industrial Ethernet cables. | Source: ATI Industrial Automation

ATI Industrial Automation today announced the GBX 10 Gigabit Tool Changer Ethernet Module. The company said it designed this module to optimize performance and flexibility of smart manufacturing processes with timely, accurate communication. 

When industrial robots first joined production lines, manufacturers expected a robot to repeatedly perform a single task for hours on end, noted ATI. Now, as end users look for more flexibility in their operations, they increasingly expect robots to not only perform a variety of tasks, but also to be able to seamlessly switch between those tasks on their own. 

This is where tool changers, which allow robots to automatically change end effectors or other peripheral, come into play, said the company.

Robotic tool changers help users get the most out of their robots, assuming they work quickly and seamlessly. Every second a robot spends changing over a tool is a second not spent doing the task at hand, according to ATI.


SITE AD for the 2025 Robotics Summit registration. Register now


GBX 10 Gigabit module designed for multiple applications

ATI said its Ethernet module provides safe and reliable transmission of high-speed data, signals, and fieldbus communication for advanced automation and artificial intelligence. It claimed that the new module provides signal communications for a variety of industries and applications, including vision, inspection, metrology, motion control, and fieldbus systems.

Additional features and benefits of the GBX 10 Gigabit Tool Changer Ethernet Module include:

  • Plug-and-play M12 8-pin X-code connectors for safe and reliable pass-through of Cat6/Cat6a industrial Ethernet cables, saving time during integration, with no stripping, crimping, or soldering required
  • Configurable cable exit options between -90°, 0°, and +90° radial positions, allowing users to optimize cable dress and routing to reduce cable strain and maximize system uptime
  • Superior longevity, handling up to 1 million mating cycles to support applications in advanced industries such as automotive, aerospace, testing, and more
  • Broad compatibility with ATI-standard tool changers, heavy-duty tool changers, and utility couplers for widespread module standardization
A pair of ATI Industrial Automation's GBX 10 tool changers.

The GBX 10 Gigabit Ethernet tool changer provides connectivity for a range of robot applications. Source: ATI Industrial Automation

About ATI Industrial Automation

Apex, N.C.-based ATI is a developer of robotic accessories. Its products include automatic tool changers, multi-axis force/torque sensing systems, utility couplers, material removal tools, robotic collision sensors, manual tool changers, and compliance devices.

Founded in 1989, the company said it aims to develop cost-effective, state-of-the-art end-effector products that improve robotic productivity. 

In November 2023, the company released the MC-50 Manual Tool Changer, which it said provides high performance, reliability, and quality for the manual exchange of robotic tooling. This compact and robust Tool Changer is designed for applications on collaborative robots that support payloads up to 25 kg (55.1 lb.), and small industrial robots supporting payloads up to 10 kg (22 lb.).

The post ATI Industrial Automation’s GBX 10 Gigabit Ethernet module aims to speed up robotic tool changing appeared first on The Robot Report.

]]>
https://www.therobotreport.com/ati-industrial-automation-gbx-ethernet-module-speeds-robotic-tool-changing/feed/ 0
NSF awards Northwestern with $26M for dexterous hand research https://www.therobotreport.com/nsf-awards-northwestern-with-26m-for-dexterous-hand-research/ https://www.therobotreport.com/nsf-awards-northwestern-with-26m-for-dexterous-hand-research/#respond Fri, 23 Aug 2024 14:18:17 +0000 https://www.therobotreport.com/?p=580376 The center at Northwestern aims to create robots capable of intelligent and versatile grasping, fine motor skills, and hand-eye coordination.

The post NSF awards Northwestern with $26M for dexterous hand research appeared first on The Robot Report.

]]>
A robotic arm reaching out to a human arm on a blue background.

The HAND project at the new Engineering Research Center will develop dexterous robot hands to assist with manufacturing and more. | Source: Northwestern University

The human hand is made up of 27 bones, 27 joints, 34 muscles, and more than 100 ligaments and tendons. With all of that hardware, it’s no surprise that roboticists have struggled to equal that level of dexterity. With a $26 million grant from the National Science Foundation, Northwestern University will launch a new Engineering Research Center, or ERC, focusing on improving the ability of robots to amplify human labor. 

The Human AugmentatioN via Dexterity (HAND) ERC plans to develop a robot hand with the dexterity to assist humans with manufacturing, caregiving, handling precious or dangerous materials, and more. The center said its goal is to build robots capable of intelligent and versatile grasping, fine motor skills, and hand-eye coordination. It also intends to develop technologies that are versatile and easy to integrate. 

“This new NSF award is a historic milestone that builds on Northwestern’s well-recognized expertise in robotics and human-machine systems,” stated Eric Perreault, vice president for research at Northwestern University. “The HAND proposal is bold and visionary.”

“It will have a long-lasting, positive effect on manufacturing, food processing, healthcare and many other areas that rely on dexterous manipulation,” he added. “Ed Colgate, Kevin Lynch, and their exceptional colleagues across Northwestern have built a world-class team of industrial and academic partners to ensure this cutting-edge research creates practical outcomes.”

The NSF grant will fund the new center across five years, with the ability to renew for another $26 million for an additional five years. This marks the first ERC led by the university.

Core partners include Carnegie Mellon University, Florida A&M, and Texas A&M. Additional faculty support is coming from Syracuse University, the University of Wisconsin-Madison, and the Massachusetts Institute of Technology.


SITE AD for the 2025 Robotics Summit registration. Register now


Researchers plan to improve robot utility  

While robots already play an important role in manufacturing and can improve workers’ job quality and raise their wages, Colgate said their full potential has been limited. Developing robotic hands that are as versatile and dexterous as human hands will enable robots to expand human capabilities and boost industry competitiveness, he said.

But dexterity isn’t the new center’s only goal. The researchers said they also want to ensure that new robotic hands are inexpensive, easy to operate without expertise, robust, durable, and ready for mass production. The robotics researchers plan to work across disciplines to engage experts in education, policy, and accessibility.

Northwestern touted potential benefits including increased worker productivity, improved job opportunities, reshoring of manufacturing, and reduced supply chain vulnerability. Other potential outcomes include enhanced food safety, improved quality of life, and democratization of technology.

The HAND ERC research at Northwestern University includes teleoperated soldering.

The HAND ERC research is working to AI-powered dexterous skills. Source: Northwestern University

Researchers come from across Northwestern University

An expert in robots and haptics, J. Edward Colgate, a Walter P. Murphy Professor of Mechanical Engineering at Northwestern’s McCormick School of Engineering, will lead the center.

Kevin Lynch, a professor of mechanical engineering at McCormick and director of Northwestern’s Center for Robotics and Biosystems, will serve as the center’s research director.

Other collaborators at the university include McCormick’s Brenna Argall, Jian Cao, Matthew Elwin, Elizabeth Gerber, Todd Murphey, and Ryan Truby and the School of Education and Social Policy’s Lois Trautvetter.

About the ERC program

Since its founding in 1985, the ERC program has supported convergent research, education, and technology translation at U.S. universities. Each ERC unites members from academia, industry, and government to produce transformational engineered systems along with engineering graduates who are adept at innovation and primed for leadership in the global economy.

“NSF’s Engineering Research Centers ask big questions in order to catalyze solutions with far-reaching impacts,” said NSF Director Sethuraman Panchanathan.

“NSF Engineering Research Centers are powerhouses of discovery and innovation, bringing America’s great engineering minds to bear on our toughest challenges,” he said. “By collaborating with industry and training the workforce of the future, ERCs create an innovation ecosystem that can accelerate engineering innovations, producing tremendous economic and societal benefits for the nation.”

Since its founding in 1985, the NSF’s ERC program has funded 83 centers (including four announced this week) that receive support for up to 10 years. The centers build partnerships with educational institutions, government agencies, and industry stakeholders to support innovation and inclusion in established and emerging engineering research.

The post NSF awards Northwestern with $26M for dexterous hand research appeared first on The Robot Report.

]]>
https://www.therobotreport.com/nsf-awards-northwestern-with-26m-for-dexterous-hand-research/feed/ 0
GITAI to share challenges of building a robotic workforce in space at RoboBusiness https://www.therobotreport.com/gitai-shares-challenges-of-building-a-robotic-workforce-in-space/ https://www.therobotreport.com/gitai-shares-challenges-of-building-a-robotic-workforce-in-space/#respond Thu, 22 Aug 2024 18:59:43 +0000 https://www.therobotreport.com/?p=580372 During RoboBusiness, GITAI will examine what a labor-abundant space industry will look like with robotics at its center.

The post GITAI to share challenges of building a robotic workforce in space at RoboBusiness appeared first on The Robot Report.

]]>

Some of the most promising tasks for robots are those that humans cannot safely perform. They include decommissioning nuclear plants, exploring the ocean’s depths, and gathering information on other planets and moons. GITAI USA Inc., which is developing systems to support space exploration, will be among the innovators to see at RoboBusiness this October. 

The space robotics market was estimated to be worth $4.4 billion in 2022. It will experience a compound annual growth rate (CAGR) of 8.8% to 2030, predicted Grand View Research. This leaves ample opportunity for robotics developers. 

GITAI has developed the S2 dual-armed robot, which was part of missions earlier this year aboard the International Space Station (ISS). The Torrance, Calif-based company‘s system was mounted on the Nanorack Bishop Airlock to conduct an external demonstration of in-space servicing, assembling, and manufacturing (ISAM).

GITAI’s previous demonstration onboard the ISS involved its S1 one-armed robot. It was able to execute two tasks: assembling structures and panels for in-space assembly (ISA), and operating switches and cables for intravehicular activity (IVA). 

The company’s other offerings include its Inchworm lunar robot. The robotic arm has grapple end effectors on both ends. The proprietary technology allows users to connect various tools to the robot to perform multiple tasks. It also allows the arm to move autonomously.

In addition, GITAI has developed a Lunar Rover, which can aid in constructing solar panels, communication antennas, and habitat modules. The company builds all of its technology with components completely in house so that it can commercialize those components in the future.

Learn how to build a robotic workforce in space

At RoboBusiness 2024, Dr. Satoshi Kitano, vice president of hardware Engineering at GITAI, will present a session on the “Challenges in Building a Robotic Workforce in Space.” He will cover the robotics technologies that are transforming our capabilities to not only reach new planets but also to support human activity while we are there.

Kitano will take a deep dive into the state of the space labor industry and highlight the interesting challenges that inhibit our progress toward rapidly expanding into the final frontier. He will also examine what a labor-abundant space industry would look like with robotics at its center.

Dedicated to developing robots for extreme environments, Kitano has a Ph.D. in mechanical and aerospace engineering from the Tokyo Institute of Technology. He previously served as an executive officer at industrial robotics company HiBot.

As lead mechanical designer and systems engineer at GITAI, Kitano spearheaded the company’s first extravehicular robotics project, S2.

Satoshi Kitano RoboBusiness promo image.

Hear from GITAI and more at RoboBusiness

RoboBusiness is scheduled for Oct. 16 and 17 in Santa Clara, Calif. The event will feature more than 60 speakers, over 100 exhibitors and demonstrations on the expo floor, 10+ hours of dedicated networking time, the Pitchfire Robotics Startup Competition, and more.

Thousands of robotics experts from around the world will convene at the event. Kitano’s talk will take place at 12:00 p.m. PT on the second day of the show. 

In addition to enabling tech and robotics innovation, RoboBusiness 2024 focuses on investments and business topics related to running a robotics company. Keynote talks at the event include:

  • Rodney Brooks, co-founder and chief technology officer at Robust AI
  • Sergey Levine, co-founder of Physical Intelligence and an associate professor at UC Berkeley
  • Claire Delaunay, the CTO at farm-ng
  • Torrey Smith, the co-founder and CEO of Endiatx

The show will also include a keynote panel on “Driving the Future of Robotics Innovation,” featuring:

  • Amit Goel, head of robotics at edge AI ecosystem at NVIDIA
  • John Bubnikovich, president of ABB Robotics US
  • Eric Truebenbach, managing director of Teradyne Robotics Ventures
  • Joan-Wilhelm Schwarze, a senior global innovation manager at DHL

The post GITAI to share challenges of building a robotic workforce in space at RoboBusiness appeared first on The Robot Report.

]]>
https://www.therobotreport.com/gitai-shares-challenges-of-building-a-robotic-workforce-in-space/feed/ 0
Shadow Robot to show how to build robots to survive real world training at RoboBusiness https://www.therobotreport.com/shadow-robot-shows-how-build-robots-survive-real-world-training-robobusiness/ https://www.therobotreport.com/shadow-robot-shows-how-build-robots-survive-real-world-training-robobusiness/#respond Thu, 15 Aug 2024 12:29:25 +0000 https://www.therobotreport.com/?p=580241 At RoboBusiness, Shadow Robot will show how solutions it found while developing its robotic hand could be applied to other robotics problems.

The post Shadow Robot to show how to build robots to survive real world training at RoboBusiness appeared first on The Robot Report.

]]>
Shadow Robot Company's dexterous robotic hand, DEX-EE.

Shadow Robot’s dexterous robotic hand, DEX-EE. | Source: Shadow Robot Co.

Shadow Robot Co. recently finished a multi-year project working with Google DeepMind to produce a new class of robot hand. The team aimed for the hand to be reliable and robust enough to survive the challenges of reinforcement learning in the real world. At the same time, it had to be dexterous and capable enough to perform human-like manipulation tasks.

This is a key challenge the robotics industry is facing. Robots often learn through trial and error, which requires them to safely test things in the real world. Throughout this testing, the robot will sometimes execute motions at the limit of its abilities, resulting in damage to the hardware. Fixing this damage can be costly and slow down experiments. 

Shadow Robot said its new hand has several new technologies that can be used in the development of next-generation robot hardware. This includes high-speed control architectures based on force-controlled N+1 actuation, new stereo tactile fingertips capable of sensing the lightest of touches, and surviving the high forces typical of grasping and manipulation research.

The London-based company tested the new robot hand for thousands of hours – from component-level wear tests to high-force self-collision tests and impact testing – and produced a modular robot finger capable of assembly into a wide range of robots with serviceability and reliability at the core of the design.

See Shadow Robot in Santa Clara

In a session at RoboBusiness 2024, which will be on Oct. 16 and 17 in Santa Clara, Calif., Shadow Robot will discuss design challenges, how it mapped the development space, and the technical and practical solutions it developed to address various challenges.

The company will also show how these solutions could be applied to other robotics problems.

Rich Walker, director of Shadow Robot, will lead the session. He spent 14 years as managing director of Shadow Robot before graduating in 2022. Walker now works with partners to build strategic relationships, develop policy and programs around robotics and AI, and help define the next generation of what’s possible with robotics technology.

About RoboBusiness 2024

Produced by The Robot Report, RoboBusiness takes place Oct. 16-17 in Santa Clara, Calif. and is the leading event focused on developing commercial robots and robotics businesses. Walker’s talk will take place at 11:45 a.m. PT on the first day of the show. 

RoboBusiness will feature more than 60 speakers, over 100 exhibitors and demonstrations on the expo floor, 10+ hours of dedicated networking time, the Pitchfire Robotics Startup Competition, and more. Thousands of robotics experts from around the world will convene at the event.

In addition to enabling tech and robotics innovation, RoboBusiness 2024 focuses on investments and business topics related to running a robotics company. Keynote talks at the event include:

  • Rodney Brooks, the co-founder and chief technology officer at Robust AI
  • Sergey Levine, co-founder of Physical Intelligence and an associate professor at UC Berkeley
  • Claire Delaunay, the CTO at farm-ng
  • Torrey Smith, the co-founder and CEO of Endiatx

The show will also include a keynote panel on “Driving the Future of Robotics Innovation,” featuring:

  • Amit Goel, head of robotics at edge AI ecosystem at NVIDIA
  • John Bubnikovich, president of ABB Robotics US
  • Eric Truebenbach, managing director of Teradyne Robotics Ventures
  • Joan-Wilhelm Schwarze, a senior global innovation manager at DHL

The post Shadow Robot to show how to build robots to survive real world training at RoboBusiness appeared first on The Robot Report.

]]>
https://www.therobotreport.com/shadow-robot-shows-how-build-robots-survive-real-world-training-robobusiness/feed/ 0
Soft Robotics exits gripper business, launches AI-focused company https://www.therobotreport.com/soft-robotics-exits-gripper-business-launches-ai-focused-company/ https://www.therobotreport.com/soft-robotics-exits-gripper-business-launches-ai-focused-company/#respond Tue, 06 Aug 2024 12:59:45 +0000 https://www.therobotreport.com/?p=580117 Soft Robotics divested its robotic gripper business to Schmalz for an undisclosed amount. A new spinoff, Oxipital AI, will focus on 3D vision and AI for inspection and robotic picking.

The post Soft Robotics exits gripper business, launches AI-focused company appeared first on The Robot Report.

]]>

Soft Robotics spinoff Oxiptal AI’s vision tech can detect bruises on produce, excess fat on proteins, burn marks on snacks, and more. | Credit: Oxipital AI

After two straight record years during the height of the COVID-19 pandemic, industrial robot sales have crashed back to reality, especially in North America. According to industry association A3, sales of industrial robots in North America declined 30% in 2023. Sales dipped 6% in North America during Q1 2024 compared with Q1 2023.

While many analysts agree that industrial robots will eventually become ubiquitous, the timeframe for that happening remains unknown. The slowdown is partially responsible for several robotics companies shutting down or laying off staff. Mark Chiappetta, president and CEO of soft gripper maker Soft Robotics Inc., was determined to not fall into that category.

Soft Robotics today divested its soft robotic gripper business to J. Schmalz GmbH for an undisclosed amount. Glatten, Germany-based Schmalz is a leading developer of vacuum technology, making everything from suction cups and vacuum generators to complete gripping and clamping systems.

Chiappetta told The Robot Report that Schmalz is acquiring Soft Robotics’ intellectual property as well as a number of employees and facilities.

“When COVID was happening, the talk was, ‘We don’t have a choice. [Installing robots] is a matter of keeping up with demand,'” he said. “We all fully expected those buying habits to stay, which would’ve led to a tectonic shift in robotics. But those habits didn’t stay.”

Soft Robotics was founded in 2012 by Dr. George M. Whitesides of Harvard University. He envisioned the use of soft materials and microfluidics to change the way robots were made, opening the door for new markets and applications. He keynoted the inaugural Robotics Summit & Expo, produced by The Robot Report, in 2018.

Oxipital AI diversifies the business

While Soft Robotics’ grippers are now under the Schmalz umbrella, the company is no longer. It has spun off its mGripAI 3D vision and artificial intelligence technologies into a new company called Oxipital AI, for which Chiappetta holds the same job title.

Oxipital AI will focus on visual inspection tasks such as defect detection, volume estimation, SKU classification, attribute segmentation, and conveyor counting. It will also on robotic picking in various industries, starting primarily in the food business where Soft Robotics had built its reputation.

The company plans to create core object models that Chiappetta said are pre-trained using 100% synthetic data. He added that Oxipital AI requires zero imagery to be gathered, nor does it need human labeling.

A no-code feature enables customers to set rules for what constitutes a good product or bad product for inspection tasks, and a cloud-based dashboard collects and analyzes real-world data, he explained. Oxipital AI’s technology stack interfaces with all existing industrial robot arms, as well as conventional automation systems such as conveyors, said Chiappetta.

Oxipital AI is a new company focusing on AI visual inspection tasks.

Oxipital AI is a new company focusing on AI visual inspection tasks. | Credit: Oxipital AI

Food industry forces Soft Robotics shift in focus

Besides not selling robotic grippers, the main difference from the former company is that Oxipital AI will have a major emphasis on applications that don’t use robots, he noted. For example, in the food industry, AI-based vision technologies can improve yield, increase throughput, and reduce waste, Chiapetta said.

“Food processors aren’t ready to rip out human picking lines and replace them with robotic lines,” Chiappetta said. “They are willing to put in a camera to expose how to optimize their current processes.”

Chiappetta said floor space is the biggest reason food processors aren’t adopting robotics. Most of the larger organizations, he said, are built by acquiring smaller producers. This makes every manufacturing plant different and puts floor space at a premium.

Another major problem, according to Chiappetta, is that food-processing companies are reluctant to take the initial leap of faith into robotics.

“[The food industry] hasn’t been a strong adopter of robotics to date,” he said. “Processors need to allocate capital with high interest rates, select a bidder, have a solution developed, take an existing line down, put the solution in place, have acceptance and quality done, and then they’ll know if the investment was worth it.”

“Once you take out humans, it’s hard to go back,” said Chiappetta.

He said on LinkedIn that Soft Robotics had nearly 1,000 soft grippers deployed in the field.

OnRobot is another well-known developer of soft robotic grippers. Founded in 2015, the Odense, Denmark-based company initially offered a variety of robotic end effectors.

However, it too has diversified its business by launching various sensors, tool changers, and software packages for applications such as palletizing, packaging, CNC machining, and more.


SITE AD for the 2025 Robotics Summit registration. Register now


Former Soft Robotics divests in move to AI

The Bedford, Mass.-based company has updated its website to reflect the new direction. One page details how a “leading sweet corn producer in the United States” recently implemented the AI-powered vision technology to inspect its produce for defects.

This system looks for various flaws such as missing kernels and misshapen or undersized produce. It then relays this information to the quality control team for necessary actions.

Why did Soft Robotics divest its gripper business instead of just increasing its focus on 3D vision and AI?

Chiappetta replied that this wasn’t an “adapt-or-die” situation, but that the increased cash from the divestiture certainly helps. The main benefit is to keep the company focused, he said.

“Robotic picking is really hard, and grippers are a niche business,” said Chiappetta. “The visual AI needed to do these things is the common denominator to do all these applications. And we’ve got the tech to do it.”

“Without focus, it’s difficult to survive,” he added. “I had many conversations with strategic partners and others who didn’t know how to look at a company that’s eyes (vision), hands (grippers), and brains (AI).”

Not 100% dependent on robot sales

December 2022 was the best month Soft Robotics ever hard financially, Chiappetta told The Robot Report. January is typically a slow month as companies figured out their budgets, but January 2023 was horrible, and February didn’t get any better, he said.

“Our partners were seeing the same thing,” recalled Chiappetta.

Soft Robotics last raised $26 million in November 2022. It had raised $86 million since its founding, according to Crunchbase.

Soft Robotics’ business was 100% dependent on robot sales, but Oxipital’s won’t be, Chiappetta asserted.

“The Schmalz transaction is the start of what we hope is a strategic partnership,” he said. “They have a great reputation and global distribution. It’s a natural fit for us. We need a company like Schmalz to grow soft robotic grippers. And the more soft gripping becomes the standard, the more opportunities we’ll have for our AI vision tech.”

Schmalz had developed its own soft grippers, such as the OFG HYG SI-70.

Schmalz had developed its own soft robotic grippers, such as the OFG HYG SI-70. Source: Schmalz

The post Soft Robotics exits gripper business, launches AI-focused company appeared first on The Robot Report.

]]>
https://www.therobotreport.com/soft-robotics-exits-gripper-business-launches-ai-focused-company/feed/ 0