Technologies Archives - The Robot Report https://www.therobotreport.com/category/technologies/ Robotics news, research and analysis Fri, 06 Dec 2024 21:48:17 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.1 https://www.therobotreport.com/wp-content/uploads/2017/08/cropped-robot-report-site-32x32.png Technologies Archives - The Robot Report https://www.therobotreport.com/category/technologies/ 32 32 Oxipital AI partners with Stäubli Robotics on food-safe picking https://www.therobotreport.com/oxipital-ai-partners-with-staubli-robotics-on-food-safe-picking/ https://www.therobotreport.com/oxipital-ai-partners-with-staubli-robotics-on-food-safe-picking/#respond Sat, 07 Dec 2024 13:45:54 +0000 https://www.therobotreport.com/?p=581870 The two companies plan to combine Stäubli's hygienic robots and Oxipital AI's machine vision technology.

The post Oxipital AI partners with Stäubli Robotics on food-safe picking appeared first on The Robot Report.

]]>
A grey Stäubli SCARA robot with a green suction cup end effector picking a hamburger patty using Oxipital AI's software.

Oxipital AI’s inspection and picking solutions and Stäubli’s hygienic robot were demonstrated at Pack Expo 2024. | Source: Oxipital AI

Oxipital AI, a developer of machine vision technologies for robotic automation and product inspection, is teaming up with Stäubli Robotics. Formerly Soft Robotics, Oxipital is targeting food processing, agriculture, and consumer goods production for its technology.

Stäubli Robotics’ product portfolio contains 4- and 6-axis industrial robots, cobots, mobile robots, and automated guided vehicles. The Duncan, S.C.-based company said its robots can work in a variety of industrial sectors, including automotive, metalworking, photovoltaics, food, pharmaceutical, and more. 

“Stäubli Robotics is honored to become Oxipital AI’s first Preferred Partner,” said Mathias Konne, North American business head at Stäubli Robotics. “This milestone recognizes the previous joint efforts and existing collaboration between our two industry-leading organizations while paving the way for an even brighter and bolder future. With this official recognition, and along with our common partners, we continue to deliver robotic systems offering the highest value and technological advancements to our clients.”

The companies said that by combining Stäubli’s hygienic robots and Oxipital inspection and picking solutions, the two companies can help ensure consistent and food-safe production without depending on human labor for profitability.

“This partnership with Stäubli solidifies our joint efforts in helping manufacturers overcome some of the most difficult challenges in food processing by utilizing AI-enabled vision solutions paired with hygienic, high-speed robotic solutions,” said Harley Green, VP of strategic accounts.

Oxipital AI spins out from Soft Robotics

In August 2024, Soft Robotics divested its soft robotic gripper business and spun off its mGripAI 3D vision and artificial intelligence technologies into Oxipital AI. Oxipital focuses on visual inspection tasks such as defect detection, volume estimation, SKU classification, attribute segmentation, and conveyor counting. It will also on robotic picking in various industries, starting primarily in the food business where Soft Robotics has built its reputation.

Last month, Oxipital AI launched its VX2 Vision System, which uses AI for inspection and high-speed picking applications across food-grade and industrial sectors. Built on the company’s proprietary Visual AI platform, the VX2 comes in a more compact package at a more accessible price than its predecessor.

The VX2 has enhanced capabilities for inspection, high-speed picking, and high-speed picking with inspection, said Oxipital. It asserted that the system ensures optimal efficiency and precision in a wide variety of environments.

The post Oxipital AI partners with Stäubli Robotics on food-safe picking appeared first on The Robot Report.

]]>
https://www.therobotreport.com/oxipital-ai-partners-with-staubli-robotics-on-food-safe-picking/feed/ 0
AMP Robotics raises $91M to accelerate deployment of recycling systems https://www.therobotreport.com/amp-robotics-raises-91m-accelerate-deployment-recycling-systems/ https://www.therobotreport.com/amp-robotics-raises-91m-accelerate-deployment-recycling-systems/#respond Thu, 05 Dec 2024 15:14:19 +0000 https://www.therobotreport.com/?p=581856 AMP Robotics will use its latest funding to deploy AMP ONE system, which is designed to improve sortation of municipal solid waste.

The post AMP Robotics raises $91M to accelerate deployment of recycling systems appeared first on The Robot Report.

]]>
AMP ONE is designed to make recycling of municipal solid waste, shown here, more economical.

AMP ONE is designed to capture billions of dollars in value otherwise lost to landfills or incineration annually. Source: AMP Robotics

AMP Robotics Corp. today said it has  has raised $91 million in corporate equity in a Series D financing. The Louisville, Colo.-based company plans to use its latest funding to accelerate deployment of its AMP ONE systems, which uses artificial intelligence and robotics to sort municipal solid waste, or MSW.

“Recycling rates have stagnated in the United States, despite the positive benefits recycling offers local economies and the environment,” said Matanya Horowitz, founder of AMP. “This latest investment enables us to tackle larger projects and deliver real outcomes for waste companies and municipalities – by lowering sortation costs, capturing more material value, diverting organic waste, and extending landfill life – all while helping the industry optimize its strategic assets.”

Founded in 2014, AMP Robotics said its AI platform has identified 150 billion items and guided the sortation of more than 2.5 million tons of recyclables. The company said its technology can help modernize and change the economics of resource recovery. It has three full-scale facilities and more than 400 AI systems deployed across North America, Asia, and Europe.

From sortation to AMP ONE

AMP Robotics said its AI uses deep learning to continuously train itself by processing millions of material images into data. The software uses pattern recognition of colors, textures, shapes, sizes, and logos to identify recyclables and contaminants in real time, enabling new offtake chemistries and capabilities, it added.

The company noted that its first products were a series of sorting robots deployed with minimal retrofit into existing recycling facilities. AMP then developed facilities that it claimed involve almost no manual sorting, are reliable, and provide “pervasive data.”

“These facilities make the recovery of commodities safer and more cost-effective than ever and have grown to encompass MSW sorting, an offering out of reach to the industry prior to the advent of AMP’s technology,” it said. “AMP ONE provides a full-scale facility solution to sort various material streams and capture more of the billions of dollars in value otherwise lost to landfills or incinerated annually.”


SITE AD for the 2025 Robotics Summit registration. Register now


AMP Robotics marks recent deployments, new CEO

Recycling and Disposal Solutions demonstrated AMP ONE’s ability to cost-effectively sort MWS at its facility  in Portsmouth, Va. It has processed 150 tons per day of local waste with more than 90% uptime, said the company.

Last month, AMP Robotics entered into an agreement with Waste Connections Inc. to equip and operate one of Waste Connections’ single-stream recycling facilities in Colorado. 

“AMP provides meaningfully lower-cost, higher-performance systems to recover commodities and increase landfill diversion, and we’re uniquely positioned to reshape the waste and recycling landscape at a critical time,” said Tim Stuart, CEO of AMP. “We’re grateful to our longstanding and newest investors for their support in helping us chart a new path for sustainable materials management and resource efficiency.”

AMP last month augmented its leadership team with the appointment of Stuart, former chief operating officer for Republic Services Inc. Horowitz transitioned from CEO into the role of chief technology officer.

Congruent Ventures leads round

Congruent Ventures led AMP Robotics’ Series D round. Current and new investors participated, including Sequoia Capital, XN, Blue Earth Capital, Liberty Mutual Investments, California State Teachers Retirement System (CalSTRS), Wellington Management, Range Ventures, and Tao Capital Partners.

“AMP’s AI sortation systems enable consumers to recycle both with and without curbside separation and communities to benefit from the recovery of recycled commodities while reducing dependence on landfills,” added Abe Yokell, co-founder and managing partner of Congruent Ventures. “AMP is an example of the real-world impacts of AI; solutions like AMP’s will divert billions of tons of recyclable material from landfills while reducing emissions.”

Congruent Ventures is a leading early-stage venture firm focused on partnering with entrepreneurs to build companies addressing climate and sustainability challenges. The firm has more than $1 billion in assets under management across early-stage climate tech funds and 59 companies in its portfolio.

The post AMP Robotics raises $91M to accelerate deployment of recycling systems appeared first on The Robot Report.

]]>
https://www.therobotreport.com/amp-robotics-raises-91m-accelerate-deployment-recycling-systems/feed/ 0
COVAL releases MPXS, its smallest micro vacuum pump to date https://www.therobotreport.com/coval-releases-mpxs-smallest-micro-vacuum-pump-to-date/ https://www.therobotreport.com/coval-releases-mpxs-smallest-micro-vacuum-pump-to-date/#respond Wed, 04 Dec 2024 18:27:15 +0000 https://www.therobotreport.com/?p=581847 With a width of just 12.5 mm and a weight of only 87 grams, the MPXS is the smallest vacuum pump designed by COVAL.

The post COVAL releases MPXS, its smallest micro vacuum pump to date appeared first on The Robot Report.

]]>
A white hand holding COVAL's MPXS micro vacuum pump.

The MPXS micro vacuum pump puts the features of COVAL’s intelligent vacuum pumps into a smaller physical space. | Source: COVAL

COVAL SAS, a designer, producer, and marketer of vacuum components and systems, has released its latest micro vacuum pump, the MPXS. The Montélier, France-based company said it designed the pump to be pilot-controlled, ultra-compact, and equipped with high-performance communication capabilities. 

The new MPXS series is intended to provide manufacturers with an efficient tool for handling non-porous parts at high speeds on robots or automated systems, said COVAL. The micro vacuum pump follows the design principles of the company‘s intelligent vacuum pumps, which COVAL said are energy-efficiency, high-performance, and communications I/O.

With a width of just 12.5 mm (0.4 in.) and a weight of only 87 g (3 oz.), the company said the MPXS is the smallest vacuum pump it has created. This size means it can be installed as close as possible to suction cups or inside restricted spaces for reduced pick-up time with no loss of load, guaranteeing high speeds.

COVAL is an ISO 9001 V2015-certified company that specializes in vacuum handling systems for multiple industries. It has clients in fields including packaging, automotive, food processing, plastic processing, and aeronautics. COVAL markets its products and services internationally through its subsidiaries and its network of authorized distributors.

More details about the MPXS

Thanks to single-stage Venturi technology, MPXS series micro vacuum pumps can quickly reach a maximum vacuum of 85%. This makes it suited to dynamic applications requiring very short cycle times. 

COVAL said the two power levels of 0.53 and 0.92 SCFM add to the system’s versatility and enable it to adapt to the needs of each application.

The MPXS also provides the user with useful information at every stage of operation. COVAL said it equipped the system with a human-machine interface (HMI) that makes it easy to read operating, diagnostic, and maintenance information. It also enables rapid parameter setting.

In addition, the integrated IO-Link communication interface supports fast, cost-effective installation, continuous diagnostics, centralized parameter setting, and efficient communication with higher-level protocols such as EtherNet/IP, PROFINET, and EtherCAT.

MPXS micro vacuum pumps feature air-saving control (ASC) technology. COVAL said it intelligently regulates vacuum generation, enabling energy savings of 90% on average by stopping air consumption once the desired vacuum level has been reached.

The modularity of the MPXS series offers a wide choice of configurations, ensuring flexibility during installation and use. It is available as stand-alone modules or in islands of up to eight modules, with standard or powerful adjustable blower options.

COVAL said the MPXS micro vacuum pump’s small size, high performance, and wide range of functions and configurations make it suitable for industrial applications requiring high speeds. These include high-speed pick-and-place systems, robot manipulators, and automated production. It is especially useful for the plastics, electronics, and pharmaceutical industries, according to the company.


SITE AD for the 2025 Robotics Summit registration. Register now


The post COVAL releases MPXS, its smallest micro vacuum pump to date appeared first on The Robot Report.

]]>
https://www.therobotreport.com/coval-releases-mpxs-smallest-micro-vacuum-pump-to-date/feed/ 0
binder introduces M16 connectors with compact design, high sealing performance https://www.therobotreport.com/binder-introduces-m16-connectors-with-compact-design-high-sealing-performance/ https://www.therobotreport.com/binder-introduces-m16-connectors-with-compact-design-high-sealing-performance/#respond Wed, 04 Dec 2024 13:20:24 +0000 https://www.therobotreport.com/?p=581845 Binder USA has released redesigned M16 connectors designed for reliability and performance in harsh conditions.

The post binder introduces M16 connectors with compact design, high sealing performance appeared first on The Robot Report.

]]>
binder new modular M16 connectors.

The new M16 connectors have been redesigned to be modular and easier to handle. Source: binder

For demanding environments, Binder USA LP has introduced a new generation of molded M16 connectors, which it said are engineered to deliver reliability and performance even in the harshest conditions. The M16 circular connectors are designed for applications ranging from heavy-duty machinery like construction cranes and excavators to precision-driven laboratory equipment.

These connectors must meet diverse requirements, ensuring stable and reliable connections in extreme conditions, such as freezing temperatures and exposure to dirt and dust. To address these challenges, they must combine high electrical performance with durability and resilience, noted Camarillo, Calif.-based binder.

binder redesigns connectors to be modular

binder said it has completely redesigned its latest generation of molded M16 connectors. The previous version included many existing parts from field-wireable connectors, not all of which were ideal for the molded version, the company explained.

With an expanding portfolio and increasing demand, the company said it decided to fundamentally redesign the product to use a modular system, enabling many common parts between the unshielded and shielded variants.

“A key feature of the new connector design is the reduction in components,” said Sebastian Ader, product manager at binder. “Thanks to the modular system, we only need one additional part for the shielded and unshielded variants. This allows us to produce much more efficiently, offering cost advantages to customers without compromising on quality.”

Developing the new M16 connector was particularly challenging, said binder, because it had to comply with both the M16 standard (DIN EN 61076-2-106) and the stringent AISG standard (for the eight-pin shielded variant) in terms of IP68 sealing and compatibility between different manufacturers.

By optimizing the sealing system, the new M16 system resolves compatibility problems that have previously led to insufficient sealing, the company said. It added that the new generation of connectors is lead-free, meeting the EU RoHS2 Directive 2011/65/EU, including 2015/863/EU.

[SiTEAD]

M16 suitable for industrial, field applications

When redesigning the M16 molded connectors, binder said it paid particular attention to applications in industrial machinery, camera systems, and pressure sensors. These areas require maximum electrical reliability, and therefore a robust connector system that functions under difficult operating conditions, it noted.

“Crane and excavator applications are a good example. Here, fixed-plug connections are required,” said Ader. “Particularly in critical moments, such as when lifting heavy loads, it is important that the connectors not only fit securely, but are also quick and easy to use.”

A triangular design is intended to make the new M16 connectors are easy to handle, even in sub-zero temperatures or when wearing gloves, for example.

“The new triangular design not only makes handling easier, but it also minimizes dirt-prone areas and undercuts, which enables use even in very harsh and demanding environments,” Ader said. “The new connectors can be reliably mated, unmated and locked at any time.’

The molded M16 connectors also meet requirements for shock resistance, vibration tolerance, and tightness, said binder. “In summary, the robust design ensures a reliable connection in extreme temperatures, dirt, and moisture, minimizes the risk of failure, and ensures the continuous operational readiness of the machines,” it asserted.

“With the molded M16 connector, we have succeeded in meeting market demands in terms of technical properties, handling, and price,” Ader said. “All this makes our solution a future-proof choice for demanding industrial applications.”

About binder

Binder USA LP is a subsidiary of binder Group, a leading global manufacturer of circular connectors, custom cord sets, and LED lights. The company‘s products are used worldwide in industrial environments for factory automation, process control, and medical technology applications.

Binder said its technical innovations meet the highest standards of quality and reliability. The company’s quality management system is ISO 9001 and 14001-certified, but binder said its solution-focused approach to customer applications and commitment to service differentiate it from the competition.

The post binder introduces M16 connectors with compact design, high sealing performance appeared first on The Robot Report.

]]>
https://www.therobotreport.com/binder-introduces-m16-connectors-with-compact-design-high-sealing-performance/feed/ 0
Project CETI uses AI and robotics to track down sperm whales https://www.therobotreport.com/project-ceti-uses-ai-and-robotics-to-track-down-sperm-whales/ https://www.therobotreport.com/project-ceti-uses-ai-and-robotics-to-track-down-sperm-whales/#respond Tue, 03 Dec 2024 21:19:23 +0000 https://www.therobotreport.com/?p=581810 Project CETI researchers developed the AVATARs framework to make the most out of the small amount of time sperm whales spend on the surface.

The post Project CETI uses AI and robotics to track down sperm whales appeared first on The Robot Report.

]]>
An image of a pod of sperm whales swimming underwater.

Sperm whales spend, on average, 10 minutes of every hour on the surface, presenting challenges for researchers studying them. | Source: Amanda Cotton/Project CETI

In the chilly waters off the New England coast, researchers from the Cetacean Translation Initiative, Project CETI, can spend hours searching and waiting for an elusive sperm whale to surface. During the minutes the whales spend above water, the researchers need to gather as much information as possible before the animals dive back beneath the surface for long periods.

With one of the widest global distributions of any marine mammal species, these whales are difficult to track down, and even more difficult to learn from. Project CETI aims to use robotics and artificial intelligence to decode the vocalizing of sperm whales. It recently released research about how it tracks down sperm whales across the wide ocean.

“The ocean and the natural habitat of the whales is this vast place where we don’t have a lot of infrastructure, so it’s hard to build infrastructure that will always be able to observe the whales,” said Stephanie Gil, an assistant professor of Computer Science at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) and an advisor on the project.

The project brings together some of the world’s leading scientists in biology, linguistics, robotics, and more. The founder of Project CETI, David Gruber, estimated that it’s one of the largest multi-disciplinary research projects active today.

“Project CETI was formed in March 2020, and we’re now over 50 scientists across eight different disciplines,” he said. “I think we’re over 15 institutions, which I believe puts us as one of the most interdisciplinary, large-scale science projects that’s ever been conducted. It’s incredibly rewarding to see so many disciplines working together.”

Project CETI shares latest research

The researchers at the nonprofit organization have developed a reinforcement learning framework that uses autonomous drones to find sperm whales and predict where they will surface. The paper, published in Science Robotics, said it’s possible to predict when and where a whale may surface using various sensor data and predictive models of sperm whale dive behavior.

This new study involved various sensing devices, such as Project CETI aerial drones with very high frequency (VHF) signal sensing capability that use signal phase along with the drone’s motion to emulate an “antenna array in the air” for estimating the direction of pings from CETI’s on-whale tags.

“There are two basic advantages of [VHF signals]. One is that they are really low power, so they can operate for a really, really long time in the field, like months or even years. So, once those small beacons are deployed on the tag, you don’t have to really replace the batteries,” said Ninad Jadhav, a co-author on the paper and a robotics and engineering Ph.D. student at Harvard University.

“The second thing is these signals that these tags transmit, the VHF, are very high-frequency signals,” he added. “They can be detected at really long ranges.”

“That’s a really huge advantage because we never know when the whales will surface or where they will surface, but if they have been tagged before, then you can sense, for example, simple information such as the direction of the signal,” Jadhav told The Robot Report. “You can deploy an algorithm on the robot to detect that, and that gives us an advantage of finding where the whales are on the surface.”

Sperm whales present unique challenges for data collection

From left to right, Stephanie Gil, Sushmita Bhattacharya, and Ninad Jadhav working on a laptop with an orange drone in the foreground.

From left to right: Stephanie Gil, Sushmita Bhattacharya, and Ninad Jadhav. | Source: Stu Rosner

“Sperm whales are only on the surface for about 10 minutes every hour,” said Gil. “Other than that, they’re diving pretty deep in the ocean, so it’s hard to access information about what the whales are actually doing. That makes them somewhat elusive for us and for science.”

“Even we humans have certain patterns day to day. But if you’re actually out observing whales on a particular day, their behavior is not going to exactly align with the models, no matter how much data you’re using to make those models right. So it’s very difficult to really predict with precision when they might be coming up,” she continued.

“You can imagine, if [the scientists] out on the water for days and days, only having a few encounters with the whales, we’re not being that efficient. So this is to increase our efficiency,” Gruber told The Robot Report.

Once the Project CETI researchers can track down the whales, they must gather as much information as possible during the short windows of time sperm whales spend on the surface.

“Underwater data collection is quite challenging,” said Sushmita Bhattacharya, a co-author on the paper and a computer science and robotics Ph.D. student at Harvard University. “So, what is easier than underwater data collection is to have data collected when they’re at the surface. We can leverage drones or shallow hydrophones and collect as much data as possible.”


SITE AD for the 2025 Robotics Summit registration. Register now


Developing the AVATARS framework

At the center of the research is the Autonomous Vehicles for Whale Tracking And Rendezvous by Remote Sensing, or AVATARS framework. AVATARS is the first co-development of VHF sensing and reinforcement learning decision-making for maximizing the rendezvous of robots and whales at sea.

“We tried to build up a model which would kind of mimic [sperm whale] behavior,” Bhattacharya said of AVATARS. “We do this based on the current information that we gather from the sparse data set.”

Being able to predict when and where the whales will surface allowed the researchers to design algorithms for the most efficient route for a drone to rendezvous with—or encounter—a whale at the surface. Designing these algorithms where challenging on many levels, the researchers said.

“Probably the hardest thing is the fact that it is such an uncertain problem. We don’t have certainty at all in [the whales’] positions when they’re underwater, because you can’t track them with GPS when they’re underwater,” Gil said. “You have to think of other ways of trying to track them, for example, by using their acoustic signals and an angle of arrival to their acoustic signals that give you a rough idea of where they are.”

“Ultimately, these algorithms are routing algorithms. So you’re trying to route a team of robots to be at a particular location in the environment, in the world, at a certain given time when it’s necessary to be there,” she told The Robot Report. “So this is analogous to something like rideshare.”

Before bringing the algorithms into the real world with real whales, the team tested them in a controlled environment with devices the team put together to mimic whales.

We mimicked the whale using an engineered whale,” recalled Bhattacharya. “So basically we used a speed boat, and it had a loud engine. We used that engine noise to mimic the whale vocalization, and we had it move to mimic whale motion. And then we used that as our ground test.”

Project CETI tests AVATARS in the real world

An image of a small white drone flying over the ocean. The top of a whale can be seen poking out of the water.

A customized off-the-shelf drone flying to deploy a whale tag developed by Project CETI researchers. | Source: Project CETI

“Every day was a challenge when we were out on the boat, because this was for me, and my co-author Sushmita, the first time we were deploying real autonomous robots from a boat in the middle of the sea trying to collect some information,” Jadhav said.

“One of the major challenges of working in this environment was the noise in the sensor,” he continued. “As opposed to running experiments in the lab environment, which is more controlled, there are fewer sources of noise that impact your experiments or your sensor data”

“The other key challenge was deploying the drone itself from the board,” noted Jadhav. “I remember one instance where this was probably the first or second day of the second expedition that we went on last November, and I had the drone ready. It had the payload. It was waterproof”

“I had already run experiments here in Boston locally, where I had an estimate of how long the drone would fly with the payload. And then we were out on the boat running some initial tests, and the drone took off,” he said. “It was fine, it was doing its thing, and within a minute of it collecting data, there was a sudden gust of wind. The drone just lost control and crashed in the water.”

The team also had to try to predict and react to whale behavior when performing field tests.

“Our algorithm was designed to handle sensor data from a single whale, but what we ended up seeing is that there were four whales together, who were socializing,” Jadhav said. “They were diving and then surfacing at the same time. So, this was tricky, because then it becomes really hard for us on the algorithm side to understand which whale is sending which acoustic signal and which one we are tracking.”

Team tries to gather data without disturbing wildlife

While Project CETI works closely with sperm whales and other sea life that might be around when the whales surface, it aims to leave the whales undisturbed during data collection.

“The main concern that we care about is that even if we fail, we should not harm the whales,” Bhattacharya said. “So we have to be very careful about respecting the boundaries of those animals. That’s why we are looking at a rendezvous radius. Our goal is to go near the whale and not land on it.”

“Being minimally invasive and invisible is a key part of Project CETI,” said Gruber. “[We’re interested in] how to collect this information without interacting directly with the whale.”

This is why the team works mostly with drones that won’t disturb sea life and with specially developed tags that latch onto the whales and collect data. The CETI team eventually collects these tags, and the valuable data they contain, after they fall off the whales.

“A lot of times, people might think of robotics and autonomy as a scary thing, but this is a really important project to showcase that robots can be used to extend the reach of humans and help us understand our world better,” Gil told The Robot Report.

Project CETI aims to decode whale communications

This latest research is just one step in Project CETI’s overarching goal to decode sperm whale vocalizations. In the short term, the organization plans to ramp up data collection, which will be crucial for the project’s long-term goals.

“Once we have all the algorithms worked out, a future outlook is one where we might have, for example, drone ports in the sea that can deploy robots with sensors around the clock to observe whales when they’re available for observation,” Gil said.

“We envision a team of drones that will essentially meet or visit the whales at the right place, at the right time,” Jadhav said. “So whenever the whales surface, you essentially have a kind of autonomous drone, or autonomous robot, very close to the whale to collect information such as visual information or even acoustic if the drone is equipped with that.”

Outside of Project CETI, organizations could use AVATARS to further protect sperm whales in their natural environments. For example, this information could be used to reroute ships away from sperm whale hot spots, reducing the odds of a ship colliding with a pod of sperm whales.

“The idea is that if we understand more about the wholes, more about the whale communities, more about their social structures, then this will also enable and motivate conservation projects and understanding of marine life and how it needs to be protected,” Gil said.

In addition, the researchers said they could apply these methods to other sea mammals that vocalize.

“Here at Project CETI, we’re concerned about sperm whales, but I think this can be generalized to other marine mammals, because a lot of marine mammals vocalize, including humpback whales, other types of whales, and dolphins,” Bhattacharya said.

The post Project CETI uses AI and robotics to track down sperm whales appeared first on The Robot Report.

]]>
https://www.therobotreport.com/project-ceti-uses-ai-and-robotics-to-track-down-sperm-whales/feed/ 0
AWS offers accelerated robotics simulation with NVIDIA https://www.therobotreport.com/aws-offers-accelerated-robotics-simulation-nvidia/ https://www.therobotreport.com/aws-offers-accelerated-robotics-simulation-nvidia/#respond Tue, 03 Dec 2024 18:30:07 +0000 https://www.therobotreport.com/?p=581816 AWS and NVIDIA said that Isaac Sim on Amazon Web Services can significantly accelerate and scale robot simulation and AI training.

The post AWS offers accelerated robotics simulation with NVIDIA appeared first on The Robot Report.

]]>
AWS and Isaac Sim can help accelerate robotics development, says NVIDIA.

AWS and Isaac Sim can help accelerate robotics development, says NVIDIA.

NVIDIA Corp. today announced at AWS re:Invent enhanced tools for robotics developers, as well as the availability of NVIDIA DGX Cloud on Amazon Web Services and offerings for artificial intelligence and quantum computing.

The company said that NVIDIA Isaac Sim is now available on NVIDIA L40S graphics processing units (GPUs) in Amazon Elastic Cloud Computing (EC2) G6e instances. It said this could double scaling robotics simulation and accelerate AI model training. Isaac Sim is a reference application built on NVIDIA Omniverse for developers to simulate and test AI-driven robots in physically based virtual environments.

With NVIDIA OSMO, a cloud-native orchestration platform, developers can easily manage their complex robotics workflows across their AWS computing infrastructure, claimed the company.

“This combination of NVIDIA-accelerated hardware and software — available on the cloud — allows teams of any size to scale their physical AI workflows,” wrote Akhil Docca, senior product marketing manager for Omniverse at NVIDIA.


SITE AD for the 2025 Robotics Summit registration. Register now


What is ‘physical AI?’

According to NVIDIA, “physical AI” describes AI models that can understand and interact with the physical world. The company said it “embodies the next wave of autonomous machines,” such as self-driving cars, industrial manipulators, mobile robots, humanoids, and even robot-run infrastructure like factories and warehouses.

With physical AI, developers are embracing a “three-computer solution” for training, simulation, and inference to make breakthroughs, NVIDIA said. Yet physical AI for robotics systems requires robust training datasets to achieve precision inference in deployment. Developing such datasets and testing them in real situations can be impractical and costly.

Simulation offers an answer, as it can accelerate the training, testing and deployment of AI-driven robots, the company asserted.

L40S GPUs in the cloud offer to scale simulation, training

Developers can use simulation to verify, validate, and optimize robot designs as well as the systems and their algorithms before deployment, said NVIDIA. It added that simulation can optimize facility and system designs before construction or remodeling starts for maximum efficiencies, reducing costly manufacturing change orders.

Amazon EC2 G6e instances accelerated by NVIDIA L40S GPUs can double performance over the prior architecture, while allowing the flexibility to scale as scene and simulation complexity grows, NVIDIA said. Roboticists can use these instances to train many computer vision models that power AI-driven robots.

This means the same instances can be extended for various tasks, from data generation and simulation to model training. NVIDIA added that OSMO allows teams to orchestrate and scale complex robotics development workflows across distributed computing resources, whether on premises or in the AWS cloud.

NVIDIA said Isaac Sim can foster collaboration and critical workflows, such as generating synthetic data for perception model training.

A reference workflow combines NVIDIA Omniverse Replicator, a framework for building custom synthetic data generation (SDG) pipelines and a core extension of Isaac Sim, with NVIDIA NIM microservices. With it, developers can build generative AI-enabled SDG pipelines, it said.

These include the USD Code NIM microservice for generating Python USD code and answering OpenUSD queries, plus the USD Search NIM microservice for exploring OpenUSD assets using natural language or image inputs.

The Edify 360 HDRi NIM microservice can generate 360-degree environment maps, while the Edify 3D NIM microservice can create ready-to-edit 3D assets from text or image prompts. Generative AI can thus ease the synthetic data generation process by reducing many tedious and manual steps, from asset creation to image augmentation, said NVIDIA.

  • Rendered.ai’s synthetic data engineering platform is integrated with Omniverse Replicator. It enables companies to generate synthetic data for computer vision models used in industries from security and intelligence to manufacturing and agriculture.
  • SoftServe Inc., an IT consulting and digital services provider, uses Isaac Sim to generate synthetic data and validate robots used in vertical farming with Pfeifer & Langen, a leading European food producer.
  • Tata Consultancy Services is building custom synthetic data generation pipelines to power its Mobility AI suite to address automotive and autonomous use cases by simulating real-world scenarios. Its applications include defect detection, end-of-line quality inspection, and hazard avoidance.

NVIDIA, AWS help robots learn in simulation

While Isaac Sim enables developers to test and validate robots in physically accurate simulation, Isaac Lab, an open-source robot learning framework built on Isaac Sim, provides a virtual playground for building robot policies that can run on AWS Batch. Because these simulations are repeatable, developers can troubleshoot and reduce the number of cycles required for validation and testing, said NVIDIA.

The company cited robotics startups that are already using Isaac Sim on AWS: 

  • Field AI is building robot foundation models to enable robots to autonomously manage a wide range of industrial processes. It uses Isaac Sim and Isaac Lab to evaluate the performance of these models in complex, unstructured environments in construction, manufacturing, oil and gas, mining, and more.
  • Vention, which offers a full-stack cloud-based automation platform, is creating pretrained skills to ease development of robotic tasks, noted NVIDIA. It is using Isaac Sim to develop and test new capabilities for robot cells used by small to midsize manufacturers.
  • Cobot offers Proxie, its AI-powered collaborative mobile manipulator. It uses Isaac Sim to enable the robot to adapt to dynamic environments, work alongside people, and streamline logistics in warehouses, hospitals, airports, and more.
  • Standard Bots is simulating and validating the performance of its R01 robot used in manufacturing and machining setup.
  • Swiss-Mile is using Isaac Sim and Isaac Lab for robot learning so that its wheeled quadruped robots can perform tasks autonomously with new levels of efficiency in factories and warehouses.
  • Cohesive Robotics has integrated Isaac Sim into its software framework called Argus OS for developing and deploying robotic workcells used in high-mix manufacturing environments.
  • Aescape’s robots are able to provide precision-tailored massages by accurately modeling and tuning the onboard sensors in Isaac Sim.

NVIDIA made other announcements in addition to the availability of Isaac Sim 4.2 on Amazon EC2 G6e Instances powered by NVIDIA L40S GPUs on AWS Marketplace.

It said that NVIDIA DGX Cloud can run on AWS for training AI models; that AWS liquid cooling is available for data centers using its Blackwell platform; and that NVIDIA BioNeMo NIM microservices and AI Blueprints, developed to advance drug discovery, are now integrated into AWS HealthOmics.

The company also said its latest AI Blueprints are available on AWS for video search and cybersecurity, the integration of NVIDIA CUDA-Q with Amazon Braket for quantum computing development, and RAPIDS Quick Start Notebooks on Amazon EMR.

The post AWS offers accelerated robotics simulation with NVIDIA appeared first on The Robot Report.

]]>
https://www.therobotreport.com/aws-offers-accelerated-robotics-simulation-nvidia/feed/ 0
Clearpath Robotics discusses development of Husky A300 ground vehicle https://www.therobotreport.com/a300-clearpath-robotics-discusses-development/ https://www.therobotreport.com/a300-clearpath-robotics-discusses-development/#respond Tue, 03 Dec 2024 15:00:08 +0000 https://www.therobotreport.com/?p=581811 The Husky A300 uncrewed ground vehicle from Clearpath includes features for both expert robot developers and non-expert users.

The post Clearpath Robotics discusses development of Husky A300 ground vehicle appeared first on The Robot Report.

]]>
The Husky A300, shown here, includes several design improvements over the A200, says Clearpath Robotics.

The Husky A300 is designed to be tougher and have longer endurance than the A200. Source: Clearpath Robotics

Developers of robots for indoor or outdoor use have a new platform to build on. In October, Clearpath Robotics Inc. released the Husky A300, the latest version of its flagship mobile robot for research and development. The Waterloo, Ontario-based company said it has improved the system’s speed, weather resistance, payload capacity, and runtime.

“Husky A200 has been on the market for over 10 years,” said Robbie Edwards, director of technology at Clearpath Robotics. “We have lots of experience figuring out what people want. We’ve had different configurations, upgrades, batteries and chargers, computers, and motors.”

“We’ve also had different configurations of the internal chassis and ingress protection, as well as custom payloads,” he told The Robot Report. “A lot of that functionality that you had to pay to add on is now stock.”

Husky A300 hardware is rugged, faster

The Husky A300 includes a high-torque drivetrain with four brushless motors that enable speeds of up to 2 m/s (4.4 mph), twice as fast as the previous version. It can carry payloads up to 100 kg (220.4 lb.) and has a runtime of up to 12 hours, said Clearpath Robotics.

The company, which Rockwell Automation acquired last year, noted that the platform can integrate third-party components and accessories including depth cameras, directional lidar, dual-antenna GPS, and manipulators. Husky A300 has an IP54 rating against dust and water and can withstand industrial environments or extreme temperatures outdoors, it said. 

“Before, the Husky was configured on a bespoke basis,” said Edwards. “Now we’re off at a more competitive price, which is great for our customers, and it now comes off our production line instead of our integration line.”

Founded in 2009, the company has tested its hardware and software near its office in a wide range of weather conditions.

Clearpath’s integration with Rockwell has gone smoothly so far, with Rockwell’s procurement team easing access to components and manufacturing, said Edwards. He observed that some of Rockwell’s customers in mining or other industrial automation could find new use cases in time.

The Husky A300 platform, shown here, is designed to withstand dust and temperature variances, says Clearpath Robotics.

The Husky A300 can withstand dust and temperature variances. Source: Clearpath Robotics

Clearpath includes ROS 2 support with A300

Husky A300 ships with Robot Operating System (ROS) 2 Jazzy plus demonstrations of Nav2, MoveIt 2, and other developer utilities.

“Over the past two years, there was a big push to get all Clearpath products to ROS 2 Humble because its configuration management system made life easier for our integration team and customers,” recalled Edwards. “We also provide support for simulation, and URDF [Unified Robot Description Format] is configured.”

Many of Clearpath’s R&D customers were familiar with ROS, C++, and Python, so it offered visualization and simulation tools in addition to the ROS stack, he added. However, as the company got non-expert customers, it wanted to enable them to also work with Husky.

“Academics who aren’t roboticists but want to do data collection can now do so with a simple Python interface, without learning ROS,” Edwards said. “We’ve maintained a level of flexibility with integrating different payloads and compute options while still giving a pretty good price point and usability.”


SITE AD for the 2025 Robotics Summit registration. Register now


Husky AMP a ‘turnkey’ option

Clearpath Robotics is offering a “turnkey” version of the robot dubbed Husky AMP, or autonomous mobile platform. It comes with a sensor suite for navigation, pre-installed and configured OutdoorNav software, a Web-based user interface, and an optional wireless charging dock.

“Robotics developers can easily integrate payloads onto the mounting deck, carry out a simple software integration through the OutdoorNav interface, and get their system working in the field faster and more efficiently,” said Clearpath.

“We’ve lowered the barrier to entry by providing all software function calls and a navigation stack,” Edwards asserted. “The RTK [real-time kinematic positioning] GPS is augmented with sensor fusion, including wheel odometry, and visual and lidar sensors.”

“With a waypoint following system, the robotics stack does the path planning, which is constrained and well-tested,” he said. “Non-roboticists can use Husky A300 as a ground drone.”

More robot enhancements, use cases to come

Clearpath Robotics is considering variant drive trains for the Husky A300, such as tracks for softer terrain as in agriculture, said Edwards.

“Husky is a general-purpose platform,” he said. “We’re serving outdoors developers rather than end users directly, but there’s a lot of demand for larger, high-endurance materials transport.”

For the A300, the company surveyed its client base, which came back with 150 use cases.

“I’ve seen lots of cool stuff — robots herding animals, helping to grow plants, working in mines, participating in the DARPA Subterranean Challenge in fleets of Husky and [Boston Dynamics’] Spot,” Edwards said. “Husky Observer conducts inspections of sites such as solar farms.”

“The benefits for industrial users also help researchers,” he said. “Making the robot cheaper to deploy for faster time to value also means better battery life, weatherproofing, and integrations.”

Edwards added that Clearpath has received a lot of interest in mobile manipulation with its Ridgeback omnidirectional platform.

“This trend is finding its way outdoors as well,” he said. “On the application engineering side, developers have put put two large Universal Robots arms on our Warthog UGV [uncrewed ground vehicle] for things like changing tires.”

The Husky A300 can carry different sensor payloads, shown here, or robotic arms.

The Husky A300 can carry different sensor payloads or robotic arms. Source: Clearpath Robotics

The post Clearpath Robotics discusses development of Husky A300 ground vehicle appeared first on The Robot Report.

]]>
https://www.therobotreport.com/a300-clearpath-robotics-discusses-development/feed/ 0
Realtime Robotics appoints Ville Lehtonen vice president of product https://www.therobotreport.com/realtime-robotics-appoints-ville-lehtonen-vice-president-of-product/ https://www.therobotreport.com/realtime-robotics-appoints-ville-lehtonen-vice-president-of-product/#respond Sun, 01 Dec 2024 13:46:12 +0000 https://www.therobotreport.com/?p=581798 Realtime Robotics has named Ville Lehtonen, who previously worked at HighRes Biosolutions and Pickle Robot, to lead its product efforts.

The post Realtime Robotics appoints Ville Lehtonen vice president of product appeared first on The Robot Report.

]]>
Optimization solution evaluates multiple paths, sequences, poses, end-of-arm-tool rotations, and interlocks for multiple robots within a cell, says Realtime Robotics.

Optimization evaluates multiple paths, sequences, poses, end-of-arm tool rotations, and interlocks for robots within a workcell. Source: Realtime Robotics.

Realtime Robotics, a leader in collision-free autonomous motion planning for industrial robots, last week named industry veteran Ville Lehtonen as its vice president of product.

Lehtonen brings experience in technology, product, and management, said Realtime Robotics. He most recently served as head of product at Pickle Robot Co., which he guided to a leadership position in the truck and container loading and unloading industry.

“Ville’s track record speaks for itself, and we’re confident he will be an excellent addition to the team,” said Kevin Carlin, chief commercial officer at Realtime Robotics.

“Our Optimization solution is already helping several manufacturing companies to reduce cycle times and improve productivity,” Carlin stated. “With Ville’s expertise, we can evolve to meet additional customer needs and expand its adoption throughout the manufacturing and logistics industries.”


SITE AD for the 2025 Robotics Summit registration. Register now


Lehtonen expects ‘a massive gear change’

Prior to Pickle, Lehtonen was head of product for HighRes Biosolutions, a laboratory automation software company, and he was a co-founder and CEO of LabMinds Ltd., a laboratory automation company.

Lehtonen holds a BS and an MS in computer science from the Helsinki University of Technology and an MBA from Oxford University.

Ville Lehtonen

Ville Lehtonen. Source: LinkedIn

“I look forward to helping already highly automated production lines become even more efficient and cost-effective with the use of Realtime’s Optimization technology,” he said. “I am confident we can help manufacturers save tens of thousands of hours on their industrial robotics projects.”

“What Realtime is doing is a massive gear change in deploying automation,” Lehtonen added. “While this will be incredibly helpful for current manufacturers, the most exciting opportunities come from unlocking the economics for companies operating on a far smaller scale than the heavy users of robots. Realtime’s technology stack also can do for kinematics what real-time object-detection frameworks like YOLO [You Only Look Once] have done for computer vision, further lowering the barriers to entry in the robotics space.”

About Realtime Robotics

Boston-based Realtime Robotics said its technology generates optimized motion plans and interlocks to achieve the shortest possible cycle time in single and multi-robot workcells. The company claimed that its systems expand the potential of automation, empowering multiple robots to work closely together in unstructured and collaborative workspaces, reacting to dynamic obstacles the instant changes are perceived.

Realtime said its Optimization product uses a combination of proprietary software and experienced robotics and application engineering insights to drastically improve a manufacturer’s overall productivity. The system analyzes a customer’s existing digital twin, identifying bottleneck areas and recommending improvements based on desired parameters. 

Optimization can do all of this without interfering with ongoing production efforts, said Realtime Robotics.

The post Realtime Robotics appoints Ville Lehtonen vice president of product appeared first on The Robot Report.

]]>
https://www.therobotreport.com/realtime-robotics-appoints-ville-lehtonen-vice-president-of-product/feed/ 0
ASTM developing testing standards for mobile manipulators https://www.therobotreport.com/astm-developing-testing-standards-for-mobile-manipulators/ https://www.therobotreport.com/astm-developing-testing-standards-for-mobile-manipulators/#respond Sat, 30 Nov 2024 13:30:51 +0000 https://www.therobotreport.com/?p=581788 The ASTM International F45 committee is developing a new standard and an apparatus to test mobile manipulator precision.

The post ASTM developing testing standards for mobile manipulators appeared first on The Robot Report.

]]>
The MC600 is designed for reliable mobile manipulation, says MiR.

The new MC600 is designed for reliable mobile manipulation, says MiR. Source: Mobile Industrial Robots

While humanoid robots are taking their first steps into industrial applications, ASTM International is working on standards for mobile manipulators. Its F45.05 Committee on Robotics, Automation, and Autonomous Systems is developing a standard designated as WK92144.

A mobile manipulator is broadly defined as an autonomous mobile robot (AMR) base with an attached multi-axis robotic arm. The ANSI/RIA R15.08-1-2020 standard defines a classification scheme for industrial mobile robots (IMR), which includes mobile manipulators as a class.

The goal of ASTM’s standard is to demonstrate the precision of a such a robot and provide a series of quantifiable tests for measuring the accuracy of the manipulator and mobile base’s movements. The research for the emerging testing standard is based on a National Institute of Standards and Technology (NIST) paper.


SITE AD for the 2025 Robotics Summit registration. Register now


Standards efforts focus on definitions, testing

“We started this standards effort in 2014 or 2015,” said Omar Aboul-Enein, co-author of the NIST paper and an F45 committee member. “In many cases, there’s noncontinuous performance. For example, the arm and vehicle don’t move simultaneously for machine tending or assembly tasks. We needed terminology for discrete pose-constrained tasks.”

In 2022, ASTM International expanded its scope to include robotics and automation, he told The Robot Report. “NIST had tested noncontinuous performance with multiple robots, such as AGVs [automated guided vehicles] and then experimented with continuous systems for large-scale manufacturing, like for aircraft wings, ship bows, and wind turbine blades.”

With R15.08, ASTM has focused on AMR testing, with task groups for mobile manipulation, grasp-type end effectors, and robotic assembly, Aboul-Emein explained. The mobile manipulation group has more than 30 members.

By supporting foundational tests for workpiece properties, ASTM wants to help industry create consistent documentation of robot configurations. Aboul-Emein described a configurable test apparatus for mobile manipulation that uses low-cost components and is designed to be easy to reproduce and allow for in-situ testing.

However, the new standards would not apply to end effectors, payloads, or fleet behavior. They could be used to develop simulations of robots and their behaviors, acknowledged Aboul-Emein.

“It definitely has potential, but there are always factors lurking in the real world, such as a dip in the lab floor or the weight of the arm when it’s fully stretched out to one side,” he said. “We’ve been working on items to assess mobile manipulators and measure their behavior, all based on consensus out of the committee. These standards are living documents.”

ASTM introduces testing table for mobile manipulators

The Robot Report also reached out to Aaron Prather, director of the Robotics and Autonomous Systems Program at ASTM International, for a more detail on the WK92144 standard and where it’s headed.

The organization‘s F45 committee is introducing a new testing table, a tool that helps show the precision of a mobile manipulator, linking its arm and base movements. The robot must try to maneuver around the table while its arm performs tasks on the surface. These tasks include tracking an S-shaped black area for welding or gluing and inserting pegs.

image showing a prototype mobile manipulator testing table.

This image, provided by Prather, is an early prototype built by the F45 team to test the emerging standard. | Credit: Aaron Prather, ASTM

Operators can adjust the tabletop to stand at 90 degrees, tilt to 45 degrees, or lay flat at 0 degrees. To make the tests more challenging, they can attach a shaker that adds motion and vibrations.

“The table design will be standardized, and the committee will provide instructions on how everyone can build their table,” said Prather. “Several test standards are planned based on the table. The goal is to have NIST task boards and this new table be the basis for how we test new grasping/manipulation/assembly applications for accuracy and repeatability.”

“Also, expect to see our new Student Competition Challenges to use the boards and table,” he added. “This will help get students involved in how to use standards and send them out into the community with the knowledge on how to leverage these new test tools we are going to keep launching to ensure new robot systems can pass them.”

“Our hope is that we see humanoids and mobile manipulators having to show their results to help end users better understand capabilities and ensure they are getting the right system for their application,” Prather said.

The post ASTM developing testing standards for mobile manipulators appeared first on The Robot Report.

]]>
https://www.therobotreport.com/astm-developing-testing-standards-for-mobile-manipulators/feed/ 0
Oxipital AI releases VX2 Vision System for inspection and picking https://www.therobotreport.com/oxipital-ai-releases-vx2-vision-system-for-inspection-and-picking/ https://www.therobotreport.com/oxipital-ai-releases-vx2-vision-system-for-inspection-and-picking/#respond Fri, 29 Nov 2024 13:05:04 +0000 https://www.therobotreport.com/?p=581791 Oxipital AI says its advanced vision system is more compact, delivers greater precision, and is more affordable than its predecessor.

The post Oxipital AI releases VX2 Vision System for inspection and picking appeared first on The Robot Report.

]]>
The VX2 Vision System uses AI for food-grade inspection, shown here, says Oxipital AI.

The VX2 Vision System uses AI for food-grade inspection and picking, says Oxipital AI.

Oxipital AI this month launched its VX2 Vision System, which uses artificial intelligence for inspection and high-speed picking applications across food-grade and industrial sectors. Built on the company’s proprietary Visual AI platform, the VX2 comes in a more compact package at a more accessible price than its predecessor.

“At Oxipital AI, we believe that listening to our customers and learning from real-world applications is the key to driving innovation,” said Austin Harvey, vice president of product at Oxipital. “The VX2 is the result of that philosophy in action. It’s smaller, more powerful, and more versatile, enabling our customers to build more resilient manufacturing processes.”

Formerly Soft Robotics, Oxipital is developing machine vision for product inspection and robotic process automation in critical industries such as food processing, agriculture, and consumer goods production.

The Bedford, Mass.-based company’s stated mission is “to deliver actionable insights through deep object understanding to customers as they embrace Industry 5.0 and unlock previously unachievable levels of resiliency, efficiency, and sustainability in their manufacturing operations.”

Oxipital AI recently launched its VX2 Vision System, which uses artificial intelligence for inspection and high-speed picking applications across food-grade and industrial sectors. Built on the company’s proprietary Visual AI platform, the VX2 comes in a more compact package at a more accessible price than its predecessor.

“At Oxipital AI, we believe that listening to our customers and learning from real-world applications is the key to driving innovation,” said Austin Harvey, vice president of product at Oxipital. “The VX2 is the result of that philosophy in action. It’s smaller, more powerful, and more versatile, enabling our customers to build more resilient manufacturing processes.”

The successor to Soft Robotics, Oxipital is developing machine vision for product inspection and robotic process automation in critical industries such as food processing, agriculture, and consumer goods production.

The Bedford, Mass.-based company’s stated mission is “to deliver actionable insights through deep object understanding to customers as they embrace Industry 5.0 and unlock previously unachievable levels of resiliency, efficiency, and sustainability in their manufacturing operations.”


SITE AD for the 2025 Robotics Summit registration. Register now


VX2 Vision System includes several enhancements

Oxipital AI said the VX2 Vision System represents a significant improvement over its first-generation vision platform. The company said it incorporated customer feedback and extensive field learning to meet the evolving needs of the industry.

The VX2 has enhanced capabilities for inspection, high-speed picking, and high-speed picking with inspection, said Oxipital. It asserted that the system ensures optimal efficiency and precision in a wide variety of environments and listed the following benefits:

Compact and powerful: The VX2 packs more processing power into a smaller, more efficient design, providing greater flexibility for installations in tight spaces or complex environments, said Oxipital.

Versatile application: Designed for food-grade and industrial use, the VX2 excels in inspection tasks, high-speed handling, and combining both, ensuring accuracy and speed in demanding workflows.

Enhanced Visual AI platform: Oxipital said its platform delivers faster, more accurate decision-making capabilities, ensuring high-performance, real-time operations.

Better price point: Despite significant improvements in power and versatility, the VX2 is available at a more competitive price, said the company. This makes it an attractive option for businesses seeking to upgrade their capabilities without incurring significant costs, it added.

Oxipital AI schematic of its vision technology. The VX2 Vision System continues the company's response to user feedback.
The VX2 Vision System continues Oxipital’s response to user feedback. Source: Oxipital AI

Oxipital AI applies vision to industry needs

With the VX2 launch at PACK EXPO this month, Oxipital said the technology demonstrates its commitment to innovations that address the challenges that industry is currently facing.

“Oxipital AI continues to push the boundaries of what is possible with vision systems in automated environments,” it said. Soft Robotics previously made compliant grippers before pivoting to vision AI.

Oxipital has partnered with Schmalz and Velec, and its was nominated as a PACK EXPO Food and Beverage Technology Excellence Award finalist.

The post Oxipital AI releases VX2 Vision System for inspection and picking appeared first on The Robot Report.

]]>
https://www.therobotreport.com/oxipital-ai-releases-vx2-vision-system-for-inspection-and-picking/feed/ 0
Renesas launches its highest performing MPU for industrial equipment https://www.therobotreport.com/renesas-launches-highest-performing-mpu-industrial-equipment/ https://www.therobotreport.com/renesas-launches-highest-performing-mpu-industrial-equipment/#respond Thu, 28 Nov 2024 13:02:54 +0000 https://www.therobotreport.com/?p=581783 The RZ/T2H comes with the Renesas Flexible Software Package and a Linux package that comes with long-term support.

The post Renesas launches its highest performing MPU for industrial equipment appeared first on The Robot Report.

]]>
An illustration of the RZ/T2H MPU and a blue industrial robot arm.

Renesas said the RZ/T2H MPU provides powerful application processing and fast real-time control. | Source: Renesas Electronics Corporation

Renesas Electronics Corp. this week launched the RZ/T2H, its highest-performance microprocessor for industrial equipment. Thanks to its powerful application processing and real-time performance, the RZ/T2H is capable of high-speed, high-precision control of industrial robot motors for up to nine axes, the company said.

As demand grows to augment scarce labor, manufacturers are deploying industrial automation such as vertically articulated robots and industrial controller equipment. Renesas claimed that the RZ/T2H microprocessor (MPU) combines all the functionality and performance needed for developing production applications.

Industrial systems traditionally required multiple MPUs or a combination of field programmable gate arrays (FPGAs) to control these applications. However, the RZ/T2H MPU offers the same functionality on a single chip, said Renesas. This can reduce the number of components and save time and cost of FPGA program development.

The MPU supports a variety of network communications including Industrial Ethernet on a single chip. It targets industrial controller equipment such as programmable logic controllers (PLCs), motion controllers, distributed control systems (DCSs), and computerized numerical controls (CNCs).

“We have enjoyed outstanding market success with RZ/T2M and RZ/T2L,” said Daryl Khoo, the vice president of the Embedded Processing 1st Business Division at Renesas. “The RZ/T2H builds on that momentum, allowing our industrial customers to leverage their existing design assets while addressing even more innovative, demanding industrial motor control and Linux applications. Our customers have been particularly impressed that the RZ/T2H enables them to implement a nine-axis motor control all on just one chip.”

A global provider of microcontrollers, Renesas combines expertise in embedded processing, analog, power, and connectivity to deliver complete semiconductor solutions. The Tokyo-based company said its products accelerate time to market for automotive, industrial, infrastructure, and Internet of Things (IoT) applications.


SITE AD for the 2025 Robotics Summit registration. Register now


RZ/T2H can generate robot trajectories

The RZ/T2H is equipped with four Arm Cortex-A55 application CPUs with a maximum operating frequency of 1.2 GHz. For external memory, it supports 32-bit LPDDR4-3200 SDRAM. Two Cortex-R52 CPUs with a maximum operating frequency of 1 GHz handle the real-time processing, with each core equipped with a total of 576 KB of high-capacity tightly coupled memory (TCM).

This allows high CPU- and memory-intensive tasks such as running Linux applications, robot trajectory generation, and PLC sequence processing to be executed on a single chip. At the same time, the RZ/T2H can handle fast and precise real-time control such as motor control and Industrial Ethernet protocol processing, said Renesas.

The RZ/T2H can control up to nine axes including three-phase PWM timers, delta-sigma interfaces for measuring current values, and encoder interfaces. It supports A-format, EnDat, BiSS, Hyperface DSL, and FA-CODER.

In addition, the company placed peripheral functions for motor control  on a low-latency peripheral port (LLPP) bus of the Cortex-R52 real-time CPU core, allowing high-speed access from the CPU.

The RZ/T2H has four Ethernet ports, three Gigabit Ethernet MAC (GMAC), plus an Ethernet switch. It also supports EtherCAT, PROFINET, EtherNet/IP, OPC UA, and the next-generation Time-Sensitive Networking (TSN) standard.

The combination of these Ethernet switches and GMAC allows the MPU to support multiple Industrial Ethernet controllers and devices. Renesas said this allows the system to adapt to a wide range of controller requirements, such as upper-layer Ethernet communications.

Block diagram of Renesas new RZT2H SOC.

Block diagram of new RZ/T2H SOC. Click here to enlarge. Source: Renesas

Renesas offers specialized boards and software

The RZ/T2H comes with the Renesas Flexible Software Package (FSP), the same as all Renesas MPUs, and a Linux package that comes with long-term support. An out-of-the-box, multi-axis, motor control evaluation system is available. It includes inverter boards for driving nine-axis motors, a multi-axis motor control software package, and Motion Utility Tool (a motor control software tool).

Renesas has also included sample protocols for industrial Ethernet and software PLC packages to kick-start system development.

The company offers a “9-axis Industrial Motor Control with Ethernet” solution that combines the RZ/T2H with numerous compatible devices such as the RV1S9231A IGBT Drive Photocoupler and RV1S9353A Optically Isolated Delta-Sigma Modulator.

It said the resulting products enable compatible devices to work together to bring optimized, low-risk designs to market faster. Renesas offers more than 400 of these combinations with a wide range of products from its portfolio.

The RZ/T2H is now available. Renesas said plans to release the new RZ/N2H device, which offers the same performance as the RZ/T2H in a smaller package, in the first quarter of 2025. It said this will be suitable for industrial controller equipment such as PLCs and motion controllers.

The RZ/T2H is managed under the Product Longevity Program (PLP) for industrial equipment that requires long life cycles.

The post Renesas launches its highest performing MPU for industrial equipment appeared first on The Robot Report.

]]>
https://www.therobotreport.com/renesas-launches-highest-performing-mpu-industrial-equipment/feed/ 0
Learn about digitalization in the warehouse in new webinar https://www.therobotreport.com/learn-about-digitalization-in-the-warehouse-in-webinar/ https://www.therobotreport.com/learn-about-digitalization-in-the-warehouse-in-webinar/#comments Wed, 27 Nov 2024 14:30:49 +0000 https://www.therobotreport.com/?p=581774 Digitalization of the warehouse involves several emerging technologies; attendees of this free webinar can learn from industry experts.

The post Learn about digitalization in the warehouse in new webinar appeared first on The Robot Report.

]]>
Digital tools such as the simulation shown here from Dexory, are part of digitalization in the warehouse.

Digitalization is bringing emerging technologies into the warehouse. Source: Dexory

Designing and deploying a digital warehouse can be a challenge, with numerous technology options to add to your operations. From robotics and automation to the latest data analytics and artificial intelligence, how can you take advantage of digitalization?

At 2:00 p.m. EST on Wednesday, Dec. 4, expert panelists will discuss how emerging technologies are changing how engineers design warehouse systems and how businesses can gain insights and efficiencies with them. Sensors, digital twins, wearables, and virtual assistants are some of the tools that are part of this digital transformation.

In this free webinar, viewers can learn about:

  • Ways to improve labor productivity with workforce management
  • The orchestration of people and autonomous mobile robots (AMRs) for order picking and fulfillment
  • Where augmented and virtual reality (AR/VR) fit in the warehouse
  • How AI will change how operators use data in positive feedback cycle
  • How to scale digital transformation across facilities and the supply chain

Register now to attend this webinar on digitalization, and have your questions answered live. Registrants will be able to view it on demand after the broadcast date.

Digitalization speakers to share insights

Robert C. Kennedy, principal at RC Kennedy Consulting, will discuss digitalization in the warehouse.

Robert C. Kennedy is principal at RC Kennedy Consulting. For over four decades, he has planned, developed, and implemented industry-leading supply chain execution systems around the globe. Kennedy and his staff have led more than 200 large-scale implementation projects of supply chain execution software for leading customers in a variety of industries, including pharmaceutical, electronics, third-party logistics (3PL), and food and beverage.

As a leading voice of expertise, Bob is featured in regular interviews by industry media and has published articles, and he has presented at numerous trade shows and seminars.

RC Kennedy Consulting provides assistance to companies to improve operational efficiencies through process design and systems. It also helps them develop strategies for growth.

Ken Ramoutar will discuss digitalization in the warehouse.

Ken Ramoutar is chief marketing officer at Lucas Systems, which helps companies transform their distribution center by dramatically increasing worker productivity, operational agility, and customer and worker satisfaction using voice and AI optimization technologies.

In his 25 years of customer centric roles in supply chain software and consulting, Ramoutar has navigated companies through uncertainty and volatility as a thought leader and change agent.

Prior to Lucas, Ken was senior vice president and global head of customer experience at Avanade, a $3 billion Accenture and Microsoft-owned company, and he has held leadership roles at IBM, Sterling Commerce, and SAP/Ariba.

Michael Taylor is chief product officer and co-founder of Duality AI.

Michael Taylor is the chief product officer and co-founder of Duality AI. He has a 20-year career in mobile robotics, with 15 years dedicated to building autonomous field robots at Caterpillar.

While there, Mike led the team developing the autonomy system for Caterpillar’s autonomous dozer, and he helped launch the Autonomous Mining Truck program. His roles included architecting behaviors and planning systems, as well as building a collection of simulation technologies to accelerate deployment to customer sites.

Taylor was also part of the Carnegie Mellon team that won DARPA’s Urban Challenge, where he led both the Controls Team and the Field Calibration Team. Taylor holds dozens of patents in fields ranging from robotics to simulation technologies.

At Duality AI, Taylor leads the company’s Product and Solutions Engineering team. He is responsible for steering Duality’s product strategy, developing technologies to address customer needs, and helping ensure that customers maximize the value they extract from Falcon. This includes projects ranging from a simulation solution to support a drone-based AI perception system, to generating synthetic data for high-volume manufacturing quality assurance, to characterizing and modeling of uncrewed ground vehicles (UGVs) navigating novel environments. 

Eugene Demaitre, editorial director for robotics at WTWH Media

Eugene Demaitre, moderator, is the editorial director for robotics at WTWH Media, which produces Automated WarehouseThe Robot Report, the Robotics Summit & Expo, and RoboBusiness. Prior to working for WTWH Media, he was an editor at BNA (now part of Bloomberg), Computerworld, TechTarget, Robotics Business Review, and Robotics 24/7.

Demaitre has participated in conferences worldwide, as well as spoken on numerous webcasts and podcasts. He is always interested in learning more about robotics. He has a master’s from the George Washington University and lives in the Boston area.

This webinar is sponsored by Baluff and Dexory.

Balluff logo
Dexory logo

The post Learn about digitalization in the warehouse in new webinar appeared first on The Robot Report.

]]>
https://www.therobotreport.com/learn-about-digitalization-in-the-warehouse-in-webinar/feed/ 1
SS Innovations completes its first robotic cardiac surgery in Indonesia https://www.therobotreport.com/ss-innovations-completes-its-first-robotic-cardiac-surgery-in-indonesia/ https://www.therobotreport.com/ss-innovations-completes-its-first-robotic-cardiac-surgery-in-indonesia/#respond Tue, 26 Nov 2024 22:33:47 +0000 https://www.therobotreport.com/?p=581754 SS Innovations SSi Mantra system assisted with the procedure at the Harapan Kita National Cardiac Hospital in Jakarta.

The post SS Innovations completes its first robotic cardiac surgery in Indonesia appeared first on The Robot Report.

]]>
Five white surgical robotic arms on grey and blue carts.

The SSI Mantra surgical robotic system can use three to five modular robotic arm carts. | Source: SS Innovations

SS Innovations International Inc. today said it has successfully performed its first-ever robotic cardiac surgery in Indonesia. The surgery was performed with the SSi Mantra system, which the company said demonstrates its commitment to making advanced robotic surgeries cost-effective and globally accessible.

The procedures included a bilateral, internal mammary artery coronary artery bypass graft (IMA CABG); an atrial septal defect (ASD) repair; and a beating heart totally endoscopic coronary artery bypass (TECAB). They were conducted at the Harapan Kita National Cardiac Hospital in Jakarta.

Dr. Sudhir Srivastava, the founder, chairman, and CEO of SS Innovations, performed the TECAB with support from his team from the company and the dedicated support staff of Harapan Kita led by Dr. Dudy Hanafy.

SSi Mantra offers modularity, visualization

Gurugram, India-based SS Innovations International offers the proprietary SSi Mantra Surgical Robotic System and SSi Mudra instrumentation. 

The SSi Mantra 3, which was released in July, is modular and allows surgeons to use three to five robotic arms. The system has an open-faced ergonomic Surgeon Command Centre, a 32-in. 3D 4K monitor, and a 23-in. 2D Touch panel monitor for all patient-related information display.

It also provides a virtual real-time image of the robotic Patient Side Arm Carts and can superimpose 3D models of diagnostic imaging. The system also supports telesurgery.

The Vision Cart gives the table-side team the same magnified 3D 4K view as the surgeon to provide better safety and efficiency, said SS Innovations. Meanwhile, the modular robotic arms provide flexibility in positioning and the number of arms to be used. This allows for collision-free conduct of surgical operations, it said.

The SSi Mantra includes more than 40 different types of robotic endo-surgical instruments that can be used for different specialties including cardiac surgery. SS Innovation said the learning curve for surgeons using this technology is short because of its ergonomic design and user-friendly features.

The SSi Mantra has been clinically validated in India in more than 80 different types of surgical procedures. In September, SS Innovations said more than 2,000 procedures have been conducted with its system. It also added Dr. Frederic Moll, founder of Intuitive Surgical, to its board.

The company said it has started the regulatory approval process in the U.S. and the European Union. It anticipates receiving Food and Drug Administration and CE Mark approval in the second half of 2025.

SS Innovations hopes to raise standard of cardiac care in Indonesia

SS Innovations assserted that launching the robotic cardiac surgery program at Harapan Kita represents the beginning of a new era in cardiac care in Indonesia. The island nation has a population of over 284 million and almost 3,000 hospitals. 

“We have been exploring the integration of robotic surgery in cardiovascular care at the Harapan Kita hospital and reviewed the globally available surgical robotic systems,” noted Dr. Iwan Dakota, the director and intervention cardiologist at NCVC Harapan Kita. “We found that the SSi Mantra is the only system supporting the full spectrum of robotic cardiac surgery, including TECAB.”

“Given its unique capabilities, we invited Dr. Srivastava’s team to demonstrate how the SSi Mantra performs as compared to other systems,” Dr. Dakota added. “The potential for implementing robotic surgery in our country is highly promising.”

Following the initial TECAB procedure, SS Innovations, in collaboration with Harapan Kita’s surgical team, performed seven to eight additional robotic-assisted procedures over the next several days. These included LIMA, BIMA, TECAB, and ASD Repair procedures. This collaboration aims to elevate the standard of cardiac care in the region.

“We are proud to partner with Harapan Kita National Cardiac Hospital to bring the most technologically advanced and cost-effective solutions to cardiac surgery to Indonesia,” stated Dr. Sudhir Srivastava, founder, chairman, and CEO of SS Innovations. “Reaching this milestone with SSi Mantra is a testament to our vision of transforming surgical practices, enhancing access, and driving the global adoption of cost-effective robotic surgery.”

“This collaboration addresses the critical need for safe, timely, and affordable cardiac care while offering patients less-invasive options and an improved quality of life,” he said.

The post SS Innovations completes its first robotic cardiac surgery in Indonesia appeared first on The Robot Report.

]]>
https://www.therobotreport.com/ss-innovations-completes-its-first-robotic-cardiac-surgery-in-indonesia/feed/ 0
GE HealthCare unveils new applications for mobile C-arm portfolio https://www.therobotreport.com/ge-healthcare-unveils-new-applications-mobile-c-arm-portfolio/ https://www.therobotreport.com/ge-healthcare-unveils-new-applications-mobile-c-arm-portfolio/#respond Mon, 25 Nov 2024 20:28:59 +0000 https://www.therobotreport.com/?p=581737 GE HealthCare said complex pulmonary and thoracic procedures require precise intraoperative imaging systems.

The post GE HealthCare unveils new applications for mobile C-arm portfolio appeared first on The Robot Report.

]]>
The OEC 3D Imaging System, which is made up of three carts with monitors, and one cart with a large, C shaped device.

The OEC 3D Imaging System. | Source: GE HealthCare

GE HealthCare Technologies Inc. last week announced that it has added new clinical applications to its OEC 3D mobile CBCT C-arm portfolio. The Chicago-based company said the additions will enable precise and efficient imaging during endoscopic bronchoscopy procedures in the practice of interventional pulmonology.

Complex pulmonary and thoracic procedures require precise intraoperative imaging systems, explained GE HealthCare. The position of a nodule can differ from pre-operative CT images, it noted. This happens as a result of differences in respiratory patterns, patient positioning, and other factors, resulting in CT-to-body divergence at the time of the procedure, said the company.

GE HealthCare claimed that its operational electronic chart (OEC) 3D intraoperative mobile cone beam computed tomography (CBCT) offers “imaging excellence” and versatility. It said it can aid in everyday procedures ranging from neuro-spine and orthopedic trauma to interventional procedures such as bronchoscopy.

OEC 3D enables the visualization of both 2D and 3D images of the lung using a single mobile C-arm. The lung suite now includes an augmented fluoroscopy overlay of 3D points of interest and adjustable motorized 3D scans.

OEC interfaces continue to expand

During bronchoscopy procedures, clinicians can use navigation or robotic assistance with the OEC Open interface to automatically transfer 3D volumetric data after reconstruction.

GE HealthCare recently added a verified interface with the Intuitive Ion endoluminal robotic bronchoscopy system. The company said it continues to expand OEC open interfaces for a variety of clinical procedures as an agnostic ecosystem. It’s currently verified with eight third-party systems across robotics, navigation, and augmented reality (AR) vision.

“As we continue to build out our OEC ecosystem, GE HealthCare is excited about the addition of the Intuitive Ion robotic system to our OEC Open interface,” said Christian O’Connor, global general manager for surgery at GE HealthCare. “This interface provides interventional pulmonologists using the OEC 3D C-arm a seamless experience during minimally invasive, robotic-assisted bronchoscopy procedures.”

“With Intuitive’s Ion Robotic Bronchoscopy System now verified to interface with GE HealthCare’s OEC 3D through the OEC Open interface, I believe we can now reach and diagnose almost any nodule in the lung,” stated Dr. Dominique Pepper. She is medical director of bronchoscopy and respiratory care at Providence Swedish South Puget Sound and a consultant for GE HealthCare.

“This is a game-changer for clinicians – this can help us confidently and accurately provide answers when we see a suspicious area of interest,” Pepper said.


SITE AD for the 2025 Robotics Summit registration. Register now


About GE HealthCare

GE HealthCare said it is a global medical technology, pharmaceutical diagnostics, and digital solutions innovator. The company said its integrated systems, services, and data analytics can make hospitals more efficient, clinicians more effective, therapies more precise, and patients healthier and happier. It said it is a $19.6 billion business with approximately 51,000 employees worldwide. 

First introduced in 2021, the OEC 3D mobile CBCT C-arm provides precise 3D and 2D imaging in a variety of procedures. During bronchoscopies, clinicians can use CBCT visualization features, such as Lung Preset, to help optimize viewing of airway structures and Augmented Fluoroscopy with Lung Suite to help confirm tool-in-lesion.

The OEC 3D enables a transition from 3D to 2D imaging through one versatile mobile CBCT imaging C-arm. GE said it includes an intuitive user interface and workflow to further optimize space in the bronchoscopy suite.

Editor’s note: This article was syndicated from The Robot Report sibling site MassDevice.

The post GE HealthCare unveils new applications for mobile C-arm portfolio appeared first on The Robot Report.

]]>
https://www.therobotreport.com/ge-healthcare-unveils-new-applications-mobile-c-arm-portfolio/feed/ 0
Imagry moves to make buses autonomous without mapping https://www.therobotreport.com/imagry-moves-to-make-buses-autonomous-without-mapping/ https://www.therobotreport.com/imagry-moves-to-make-buses-autonomous-without-mapping/#respond Mon, 25 Nov 2024 19:18:36 +0000 https://www.therobotreport.com/?p=581732 Imagry has developed hardware-agnostic systems to provide Level 4 autonomy to buses with time to market in mind.

The post Imagry moves to make buses autonomous without mapping appeared first on The Robot Report.

]]>
Imagry says its autonomy kit enables buses to autonomously handle roundabouts, as shown here.

Imagry says its software enables buses to autonomously handle complex situations such as roundabouts. Source: Imagry

Autonomous vehicles often rely heavily on prior information about their routes, but new technology promises to improve real-time situational awareness for vehicles including buses. Imagry said its “HD-mapless driving” software stack enables vehicles to react to dynamic contexts and situations more like human drivers.

The company also said its AI Vision 360 eliminates the need for external sensor infrastructure. It claimed that its bio-inspired neural network and hardware-agnostic systems allow for SAE Level 3/4 operations without spending time on mapping.

“We’ve been focusing on two sectors,” said Eran Ofir, CEO of Imagry. “We’ve been selling our perception and motion-planning stack to Tier 1 suppliers and automotive OEMs for autonomous vehicles. We signed a 10-year contract with Continental and are jointly developing a software-defined vehicle platform.”

“And we’ve started working with transportation operators on providing autonomous buses,” he told The Robot Report. “For example, in Turkey, France, Spain, and soon Japan, we’re retrofitting electric buses to be autonomous.”


SITE AD for the 2025 Robotics Summit registration. Register now


Imagry trains in real time with supervision

Imagry was established in 2015 with a focus on computer vision for retail. In 2018, it began focusing entirely on autonomous driving. The company now has about 120 employees in San Jose, Calif., and Haifa, Israel.

Imagry said its technology is similar to that of Tesla in relying on 3D vision for perception and motion planning rather than rule-based coding or maps.

“Most players in the industry use HD maps with 5 cm [1.9 in.] resolution, telling the vehicle where lights, signs, and lane markers are,” said Ofir. “Our system teaches itself with supervised learning. It maps in real time while driving. Like a human driver, it gets the route but doesn’t know what it will find.”

How does Imagry deal with the need for massive data sets to train for navigation and obstacle detection and avoidance?

“We wrote a proprietary tool for annotation to train faster, better, and cheaper,” Ofir replied. “The data is collected but doesn’t live in the cloud. The human supervisor tells the vehicle where it was wrong, like a child. We deliver over-the-air updates to customers.”

“The world doesn’t belong to HD maps — it’s a matter of trusting AI-based software for perception and motion planning,” he said.

Ofir cited an example of a vehicle in Arizona on a random route with no communications to centralized computing. Its onboard sensors and compute recognized construction zones, skateboarders, a bike lane, and stop signs.

“The capability to drive out of the box in new places is unique to Imagry,” asserted Ofir. “We can handle righthand and lefthand driving, such as in Tokyo, where we’ve been driving for a year now.”

How does the bus know when to stop for passengers?

It could stop at every bus stop, upon request via a button at the stop (for the elderly, who may not use phone apps), or be summoned by an app that also handles payment, responded Ofir. Imagry’s system also supports “kneeling” for people with disabilities.

Why buses are a better focus for autonomy

Imagry has decided to focus on urban use cases rather than highways. Buses are simpler to get to Level 4 autonomy, said Ofir.

“Autonomous buses are better than ride hailing; they’re simpler than passenger vehicles,” said Ofir. “They drive in specific routes and at a speed of only 50 kph [31 mph] versus 80 kph [50 mph]. It’s a simpler use case, with economies of scale.”

“The time to revenue is much faster — the design cycle is four years, while integrating with a bus takes two to three months,” he explained. “Once we hand it over to the transport operator, we can get to L4 in 18 months, and then they can buy and deploy 40 more buses.”

In addition, the regulations for autonomous buses are clearer, with 22 countries running pilots, he noted.

“We already have projects with a large medical center and on a public road in Israel,” Ofir said. “We’re not doing small pods — most transport operators desire M3-class standard buses for 30 to 45 passengers because of the total cost of ownership, and they know how to operate them.”

In September and October, Imagry submitted bids for autonomous buses in Austria, Portugal, Germany, Sweden, and Japan.

Software focus could save money

By being vehicle-agnostic, Ofir said Imagry avoids being tied to specific, expensive hardware. Fifteen vendors are making systems on chips (SoCs) that are sufficient for Level 3 autonomy, he said.

“OEMs want the agility to use different sets of hardware in different vehicles. A $30,000 car is different from a $60,000 car, with different hardware stacks and bills of materials, such as camera or compute,” said Ofir. “It’s a crowded market, and the autonomy stack still costs $100,000 per vehicle. Ours is only $3,000 and runs on Ambarella, NVIDIA, TI, Qualcomm, and Intel.”

“With our first commercial proof of concept for Continental in Frankfurt, Germany, we calibrated our car and did some localization,” he added. “Three days after arrival, we simply took it out on the road, and it drove, knowing there’s no right on red.”

With shortages of drivers, particularly in Japan, operators could save $40,000 to $70,000 per bus per year, he said. The Japanese government wants 50 locations across the country to be served with autonomous buses by the end of 2025 and 100 by the end of 2027.

Autonomous buses are also reliable around the clock and don’t get sick or go on strike, he said.

“We’re working on fully autonomous parking, traffic jam assist, and Safe Driver Overwatch to help younger or older drivers obey traffic signs, which could be a game-changer in the insurance industry,” he added. “Our buses can handle roundabouts, narrow streets, and mixed traffic and are location-independent.”

Phases of autonomous bus deployment

Technology hurdles aside, getting autonomous buses recognized by the rules of the road requires patience, said Ofir.

“Together with Mobileye, which later moved to the robotaxi market, Imagry helped draft Israel’s regulatory framework for autonomous driving, which was completed in 2022,” recalled Ofir. “We’re working with lawmakers in France and Germany and will launch pilots in three markets in 2025.”

Testing even Level 3 autonomy can take years, depending on the region. He outlined the phases for autonomous bus rollout:

  1. Work with the electric bus for that market, then activate the system on a public road. “In the U.S., we’ve installed the full software and control stack in a vehicle and are testing FSD [full self-driving],” Ofir said.
  2. Pass NCAP (European New Car Assessment Programme) testing for merging and stops in 99 scenarios. “We’re the only company to date to pass those tests with an autonomous bus,” said Ofir. “Japan also has stringent safety standards.”
  3. Pass the cybersecurity framework, then allow passengers onboard buses with a safety driver present.
  4. Autonomously drive 100,000 km (62,137 mi.) on a designated route with one or more buses. After submitting a report to a department of motor vehicles or the equivalent, the bus operator could then remove the human driver.

“The silicon, sensors, and software don’t matter for time to revenue, and getting approvals from the U.S. National Highway Traffic Safety Administration [NHTSA] can take years,” Ofir said. “We expect passenger vehicles with our software on the road in Europe, the U.S., and Japan sometime in 2027.”

Imagry has joined Partners for Automated Vehicle Education (PAVE) and will be exhibiting at CES in January 2025.

The post Imagry moves to make buses autonomous without mapping appeared first on The Robot Report.

]]>
https://www.therobotreport.com/imagry-moves-to-make-buses-autonomous-without-mapping/feed/ 0