Design / Development Archives - The Robot Report https://www.therobotreport.com/category/design-development/ Robotics news, research and analysis Fri, 06 Dec 2024 23:31:17 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.1 https://www.therobotreport.com/wp-content/uploads/2017/08/cropped-robot-report-site-32x32.png Design / Development Archives - The Robot Report https://www.therobotreport.com/category/design-development/ 32 32 Funding the next wave of robotics https://www.therobotreport.com/funding-the-next-wave-of-robotics/ https://www.therobotreport.com/funding-the-next-wave-of-robotics/#respond Fri, 06 Dec 2024 23:31:17 +0000 https://www.therobotreport.com/?p=581876 Episode features conversations with two VC's and explores robotics and AI investment trends.

The post Funding the next wave of robotics appeared first on The Robot Report.

]]>

 

In Episode 176 of The Robot Report Podcast, we feature an interview with venture capitalists Juliette Chevallier, Principle at Scale Ventures, and Jasmeet Singh founder of JMOON Ventures.

It’s VC week here at the podcast.

This episode features interviews with Juliette Chevallier from Scale Ventures and Jasmine Singh from Jay Moon Ventures and covers investment trends in robotics, emphasizing the importance of execution risk over technical risk.

Juliette Chevallier, Principal, Investments, Scale Venture Partners

Juliette Chevallier has a background in autonomous vehicles and robotics, having previously worked at companies like Google Chauffeur (now Waymo) and MIT spinoff Optimus Ride. She joined Scale Venture Partners about 2 years ago to lead their investment thesis on robotics, AI applications, and cybersecurity. Scale Venture Partners’ approach focuses on investing at the point of execution risk rather than technical risk, looking for companies with a working product and proven product-market fit. Juliette emphasizes the importance of understanding the customer ROI and business model as key criteria.

In her role as a VC, Juliette prefers to have a deep, hands-on involvement with portfolio companies, acting as a strategic sounding board and collaborating closely with founders to work through tough problems. She sees her role as helping founders navigate the operational and go-to-market challenges. Juliette notes a renewed interest in robotics from VCs, though she is cautious about some “wild” valuations and funding rounds, preferring bottoms-up market analysis over top-down figures.

Juliette is bullish on the potential of robotics foundation models (RFMs) to drive transformation, emphasizing the need for more multi-modal AI models that integrate vision, action, and communication. She is excited about the possibilities of AI to enhance robotics, but cautions about the risks of AI development burning through funding. Overall, Juliette’s approach focuses on de-risking execution and operational challenges for robotics startups, leveraging her deep technical and business expertise to support founders.

Learn more at: www.scalevp.com/

Jasmeet Singh, Founder, JMOON Ventures

Jasmeet Singh has a diverse background spanning robotics engineering, founding startups, and investing since 2012. As an investor at J Moon Ventures, he focuses on “physical AI” startups – those combining hardware, electronics, and AI in areas like robotics, IoT, and 3D printing.

Jasmeet emphasizes the importance of solving real problems, not just building cool technology. He looks for startups with a strong understanding of the user and business model, noting operational challenges like scaling manufacturing and finding the right business model.
Compared to the more risk-averse Canadian market, Jasmeet sees the US as a better environment for robotics fundraising. He advises founders to target large, underserved problems and focus on customer service and support.

Some of Jasmeet’s investments include Orange Wood Labs, Brisk AI, and Rural Hologram. As he launches J Moon Ventures, he is particularly interested in opportunities in agriculture, construction, medical, and sustainability.

Overall, Jasmeet brings a unique perspective as an investor with deep technical expertise and operational experience in robotics. He is focused on backing founders solving real-world problems with innovative hardware-software solutions.

Learn more at: jmoon.ventures/

Show timeline

  • 8:40 – News of the week
  • 26:38 – Interview with Juliette Chevallier
  • 1:03:00 – Interview with Jasmeet Singh, AKA The Bearded Maker

SITE AD for the 2025 Robotics Summit registration. Register now


News of the week

Humanoid video of the week

@plugfc7

Kai’s 1X Robot didn’t last long after getting rebooted #kaicenat #1x #1xrobot #fyp @Kai Cenat

♬ silence – moartea regelui.

Recent videos featuring internet influencer Kai Cenat and his 1X EVE robot have sparked a significant discussion about the readiness of humanoid robots for domestic use. In one particular incident (seen in the TiKTok video above), the robot abruptly powered down and fell over, raising concerns about potential safety hazards and the current limitations of humanoid technology. This event highlights the need for rigorous testing and development before deploying such robots in homes, as opposed to the more controlled industrial environments where they are currently being trialed.

ASTM developing testing standards for mobile manipulators

The ASTM F45 subcommittee is developing a new standard to evaluate the agility of mobile manipulators. This standard aims to provide a standardized testing procedure similar to automotive evaluations, allowing manufacturers to benchmark their solutions and identify areas for improvement. The proposed tests involve tracking a specific path on a table surface and inserting pegs, assessing the robot’s precision and coordination between arm and base movements. This initiative and other ASTM F45 efforts in mobile robot testing underscore the growing importance of standardized evaluation methods for advancing robotics technology.

GEODIS reaches 10M picks with Locus mobile robots

Locus Robotics and GEODIS have reached a major milestone with over 10 million units picked using autonomous mobile robots (AMRs) at a GEODIS distribution center in Pennsylvania. Locus’s AI-powered platform, LocusONE, optimizes worker productivity by directing them to the next pick location, reducing wasted time and boosting efficiency. This partnership highlights the increasing adoption of warehouse automation to meet growing e-commerce demands and improve operational efficiency.


2025 RBR50 Robotics Innovation Awards open for nominations

You can now submit nominations for the 2025 RBR50 innovation awards. They will recognize technology and business innovations in the calendar year 2024, and the awards are open to any company worldwide that produces robotics or automation.

The categories include:

  1. Technologies, products, and services: This category includes primary or applied research focusing on robotics and supporting technologies such as motion control, vision, or machine learning. It also includes new products and business, engineering, or technology services.
  2. Business and management: This category covers initiatives positioning a company as a market leader or an organization as an important thought leader in the robotics ecosystem. Significant mergers and acquisitions are relevant, as are supplier, partner, and integrator relationships.
  3. Applications and markets: The RBR50 will also recognize innovations that improve productivity, quality, and cost-effectiveness, as well as those that automate new tasks.

In addition, the 2025 RBR50 awards will celebrate the following:

  • Startup of the Year
  • Application of the Year
  • Robot of the Year
  • Robots for Good Award

The deadline for submissions is Friday, Dec. 20, 2024.


Podcast sponsored by FlexQube

The show this week is sponsored by FlexQube. Move material with any size, shape, and weight with the FlexQube Navigator AMR, the world’s first multi-purpose and non-load carrying robot.

The FlexQube Navigator AMR features a standardized coupling interface to connect with an ecosystem of different load carriers depending on the customer’s needs.

The system also features a safety-rated identification of load carrier footprint to secure a safe and efficient scale-up of different use cases in a factory or warehouse. 

FlexQube Navigator – robotics that delivers! 

To learn more about FlexQube’s solutions goto: https://www.flexqube.com 


 

The post Funding the next wave of robotics appeared first on The Robot Report.

]]>
https://www.therobotreport.com/funding-the-next-wave-of-robotics/feed/ 0
Hello Robot’s Stretch AI toolkit explores embodied intelligence https://www.therobotreport.com/stretch-ai-toolkit-explore-embodied-intelligence/ https://www.therobotreport.com/stretch-ai-toolkit-explore-embodied-intelligence/#respond Fri, 06 Dec 2024 21:26:46 +0000 https://www.therobotreport.com/?p=581871 Stretch AI is a powerful toolkit designed to help researchers and developers create intelligent behaviors for Hello Robot's Stretch 3 mobile manipulator.

The post Hello Robot’s Stretch AI toolkit explores embodied intelligence appeared first on The Robot Report.

]]>

Hello Robot released an open-source collection of tools, tutorials, and reference code called Stretch AI that empowers developers to explore the future of embodied AI on the Stretch 3 mobile manipulator. Stretch 3, released in February 2024, is gaining traction with university labs as a both a platform for research about AI research and real-world deployments.

This comes on the heels of the advancement of robot utility models as a precursor for embodied AI capabilities. Available policies include ACT, VQ-BeT, and Diffusion Policy.

Stretch AI is a powerful toolkit designed to empower researchers and developers to create intelligent behaviors for the Stretch 3 mobile manipulator. This platform offers a range of capabilities, including:

  • Code for precise grasping and manipulation
  • Advanced mapping and navigation techniques
  • Integration with LLM agents for sophisticated decision-making, seamless text-to-speech and speech-to-text functionality
  • Robust visualization and debugging tools to streamline development and testing processes.

Stretch AI integrates open-source AI models, allowing it to accomplish home tasks with natural verbal requests such as “Stretch, pick up the toy, and put it in the bin.” There is a dedicated GitHub repo for Stretch AI.

“With Stretch AI, we wanted to open up access to the latest Embodied AI techniques and make them available to the fast-growing community of Stretch developers,” said Chris Paxton, senior embodied AI lead at Hello Robot. “We’re moving towards a world where robots can perform complex, multi-step tasks in homes. Stretch AI advances the ability to simply develop autonomous systems such as these using AI.”


SITE AD for the 2025 Robotics Summit registration. Register now


Taking AI from labs to living rooms

“Thanks to advances in AI, general-purpose home robots like Stretch are developing faster than expected,” said Hello Robot CEO Aaron Edsinger. “However, it is uncommon to see these robots actually working in real homes with real people. With Stretch AI, roboticists can take their work from the lab and begin developing real applications for realistic home settings.”

Stretch AI offers a distinct vision of the future in which AI-powered robots benefit everyone, including older adults, children, and people with disabilities. “Homes are an inclusive place. To truly succeed in homes, robots, and the AI that powers them, should be made for everyone,” said Edsinger.

Hello Robot said its Stretch mobile manipulator is used by developers in 20 countries, from leading universities to innovative companies. With Stretch AI, Hello Robot invites the research community to collaborate on shaping the future of embodied intelligence.

The Stretch 3 is priced at $24,950 and is available on Hello Robot’s website.

Stretch 3 is portable, lightweight, and designed from the ground up to work around people.

Stretch 3 is portable, lightweight, and designed from the ground up to work around people. | Credit: Hello Robot

The post Hello Robot’s Stretch AI toolkit explores embodied intelligence appeared first on The Robot Report.

]]>
https://www.therobotreport.com/stretch-ai-toolkit-explore-embodied-intelligence/feed/ 0
AMP Robotics raises $91M to accelerate deployment of recycling systems https://www.therobotreport.com/amp-robotics-raises-91m-accelerate-deployment-recycling-systems/ https://www.therobotreport.com/amp-robotics-raises-91m-accelerate-deployment-recycling-systems/#respond Thu, 05 Dec 2024 15:14:19 +0000 https://www.therobotreport.com/?p=581856 AMP Robotics will use its latest funding to deploy AMP ONE system, which is designed to improve sortation of municipal solid waste.

The post AMP Robotics raises $91M to accelerate deployment of recycling systems appeared first on The Robot Report.

]]>
AMP ONE is designed to make recycling of municipal solid waste, shown here, more economical.

AMP ONE is designed to capture billions of dollars in value otherwise lost to landfills or incineration annually. Source: AMP Robotics

AMP Robotics Corp. today said it has  has raised $91 million in corporate equity in a Series D financing. The Louisville, Colo.-based company plans to use its latest funding to accelerate deployment of its AMP ONE systems, which uses artificial intelligence and robotics to sort municipal solid waste, or MSW.

“Recycling rates have stagnated in the United States, despite the positive benefits recycling offers local economies and the environment,” said Matanya Horowitz, founder of AMP. “This latest investment enables us to tackle larger projects and deliver real outcomes for waste companies and municipalities – by lowering sortation costs, capturing more material value, diverting organic waste, and extending landfill life – all while helping the industry optimize its strategic assets.”

Founded in 2014, AMP Robotics said its AI platform has identified 150 billion items and guided the sortation of more than 2.5 million tons of recyclables. The company said its technology can help modernize and change the economics of resource recovery. It has three full-scale facilities and more than 400 AI systems deployed across North America, Asia, and Europe.

From sortation to AMP ONE

AMP Robotics said its AI uses deep learning to continuously train itself by processing millions of material images into data. The software uses pattern recognition of colors, textures, shapes, sizes, and logos to identify recyclables and contaminants in real time, enabling new offtake chemistries and capabilities, it added.

The company noted that its first products were a series of sorting robots deployed with minimal retrofit into existing recycling facilities. AMP then developed facilities that it claimed involve almost no manual sorting, are reliable, and provide “pervasive data.”

“These facilities make the recovery of commodities safer and more cost-effective than ever and have grown to encompass MSW sorting, an offering out of reach to the industry prior to the advent of AMP’s technology,” it said. “AMP ONE provides a full-scale facility solution to sort various material streams and capture more of the billions of dollars in value otherwise lost to landfills or incinerated annually.”


SITE AD for the 2025 Robotics Summit registration. Register now


AMP Robotics marks recent deployments, new CEO

Recycling and Disposal Solutions demonstrated AMP ONE’s ability to cost-effectively sort MWS at its facility  in Portsmouth, Va. It has processed 150 tons per day of local waste with more than 90% uptime, said the company.

Last month, AMP Robotics entered into an agreement with Waste Connections Inc. to equip and operate one of Waste Connections’ single-stream recycling facilities in Colorado. 

“AMP provides meaningfully lower-cost, higher-performance systems to recover commodities and increase landfill diversion, and we’re uniquely positioned to reshape the waste and recycling landscape at a critical time,” said Tim Stuart, CEO of AMP. “We’re grateful to our longstanding and newest investors for their support in helping us chart a new path for sustainable materials management and resource efficiency.”

AMP last month augmented its leadership team with the appointment of Stuart, former chief operating officer for Republic Services Inc. Horowitz transitioned from CEO into the role of chief technology officer.

Congruent Ventures leads round

Congruent Ventures led AMP Robotics’ Series D round. Current and new investors participated, including Sequoia Capital, XN, Blue Earth Capital, Liberty Mutual Investments, California State Teachers Retirement System (CalSTRS), Wellington Management, Range Ventures, and Tao Capital Partners.

“AMP’s AI sortation systems enable consumers to recycle both with and without curbside separation and communities to benefit from the recovery of recycled commodities while reducing dependence on landfills,” added Abe Yokell, co-founder and managing partner of Congruent Ventures. “AMP is an example of the real-world impacts of AI; solutions like AMP’s will divert billions of tons of recyclable material from landfills while reducing emissions.”

Congruent Ventures is a leading early-stage venture firm focused on partnering with entrepreneurs to build companies addressing climate and sustainability challenges. The firm has more than $1 billion in assets under management across early-stage climate tech funds and 59 companies in its portfolio.

The post AMP Robotics raises $91M to accelerate deployment of recycling systems appeared first on The Robot Report.

]]>
https://www.therobotreport.com/amp-robotics-raises-91m-accelerate-deployment-recycling-systems/feed/ 0
Project CETI uses AI and robotics to track down sperm whales https://www.therobotreport.com/project-ceti-uses-ai-and-robotics-to-track-down-sperm-whales/ https://www.therobotreport.com/project-ceti-uses-ai-and-robotics-to-track-down-sperm-whales/#respond Tue, 03 Dec 2024 21:19:23 +0000 https://www.therobotreport.com/?p=581810 Project CETI researchers developed the AVATARs framework to make the most out of the small amount of time sperm whales spend on the surface.

The post Project CETI uses AI and robotics to track down sperm whales appeared first on The Robot Report.

]]>
An image of a pod of sperm whales swimming underwater.

Sperm whales spend, on average, 10 minutes of every hour on the surface, presenting challenges for researchers studying them. | Source: Amanda Cotton/Project CETI

In the chilly waters off the New England coast, researchers from the Cetacean Translation Initiative, Project CETI, can spend hours searching and waiting for an elusive sperm whale to surface. During the minutes the whales spend above water, the researchers need to gather as much information as possible before the animals dive back beneath the surface for long periods.

With one of the widest global distributions of any marine mammal species, these whales are difficult to track down, and even more difficult to learn from. Project CETI aims to use robotics and artificial intelligence to decode the vocalizing of sperm whales. It recently released research about how it tracks down sperm whales across the wide ocean.

“The ocean and the natural habitat of the whales is this vast place where we don’t have a lot of infrastructure, so it’s hard to build infrastructure that will always be able to observe the whales,” said Stephanie Gil, an assistant professor of Computer Science at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) and an advisor on the project.

The project brings together some of the world’s leading scientists in biology, linguistics, robotics, and more. The founder of Project CETI, David Gruber, estimated that it’s one of the largest multi-disciplinary research projects active today.

“Project CETI was formed in March 2020, and we’re now over 50 scientists across eight different disciplines,” he said. “I think we’re over 15 institutions, which I believe puts us as one of the most interdisciplinary, large-scale science projects that’s ever been conducted. It’s incredibly rewarding to see so many disciplines working together.”

Project CETI shares latest research

The researchers at the nonprofit organization have developed a reinforcement learning framework that uses autonomous drones to find sperm whales and predict where they will surface. The paper, published in Science Robotics, said it’s possible to predict when and where a whale may surface using various sensor data and predictive models of sperm whale dive behavior.

This new study involved various sensing devices, such as Project CETI aerial drones with very high frequency (VHF) signal sensing capability that use signal phase along with the drone’s motion to emulate an “antenna array in the air” for estimating the direction of pings from CETI’s on-whale tags.

“There are two basic advantages of [VHF signals]. One is that they are really low power, so they can operate for a really, really long time in the field, like months or even years. So, once those small beacons are deployed on the tag, you don’t have to really replace the batteries,” said Ninad Jadhav, a co-author on the paper and a robotics and engineering Ph.D. student at Harvard University.

“The second thing is these signals that these tags transmit, the VHF, are very high-frequency signals,” he added. “They can be detected at really long ranges.”

“That’s a really huge advantage because we never know when the whales will surface or where they will surface, but if they have been tagged before, then you can sense, for example, simple information such as the direction of the signal,” Jadhav told The Robot Report. “You can deploy an algorithm on the robot to detect that, and that gives us an advantage of finding where the whales are on the surface.”

Sperm whales present unique challenges for data collection

From left to right, Stephanie Gil, Sushmita Bhattacharya, and Ninad Jadhav working on a laptop with an orange drone in the foreground.

From left to right: Stephanie Gil, Sushmita Bhattacharya, and Ninad Jadhav. | Source: Stu Rosner

“Sperm whales are only on the surface for about 10 minutes every hour,” said Gil. “Other than that, they’re diving pretty deep in the ocean, so it’s hard to access information about what the whales are actually doing. That makes them somewhat elusive for us and for science.”

“Even we humans have certain patterns day to day. But if you’re actually out observing whales on a particular day, their behavior is not going to exactly align with the models, no matter how much data you’re using to make those models right. So it’s very difficult to really predict with precision when they might be coming up,” she continued.

“You can imagine, if [the scientists] out on the water for days and days, only having a few encounters with the whales, we’re not being that efficient. So this is to increase our efficiency,” Gruber told The Robot Report.

Once the Project CETI researchers can track down the whales, they must gather as much information as possible during the short windows of time sperm whales spend on the surface.

“Underwater data collection is quite challenging,” said Sushmita Bhattacharya, a co-author on the paper and a computer science and robotics Ph.D. student at Harvard University. “So, what is easier than underwater data collection is to have data collected when they’re at the surface. We can leverage drones or shallow hydrophones and collect as much data as possible.”


SITE AD for the 2025 Robotics Summit registration. Register now


Developing the AVATARS framework

At the center of the research is the Autonomous Vehicles for Whale Tracking And Rendezvous by Remote Sensing, or AVATARS framework. AVATARS is the first co-development of VHF sensing and reinforcement learning decision-making for maximizing the rendezvous of robots and whales at sea.

“We tried to build up a model which would kind of mimic [sperm whale] behavior,” Bhattacharya said of AVATARS. “We do this based on the current information that we gather from the sparse data set.”

Being able to predict when and where the whales will surface allowed the researchers to design algorithms for the most efficient route for a drone to rendezvous with—or encounter—a whale at the surface. Designing these algorithms where challenging on many levels, the researchers said.

“Probably the hardest thing is the fact that it is such an uncertain problem. We don’t have certainty at all in [the whales’] positions when they’re underwater, because you can’t track them with GPS when they’re underwater,” Gil said. “You have to think of other ways of trying to track them, for example, by using their acoustic signals and an angle of arrival to their acoustic signals that give you a rough idea of where they are.”

“Ultimately, these algorithms are routing algorithms. So you’re trying to route a team of robots to be at a particular location in the environment, in the world, at a certain given time when it’s necessary to be there,” she told The Robot Report. “So this is analogous to something like rideshare.”

Before bringing the algorithms into the real world with real whales, the team tested them in a controlled environment with devices the team put together to mimic whales.

We mimicked the whale using an engineered whale,” recalled Bhattacharya. “So basically we used a speed boat, and it had a loud engine. We used that engine noise to mimic the whale vocalization, and we had it move to mimic whale motion. And then we used that as our ground test.”

Project CETI tests AVATARS in the real world

An image of a small white drone flying over the ocean. The top of a whale can be seen poking out of the water.

A customized off-the-shelf drone flying to deploy a whale tag developed by Project CETI researchers. | Source: Project CETI

“Every day was a challenge when we were out on the boat, because this was for me, and my co-author Sushmita, the first time we were deploying real autonomous robots from a boat in the middle of the sea trying to collect some information,” Jadhav said.

“One of the major challenges of working in this environment was the noise in the sensor,” he continued. “As opposed to running experiments in the lab environment, which is more controlled, there are fewer sources of noise that impact your experiments or your sensor data”

“The other key challenge was deploying the drone itself from the board,” noted Jadhav. “I remember one instance where this was probably the first or second day of the second expedition that we went on last November, and I had the drone ready. It had the payload. It was waterproof”

“I had already run experiments here in Boston locally, where I had an estimate of how long the drone would fly with the payload. And then we were out on the boat running some initial tests, and the drone took off,” he said. “It was fine, it was doing its thing, and within a minute of it collecting data, there was a sudden gust of wind. The drone just lost control and crashed in the water.”

The team also had to try to predict and react to whale behavior when performing field tests.

“Our algorithm was designed to handle sensor data from a single whale, but what we ended up seeing is that there were four whales together, who were socializing,” Jadhav said. “They were diving and then surfacing at the same time. So, this was tricky, because then it becomes really hard for us on the algorithm side to understand which whale is sending which acoustic signal and which one we are tracking.”

Team tries to gather data without disturbing wildlife

While Project CETI works closely with sperm whales and other sea life that might be around when the whales surface, it aims to leave the whales undisturbed during data collection.

“The main concern that we care about is that even if we fail, we should not harm the whales,” Bhattacharya said. “So we have to be very careful about respecting the boundaries of those animals. That’s why we are looking at a rendezvous radius. Our goal is to go near the whale and not land on it.”

“Being minimally invasive and invisible is a key part of Project CETI,” said Gruber. “[We’re interested in] how to collect this information without interacting directly with the whale.”

This is why the team works mostly with drones that won’t disturb sea life and with specially developed tags that latch onto the whales and collect data. The CETI team eventually collects these tags, and the valuable data they contain, after they fall off the whales.

“A lot of times, people might think of robotics and autonomy as a scary thing, but this is a really important project to showcase that robots can be used to extend the reach of humans and help us understand our world better,” Gil told The Robot Report.

Project CETI aims to decode whale communications

This latest research is just one step in Project CETI’s overarching goal to decode sperm whale vocalizations. In the short term, the organization plans to ramp up data collection, which will be crucial for the project’s long-term goals.

“Once we have all the algorithms worked out, a future outlook is one where we might have, for example, drone ports in the sea that can deploy robots with sensors around the clock to observe whales when they’re available for observation,” Gil said.

“We envision a team of drones that will essentially meet or visit the whales at the right place, at the right time,” Jadhav said. “So whenever the whales surface, you essentially have a kind of autonomous drone, or autonomous robot, very close to the whale to collect information such as visual information or even acoustic if the drone is equipped with that.”

Outside of Project CETI, organizations could use AVATARS to further protect sperm whales in their natural environments. For example, this information could be used to reroute ships away from sperm whale hot spots, reducing the odds of a ship colliding with a pod of sperm whales.

“The idea is that if we understand more about the wholes, more about the whale communities, more about their social structures, then this will also enable and motivate conservation projects and understanding of marine life and how it needs to be protected,” Gil said.

In addition, the researchers said they could apply these methods to other sea mammals that vocalize.

“Here at Project CETI, we’re concerned about sperm whales, but I think this can be generalized to other marine mammals, because a lot of marine mammals vocalize, including humpback whales, other types of whales, and dolphins,” Bhattacharya said.

The post Project CETI uses AI and robotics to track down sperm whales appeared first on The Robot Report.

]]>
https://www.therobotreport.com/project-ceti-uses-ai-and-robotics-to-track-down-sperm-whales/feed/ 0
AWS offers accelerated robotics simulation with NVIDIA https://www.therobotreport.com/aws-offers-accelerated-robotics-simulation-nvidia/ https://www.therobotreport.com/aws-offers-accelerated-robotics-simulation-nvidia/#respond Tue, 03 Dec 2024 18:30:07 +0000 https://www.therobotreport.com/?p=581816 AWS and NVIDIA said that Isaac Sim on Amazon Web Services can significantly accelerate and scale robot simulation and AI training.

The post AWS offers accelerated robotics simulation with NVIDIA appeared first on The Robot Report.

]]>
AWS and Isaac Sim can help accelerate robotics development, says NVIDIA.

AWS and Isaac Sim can help accelerate robotics development, says NVIDIA.

NVIDIA Corp. today announced at AWS re:Invent enhanced tools for robotics developers, as well as the availability of NVIDIA DGX Cloud on Amazon Web Services and offerings for artificial intelligence and quantum computing.

The company said that NVIDIA Isaac Sim is now available on NVIDIA L40S graphics processing units (GPUs) in Amazon Elastic Cloud Computing (EC2) G6e instances. It said this could double scaling robotics simulation and accelerate AI model training. Isaac Sim is a reference application built on NVIDIA Omniverse for developers to simulate and test AI-driven robots in physically based virtual environments.

With NVIDIA OSMO, a cloud-native orchestration platform, developers can easily manage their complex robotics workflows across their AWS computing infrastructure, claimed the company.

“This combination of NVIDIA-accelerated hardware and software — available on the cloud — allows teams of any size to scale their physical AI workflows,” wrote Akhil Docca, senior product marketing manager for Omniverse at NVIDIA.


SITE AD for the 2025 Robotics Summit registration. Register now


What is ‘physical AI?’

According to NVIDIA, “physical AI” describes AI models that can understand and interact with the physical world. The company said it “embodies the next wave of autonomous machines,” such as self-driving cars, industrial manipulators, mobile robots, humanoids, and even robot-run infrastructure like factories and warehouses.

With physical AI, developers are embracing a “three-computer solution” for training, simulation, and inference to make breakthroughs, NVIDIA said. Yet physical AI for robotics systems requires robust training datasets to achieve precision inference in deployment. Developing such datasets and testing them in real situations can be impractical and costly.

Simulation offers an answer, as it can accelerate the training, testing and deployment of AI-driven robots, the company asserted.

L40S GPUs in the cloud offer to scale simulation, training

Developers can use simulation to verify, validate, and optimize robot designs as well as the systems and their algorithms before deployment, said NVIDIA. It added that simulation can optimize facility and system designs before construction or remodeling starts for maximum efficiencies, reducing costly manufacturing change orders.

Amazon EC2 G6e instances accelerated by NVIDIA L40S GPUs can double performance over the prior architecture, while allowing the flexibility to scale as scene and simulation complexity grows, NVIDIA said. Roboticists can use these instances to train many computer vision models that power AI-driven robots.

This means the same instances can be extended for various tasks, from data generation and simulation to model training. NVIDIA added that OSMO allows teams to orchestrate and scale complex robotics development workflows across distributed computing resources, whether on premises or in the AWS cloud.

NVIDIA said Isaac Sim can foster collaboration and critical workflows, such as generating synthetic data for perception model training.

A reference workflow combines NVIDIA Omniverse Replicator, a framework for building custom synthetic data generation (SDG) pipelines and a core extension of Isaac Sim, with NVIDIA NIM microservices. With it, developers can build generative AI-enabled SDG pipelines, it said.

These include the USD Code NIM microservice for generating Python USD code and answering OpenUSD queries, plus the USD Search NIM microservice for exploring OpenUSD assets using natural language or image inputs.

The Edify 360 HDRi NIM microservice can generate 360-degree environment maps, while the Edify 3D NIM microservice can create ready-to-edit 3D assets from text or image prompts. Generative AI can thus ease the synthetic data generation process by reducing many tedious and manual steps, from asset creation to image augmentation, said NVIDIA.

  • Rendered.ai’s synthetic data engineering platform is integrated with Omniverse Replicator. It enables companies to generate synthetic data for computer vision models used in industries from security and intelligence to manufacturing and agriculture.
  • SoftServe Inc., an IT consulting and digital services provider, uses Isaac Sim to generate synthetic data and validate robots used in vertical farming with Pfeifer & Langen, a leading European food producer.
  • Tata Consultancy Services is building custom synthetic data generation pipelines to power its Mobility AI suite to address automotive and autonomous use cases by simulating real-world scenarios. Its applications include defect detection, end-of-line quality inspection, and hazard avoidance.

NVIDIA, AWS help robots learn in simulation

While Isaac Sim enables developers to test and validate robots in physically accurate simulation, Isaac Lab, an open-source robot learning framework built on Isaac Sim, provides a virtual playground for building robot policies that can run on AWS Batch. Because these simulations are repeatable, developers can troubleshoot and reduce the number of cycles required for validation and testing, said NVIDIA.

The company cited robotics startups that are already using Isaac Sim on AWS: 

  • Field AI is building robot foundation models to enable robots to autonomously manage a wide range of industrial processes. It uses Isaac Sim and Isaac Lab to evaluate the performance of these models in complex, unstructured environments in construction, manufacturing, oil and gas, mining, and more.
  • Vention, which offers a full-stack cloud-based automation platform, is creating pretrained skills to ease development of robotic tasks, noted NVIDIA. It is using Isaac Sim to develop and test new capabilities for robot cells used by small to midsize manufacturers.
  • Cobot offers Proxie, its AI-powered collaborative mobile manipulator. It uses Isaac Sim to enable the robot to adapt to dynamic environments, work alongside people, and streamline logistics in warehouses, hospitals, airports, and more.
  • Standard Bots is simulating and validating the performance of its R01 robot used in manufacturing and machining setup.
  • Swiss-Mile is using Isaac Sim and Isaac Lab for robot learning so that its wheeled quadruped robots can perform tasks autonomously with new levels of efficiency in factories and warehouses.
  • Cohesive Robotics has integrated Isaac Sim into its software framework called Argus OS for developing and deploying robotic workcells used in high-mix manufacturing environments.
  • Aescape’s robots are able to provide precision-tailored massages by accurately modeling and tuning the onboard sensors in Isaac Sim.

NVIDIA made other announcements in addition to the availability of Isaac Sim 4.2 on Amazon EC2 G6e Instances powered by NVIDIA L40S GPUs on AWS Marketplace.

It said that NVIDIA DGX Cloud can run on AWS for training AI models; that AWS liquid cooling is available for data centers using its Blackwell platform; and that NVIDIA BioNeMo NIM microservices and AI Blueprints, developed to advance drug discovery, are now integrated into AWS HealthOmics.

The company also said its latest AI Blueprints are available on AWS for video search and cybersecurity, the integration of NVIDIA CUDA-Q with Amazon Braket for quantum computing development, and RAPIDS Quick Start Notebooks on Amazon EMR.

The post AWS offers accelerated robotics simulation with NVIDIA appeared first on The Robot Report.

]]>
https://www.therobotreport.com/aws-offers-accelerated-robotics-simulation-nvidia/feed/ 0
Clearpath Robotics discusses development of Husky A300 ground vehicle https://www.therobotreport.com/a300-clearpath-robotics-discusses-development/ https://www.therobotreport.com/a300-clearpath-robotics-discusses-development/#respond Tue, 03 Dec 2024 15:00:08 +0000 https://www.therobotreport.com/?p=581811 The Husky A300 uncrewed ground vehicle from Clearpath includes features for both expert robot developers and non-expert users.

The post Clearpath Robotics discusses development of Husky A300 ground vehicle appeared first on The Robot Report.

]]>
The Husky A300, shown here, includes several design improvements over the A200, says Clearpath Robotics.

The Husky A300 is designed to be tougher and have longer endurance than the A200. Source: Clearpath Robotics

Developers of robots for indoor or outdoor use have a new platform to build on. In October, Clearpath Robotics Inc. released the Husky A300, the latest version of its flagship mobile robot for research and development. The Waterloo, Ontario-based company said it has improved the system’s speed, weather resistance, payload capacity, and runtime.

“Husky A200 has been on the market for over 10 years,” said Robbie Edwards, director of technology at Clearpath Robotics. “We have lots of experience figuring out what people want. We’ve had different configurations, upgrades, batteries and chargers, computers, and motors.”

“We’ve also had different configurations of the internal chassis and ingress protection, as well as custom payloads,” he told The Robot Report. “A lot of that functionality that you had to pay to add on is now stock.”

Husky A300 hardware is rugged, faster

The Husky A300 includes a high-torque drivetrain with four brushless motors that enable speeds of up to 2 m/s (4.4 mph), twice as fast as the previous version. It can carry payloads up to 100 kg (220.4 lb.) and has a runtime of up to 12 hours, said Clearpath Robotics.

The company, which Rockwell Automation acquired last year, noted that the platform can integrate third-party components and accessories including depth cameras, directional lidar, dual-antenna GPS, and manipulators. Husky A300 has an IP54 rating against dust and water and can withstand industrial environments or extreme temperatures outdoors, it said. 

“Before, the Husky was configured on a bespoke basis,” said Edwards. “Now we’re off at a more competitive price, which is great for our customers, and it now comes off our production line instead of our integration line.”

Founded in 2009, the company has tested its hardware and software near its office in a wide range of weather conditions.

Clearpath’s integration with Rockwell has gone smoothly so far, with Rockwell’s procurement team easing access to components and manufacturing, said Edwards. He observed that some of Rockwell’s customers in mining or other industrial automation could find new use cases in time.

The Husky A300 platform, shown here, is designed to withstand dust and temperature variances, says Clearpath Robotics.

The Husky A300 can withstand dust and temperature variances. Source: Clearpath Robotics

Clearpath includes ROS 2 support with A300

Husky A300 ships with Robot Operating System (ROS) 2 Jazzy plus demonstrations of Nav2, MoveIt 2, and other developer utilities.

“Over the past two years, there was a big push to get all Clearpath products to ROS 2 Humble because its configuration management system made life easier for our integration team and customers,” recalled Edwards. “We also provide support for simulation, and URDF [Unified Robot Description Format] is configured.”

Many of Clearpath’s R&D customers were familiar with ROS, C++, and Python, so it offered visualization and simulation tools in addition to the ROS stack, he added. However, as the company got non-expert customers, it wanted to enable them to also work with Husky.

“Academics who aren’t roboticists but want to do data collection can now do so with a simple Python interface, without learning ROS,” Edwards said. “We’ve maintained a level of flexibility with integrating different payloads and compute options while still giving a pretty good price point and usability.”


SITE AD for the 2025 Robotics Summit registration. Register now


Husky AMP a ‘turnkey’ option

Clearpath Robotics is offering a “turnkey” version of the robot dubbed Husky AMP, or autonomous mobile platform. It comes with a sensor suite for navigation, pre-installed and configured OutdoorNav software, a Web-based user interface, and an optional wireless charging dock.

“Robotics developers can easily integrate payloads onto the mounting deck, carry out a simple software integration through the OutdoorNav interface, and get their system working in the field faster and more efficiently,” said Clearpath.

“We’ve lowered the barrier to entry by providing all software function calls and a navigation stack,” Edwards asserted. “The RTK [real-time kinematic positioning] GPS is augmented with sensor fusion, including wheel odometry, and visual and lidar sensors.”

“With a waypoint following system, the robotics stack does the path planning, which is constrained and well-tested,” he said. “Non-roboticists can use Husky A300 as a ground drone.”

More robot enhancements, use cases to come

Clearpath Robotics is considering variant drive trains for the Husky A300, such as tracks for softer terrain as in agriculture, said Edwards.

“Husky is a general-purpose platform,” he said. “We’re serving outdoors developers rather than end users directly, but there’s a lot of demand for larger, high-endurance materials transport.”

For the A300, the company surveyed its client base, which came back with 150 use cases.

“I’ve seen lots of cool stuff — robots herding animals, helping to grow plants, working in mines, participating in the DARPA Subterranean Challenge in fleets of Husky and [Boston Dynamics’] Spot,” Edwards said. “Husky Observer conducts inspections of sites such as solar farms.”

“The benefits for industrial users also help researchers,” he said. “Making the robot cheaper to deploy for faster time to value also means better battery life, weatherproofing, and integrations.”

Edwards added that Clearpath has received a lot of interest in mobile manipulation with its Ridgeback omnidirectional platform.

“This trend is finding its way outdoors as well,” he said. “On the application engineering side, developers have put put two large Universal Robots arms on our Warthog UGV [uncrewed ground vehicle] for things like changing tires.”

The Husky A300 can carry different sensor payloads, shown here, or robotic arms.

The Husky A300 can carry different sensor payloads or robotic arms. Source: Clearpath Robotics

The post Clearpath Robotics discusses development of Husky A300 ground vehicle appeared first on The Robot Report.

]]>
https://www.therobotreport.com/a300-clearpath-robotics-discusses-development/feed/ 0
Top 10 robotics developments of November 2024 https://www.therobotreport.com/november-2024-top-10-robotics-developments/ https://www.therobotreport.com/november-2024-top-10-robotics-developments/#respond Mon, 02 Dec 2024 19:15:55 +0000 https://www.therobotreport.com/?p=581806 In November 2024, stories about the future of robotics, big robot milestones, and new product unveilings grabbed our readers' attention.

The post Top 10 robotics developments of November 2024 appeared first on The Robot Report.

]]>
The start of the holiday season hasn’t slowed down the robotics industry. In November 2024, stories about the future of robotics, big robot milestones, and new product unveilings grabbed our reader’s attention. 

Here are the top 10 most popular stories on The Robot Report in the past month. Subscribe to The Robot Report Newsletter and listen to The Robot Report Podcast to stay up to date on the robotics developments you need to know about.


Robotic hand and human hand with map of Europe. In November 2024, European robotics hubs showed promise amid global competition.10. Europe has a key role to play in the development of robots, humanoids

While headlines often spotlight U.S. and Asian companies in the humanoid robotics race, startups in the tech hubs of Europe are making strides in developing human-like robots. From Norway to Switzerland, innovative European firms are pushing the boundaries of robotics technology, creating machines that can sense, feel, and interact with their environments in increasingly human-like ways. Read more.


A 'humanoid for hospitals,' Moxi has an arm for opening doors and operating elevators. It reached 100k elevator rides in November 2024.9. Moxi reaches milestone of 100,000 autonomous elevator rides in hospitals

As development continues on humanoid robots, one mobile robot is already at work in hospitals. Diligent Robotics announced that its Moxi robot has completed 110,000 autonomous elevator rides at health systems across the U.S. The mobile manipulator has a single arm for opening doors and pushing buttons to operate elevators. Read more.


AeroVironment's JUMP 20 uncrewed aircraft system.8. AeroVironment acquiring BlueHalo for $4.1B to boost defense tech

Defense contractor AeroVironment has agreed to acquire BlueHalo in an all-stock transaction worth approximately $4.1 billion. BlueHalo is best known for its drone swarm and counter-drone technology. The acquisition, which has been approved by both companies’ boards of directors, is expected to close in the first half of 2025. Read more.


Kassow has designed its Edge Edition cobot arms to work with mobile robot bases, as shown here. 7. Kassow Robots’ new cobots designed for mobile manipulation

Kassow Robots in November 2024 introduced a new line of compact collaborative robots designed to integrate with mobile robots. The new Edge Edition cobots are smaller robot arms designed for mobile manipulation applications. They feature a direct DC connection from battery power, enabling them to operate while mounted to a mobile robot. Read more.


close up of proxie's base.6. Collaborative Robotics unveils Proxie mobile manipulator

Collaborative Robotics Inc. unveiled its Proxie mobile manipulator publicly for the first time. The startup has been secretive about the design of the robot since Brad Porter founded the company in 2022. Porter has hinted at the design of the robot by alluding to the importance of a mobile manipulator for applications within the warehouse, with a kinematic that could be better suited for warehouse workflows than a humanoid. Read more.


Physical Intelligence demonstrates the application of foundation models to training robots for tasks such as folding laundry and assembling cardboard boxes.5. Physical Intelligence raises $400M for foundation models for robotics

Foundation models promise to give robots the ability to generalize actions from fewer examples than traditional artificial intelligence approaches. Physical Intelligence it has raised $400 million to continue its development of artificial intelligence for a range of robots. Read more.


The Digit humanoid carries totes at a Spanx warehouse in Georgia.4. Schaeffler plans global use of Agility Robotics’ Digit humanoid

Schaeffler AG, a global leader in motion technology, is making a minority investment into Agility Robotics and buying Digit humanoid robots for use across its global plant network. The companies did not disclose the size of the November 2024 investment, the number of humanoids being purchased, or what they will be used for. Read more.


Pickle Robot demonstrates lifting a 50-lb. box in a trailer.3. Pickle Robot gets orders for over 30 unloading systems plus $50M in funding

Robotic truck unloading fits the classic definition of dull, dirty, or dangerous jobs worth automating. Pickle Robot has raised $50 million in Series B funding and said that six customers placed orders during the third quarter for more than 30 robots to deploy in the first half of 2025. The new orders include pilot conversions, existing customer expansions, and new customer adoption. Read more.


The Southland Development Authority is reinvigorating manufacturing in Chicago's suburbs, shown here, through programs such as the Metals HUB.2. Chicago’s South Suburbs see the future of manufacturing as American and robotic

For decades, the Chicagoland area has played a pivotal role in American manufacturing capability. Unfortunately, the once-strong bastion of manufacturing and fabrication has lost much of its fervor following years of economic stagnation, outmigration, and a declining tax base. However, as the global marketplace continues to evolve, U.S. manufacturers must contend with an aging ownership base, greater competition, and a severe labor shortage. Read more.


A solder in camo and sunglasses looking into the camera and holding Red Cat's Black Widow drone. The company won an Air Force contract in November 2024.1. Red Cat wins U.S. Army next-gen drone contract over Skydio

Red Cat Holdings Inc. announced that it won the U.S. Army’s Short Range Reconnaissance, or SRR, program-of-record contract. The company replaced Skydio on this contract. The U.S. Army set an initial acquisition target of 5,880 systems over a five-year period. Read more.

The post Top 10 robotics developments of November 2024 appeared first on The Robot Report.

]]>
https://www.therobotreport.com/november-2024-top-10-robotics-developments/feed/ 0
Robotics Summit 2025 opens call for speakers https://www.therobotreport.com/robotics-summit-2025-call-for-speakers/ https://www.therobotreport.com/robotics-summit-2025-call-for-speakers/#respond Mon, 02 Dec 2024 19:00:14 +0000 https://www.therobotreport.com/?p=581172 Robotics industry’s must-attend technical development event returns to Boston in 2025 with deepened coverage, an expanded expo floor and more.

The post Robotics Summit 2025 opens call for speakers appeared first on The Robot Report.

]]>

The Robot Report invites you to submit a session abstract to be considered for presentation at the Robotics Summit & Expo, which will be held April 30-May 1, 2025 in Boston at the Boston Convention and Exhibition Center. The Robotics Summit is the leading event focused on the technical issues involved with developing commercial robots.

We are seeking sessions that should be technical and educational to help attendees overcome hurdles related to commercial robotics development. Topics can be submitted based on the following tracks:

  • Technologies, Tools and Platforms: Advances in the core technologies common to all classes of commercial robots
  • Design, Development & Manufacturability: Technologies, tools and methodologies to simulate, prototype and build commercial robots
  • Artificial Intelligence: How machine learning, generative AI, foundation models and other AI technologies impact commercial robotics development
  • Healthcare Robotics: Advances in medical robotics, from surgical systems to patient care solutions, key enabling technologies, and emerging applications
  • Automated Warehouse: Advances in warehouse robotics, key enabling technologies, and emerging applications

The event is also looking for technical workshops, robot demos, and off-site technical tours of local robotics organizations or universities.

Submission Form
The entry deadline for submitting speaker proposals is December 4, 2024. To apply to deliver a session at the Robotics Summit & Expo click HERE.

All speakers receive:

  • Complimentary full conference registration with admission to all keynotes, sessions, panels, networking receptions and special events
  • Complimentary guest registrations for up to two attendees

Co-Located Events
The Robotics Summit & Expo will be co-located with DeviceTalks Boston, the premier industry event for medical technology professionals. Both events attract engineering and business professionals from a broad range of healthcare and medical technology backgrounds.

Sponsorship Opportunities
For information about sponsorship and exhibition opportunities, download the prospectus. Questions regarding sponsorship opportunities should be directed to Colleen Sepich at csepich[AT]wtwhmedia.com.

Conference Programming
For questions regarding Robotics Summit conference programming, contact Steve Crowe at scrowe[AT]wtwhmedia.com.

Jonathan Hurst delivering open keynote at Robotics Summit 2024.

Jonathan Hurst, co-founder and chief robot officer at Agility Robotics, delivered the opening keynote at Robotics Summit 2024. | Credit: Jeff Pinette

The post Robotics Summit 2025 opens call for speakers appeared first on The Robot Report.

]]>
https://www.therobotreport.com/robotics-summit-2025-call-for-speakers/feed/ 0
Oxipital AI releases VX2 Vision System for inspection and picking https://www.therobotreport.com/oxipital-ai-releases-vx2-vision-system-for-inspection-and-picking/ https://www.therobotreport.com/oxipital-ai-releases-vx2-vision-system-for-inspection-and-picking/#respond Fri, 29 Nov 2024 13:05:04 +0000 https://www.therobotreport.com/?p=581791 Oxipital AI says its advanced vision system is more compact, delivers greater precision, and is more affordable than its predecessor.

The post Oxipital AI releases VX2 Vision System for inspection and picking appeared first on The Robot Report.

]]>
The VX2 Vision System uses AI for food-grade inspection, shown here, says Oxipital AI.

The VX2 Vision System uses AI for food-grade inspection and picking, says Oxipital AI.

Oxipital AI this month launched its VX2 Vision System, which uses artificial intelligence for inspection and high-speed picking applications across food-grade and industrial sectors. Built on the company’s proprietary Visual AI platform, the VX2 comes in a more compact package at a more accessible price than its predecessor.

“At Oxipital AI, we believe that listening to our customers and learning from real-world applications is the key to driving innovation,” said Austin Harvey, vice president of product at Oxipital. “The VX2 is the result of that philosophy in action. It’s smaller, more powerful, and more versatile, enabling our customers to build more resilient manufacturing processes.”

Formerly Soft Robotics, Oxipital is developing machine vision for product inspection and robotic process automation in critical industries such as food processing, agriculture, and consumer goods production.

The Bedford, Mass.-based company’s stated mission is “to deliver actionable insights through deep object understanding to customers as they embrace Industry 5.0 and unlock previously unachievable levels of resiliency, efficiency, and sustainability in their manufacturing operations.”

Oxipital AI recently launched its VX2 Vision System, which uses artificial intelligence for inspection and high-speed picking applications across food-grade and industrial sectors. Built on the company’s proprietary Visual AI platform, the VX2 comes in a more compact package at a more accessible price than its predecessor.

“At Oxipital AI, we believe that listening to our customers and learning from real-world applications is the key to driving innovation,” said Austin Harvey, vice president of product at Oxipital. “The VX2 is the result of that philosophy in action. It’s smaller, more powerful, and more versatile, enabling our customers to build more resilient manufacturing processes.”

The successor to Soft Robotics, Oxipital is developing machine vision for product inspection and robotic process automation in critical industries such as food processing, agriculture, and consumer goods production.

The Bedford, Mass.-based company’s stated mission is “to deliver actionable insights through deep object understanding to customers as they embrace Industry 5.0 and unlock previously unachievable levels of resiliency, efficiency, and sustainability in their manufacturing operations.”


SITE AD for the 2025 Robotics Summit registration. Register now


VX2 Vision System includes several enhancements

Oxipital AI said the VX2 Vision System represents a significant improvement over its first-generation vision platform. The company said it incorporated customer feedback and extensive field learning to meet the evolving needs of the industry.

The VX2 has enhanced capabilities for inspection, high-speed picking, and high-speed picking with inspection, said Oxipital. It asserted that the system ensures optimal efficiency and precision in a wide variety of environments and listed the following benefits:

Compact and powerful: The VX2 packs more processing power into a smaller, more efficient design, providing greater flexibility for installations in tight spaces or complex environments, said Oxipital.

Versatile application: Designed for food-grade and industrial use, the VX2 excels in inspection tasks, high-speed handling, and combining both, ensuring accuracy and speed in demanding workflows.

Enhanced Visual AI platform: Oxipital said its platform delivers faster, more accurate decision-making capabilities, ensuring high-performance, real-time operations.

Better price point: Despite significant improvements in power and versatility, the VX2 is available at a more competitive price, said the company. This makes it an attractive option for businesses seeking to upgrade their capabilities without incurring significant costs, it added.

Oxipital AI schematic of its vision technology. The VX2 Vision System continues the company's response to user feedback.
The VX2 Vision System continues Oxipital’s response to user feedback. Source: Oxipital AI

Oxipital AI applies vision to industry needs

With the VX2 launch at PACK EXPO this month, Oxipital said the technology demonstrates its commitment to innovations that address the challenges that industry is currently facing.

“Oxipital AI continues to push the boundaries of what is possible with vision systems in automated environments,” it said. Soft Robotics previously made compliant grippers before pivoting to vision AI.

Oxipital has partnered with Schmalz and Velec, and its was nominated as a PACK EXPO Food and Beverage Technology Excellence Award finalist.

The post Oxipital AI releases VX2 Vision System for inspection and picking appeared first on The Robot Report.

]]>
https://www.therobotreport.com/oxipital-ai-releases-vx2-vision-system-for-inspection-and-picking/feed/ 0
Nuro navigates to fully autonomous driving https://www.therobotreport.com/nuro-navigates-to-fully-autonomous-driving/ https://www.therobotreport.com/nuro-navigates-to-fully-autonomous-driving/#respond Wed, 27 Nov 2024 18:30:36 +0000 https://www.therobotreport.com/?p=581779 Andrew Clare, CTO of Nuro, discusses the current state of self-driving vehicles and software and the road to Level 5 autonomy.

The post Nuro navigates to fully autonomous driving appeared first on The Robot Report.

]]>

In Episode 174 of The Robot Report Podcast, we feature an interview with Andrew Clare, chief technology officer of Nuro Inc. It’s a short workweek in the U.S. with the Thanksgiving holiday, so we skipped the news this week and went straight into the interview with Clare.

He discusses the company‘s evolution in the autonomous vehicle space, focusing on its Nuro Driver technology. Clare elaborates on Nuro’s expansion of its business model to include partnerships with automotive OEMs and the potential market for AI-based driving.

Clare also highlights the challenges of urban versus highway driving, the importance of safety culture, and the technology stack required for autonomous vehicles. We also touch on the differences between SAE Level 4 and Level 5 autonomy, as well as the future direction of Nuro in integrating hardware and software.

Show timeline

  • 8:12 – Interview with Andrew Clare

SITE AD for the 2025 Robotics Summit registration. Register now


News of the week

We’ll discuss the latest news after the Thanksgiving holiday.


2025 RBR50 Robotics Innovation Awards open for nominations

You can now submit nominations for the 2025 RBR50 innovation awards. They will recognize technology and business innovations in the calendar year 2024, and the awards are open to any company worldwide that produces robotics or automation.

The categories include:

  1. Technologies, products, and services: This category includes primary or applied research focusing on robotics and supporting technologies such as motion control, vision, or machine learning. It also includes new products and business, engineering, or technology services.
  2. Business and management: This category covers initiatives positioning a company as a market leader or an organization as an important thought leader in the robotics ecosystem. Significant mergers and acquisitions are relevant, as are supplier, partner, and integrator relationships.
  3. Applications and markets: The RBR50 will also recognize innovations that improve productivity, quality, and cost-effectiveness, as well as those that automate new tasks.

In addition, the 2025 RBR50 awards will celebrate the following:

  • Startup of the Year
  • Application of the Year
  • Robot of the Year
  • Robots for Good Award

The deadline for submissions is Friday, Dec. 20, 2024.


Podcast sponsored by RGo Robotics

The show this week is sponsored by RGo Robotics Inc.

Is your autonomous mobile robot (AMR) struggling in dynamic environments? Is your business stuck because it takes months to commission a new site?

RGo Robotics’ Perception Engine is revolutionizing the AMR business through advanced Vision AI perception technology. Unlike traditional solutions, The company’s software enables AMRs to adapt to changing environments and navigate complex spaces with unprecedented accuracy and the commissioning process is shorter and simpler.

Leading AMR companies are enhancing their fleets with RGo’s AI-powered perception, enabling their teams to accelerate use of advanced AI capabilities like foundation models and digital twins.

Don’t let outdated navigation hold your business back.

To learn more about RGo’s solutions, go to: https://www.rgorobotics.ai/


 

The post Nuro navigates to fully autonomous driving appeared first on The Robot Report.

]]>
https://www.therobotreport.com/nuro-navigates-to-fully-autonomous-driving/feed/ 0
Learn about digitalization in the warehouse in new webinar https://www.therobotreport.com/learn-about-digitalization-in-the-warehouse-in-webinar/ https://www.therobotreport.com/learn-about-digitalization-in-the-warehouse-in-webinar/#comments Wed, 27 Nov 2024 14:30:49 +0000 https://www.therobotreport.com/?p=581774 Digitalization of the warehouse involves several emerging technologies; attendees of this free webinar can learn from industry experts.

The post Learn about digitalization in the warehouse in new webinar appeared first on The Robot Report.

]]>
Digital tools such as the simulation shown here from Dexory, are part of digitalization in the warehouse.

Digitalization is bringing emerging technologies into the warehouse. Source: Dexory

Designing and deploying a digital warehouse can be a challenge, with numerous technology options to add to your operations. From robotics and automation to the latest data analytics and artificial intelligence, how can you take advantage of digitalization?

At 2:00 p.m. EST on Wednesday, Dec. 4, expert panelists will discuss how emerging technologies are changing how engineers design warehouse systems and how businesses can gain insights and efficiencies with them. Sensors, digital twins, wearables, and virtual assistants are some of the tools that are part of this digital transformation.

In this free webinar, viewers can learn about:

  • Ways to improve labor productivity with workforce management
  • The orchestration of people and autonomous mobile robots (AMRs) for order picking and fulfillment
  • Where augmented and virtual reality (AR/VR) fit in the warehouse
  • How AI will change how operators use data in positive feedback cycle
  • How to scale digital transformation across facilities and the supply chain

Register now to attend this webinar on digitalization, and have your questions answered live. Registrants will be able to view it on demand after the broadcast date.

Digitalization speakers to share insights

Robert C. Kennedy, principal at RC Kennedy Consulting, will discuss digitalization in the warehouse.

Robert C. Kennedy is principal at RC Kennedy Consulting. For over four decades, he has planned, developed, and implemented industry-leading supply chain execution systems around the globe. Kennedy and his staff have led more than 200 large-scale implementation projects of supply chain execution software for leading customers in a variety of industries, including pharmaceutical, electronics, third-party logistics (3PL), and food and beverage.

As a leading voice of expertise, Bob is featured in regular interviews by industry media and has published articles, and he has presented at numerous trade shows and seminars.

RC Kennedy Consulting provides assistance to companies to improve operational efficiencies through process design and systems. It also helps them develop strategies for growth.

Ken Ramoutar will discuss digitalization in the warehouse.

Ken Ramoutar is chief marketing officer at Lucas Systems, which helps companies transform their distribution center by dramatically increasing worker productivity, operational agility, and customer and worker satisfaction using voice and AI optimization technologies.

In his 25 years of customer centric roles in supply chain software and consulting, Ramoutar has navigated companies through uncertainty and volatility as a thought leader and change agent.

Prior to Lucas, Ken was senior vice president and global head of customer experience at Avanade, a $3 billion Accenture and Microsoft-owned company, and he has held leadership roles at IBM, Sterling Commerce, and SAP/Ariba.

Michael Taylor is chief product officer and co-founder of Duality AI.

Michael Taylor is the chief product officer and co-founder of Duality AI. He has a 20-year career in mobile robotics, with 15 years dedicated to building autonomous field robots at Caterpillar.

While there, Mike led the team developing the autonomy system for Caterpillar’s autonomous dozer, and he helped launch the Autonomous Mining Truck program. His roles included architecting behaviors and planning systems, as well as building a collection of simulation technologies to accelerate deployment to customer sites.

Taylor was also part of the Carnegie Mellon team that won DARPA’s Urban Challenge, where he led both the Controls Team and the Field Calibration Team. Taylor holds dozens of patents in fields ranging from robotics to simulation technologies.

At Duality AI, Taylor leads the company’s Product and Solutions Engineering team. He is responsible for steering Duality’s product strategy, developing technologies to address customer needs, and helping ensure that customers maximize the value they extract from Falcon. This includes projects ranging from a simulation solution to support a drone-based AI perception system, to generating synthetic data for high-volume manufacturing quality assurance, to characterizing and modeling of uncrewed ground vehicles (UGVs) navigating novel environments. 

Eugene Demaitre, editorial director for robotics at WTWH Media

Eugene Demaitre, moderator, is the editorial director for robotics at WTWH Media, which produces Automated WarehouseThe Robot Report, the Robotics Summit & Expo, and RoboBusiness. Prior to working for WTWH Media, he was an editor at BNA (now part of Bloomberg), Computerworld, TechTarget, Robotics Business Review, and Robotics 24/7.

Demaitre has participated in conferences worldwide, as well as spoken on numerous webcasts and podcasts. He is always interested in learning more about robotics. He has a master’s from the George Washington University and lives in the Boston area.

This webinar is sponsored by Baluff and Dexory.

Balluff logo
Dexory logo

The post Learn about digitalization in the warehouse in new webinar appeared first on The Robot Report.

]]>
https://www.therobotreport.com/learn-about-digitalization-in-the-warehouse-in-webinar/feed/ 1
Imagry moves to make buses autonomous without mapping https://www.therobotreport.com/imagry-moves-to-make-buses-autonomous-without-mapping/ https://www.therobotreport.com/imagry-moves-to-make-buses-autonomous-without-mapping/#respond Mon, 25 Nov 2024 19:18:36 +0000 https://www.therobotreport.com/?p=581732 Imagry has developed hardware-agnostic systems to provide Level 4 autonomy to buses with time to market in mind.

The post Imagry moves to make buses autonomous without mapping appeared first on The Robot Report.

]]>
Imagry says its autonomy kit enables buses to autonomously handle roundabouts, as shown here.

Imagry says its software enables buses to autonomously handle complex situations such as roundabouts. Source: Imagry

Autonomous vehicles often rely heavily on prior information about their routes, but new technology promises to improve real-time situational awareness for vehicles including buses. Imagry said its “HD-mapless driving” software stack enables vehicles to react to dynamic contexts and situations more like human drivers.

The company also said its AI Vision 360 eliminates the need for external sensor infrastructure. It claimed that its bio-inspired neural network and hardware-agnostic systems allow for SAE Level 3/4 operations without spending time on mapping.

“We’ve been focusing on two sectors,” said Eran Ofir, CEO of Imagry. “We’ve been selling our perception and motion-planning stack to Tier 1 suppliers and automotive OEMs for autonomous vehicles. We signed a 10-year contract with Continental and are jointly developing a software-defined vehicle platform.”

“And we’ve started working with transportation operators on providing autonomous buses,” he told The Robot Report. “For example, in Turkey, France, Spain, and soon Japan, we’re retrofitting electric buses to be autonomous.”


SITE AD for the 2025 Robotics Summit registration. Register now


Imagry trains in real time with supervision

Imagry was established in 2015 with a focus on computer vision for retail. In 2018, it began focusing entirely on autonomous driving. The company now has about 120 employees in San Jose, Calif., and Haifa, Israel.

Imagry said its technology is similar to that of Tesla in relying on 3D vision for perception and motion planning rather than rule-based coding or maps.

“Most players in the industry use HD maps with 5 cm [1.9 in.] resolution, telling the vehicle where lights, signs, and lane markers are,” said Ofir. “Our system teaches itself with supervised learning. It maps in real time while driving. Like a human driver, it gets the route but doesn’t know what it will find.”

How does Imagry deal with the need for massive data sets to train for navigation and obstacle detection and avoidance?

“We wrote a proprietary tool for annotation to train faster, better, and cheaper,” Ofir replied. “The data is collected but doesn’t live in the cloud. The human supervisor tells the vehicle where it was wrong, like a child. We deliver over-the-air updates to customers.”

“The world doesn’t belong to HD maps — it’s a matter of trusting AI-based software for perception and motion planning,” he said.

Ofir cited an example of a vehicle in Arizona on a random route with no communications to centralized computing. Its onboard sensors and compute recognized construction zones, skateboarders, a bike lane, and stop signs.

“The capability to drive out of the box in new places is unique to Imagry,” asserted Ofir. “We can handle righthand and lefthand driving, such as in Tokyo, where we’ve been driving for a year now.”

How does the bus know when to stop for passengers?

It could stop at every bus stop, upon request via a button at the stop (for the elderly, who may not use phone apps), or be summoned by an app that also handles payment, responded Ofir. Imagry’s system also supports “kneeling” for people with disabilities.

Why buses are a better focus for autonomy

Imagry has decided to focus on urban use cases rather than highways. Buses are simpler to get to Level 4 autonomy, said Ofir.

“Autonomous buses are better than ride hailing; they’re simpler than passenger vehicles,” said Ofir. “They drive in specific routes and at a speed of only 50 kph [31 mph] versus 80 kph [50 mph]. It’s a simpler use case, with economies of scale.”

“The time to revenue is much faster — the design cycle is four years, while integrating with a bus takes two to three months,” he explained. “Once we hand it over to the transport operator, we can get to L4 in 18 months, and then they can buy and deploy 40 more buses.”

In addition, the regulations for autonomous buses are clearer, with 22 countries running pilots, he noted.

“We already have projects with a large medical center and on a public road in Israel,” Ofir said. “We’re not doing small pods — most transport operators desire M3-class standard buses for 30 to 45 passengers because of the total cost of ownership, and they know how to operate them.”

In September and October, Imagry submitted bids for autonomous buses in Austria, Portugal, Germany, Sweden, and Japan.

Software focus could save money

By being vehicle-agnostic, Ofir said Imagry avoids being tied to specific, expensive hardware. Fifteen vendors are making systems on chips (SoCs) that are sufficient for Level 3 autonomy, he said.

“OEMs want the agility to use different sets of hardware in different vehicles. A $30,000 car is different from a $60,000 car, with different hardware stacks and bills of materials, such as camera or compute,” said Ofir. “It’s a crowded market, and the autonomy stack still costs $100,000 per vehicle. Ours is only $3,000 and runs on Ambarella, NVIDIA, TI, Qualcomm, and Intel.”

“With our first commercial proof of concept for Continental in Frankfurt, Germany, we calibrated our car and did some localization,” he added. “Three days after arrival, we simply took it out on the road, and it drove, knowing there’s no right on red.”

With shortages of drivers, particularly in Japan, operators could save $40,000 to $70,000 per bus per year, he said. The Japanese government wants 50 locations across the country to be served with autonomous buses by the end of 2025 and 100 by the end of 2027.

Autonomous buses are also reliable around the clock and don’t get sick or go on strike, he said.

“We’re working on fully autonomous parking, traffic jam assist, and Safe Driver Overwatch to help younger or older drivers obey traffic signs, which could be a game-changer in the insurance industry,” he added. “Our buses can handle roundabouts, narrow streets, and mixed traffic and are location-independent.”

Phases of autonomous bus deployment

Technology hurdles aside, getting autonomous buses recognized by the rules of the road requires patience, said Ofir.

“Together with Mobileye, which later moved to the robotaxi market, Imagry helped draft Israel’s regulatory framework for autonomous driving, which was completed in 2022,” recalled Ofir. “We’re working with lawmakers in France and Germany and will launch pilots in three markets in 2025.”

Testing even Level 3 autonomy can take years, depending on the region. He outlined the phases for autonomous bus rollout:

  1. Work with the electric bus for that market, then activate the system on a public road. “In the U.S., we’ve installed the full software and control stack in a vehicle and are testing FSD [full self-driving],” Ofir said.
  2. Pass NCAP (European New Car Assessment Programme) testing for merging and stops in 99 scenarios. “We’re the only company to date to pass those tests with an autonomous bus,” said Ofir. “Japan also has stringent safety standards.”
  3. Pass the cybersecurity framework, then allow passengers onboard buses with a safety driver present.
  4. Autonomously drive 100,000 km (62,137 mi.) on a designated route with one or more buses. After submitting a report to a department of motor vehicles or the equivalent, the bus operator could then remove the human driver.

“The silicon, sensors, and software don’t matter for time to revenue, and getting approvals from the U.S. National Highway Traffic Safety Administration [NHTSA] can take years,” Ofir said. “We expect passenger vehicles with our software on the road in Europe, the U.S., and Japan sometime in 2027.”

Imagry has joined Partners for Automated Vehicle Education (PAVE) and will be exhibiting at CES in January 2025.

The post Imagry moves to make buses autonomous without mapping appeared first on The Robot Report.

]]>
https://www.therobotreport.com/imagry-moves-to-make-buses-autonomous-without-mapping/feed/ 0
Healthcare companion market offers new robotics opportunities https://www.therobotreport.com/healthcare-companion-market-offers-new-robotics-opportunities/ https://www.therobotreport.com/healthcare-companion-market-offers-new-robotics-opportunities/#respond Sun, 24 Nov 2024 13:01:01 +0000 https://www.therobotreport.com/?p=581712 Healthcare applications for robots, from surgery to companionship, are growing for innovators and entrepreneurs, says Research Nester.

The post Healthcare companion market offers new robotics opportunities appeared first on The Robot Report.

]]>
Robotic companions are spreading in healthcare settings such as this hospital.

Robotic companions are spreading in healthcare settings, according to Research Nester. Credit: Adobe Stock

Back in 1985, a robot named PUMA 560 conducted a stereotactic brain biopsy with 0.05 mm accuracy. This began the introduction of robots in the healthcare system for performing various functions.

The demand for healthcare robots mushroomed since the COVID-19 pandemic. Research Nester has estimated that the market for medical robots increased by 36.5% to around 6,100 units in 2023.

In addition, sales of rehabilitation and non-invasive therapy robots grew by almost 128%. The robots have become companions for healthcare staffers, and the market is booming with opportunities. Let’s examine why numerous companies are willing to invest in this market.


SITE AD for the 2025 Robotics Summit registration. Register now


Healthcare robots already on duty in hospitals

Robotics has an array of applications in hospitals. For instance, the robotic-assisted surgery market could reach more than $14 billion (U.S.) by 2026. Here are some of the prominent tasks for which robots are being used in hospitals:

Telepresence: The Sanbot Elf robot was designed to provide a remote presence for visiting families, co-workers, hospital patients and efficaciously. During the pandemic, Ava Robotics‘ systems helped doctors see more patients while avoiding infections.

Surgical assistants: Millions of procedures are carried out each year with robotic assistance, and over the past 20 years, over 12 million were performed with Intuitive Surgical‘s da Vinci systems alone. Some of the prominent surgeries that can be done with the help of robots are:

  • Colorectal
  • General surgery
  • Gastric bypass
  • Robotic-assisted laparoscopy
  • Cholecystectomy
  • Kidney transplantation

In addition, by 2030, more than 700,100 robotic-assisted knee reconstruction procedures could be performed globally.

Medical transportation: Unlimited Robotics offers Gary, which it designed to address numerous logistical challenges faced by modern hospitals. Mobile robots can improve operations in the following ways:

  • Improved efficiency: 26% to 30% reduction in wait times for supply deliveries and patient transfers
  • Time savings: 100 to 180 staff hours could be saved daily
  • Cost savings: A facility’s labor costs could be reduced by $876,100 to $1 million annually
  • Staff satisfaction: 5% to 10% reduction in turnover rates due to decreased non-clinical workload

Sanitation and disinfection robots: With the rise in antibiotic resistance, healthcare facilities are using robots to clean surfaces. Ultraviolet disinfection robots are widely used to enhance manual cleaning.

Robots enter nursing homes and elder-care centers

The PARO robotic seal is an example of an interactive care robot particularly fabricated for older individuals suffering from dementia. These therapeutic robots provide emotional support and companionship to alleviate loneliness and anxiety.

Some of the prominent activities performed by robots such as PARO, Tombot’s Jennie, and Intuition Robotics’ ElliQ are:

  • Assisting with daily activities
  • Increasing mobility and independence for older adults
  • Preventing accidents and detecting falls
  • Rehabilitation and cognitive training
  • Medication reminders

According to the National Institutes of Health, the average implementation cost of a robot is almost $85,000 per year, but the cost of hiring human caregivers is higher than this.

Surveying the landscape for healthcare companion robots

Research Nester estimates that the companion robot market will garner $1 billion by the end of 2024, and it could reach $11 billion by the end of 2037. Some of the growth-propelling factors for the market are:

  • Increasing demand for home assistance for the aging population
  • Rising inculcation of automation
  • Rising need to help the differently-abled population
  • Increasing initiatives from the government supporting healthcare robots
  • Robot companions aiding the cognitive and emotional development of children

However, factors such as the high cost of production and privacy challenges are some of the growth restraining factors for the growth of the healthcare companion robots market.

Some of the companies making a positive impact are Aeolus Robotics, Andromeda, ASUSTeK Computer, Blue Frog Robotics, inGen Dynamics, Luvozo, PAL Robotics, and UBTECH.

We project that North America will experience the most promising growth rate, with rising demand for humanoid robots in hospitals. For instance, Aethon’s Zena RX can securely deliver pharmacy and other clinical materials day or night.

Cover of the Research Nester report on the healthcare companion robotics market.

The healthcare companion robotics market is expected to grow from 2024 to 2037. Source: Research Nester

The above discussion shows that the healthcare robotics market is offering lucrative growth opportunities. Entrepreneurs and practitioners are willing to make investments to serve this market.

However, getting appropriate knowledge of the market parameters is of utmost importance with cutting-edge technologies in a competitive world. Market research reports offer detailed analyses of growth drivers and constraints, regional differences, etc. These factors can help you make judicious business decisions.

Aashi MishraAbout the author

Aashi Mishra is an experienced research writer, strategist, and marketer with a demonstrated history of research in a myriad of industries. She said she loves to distill complex industrial terminologies of market space into simpler terms. 

The post Healthcare companion market offers new robotics opportunities appeared first on The Robot Report.

]]>
https://www.therobotreport.com/healthcare-companion-market-offers-new-robotics-opportunities/feed/ 0
Smart Vision Works introduces SiftAI robotic potato sorter https://www.therobotreport.com/smart-vision-works-introduces-siftai-robotic-potato-sorter/ https://www.therobotreport.com/smart-vision-works-introduces-siftai-robotic-potato-sorter/#respond Sat, 23 Nov 2024 13:30:21 +0000 https://www.therobotreport.com/?p=581648 Smart Vision Works said its SiftAI robotic potato sorter will pay for itself in fewer than two years from installation.

The post Smart Vision Works introduces SiftAI robotic potato sorter appeared first on The Robot Report.

]]>
Two SiftAI Robotic Sorters, delta robots, sorting potatoes as they go down a conveyor belt.

Smart Vision Works’ SiftAI vision system uses AI to sort potato defects and sizes with high accuracy. | Source: Smart Vision Works

Fresh-pack potato processors struggle to find workers for the final inspection of potato sorting and grading. Smart Vision Works last week announced the SiftAI Robotic Sorter, which combines a delta robot with an AI-based vision inspection system to sort potatoes.

Even when potato-sorting sheds can adequately staff, defects still reach customers, and acceptable potatoes are wasted, it said. The Westborough, Mass.-based company said its robotic sorter can automate final inspection, ensuring accurate grading, increasing profits, and allowing managers to redeploy scarce workers to other tasks.

“Because of potato oversupply and rising wages in North America, many potato processors are losing money on every box shipped,” said Curtis Koelling, vice president of product development and innovation at Smart Vision Works.

“Managers are eager to identify technology that can lower their production costs,” he said. “When they see a competitor managing final inspection without labor costs, they become very interested in the technology.”

SiftAI inspects potatoes for defects

Founded in 2012, Smart Vision Works creates AI and machine learning algorithms to reduce the number of images needed to train models. It can then take on challenging machine vision problems and to deliver high-quality solutions for its customers.

KPM Analytics, a global developer of scientific instrumentation, acquired the Orem, Utah-based company in 2023.

The new product includes a vision-based system, AI software, and a proven potato-inspection model including 19 different defects. Installed over a roller table, SiftAI uses its cameras to collect images of all sides of the potato.

Each system is programmed with AI models for overall potato size and shape or the presence of defects like bruises, cracks, percent green, and other cosmetic features.

Smart Vision Works develops rapid sortation

For any potatoes that grade outside the AI model’s acceptance criteria, SiftAI triggers the robotic arm to pick up and remove the potato from the product stream at rates of 80 to 100 picks per minute with two-robot system configurations.

The SiftAI Robotic Sorter inspects potatoes with the same dexterity and speed as a human inspector but with much higher accuracy, increasing profitability and reducing customer chargebacks, claimed Smart Vision Works. Currently, the industry goal is to have no more than 5% of defective potatoes reaching customers, which is the limit set by the U.S. Department of Agriculture.

Human inspectors typically discard 10% to 20% of acceptable potatoes, reducing profits. In beta testing, the new AI-enabled robotic sorter dramatically reduced the percentage of missed defects and misgraded potatoes. Adding increased profitability to the labor savings, Smart Vision Works said the financial impact of this system is significant.

The investment pays for itself in fewer than two years, said the company. It asserted that the system’s high accuracy is possible because its technology is not like the basic AI commonly used by other vision inspection systems.

Instead, SiftAI is built on 12 years of development by AI scientists and years of experience in the potato industry. Unlike systems that use optical scanners, the system takes a full digital image and runs it through a neural network, said Smart Vision Works.

The SIftAI Robotic Sorter is available for order now.


SITE AD for the 2025 Robotics Summit registration. Register now


The post Smart Vision Works introduces SiftAI robotic potato sorter appeared first on The Robot Report.

]]>
https://www.therobotreport.com/smart-vision-works-introduces-siftai-robotic-potato-sorter/feed/ 0
How AI, perception are shaping mobile robotics https://www.therobotreport.com/how-ai-perception-are-shaping-mobile-robotics/ https://www.therobotreport.com/how-ai-perception-are-shaping-mobile-robotics/#respond Fri, 22 Nov 2024 22:28:39 +0000 https://www.therobotreport.com/?p=581710 Amir Bousani, CEO of RGO Robotics, and Jacob Petersen, Chief Commercial Officer from Wheel.Me, discuss the importance of perception and AI for mobile robotics.

The post How AI, perception are shaping mobile robotics appeared first on The Robot Report.

]]>

In Episode 173 of The Robot Report Podcast, co-host Steve Crowe and I catch up on the news of the week, including several recent stories about mobile manipulators.

Featured interview with RGO Robotics and Wheel.Me

In the featured interview this week, I talk to Amir Bousani, CEO of RGO Robotics, and Jacob Petersen, chief commercial officer of Wheel.Me. We discuss the importance of perception for autonomous mobile robots, and discuss Wheel.Me’s decision to leverage RGO Robotics perception engine in its platform.

Show timeline

  • 7:44 – News of the week
  • 11:02 – Update on Proxie from Brad Porter, founder and CEO of Collaborative Robotics
  • 24:15 – Interview with Amir Bousani, CEO of RGO Robotics, and Jacob Petersen, Chief Commercial Officer from Wheel.Me.

SITE AD for the 2025 Robotics Summit registration. Register now


News of the week

Collaborative Robotics unveils Proxie mobile manipulator

Collaborative Robotics Inc. this week unveiled its Proxie mobile manipulator publicly for the first time. The startup has been secretive about the design of the robot since Porter founded the company in 2022. In April 2024, Collaborative Robotics closed a $100 million Series B round toward commercializing its autonomous mobile robot (AMR).

The company’s been closed-lipped about the design of the robot, but on Wednesday it released images and video of the Proxie AMR, along with a newly redesigned website. The AMR features a swerve drive, a hot-swappable battery, and a fixed linear actuator in its “spine.” The robot is designed to be fitted with a variety of onboard actuators, and the first one to be productized is a simple cart acquisition.

Pickle Robot gets orders for over 30 unloading systems, plus $50M in funding

Pickle Robot Co. raised $50 million in Series B funding this week. It also announced that six customers placed orders during the third quarter for more than 30 robots to deploy in the first half of 2025. Founded in 2018, Pickle Robot said its robots are designed to autonomously unload trucks, trailers, and import containers at human-scale or better performance.

The company said its Series B funding included participation from a strategic customer. Teradyne Robotics Ventures, Toyota Ventures, Ranpak, Third Kind Venture Capital, One Madison Group, Hyperplane, Catapult Ventures, and others also participated. The company said it plans to use its latest funding to accelerate the development of new feature sets. It also plans to build out its commercial teams to unlock new markets and geographies worldwide.

MC600 mobile manipulator combines UR cobot with MiR base

The new MC600 combines the MiR600 AMR with the UR20 and UR30 collaborative robot arms from Universal Robots A/S, which is also owned by Teradyne. Mobile Industrial Robots said it can handle payloads up to 600 kg (1,322 lb.) and automate complex workflows in industrial environments. A unified software platform by MiR Go partner Enabled Robotics controls the MC600. MiR said this coordinates its mobile base and robotic arms, simplifying integration into existing workflows and ensuring smooth operations.

ASTM proposes mobile manipulation standard

In other mobile manipulation news, ASTM International’s F45 committee for robotics, automation, and autonomous systems has proposed a new standard, WK92144. It provides guidelines for documenting disturbances of robot arms, such as by heavy equipment, in unstructured manufacturing environments. The proposed standard describes an example apparatus for testing.


2025 RBR50 Robotics Innovation Awards open for nominations

You can now submit nominations for the 2025 RBR50 innovation awards. They will recognize technology and business innovations in the calendar year 2024, and the awards are open to any company worldwide that produces robotics or automation.

The categories include:

  1. Technologies, products, and services: This category includes primary or applied research focusing on robotics and supporting technologies such as motion control, vision, or machine learning. It also includes new products and business, engineering, or technology services.
  2. Business and management: This category covers initiatives positioning a company as a market leader or an organization as an important thought leader in the robotics ecosystem. Significant mergers and acquisitions are relevant, as are supplier, partner, and integrator relationships.
  3. Applications and markets: The RBR50 will also recognize innovations that improve productivity, quality, and cost-effectiveness, as well as those that automate new tasks.

In addition, the 2025 RBR50 awards will celebrate the following:

  • Startup of the Year
  • Application of the Year
  • Robot of the Year
  • Robots for Good Award

The deadline for submissions is Friday, Dec. 20, 2024.


Podcast sponsored by RGO Robotics

The show this week is sponsored by RGO Robotics 

Is your autonomous mobile robot (AMR) struggling in dynamic environments? Is your business stuck because it takes months to commission a new site?

RGo Robotics’ Perception Engine is revolutionizing the AMR business through advanced Vision AI perception technology. Unlike traditional solutions, The company’s software enables AMRs to adapt to changing environments and navigate complex spaces with unprecedented accuracy and the commissioning process is shorter and simpler.

Leading AMR companies are enhancing their fleets with RGo’s AI-powered perception, enabling their teams to accelerate use of advanced AI capabilities like foundation models and digital twins.

Don’t let outdated navigation hold your business back.

To learn more about RGO’s solutions, go to: https://www.rgorobotics.ai/


 

The post How AI, perception are shaping mobile robotics appeared first on The Robot Report.

]]>
https://www.therobotreport.com/how-ai-perception-are-shaping-mobile-robotics/feed/ 0