Mobility / Navigation Archives - The Robot Report https://www.therobotreport.com/category/design-development/mobility-navigation/ Robotics news, research and analysis Tue, 03 Dec 2024 13:22:06 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.1 https://www.therobotreport.com/wp-content/uploads/2017/08/cropped-robot-report-site-32x32.png Mobility / Navigation Archives - The Robot Report https://www.therobotreport.com/category/design-development/mobility-navigation/ 32 32 Clearpath Robotics discusses development of Husky A300 ground vehicle https://www.therobotreport.com/a300-clearpath-robotics-discusses-development/ https://www.therobotreport.com/a300-clearpath-robotics-discusses-development/#respond Tue, 03 Dec 2024 15:00:08 +0000 https://www.therobotreport.com/?p=581811 The Husky A300 uncrewed ground vehicle from Clearpath includes features for both expert robot developers and non-expert users.

The post Clearpath Robotics discusses development of Husky A300 ground vehicle appeared first on The Robot Report.

]]>
The Husky A300, shown here, includes several design improvements over the A200, says Clearpath Robotics.

The Husky A300 is designed to be tougher and have longer endurance than the A200. Source: Clearpath Robotics

Developers of robots for indoor or outdoor use have a new platform to build on. In October, Clearpath Robotics Inc. released the Husky A300, the latest version of its flagship mobile robot for research and development. The Waterloo, Ontario-based company said it has improved the system’s speed, weather resistance, payload capacity, and runtime.

“Husky A200 has been on the market for over 10 years,” said Robbie Edwards, director of technology at Clearpath Robotics. “We have lots of experience figuring out what people want. We’ve had different configurations, upgrades, batteries and chargers, computers, and motors.”

“We’ve also had different configurations of the internal chassis and ingress protection, as well as custom payloads,” he told The Robot Report. “A lot of that functionality that you had to pay to add on is now stock.”

Husky A300 hardware is rugged, faster

The Husky A300 includes a high-torque drivetrain with four brushless motors that enable speeds of up to 2 m/s (4.4 mph), twice as fast as the previous version. It can carry payloads up to 100 kg (220.4 lb.) and has a runtime of up to 12 hours, said Clearpath Robotics.

The company, which Rockwell Automation acquired last year, noted that the platform can integrate third-party components and accessories including depth cameras, directional lidar, dual-antenna GPS, and manipulators. Husky A300 has an IP54 rating against dust and water and can withstand industrial environments or extreme temperatures outdoors, it said. 

“Before, the Husky was configured on a bespoke basis,” said Edwards. “Now we’re off at a more competitive price, which is great for our customers, and it now comes off our production line instead of our integration line.”

Founded in 2009, the company has tested its hardware and software near its office in a wide range of weather conditions.

Clearpath’s integration with Rockwell has gone smoothly so far, with Rockwell’s procurement team easing access to components and manufacturing, said Edwards. He observed that some of Rockwell’s customers in mining or other industrial automation could find new use cases in time.

The Husky A300 platform, shown here, is designed to withstand dust and temperature variances, says Clearpath Robotics.

The Husky A300 can withstand dust and temperature variances. Source: Clearpath Robotics

Clearpath includes ROS 2 support with A300

Husky A300 ships with Robot Operating System (ROS) 2 Jazzy plus demonstrations of Nav2, MoveIt 2, and other developer utilities.

“Over the past two years, there was a big push to get all Clearpath products to ROS 2 Humble because its configuration management system made life easier for our integration team and customers,” recalled Edwards. “We also provide support for simulation, and URDF [Unified Robot Description Format] is configured.”

Many of Clearpath’s R&D customers were familiar with ROS, C++, and Python, so it offered visualization and simulation tools in addition to the ROS stack, he added. However, as the company got non-expert customers, it wanted to enable them to also work with Husky.

“Academics who aren’t roboticists but want to do data collection can now do so with a simple Python interface, without learning ROS,” Edwards said. “We’ve maintained a level of flexibility with integrating different payloads and compute options while still giving a pretty good price point and usability.”


SITE AD for the 2025 Robotics Summit registration. Register now


Husky AMP a ‘turnkey’ option

Clearpath Robotics is offering a “turnkey” version of the robot dubbed Husky AMP, or autonomous mobile platform. It comes with a sensor suite for navigation, pre-installed and configured OutdoorNav software, a Web-based user interface, and an optional wireless charging dock.

“Robotics developers can easily integrate payloads onto the mounting deck, carry out a simple software integration through the OutdoorNav interface, and get their system working in the field faster and more efficiently,” said Clearpath.

“We’ve lowered the barrier to entry by providing all software function calls and a navigation stack,” Edwards asserted. “The RTK [real-time kinematic positioning] GPS is augmented with sensor fusion, including wheel odometry, and visual and lidar sensors.”

“With a waypoint following system, the robotics stack does the path planning, which is constrained and well-tested,” he said. “Non-roboticists can use Husky A300 as a ground drone.”

More robot enhancements, use cases to come

Clearpath Robotics is considering variant drive trains for the Husky A300, such as tracks for softer terrain as in agriculture, said Edwards.

“Husky is a general-purpose platform,” he said. “We’re serving outdoors developers rather than end users directly, but there’s a lot of demand for larger, high-endurance materials transport.”

For the A300, the company surveyed its client base, which came back with 150 use cases.

“I’ve seen lots of cool stuff — robots herding animals, helping to grow plants, working in mines, participating in the DARPA Subterranean Challenge in fleets of Husky and [Boston Dynamics’] Spot,” Edwards said. “Husky Observer conducts inspections of sites such as solar farms.”

“The benefits for industrial users also help researchers,” he said. “Making the robot cheaper to deploy for faster time to value also means better battery life, weatherproofing, and integrations.”

Edwards added that Clearpath has received a lot of interest in mobile manipulation with its Ridgeback omnidirectional platform.

“This trend is finding its way outdoors as well,” he said. “On the application engineering side, developers have put put two large Universal Robots arms on our Warthog UGV [uncrewed ground vehicle] for things like changing tires.”

The Husky A300 can carry different sensor payloads, shown here, or robotic arms.

The Husky A300 can carry different sensor payloads or robotic arms. Source: Clearpath Robotics

The post Clearpath Robotics discusses development of Husky A300 ground vehicle appeared first on The Robot Report.

]]>
https://www.therobotreport.com/a300-clearpath-robotics-discusses-development/feed/ 0
Imagry moves to make buses autonomous without mapping https://www.therobotreport.com/imagry-moves-to-make-buses-autonomous-without-mapping/ https://www.therobotreport.com/imagry-moves-to-make-buses-autonomous-without-mapping/#respond Mon, 25 Nov 2024 19:18:36 +0000 https://www.therobotreport.com/?p=581732 Imagry has developed hardware-agnostic systems to provide Level 4 autonomy to buses with time to market in mind.

The post Imagry moves to make buses autonomous without mapping appeared first on The Robot Report.

]]>
Imagry says its autonomy kit enables buses to autonomously handle roundabouts, as shown here.

Imagry says its software enables buses to autonomously handle complex situations such as roundabouts. Source: Imagry

Autonomous vehicles often rely heavily on prior information about their routes, but new technology promises to improve real-time situational awareness for vehicles including buses. Imagry said its “HD-mapless driving” software stack enables vehicles to react to dynamic contexts and situations more like human drivers.

The company also said its AI Vision 360 eliminates the need for external sensor infrastructure. It claimed that its bio-inspired neural network and hardware-agnostic systems allow for SAE Level 3/4 operations without spending time on mapping.

“We’ve been focusing on two sectors,” said Eran Ofir, CEO of Imagry. “We’ve been selling our perception and motion-planning stack to Tier 1 suppliers and automotive OEMs for autonomous vehicles. We signed a 10-year contract with Continental and are jointly developing a software-defined vehicle platform.”

“And we’ve started working with transportation operators on providing autonomous buses,” he told The Robot Report. “For example, in Turkey, France, Spain, and soon Japan, we’re retrofitting electric buses to be autonomous.”


SITE AD for the 2025 Robotics Summit registration. Register now


Imagry trains in real time with supervision

Imagry was established in 2015 with a focus on computer vision for retail. In 2018, it began focusing entirely on autonomous driving. The company now has about 120 employees in San Jose, Calif., and Haifa, Israel.

Imagry said its technology is similar to that of Tesla in relying on 3D vision for perception and motion planning rather than rule-based coding or maps.

“Most players in the industry use HD maps with 5 cm [1.9 in.] resolution, telling the vehicle where lights, signs, and lane markers are,” said Ofir. “Our system teaches itself with supervised learning. It maps in real time while driving. Like a human driver, it gets the route but doesn’t know what it will find.”

How does Imagry deal with the need for massive data sets to train for navigation and obstacle detection and avoidance?

“We wrote a proprietary tool for annotation to train faster, better, and cheaper,” Ofir replied. “The data is collected but doesn’t live in the cloud. The human supervisor tells the vehicle where it was wrong, like a child. We deliver over-the-air updates to customers.”

“The world doesn’t belong to HD maps — it’s a matter of trusting AI-based software for perception and motion planning,” he said.

Ofir cited an example of a vehicle in Arizona on a random route with no communications to centralized computing. Its onboard sensors and compute recognized construction zones, skateboarders, a bike lane, and stop signs.

“The capability to drive out of the box in new places is unique to Imagry,” asserted Ofir. “We can handle righthand and lefthand driving, such as in Tokyo, where we’ve been driving for a year now.”

How does the bus know when to stop for passengers?

It could stop at every bus stop, upon request via a button at the stop (for the elderly, who may not use phone apps), or be summoned by an app that also handles payment, responded Ofir. Imagry’s system also supports “kneeling” for people with disabilities.

Why buses are a better focus for autonomy

Imagry has decided to focus on urban use cases rather than highways. Buses are simpler to get to Level 4 autonomy, said Ofir.

“Autonomous buses are better than ride hailing; they’re simpler than passenger vehicles,” said Ofir. “They drive in specific routes and at a speed of only 50 kph [31 mph] versus 80 kph [50 mph]. It’s a simpler use case, with economies of scale.”

“The time to revenue is much faster — the design cycle is four years, while integrating with a bus takes two to three months,” he explained. “Once we hand it over to the transport operator, we can get to L4 in 18 months, and then they can buy and deploy 40 more buses.”

In addition, the regulations for autonomous buses are clearer, with 22 countries running pilots, he noted.

“We already have projects with a large medical center and on a public road in Israel,” Ofir said. “We’re not doing small pods — most transport operators desire M3-class standard buses for 30 to 45 passengers because of the total cost of ownership, and they know how to operate them.”

In September and October, Imagry submitted bids for autonomous buses in Austria, Portugal, Germany, Sweden, and Japan.

Software focus could save money

By being vehicle-agnostic, Ofir said Imagry avoids being tied to specific, expensive hardware. Fifteen vendors are making systems on chips (SoCs) that are sufficient for Level 3 autonomy, he said.

“OEMs want the agility to use different sets of hardware in different vehicles. A $30,000 car is different from a $60,000 car, with different hardware stacks and bills of materials, such as camera or compute,” said Ofir. “It’s a crowded market, and the autonomy stack still costs $100,000 per vehicle. Ours is only $3,000 and runs on Ambarella, NVIDIA, TI, Qualcomm, and Intel.”

“With our first commercial proof of concept for Continental in Frankfurt, Germany, we calibrated our car and did some localization,” he added. “Three days after arrival, we simply took it out on the road, and it drove, knowing there’s no right on red.”

With shortages of drivers, particularly in Japan, operators could save $40,000 to $70,000 per bus per year, he said. The Japanese government wants 50 locations across the country to be served with autonomous buses by the end of 2025 and 100 by the end of 2027.

Autonomous buses are also reliable around the clock and don’t get sick or go on strike, he said.

“We’re working on fully autonomous parking, traffic jam assist, and Safe Driver Overwatch to help younger or older drivers obey traffic signs, which could be a game-changer in the insurance industry,” he added. “Our buses can handle roundabouts, narrow streets, and mixed traffic and are location-independent.”

Phases of autonomous bus deployment

Technology hurdles aside, getting autonomous buses recognized by the rules of the road requires patience, said Ofir.

“Together with Mobileye, which later moved to the robotaxi market, Imagry helped draft Israel’s regulatory framework for autonomous driving, which was completed in 2022,” recalled Ofir. “We’re working with lawmakers in France and Germany and will launch pilots in three markets in 2025.”

Testing even Level 3 autonomy can take years, depending on the region. He outlined the phases for autonomous bus rollout:

  1. Work with the electric bus for that market, then activate the system on a public road. “In the U.S., we’ve installed the full software and control stack in a vehicle and are testing FSD [full self-driving],” Ofir said.
  2. Pass NCAP (European New Car Assessment Programme) testing for merging and stops in 99 scenarios. “We’re the only company to date to pass those tests with an autonomous bus,” said Ofir. “Japan also has stringent safety standards.”
  3. Pass the cybersecurity framework, then allow passengers onboard buses with a safety driver present.
  4. Autonomously drive 100,000 km (62,137 mi.) on a designated route with one or more buses. After submitting a report to a department of motor vehicles or the equivalent, the bus operator could then remove the human driver.

“The silicon, sensors, and software don’t matter for time to revenue, and getting approvals from the U.S. National Highway Traffic Safety Administration [NHTSA] can take years,” Ofir said. “We expect passenger vehicles with our software on the road in Europe, the U.S., and Japan sometime in 2027.”

Imagry has joined Partners for Automated Vehicle Education (PAVE) and will be exhibiting at CES in January 2025.

The post Imagry moves to make buses autonomous without mapping appeared first on The Robot Report.

]]>
https://www.therobotreport.com/imagry-moves-to-make-buses-autonomous-without-mapping/feed/ 0
How AI, perception are shaping mobile robotics https://www.therobotreport.com/how-ai-perception-are-shaping-mobile-robotics/ https://www.therobotreport.com/how-ai-perception-are-shaping-mobile-robotics/#respond Fri, 22 Nov 2024 22:28:39 +0000 https://www.therobotreport.com/?p=581710 Amir Bousani, CEO of RGO Robotics, and Jacob Petersen, Chief Commercial Officer from Wheel.Me, discuss the importance of perception and AI for mobile robotics.

The post How AI, perception are shaping mobile robotics appeared first on The Robot Report.

]]>

In Episode 173 of The Robot Report Podcast, co-host Steve Crowe and I catch up on the news of the week, including several recent stories about mobile manipulators.

Featured interview with RGO Robotics and Wheel.Me

In the featured interview this week, I talk to Amir Bousani, CEO of RGO Robotics, and Jacob Petersen, chief commercial officer of Wheel.Me. We discuss the importance of perception for autonomous mobile robots, and discuss Wheel.Me’s decision to leverage RGO Robotics perception engine in its platform.

Show timeline

  • 7:44 – News of the week
  • 11:02 – Update on Proxie from Brad Porter, founder and CEO of Collaborative Robotics
  • 24:15 – Interview with Amir Bousani, CEO of RGO Robotics, and Jacob Petersen, Chief Commercial Officer from Wheel.Me.

SITE AD for the 2025 Robotics Summit registration. Register now


News of the week

Collaborative Robotics unveils Proxie mobile manipulator

Collaborative Robotics Inc. this week unveiled its Proxie mobile manipulator publicly for the first time. The startup has been secretive about the design of the robot since Porter founded the company in 2022. In April 2024, Collaborative Robotics closed a $100 million Series B round toward commercializing its autonomous mobile robot (AMR).

The company’s been closed-lipped about the design of the robot, but on Wednesday it released images and video of the Proxie AMR, along with a newly redesigned website. The AMR features a swerve drive, a hot-swappable battery, and a fixed linear actuator in its “spine.” The robot is designed to be fitted with a variety of onboard actuators, and the first one to be productized is a simple cart acquisition.

Pickle Robot gets orders for over 30 unloading systems, plus $50M in funding

Pickle Robot Co. raised $50 million in Series B funding this week. It also announced that six customers placed orders during the third quarter for more than 30 robots to deploy in the first half of 2025. Founded in 2018, Pickle Robot said its robots are designed to autonomously unload trucks, trailers, and import containers at human-scale or better performance.

The company said its Series B funding included participation from a strategic customer. Teradyne Robotics Ventures, Toyota Ventures, Ranpak, Third Kind Venture Capital, One Madison Group, Hyperplane, Catapult Ventures, and others also participated. The company said it plans to use its latest funding to accelerate the development of new feature sets. It also plans to build out its commercial teams to unlock new markets and geographies worldwide.

MC600 mobile manipulator combines UR cobot with MiR base

The new MC600 combines the MiR600 AMR with the UR20 and UR30 collaborative robot arms from Universal Robots A/S, which is also owned by Teradyne. Mobile Industrial Robots said it can handle payloads up to 600 kg (1,322 lb.) and automate complex workflows in industrial environments. A unified software platform by MiR Go partner Enabled Robotics controls the MC600. MiR said this coordinates its mobile base and robotic arms, simplifying integration into existing workflows and ensuring smooth operations.

ASTM proposes mobile manipulation standard

In other mobile manipulation news, ASTM International’s F45 committee for robotics, automation, and autonomous systems has proposed a new standard, WK92144. It provides guidelines for documenting disturbances of robot arms, such as by heavy equipment, in unstructured manufacturing environments. The proposed standard describes an example apparatus for testing.


2025 RBR50 Robotics Innovation Awards open for nominations

You can now submit nominations for the 2025 RBR50 innovation awards. They will recognize technology and business innovations in the calendar year 2024, and the awards are open to any company worldwide that produces robotics or automation.

The categories include:

  1. Technologies, products, and services: This category includes primary or applied research focusing on robotics and supporting technologies such as motion control, vision, or machine learning. It also includes new products and business, engineering, or technology services.
  2. Business and management: This category covers initiatives positioning a company as a market leader or an organization as an important thought leader in the robotics ecosystem. Significant mergers and acquisitions are relevant, as are supplier, partner, and integrator relationships.
  3. Applications and markets: The RBR50 will also recognize innovations that improve productivity, quality, and cost-effectiveness, as well as those that automate new tasks.

In addition, the 2025 RBR50 awards will celebrate the following:

  • Startup of the Year
  • Application of the Year
  • Robot of the Year
  • Robots for Good Award

The deadline for submissions is Friday, Dec. 20, 2024.


Podcast sponsored by RGO Robotics

The show this week is sponsored by RGO Robotics 

Is your autonomous mobile robot (AMR) struggling in dynamic environments? Is your business stuck because it takes months to commission a new site?

RGo Robotics’ Perception Engine is revolutionizing the AMR business through advanced Vision AI perception technology. Unlike traditional solutions, The company’s software enables AMRs to adapt to changing environments and navigate complex spaces with unprecedented accuracy and the commissioning process is shorter and simpler.

Leading AMR companies are enhancing their fleets with RGo’s AI-powered perception, enabling their teams to accelerate use of advanced AI capabilities like foundation models and digital twins.

Don’t let outdated navigation hold your business back.

To learn more about RGO’s solutions, go to: https://www.rgorobotics.ai/


 

The post How AI, perception are shaping mobile robotics appeared first on The Robot Report.

]]>
https://www.therobotreport.com/how-ai-perception-are-shaping-mobile-robotics/feed/ 0
ANELLO Photonics secures funding for inertial navigation in GPS-denied environments https://www.therobotreport.com/anello-photonics-secures-funding-inertial-navigation-gps-denied-environments/ https://www.therobotreport.com/anello-photonics-secures-funding-inertial-navigation-gps-denied-environments/#respond Tue, 19 Nov 2024 16:15:50 +0000 https://www.therobotreport.com/?p=581641 ANELLO Photonics, which has developed compact navigation and positioning for autonomous systems, has closed its Series B round.

The post ANELLO Photonics secures funding for inertial navigation in GPS-denied environments appeared first on The Robot Report.

]]>
ANELLO evaluation kit for its SiPhOG optical navigation system.

ANELLO offers an evaluation kit for its navigation and positioning system. Source: ANELLO Photonics

Self-driving vehicles, mobile robots, and drones need multiple sensors for safe and reliable operation, but the cost and bulk of those sensors have posed challenges for developers and manufacturers. ANELLO Photonics Inc. yesterday said it has closed its Series B funding round for its SiPhOG inertial navigation system, or INS.

“This investment not only validates our SiPhOG technology and products in the marketplace, but will [also] allow us to accelerate our manufacturing and product development as we continue to push the boundaries and leadership for navigation capabilities and performance to our customers who want solutions for GPS-denied environments,” stated Dr. Mario Paniccia, co-founder and CEO of ANELLO Photonics.

Founded in 2018, ANELLO has developed SiPhOG — Silicon Photonics Optical Gyroscope — based on integrated photonic system-on-chip (SoC) technology. The Santa Clara, Calif.-based company said it has more than 28 patents, with 44 pending. Its technologies also include a sensor-fusion engine using artificial intelligence.

“I spent 22 years at Intel and started this field of silicon photonics, which is the idea of building optical devices out of standard silicon processing, mostly focused on the data center,” recalled Paniccia. “Mike Horton, my co-founder, was a sensor gyro expert who started a company called Crossbow coming out of UC Berkeley.”

“Everyone doing autonomy was saying lidar and radar, but customers told Mike that if we could build an integrated photonic chip, they’d be very interested,” he told The Robot Report. “If you look at fiber gyros, they work great but are big, bulky, and expensive.”

“The stuff on our phones are MEMS [micro-electromechanical systems]-based today, which is not very accurate and is very sensitive to temperature, vibration, and EM interference,” Paniccia explained. “With the the same concept as a fiber gyro — the idea of light going around a coil, and you measure the phase based on rotation — we integrated all those components on a single chip, added a little laser, and put electronics around it, and you now get SiPhOG, which fits in the palm of your hand.”


SITE AD for the 2025 Robotics Summit registration. Register now


SiPhOG combines compactness and precision

SiPhOG brings high-precision into an integrated silicon photonics platform, claimed ANELLO. It is based on the interferometric fiber-optic gyroscope (FOG) but is designed for compactness, said Paniccia.

“It’s literally 2 by 5 mm,” he said. “On that chip, we have all the components — the splitters, the couplers, the phase modulators, and the delay lines. We measure about 50 nano-radians of signal, so a tiny, tiny signal, but we measure it very accurately.”

The system also has a non-ASIC, two-sided electronics board with an analog lock-in amplifier, a temperature controller, and an isolator, Paniccia said. It has none of the drawbacks of MEMS and uses 3.3 volts, he added.

Paniccia said the SiPhOG unit includes an optical gyro, triple-redundant MEMS, accelerometers, and magnetometers. It also has two GPS chips and dual antennas and is sealed to be waterproof.

The ANELLO IMU+ is designed for harsh environments including construction, robotics, mining, trucking, and defense.

The ANELLO IMU+ is designed for harsh environments including in construction, robotics, mining, trucking, and defense. Source: ANELLO

Navigation system ready for multiple markets

Autonomous systems can work with ANELLO’s technology and the Global Navigation Satellite System (GNSS) for navigation, positioning, and motion tracking for a range of applications, said the company.

“We’re shipping to customers now in orchards, where the leaves come in, and the water in them essentially acts like a tunnel, absorbing GPS,” Paniccia said. “Our algorithm says, ‘I’m losing GPS, so weigh the navigation algorithm more to the optical gyro.’ You want the robot to stay within a tenth of a meter across a distance of half a mile. Long-distance, we’re looking at 100 km of driving without GPS with less than 100-m lateral error.”

In addition, SiPhOG is built for scalability and cost-effectiveness.

“VC friends tell me that automakers are putting six lidar systems on a car, and each one is $10,000 each. It’s never going to get to mass market,” Paniccia said. “We have an optical technology for land, air, and sea. And whether that land vehicle is for agriculture or construction, or in the longer term, trucking or autonomous cars, we can do it.”

“You can literally tape SiPhOG to a dashboard and plug it into the cigarette lighter,” he said. “We have self-alignment correction, and within 15 minutes, you can have GPS-denied navigation capability. We’re also shipping this system for indoor robots like in construction.”

“If I put three SiPhOGs in a cube, I can have the same performance but at one-fifth the size and weight and a quarter of the power for precision in three dimensions,” said Paniccia. “That’s exciting for drones and maritime.”

Investors to accelerate ANELLO 

Lockheed Martin, Catapult Ventures, and One Madison Group co-led ANELLO’s unspecified Series B round. New Legacy, Build Collective, Trousdale Ventures, In-Q-Tel (IQT), K2 Access Fund, Purdue Strategic Ventures, Santuri Ventures, Handshake Ventures, Irongate Capital, and Mana Ventures also participated. 

“We’re committed to fostering the art of the possible with investments in cutting edge technologies, including advancements in inertial navigation that have the potential to enhance autonomous operations in GPS-denied environments,” said Chris Moran, vice president and general manager of Lockheed Martin Ventures. “Our continued investment in ANELLO reflects our mission to accelerate technologies that can ultimately benefit national security.”

ANELLO said it plans to use its latest funding to continue developing and deploying its technology. The company has worked with the U.S. Department of Defense to optimize its algorithms against jamming or spoofing.

“Every week, there’s an article about a commercial flight or defense-related mission getting GPS jammed, like thousands of flights to and from Europe affected by suspected Russian jamming,” noted Tony Fadell, founder of Nest and a principal at investor Build Collective. “GPS has become a single point of failure because it’s too easily compromised with various jamming and spoofing techniques.”

“ANELLO’s proven and commercially available optical gyroscope is the only navigational tool that can take over, [offering] precision over long periods of time, the size of a golf ball, low-power, low-cost, that’s immune to shock and vibration,” he added. “ANELLO will save lives in the air, on the road, and over water.”

The post ANELLO Photonics secures funding for inertial navigation in GPS-denied environments appeared first on The Robot Report.

]]>
https://www.therobotreport.com/anello-photonics-secures-funding-inertial-navigation-gps-denied-environments/feed/ 0
Nuro Driver expands Level 4 autonomous fleet in California and Texas https://www.therobotreport.com/nuro-driver-expands-level-4-autonomous-deliveries-california-texas/ https://www.therobotreport.com/nuro-driver-expands-level-4-autonomous-deliveries-california-texas/#respond Tue, 19 Nov 2024 14:00:20 +0000 https://www.therobotreport.com/?p=581631 With this expanded deployment of zero-occupant vehicles, the company said Nuro Driver is ready to autonomously transport people and goods.

The post Nuro Driver expands Level 4 autonomous fleet in California and Texas appeared first on The Robot Report.

]]>
A small, boxy, white Nuro vehicle driving on a road with a glass building behind it.

Nuro’s custom L4 vehicles use the Nuro Driver to safely carry food and drink, with no human present in the vehicle. | Source: Nuro

Nuro Inc. today announced a significant expansion of its driverless capabilities using zero-occupant vehicles with the artificial intelligence-powered Nuro Driver system. The company said this expansion covers multiple cities in two states and includes significant operational advancements.

The expanded deployment of autonomous vehicles demonstrates foundational technology for transporting people and goods, asserted Nuro. It plans to expand in Mountain View and Palo Alto, Calif., where the company increased its deployment area by 83%. Nuro also plans to increase its deployment area in Houston by 70%, in terms of linear miles. 

In September, Nuro expanded its business model to include licensing Nuro Driver to automotive OEMs. As part of the new licensing model, the company also announced the Nuro AI Platform, which consists of scalable and performant developer tools to support AI development and validation for the Nuro Driver.

“Since publicly unveiling our new direction a little over a month ago, we have seen tremendous interest in our AI-driven autonomy platform from automotive OEMs and mobility companies,” stated Jiajun Zhu, the co-founder and CEO of Nuro. “Our latest driverless deployment demonstrates the maturity and capability of our AI platform, and we’re excited for potential partners to capitalize on the performance, safety, and sophistication of the Nuro Driver to build their own incredible autonomy products.”


SITE AD for the 2025 Robotics Summit registration. Register now


Nuro Driver ready to take on new challenges

Founded in 2016, Nuro said its newly expanded operational design domain (ODD) encompasses advances including:

  • Multi-lane road operation at speeds up to 35 mph (56.3 kph)
  • Improvements related to complex scenario handling, such as reacting to active emergency vehicles, navigating construction zones, and responding to active school busesa
  • Night operation, expanding service availability

Nuro said its system now covers a wider portion of everyday driving conditions. The Mountain View-based company said this expanded operational scope demonstrates the growing sophistication and reliability of its autonomous vehicles in real-world applications.

To date, Nuro said its fleet has logged more than 1 million autonomous miles with zero at-fault incidents, underscoring the company’s commitment to safety and technological excellence. Its custom L4 vehicle is designed with cost-effective, automotive-grade components.

Nuro claimed that its approach ensures that its technology is not only highly capable but also practical for large-scale deployment across various vehicle types and use cases. The company said Nuro Driver can accelerate autonomous vehicle development by enabling up to SAE Level 4 autonomy on mobility platforms and personally-owned vehicles.

The post Nuro Driver expands Level 4 autonomous fleet in California and Texas appeared first on The Robot Report.

]]>
https://www.therobotreport.com/nuro-driver-expands-level-4-autonomous-deliveries-california-texas/feed/ 0
LOXO expands into Germany with self-driving logistics subsidiary https://www.therobotreport.com/loxo-expands-into-germany-with-self-driving-logistics-subsidiary/ https://www.therobotreport.com/loxo-expands-into-germany-with-self-driving-logistics-subsidiary/#respond Fri, 15 Nov 2024 07:00:49 +0000 https://www.therobotreport.com/?p=581602 LOXO, which has already successfully demonstrated autonomous deliveries with its software and vehicles, is opening an office in Munich.

The post LOXO expands into Germany with self-driving logistics subsidiary appeared first on The Robot Report.

]]>
The LOXO Alpha autonomous vehicle makes a delivery.

LOXO’s Migronomous delivery service demonstrated the Alpha self-driving vehicle. Source: Schindler Group

LOXO AG today opened its first international subsidiary in Munich, Germany. The Bern, Switzerland-based company has developed and deployed software-as-a-service, or SaaS, systems to provide autonomy to delivery vehicles.

“Germany is a natural next step for LOXO as we continue our mission to revolutionize commercial vehicle automation in Europe,” said Amin Amini, CEO of LOXO. “Germany’s strong logistics market, advanced automotive infrastructure, and progressive legislation surrounding autonomous vehicles make it the ideal location for us to further our middle-mile and mobile distribution projects.”

Amin Amini, Lara Amini, and Claudio Panizza founded the company in 2018. LOXO claimed that its vehicle-agnostic LOXO Digital Driver (LDD) software can give nearly any commercial vehicle SAE Level 4 autonomy.

In addition, the company operates its own fleet of logistics vehicles, the electric LOXO Alpha and LOXO R1.


SITE AD for the 2025 Robotics Summit registration. Register now


LOXO begins with autonomous deliveries in Bern

LOXO recently rolled out what it said was Europe’s first L4 self-driving technology, which is operational on public roads in Bern. The company, which began with remote operation, has authorized autonomous driving routes spanning 65 km (40 mi.) within the city.

“The LDD software serves as the core of every autonomous operation, combining advanced AI with an autonomous sensor stack to enable vehicles to drive autonomously,” explained the company on its website. “Our virtual mapping approach significantly enhances scalability, ensuring operations remain fast and cost-effective.”

Last year, it conducted projects with Schindler Group and Migros, Switzerland’s largest online retailer. Its autonomous vehicles are currently proving their capabilities in projects with Planzer, a major Swiss logistics company.

German middle-mile market presents an opportunity

“Germany’s middle-mile logistics sector, valued at approximately $379.89 billion in 2023 and projected to surpass $504 billion by 2032, has enormous untapped potential,” asserted LOXO. “The repetitive nature of middle-mile routes between business hubs presents an ideal opportunity for autonomous driving solutions.”

Lara Amini, co-founder and current chief business officer of LOXO Switzerland, will lead the new Munich subsidiary.

“We’re not just focused on replacing traditional vehicles with autonomous ones. Our goal is to foster innovation that will catalyze the transformation of the entire logistics sector,” said Lara Amini. “By collaborating with local partners and tapping into Germany’s pool of talent, we aim to take LOXO’s successful model and scale it across new markets.”

The company said its expansion into Germany signals its commitment to advancing autonomous driving in one of Europe’s largest logistics markets. It said it aims to further integrate its vehicles into the supply chain and advance the state of commercial vehicle automation.

The post LOXO expands into Germany with self-driving logistics subsidiary appeared first on The Robot Report.

]]>
https://www.therobotreport.com/loxo-expands-into-germany-with-self-driving-logistics-subsidiary/feed/ 0
Silicon Sensing to supply PinPoint gyros for Martian moons exploration https://www.therobotreport.com/silicon-sensing-supply-pinpoint-gyros-martian-moons-exploration/ https://www.therobotreport.com/silicon-sensing-supply-pinpoint-gyros-martian-moons-exploration/#respond Wed, 13 Nov 2024 20:57:56 +0000 https://www.therobotreport.com/?p=581578 The size of a fingernail, PinPoint is the smallest gyro in Silicon Sensing Systems' MEMS product range and has been tested for space.

The post Silicon Sensing to supply PinPoint gyros for Martian moons exploration appeared first on The Robot Report.

]]>
Silicon Sensing's CRM200 Pinpoint.

Silicon Sensing’s miniature CRM200 PinPoint gyro. | Source: Silicon Sensing Systems

Robotic missions to Earth’s moon are challenging enough, but motion control on the moons of Mars requires precision technology. Silicon Sensing Systems Ltd. has been contracted by the German Aerospace Centre to supply two miniature PinPoint gyros for use in the Martian Moons eXploration mission. The mission aims to send rovers to survey Deimos and Phobos. 

The German Aerospace Centre (DLR) will use the company’s CRM200 gyros in the vehicle that will explore the larger of these moons, Phobos. There, this rover will collect surface samples. The set of PinPoint gyros will help detect unintended movement of the rover on the unknown surface.

Depending on the initial checkout of the drivetrain that includes the gyros, the team will activate an optional safety module in the software. This module will automatically prevent instability during drive sessions of the rover.

“PinPoint has a proven track record in space applications, but this will be a landmark use on a remarkable mission where this gyro’s reliability and endurance will be critical,” stated David Sommerville, the general manager of Silicon Sensing Systems. 

Founded in 1999, Silicon Sensing Systems engineers gyroscope and inertial systems. Jointly owned by Collins Aerospace and Sumitomo Precision Products, the company develops silicon, micro-electromechanical systems (MEMS)-based navigation and stabilization technology. 

Silicon Sensing said it has supplied millions of MEMS gyroscopes and accelerometers to thousands of customers.

Silicon Sensing designs compact, robust gyros

Just the size of a fingernail, at approximately 5mm x 6mm (0.1 x 0.2 in.), PinPoint is the smallest gyro in Silicon Sensing’s MEMS product range. The company said it is a proven, low drift, single-axis, angular-rate sensor with many applications across diverse market sectors.

In combination, these robust sensors can precisely measure angular rate across multiple axes. This includes any combination of pitch, yaw, and roll – all while consuming very little power, according to Silicon Sensing.

As part of the rigorous selection process for this exploration program, PinPoint completed total ionizing dose (TID) testing at 17kRad radiation and proton tests (up to 68 MeV/proton). This testing demonstrated the gyro’s suitability for space requirements.

“We are also seeing increasing space-sector application for our latest tactical grade IMU [inertial measurement unit] — the DMU41 — which has recently been selected for a number of low-Earth orbit programs,” said Somerville. “This growing interest in our MEMS-based inertial sensors and systems reflects the potential of this technology, with its rugged reliability, compact size, and low power consumption, for the sector.”


SITE AD for the 2025 Robotics Summit registration. Register now


Mission gets ready for Martian moons

JAXA, the Japanese space agency, is leading the Martian Moons eXploration (MMX) mission. It will explore the two moons of Mars with contributions from NASA, ESA, CNES, and DLR. CNES, the French national space agency, and the DLR are jointly contributing a 25-kg (55.1 lb.) rover. 

Approximately one year after leaving Earth, the spacecraft will arrive in Martian space and enter into an orbit around the planet. It will then move into a quasi-satellite orbit (QSO) around Phobos to collect scientific data, drop the rover, and gather a sample of the moon’s surface.

After observation and sample collection, the spacecraft will return to Earth carrying the material gathered from Phobos.

The current schedule has a launch date in 2026, followed by a Martian orbit insertion in 2027. The team said it hopes the probe will return to Earth in 2031.

An illustration of the MMX rover vehicle, a boxy vheicle with four wheels attached on legs, driving on Mars.

The MMX rover vehicle will eventually gather samples from the surface of one of Mars’ moons. | Source: Silicon Sensing Systems

The post Silicon Sensing to supply PinPoint gyros for Martian moons exploration appeared first on The Robot Report.

]]>
https://www.therobotreport.com/silicon-sensing-supply-pinpoint-gyros-martian-moons-exploration/feed/ 0
ROAMEO specs revealed by AITX RAD as company marks 1,000 security systems https://www.therobotreport.com/aitxs-rad-reveals-specs-for-new-autonomous-security-robot/ https://www.therobotreport.com/aitxs-rad-reveals-specs-for-new-autonomous-security-robot/#respond Thu, 07 Nov 2024 15:50:10 +0000 https://www.therobotreport.com/?p=581454 AITX RAD unveiled the capabilities of ROAMEO Gen 4, its autonomous security robot that will begin shipping in March 2025.

The post ROAMEO specs revealed by AITX RAD as company marks 1,000 security systems appeared first on The Robot Report.

]]>
hero image of an illustration of the ROAMEO Gen4 autonomous mobile robot.

The ROAMEO Gen 4 autonomous security robot is equipped with an array of sensors that enable it to patrol remote areas of a facility. | Credit AITX RAD

Robotic Assistance Devices Inc., or RAD, this month announced the specifications of the fourth generation of its ROAMEO wheeled security robot. The Ferndale, Mich.-based company also today celebrated the deployment of 1,000 devices.

RAD said that ROAMEO Gen 4 will provide security and concierge services with enhanced capabilities and artificial intelligence integration. The vehicle can autonomously patrol large corporate, college, and university campuses, stated the subsidiary of Artificial Intelligence Technology Solutions Inc. (AITX).

“The market for a robot like ROAMEO remains untapped, and based on our years serving this space, we are certain that it is a significantly large market,” said Steve Reinharz, chief technology officer and CEO of AITX and RAD.

“Having solved a myriad of technical challenges and deployed many earlier versions of ROAMEO, we are perfectly positioned to define and capture this market,” he added. “This is the cumulation of years of creation, testing, and perseverance. We are beyond thrilled with today’s big reveal and excited for what this robot will do for the industry and AITX.”

ROAMEO builds on previous models for faster ROI

RAD said it gained insights from deployments of ROAMEO 1.x and 2.x, as well as from work on the unreleased Version 3.x. The latest design includes fully autonomous recharging, as well as a software architecture built around AITX’s proprietary Autonomous Intelligent Response (AIR) technology.

Patrolling outdoor spaces can be tedious, expensive, and dangerous work, noted the company. It said the robot enables campus security teams to extend their presence and mission scope without expending additional resources. 

“From an industry perspective, demand for ROAMEO is so strong because it addresses a significant pain point for these end users,” said Reinharz. “That’s why we’re seeing so much interest and unsolicited outreach to us.”

“ROAMEO can make a positive financial impact even with as few as 25 deployments,” he said. “We’ll be working to hit 100 deployments as soon as possible, forecasted by the end of 2026, and that will be quite remarkable. Looking ahead, we’re anticipating hundreds of units being deployed in the coming years, marking a major step forward in our growth, path to profitability, and continued innovation.”

RAD acknowledged delays in bringing ROAMEO to market, citing the need to shift priorities from other systems including stationary sensors. It has completed work on ROSA and RIO Gen 4, as well as AVA Gen 4. The company will initially integrate its AIR technology into RADCam, set to begin shipping in December. RAD will then roll it into all of its other systems and ROSS software.


SITE AD for the 2025 Robotics Summit registration. Register now


New vehicle is taller to see over traffic

ROAMEO Gen 4 is the largest autonomous security device RAD said it has produced to date, and it is designed for high-visibility outdoor applications. 

ROAMEO stands at a height of 6 ft., 9 in. (205.7 cm), width of 5 ft., 5 in. (165 cm), and length of 8 ft. 4 in. (254 cm). This is smaller than a midsize automobile but larger than the typical security golf cart. The taller robot has a clear line of sight over vehicles and people in high-traffic areas for both advanced detection and person/vehicle engagement, said RAD.

The vehicle weighs 1,609 lb. (729.8 kg) and delivers up to 16 hours of continuous run time, offering two-shift coverage on a single charge. With a ground clearance of up to 9.4 in. (23.8 cm) and four-wheel drive, ROAMEO can climb up to a 20% incline.

ROAMEO Gen 4 features lidar, radar, vision, microphones, and ultrasonic sensors. It sends data to an onsite command center in real time, alerting security staff to unusual situations.

ROAMEO Gen 4 is equipped with 215/45R17 wheels and tires, making transition to snow tires and or mud terrain tires quick and inexpensive, added RAD. The company claimed that ROAMEO Gen4 will be capable of SAE Level 5 autonomous operation in weather including rain and snowstorms.

illustration of the RAD ROAMEO Gen4 vehicle next to a woman and a pickup truck for size.

The ROAMEO Gen 4 is tall enough so that its the sensors can see over most parked vehicles. | Credit: AITX RAD

RAD celebrates growth, expects more public safety demand

Robotic Assistance Devices today said that it has surpassed 1,000 deployed and contracted security devices as of late October.

“Reaching this 1,000-device milestone validates RAD’s role in transforming the security landscape,” said Reinharz. “Our clients are choosing RAD to address their most pressing security needs with efficiency and impact.”

The company said it anticipates a renewed commitment to public safety and security from federal, state, and local governments in the coming months. RAD asserted that it is prepared to support evolving government initiatives with responsive security systems that also offer cost savings.

“We’ve seen a significant increase in interest and opportunity from municipalities and regional jurisdictions looking for advanced security solutions that can meet their communities’ needs,” Reinharz added. “Our solutions provide an efficient approach that enhances security without the high costs traditionally associated with physical security measures. We’re committed to supporting these evolving public safety initiatives with technology that’s accessible, non-biased, and impactful.”

The company estimated that 250 deployed ROAMEO units could generate as much as $20 million in recurring revenue. RAD said it has a prospective sales pipeline of more than 35 Fortune 500 companies and numerous other client opportunities.

The post ROAMEO specs revealed by AITX RAD as company marks 1,000 security systems appeared first on The Robot Report.

]]>
https://www.therobotreport.com/aitxs-rad-reveals-specs-for-new-autonomous-security-robot/feed/ 0
Geek+ and Intel launch Vision Only Robot Solution for smart logistics https://www.therobotreport.com/geekplus-intel-launch-vision-only-robot-system-logistics/ https://www.therobotreport.com/geekplus-intel-launch-vision-only-robot-system-logistics/#respond Mon, 04 Nov 2024 19:19:25 +0000 https://www.therobotreport.com/?p=581397 Geek+ expects these robots to work in factory and warehouse transportation, helping customers build agile, digital, and intelligent supply chains.

The post Geek+ and Intel launch Vision Only Robot Solution for smart logistics appeared first on The Robot Report.

]]>
An image of Intel's robotic vision hub. You can see the outline of an AMR, with the hardware of the vision hub being the only thing visible.

The Robotic Vision Hub, which contains components such as the Intel Core i7-1270P processor and connection modules. | Source: Geek+

Geekplus Technology Co. today launched its Vision Only Robot Solution. The system includes Intel Visual Navigation Modules, which Geek+ said will drive the digital transformation of the logistics industry. 

“The Vision Only Robot Solution, developed in collaboration with Intel, effectively leverages the depth vision perception of the Intel RealSense camera,” stated Solomon Lee, vice president of product at Geek+. “Together with the deep algorithmic innovations from both sides, it results in a boost in business growth and efficiency for customers, driving the digital and intelligent upgrade of smart logistics.”

Geek+ claimed that its new system is the world’s first vison-only autonomous mobile robot (AMR) using Intel Corp.‘s Visual Navigation Modules. It also features algorithmic innovations in V-SLAM (visual simultaneous localization and mapping) positioning, composite detection networks, and robot following, the partners said. This allows for highly accurate navigation and obstacle avoidance, helping enterprises cope with diverse and complex logistics scenarios while enhancing both efficiency and accuracy, said Geek+.

The vision-only robots equipped with the Intel Visual Navigation Modules will debut this week at CeMAT in Shanghai. Geek+ said it plans to strengthen its partnership with Intel to develop more smart logistics systems.

Founded in 2015, Geek+ said that more than 1,000 customers use its AMRs for warehouses and supply chain management. The company has offices in the U.S., Germany, the U.K., Japan, South Korea, China, and Singapore. Last month, it opened a 40,000-sq.-ft. facility near Atlanta, announced a 12 m (40 ft.) tall automated storage system, and partnered with Floatic.

Intel RealSense supports vision-based AI

Geek+ explained that its Vision Only Robot Solution integrates the Intel RealSense camera. This camera has an all-in-one design that enables all depth calculations to be performed directly within the device. This will result in low power consumption and independence from specific platforms or hardware, said the companies.

The Intel RealSense also supports various vision-based AI, noted Intel. When paired with a dedicated visual processor, it can accelerate the machine-learning process and shorten the deployment cycle for new automation.

Thanks to the Intel RealSense camera, Geek+ said its Vision Only Robot can observe, understand, and learn from its environment. By obtaining highly accurate and consistent depth data, the robot can accurately recognize and interact with its surroundings, the company said.

“Highly accurate and consistent depth vision data is critical for [an] AMR to achieve environmental perception, significantly influencing its performance in positioning, navigation, and obstacle avoidance,” said Mark Yahiro, vice president of corporate strategy and ventures and the general manager of the RealSense business unit within Intel’s Corporate Strategy Office.

“Through collaboration with Geek+, we are driving AMR innovations based on depth vision data, enabling logistics robots to deliver highly stable and accurate transport services in complex environments, thereby empowering agile, digital, and intelligent supply chains,” he said.

In addition to the camera, the Intel Visual Navigation Module includes the Robotic Vision Hub, which contains components such as the Intel Core i7-1270P processor and connection modules. The module also enables cloud-edge collaboration through high-speed networks, said the partners.


SITE AD for the 2025 Robotics Summit registration. Register now


Geek+ aims for algorithmic innovation 

Geek+ said is building on the Intel Visual Navigation Module to provide reliable computational support for algorithms running on its Vision Only Robot:

  • V-SLAM positioning algorithm: This fuses multi-sensor data and various visual feature elements to generate composite maps, such as point feature maps, line feature maps, object maps, and special area maps. It can deliver reliable and precise positioning in complex and dynamic environments, said the companies.
  • Composite detection network: With both a traditional object-detection network and a validation network, it processes detection data from multiple dimensions, thus enhancing accuracy and reducing the false detection rate.
  • Robot following: By integrating modules such as personnel detection, re-identification, and visual target tracking, Geek+ said it has developed a flexible and efficient visual perception pipeline. Once the relative position between the target personnel and the AMR is determined, the local planning algorithm in Geek+’s self-developed RoboGo, a robotic standalone system, will enable autonomous obstacle avoidance for smooth AMR following of target personnel.

Geek+ said the combination of the Intel Visual Navigation Module’s depth perception and collaborative algorithmic innovations will ensure efficiency for its Vision Only Robot. It will also provide high precision and efficiency for environmental perception, positioning, and tracking, the company said.

Intel and Geek+ said they expect to see widespread adoption of these robots in areas such as factory and warehouse transportation.

Geek+ and Intel have debuted the Vision Only Robot Solution.

Geek+ and Intel have debuted the Vision Only Robot Solution. Source: Geek+

The post Geek+ and Intel launch Vision Only Robot Solution for smart logistics appeared first on The Robot Report.

]]>
https://www.therobotreport.com/geekplus-intel-launch-vision-only-robot-system-logistics/feed/ 0
Top 10 robotics developments of October 2024 https://www.therobotreport.com/top-10-robotics-developments-of-october-2024/ https://www.therobotreport.com/top-10-robotics-developments-of-october-2024/#respond Fri, 01 Nov 2024 17:47:03 +0000 https://www.therobotreport.com/?p=581379 In October 2024, large funding rounds, new AI product developments, and, of course, humanoids were drew readers' attention.

The post Top 10 robotics developments of October 2024 appeared first on The Robot Report.

]]>
October was another busy month for the robotics industry. It included many new developments and the return of exciting events, like RoboBusiness, which took in Santa Clara, Calif. Large funding rounds, new AI product developments, and, of course, humanoids were just a few of the things that drew our readers’ attention this month.  

Here are the top 10 most popular stories on The Robot Report in October 2024. Subscribe to The Robot Report Newsletter or listen to The Robot Report Podcast to stay up to date on the robotics developments you need to know about.


A Relay indoor robot makes a hotel delivery.10. Relay Robotics proposes levels of autonomous navigation for indoor robots

When we think of autonomous navigation, the first thing that usually comes to mind is self-driving cars. Although their development has spanned decades, recent years have seen significant advancements. One important framework that is used ubiquitously in the self-driving car industry is the classification of levels of driving automation. Defined by the Society of Automotive Engineers (SAE) in 2014, this framework remains a standard reference in the field. Read More


the training gripper sits on a table.9. Robot Utility Models: the coolest thing you never heard about (yet)

Robot Utility Models (RUMs) are a new area of research and development for the advancement of AI training for robotics. Lerrel Pinto, an assistant professor of computer science, and a team at at New York University created RUM. This open-source research project is trying to generalize training for robots so that one doesn’t have to train thousands of examples of a task. Read More


8. Innovative motion solutions are supporting the latest trends in robotics

Rapidly growing markets for robot innovation include applications that enhance human health and wellbeing, such as bionics and robotic surgery. Robots in these fields rely on miniature DC motion technology, which require state-of-the-art motion control. Motors with increasing torque density and dynamics are helping robot designers improve precision and control. Read More


atlas robot squats to pick up an item.7. Atlas humanoid robot shows increasing competence in latest Boston Dynamics video

Boston Dynamics Inc. released a new video of its Atlas humanoid robot. The video shows the electric robot handling large automotive parts autonomously. According to the company, the robot uses machine learning to execute its tasks and 3D vision to perceive the world around it. Read More


Swiss-Mile's quadruped robot with wheels making its way down stone stairs.6. Robotics investments near $1B in August

Fifty producers of robots and robotics-enabling technologies received funding in August 2024, pulling in a total of approximately $1 billion. This figure is on par with the $1.2 billion average The Robot Report has tracked each of the previous 12 months. Investment targeted to robotics companies for the first eight months of 2024 equals about $10.86 billion.  Read More


fourier gr-2 robot illustration with three different views of the robot.5. Fourier launches GR-2 humanoid, software platform

Shanghai-based Fourier launched GR-2, the latest generation of its GRx humanoid robot series. It has upgraded its hardware, design, and software. This announcement followed the company‘s rebranding from Fourier Intelligence to Fourier earlier this year, and the GR-2 release builds on the production release of the first-generation GR-1 in late 2023. Read More


image of the wiferion wireless solution and a PULS DIN power supply.4. PULS acquires Wiferion’s wireless charging business

DIN rail power supply provider PULS has acquired Wiferion from Tesla. This deal comes after Tesla acquired Wiferion for an undisclosed amount in June 2023. PULS said it plans to continue manufacturing, marketing and selling the company‘s wireless charging products worldwide. Read More


3. Renishaw and RLS help to drive a robot revolution

A revolution in collaborative robots promises to change how assistive care is delivered to the elderly, how people interact with their work environment, and even how surgeons perform heart surgery. RLS d.o.o. has over many years cultivated a value-added partnership with the German company, TQ-RoboDrive, part of the TQ-Group. Read More


The Universal Robots AI Accelerator Kit includes reference hardware and software.2. Universal Robots AI Accelerator offers to ease development of cobot applications

The latest advances in artificial intelligence promise to improve robot capabilities, but engineers need to bring the technologies together. Universal Robots announced its UR AI Accelerator, a hardware and software toolkit to enable the development of AI-powered collaborative robot applications. Read More


Corvus drone flying in a warehouse.1. Corvus Robotics soars to new heights with Series A round for drone inventory

Corvus Robotics Inc. has closed an $18 million Series A round and seed funding led by S2G Ventures and Spero Ventures. The Mountain View, Calif.-based company has been engineering and validating its inventory drone system since it was founded in 2017. Read More

The post Top 10 robotics developments of October 2024 appeared first on The Robot Report.

]]>
https://www.therobotreport.com/top-10-robotics-developments-of-october-2024/feed/ 0
RBR50 Spotlight: evoBot offers unique design for autonomous mobile robots https://www.therobotreport.com/rbr50-spotlight-evobot-offers-unique-design-for-autonomous-mobile-robots/ https://www.therobotreport.com/rbr50-spotlight-evobot-offers-unique-design-for-autonomous-mobile-robots/#respond Thu, 31 Oct 2024 20:01:03 +0000 https://www.therobotreport.com/?p=581365 evoBOT from Fraunhofer IML has a seemingly simple design that doesn’t resemble any other mobile robot on the market today.

The post RBR50 Spotlight: evoBot offers unique design for autonomous mobile robots appeared first on The Robot Report.

]]>


Organization: Fraunhofer Institute for Material Flow and Logistics
Country: Germany
Website: https://www.iml.fraunhofer.de/en/fields_of_activity/material-flow-systems/iot-and-embedded-systems/evobot.html
Year Founded: 1981
Number of Employees: 500+
Innovation Class: Technology, Product & Services

evoBOT is the kind of robot that stands out the moment you look at it. Its unique and seemingly simple design doesn’t resemble any other autonomous mobile robot (AMR) on the market today. Designed at the Fraunhofer Institute for Material Flow and Logistics (IML), the robot consists of two wheels and gripper arms. If you’re looking at it head-on while it’s zooming around without any cargo, the robot simply looks like an arch with wheels attached at the bottom.

rbr50 banner logo.The robot uses these arms to grip boxes by applying pressure from each side, and it maintains balance as it zips across airports. It keeps its balance using a dynamically stable system based on the principle of an inverted pendulum, which means it doesn’t have an external counterweight. Its ability to balance enables it to move on different and uneven surfaces.

evoBOT can reach a maximum speed of up to 60 kph (37.2 mph) and can transport a load of up to 100 kg (220.4 lb.). It can handle hazardous goods, transport parcels for longer recurring distances, and relieve employees during lifting and overhead work. The mobile manipulator can also procure materials and provide support during the loading and unloading of an aircraft.

Last year, evoBOT completed its first test run at the Munich Airport. There, it performed a practical test in the cargo terminal and on the apron of the airport.

These tests further proved the versatility of Fraunhofer IML’s system, setting it up for potential deployments in numerous industries. evoBOT’s innovative design sets it apart from its AMR counterparts, and it has the capabilities to back that up.

Explore the RBR50 Robotics Innovation Awards 2024.


RBR50 Robotics Innovation Awards 2024

OrganizationInnovation
ABB RoboticsModular industrial robot arms offer flexibility
Advanced Construction RoboticsIronBOT makes rebar installation faster, safer
Agility RoboticsDigit humanoid gets feet wet with logistics work
Amazon RoboticsAmazon strengthens portfolio with heavy-duty AGV
Ambi RoboticsAmbiSort uses real-world data to improve picking
ApptronikApollo humanoid features bespoke linear actuators
Boston DynamicsAtlas shows off unique skills for humanoid
BrightpickAutopicker applies mobile manipulation, AI to warehouses
Capra RoboticsHircus AMR bridges gap between indoor, outdoor logistics
DexterityDexterity stacks robotics and AI for truck loading
DisneyDisney brings beloved characters to life through robotics
DoosanApp-like Dart-Suite eases cobot programming
Electric SheepVertical integration positions landscaping startup for success
ExotecSkypod ASRS scales to serve automotive supplier
FANUCFANUC ships one-millionth industrial robot
FigureStartup builds working humanoid within one year
Fraunhofer Institute for Material Flow and LogisticsevoBot features unique mobile manipulator design
Gardarika TresDevelops de-mining robot for Ukraine
Geek+Upgrades PopPick goods-to-person system
GlidanceProvides independence to visually impaired individuals
Harvard UniversityExoskeleton improves walking for people with Parkinson’s disease
ifm efectorObstacle Detection System simplifies mobile robot development
igusReBeL cobot gets low-cost, human-like hand
InstockInstock turns fulfillment processes upside down with ASRS
Kodama SystemsStartup uses robotics to prevent wildfires
Kodiak RoboticsAutonomous pickup truck to enhance U.S. military operations
KUKARobotic arm leader doubles down on mobile robots for logistics
Locus RoboticsMobile robot leader surpasses 2 billion picks
MassRobotics AcceleratorEquity-free accelerator positions startups for success
MecademicMCS500 SCARA robot accelerates micro-automation
MITRobotic ventricle advances understanding of heart disease
MujinTruckBot accelerates automated truck unloading
MushinyIntelligent 3D sorter ramps up throughput, flexibility
NASAMOXIE completes historic oxygen-making mission on Mars
Neya SystemsDevelopment of cybersecurity standards harden AGVs
NVIDIANova Carter gives mobile robots all-around sight
Olive RoboticsEdgeROS eases robotics development process
OpenAILLMs enable embedded AI to flourish
OpteranApplies insect intelligence to mobile robot navigation
Renovate RoboticsRufus robot automates installation of roof shingles
RobelAutomates railway repairs to overcome labor shortage
Robust AICarter AMR joins DHL's impressive robotics portfolio
Rockwell AutomationAdds OTTO Motors mobile robots to manufacturing lineup
SereactPickGPT harnesses power of generative AI for robotics
Simbe RoboticsScales inventory robotics deal with BJ’s Wholesale Club
Slip RoboticsSimplifies trailer loading/unloading with heavy-duty AMR
SymboticWalmart-backed company rides wave of logistics automation demand
Toyota Research InstituteBuilds large behavior models for fast robot teaching
ULC TechnologiesCable Splicing Machine improve safety, power grid reliability
Universal RobotsCobot leader strengthens lineup with UR30

The post RBR50 Spotlight: evoBot offers unique design for autonomous mobile robots appeared first on The Robot Report.

]]>
https://www.therobotreport.com/rbr50-spotlight-evobot-offers-unique-design-for-autonomous-mobile-robots/feed/ 0
Advantech partners with oToBrite to create low-latency AI for AMRs https://www.therobotreport.com/advantech-partners-with-otobrite-to-create-low-latency-ai-for-amrs/ https://www.therobotreport.com/advantech-partners-with-otobrite-to-create-low-latency-ai-for-amrs/#respond Thu, 31 Oct 2024 12:30:22 +0000 https://www.therobotreport.com/?p=581334 Advantech and oToBrite said the joint system will enable high-resolution, low-latency AI for next-generation AMRs.

The post Advantech partners with oToBrite to create low-latency AI for AMRs appeared first on The Robot Report.

]]>
A graphic showing oToBrite's automotive GMSL cameras and the Intel Core Ultra H/U, which now work with Advantech AI.

Advantech and oToBrite said their joint system will benefit industries from logistics to manufacturing. | Source: oToBrite

oToBrite Electronics Inc. this week announced a strategic partnership with Advantech to co-develop high-performance and cost-effective perception for mobile robots. oToBrite will bring its experience with artificial intelligence, machine vision, and automotive-grade cameras, while Advantech will provide expertise with global industrial Internet of Things.

The collaborators said they will integrate oToBrite’s high-speed automotive Gigabit Multiple Serial Link (GMSL) cameras with Advantech’s AFE-R360 platform, powered by the Intel Core Ultra H/U (Meteor Lake). 

The joint system will enable high-resolution, low-latency AI for next-generation autonomous mobile robots (AMRs), benefiting industries from logistics to manufacturing, said the companies.

oToBrite says GMSL cameras meet industry needs

AMR applications have expanded into warehouse logistics, last-mile delivery, and terminal or yard tractors. In response to this, oToBrite said integrating GMSL technology addresses the increasing need for real-time, uncompressed, and high-resolution perception. The company said its technologies enable accurate autonomous navigation in diverse environments.

As a provider of advanced driver-assist systems (ADAS), oToBrite has manufactured several vision-AI products for major automakers. Those products rely on high-speed data transmission to handle the large data flow from multiple cameras and enable real-time processing in vehicles.

To meet demand, oToBrite has integrated GMSL technology in products like SAE Level 2+ ADAS and Level 4 autonomous valet parking. The Hsinchu, Taiwan-based company said its automotive experience and technology will enable customers to successfully deploy AMRs with Advantech.

The oToCAM222 series with 2.5M pixels offers multiple viewing angles (63.9°/120.6°/195.9°). The company said this makes it suitable for low-speed AMR applications in challenging industrial environments. The camera offers high-speed, low-latency data processing and IP67/69K-rated durability, said oToBrite.

The company also noted that it has created advanced vision-AI models, embedded system software for various platforms, and active alignment technology for IP67/69K automotive cameras in its IATF16949-certified factory. 


SITE AD for the 2025 Robotics Summit registration. Register now


Advantech initiates partnerships with other camera providers

The AFE-R360 platform is powered by Intel’s Core Ultra 16-core processor with Arc graphics and an integrated neural processing unit (NPU) delivering up to 32 trillion operations per second (TOPS), enhanced by the OpenVINO toolkit for optimized AI performance, said Advantech.

In addition to its partnership with oToBrite, the Taipei, Taiwan-based company recently initiated a partner alignment strategy, gathering top camera providers to develop systems for the AFE-R360. These include Intel RealSense, e-con Systems, and Innodisk. 

“Advantech is proud to collaborate with Intel and camera vendors to strengthen our AMR solution,” stated James Wang, director of embedded applications at Advantech. “By incorporating MIPI and GMSL interfaces into AFE-R360, Advantech is committed to providing our customers with cutting-edge technology that meets the challenges of tomorrow. These interfaces not only enhance performance but also enable new possibilities in imaging applications across various industries.” 

The offering also includes a 3.5-in. (8.8-cm) single board computer (SBC) supporting up to eight MIPI-CSI lanes for seamless GMSL input, ensuring low latency and high noise immunity essential for autonomous operations, said the companies. It also has three LAN and three USB-C ports for integrating depth and lidar sensors.

oToBrite and Advantach said their combination of AI and advanced GMSL camera technology will enhance cost-effective AMR systems.

The post Advantech partners with oToBrite to create low-latency AI for AMRs appeared first on The Robot Report.

]]>
https://www.therobotreport.com/advantech-partners-with-otobrite-to-create-low-latency-ai-for-amrs/feed/ 0
Relay Robotics proposes levels of autonomous navigation for indoor robots https://www.therobotreport.com/relay-robotics-explains-autonomous-navigation-levels-indoor-robots/ https://www.therobotreport.com/relay-robotics-explains-autonomous-navigation-levels-indoor-robots/#respond Sun, 27 Oct 2024 15:36:41 +0000 https://www.therobotreport.com/?p=581288 Autonomous navigation is best understood in terms of levels of autonomy, similar to those for self-driving cars, notes Relay Robotics.

The post Relay Robotics proposes levels of autonomous navigation for indoor robots appeared first on The Robot Report.

]]>
A Relay indoor robot makes a hotel delivery.

A Relay robot makes a hotel delivery using autonomous navigation. Source: Relay Robotics

When we think of autonomous navigation, the first thing that usually comes to mind is self-driving cars. Although their development has spanned decades, recent years have seen significant advancements.

One important framework that is used ubiquitously in the self-driving car industry is the classification of levels of driving automation. Defined by the Society of Automotive Engineers (SAE) in 2014, this framework remains a standard reference in the field.

While indoor mobile robots have enjoyed nowhere near the fame that self-driving cars have, they’ve evolved substantially in the past decade as well. Driven by staff shortages, service robots are increasingly being deployed across various industries, including hospitality, healthcare, warehouse and logistics, food service, and cleaning.

Relay robots in particular, are being deployed in busy hospitals and hotels across the world. However, unlike automated driving, there is currently no widely adopted standard for levels of autonomous navigation for indoor robots. Our objective is to present such a framework.

Given the inherent availability of a human driver as fallback in self-driving cars, much of the SAE framework is based on the distribution of driving responsibilities between the human driver and the self-driving agent. Level 0 indicates no automation where the human driver is completely in control.

Levels 1, 2, and 3 have varying degrees of partial automation. At Level 4, the vehicle is fully self-driving, but only under certain defined conditions. Leading self-driving companies like Waymo have achieved this level of autonomy.

Finally, Level 5 is full automation everywhere and in all conditions. This level has not been achieved yet.

What influences levels of autonomous navigation for indoor robots?

Installation complexity

Indoor robots do not have an inherent partnership with a human driver. Essentially, they begin at Level 4 of the SAE framework in this regard. But indoor robots do have a different advantage, another crutch to rely on instead at initial levels of autonomy — the ability to modify their environment.

For example, modifying a building’s infrastructure by painting lines on the floor or placing landmarks on the walls is not as difficult relative to modifying all road infrastructure. Such markers can be very helpful aids for automated guided vehicle (AGV) navigation.

In general, indoor robots today go through an installation process before being put into operation. In addition to modifying building infrastructure, mapping, labeling, and other required setup can be a part of this process. This can often be cost-, time-, and labor-intensive.

The more advanced the navigation skills of the robot though, the less complicated the installation process tends to be. And lower installation complexity leads to lower cost and friction for adoption.

Installation complexity is thus an important factor to consider while defining the levels of autonomous navigation for indoor robots.

Indoor mobile robots rely on autonomous navigation for safe and efficient delivery in settings such as hospitals.

Mobile robot considerations include safety and efficiency in dynamic settings like hospitals. Source: Relay Robotics

Social navigation

Another major distinction between self-driving cars and indoor autonomous robots is of course the difference in environments. With the exception of factory-like environments, most indoor environments are very unstructured. There are no lanes or signals, no dedicated crosswalks for people, and no well defined rules of the road.

Instead, indoor environments are highly social spaces. Robots have to co-navigate with all other agents, human and robot, that are also using the space. Well-defined rules of the road are replaced by a loosely defined set of social rules that change based on country, environment, situation and many other factors. For instance, do robots, people, or other vehicles pass on the left or the right?

Successfully navigating in these highly unstructured and social environments requires skills and behaviors that are usually placed under the label “social navigation.” At a high level, social navigation is a set of behaviors that allows a robot to navigate in human-populated environments in a way that preserves or even enhances the experience of the humans around it.

While functional navigation focuses on safety and efficiency, resulting in robots that can complete a task but often need humans to adapt to them, social navigation focuses on the quality of human experience and allows robots to adapt to humans. This may not be crucial for controlled, human-sparse environments like factories and warehouses but becomes increasingly important for unstructured, human-populated environments.

Operational domain helps define autonomous navigation

A robot’s operational domain is the kinds of environments it can be successful in. Not all indoor environments are the same. Different environments have different needs and might require different levels of navigation sophistication.

For instance, warehouses and factories allow for robots with simpler, safety focused navigation to be successful. On the other hand, environments like hotels or restaurants are unstructured, unpredictable and require higher levels of navigation skill, particularly social navigation. Even more challenging are highly crowded environments or sensitive environments like hospitals and elder care homes.

Not every indoor environment requires a robot of the highest social navigation level, but placing a robot with low social navigation skill in environments like hospitals can result in poor performance. So it is important to define the operational domain of a robot.

Multi-floor autonomous navigation

Self-driving cars need only worry about single-level roads. But a large number of buildings in the world are multi-floor, and robots need to be able to traverse those floors to be effective. Overcoming this challenge of vertical navigation can result in a huge increase in a robot’s operational domain and is an important factor to consider when defining a robot’s level.

So installation complexity, social navigation, and operational domain are the three barometers against which we can measure the level of autonomous navigation for indoor robots.

Multi-floor navigation, while hugely important, is somewhat orthogonal to 2D navigation skill and robots of every navigation level could potentially access it. So we create a level modifier for this capability that could be added to any level.

With that, let’s dive into defining levels of indoor robot navigation.

Levels of autonomous navigation for indoor robots

Level 0

These are robots that have no autonomous navigation capabilities and rely entirely on humans to operate them. Robots that fall into this category are telepresence robots and remote controlled robots like remote-controlled cars.

Level 1

Robots that have a minimal sensor suite and can only navigate on paths that are predefined using physical mechanisms like wires buried in the floor, magnetic tape or paint. These Level 1 robots have no ability to leave these predefined paths.

Such AGVs have no concept of location, using only the distance traveled along the path to make decisions. They can typically detect obstacles and slow down or stop for them, but they do not have the ability to avoid obstacles.

A Mouse AGC 3A10-20T Toyota automated guided cart.

A Mouse AGC 3A10-20T automated guided cart. Source: Toyota

Level 1 robots need extensive changes to a building’s infrastructure during installation leading to significant cost. They have almost no social navigation capability, and so their operational domain is mainly highly structured and controlled manufacturing and logistics environments.

Level 1 AGV characteristics.

Level 1 characteristics. Source: Relay Robotics

Level 2

Robots operating at Level 2 are AGVs that do not need physical path definition but still rely on paths that are digitally defined during installation. These mobile robots can localize themselves within a site using external aids such as reflectors, fiducials or beacons that are placed in strategic locations at the site. They can use this location to follow the virtually defined paths.

Like Level 1 robots, these robots also cannot leave their virtual predefined paths and can only detect and stop for obstacles but cannot avoid them.

Demonstration of an AGV triangulating using reflectors on walls.

Demonstration of an AGV triangulating using reflectors on walls. Source: Cisco-Eagle

Although the infrastructure changes required are not as intrusive as Level 1, because of the need for installation of external localization sources, these robots have moderate complexity of installation. The fixed paths mean that they have low social navigation skill and are still best used in relatively structured environments with little to no interaction with humans.

Level 2 of autonomous navigation characteristics from Reach Robotics.

Level 2 autonomous navigation characteristics. Source: Relay Robotics

Level 3

Robots operating at Level 3 rely entirely on onboard sensors for navigation. They use lidars and/or cameras to form a map of their environment and localize themselves within it. Using this map, they can plan their own paths through the site. They can also dynamically change their path if they detect obstacles on it. So they can not only detect obstacles, but can also avoid them.

A 3D lidar point-cloud visualization.

A 3D lidar point-cloud visualization. Source: Jose Guivant, “Autonomous Navigation using a Real-Time 3D Point Cloud

This independence and flexibility of Level 3 robots results in moderate social navigation skills and significantly reduced installation complexity since no infrastructure changes are required.

Level 3 robots can be used in unstructured environments where they can navigate alongside humans. They represent a significant increase in intelligence, and systems of this level and higher are called autonomous mobile robots (AMRs). Most modern service robots belong to this category.

Level 3 autonomous navigation characteristics

Level 3 autonomous navigation characteristics. Source: Relay Robotics

Level 4

Even though robots of Level 3 cross the threshold of navigating in unstructured environments alongside humans, they still navigate with moderate social navigation skill. They do not have the advanced social navigation skills needed to adapt to all human interaction scenarios with sophistication. This sometimes requires the humans it interacts with to compensate for its behavioral limitations.

In contrast, Level 4 robots are AMRs with social navigation skills evolved enough to be on par with humans. They can capably navigate in any indoor environment in any situation provided there aren’t any physical limitations.

This means that their operational domain can include all indoor environments. Another ramification of this is that Level 4 robots should never need human intervention to navigate.

This level has not yet been fully achieved, and defining and evaluating everything that is required for such sophisticated social navigation is challenging and remains an active area of research. Here is an infographic from a recent attempt to capture all the facets of social navigation:

To navigate capably in all indoor environments, robots need to be able to optimize within a complex, ill-defined, and constantly changing set of rules. This is something that humans handle effortlessly and often without conscious thought, but that ease belies a lot of complexity. Here are a few challenges that lie on the path to achieving human-level social navigation –

  • Proxemics: Every person has a space around them that is considered personal space. Invading that space can make them uncomfortable, and robots need to respect that while navigating. However, the size and shape of this space bubble can vary based on culture, environment, situation, crowd density, age, gender, etc. For example, a person with a walker might need a larger-than-average space bubble around them for comfort, but this space has to shrink considerably when taking an elevator. Specifying rules for every situation can quickly become intractable.
  • Shared resources: The use of doors, elevators, and other shared resources in a building have their own implicit set of rules. Navigation patterns that hold for the rest of the building might not apply here. In addition, robots need to follow certain social norms while using these resources. Opening doors for others is considered polite. Waiting for people to exit an elevator before trying to enter, making space for people trying to get off a crowded elevator, or even temporarily getting off the elevator entirely to make space for people to exit are common courtesies that robots need to observe.
  • Communicating intent: Robots need to be able to communicate their intent while co-navigating with other agents. Not doing so can sometimes create uncertainty and confusion. Humans do this with body language, eye contact, or verbal communication. We rely on this particularly when we find ourselves in deadlock situations like walking toward another person in a narrow corridor or when approaching the same door at the same time. Robots also need to be able to resolve situations like these while preserving the safety and comfort of the humans they’re interacting with.

All in all, achieving this level of social navigation is extremely challenging. While some Level 3 robots may have partially solved some of these problems, there is still quite a ways to go to reach true Level 4 autonomy.

Level 4 characteristics of autonomous navigation indoors. Source: Reach Robotics

Level 4 indoor navigation characteristics. Source: Relay Robotics

Level 5

As humans, we are able to find our way even in new, unfamiliar buildings by relying on signage, using semantic knowledge, and by asking for directions when necessary. Robots today cannot do this. At the very least, the site needs to be fully mapped during installation.

Level 5 autonomous indoor navigation of a service robot.

Level 5 autonomous indoor navigation of a service robot. Source: Relay Robotics, generated with Google Gemini

Level 5 robots are robots that could navigate in all indoor environments on par with human skill, as well as do so in a completely new environment without detailed prebuilt maps and a manually intensive installation process. This would remove installation complexity entirely, allowing robots to be operational in new environments instantly, reducing friction for adoption, and paving the way for robots to become more widespread.

This is a missing level in the framework for self-driving cars as they also go through a similar process where high precision 3D maps of an area are created and annotated before a self-driving car can operate in it. Developments in artificial intelligence could help realize Level 5 capability.

Level 5 mobile robot navigation characteristics.

Level 5 mobile robot navigation characteristics. Source: Relay Robotics

Multi-floor autonomous navigation+

Robots that can either climb stairs or that can call, board, and leave elevators unlock the ability to do multi-floor navigation and get the “plus” designation. Also, highly reliable sensors are required to detect and avoid safety hazards like staircases and escalators for any robot that operates in multi-floor buildings. So a Level 2 robot that can successfully ride elevators would be designated Level 2+.

Elevator riding is the more common of the two approaches to this capability and may require infrastructure changes to the elevator system to achieve. So this introduces additional installation complexity.

It is also worth noting that in human-populated environments, elevators provide robots an additional social navigation challenge. This is because it requires movement in a confined space with many other agents, tight time constraints for elevator entry and exit, and dealing with special behavioral patterns that humans engage in while riding elevators.

In summary, robots of Levels 1 and 2 rely heavily on infrastructure changes for navigation and have low social navigation, so they are best suited for structured, human-sparse environments.

Robots of Level 3 are more intelligent and self-reliant. They require almost no infrastructure changes during installation, but at minimum they require the environment to be mapped and labeled. They possess moderate social navigation skills and can operate in unstructured, human-populated environments.

Level 4 represents an advancement to human-level navigation skill allowing for safe deployment in any indoor environment. Level 5 robots take this a step further, navigating with the same proficiency even in entirely new, unfamiliar spaces. Any of these robots that can do multi-floor navigation get the additional “+” designation.

Trends across levels.All infographics created by Irina Kim and Jason Hu, Relay Robotics

Trends across levels. All infographics created by Irina Kim and Jason Hu, Relay Robotics

Autonomous navigation must be reliable

A crucial factor for success that is not represented in this framework is the overall robustness and reliability of the product. It is easy to underestimate the complexity and unpredictability of real-world environments. Robotic systems typically take several years of field experience to go from a cool lab demonstration to a robust and reliable product that people can rely on.

For example, Relay Robotics offers Level 3+ robots that have already completed over 1.5 million successful deliveries and accumulated years of real-world operational experience. With this mature technology as a foundation, the company is making strides toward Level 4+ navigation.

Relay’s focus on creating sophisticated social navigation that can handle even busy and stressful environments like hospital emergency departments has made our AMRs among the most sophisticated on the market today. For the Relay and the broader industry, the key to advancing further lies in enhancing social navigation capabilities.

Even though there is still much work to do, Relay Robotics is using breakthroughs in AI and deep learning to get there.

About the authors

Sonali Deshpande is senior navigation engineer at Relay Robotics. Prior to that, she was a robotics software engineer at Mayfield Robotics, a perception systems engineer at General Motors, and a robotics engineer at Discovery Robotics.

Deshpande has a master’s in robotic systems development from Carnegie Mellon University.

Jim Slater is a robot systems architect and member of the executive staff at Relay Robotics, acting as a consultant. Prior to that, he was the founder and CEO of two successful startups including Nomadic Technologies (mobile robotics) and Alliant Networks (wireless networks). 

Slater has his master’s in engineering from Stanford University, where he was a research assistant in the Computer Science Robotics lab.  He also holds an MBA from the University of Wisconsin – Madison.

The authors also thank Steve Cousins for his insight and feedback in creating this piece. This article is posted with permission.

The post Relay Robotics proposes levels of autonomous navigation for indoor robots appeared first on The Robot Report.

]]>
https://www.therobotreport.com/relay-robotics-explains-autonomous-navigation-levels-indoor-robots/feed/ 0
Chinese AV developer WeRide brings in $458.5M from IPO, private placement https://www.therobotreport.com/chinese-av-developer-weride-brings-in-458-5m-from-ipo-private-placement/ https://www.therobotreport.com/chinese-av-developer-weride-brings-in-458-5m-from-ipo-private-placement/#comments Fri, 25 Oct 2024 21:16:22 +0000 https://www.therobotreport.com/?p=581282 WeRide currently holds autonomous driving licenses in China, the UAE, Singapore, and the U.S., and it has operations in over 30 cities.

The post Chinese AV developer WeRide brings in $458.5M from IPO, private placement appeared first on The Robot Report.

]]>
A WeRide advertisement on the Nasdaq building in New York City.

WeRide is one of the only autonomous vehicle companies listed on the Nasdaq currently. | Source: WeRide

Guangzhou, China-based WeRide was officially listed on the Nasdaq Global Select Market today under the ticker symbol “WRD.” The autonomous vehicle, or AV, developer set its initial public offering price at $15.5 per American depositary share, or ADS. This price was at the lower end of the company’s targeted range.

If the underwriters fully exercise the granted option, the company will issue a total of 8,903,760 ADSs, reported Reuters. The company expects total proceeds from the public offering, combined with $320 million concurrent private placement, to amount to $458.5 million. This is assuming the underwriters of the initial public offering fully exercise their option.

“Seven years of perseverance have led to WeRide’s new journey today,” stated Tony Han, the founder and CEO of WeRide. “We are deeply grateful to our investors, clients, employees, and all our partners for their trust and support.”

“Together, we have achieved this important milestone,” he said. “For us, going public is a new beginning, and WeRide will continue to drive technological innovation, delivering safe, comfortable, and convenient autonomous driving products and services to more countries and regions.” 

Established in 2017, WeRide provides autonomous driving systems ranging from SAE Level 2 to Level 4. The company holds autonomous driving licenses in China, the UAE, Singapore, and the U.S. So far, it has conducted research and development, testing, and operations across 30 cities in seven countries, with more than 1,700 days of operation.

WeRide commercializes autonomous mobility

WeRide’s portfolio includes its Robotaxi, Robobus, Robovan, and Robosweeper, plus an advanced driver-assistance system (ADAS). It said application scenarios span smart mobility, smart logistics, and smart sanitation.

The company‘s debut product, its Robotaxi, provides L4 services using WeRide’s shared mobility network as well as third-party shared mobility networks. It cited partnerships with world-class OEMs, including GAC and Nissan, to design and manufacture these vehicles. 

Its next offering, the Robobus, is a fully driverless, L4 bus designed for mass transit in urban settings. WeRide partnered with Yutong Group to produce the system. The Robobus has a top speed of 40 kph (24.8 mph). In addition, WeRide said it can handle open roads in all weather conditions. 

The company launched Robovan in September 2021. It built this offering with JMC-Ford Motors for intra-city delivery in urban areas. So far, WeRide has conducted road testing for this vehicle, which it claimed can offer a variety of logistics products and services.

WeRide offers two versions of its Robosweeper, the smaller S1 and the larger S6. These autonomous sanitation vehicles can sweep garbage and dust from road surfaces. The company said that with a single charge, it can cover over 120,000 sq. m (1.2 million sq. ft.), autonomously dump garbage, and park itself. 

An image showing WeRide's lineup of different autonomous vehicles on a blue and white background.

(From left to right) WeRide’s Robovan, RoboBus, an ADAS-enabled vehicle, S2 Robosweeper, robotaxi, and its S6 Robosweeper. | Source: WeRide

AV companies continue advancements

WeRide isn’t the only AV company to announce large funding rounds today. Waymo, the self-driving unit of Alphabet, raised $5.6 billion in Series C funding. Pony.ai is also reportedly working toward an IPO.

With this latest investment, Waymo said it will continue to welcome more riders into its Waymo One robotaxi service in San Francisco, Phoenix, and Los Angeles, and in Austin and Atlanta through its expanded partnership with Uber. 

Looking ahead, WeRide said it remains committed to its mission, continuing to advance its technology, and striving to provide innovative products and ervices that contribute to greener, low-carbon, and sustainable urban living.

The post Chinese AV developer WeRide brings in $458.5M from IPO, private placement appeared first on The Robot Report.

]]>
https://www.therobotreport.com/chinese-av-developer-weride-brings-in-458-5m-from-ipo-private-placement/feed/ 1
A&K Robotics refocuses micromobility testing in select airports https://www.therobotreport.com/ak-robotics-refocuses-micromobility-testing-select-airports/ https://www.therobotreport.com/ak-robotics-refocuses-micromobility-testing-select-airports/#respond Wed, 16 Oct 2024 12:30:12 +0000 https://www.therobotreport.com/?p=581153 A&K Robotics is testing its connected robotic pods in airports as part of its strategy to expand mobility worldwide.

The post A&K Robotics refocuses micromobility testing in select airports appeared first on The Robot Report.

]]>
A&K Robotics is testing its micromobility platform in Vancouver International Airport.

Micromobility trials in Vancouver International Airport could lead to wider deployments. Source: A&K Robotics

Demand for mobility assistance in spaces such as airports is increasing as populations age and more people travel. Robots and autonomous vehicles can help meet that demand amid persistent labor shortages, according to A&K Robotics Inc.

Since 2016, the Vancouver, B.C.-based company has been developing electric micromobility platforms and self-driving robotic pods to help improve quality of life and environmental sustainability.

“We’re not replacing wheelchairs in airports and other facilities,” said Jessica Yip, co-founder of A&K Robotics. “Our pods are intended to help people with mobility limitations.”

A&K Robotics rolls out airport robots in phases

In July, A&K Robotics said it is bringing its Cruz self-driving robotic pods at the Vancouver International Airport (YVR). The company had already tested its systems at the Hartsfield-Jackson Atlanta International Airport in 2022. 

Jessica Yip, co-founder of A&K Robotics

Jessica Yip, co-founder of A&K Robotics

“[Co-founder] Matt [Anderson] and I envisioned going to several airports when we started the company,” acknowledged Yip. “We then decided to focus on quality over quantity.”

“We had previously delivered one or two robots for relatively short durations,” she told The Robot Report. “It takes resources to bring a team and a 400-lb. mobile robot to each facility.”

“We knew we’d take a multi-stage approach to commercialization,” added Yip. “We researched the problem space and came to the conclusion that the automated mobility experience is really important to an airport’s customers — and to its business.”

“We’re prioritizing airports with high standards for operations and efficiency versus those that want robots as a novelty for marketing,” she said. “We’re focused on real-world operations and building our product to enable airports and airlines to have a high level of customization for branding.”

Finding value at the dawn of digitalization

In most airports, wheelchairs and shuttles must be manually fetched and brought to passengers and gates, noted Yip. Just knowing where they are in million-square-foot facilities or even parking lots can be a challenge requiring staffers to walk long distances and spend precious time.

“YVR has 10 million sq. ft., fire and ambulance service, IT, wildlife and aquariums, and plumbing — it’s actually a small city,” Yip said. “We have a great opportunity to test a mobility use case where there’s demand right now.”

“Our pods are connected IoT [Internet of Things] devices, and we’re building dashboards and tools for airports to know where their fleets are and their battery status,” she explained. “By digitizing control, they could even remotely deploy a pod to a gate.”

“We’re just on the cusp of learning what value we can bring with robots enabled by AI, sensors, and data,” she said.

Partnerships to boost Canadian robotics

Last month, A&K Robotics announced strategic partnerships to promote the adoption of robotics across Canada. It is working with telecommunications leader Bell Canada, battery and charging provider Delta-Q Technologies, and assistive charity the Rick Hansen Foundation to build an ecosystem of robots, cloud infrastructure, electric vehicle systems, and new manufacturing facilities.

“Some critical technologies are necessary for self-driving systems to scale,” explained Yip. “For quality of service, we need 5G connectivity, and we’re working with Amazon Web Services for cloud services to deploy fleets.”

“When robotic sensors pick up environmental changes, our systems will perform better at scale than in ones or twos,” she added. “They can detect if a gate is boarding and divert other robots to avoid congestion.”


SITE AD for the 2025 Robotics Summit registration. Register now


A&K Robotics expects micromobility to grow globally

The global micromobility market could expand from $79.1 billion (U.S.) to $243.2 billion by 2030 at a compound annual growth rate of 17.4%, predicted Maximize Market Research Pvt. Ltd. It cited advances in IoT and battery technology. A&K Robotics said it is poised to lead in that growth.

“We’re currently focused on a few strategic accounts in the Canada, the U.S., and Europe that each have five to 10 units,” said Yip. “We need boots on the ground and want to develop an initial model to implement mobility that we can then replicate.”

What are some of the differences between regions?

“In the EU, the responsibility for providing wheelchair assistance lies with the airport and its service provider, while in North America, that responsibility is with the airline, from budget to luxury,” replied Yip. “From a passenger standpoint, the EU model is better, especially if one gets bounced around among connecting flights.”

North American airports are beginning to realize that they need to invest in mobility assistance for older passengers, she said.

“Our long-term goal is to integrate mobility in a way that’s sustainable for us and the airport,” concluded Yip. “Airports are also launchpads for smart-city applications. It doesn’t make sense to deliver a pizza with a five-seater car; there have to be more sustainable options.”

The post A&K Robotics refocuses micromobility testing in select airports appeared first on The Robot Report.

]]>
https://www.therobotreport.com/ak-robotics-refocuses-micromobility-testing-select-airports/feed/ 0