Top 10 robotics stories of September 2022

Big acquisitions, bipedal robots and an FTC investigation captured your attention in September. 

Here are the 10 most popular robotics stories on The Robot Report in September. Subscribe to The Robot Report Newsletter to stay updated on the robotics stories you need to know about.


diagram showing architecture of a robot vacuum cleaner

10. Sensor breakdown: how robot vacuums navigate

Over the past few years, robot vacuums have advanced immensely. Initial models tended to randomly bump their way around the room, often missing key areas on the floor during their runtime. Since those early days, these cons have turned into pros with the innovative use of sensors and motor controllers in combination with dedicated open-source software and drivers. Here is a look at some of the different sensors used in today’s robot vacuums for improved navigation and cleaning. Read More


combined image of the AMD instinct and NVIDIA chip overlayed with CHINA FLAG9. How AI chipset bans could impact Chinese robotics companies

NVIDIA and AMD said that the United States government has ordered them to halt exports of certain AI chipsets to China, which is the world’s second-largest economy. Both companies now require licenses for the sale of AI chipsets to China. Read More


dooson cobot in a manufacturing use case

8. Doosan Robotics signs cobot distributor in Northeast

Doosan Robotics formed a strategic partnership with Industrial Automation Supply (IAS) in Portland, Maine. IAS will serve as a partner and reseller of Doosan’s M, H and A-SERIES collaborative robotic arms across the Northeast. Doosan’s four M-SERIES cobot models are all equipped with six torque sensors – one in each joint. The models have a working radius of 900 to 1,700 millimeters and a payload capacity of 6 to 15 kilograms. Read More


7. Will Tesla’s Optimus robot be transformative?

Let’s be frank, Optimus feels a bit dystopian, as if we’re all going to be eminently replaced by a sleek, slender, cold electronic robot. It feels like Optimus inhabits a world of beautiful black and white design, while the rest of us get to drive around in stainless-steel Cybertrucks overseeing our hole-drilling operations on Mars. Read More


osu bipedal robot6. Watch a Cassie bipedal robot run 100 meters

Cassie, a bipedal robot developed at the Oregon State University (OSU) College of Engineering and produced by OSU-spinout company Agility Robotics, recently ran 100 meters with no falls in 24.73 seconds at OSU’s Whyte Track and Field Center. The robot established a Guinness World Record for the fastest 100 meters by a bipedal robot. Read More


Cloostermans legacy process production machine5. Amazon acquiring warehouse robotics maker Cloostermans

Amazon is continuing its acquisitions streak. Amazon has agreed to acquire Cloostermans, a Belgium-based company that specializes in mechatronics. Cloostermans has been selling products to Amazon since at least 2019, including technology Amazon uses in its operation to move and stack heavy pallets and totes and robots to package products for customer orders. Read More


MAXXgrip gripper

4. The Gripper Company launches MAXXgrip

The Gripper Company officially launched MAXXgrip, its first gripper solution designed specifically for warehouse and logistics applications. The new MAXXgrip gripper uses a vacuum and four soft fingers that move to solve the problems robot grippers have with handling pieces in warehouse picking and sorting jobs where there are a lot of different kinds of items to handle. An articulating vacuum gripper is used for initial item acquisition, then the fingers are deployed to stabilize the gripped item during the transfer by the robot. Read More


Amazon robot

3. Amazon testing pinch-grasping robots for e-commerce fulfillment

Robots picking items in Amazon’s warehouses need to be able to handle millions of different items of various shapes, sizes and weights. Right now, the company primarily uses suction grippers, which use air and a tight seal to lift items, but Amazon’s robotics team is developing a more flexible gripper to reliably pick up items suction grippers struggle to pick. Read More


irobot on the floor2. FTC investigating Amazon’s acquisition of iRobot

The Federal Trade Commission (FTC) has officially started an antitrust investigation into Amazon’s plans to acquire robot vacuum maker iRobot for $1.7 billion. Politico reports the FTC is investigating a number of potential issues. The FTC’s investigation will reportedly focus on whether the data provided by iRobot’s Roomba robot vacuum gives Amazon an unfair advantage in the retail industry. Read More


rust linux1. Linux embracing Rust will boost robotics community

Linux’s Benevolent Dictator For Life Linus Torvalds recently mentioned that the Rust programming language would be used in the upcoming Linux 6.1 kernel. Currently, the Linux kernel is at preview version 6.0-rc6 (codenamed “Hurr durr I’ma ninja sloth”) so we have a bit of time before we all have Rust powering the kernel, but the mere announcement is news-worthy. It’s the author’s opinion that this embrace of Rust at the very core of Linux will be a huge boost to the robotics community. Read More

The post Top 10 robotics stories of September 2022 appeared first on The Robot Report.

A system for automating robot design inspired by the evolution of vertebrates

Researchers at Kyoto University and Nagoya University in Japan have recently devised a new, automatic approach for designing robots that could simultaneously improve their shape, structure, movements, and controller components. This approach, presented in a paper published in Artificial Life and Robotics, draws inspiration from the evolution of vertebrates, the broad category of animals that possess a backbone or spinal column, which includes mammals, reptiles, birds, amphibians, and fishes.

Exoskeleton walks out into the real world

For years, the Stanford Biomechatronics Laboratory has captured imaginations with their exoskeleton emulators—lab-based robotic devices that help wearers walk and run faster, with less effort. Now, these researchers will turn heads out in the "wild" with their first untethered exoskeleton, featured in a paper published Oct. 12 in Nature.

Stanford researchers create robotic boot that helps people walk

Engineers at Stanford University have created a boot-like robotic exoskeleton that can increase walking speed and reduce walking effort in the real world outside of the lab. The team’s research was published in Nature

The exoskeleton gives users personalized walking assistance, allowing people to walk 9% faster and use 17% less energy per distance traveled. The energy savings and speed boost that the exoskeleton provides is equivalent to taking off a 30-pound backpack, according to the team. 

The goal is to help people with mobility impairments, especially older people, to more easily move throughout the world, and the Standford team believes that its technology will be ready for commercialization in the next few years. 

Using a motor that works with calf muscles, the robotic boot gives wearers an extra push with every step. The push is personalized using a machine learning-based model that was trained through years of work with emulators, or large, immobile and expensive lab setups that can rapidly test how to best assist people. 

Students and volunteers were hooked up to the exoskeleton emulators while researchers collected motion and energy expenditure data. This data helped the research team to understand how the way a person walks with the exoskeleton relates to how much energy they’re using. The team gained more details about the relative benefits of different kinds of assistance offered by the emulator, and used the information to inform a machine-learning model that the real-world exoskeleton now uses to adapt to each wearer. 

To adapt to an individual’s unique way of walking, the exoskeleton will provide a slightly different pattern of assistance each time the user walks. The exoskeleton then measures the resulting motion so that the machine learning model can determine how to better assist the user the next time they walk. In total, it takes the exoskeleton about an hour to customize its support to a new user. 

Moving forward, the Stanford researchers hope to test what the exoskeleton can do for its target demographic, older adults and people who are experiencing mobility decline from disability. The team also wants to plan design variations that target improving balance and reducing joint pain, and work with commercial partners to turn the device into a product. 

The post Stanford researchers create robotic boot that helps people walk appeared first on The Robot Report.

New walking robot design could revolutionize how we build things in space

Researchers have designed a state-of-the-art walking robot that could revolutionize large construction projects in space. They tested the feasibility of the robot for the in-space assembly of a 25m Large Aperture Space Telescope. They present their findings in Frontiers in Robotics and AI. A scaled-down prototype of the robot also showed promise for large construction applications on Earth.

Tiny, caterpillar-like soft robot folds, rolls, grabs and degrades

When you hear the term "robot," you might think of complicated machinery working in factories or roving on other planets. But "millirobots" might change that. They're robots about as wide as a finger that someday could deliver drugs or perform minimally invasive surgery. Now, researchers reporting in ACS Applied Polymer Materials have developed a soft, biodegradable, magnetic millirobot inspired by the walking and grabbing capabilities of insects.

Expressing and recognizing intentions in robots

The digital and physical worlds are becoming more and more populated by intelligent computer programs called agents. Agents have the potential to intelligently automate many daily tasks such as maintaining an agenda, driving, interacting with a phone or computer, and many more. However, there are many challenges to solve before getting there. One of them is that agents need to recognize and express intentions, Michele Persiani shows in his thesis in computing science at Umeå University.

Sensor breakdown: how robot vacuums navigate

block diagram robot vacuum

An example diagram block for a robot vacuum. | Credit: Invensense, a TDK company

Over the past few years, robot vacuums have advanced immensely. Initial models tended to randomly bump their way around the room, often missing key areas on the floor during their runtime. They also became trapped on thick rugs, and if vacuuming upstairs, came tumbling down with a heavy thud. Their runtime was also relatively short, and you’d often come home hoping for a nice and clean room only to discover that it had run out of juice halfway through.

Since those early days, these cons have turned into pros with the innovative use of sensors and motor controllers in combination with dedicated open-source software and drivers. Here is a look at some of the different sensors used in today’s robot vacuums for improved navigation and cleaning.

Ultrasonic time-of-flight sensors
Ultrasonic time-of-flight (ToF) sensors work in any lighting conditions and can provide millimeter-accurate range measurements independent of the target’s color and optical transparency. The sensor’s wide field-of-view (FoV) enables simultaneous range measurements of multiple objects. In a robot vacuum, they are used to detect if an object, such as a dog or children’s toy, is in its way and whether it needs to deviate its route to avoid a collision.

Short-range ultrasonic ToF sensors
Short-range ultrasonic ToF sensors can be used to determine different floor types. The application uses the average amplitude of a reflected ultrasonic signal to determine if the target surface is hard or soft. If the robot vacuum detects that it has moved from a carpet onto a hardwood floor, it can slow the motors down because they do not need to work as hard compared to carpet use.

The cliff detection feature can enable the robot vacuum to determine when it’s at the top of a set of stairs to prevent a fall.

VSLAM and LiDAR
Most companies developing high-end robot vacuums use visual simultaneous localization and mapping (VSLAM) or LiDAR technology to build a virtual map of the room. These technologies enable the robot vacuum to move around more efficiently, covering an entire level of a home with multiple rooms. However, if you lift the robot and put it down, it will not know its new location. To find out where it is, the robot must go off in a random direction and, once it detects an object and starts tracing the walls, it can find out where it is relevant to the map.

VSLAM or LiDAR technologies may not be applicable for low-light areas, for example, if the robot vacuum goes under a table or couch, where it is unable to read the map.

An example of the mapping capabilities of iRobot’s j7 robot vacuum. | Credit: iRobot

Inertial Measurement Units (IMU)
IMUs take the roll, pitch, and yaw of movements of the robot vacuum in the real world both from a linear and rotational perspective. When the robot vacuum is doing circles or moving in a straight line, it knows where it is supposed to go and how it is moving. There may be a slight error between where it should be and where it is, and the IMU can hold that position in a very accurate way.

Based on rotational and linear movement, plus the mapping of the room, the robot vacuum can determine that it is not going over the same areas twice and can pick up where it left off if the battery dies. And, if someone picks up the robot vacuum and places it somewhere else or turns it around, it can detect what is happening and know where it is in real space. The IMU is essential to making robot vacuums efficient.

For robot vacuums that do not use VSLAM or LiDAR mapping technology, their position and navigation can be determined using dead reckoning by combining measurements from the wheel’s rotations with the inertial measurements from the IMU and object detection from the ToF sensors.

Smart speaker microphones
As developers of robot vacuums continue to implement artificial intelligence (AI) with the ability to use voice assistants, microphones become an essential sensor technology. Take beamforming, for example. Beamforming is a type of radio frequency (RF) management technique that focuses the noise signal towards the microphone in combination with AI for tweaking. At the moment, the noise of the motors and the turning brushes on the robot vacuum are a bit loud. However, as microphone technology progresses and motors and brushes become quieter, coupled with beamforming, microphones will be able to determine the user’s voice in the not-too-distant future.

Algorithms can also be trained to disregard certain noises and listen specifically for the voice of the user. Ostensibly, the user wants to call for the vacuum cleaner to clear up something or tell it to go home without going through an app or voice assistant product. You want that to happen in real time inside the host processor of the robot vacuum. Alternatively, if the microphone notices that something is being spoken, it may be possible for the robot vacuum to stop all of its motors to listen to the command.

Embedded motor controllers
The embedded motor controllers are turning the gears to ensure the wheels are moving the robot vacuum in the correct direction with accuracy that can tell when the wheel is actually turned 90 degrees as opposed to 88 degrees. Without this high level of accuracy, the robot vacuum will be way off track after a certain amount of time. The embedded motor controller can be flexible whether you use sensors or not, making the robot vacuum scalable.

Pressure sensors
The level of dust inside the dust box is estimated by monitoring the flow of air through the dustbin with a pressure sensor. Compared to the air pressure when the dustbin is empty, the air pressure inside the dustbin begins to drop when the airflow begins to stagnate due to an increase in suction dust or clogging of the filter. However, for more accurate detection, it is recommended to detect it as a differential pressure that uses a similar pressure sensor to measure the outside air pressure.

A lot of the high-end bases have the capability to suck out the contents of the dust box automatically. The robot vacuum can then return to base, empty its contents, return to its last known position and continue cleaning.

Auto-recharging
To determine the battery’s state of charge (SoC), you need accurate current and voltage measurements. The coulomb counters and NTC thermistors in the battery pack provide this information.

When the battery reaches an arbitrary SoC level, the battery communicates an instruction for the robot vacuum to stop cleaning and return to the base for a recharge. When fully charged, the robot vacuum goes back to its last known position and continues cleaning. Regardless of the size of the room, in theory, with multiple chargers and multiple abilities to empty the dustbin, the robot vacuum can cover the entire floor space.

Thermistors
Thermistors, which are a type of temperature sensor, can be used to monitor the running temperature of the MCU or MPU. They can also be used to monitor the temperatures of the motors and brush gears. If they are running way too hot, the robot vacuum is instructed to take a break and perhaps run a few system diagnostics to find out what is causing the problem. Also, items caught in the brushes, like an elastic band or excess hair, can make the motors overcompensate and overheat.

Robot vacuum developers should understand what the motors are supposed to sound like at a certain threshold of frequency. It is possible to use a microphone to detect whether the motors are running abnormally, thereby detecting early stages of motor degradation. Again, by using diagnostics, the abnormal noise from the bushes could indicate that they have picked.

Conclusion
The retail price of a robot vacuum goes hand in hand with functionality and accuracy; some of the high-end models can be as much as $1,100. You can get a robot vacuum for closer to $200, but you will be sacrificing some of the bells and whistles. It all depends on the value the robot vacuum developer wants to create and the cost structure that works best for the user.

As component costs come down, it seems likely that more mid-tier robot vacuums will enter the market. Technologies like ToF sensors, pressure sensors, IMUs and motor controllers, along with improvements in battery efficiency, will drive this growth.

About the Author
For seven years, Peter Hartwell has been the chief technology officer at Invensense, a TDK company. He holds more than 40 patents and his operation oversees 600 engineers who have developed a broad range of technologies and sensors for drones, automotive, industrial and, more broadly, IoT. Hartwell has 25-plus years of experience commercializing silicon MEMS products, working on advanced sensors and actuators, and specializes in MEMS testing techniques.

Prior to joining InvenSense, he spent four years as an architect of sensing hardware at Apple where he built and led a team responsible for the integration of accelerometer, gyroscope, magnetometer, pressure, proximity, and ambient light sensors across the entire product line. Hartwell holds a B.S. in Materials Science from the University of Michigan and a Ph.D. in Electrical Engineering from Cornell University.

The post Sensor breakdown: how robot vacuums navigate appeared first on The Robot Report.