Robotics Summit: how to manage robot autonomy with behavior trees

Boston Dynamics' Spot robot and Agility Robotics' Digit at the Robotics Summit & Expo.

Boston Dynamics’ Spot robot (center) and Agility Robotics’ Digit humanoid at the 2023 Robotics Summit & Expo.

Behavior trees are a mechanism for representing and abstracting complex behavior for robots. While they originated in the video game industry, they have been steadily gaining popularity in robotics over the past several years.

At the Robotics Summit & Expo on May 1-2 in Boston, attendees will learn how to manage their robot’s autonomy with behavior trees. Sebastian Castro, a senior robotics engineer at PickNik Robotics, and Andrew Stout, a roboticist and software engineer at The AI Institute, will contrast behavior trees with other widely used approaches such as finite-state machines.

This presentation, which runs from 2:45-3:30 on May 1, will cover the basics of behavior trees, detail the main software packages and resources available for getting started, and present lessons learned from applying behavior trees in various robotics projects.

At PickNik Robotics, Castro works on the MoveIt Studio developer platform for robot behavior using ROS. Previously, he managed educational content for robotics student competitions using MATLAB and Simulink, stood up a home service robotics software framework at MIT CSAIL, and participated in behavior development and field testing for autonomous truck unloading with the Boston Dynamics.

Stout previously led the adoption of behavior trees for autonomous behavior management at Diligent Robotics and has extensive experience integrating foundational robotic capabilities into coherent, autonomous, interactive, human-centered applications for new robot products. A veteran of several early-to-mid-stage robot startups, he has shipped production software to hundreds of thousands of deployed robots in consumer, healthcare, and education markets.

Robotics Summit brings together 5,000+ robotics developers

Launched in 2018, the Robotics Summit has quickly become a must-attend event for commercial robotics developers. The 2024 edition will bring together more than 5,000 attendees from across the robotics ecosystem.

The Robotics Summit & Expo will feature more than 60 speakers and over 40 technical sessions. Keynote speakers will include:

  • Jonathan Hurst, co-founder and chief robot officer, Agility Robotics
  • Tye Brady, chief technologist, Amazon Robotics
  • Ujjwal Kumar, president, Teradyne Robotics
  • Moritz Baecher, Tony Dohi, and Morgan Pope from Disney
  • Medtronic will demonstrate a remote robotic-assisted surgical system

You can view the complete Robotics Summit agenda here. Speakers are still being added.

This will be the largest Robotics Summit ever. It will include more than 200 exhibitors, various networking opportunities, a women in robotics breakfast, a career fair, an engineering theater, a startup showcases, and more!

New to the event is the RBR50 Robotics Innovation Awards Gala. The event will include a cocktail hour, a plated dinner, photo opportunities, and the chance to hear from the Robot of the Year, Startup of the Year, and Application of the Year winners. Each RBR50 winner will receive two complimentary tickets to the Robotics Summit and RBR50 Gala. A limited number of tickets are also available to summit attendees. 

Registration is now open for the event. Register by March 8 to take advantage of early-bird pricing.

The post Robotics Summit: how to manage robot autonomy with behavior trees appeared first on The Robot Report.

Researchers develop interface for quadriplegics to control robots

a quadriplegic wearing a an assistive device on his head that enables him to control robots.

Carnegie Mellon University researchers lived with Henry and Jane Evans for a week to test their Head-Worn Assistive Teleoperation (HAT) device with Henry, who lost his ability to speak and move his limbs 20 years ago. | Credit: CMU

No one could blame Carnegie Mellon University students Akhil Padmanabha and Janavi Gupta if they were a bit anxious this past August as they traveled to the Bay Area home of Henry and Jane Evans.

The students were about to live with strangers for the next seven days. On top of that, Henry, a person with quadriplegia, would spend the week putting their Head-Worn Assistive Teleoperation (HAT) — an experimental interface to control a mobile robot — to the test.

HAT requires fewer fine motor skills than other interfaces to help people with some form of paralysis or similar motor impairments control a mobile robot and manipulator. It allows users to control a mobile robot via head motion and speech recognition, and versions of the device have featured a hands-free microphone and head-worn sensor.

Padmanabha and Gupta quickly realized that any trepidation they may have felt was misplaced. Henry, who lost the ability to move his limbs and talk after a brain-stem stroke two decades ago, enjoyed using HAT to control the robot by moving his head and in some situations preferred HAT to the computer screen he normally uses.

“We were excited to see it work well in the real world,” said Padmanabha, a Ph.D. student in robotics who leads the HAT research team. “Henry became increasingly proficient in using HAT over the week and gave us lots of valuable feedback.”

During the home trial, the researchers had Henry perform predefined tasks, such as fetching a drink, feeding himself and scratching an itch. Henry directed a robot — Stretch, a commercially available mobile robot outfitted with a pincer-like gripper on its single arm — using HAT to control it.

Daily, Henry performed the so-called blanket+tissue+trash task, which involved moving a blanket off his body, grabbing a tissue and wiping his face with it, and then throwing the tissue away. As the week progressed, Henry could do it faster and faster and with fewer errors.

Henry said he preferred using HAT with a robot for certain tasks rather than depending on a caregiver.

“Definitely scratching itches,” he said. “I would be happy to have it stand next to me all day, ready to do that or hold a towel to my mouth. Also, feeding me soft foods, operating the blinds and doing odd jobs around the room.”

One innovation in particular, software called Driver Assistance that helps align the robot’s gripper with an object the user wants to pick up, was “awesome,” Henry said. Driver Assistance leaves the user in control while it makes the fine adjustments and corrections that can make controlling a robot both tedious and demanding.

“That’s better than anything I have tried for grasping,” Henry said, adding that he would like to see Driver Assistance used for every interface that controls Stretch robots.

Praise from Henry, as well as his suggestions for improving HAT, is no small thing. He has collaborated in multiple research projects, including the development of Stretch, and his expertise is widely admired within the assistive robotics community. He’s even been featured by The Washington Post and last year appeared on the cover of IEEE Spectrum.

Via email, Henry said his incentive for participating in research is simple. “Without technology I would spend each day staring at the ceiling waiting to die,” he said. “To be able to manipulate my environment again according to my will is motivation enough.”

Padmanabha said user-centered or participatory design is important within the assistive device community and requires getting feedback from potential users at every step. Henry’s feedback proved extremely helpful and gave the team new ideas to think about as they move forward.

The HAT researchers will present the results of their study at the ACM/IEEE International Conference on Human-Robot Interaction March 11–15 in Boulder, Colorado.

HAT originated more than two years ago in a project course taught by Zackory Erickson, an assistant professor in the Robotics Institute. The students contacted Henry as part of their customer discovery process. Even then, he was excited about the possibility of using a prototype.

The project showed promise and later was spun out of the class. An early version of HAT was developed and tested in the lab by participants both with and without motor impairments. When it came time to do an in-home case study, Henry seemed the logical person to start with.

During the weeklong study, Padmanabha and Gupta lived in the Evans home around the clock, both for travel convenience and to be able to perform testing whenever Henry was ready. Having strangers in the house 24/7 is typical of the studies Henry’s been involved in and is no big deal for him or Jane.

“We’re both from large families,” he said.

Padmanabha and Gupta, a computer science major, likewise adjusted quickly to the new surroundings and got used to communicating with Henry using a letterboard, a tool that allows Henry to spell out words by looking at or pointing a laser at each letter. The pair even played poker with Henry and Jane, with Henry using Stretch to manipulate his cards.

In the earlier tests, HAT used head movements and voice commands to control a robot. Henry can’t speak, but he can move his left thumb just enough to click a computer mouse. So the team reconfigured HAT for the Evans trial, substituting computer clicks for voice commands as a way to shift between modes that include controlling the movement of the robot base, arm or wrist, or pausing the robot.

“Among people with motor impairments, everyone has different levels of motor function,” Padmanabha said. “Some may have head movement, others may only have speech, others just have clicking capabilities. So it’s important that you allow for customization of your interface.”

Head motions are key to using HAT, which detects head movement using a sensor in a cap, headband or — in Henry’s case — a chin strap.

“People use head gesturing as a way to communicate with each other and I think it’s a natural way of controlling or gesturing to a robot,” Padmanabha said.

A graphical user interface — a computer screen — is more typical for controlling robots. But Gupta said users don’t like using a computer screen to control a robot that is operating around their body.

“It can be scary to have a robot close to your face, trying to feed you or wipe your face,” she said. Many user studies therefore shy away from attempting tasks that come close to the face. But once Henry got used to HAT, he didn’t hesitate to perform such tasks, she added.

A computer screen is available to control Stretch in tasks that are out of the user’s line of sight, such as sending the robot to fetch something from another room. At Henry’s suggestion, the researchers made it possible to use HAT to control a computer cursor with head movements.

In addition to Gupta, Padmanabha and Erickson, the research team includes CMU’s Carmel Majidi, the Clarence H. Adamson Professor of Mechanical Engineering; Douglas Weber, the Akhtar and Bhutta Professor of Mechanical Engineering; and Jehan Yang, a Ph.D. student in mechanical engineering. Also included are Vy Nguyen of Hello Robot, maker of Stretch; and Chen Chen, an undergraduate at Tsinghua University in Beijing, who implemented the Driver Assistance software.

Though Stretch is commercially available, it is still primarily used by researchers and CMU has 10–15 of them. It’s a simple robot with limited capabilities, but Padmanabha said its approximate $25,000 price tag inspires hope for expanded use of mobile robots.

“We’re getting to the price point where we think robots could be in the home in the near future,” he said.

Henry said Stretch/HAT still needs systemwide debugging and added features before it is more widely adopted. He thinks that might occur in as little as five years, though that will depend not only on price and features, but the choice of market.

“I believe the market for elderly people is larger and more affluent and will therefore develop faster than the market for people with disabilities,” he said.

Editor’s Note: This article was republished from Carnegie Mellon University.

The post Researchers develop interface for quadriplegics to control robots appeared first on The Robot Report.

Watch an autonomous helicopter demo wildfire response skills

Sikorsky’s Optionally Piloted Black Hawk helicopter equipped with MATRIX and Rain autonomy systems during fire localization and targeting demonstrations at Sikorsky HQ in Stratford, Connecticut.

Sikorsky’s Optionally Piloted Black Hawk helicopter equipped with MATRIX and Rain autonomy systems during fire localization and targeting demos at Sikorsky HQ in Stratford, Connecticut. | Source: Rain

Rain, a developer of aerial wildfire containment technology, and Sikorsky, a Lockheed Martin company specializing in advanced rotorcraft, are researching how to fight wildfires autonomously. The companies completed test flights in late 2023 using an autonomous helicopter that carries and dumps water onto wild fires in the very early stages.

The flight demonstration took place at Sikorsky’s headquarters in Stratford, Conn. The Optionally Piloted Black Hawk helicopter flew in autonomous mode with Sikorsky safety pilots on board.

“In 2023, in collaboration with Sikorsky, we set out to prove that we could receive an alert about a possible wildfire; send commands to launch and fly an autonomous helicopter capable of moving a large amount of suppressant to a fire’s location; then command the helicopter to accurately drop water onto the fire,” said Rain CEO Maxwell Brodie. “We are very pleased with the results that successfully demonstrate autonomous early detection and rapid response.”

Accelerating aerial response to wildfires

 

Rain and Sikorsky say the capabilities they’ve demonstrated can enable an accelerated aerial response to wildfires. These capabilities include: 

  • Integrating wildfire early detection cameras, powered by Alchera X’s FireScout AI, with automated aircraft dispatch and routing powered by rain
  • An optionally piloted Black Hawk helicopter with Sikorsky’s MATRIX flight autonomy system was integrated with Rain’s wildfire mission autonomy system
  • Rian’s wildfire mission autonomy system processed imagery from an ob-board high-resolution thermal camera to localize, target, and suppress the fire, directing both the aircraft’s flight path and water release timing

“By combining Sikorsky’s Matrix aircraft autonomy system with Rain’s wildfire mission autonomy capability, we have shown the potential to put out a wildfire in its initial stage before it becomes a huge problem,” said Igor Cherepinsky, director of Sikorsky Innovations.

Rain says the fully integrated system performed end-to-end autonomous wildfire response. This includes early detection, dispatch, route planning, preflight, takeoff, flight, Bambi bucket operations, targeting, suppression, and landing.

Moving forward, Rain says it will continue to expand its engagement with fire agencies to integrate autonomous systems into operations and enhance safety for responders and the community. The company says it’s looking forward to advancing capabilities for autonomous firefighting in collaboration with Sikorsky.

“Ultimately, Rain envisions equipping fire agencies with the capability to strategically position a future fleet of firefighting helicopters capable of receiving and carrying out mission commands to autonomously suppress a wildfire in its earliest stage,” Brodie said.

The post Watch an autonomous helicopter demo wildfire response skills appeared first on The Robot Report.