A robotic painting system developed for a leading aerospace manufacturer. | Photo Credit: Aerobotix
When it comes to operating and controlling robots, there are a variety of options that engineers can consider. These include robotic simulation software, artificial intelligence (AI), and a host of other off-the-shelf software packages that have been designed for specific applications.
When clients present our robotics company, Aerobotix, with challenging problems, we often decide to use an open-source middleware option such as Robot Operating System (ROS). ROS has been built on a framework focused on automation, reliability and flexibility. The benefit of using an open-source framework is that it includes a large contributing community, which is continuously developing and improving.
Why my team chooses ROS
ROS provides a dynamic backbone for creating new systems with a whole host of sensor packages. This freedom is perfect for our company’s robotic systems, as we use sensors like motors, lasers, LiDARs and safety devices. We’ve been able to find manufacturers that have developed their own hardware drivers and interfaces to easily pair with ROS.
Pairing these drivers with our custom solutions is a complex process due to the dynamic framework on which ROS is built. Some of these solutions were developed in short timelines so we looked to the ROS community for support and contracted individuals skilled in ROS development. These contractors helped us achieve understanding in areas such as point cloud manipulation and automated navigation.
The building blocks of robotics automation traditionally include: a human-machine interface (HMI), a programmable logic controller (PLC) and the robot itself. In this basic setup, the PLC acts as the main interface layer — or middleman — for the control system, and all communication goes through the PLC. If you have a request from the HMI or the robot, the PLC answers it. The main constraint with this setup is that you’re stuck with “simple bits and bytes” and more advanced problems can’t be solved.
Using ROS alongside a traditional setup introduces additional capabilities to these bits and bytes. These additions include advanced devices, such as LiDAR, which may be used to create your own vision system. For example, LiDARs create “point clouds” that can be used for navigation, part detection and even object recognition.
Case study: collaborative mobile robot for Air Force maintenance depots
Our company’s first application of ROS was while working as the robotics partner on what became an award-winning project — an adaptive radome diagnostic system (ARDS). This introduced the use of a collaborative mobile robot in U.S. Air Force maintenance depots.
This system uses sensors that transmit microwave signals to non-destructively evaluate (NDE) aircraft radomes and identify defects such as delamination or water ingress in the composite structure. We developed a system integrating a FANUC CRX-10iA collaborative robot, a LiDAR vision system and a custom automated guided vehicle (AGV). This robot scans the warehouse with the LiDAR, navigates to the part, orients normal to the part, creates an inspection path, and outputs a detailed part analysis.
As this was our first application of ROS, we went through a steep learning curve to better understand the various ROS components—services, nodes, publishers and topics. This experience was demystified by online documentation and vast community support.
Case study: robotic painting system for leading aerospace manufacturer
This client was looking towards the future and wanted a more dynamic solution than traditional robotics methods could achieve. The request was for an automated part detection system with a laundry list of features including a non-contact, non-robotic motion that detects and finds multiple aircraft components within a hazardous C1D1-rated paint booth to ±0.50-inch accuracy — all from a single click.
ROS is at the core of the vision system we developed. This system begins with a recorded point cloud containing the robots and the aircraft components. By associating 3D models – provided by the customer — with the point cloud, we were able to locate the parts in reference to the robot. This relationship grants us access to change robotic motion paths for the newly loaded parts in the paint booth, pushing the boundaries of what is possible.
ROS works for you
Every project has its own unique challenges, which means each must be assessed and solved using a customized solution. Delving into the ROS ecosystem has aided my team in expanding beyond traditional robotics and furthered our understanding of advanced sensor technology.
We would encourage any engineer to add ROS to their toolkit and start exploring its unique applications.
About the Author
Aaron Feick is a lead software engineer at Aerobotix, an innovative leader in robotic solutions for the aerospace and defense industries. Headquartered in Huntsville, Alabama, the company specializes in the creation of cutting-edge automated robotic solutions for high-value, high-precision components, aircraft and vehicles.
Individual ants are relatively simple creatures and yet a colony of ants can perform really complex tasks, such as intricate construction, foraging and defense.
Australian-based robotics company Lyro Robotics creates an autonomous packing robot. | Source: Lyro Robotics
Ed Husic, Australia’s Minister for Industry and Science, appointed a National Robotics Strategy Advisory Committee. The committee will help to guide Australia’s strategy for emerging automation technologies.
The committee will develop a national robotics strategy to help the country harness robotics and automation opportunities. The committee will examine robotics from every industry, from advanced manufacturing to agriculture.
“We have brought together some of the nation’s leading robotics and technology thinkers and practitioners to guide the way we develop and use robotics,” Husic said. “Australia has a lot of the key elements that can help in the development of national robotics capabilities: our people, research and manufacturing skills. And while we’re recognized as possessing strength in field robotics, we can do better, across a wider range of activities.”
The National Robotics Strategy Advisory Committee is chaired by Professor Bronwyn Fox, the Chief Scientist of CSIRO, Australia’s national science agency.
Other members of the committee include:
Catherine Ball, an associate professor at the Australian National University
Andrew Dettmer, the National President of the Australian Manufacturing Workers’ Union
Hugh Durrant-Whyte, the NSW chief scientist and engineer
Sue Keay, the founder and chair of the Robotics Australia Group
Simon Lucey, the director of the Australian Institute of Machine Learning
Julia Powels, the director of UWA Minderoo Tech & Policy Lab
Mike Zimmerman, a partner at Main Sequence Ventures
“Australian-made and maintained robotics and automation systems have the potential to boost local manufacturing, open up export opportunities and create safer and more productive work environments,” Husic said.
Husic also said that the National Robotics Strategy Advisory Committee will aim to develop robotic strength while also developing human skills so that Australians still have access to secure, well-paying jobs. Husic asked for the strategy to be finalized by March 2023.
This year, we covered nearly 50 mergers and acquisitions worth billions of dollars. The SPAC craze of 2021 seemingly came to a crashing halt, but there was still plenty of M&A activity.
Below are 10 robotics acquisitions, in chronological order, that stood out to us in 2022 and two notable mergers. Take a look back at the notable acquisitions of 2021.
Aptiv kicked off the year with a bang, announcing it was acquiring Wind River for $4.3 billion. The deal just closed on Dec. 23, but at a lower price point of $3.5 billion. The companies said the price was amended because of certain changes in Wind River’s operating structure required to bring the regulatory approval process to a satisfactory conclusion.
Aptiv, a global mobility company, in 2019 along with Hyundai Motor Group formed the autonomous driving joint venture now known as Motional. Wind River is a global leader in developing software for the edge. Its software is used on more than two billion edge devices across more than 1,700 customers globally. Wind River generated approximately $400 million in revenue in 2021. Intel acquired Wind River in 2008 for $884 million. Intel sold Wind River to TPG Capital in 2018 for an undisclosed amount.
Aptiv said the acquisition allows it to execute against the large software-defined mobility opportunity and expand into multiple industries. Aptiv said it will combine Wind River Studio, a cloud-native intelligent systems software platform, with its SVA platform and automotive expertise.
Zebra Technologies acquires Matrox Imaging
Date Announced: March 15, 2022 Amount: $875M
Zebra Technologies acquired Matrox Imaging for $875 million. Matrox Imaging develops machine vision components and systems and generates annual sales of approximately $100 million.
Matrox offers platform-independent software, software development kits (SDKs), smart cameras, 3D sensors, vision controllers, input/output (I/O) cards, and frame grabbers which are used to capture, inspect, assess, and record data from industrial vision systems in factory automation, electronics and pharmaceutical packaging, semiconductor inspection, and more.
Zebra Technologies in 2021 acquired Adaptive Vision, a machine vision software company, and launched its own line of fixed industrial scanning and machine vision systems. Zebra said the Matrox Imaging deal will complement the aforementioned products.
The deal expanded Zebra’s presence in the automation market. In 2021, Zebra acquired Fetch Robotics, a San Jose, Calif.-based developer of autonomous mobile robots for $290 million.
Bosch Rexroth acquires cobot maker Kassow
Date Announced: March 21, 2022 Amount: Undisclosed
Kassow Robots is developing 7-DOF cobot arms. (Credit: Kassow Robots)
Bosch Rexroth acquired the majority stake in Kassow Robots, a Denmark-based developer of 7-axis collaborative robotic arms. Bosch said the acquisition enables it to offer one-stop solutions, especially for the consumer goods and mobility industry including battery production as well as for semiconductor production. Kassow has five cobots in its portfolio, including reaches from 850mm to 1800mm and payloads from 5kg to 18kg.
Kassow Robots exited stealth mode at Automatica 2018. It was co-founded in 2014 by Kristian Kassow, the former co-founder of Universal Robots, which is the leading developer of cobots. Founded in 2005 by Kassow, Esben Østergaard and Kasper Støy, Universal Robots was acquired a mere 10 years later by Teradyne for $285 million.
Sarcos buys RE2 Robotics
Date Announced: March 28, 2022 Amount: $100M
Sarcos Technology and Robotics Corporation acquired RE2 Robotics, a Pittsburgh-based developer of autonomous and teleoperated mobile robotic systems, for $100 million. The deal consists of $30 million in cash and $70 million of Sarcos common stock. Both companies have had long-term success with military and defense customers, but they never collaborated.
The combined company now offers an extended product portfolio, which will enable it to target a much broader spectrum of customer needs across the commercial (aviation, construction, medical and subsea) and defense sectors. For example, the company has been working with Changi Airport Group to develop an outdoor-based baggage loading system that can automate the loading and unloading of loose passenger bags from a narrow-bodied aircraft.
American Robotics acquires Airobotics
Date Announced: July 5, 2022 Amount: Undisclosed
American Robotics acquired Airobotics, an Israeli developer of autonomous unmanned aircraft systems (UAS). The acquisition is intended to accelerate American Robotics’ technical development and regulatory roadmap and expand the breadth of applications, use cases and vertical targets.
Combining American Robotics and Airobotics also means bringing together leading engineering and aviation talent and two world-class technology platforms. In January 2021, Marlborough, Mass.-based American Robotics became the first company approved by the Federal Aviation Administration to operate fully autonomous drones without visual observers on-site. Before this approval, waivers and certifications awarded by the FAA required visual observers to be stationed along the flight path to monitor a drone’s airspace.
Amazon buying iRobot
Date Announced: August 5, 2022 Amount: $1.7B
Amazon agreed to buy consumer robotics giant iRobot for $1.7 billion back in August, but the deal is under review by the Federal Trade Commission (FTC). The antitrust investigation is focusing on whether the data provided by iRobot’s Roomba robot vacuum gives Amazon an unfair advantage in the retail industry. The FTC is also analyzing how the line of robot vacuums would fit in with Amazon’s existing smart home products.
Amazon and iRobot have had a relationship dating back to 2005 before iRobot went public. iRobot listed Explore Holdings LLC as an investor and named Elizabeth Korrell as its manager. Explore Holdings was another name for Bezos Expeditions, Jeff Bezos’ personal investment firm at the time, and Elizabeth Korrell was an attorney for Bezos. Beyond that, iRobot uses Amazon Web Services (AWS) and developed voice integration between Alexa and the Roomba.
Both companies could seemingly benefit from this deal. Amazon has struggled to enter the consumer robotics market and underwhelmed the industry with its Astro robot. iRobot’s second quarter revenue fell 30% this year due to weak demand and cancellations from retailers in North America and Europe, Middle East and Africa. The company also warned of weaker growth going forward. iRobot has diversified its product portfolio in recent years with non-robotics products, including a handheld vacuum and air purifiers.
Walmart acquires ASRS maker Alert Innovation
Date Announced: October 6, 2022 Amount: Undisclosed
Walmart, the world’s largest retailer, agreed to acquire Alert Innovation for an undisclosed price. Alert Innovation is a North Billerica, Mass.-based developer of robotic e-grocery fulfillment technologies.
Walmart began working with Alert Innovation in 2016 to build custom technology for its micro-fulfillment centers (MFCs). Walmart piloted its first MFC in Salem, N.H. in late 2019 using custom technology from Alert Innovation. The autonomous robot, named Alphabot, can store, retrieve and dispense orders by moving horizontally, laterally and vertically across three temperature zones without any lifts or conveyors.
At the time of the deal, Walmart said that “bringing the best of Alert’s technology and capabilities in-house will enable us to reach more customers quicker by deploying MFCs with greater speed, providing both an unmatched shopping experience and a competitive advantage in omnichannel fulfillment.”
Intrinsic acquires Open Source Robotics Corp
Date: Dec.15, 2022 Amount: Undisclosed
Intrinsic, a software company that launched out of the X moonshot division of Alphabet in mid-2021 to simplify the use of industrial robots, acquired the Open Source Robotics Corporation (OSRC). The OSRC is the for-profit arm of Open Source Robotics Foundation, which is the developer of the popular Robot Operating System (ROS). Intrinsic is also acquiring Open Source Robotics Corporation Singapore (OSRC-SG), the division of the company that led directly to the release of Open-RMF for interoperability.
To be clear, Intrinsic did not acquire the Open Source Robotics Foundation (OSRF), the non-profit that has and will continue to be responsible for the day-to-day activities and development of ROS, Gazebo, Open-RMF and the entire ROS community.
Here’s why this deal is important going forward.
“As a small independent company at OSRC, it’s become increasingly challenging for us to meet the diverse needs of our large and growing user community and continue the commercial business of OSRC,” said Brian Gerkey, co-founder and now-former CEO of Open Robotics, who is joining Intrinsic along with many of his colleagues. “Greater institutional support from Intrinsic and the resources from this transaction allow our team to focus on what we do best and accelerate the development of ROS, Gazebo, and Open-RMF in a sustainable way.
“Together we will give the robotics community great new features in ROS, Gazebo, and Open-RMF, while also building new products and services on top. We will continue to improve ROS, Gazebo, and Open-RMF so that they can be used in even more domains, with ever-higher demands for software quality, testing, and platform support.”
Notable mergers
There were also several notable mergers during 2022. Teradyne companies Mobile Industrial Robots (MiR) and AutoGuide Mobile Robots merged to become a single supplier of autonomous mobile robots (AMRs). At the end of September 2022, the integrated company officially became known as Mobile Industrial Robots (MiR).
Prior to the merger, MiR offered a range of AMRs capable of carrying payloads and pallets up to 3,000 lb. (1350 kg). By combining with AutoGuide, the portfolio will expand to include high payload AMR tuggers and forklifts that will operate on the MiRFleet software.
Another notable merger took place between LiDAR makers Ouster and Velodyne. At press time, this merger wasn’t complete, but the combined company plans to leverage the complementary customer base, partners and distribution channels to accelerate LiDAR adoption. Combined, Ouster and Velodyne have 173 granted and 504 pending patents, and a cash balance of approximately $355 million as of the end of September 2022.
Pony.ai and Baidu announced that they have been issued a fully driverless autonomous vehicle road test permit by the Beijing Intelligent Connected Vehicle Policy Pilot Zone.
With this new permit, Pony.ai will test ten driverless robotaxis in the pilot zone in Yizhuang, Beijing, over an area of 20 square kilometers (7.7 square miles). The Pony.ai robotaxis will be tested in difficult traffic situations in cities. The autonomous vehicles will be tested without anyone in the vehicle; a safety officer will monitor the test vehicles remotely. As with the other regulatory approvals in Beijing over the past year, Pony.ai is one of only two AV companies to get approval in the first group.
Over the past year, the Beijing Intelligent Connected Vehicle Policy Pilot Zone has announced a number of industry-leading autonomous driving policies in succession. In October 2021, the pilot zone opened up autonomous driving unmanned road tests for the first time and divided “autonomous” testing into three stages:
Nobody behind the steering wheel but there is a safety operator in the passenger seat
Nobody in the front row, but with a safety operator in the back row
“Fully driverless” – this covers the permit for autonomous testing in Beijing
To enter this third stage, AV test vehicles need to have met strict technical and operational requirements such as test mileage and disengagement rate; Pony.ai’s ten test vehicles successfully passed the tests without any safety issues.
This authorization builds on other recent autonomous driving milestones for Pony.ai in Beijing. In November 2022, Pony.ai got approval to test with a safety operator in the back seat (stage two of the three stages). Pony.ai acquired permission to offer fare-charging robotaxi services in 2022. It became the first and only AV firm to acquire a taxi license in China in April 2022.
The SAIC Marvel R model will be equipped with the latest generation of Pony.ai’s autonomous driving technology. | Credit: Pony.ai
Pony.ai recently announced that it is collaborating with SAIC AI Lab, a division of SAIC Motor (Shanghai Automotive Industry Corporation), China’s largest auto manufacturer to jointly explore and advance driverless technology. Together with SAIC AI Lab, Pony.ai launched a concept vehicle based on the SAIC Marvel R model and will build out a fleet of autonomous vehicles equipped with Pony.ai’s L4-level driverless solutions, over time.
Baidu wants to become the largest robotaxi provider in the world
Apollo Go Fully Driverless Robotaxi Running on Wuhan Public Road in Evening Hours
Baidu has been offering ride-hailing services with no human drivers since August 2022 in the cities of Chongqing and Wuhan, where they can operate over hundreds of square kilometers. Baidu will keep growing its operational area into 2023 as it plans to create the largest service area for fully driverless robotaxis in the world. The company is planning to add 200 vehicles to its fleet in 2023.
A total of 10 fully driverless test vehicles will travel across a 20 square kilometer area in Beijing Yizhuang Economic Development Zone, covering a series of complex urban road scenarios.
Baidu’s Apollo Go covers more than 10 Chinese cities, including all first-tier cities. Apollo Go conducted 474,000 rides in Q3 2022, up 311% year over year and 65% from the previous quarter. In first-tier cities like Beijing and Shanghai, each Apollo Go robotaxi can give 15 rides a day on average. Apollo Go had given 1.4 million rides by Q3 2022. As Baidu expands robotaxi service, it is one step closer to providing autonomous driving services to more people and consolidating its leading position in the worldwide autonomous ride-hailing market.
Earlier this week, Baidu also announced a major expansion of its commercialized fully driverless robotaxi service in Wuhan, tripling the size of its operation area, increasing the number of robotaxis in service and expanding operating time to include key evening hours.
“Backed by its solid AI technology, Baidu Apollo has created a safe, intelligent and efficient autonomous driving technology system, bringing robotaxi services from designated zones to open roads at scale,” said Jingkai Chen, Baidu’s autonomous driving technology expert at the event. The generalization ability of Baidu’s autonomous driving technology has progressed at a more advanced pace than expected. Now, the lead time to deploy autonomous driving technology in a new city is only 20 days.
The Overture Maps Foundation, created by the Linux Foundation, aims to help developers who build map services or use geospatial data. | Source: Overture Maps Foundation
The Linux Foundation announced it formed the Overture Maps Foundation, a collaborative effort to create interoperable open map data as a shared asset. The Overture Maps Foundation aims to strengthen mapping services worldwide and enable current and next-generation mapping products. These mapping services could be crucial to robotic applications like autonomous driving.
Currently, companies developing and rolling out autonomous vehicles have to spend massive amounts of time and money meticulously mapping the cities they’re deploying in. Additionally, those companies have to continuously remap those cities to account for any changes in road work or traffic laws.
The foundation is founded by Amazon Web Services (AWS), Meta, Microsoft and TomTom. Overture hopes to add more members in the future to include a wide range of signals and data inputs. Members of the foundation will combine their resources to create map data that is complete, accurate and refreshed as the physical world changes. The resulting data will be open and extensible under an open data license.
“Mapping the physical environment and every community in the world, even as they grow and change, is a massively complex challenge that no one organization can manage. Industry needs to come together to do this for the benefit of all,” Jim Zemlin, executive director for the Linux Foundation, said. “We are excited to facilitate this open collaboration among leading technology companies to develop high quality, open map data that will enable untold innovations for the benefit of people, companies, and communities.”
The Overture Maps foundation aims to build maps using data from multiple sources, including Overture members, civic organizations and open data sources, and simplify interoperability by creating a system that links entities from different data sets to the same real-world entities. All data used by Overture will undergo validation to ensure there are no map errors, breakage or vandalism within the mapping data.
Overture also aims to help drive the adoption of a common, structured and documented data schema to create an easy-to-use ecosystem of map data. Currently, developers looking to create detailed maps have to source and curate their data from disparate sources, which can be difficult and expensive. Not to mention, many datasets use different conventions and vocabulary to reference the same real-world entities.
“Microsoft is committed to closing the data divide and helping organizations of all sizes to realize the benefits of data as well as the new technologies it powers, including geospatial data,” Russell Dicker, Corporate Vice President, Product, Maps and Local at Microsoft, said. “Current and next-generation map products require open map data built using AI that’s reliable, easy-to-use and interoperable. We’re proud to contribute to this important work to help empower the global developer community as they build the next generation of location-based applications.”
Overture hopes to release its first datasets in the first half of 2023. The initial release will include basic layers including buildings, road and administrative information, but Overture plans to steadily add more layers like places, routing or 3D building data.
We already recapped the most memorable and most popular stories of 2022, as well as the major acquisitions. You can find all of The Robot Report‘s 2022 Year in Review coverage here.
With 2023 just underway, we asked some of the robotics industry’s leading minds to look to the future. Here’s what they’ll be keeping an eye on in 2023. This article will be updated if additional experts weigh in.
Ken Goldberg, professor, industrial engineering and operations research; UC Berkeley; William S. Floyd Jr. distinguished chair in engineering, UC Berkeley; co-founder & chief scientist, Ambi Robotics
Two of my predictions for 2022 were accurate (the rise of tactile sensing and Sim2Real), but the division of labor between robots and humans is still evolving. On the other hand, I didn’t anticipate the quantum leap in performance of Transformer architectures for Large Language Models (LLMs) such as Stable Diffusion and ChatGPT.
Here are three predictions for 2023:
Transformer architectures will have an increasing influence on robotics
LLMs learn by ingesting vast quantities of human-written text to set millions of weights in a transformer sequential network architecture. LLMs are not grounded in physical experience, but textual captions and images can be integrated to produce surprisingly interesting hybrid images. A recent project by Google researchers shows how LLMs can provide semantic links between human requests (“please help me clean up this spill”) and robot affordances (a sponge within reach). It’s not clear yet exactly how, but I think we’ll see Transformer architectures applied to robotics in 2023 using similarly large sample sizes, such as examples of driving that are being collected by Google, Cruise, Toyota, Tesla, and others.
ROS 2 will gain traction as a standard for industrial robots
The Open Source Robotics Corporation (OSRC) is dramatically revising the new version of ROS to make it much more reliable and compatible with industry standards. In December, Intrinsic, a division of Alphabet, acquired OSRC to combine forces and boost the speed, reliability, and security of this standard and to integrate the latest advances in software architectures and cloud computing. The process will take longer than one year, but I think we’ll see ROS 2 taken much more seriously by major robotics and automation companies in 2023.
Indoor farming using agricultural robotics will mature
Advances in LED lighting and hydroponics developed for “recreational” crops are being adopted by indoor farming centers located in large warehouses proximal to urban centers. Robotics can be used to monitor and fine-tune lighting and temperature to observe plant conditions, allowing fresh crops to be harvested every week. Indoor crops avoid pesticides and require far less water than traditional farming because there is little evaporation and almost no washing and local farms reduce transportation costs. I look forward to eating more spotless fresh lettuce and produce in 2023
Aaron Prather, director of robotics and autonomous systems program, ASTM International
2023 is going to be the year where there will be more opportunities for robotics research through numerous government programs. Two of the biggest will be via the Manufacturing Extension Program (MEP) and Manufacturing USA, which both are seeing boosts in no small part due to the CHIPS Act. Both programs will see massive increases in their federal funding. MEP will see an over 70% increase in funding, while Manufacturing USA will see a whopping nearly 500% increase in funding.
Much of the increase in funding for Manufacturing USA will be to open more institutes to join fellow existing Manufacturing USA organizations like the ARM Institute in Pittsburgh, CESMII in Los Angeles, and MxD in Chicago. These new institutions will focus on the semiconductor industry from materials to production to shipping. Each vertical will require automation and robotics research.
This does not mean the existing 16 Manufacturing USA institutions will go lacking. Not only are most of them seeing increases in their core funding, but they will also get funding from new sources to expand into more areas. All of this is going to lead to numerous more projects between industry, academics, and government.
Other organizations, like the National Science Foundation (NSF) and National Institutes of Health (NIH), are seeing increases in funding that could go into further robotics research.
One potential downside to this will be in what selection criteria the U.S. government puts on this R&D work. The growing concern about Chinese technologies, as it pertains to the Federal Government, could limit who can participate in these funding projects. In October, the U.S. Department of Defense made its ban on Chinese drone maker DJI official. DJI is now one of several dozen Chinese companies deemed to be too closely tied to China’s military for the U.S. Government.
However, the opportunities this funding will have for the robotics industry will be huge. The recent request by some of these institutions for more SMEs to participate, especially integrators and installers of automation equipment, shows how much this additional funding will have from the lab to the factory floor.
An Argo AI vehicle performing a driverless test ride in Austin. Argo AI shut down in October. | Source: Argo AI
William Sitch, chief business officer, MSA
Do you remember three years ago when the pandemic started, the workplace shut down, and the future of humanity was uncertain? It turns out that was a good time to raise money: U.S. VC investment in 2020 went up 15% over 2019. But 2021 was literally twice as nice for raising dollars – truly the golden age for starting companies!
In 2022, all that irrational exuberance came crashing down. Inflation roared, the fed raised, and the markets are blood red – the NASDAQ is down 35% at the time I’m writing this. VC funding is now back to pre-exuberance levels. Exits mostly stopped.
Amid the bad macro backdrop, robots and automation got crushed. Firms shuttered, good engineers were laid off, and lots of autonomy work product was wasted. Maybe these failed concepts were problematic, or early, or whatever, but individuals and the industry suffer when these things happen. So with a boom-bust cycle reverting us back to the mean, here are my predictions for robotics and autonomy businesses in 2023:
More pain
The macro picture just doesn’t look good. The fed will continue to raise rates and I think our current recession will continue through midyear. Layoffs to extend startup lifespan will continue. More shutdowns will happen. Some big names are teetering on the brink and will fail in 23H1. Fingers crossed for TuSimple.
More startups
AI is all the rage, AgTech is on a tear (farmland appreciated 14% from 2021), logistics automation is ROI-positive, and some failures will be the catalyst for new companies. Good ideas and bravery don’t just go away during times of pain, and there’s still money out there for raising. VC funding will be horrible in the first quarter but will accelerate through the end of the year.
Aggressive robotaxi expansion
I don’t buy the thesis that Argo’s shutdown was the end of AV. Cruise and Waymo have demonstrated product/market fit. Failures will happen, but vehicles will continue to get incrementally safer. GM says it will spend $2B on Cruise’s expansion into new markets with the purpose-built Origin. Alphabet’s Waymo One is also expanding. Zoox will launch. 2023 will be the first year robotaxis get mainstream awareness.
I just can’t with Tesla
$800B value destruction by a distracted CEO who needs a social media timeout. California, Euro- and U.S. federal and state regulators coming for Tesla FSD. Deaths attributed to driver error by last-millisecond autonomy disengagements. Traditional OEMs with positive brand equity showing up with L2+ and DMS. Headwinds for sure. Still, $18B net cash and $9B FCF in 2022 is remarkable. I’m out of my league here; I don’t know what’s going to happen and can’t make a prediction. It would be nice if they deployed radar, fused sensor data, and stopped running over mannequins.
Functionality becomes leading indicator of success
Restrictive ODDs, development during deployment and minimally-viable products that are too minimal – these things will cripple adoption, limit growth, and restrict funding. Robots that only work 95% of the time will cause companies to fail. Success will come to those who develop robust prototypes that work before mass deployment.
TL;DR I see 2023 as a second-half recovery story. I’d love to hear your feedback. Tell me how I’m wrong!
Deepu Talla, VP of embedded and edge computing, NVIDIA
Demand for intelligent robots will continue to grow: more industries embrace automation to address supply chain challenges and labor shortages. We see two key trends emerging as developing and deploying these new AI-based robots drives the need for advanced simulation technology that places them in realistic scenarios.
Millions of virtual proving grounds: photorealistic rendering and accurate physics modeling combined with the ability to simulate in parallel millions of instances of a robot on GPUs in the cloud will enable more robots to be trained and validated in virtual worlds. And generative AI techniques will make it easier to create highly realistic 3D simulation scenarios and further accelerate the adoption of simulation and synthetic data for developing more capable robots.
Expanding the horizon: the majority of robots today operate in constrained environments where there is minimal human activity. Advances in AI and edge computing will give robots multi-modal perception for better semantic understanding of their environments. Roboticists will be able to teach robots to perform increasingly complex tasks while making them faster, flexible and safer to operate in collaboration with humans in dynamic environments. This will drive increased adoption in brownfield facilities and public spaces such as hospitals, hotels, retail stores and more.
Several folks from Tangram Vision, a startup that helps robotics companies solve perception challenges, sent us there thoughts.
Brandon Minor, CEO & co-founder
ROS 2 will eclipse ROS as the platform of choice for roboticists. This is partially due to the looming deprecation of ROS, but also due to the fact that ROS 2 has seen significant development on the part of the robotics community that has made it much more tenable as a solution.
There will be a lot more consolidation in the world of autonomous vehicles. Despite a flight to more constrained ODDs, there are likely a number of AV startups that may still find themselves requiring investment. A negative narrative around AVs, coupled with ambivalent investors, will force them to sell, merge, or close their doors, unfortunately.
Julie Matheney, director of marketing
Thermal cameras will transition from an exotic sensor choice to a typical sensor choice. As a result, we’ll see them as part of the sensor array on many more robotic and autonomous vehicle platforms in 2023.
Adam Rodnitzky, COO & co-founder
We’ll see attempts to use generative tools to create robotics code like ROS nodes. These initial attempts won’t be that successful, but they’ll be the first step towards generative code finding its way into the world of robotics.
Jeremy Steward, senior perception architect
Rust uptake will increase for robotics and AV companies. The linux kernel just released a version with Rust in it, and more and more userspace libraries for working with ROS 2 over Rust are now available.
Robotics companies will be bearish on the hiring front. They will instead expect existing engineering teams to output more code with less resources.
Joel Carter, chief marketing officer, Softeq; managing partner, Softeq Venture Fund
As contract developers for some of the world’s top tech and robotics companies, Softeq views 2023 as a breakout year for robotics in the 3D internet. Metaverse applications are real and now widely available for virtual robot design, creation, programming, and testing thanks to new tools like Omniverse from NVIDIA. The platform is compatible with the popular open-source Robot Operating System (ROS) and includes a terrific physics engine explicitly tuned for industrial automation applications. We’ll also continue seeing the widespread availability of edge and cloud AI/ML algorithms to make machines even faster, smarter, and more intuitive.
Keith Pfeifer, president, Aerobotix
Increased job satisfaction and retention for humans working with robots
Especially for jobs that humans find dull, dangerous and dirty, robots will continue to lessen the burden of performing these tasks. As a result, humans will be freed to perform jobs that are more interesting, including supervising the robots.
There’s been a fear that the growing use of robots will cause more unemployment, but to the contrary, the World Economic Forum believes there will be a net positive of 12 million jobs created for humans by the year 2025.
A reduction in workplace-related injuries and diseases
The EPA recently released a new human health assessment for hexavalent chromium, which can cause cancer and is just one of the many chemical compounds that workers are often exposed to during industrial processes. Robots exposed to contaminant substances obviously can’t contract the same kinds of diseases that humans can, nor can they suffer the many injuries that humans can and do in the workplace.
Workplace-related injuries and diseases will decrease for robot-friendly companies, and both employees and employers will be better off for it.
Companies using robots will save tremendous amounts of money
While the upfront costs to install automated systems can be significant, organizations that embrace robot technology will achieve major cost savings as a result of improved labor and time efficiencies. They’ll also have a safer work environment, which means lower insurance costs and less exposure to civil or criminal liability.
An unmanned semi-submersible vehicle developed at Washington State University may prove that the best way to travel in water undetected and efficiently is not on top, or below, but in-between.
The bipedal robot’s average speed was just over 4 m/s, slightly slower than its top speed because it started from a standing position and returned to that position after the sprint, a challenging aspect of developing Cassie, according to the researchers behind the robot.
“Starting and stopping in a standing position are more difficult than the running part, similar to how taking off and landing are harder than actually flying a plane,” OSU AI Professor and collaborator on the project Alan Fern said. “This 100-meter result was achieved by a deep collaboration between mechanical hardware design and advanced artificial intelligence for the control of that hardware.”
Agility Robotics co-founder and CEO Damion Shelton will be keynoting RoboBuiness, which runs Oct. 19-20 in Santa Clara and is produced by WTWH Media, the parent company of The Robot Report. On Oct. 20 from 9-9:45 AM, Shelton will deliver a keynote called “Building Human-Centric Robots for Real-World Tasks.” Agility Robotics will also demo Digit during the session, as well as on the expo floor, and tease the next version of Digit that is due out this fall.
Cassie has knees that bend like an ostrich’s, the fastest-running bird on the planet with the ability to run about 43 mph, and no cameras or external sensors, meaning the robot is blind to its environment and is not autonomous.
Since Cassie’s introduction in 2017, OSU students have been exploring machine learning options in Oregon State’s Dynamic Robotics and AI Lab, where Cassie has been working to. learn how to run, walk and even go up and down stairs. To develop its robot control, the Dynamic Robotics and AI Lab melded physics with AI approaches that are typically used with data and simulation.
The team compressed Cassie’s simulated training, which is equivalent to a year, to just a week by using a computing technique called parallelization in which multiple processes and calculations happen at the same time. This allows Cassie to go through a range of training experiences simultaneously.
In 2021, Cassie traveled 5 kilometers in just over 53 minutes across OSU’s campus, untethered and on a single battery charge. During the run, Cassie used machine learning to control a running gait on outdoor terrain.
The bipedal robot was developed under the direction of Oregon State robotics professor and chief technology officer and co-founder at Agility Robotics Jonathan Hurst with a 16-month, $1 million grant from the Defense Advanced Research Projects Agency (DARPA) and additional funding from the National Science Foundation.
“This may be the first bipedal robot to learn to run, but it won’t be the last,” Hurst said. “I believe control approaches like this are going to be a huge part of the future of robotics. The exciting part of this race is the potential. Using learned policies for robot control is a very new field, and this 100-meter dash is showing better performance than other control methods. I think progress is going to accelerate from here.”
An early prototype of Tesla Inc.'s proposed Optimus humanoid robot slowly and awkwardly walked onto a stage, turned, and waved to a cheering crowd at the company's artificial intelligence event Friday.