When artificial intelligence systems encounter scenes where objects are not fully visible, they have to make estimations based only on the visible parts of the objects. This partial information leads to detection errors, and large training data is required to correctly recognize such scenes. Now, researchers at the Gwangju Institute of Science and Technology have developed a framework that allows robot vision to detect such objects successfully in the same way that we perceive them
In a global first, scientists have demonstrated that molecular robots are able to accomplish cargo delivery by employing a strategy of swarming, achieving a transport efficiency five times greater than that of single robots.
Materials scientists aim to develop biomimetic soft robotic crawlers including earthworm-like and inchworm-like crawlers to realize locomotion via in-plane and out-of-plane contractions for a variety of engineering applications. While such devices can show effective motion in confined spaces, it is challenging to miniaturize the concept due to complex and limited actuation. In a new report now published in Science Advances, Qiji Ze and a team of scientists in mechanical engineering and aerospace engineering at Stanford University and the Ohio State University, U.S., described a magnetically actuated, small-scale origami crawler exhibiting in-plane contraction. The team achieved contraction mechanisms via a four-unit Kresling origami assembly to facilitate the motion of an untethered robot with crawling or steering capacity. The crawler overcame large resistances in severely confined spaces due to its magnetically tunable structural stiffness and anisotropy. The setup provided a contraption for drug storage and release with potential to serve as a minimally invasive device in biomedicine.
Yes, those 7-foot-tall machines at Dallas Love Field are watching you. They want to make sure you're wearing a mask if you're boarding a flight or not parking too long at the curb if you're picking up a returning traveler.
New research from the University of Hertfordshire reveals how humans could develop more natural, social interactions with robots in the future.
In a lab at the University of Washington, robots are playing air hockey.
MIT engineers have developed a telerobotic system to help surgeons quickly and remotely treat patients experiencing a stroke or aneurysm. With a modified joystick, surgeons in one hospital may control a robotic arm at another location to safely operate on a patient during a critical window of time that could save the patient's life and preserve their brain function.
Can robots adapt their own working methods to solve complex tasks? Researchers at Chalmers University of Technology, Sweden, have developed a new form of AI, which, by observing human behavior, can adapt to perform its tasks in a changeable environment. The hope is that robots that can be flexible in this way will be able to work alongside humans to a much greater degree.
The notion of a large metallic robot that speaks in monotone and moves in lumbering, deliberate steps is somewhat hard to shake. But practitioners in the field of soft robotics have an entirely different image in mind—autonomous devices composed of compliant parts that are gentle to the touch, more closely resembling human fingers than R2-D2 or Robby the Robot.
To effectively interact with humans in crowded social settings, such as malls, hospitals, and other public spaces, robots should be able to actively participate in both group and one-to-one interactions. Most existing robots, however, have been found to perform much better when communicating with individual users than with groups of conversing humans.