By Louis DiPietro
On

 

A spider-like robot developed by researchers from North Carolina State and Cornell University may be the future of robot-assisted farming. A fleet of nimble, coordinated drones developed by Syracuse University could help save lives after natural disasters. And fresh machine learning (ML) models developed at Cornell could help train robots to detect their own missteps by reading reactions from human teammates, like verbal commands or even frustration. 

This trio of robotics projects represents a small sampling of the dizzying 122 projects featured at the Northeastern Robotics Conference (NERC), held Saturday, Oct. 11, in the Statler Hotel and Duffield Hall on the Cornell campus. 

“The feedback has been overwhelmingly positive, and it’s inspiring to see growing enthusiasm for regional conferences like NERC,” said Tapomayukh “Tapo” Bhattacharjee, assistant professor of computer science and NERC faculty lead. “What makes this event special is its focus on the remarkably diverse robotics research ecosystem in the Northeast.” 

The day-long conference – the largest NERC to date, with 250 attendees from around the world – featured four keynote talks from leading robotics researchers, “Rising Star” talks from eight promising early-career scholars, lab demos, and enough robotics research to fill the Duffield Hall atrium, including Unitree’s shadowboxing humanoid and a backflipping robot dog. Cornell Bowers, Cornell Engineering, Fourier, the Robotics and AI Institute, Unitree, Clearpath Robotics, and XDOF sponsored the event.

The keynote speakers were: Laura Herlant, research director at the Robotics and AI Institute; Wendy Ju, associate professor of information science at Cornell Tech, the Cornell Ann S. Bowers College of Computing and Information Science, the Jacobs Technion-Cornell Institute, and the multicollege Department of Design Tech; Anirudhu Majumdar, associate professor of mechanical and aerospace engineering at Princeton University, and Vickie Webster-Wood, associate professor of mechanical engineering at Carnegie Mellon University. 

“The talks showcased an impressive range – from soft and bio-hybrid robotics and robot learning to human–robot interaction, planning and control, robot safety, multi-agent coordination, marine robotics, and assistive technologies, to name a few,” Bhattacharjee said. “It’s exciting to see how these areas are converging to push the boundaries of what robots can perceive, learn, and accomplish in the real world." 

During the first poster session on Saturday morning, Zhenghua Zhang, a doctoral student in biological and agricultural engineering at North Carolina State University, demoed a hexapod robot about the size of a small dog. Called AgHexaNav, the robotic arachnid is designed to see and step its way through dense vine crops and survey plants, water seedings, and administer herbicides. Its six legs and high clearance are key features, Zhang said, allowing AgHexaNav to delicately navigate through crops while minimizing plant damage, unlike a wheeled robot that would steamroll anything in its path. AgHexaNav’s developers include Lirong Xiang, assistant professor of biological and environmental engineering in the Cornell College of Agriculture and Life Sciences (CALS).

Nearby, Neon Srinivasu and Nazanin Hashkavaei, two doctoral students in mechanical and aerospace engineering at Syracuse University, presented a poster on their system to optimize a fleet of drones – from a few to dozens – to monitor and track natural disasters and assist in recovery efforts. Each drone is equipped with multiple sensors and an optical camera that allow the drones to “see” each other in formation and share information among them, Srinivasu said. 

“With this scheme that we developed, we can scale it up to any number of drones,” Srinivasu said, noting a large wildfire as an example, which might require dozens of drones to monitor and track.  “We’re not using any sort of machine learning or AI [artificial intelligence] tools, which can be highly computationally demanding. That makes our scheme much faster than other existing schemes.”

Across the atrium, Shannon Liu ’27, a computer science major at Cornell, presented her project, “Training Models to Detect Success Robot Errors from Human Reactions.” Robots don’t know when they’ve failed, but humans do, and sometimes react with hesitation, confusion, frustration, or a mix of all three. These human verbal and nonverbal expressions are useful to researchers like Liu, who are exploring how ML models can help robots know when they’ve erred based on responses from their human teammates. 

“We’ve found that, yes, machine learning models can perform well in detecting successive robot errors based on human reactions,” said Liu, who is the lead author of a paper detailing this work, “‘I’m Done’: Describing Human Reactions to Successive Robot Failure,” which was published in the Proceedings of the Association for Computing Machinery (ACM)/Institute of Electrical and Electronics Engineers (IEEE) Conference on Human-Robot Interaction earlier this year.  

Liu intends to evaluate whether these same models can be generalized to read behaviors from people in addition to her 26 study participants.  

“If a robot has never seen this particular human react in a very frustrated way but knows that typically humans will react in this certain way, then the robot could predict that it’s made multiple errors in a row,” she said.

With the more than 100 projects presented, NERC also provided ample opportunity for networking among attendees. Srinivasu, from the Syracuse University drone project, said this was his primary reason for attending the conference.

“I just met a guy who’s doing similar work,” he said. “To meet people and see what they’re working on – maybe we can integrate or collaborate with them” in the future.

Louis DiPietro is a writer for the Cornell Ann S. Bowers College of Computing and Information Science.