Home
HoloLens

UC Berkeley Team Commands Drone Fleet with HoloLens

Mar 21, 2017 07:23 PM
Mar 21, 2017 08:24 PM
636256958054909566.jpg

User interfaces for computers have evolved over the years, from the introduction of the keyboard and mouse on the personal computer, to touchscreens on mobile devices, to natural voice recognition. However, the same cannot be said for robots or drones—until now.

There are two standard methods for interface for humans with robots today—either a control panel or command line tool. Neither of these methods are as intuitive as touchscreens are for smartphones and tablets.

Dr. Allen Yang, executive director at the Center for Augmented Cognition at the University of California, Berkeley, believes that augmented reality is the next evolution of interfaces for robots and drones.

"The interface for robots have not changed for the past fifty years. The emerging market of AR may provide new ways for human users to more intuitively interact with robots," said Dr. Yang in an interview with NextReality.

636256892027566710.jpg

Dr. Allen Yang, executive director at the Center for Augmented Cognition, believes augmented reality can be the first innovation in fifty years for robot control interfaces.

Expanding AR Research

Based in the College of Engineering, the Center for Augmented Cognition started as a division of the Berkeley Robotics and Intelligent Machines lab. The center supports research by faculty and students in applying augmented and virtual technologies to human cognition modeling, human-computer interaction, and human-robot collaboration.

Last week, UC Berkley announced that a corporate gift from virtual reality entertainment company Immerex would facilitate the opening of a new lab and collaboration space for the Center for Augmented Cognition.

"We expect the new lab will make a positive impact in strengthening Berkeley's leading role in Silicon Valley and globally on the innovation of disruptive technologies that connect information, people and society," said S. Shankar Sastry, dean of the College of Engineering, as well as a co-director of the Center for Augmented Cognition, in a press release.

636256893679673385.jpg

The new lab for augmented and virtual reality research was made possible by a donation from Immerex, a virtual reality entertainment company.

Named the Immerex Virtual Reality Lab, students will be able to use the space to develop projects with advanced equipment. In addition, the corporate gift will fund fellowships for student support and AR/VR classroom renovations.

In this new lab, we will be able to create synergies among Berkeley's various programs in AR/VR and connect researchers and students to Immerex's industry-leading technologies and channels in the emerging virtual reality global market.

ISAACS: AR as Human-Machine Interface

Among the projects students and faculty at the center are undertaking is a method for controlling drones with augmented reality. Immersive Semi-Autonomous Aerial Command System (ISAACS) is an open-source project that uses Microsoft HoloLens as the command interface for a drone fleet, though it could eventually be applied to robots as well.

The ISAACS program is led by Dr. Yang, who brings expertise in image recognition, with Dean Sastry and Prof. Clair J. Tomlin, both of whom have studied drone control and safety of the past ten years.

"What we've proposed with the ISAACS program is to combine the new capability to understand humans with the new capability to understand robots using control theory and create a new kind of interface," Dr. Yang told NextReality. "That interface has to be intuitive, so you don't have to learn how to program robots, and it also has to be immersive, the reason being that the robots are physically sharing the same space with humans."

636256894911847660.jpg

Augmented reality can provide a more natural interface for humans to command drones and robots and for those machines to understand humans.

According to Dr. Yang, AR allows for a more natural connection between man and machine by facilitating two-way communication. Through AR and machine learning, robots can understand commands and intent from humans by voice, gestures, and posture. At the same time, humans can better understand robots by seeing its field of view at a first person perspective, as well as having information such as battery power or current task displayed in the same field.

In addition to the AR-based interface, ISAACS also coordinates localization and visualization between operator and vehicle and provides a framework for vehicle safety assurance. Using real-time simultaneous localization and mapping (SLAM) solution, ISAACS can localize 3D coordinates between drones and the HoloLens operator. ISAACS also prevents accidents from human error by connecting with a drone's low-level controller and also optimizing the operators' situational awareness through the AR interface.

The goal is to make sure that robots can work for normal people. Those are the people who might not necessarily have computer science or electronic engineering background.

— Dr. Allen Yang, Center for Augmented Cognition

The team have impressed the business community already. Last year, ISAACS was named among the inaugural recipients of the Microsoft HoloLens Research Grant. Another stakeholder, DJI, provides the ISAACS team with test drones for their lab.

At this juncture, though ISAACS is not a commercially-ready platform, it is an opportunity for the industry to share knowledge for potential future developments that include these safety and natural interface features.

"Their purpose as part of the open research is to motivate open research for academia and the industry to collaborate," said Dr. Yang. "Our industry partners, by working with us, can learn from our experience and learn from our research."

DJI provided the ISAACS team with test drones, such as the DJI Matrice 100, shown here in a test flight.

Coming Soon: OpenARK SDK

While ISAACS is still far removed from production deployment, the Center for Augmented Cognition has another AR-based project that will be released soon, pending university approval for the open-source license.

OpenARK is an open-sourced software development kit that will provide tools for calibration, gesture recognition, and character input. When ISAACS is mature enough, Dr. Yang expects ISAACS to be released as part of OpenARK.

636256907278997648.jpg

According to the Center for Augmented Cognition website, OpenARK "offers innovative core functionalities to power a wide range of off-the-shelf AR components, including see-through glasses, depth cameras, and IMUs."

With these projects, Dr. Yang believes the university is poised to take a leadership with regards to establishing standards that meet the needs of the industry and the development community, much like it has in the past with UNIX for computer language and Caffe for deep learning.

"OPENARK is trying to set the foundation for human-computer interaction using the augmented reality modality," said Dr. Yang. "In the past, Berkeley has a good tradition of publishing open source projects, and that has helped the industry a lot."

Are you excited about the prospects of controlling drones through augmented reality? Let's talk about it in the comments section below.

Cover image via UC Berkeley

Comments

No Comments Exist

Be the first, drop a comment!