NASA Selects Augmented Reality Projects for SBIR Awards

NASA astronaut Scott Kelly performing checkouts for NASA’s Project Sidekick, which makes use of Microsoft’s HoloLens device. (Credit: NASA)
NASA astronaut Scott Kelly performing checkouts for NASA’s Project Sidekick, which makes use of Microsoft’s HoloLens device. (Credit: NASA)

NASA has selected three augmented reality projects for funding its Small Business Innovation Research (SBIR) Phase I program. The projects include:

  • The Station Manipulator Arm Augmented Reality Trainer — Systems Technology, Inc.
  • Context-Sensitive Augmented Reality for Mission Operations — TRACLabs, Inc.
  • MonitAR — Adventium Enterprises, LLC

Descriptions of the three selected projects follow.

The Station Manipulator Arm Augmented Reality Trainer
Subtopic: Augmented Reality

Small Business Concerm
Systems Technology, Inc.
Hawthorne, CA

Principal Investigator/Project Manager
Mr. David H Klyde

Estimated Technology Readiness Level (TRL) at beginning and end of contract:
Begin: 2
End: 3

Technical Abstract

One of the most demanding and high-stakes crew tasks aboard the International Space Station (ISS) is the capture of a visiting spacecraft by manual operation of the Space Station Robotic Manipulator System (SSRMS, or Canadarm2). The cost of a missed capture or improper arm/vehicle contact is likely to be very high. Since these operations may be performed up to six months after the most recent ground-based training, crews aboard the ISS prepare for such manual robotic tasks with the Robotics On-Board Trainer, a laptop-based graphical/dynamic simulator using NASA Dynamic Onboard Ubiquitous Graphic (DOUG) software from Johnson Space Center’s Virtual Reality Laboratory. This system, however, does not utilize any real-world, 3-D, out-the-window views. Building upon recent advances in head-mounted augmented reality systems, the team of Systems Technology, Inc. and Dr. Stephen Robinson of UC Davis propose the Station Manipulator Arm Augmented Reality Trainer (SMAART) that will offer ISS crews significantly more realistic on-board refresher training for vehicle capture by manipulating the actual SSRMS with real out-the-Cupola-window views, but with a graphically-simulated vehicle overlaid on the astronaut’s non-simulated view via a head-mounted display. Providing multi-sensory realism in on-board training for such high cognitive-demand skills is expected to increase crew readiness and therefore reduce operational risk for visiting vehicle capture.

Potential NASA Commercial Applications

The proposed Station Manipulator Arm Augmented Reality Trainer (SMAART) technology directly supports the Augmented Reality topic of the NASA Space Technology Mission Directorate (STMD) by allowing astronauts to train/mission rehearse the complex resupply capsule capture task with the mission specific hardware. Currently all training conducted once on-board the ISS is accomplished using a simulation program that does not include use of the actual SSRMS inceptors or feature real-world views. This proposed program also compliments the current STMD Game Changing Development Program IDEAS – Integrated Display and Environmental Awareness System. This program led by Kennedy Space Center with support from Ames Research Center is developing a transparent head mounted display augmented reality system to support situational awareness for operations on Earth and in Space. The emerging SMAART technology can be used to support and/or enhance IDEAS-based applications.

Potential Non-NASA Commercial Applications

The SMAART-based technology is a leap forward from conventional augmented reality in that it will allow the virtual objects to interact directly, not passively with real-world objects in the combined mixed-world environment. Furthermore, using the new geometry-based keying technology, any vehicle or environment can potentially become an in situ simulator. Examples include aviation, a shipyard crane cab, a UAS ground station, and many others. Imagine that a naval aviator can repeatedly practice carrier landings in the actual aircraft at a safe altitude to improve proficiency in a wide variety of conditions before ever attempting actual landings on the carrier. This technology will find applications in the area of hazardous duty training for first responders and the military and broad commercial applications in training for industrial maintenance tasks and operator training with complex machines and vehicles.

Technology Taxonomy Mapping

  • Display
  • Image Processing
  • Man-Machine Interaction
  • Mission Training
  • Prototyping
  • Robotics (see also Control & Monitoring; Sensors)
  • Simulation & Modeling
  • Tools/EVA Tools

Context-Sensitive Augmented Reality for Mission Operations
Subtopic: Augmented Reality

Small Business Concern
TRACLabs, Inc.
San Antonio, TX

Principal Investigator/Project Manager
Dr. David Kortenkamp

Estimated Technology Readiness Level (TRL) at beginning and end of contract:
Begin: 4
End: 5

Technical Abstract

Current NASA missions to the International Space Station are heavily dependent upon ground controllers to assist crew members in performing routine operations and maintenance as well as responses to off-nominal situations. Standard operating procedures are at the heart of spacecraft operations, with almost 5000 procedures for ISS alone. Performing these procedures often requires close collaboration between ground controllers who have deep knowledge of the spacecraft’s systems and crew members who have on-board situation awareness. This close collaboration will become more difficult in extended missions and crew members will need to have more autonomy. Augmented reality technology can help replace some of the guidance that ground controllers offer to crew members during procedure execution. Augmented reality can also provide continuous and just-in-time training opportunities during extended missions as well as entertainment and social connection opportunities. Context-sensitive augmented reality provides different support depending upon the on-board situation and ties directly to procedures, system data, daily plans, background information, and robotic assistants. TRACLabs has developed a procedure integrated development environment called PRIDE that is currently being used by NASA for ISS and Orion procedures. TRACLabs proposes to integrate augmented reality technologies into PRIDE in collaboration with the Georgia Tech Augmented Environmental Lab. In particular, Georgia Tech has developed an augmented realty capable web browser and Javascript framework that will complement the PRIDE web-based procedure execution system. These two industry-leading technologies will form the platform on which a suite of context-sensitive augmented reality applications can be quickly developed and deployed for a variety of NASA applications.

Potential NASA Commercial Applications

This research can have immediate application to ISS operations because there are several iPads already on ISS and a Microsoft HoloLens on its way. We will use ISS procedures as test cases for this project. We will work closely with a variety of NASA personnel who are working on next generation procedure displays. We also see applications to robotic missions including control of NASA’s R2 and R5 robots and robots on other planetary surfaces. PRIDE is being evaluated for use in ground operators for the Resource Prospector robotic mission to the moon being jointly developed by NASA JSC and ARC. Ground operations personnel are currently evaluating PRIDE and this technology would be able to assist them in their operations. We also have a close relationship with the Autonomous Mission Operations TOCA Autonomous Operations Project (AMO-TOCA)being tested on-board ISS. This could serve as a potential Phase~III testbed on ISS. Finally, we have connections to the Human Research Program (HRP) at NASA and will work with those personnel to identify applications, including analog test environments, for this work. We will meet with all of these individuals in Phase~I to determine their requirements and use cases for augmented reality.

Potential Non-NASA Commercial Applications

TRACLabs is already selling PRIDE as a commercial product with oil field services company Baker Hughes as a launch customer. Baker Hughes is field-testing PRIDE at several sites world-wide before deployment in actual operations in mid-2016. PRIDE is proving automation assistance to drilling operations. Augmented reality would immediately increase the effectiveness of the PRIDE software in drilling operations by providing assistance in performing complex and dangerous procedures. TRACLabs expects additional customers in the oil and gas industry will deploy PRIDE once it has been proven effective by Baker Hughes. TRACLabs also sees application of this technology in the automotive manufacturing area. TRACLabs performed a small pilot project for automotive supplier Magna (second largest in the world with 285 manufacturing facilities and over 125,000 employees) on flexible robotic assembly. This was successful, and after a tour of several Magna manufacturing facilities in North America, TRACLabs personnel are negotiating a follow-on contract for research and development. Augmented reality would be used to assign personnel on the manufacturing floor in performing their tasks and validating their work. We expect other manufacturing companies to be interested as well. Sierra Nevada Corporation has also purchased PRIDE licenses for use in their Dream Chaser program, which was recently selected to deliver cargo to ISS.

Technology Taxonomy Mapping

  • Intelligence
  • Man-Machine Interaction
  • Mission Training
  • Perception/Vision

MonitAR
Subtopic: Augmented Reality

SMALL BUSINESS CONCERN
Adventium Enterprises, LLC
Minneapolis, MN

Principal Investigator/Project Manager
Ms. Hayley Borck

Estimated Technology Readiness Level (TRL) at beginning and end of contract:
Begin: 1
End: 3

Technical Abstract

We propose to develop MonitAR, an Augmented Reality (AR) system that provides procedure completion guidance to astronauts. MonitAR will replace guidance from mission control during periods of long time delay or when communication with Earth is not possible. Astronauts using AR glasses will receive feedback from MonitAR via visual cues as they progress through procedures on the spacecraft. The visual cues will be provided when MonitAR determines the astronaut is executing a task (a specific step in the procedure) that deviates from the current procedure. MonitAR will then guide the astronaut back to completing the task in a way that fits with the procedure. During execution, the current and upcoming tasks are proactively displayed to the astronaut in a readable form.

The key innovation is to apply Case-Based Reasoning (CBR) to enable MonitAR to predict the task the astronaut is beginning to execute rather than recognize it when completed. This look-ahead capability enables guidance to be provided early enough to avoid procedure/task failure. Moreover, CBR takes advantage of the astronaut?s extensive training to capture how procedures/tasks are completed and, thereby, avoid a cumbersome and brittle modeling effort. Astronaut procedures will be represented as tasks in a plan using Action Notation Modeling Language (ANML), a planning language already being used to represent astronaut procedures. By representing the procedures as plans, the different ways a procedure can be correctly executed will be captured directly from the existing procedures.

Potential NASA Commercial Applications

The initial target application for this work is to support astronaut training operations. MonitAR will enable astronauts to operate and maintain numerous systems that might unexpectedly break or provide emergency first aid without support from Earth. By adding AR-enabled look-ahead capability in monitoring procedure performance, MonitAR will provide timely guidance to keep astronauts on task.

Potential Non-NASA Commercial Applications

Other potential applications for MonitAR include situations where users are under stress, required to follow procedures to accomplish certain tasks, and train to maintain proficiency in them. Examples include maintenance operations on complex systems like airplanes, ships, industrial control systems, and so on. MonitAR could also be used for providing first aid in the field when a medical doctor is not present. Potential customers include DoD’s Telemedicine and Advanced Technology Research Center (TATRC), FEMA for relief/response operations, and the Center for Disease Control (CDC) for epidemic control procedures. In the private sector, companies that provide industrial control systems would find MonitAR useful in handling abnormal situations in refineries, electrical and nuclear facilities, and manufacturing settings.

Technology Taxonomy Mapping

  • Autonomous Control (see also Control & Monitoring)
  • Data Input/Output Devices (Displays, Storage)
  • Intelligence
  • Man-Machine Interaction
  • Mission Training
  • Perception/Vision
  • Tools/EVA Tools