In today’s fast-paced world engineers, scientists, war-fighters, and first responders all need immediate access to ever increasing amounts of data and information. Expanded sensor capabilities are providing more and more data and information, the trick is how can a user access the right data at the right time and in the right context. As an example, engineers and scientists may be collecting significant amounts of data from a live fire rocket test in real time, but cannot act on that data due to sensory overload. The same with war-fighters and first responders when an event occurs, how can they intake, process and act on real-time data effectively?
Geocent recognized the potential for Augmented Reality to provide a new and innovative ways to display data, particularly in a real-time environment. To investigate this further Geocent executed an Internal Research and Development (IRAD) project to investigate the potential for Augmented Reality to be integrated with real-time sensor data to enable innovative information displays. In addition to using Augmented Reality capabilities, we utilized IoT technology for sensor implementation and cloud technology for data integration.
The idea came from the SLS rockets being tested at NASA Stennis Space Center. During these tests the command terminals are monitoring numerous pressure, temperature, fluid flow and other sensors feeds. This is a lot of information to monitor during real-time. By integrating the sensor feeds into an AR display such as the HoloLens, one can visually watch the rocket during the tests to see if there is an unusual buildup of pressure at a particular fuel valve either by changing the colors of the equipment, displaying the actual readout or even providing real-time graphs of the sensor data while watching the rocket during test. Thresholds can be set to trigger visual changes in color or notifications. This implementation required 2 main phases of development:
- Getting multiple IoT sensors to feed into a SensorHub which provides access that the AR tool, or Hololens, will have to consume
- Developing an “app” for the HoloLens so that it knows where to overlay the data within the users field of view. This is done by the camera’s feed into image recognition software built into the AR libraries. A picture of the tank, valve, sensor, or other item of interest can be used to identify where to overlay the information.
By using open standards such as the Open Geospatial Consortium (OGC) Sensor Observation Service (SOS) protocols to access the sensor data along with a catalog of the metadata of the sensors this architecture has no limits of it uses from manufacturing, to Next Generation First Responders.
The result of this IRAD is an architecture that allows us to integrate sensor data directly into an augmented reality visualization providing a next generation data visualization capability. By augmenting the real world with data the context of the information is instantly understandable. An object becoming red can be instantly understood as heating up exceeding pressures depending on the object being viewed.