People's Choice Nominee

ObserVR received a People's Choice Nomination.


Develop a web tool, mobile device app or add-on for existing apps or websites that leverages NASA imagery and climate data to illustrate the impacts of our changing Earth in areas of interest to you. Some ideas to explore include:

  • Use NASA Earth observations data, social media, smart phones, and Short Message Service (SMS) text phones to collect Earth observations and connect public in local, regional, and national networks to communicate about our changing planet.
  • Examine current natural events curated by NASA’s EONET (Earth Observatory Natural Event Tracker) by browsing global historical and near real-time imagery from space.
  • Upload images or other data points demonstrating visible observations and how they compare to satellite data. For example, generating early-warning alerts or validating precipitation rates reported from NASA’s Global Precipitation Measurement Mission.
  • Integrate NASA imagery with a mobile assistant to allow for dynamic image generation based on a basic request structure.

Traditional data visualizations attempts to show three dimensional (spatial), or even four dimensional data (spatial + time) on a two dimensional plane that is your screen. ObserVR uses virtual reality to to break that barrier and see 3D spatial data such as satellite orbits in a more intuitive way. It is a platform built to consolidate both historical and live data and display them using meaningful visualization techniques over Earth, the Solar System, or even larger! ObserVR is aims to be a fully immersive experience by ditching the keyboard and using hand gestures and gaze for controls.

The goal is to make ObserVR into an open source data visualization platform. The base or template may just be the Earth, planets in the Solar System, or even galaxies that exist on an easy to use coordinate system. Data sources and visualizations are modular, which can be added or removed according to the users' needs. If someone would like see a set of data, or see data visualized in a certain way, they can build it and share it. This way, people around the world can all contribute to see data better.

The demo video shows some of the types and sources of data which can be brought in, and the way they can be visualized.

  • Satellite positions and orbits can be found online in the form of Two Line Elements (TLE), which includes parameters that can be used to calculate the position of the satellite at a certain number of minutes after an epoch. They are represented using spheres in space of various sizes (in the future, maybe even use 3D models!).
  • Meteorite landing locations are available from NASA's own collection of REST APIs, which contain their latitude and longitude coordinates, as well as their mass. Their landing location are be marked on globe using a red column, and that column's height is scaled based on the meteorite's mass.
  • Many cloud coverage maps exists on the internet, but few were of good quality and were free to use. The cloud maps used in this demo was free, but only updated once per day. The cloud coverage maps are applied to the globe by setting them as the texture of a slightly larger, transparent globe.
  • The NOAA provides 30-minute aurora forecasts in the form of an array of 1024x512 integers between 0 and 100 (representing chance of seeing the aurora at that latitude/longitude). This array is parsed and converted into a bitmap image that is applied to the globe in the same fashion as clouds.

Although it's not apparent in the video, seeing it in VR is quite a bit different compare to seeing this on a screen. Because your perspective changes with your head movements, the third dimension is extra pronounced. The altitude of each satellite and height of each column representing meteorite mass becomes vary apparent.

Resources Used

The Oculus Rift DK2 and Unity Game Engine are used to create the VR experience. The Leap Motion is used for controls. The satellite orbit data is pulled from CelesTrack's collection of orbital TLEs. The aurora coverage data is parsed from NOAA's 30-minute Aurora Forecast. The meteorite data is pulled from NASA's Meteorite Landings API. The cloud cover data is acquired from xplanetclouds.com.

Made inWaterloo, ON Canada
from the minds of
How they did it