BOOM

THE CHALLENGE: Low Boom
Aeronautics

Using data generated by actual flight tests conducted at NASA Armstrong Flight Research Center and data collected from NASA noise laboratories, app developers should construct a visualization of low boom as it compares to normal sonic boom. Currently noise data is either illustrated with ‘contours’ around airport runways and surrounding areas or presented numerically in decibels. Can an app be developed that allows people to ‘see’ the difference between low boom and normal sonic boom over their geographical area? Such an app would help visual learners to grasp the difference more rapidly than traditional data displays.

Explanation

Our goal is to help design faster aircrafts. In order to do this we need to help engineers visualize sound files of redesigned aircrafts to they can compare the improvements of performance. These sound files are the recordings for sonic booms created by the aircrafts being tested.

The problem is that the process of getting data out of sound files is too tedious and time consuming. That's where our solution comes in. We have created both a mobile and desktop app that takes care of this tedious and time consuming process so that all engineers have to do is upload a sound file and then see all the data they need in any form (i.e graphical, audio and even haptic).

Our iOS app is meant to be simple and represent audio files in three forms. The first is the animation of the contours of sounds files (see demo) which speed up or slow down according to the sound intensity of the file. Then there is actual sound of the file when its played, which adjusts the volume of the mobile device according to the sound intensity of the file. Finally there is the haptic feedback which adjusts the vibration of the mobile device according to the sound intensity of the file. All this can be seen in the demo above for the app.

Our desktop app is meant to provide more in depth information for users that want to get more detail out of the sound file. This app is the brain behind the project and does the sound processing in real-time through the use of MATLAB. It is what powers the iOS app. (NOTE: here is also the link for the desktop apps github page: https://github.com/shirsks14/BoomDesktop)

Resources Used

The following resources were used to make BOOM:

Desktop App built with C#, MATLAB
Mobile App built with Objective-C
IDE: Visual Studio, Xcode
Data Analysis:

-Pulse Code Modulation: For the data extraction from the sound files

-Fast Fourier Transform: To compare and analyze the sound files

-Doppler Effect: The theory used to plot the contours of the sounds files

-Lagrange Interpolation: Used to map sound data to other forms of feedback (haptic and audio level)

Data Source: https://data.nasa.gov/docs/aeronautics/lowboom.html


Made inWaterloo, ON Canada
from the minds of
How they did it