# Development of an emission model

On June the 2nd I will present a paper, titled Determining an Empirical Emission Model for the Auralization of Jet Aircraft, at the Euronoise conference in Maastricht, The Netherlands. My presentation will be in the Auralisation of urban sound session. The conference is just a couple of weeks away from now, and I am still analysing and gathering results. Nevertheless, I thought it would be nice to give a bit of an insight on what I'm working on now and what I will present at Euronoise.

## Emission model for auralizations¶

Currently I'm developing an emission model for jet aircraft that can be used for auralizations. Existing emission models for noise prediction generally predict sound pressure levels in 1/3-octaves. This works for noise prediction, however, for auralization a finer resolution is needed. Indeed, one needs to be able to model individual tones and in certain cases also modulations.

At Empa we have in the past conducted measurements for the sonAIR project resulting in a large dataset including audio recordings at multiple sites, cockpit data and flight track data. Colleagues of mine are using this dataset now to develop a next-generation emission model for noise prediction and I'm using this dataset to develop an emission model for auralizations.

## Analysis of an event¶

Let's now have a look at one specific event and how I analyse such event. I've included the Python code I use for the analysis, so you get a better idea of how I'm working. To give you an idea, there is over 500 GB of audio recordings and several hundreds of MBs on other data. The audio is stored in a single hdf5 file using h5py and all other data in a SQLite database. All data is handled using the amazing Blaze and pandas modules.

In [1]:
from sonair.processing import *
%matplotlib inline


We begin by loading the data belonging to a combination of event (i.e. aircraft passage) and receiver and from a certain start and stop time. start and stop are here seconds relative to a certain event reference time. This reference time is the time at which the aircraft is closest to any of the receivers and turned out to be quite conventient to use.

In [2]:
event = '10_004_A320'
start = -5.
stop = +5.

analysis = EventAnalysis(event, receiver, start, stop)


We now have a nice object that gives easy access to all the data. For example, we can request the atmospheric pressure (in mbar) during that event

In [3]:
analysis.event.pressure

Out[3]:
977.39999999999998

or the coordinates of the receiver (Swiss grid)

In [4]:
analysis.receiver.x, analysis.receiver.y, analysis.receiver.z

Out[4]:
(682692.67099999997, 257054.26800000001, 422.048)

Obviously, we can also listen to the recording

In [5]:
from IPython.display import Audio
Audio(data=analysis.recording_as_signal, rate=analysis.recording_as_signal.fs)

Out[5]: