Data acquisition systems
HOW TO TRANSFORM THE
DATA AVALANCHE INTO INSIGHT
In a world of increasingly complex products and faster
release cycles, the ability to accumulate and efficiently
analyze test data has never been more important
// STEPHAN PLOEGMAN
Due to the conflicting trends of
increasing complexity of structures
and systems and drastically reduced
development times, test labs are under
immense pressure to produce test results
quicker in order to save costs and reduced
development times despite acquiring more
data from more sensors. Test engineers are
continuously looking at ways to reduce test
time and risk. To work faster and more
efficiently, these engineers must be able to
monitor and respond to test data in realtime
regardless of the data volume.
Depending on the type of test, the
duration, and measurement frequency, an
overwhelming avalanche of data is
generated. The challenge ahead is not only
to acquire the data, but to store and
preserve large volumes of data, and have
the ability to access this data for fast
continuous online analysis. Large volumes
of both structured and unstructured data
require increased processing power,
storage, and a reliable data infrastructure.
When all elements are applied together
into a scalable data backend it can greatly
improve time to market, reduce costs, and
build better products.
ADAPTIVE AND SCALABLE
DATA BACKEND
An adaptive and scalable data backend
provides a scalable storage and compute
platform for acquiring data streams from
instruments, storing configurations, and
performing analyses.
To cope with constantly changing
requirements, setup configuration,
parameter extensions, and varying sample
rates, a separation between hot and cold
data is the best choice. Raw data, data that
is less-frequently accessed and only needed
120 SHOWCASE \\ AEROSPACETESTINGINTERNATIONAL.COM
for auditing or test post-processing (‘cold
data’), is stored in a distributed streaming
platform that scales extremely efficiently.
If one has to store, process and calculate
new variables from hundreds of thousands
of samples per second and from hundreds
of channels at the same time, this
distributed streaming architecture will
show its strength and power.
So-called ‘hot data’, measurement data
that must be accessed immediately for
analysis, is provided in a NoSQL time
series database. This database stores data
securely in redundant, fault-tolerant
clusters. All measurement data is
automatically backed up. Flexible data
aggregation ensures that measurement
data is continuous processed from the
streaming platform to the database with
predefined datasets for easy data
processing of test metrics and KPIs, like
mean value, standard deviation, and
minimum/maximum. However, the same
data can be replayed and aggregated
differently in case detailed analysis around
a certain test event is required. This
approach minimizes the investment and
operational cost for IT and storage
/AEROSPACETESTINGINTERNATIONAL.COM