irinastrel123/stock.adobe.com
INTERNET OF THINGS DATA
REVOLUTIONISING
DATA HANDLING
In-memory computing is proving a smarter way to
access and process data. By Neil Tyler
The whole concept of in-memory
computing has been gaining
attention at a time when data
has become the fundamental tool and
building block on which businesses
are built.
In-memory involves the storage
of information in the main random
access memory (RAM) of dedicated
servers, rather than in what can tend
to be rather messy and complicated
relational databases which tend to
operate using slower disk drives.
According to Nikita Ivanov, the
CTO of Grid Gain, a company that
specialises in offering enterprisegrade
in-memory computing solutions,
“Memory is less about storage and
more about processing. Historically,
we stored data on tape drives in the
1950s; it was the rst mass storage
solution. In the 1960s we developed
hard drives which were signi cantly
faster and then in the 1980s
progressed to ash.
“We’ve known for a long time that
the ultimate place to store data was
in memory, but that has always proved
to be prohibitively expensive as a
solution and it could only handle a
small amount of data.
“In-memory is a natural evolution
which has been helped by the drop
in memory prices which has played a
major part in the growing popularity of
in-memory as a solution. In-memory
computing is now economically viable
across a number of different sectors
from retail banking to utilities. The
economics has played a crucial part in
its development and acceptance.”
Some of the key bene ts of
in-memory computing include the
ability to cache large amounts of
data constantly, which ensures a
fast response for
searches; it can
store session
data, which
allows for the
customisation
of live sessions
and can
improve the
performance
of websites.
It also has the
capability to process
events to improve complex event
processing.
“If you go back 20-30 years the
kinds of applications that were then
in use involved thousands of users,
at most. Today, with the likes of
Facebook, applications are being
used by millions of users, so the
requirement to scale has changed
dramatically.
“In order to support this there is no
other way than removing all the slow
moving parts in a system. As a result
we are storing data in RAM in a move
that is being described as the ‘ nal
frontier’,” according to Ivanov.
“There is nowhere else that we can
go. Over the next few years, unless we
fundamentally change the architecture
of computers, we will have to continue
to develop this particular paradigm.”
Many companies are looking to use
this technology.
SAP for example, has developed an
in-memory computing technology that
uses sophisticated data compression
to store data in random access
memory and which is 10,000 times
faster than standard disks.
“We are currently experiencing
what is described as a digital
transformation,” says Ivanov. “But
that can be a loaded topic. How is
it different from computerisation?
Well, the latter uses computers and
software to optimise existing business
processes, while the former is about
radically transforming processes
in order to cope with new ways of
handling and processing data.
“Look at companies like General
Motors. In a matter of a few years
most of their revenue will be derived
from selling data that’s been collected
from sensors on their vehicles, rather
than selling the vehicles themselves,
and the same is true of many other
industries.
“In-memory will be a key enabler
and support companies looking
to power their business through
processing and performing analytics
in real time. Companies want to be
able to use complex modelling to
www.newelectronics.co.uk 10 September 2019 29
/stock.adobe.com
/www.newelectronics.co.uk