technical sta at Motorola Solutions,
voiced his support for Kirkaldy’s
comments and said: “An important
area where you have something that is
sharing frequencies in a standardised
way is can we build in the hooks
for prioritisation? So can we build a
hook into the standard that gives you
government-level priority and lets
you take over half of that spectrum,
two-thirds of that spectrum… and if
you can do that in a way that builds a
level of assurance and resilience into
your grab of that spectrum in that
situation, that helps reinforce using
these techniques, which are designed
for more light industrial capabilities; it
puts another tool into the public safety
toolbox, so those are the things that we
should be bringing into the standards.”
Kirkaldy highlighted the momentum
behind approaches such as licensed
shared access (LSA), such as the use
of CBRS in the US, “and we have
similar applications and deployments
in Europe, Middle East, Africa as
well… it’s working very successfully”.
However, he noted the work that is
required when opening up spectrum
for shared use, given it can involve
“bumping some fairly sensitive
organisations or refarming existing
spectrum”. Kirkaldy also noted the
UK spectrum regulator Ofcom’s recent
consultation on sharing some spectrum
in the 1,800MHz, 2,300MHz and
3.8-4.2GHz bands (see: https://bit.
ly/2FlBzc5), describing it as “a fantastic
benchmark that other markets can
looktowards”.
Curnow-Ford drew attention to the
computationally intense nature of some
approaches, saying a full-blown TV
White Space approach with an active
database in the 3.5GHz band would
need a supercomputer on the scale of
IBM Watson to do the calculations
quickly enough. He suggested a more
pragmatic approach would be to use
5G New Radio in unlicensed, shared
spectrum access and a set of tools,
including databases with an element
of static allocation and pre-planning
that is used to inform the creation of
the static databases, which can also
be adjusted over time in response to
data on inference levels, along with
licences that can be adapted to that
regime. “at would give you a much
more pragmatic, lower-cost, easier-toimplement
solution.”
Staying on the topic of shared
spectrum, Chater-Lea asked whether
it is possible for public safety
organisations to get priority access to
it, adding “is there going to be enough
of it?” given that spectrum regulatory
administrations work to maximise the
value of the spectrum that they control,
“both in monetary terms and in terms
of utilisation”. While he said using
LSA to provide “spectrum for new
industries to allow them to develop and
bring value to a country’s economy is a
good thing to do”, this still needs to be
weighed against the value that can be
realised by permanently allocating that
spectrum to a commercial operator. In
his view, “LSA is too new to know how
that equation pans out”.
Keeping AI honest
Eric Davalo, head of strategic
development at Airbus Secure Land
Communications, discussed the role
that articial intelligence (AI) can
play in public safety. He identied
four main domains where it can play
a role: automating administrative and
routine tasks (such as transcribing legal
hearings), video analytics, predictive
policing and policing cyberspace.
However, he said there is a “but” in
that AI has to be trained for specic
tasks, “so the way you dene the
operational conditions and what
you want to get out of the system is
extremely important, you cannot just
take an image recognition solution,
plug it into any CCTV camera and
expect to get results – it doesn’t work,
you need the right dataset and you
need to be able to specify exactly what
you want to get out of it”.
While he said it is too early to
properly evaluate the ecacy of
predictive policing, Davalo emphasised
the dangers that can arise when
training AI systems on biased datasets
– giving an example of an image
recognition system that was trained
on pictures of people cooking in
kitchens and ended up miscategorising
men in images as women, due to
the latter’s greater prevalence in the
dataset. He also noted that there is
no logical or mathematical proof that
AI systems work and it is extremely
hard to identify any errors they create
and determine how they are caused,
so public safety organisations have to
carefully consider at which level of the
decision-making process they are used.
Davalo also emphasised the importance
of transparency around how datasets
are built and used, given public safety
organisations’ need to maintain
legitimacy and public trust.
As ICT giants such as Google
are developing the algorithms that
underpin AI and these are readily
available, Davalo sees the underlying
challenge as being “how do you build
the dataset, dene the task and make
L-R: Tony Gray,
chief executive of
TCCA; and Adrian
Scrase, ETSI’s
CTO and 3GPP’s
head of MCC
Eric Davalo noted that
there is no logical or
mathematical proof that
AI systems work and it is
extremely hard to identify
any errors they create
April Supplement 2019 @CritCommsToday 11
/2FlBzc5