Ok, ready? Right, apparently we're ready because we're.
Broadcasting to the world. So, good afternoon everybody and welcome to
this huddle session.
You can come a bit forward if you'd like to huddle, because apparently we're supposed to
huddle in this session, but welcome to the huddle to have a discussion about what does
future tech look like.
And good afternoon, I'm Robin Davis. I head the Future Technologies Group at the
Critical Communications Association.
I also as a day have a day job, uh, where I am one of the directors at Actica.
So, um, before we go into a few slides, and then I know most people are always interested
in the trends and the technology trends and what have we found in,
particularly in the TCCA.
In terms of what was trending in the last year, and we're going to use that for a bit of
discussion with our panellists.
Let's introduce our panellists from left to right, and I'm actually going to let them
pronounce their surnames so I get, they get them correct because I will get them incorrect
with my West Country accent.
So we'll start at this end with Wayne. Unfortunately due to technology,
and we are a technology.
Uh, group, we only have one microphone, so it's great.
Uh, so, uh, we will start at this end, um, and, uh, they,
each person will tell you who they are and where they are from.
Over to Wayne. Thank you, Robin.
Wayne Parks is not particularly difficult, really, but,
uh, um, I'm, I'm one of those used to be, so I used to be in policing for just over 30 years,
um, all around police technology, the last 7 or 8 years of that,
um, at the centre around Home Office programmes and chaired National Technology board,
um, now consulting, um, widely still within policing, um,
trying to help, forces, suppliers.
I guess try and predict what the future of technology is,
so hopefully I can bring some insight from that today.
Thank you very much. I'm Jason Somerville.
Um, I work for South Central Ambulance Service, as it is up there.
I, I've been in the ambulance service for around about 30 years or so,
um, use this month, I think, and, uh, uh, you know, I start,
I've been on the road, uh, as a tech as a, as a clinician, and uh,
I've moved over into the, uh, the digital space, so I've been in the digital space now for about
1516 years.
Thank you. Um, Gillian Fyfe, um, digital tech and cyber
lead with the NFCC, fairly new to fire and rescue as a sector.
Um, before that I was with um Scottish Government and the digital Health and care
directorate, um, and Alzheimer Scotland, um, and, um,
seeing that a lot of the, the digital challenges in this world are,
are not unique to the sector.
Thanks Steve Russell, director of data strategy and technology at Warwickshire Police.
So all things technology sit under my portfolio.
My background is more data policy, and I did end up somehow ended up being a director of
communications and marketing for 3 to 4 years, which is interesting then now being a
technologist. So I'm not a deep technologist,
but in some ways I hopefully that works in my favour in sometimes.
Excellent. Thank you.
Um, yeah, good afternoon. I'm Heidi Clewitt,
head of, um, the technology team for HM Coast Guard, and I've been with HM Coast Guard for
about 11 years, um, this month actually, um, initially in operations,
and then, um, Then more in the sort of domain awareness, um,
vessel traffic management and policy space, and more recently in the last couple of years
within the technology team.
Um, so thanks. Brilliant. OK, so, um, just go through a few slides to set
the scene. So we're particularly here, we will have a
discussion about what does.
The future of tech look like and particularly as I said,
some of those interesting trends and what do our panellists,
so we've got police, fire, ambulance, and I used to do lots of things and of course
Heidi from the Coast Guard at the end, so we're going to have a complete view of where are we
going with with technology.
From the moment we all finished the panel at about 3 o'clock if we can keep going for that
long. So what does future tech look like?
Let's just start with the Critical Communications Association has been here for
now for 30 years, so we're chairing this slot. We're going to chair another slot at 3:15.
Our Chairman, Mr. Mladen, who's just to my right here,
has asked for all of you to come back at 3:15.
So hopefully we will keep this going, uh, and it will be fun,
and you will all come back for the next lot where I will actually then be in a chair.
So I've got the whole afternoon huddling in the huddle.
But what is important is the Critical Communications Association's vision is about
advancing global critical communications for a safer, more connected world.
We are very much about standards and developing standards and driving forward.
Um, trusted and standardised, uh, technology and has said we've been doing that,
and some of us have been involved in the association, some of you sat out there involved
in the association for, Quite a few years, um, we have 11 working groups of which the
Future Technologies Group was set up about 5 years ago,
um, by, um, our CEO at the time, Mr. Tony grey,
who is in fact sat there in the front row.
With the aim of let's try and do some horizon scanning, let's try and inform the members of
the industry, the users, what sort of technologies are out there,
what sort of emerging things are coming up that might be able to be adopted and might be
interesting. And if I pressed this too much, I will see how
we get on. Technology problems.
No, we're fine. And What we're trying to do as a group linked
to the critical Broadband group is publish papers, publish material that people can pick
up and go, Do you know what, that's quite a good idea.
Let's try and see what we can do with that. And this is just an example of some of the
papers and some of the output that as the TCCA we have produced and some of the things we
are looking at. And if you want to know more about that,
you can download the slides and you can always go on our website and you can get that
information. The Future Technologies Group though is very
much looking at producing some material.
That is sent to all the members, so we have people who contribute and they will scan
academic media and what's going on in universities.
We will look at what's going on in Silicon Valley and we will scan various material
across scientific journals, engineering journals, and we will pick out the things that
we feel are and may be interesting to the industry and What that ultimately
means is that each year we have a set of trends. We're very much looked at what is linked to
critical communication, so where critical communications can be used as a bearer.
We're not just interested in the apps, we might be interested in the power technology,
the battery. Technology, the screens, um,
wearables, and all those things that are linked that ultimately can help the end user.
And clearly, as you know, in the private domain with lots of things and IoT and people wearing
Apple watches and VR headsets. There's lots of technology out there.
Some of them are fads, some of them don't last long.
Some of them may well have practical applications, particularly in the public safety
domain. So when we're trying to look at what these
emerging elements and merging bits, we try to look at how they relate to the end users of
ultimately people who will be delivered with some sort of mission critical broadband
capability. So let's have a look at some of the trends.
Are you pressing this or am I pressing this, the technology's out the window.
um. And the most important thing, we're going to
stick with this pie chart and it's going to stay there forever.
So the trends last year, so we do horizon scanning about every quarter,
so we did 4 last year.
Um, we've done 1 so far this year.
I added all those articles up, all those snippets up,
and I came up, somebody said, a 3D pie chart, um, which looked at ultimately what were the
trends, and it wouldn't be a surprise to anybody if I have a laser pointer on here
without taking my eye out, and I do, but it won't come off on the screen.
That AI and augmented reality was at the, at the top of the list.
Uh, then we got beyond 5G and extending connectivity, you might say,
well, that's because you're looking for those things and therefore,
everybody's focusing on faster broadband and 5G and even 6G.
Now and what will come out of that, we then got into an interesting piece around
wearables you are UX and devices, being 16% and a smattering of a number of other
smaller ones, quantum computing started to appear.
Robotics went down the previous year. Robotics and drones was even higher,
the same with IoT, but the one that interests me, as I said to some of the panellists just a
minute ago, is always power and energy because we have mobile devices,
battery technology. You want to get more out of these things.
You always want more power. So watching what's going on with power and
energy is always an interesting topic.
So what we'll do now is we'll open this up to.
Some views from the panel.
As I've said to them all, there is no right or wrong answer.
We have not rehearsed this.
Everybody did get sent the questions, so they sort of know what the questions are,
but nobody has told me what they are going to say, so I will apologise in advance if it's
gobbledygook or equally, it might be really informative.
So we will go into having a debate.
Um, normally we do have a screen in front of us so they can see the pie chart,
so they're all gonna probably sue me afterwards for a crooked neck,
or equally we can just pass the, pass the iPad around.
So looking at those technology trends, and Wayne can remember them because he's got a
photographic memory, the question is, out of those, taking those trending ones,
which of those are of particular interest to you or your organisation,
I don't mind which.
And why, and we'll start there. So Heidi has the microphone,
so she, she can't pass it on yet. So we asked Heidi from a coastguard point of
view, what sort of things that jump at you. So do you know what?
That's really interested me, or I've been doing some stuff around this in my day job,
that's quite interests us.
Um, yeah, well, from a, um, coast guard perspective, um,
probably quite predictably, um.
For the next year, our focus and, uh, and quite a lot of investment is going to be in the AI
space. Um, as I, as I say, obviously the, the largest
section there. Um, so really looking at how can we get help to
the, the casualty quicker.
Um, how can we help refine our searching? How can we refine the information we're looking
at? We have loads of data, so how can we, how can
we use that data, uh, specifically proactively?
How can we get more proactive with where we're sending, um,
our assets, where we have our resources, um, also things like our search plans and search
patterns. How can we utilise AI to create better search
plans faster?
Um, and also things like, uh, our satellite images.
So, um, how can we help refine satellite images, um, and,
and get the information we need out of that without the,
uh, you know, the, the, the kind of slow human intervention that's quite manual.
Um, probably another one in there that, um, is slightly linked there is the drones as well.
So we're doing trials with drones at the moment and have been for the last couple of years,
um, and again with that, you know, how can we use AI to help us identify what we need to
identify within the drone images and the footage, um,
and how can we, um, you know, without having the human intervention sitting there actually
physically looking through that, um, we can pick out that much quicker and easier.
OK, brilliant, that's a good starting point. Let's keep going.
Let's, uh, Stephen, what's your thought? What, what,
what particularly jumps out there, uh, from, from yourself and,
uh, from a police perspective?
Yeah, and I'll try and keep it interesting and not just repeat AR 21%,
it could have been a 99.9%. Based on kind of my articles and reading and
this conference. So I kind of won't repeat that we might come
back to I guess one that stood out for me is the security and privacy.
Yeah, I'm I'm not unique, but I'm a CTO that gets quite nervous about how much technology
and I suppose I, you know, the example of Crowd strike last year.
And how that I'm getting an email going, are we going to lose it?
And I was like, what's happening? I was on BBC not knowing at that point was I
exposed, was I going to lose all my critical services, and I guess in the kind of rush to
digitise and connect everything, I suppose I reflect on are we compounding risks to the
point when, you know, I joke that the biggest risk in my force is an air conditioning unit
and a server that I don't know about, and actually, you know,
we digitise lots of processes, we've just moved everyone onto digital pocket notebook.
The risk is I could lose everybody's evidential chain, whereas before,
you know, an officer might stick it in the bin, leave it on a train,
and whilst the officer will get a slap on the wrist and there'll be a consequence,
I could now lose 1200 police officers evidential chain,
and that's the thing that kind of keeps me up at night that you kind of can't come back from
this onslaught of technology as an enabler for good.
But I do think we are compounding risk and therefore, the large scale events,
the black swan events become ever increasing, so I kind of have this excitement around hype
and I'm bought into it like anybody AI and I sit there thinking though that the major,
the business continuing the disaster recovery bit and how we focus on that is the thing that
I kind of concern myself with because I do worry that.
That the compounding of risk and the connectivity, you know,
an individual in a server room can bring down a police force or a bad actor,
forbid, so I think the security and privacy about how do we make society overall better
rather than it's really, really good, and then we crash when we have a big compounded black
swan, and I think crowd strike.
The issue last year was a kind of good reminder of the downside of when we connect all this
technology. That's one I think particularly stood out for
me. That's an interesting point, isn't it?
And there's this sort of this term.
That's been around for a couple of years, but I think again getting more focus is secure by
design, right, trying to build security into those things from the start and to consider
what those risks are and be able to mitigate them whilst not necessarily being too
oversensitive to the point, you know what, I don't trust this thing or use it at all because
it's technology and you are right, and I always say to people,
actually one of the greatest threats is from within, right,
like you say, and even inadvertently does somebody just pull something out or try to
upgrade some. Some technology or some servers and actually
upgrade both by mistake and then pull the whole thing down and that might have happened in some
worlds that I work in, you know, which was totally, totally,
you know, an error and by mistake, not on purpose, but it certainly had a deep effect.
So the more you rely on this, you need to you need to build those things in.
So Gillian, from a fire point of view, you can either say AI as well,
or you can pick something else if you want to be fun.
I'll try and be fun.
Um, I, I think actually some of the things that we're looking at,
um, cover two or three of these things. So looking at things like dynamic risk
modelling, um, so yes, looking at where we, where we bring data in from.
So, um, instead of the, the kind of flat model that we have and the way that services are
resourced at the moment, um, things that, you know, kind of known,
is there going to be a Taylor Swift concert at the weekend?
Is there going to be a thunderstorm at the same time?
Is there also a football match on?
And do we know that hospitals are experiencing increased demand around the same time?
And how do we put all that together so that we can tailor our response?
And so I think, you know, that it kind of looks at that interconnectedness and and also having
been wary of the um the the risks that that brings because every every new connection in
that chain is a is a potential point of failure.
Um, but we, we need to, to try and be more intelligent in the way that we're using the
resources that we have, um, and looking in again about IoT sensors,
um, and how we can use that smart city technology, that connectivity with 5G and 6G.
Um, how that might tailor our response as well. So being able to have something more than just
a smoke alarm going off, which then triggers a response.
Um, what else can we glean before we get to the get to the fire ground and understand what
we'll need to do? Yeah. And and again, it's not on here as a.
Particular topic, it's sort of built into a number of those,
but situational awareness, another couple of buzzwords where people have been talking a lot
about how can I improve situational awareness, and that's not necessarily about wearing
wearables and having things in buildings, although there has been some case studies and
scenarios of heads up displays in.
In in breathing apparatus and being able to tell people where the exits are and once upon a
time there was a great thing that Motorola did with a great video where they chased the
criminals and they knew where the criminals were going to run to Wayne,
and they could go to the back door and wait for them to come out,
but somebody else said to me, I used to have a chalkboard and do that and just catch people
running. Robbing places and catching them as they jumped
out the window, but situational awareness and using that data and making good use of that,
as you say, whether it's whether it's IOT devices, smoke sensors,
CCTV, other things, and then putting analytics and even AI on top of that to help you inform
you with decisions.
And it's interesting. I was looking at fire and fire control rooms,
you know, and predetermined attendance and actually say some bits of fire have been doing
this stuff for years. Everybody else is, you know.
In the past, but anyway, uh, then I'm being prejudiced towards fire.
Let's move on to ambulance chase.
Um At the risk of being a little bit predictable, cos I think I might have said this
last year, but I'm not quite sure. They won't remember,
they won't remember, no. um I I I'm just gonna ask for a little bit of,
um, participation if I can, uh, hands up anyone who's got a smartwatch on.
On 2, yeah, me as well.
Uh, keep them up if you keep them on for pretty much 24 hours of the day.
I like sleep, stress, all of those things, yeah, brilliant.
So if I look at it from What's the ambulance service there for?
It's for the patient, right, so we're there for the patient.
Lots of information about us.
On these devices, right, about blood pressures, uh, some,
some do respiration rates, some can give you indications of abnormal um heart rates,
etc. etc. can't they, you know, pulse rates.
Um, just imagine a scenario whereby, uh, one of my colleagues,
John's just sat down, who, who goes to a patient unconscious,
right, so we can't ask any questions of this patient.
Right, they're they're unconscious, they're not gonna, they're not gonna respond to anything,
there's a protocol and the procedures that we go through.
But if we could interrogate that device.
Maybe that device might give us some insight into maybe this patient has had a um
a cardiac event or something has happened in the last 24 hours that that uh you know,
essentially um has led to this event, right, so, so I think wearables.
If we don't, if we're not thinking about sort of VR headsets and those sort of,
but thinking about the things we have every day upon us,
I, I think that's a really, they're really powerful for,
for the patient journey, the first part of the patient journey,
um, I mean, I, I was gonna say AI because we're looking at AI as well,
but, but essentially I think if we, if we start, if I start there,
right at the start of the patient journey, I think if we could somehow harness the power of
these things, right, and that might be some.
AI application and an electronic patient record that that can identify which device is on
the patient's wrist and therefore load the appropriate, you know,
interface required. I don't know, but, but, you know,
I think or a standardised approach to that sort of data capture would be really,
really helpful. Yeah.
I don't think you said that last year, actually.
OK. That's got off with that.
Well, I certainly didn't do patient uh can't remember.
And and it's interesting because again I've read some articles where people talk about AI
and prediction and hoovering all this stuff up and then there's that interesting about come
back to the privacy and security.
Some people don't want all that data broadcast because, you know,
and civil liberties and other things, you know, we start getting into politics and,
but you know, there has been a few articles. So, well,
if I had all this data and I could mine it, I could deploy an ambulance crew to you before
some major events can happen with your body, because I know in 1 hour or 2 hours' time I'm
gonna knock on the door and say, do you know what, you need an ambulance.
What? Well, you're absolutely those sorts of things
people have talked about, right?
You're absolutely right, Robert, and you know, and if you think now you've probably seen
adverts on the TV for these monitors that people are gonna have to look at sort of um
blood glucose. You've got people doing it and have these
things on because they, they want to do some sort of diet,
you know, so they can watch their blood sugars and all this type of stuff.
Well, actually, again, that's, that's medical information that could or could be of real
benefit to us. Yeah, I just had a pint of Coca-Cola,
so that won't be any good spike then. Wayne.
There's a downside to going last. They've covered all the good stuff,
haven't they, and I'm just gonna talk about AI again, but I think it's AI from a different
perspective, perhaps I can talk about, particularly in the policing world,
is the criminal use of AI you know, criminals are working from home nowadays,
you know, that that's how it happens and the, the majority of crime is,
is now fraud is is online but um Steve talked about the cyber threat.
You know, there's fantastically powerful AI tools out there that can find the
vulnerabilities that can help them work all of that out,
and I guess we've got that bizarre kind of machine on machine situation where the great AI
tools to discover that and stop them getting in. So it's almost the fight of the machines there
in many, many ways, but so that's one side in terms of the kind of attack vector from
security, but also fraud, you know, for all of us in our personal lives,
you know, that. AI tools to seek out those vulnerabilities,
to find out your behaviours, what's going on when, you know,
how, how they can interrupt, intervene with that, you know,
to turn that into criminal activity is obviously a growing area.
So there's kind of two sides, they're struggling with,
you know, which I see everywhere, looking at people looking at automation and AI to drive
out better service and productivity, particularly around cost savings,
but you know, the growing use from the criminal fraternity.
Is becoming a real challenge and that crime is going up and up,
and I think everyone's aware that the vast majority of fraud where you've lost money
through banking or whatever, doesn't really end up on anybody's list anywhere,
you know, and that it's huge and growing all the time.
So just perhaps a different perspective on AI and that's interesting,
right, and we'll touch on this because we'll put a bit of spin on my question number 2,
which was. When I look at how fast technology is going,
some of these things, and we talk about some of these private,
even the private world, let's cover the criminals.
So when we look at how quick technology is moving, they can just adopt stuff,
right, criminals, say the criminals are better equipped than public safety organisations.
They have no procurement process. Yeah, so why do you think that is?
Why do you think that these end user organisations, maybe some of yours,
maybe, maybe Stephen says, no, I'm the director, I can get anything I want.
He's laughed at that one. But, but why, why in public safety do we think
that some of these technologies, some of this stuff that you see is really brilliant.
Why, why does it take forever or even never, forever or never to get in the hands
of the users? What's your thoughts on that?
Do you want to start and then we'll go that way? Yeah, sure.
Money is one side of it, you know, criminals tend to have very,
very big budgets, you know, to be able to do this kind of stuff because that's that's the
nature of an organised crime group. They're all about money,
you know, so they can move things forward very quickly.
They don't have procurement, they don't have standards, they don't have policies,
they don't have all the other things that go with all of our organisations and the services
that you have to go through, and they have the skills and capability because again they can
buy it in. Yeah, they can go to people with big money,
the really, really clever people that any of our services can never afford to bring in.
So, you know, that it, it's.
It's so much easier, you know, and all those things become blockers,
you know, they do worry about security to a certain extent,
but nowhere near as much as we have to. Yeah.
Jason, what's your thoughts in terms of trying to get things out there in the hands of users
from a speed point of view?
Uh, a risk of parroting the, the, the, the money thing is definitely money,
money and procurement and procurement, but I think it's quite often the way we
approach, um, uh, the sort of implementation of things.
So from an ambulance service perspective, we're really, really risk averse,
right, so we tend to like to see um technologies already out there,
already working. User cases or it's evidence-based,
you know, sort of an evidence-based, um, uh, organisation really.
Um, so, so I think it's the, the way we go around, um,
and we approach bringing something new in.
we should be proofs of proofs of concept really or or or something along those lines which are
not tied to an tied to a to to necessarily bringing them in,
but, but if we can if we can if we can have those sort of uh um uh
opportunities to test.
Trial things in the in the real world funded potentially by the vendor.
I mean, I'm I'm just putting it out there vendors, um,
you know, it's it's quite possible then that we can, we can measure those benefits and provide
those benefits, you know, um, and then implement something based upon.
The risk being somewhere else, right, so the financial risk is somewhere else.
We've only done it small scale, you know, um, we've tested it,
we're not committed, right? If we could do something like that and our
vendors understand that as well, then I think we, we,
we'd, um, we'd see a quicker implementation.
OK, Gillian, what's your thoughts on trying to get things a bit quicker in the hands of users?
I think some of it is just trying to um focus your priorities.
There are, there are so many new things out there that you,
you can't do it all, so you have to kind of cut your cloth,
um, and you have to work out what, what are the things that are actually going to bring you
good outcomes and concentrate on on those things.
um, there are um. I remember, you know, kind of a few years ago
in a user testing group with people with dementia diagnosis,
and we were looking at different products that might help them around the home and one of them
described it as a as a solution looking for a problem.
And so, you know, and that always stayed with me, and I thought,
you know, that's that there are a lot of things out there in the world like that.
Um, but to be able to do that, we have to be able to spend some time on innovation.
And a lot of fire services are are are stretched in that capacity.
So they're they're doing what they need to do to keep the lights on,
to make sure that the systems are running.
You can't afford to get it wrong because it's public safety that you're talking about.
So there is that that kind of risk averse nature.
And we see, you know, maybe somebody doing a proof of concept.
In another sector where you would think, well, actually, OK,
right, if it works for the financial sector, surely that's got to be good enough for us,
but then we think, well, we would like to see a proof of concept in the fire sector and then
that we see, well, actually, you know, that's a different service.
We'd like to see a proof of concept for our service before we decide to move on.
So I think we have to move away from that. A little bit and work out what are the
standards that we need to see in some of those proofs of concepts and say,
OK, well, if it, if it meets those standards, then that's good enough for us across the board
so that we can actually get it to the next stage.
I've got another question about pilots and things.
We'll come on to that, but it's coming back to that that second question around.
Speed of technology, wire end users sort of seem to be behind where it seemed to take,
you know, ages and been out with a few people where they go,
well, I've got my got my airwave radio, and don't get me wrong,
you know, Airwave and Tetra has been great for those users,
but they then go, Do you know what, I want my broadband, I want the broadband applications.
I'll use my phone, I use Google Maps, I'll, I'll use what's 3 words on that.
I've got apps, I'll just, I'll just use my, use my phone,
so. Views and thoughts from a policing point of
view about getting technology into the end user hands. Steven.
Yeah, we can't get away from the reality of what we are as an organisation,
you know, yeah, Amazon is fated there were 1000 Amazons and probably 999 failed.
Yeah, 43 forces, you can't accept failing.
So there's always going to be that and there's a relative,
but I come back to the risk.
So has quite a unique history in terms of its evolution.
It was in a big partnership I mentioned on my previous.
And we had a year to kind of rebuild the force from scratch and the hardest bit being
rebuilding IT from scratch, and we don't make beans, so we still have to service the public
in a life and death situation.
So the whole culture and the risk appetite of the organisation orientated around that task,
and I've been really keen to kind of harness that as we went into what I call peace time.
Because we are as an organisation have tried to add in bits of governance every time and we'll
have a technical design authority or we'll have a triage,
and I've been pretty brutal in trying to keep that spirit and risk kind of appetite that we
had during then.
It's hard because we kind of add things in every single time and on their own they all
make sense. So I'm relatively fortunate in that the Chief
Constable and the deputy were around at that time and they have a relatively high risk
appetite, and we always come back to what's the cost of doing nothing,
you know, no decision is a decision.
An example would be, we, I mentioned it, we're the first force to roll out Apple products as a
mobile my business case was about half a page.
Now the only reason I kind of got away with that was because there was huge trust.
Yeah, we didn't, we were a small force, we didn't bombard with governance.
The chief was like, that's what you think's right.
And then you kind of got to deliver on that execution, so if you get it right,
it becomes a virtuous circle. So the next time you come,
so we're bringing in AI, there's a kind of trust built in that you've delivered and it's
hard to create that virtuous cycle and it's easily broken.
So I do think, you know, my, my, I describe my deputy chief constable as saying he'll take you
there. In terms of risk because he knows you'll only
go there, but if he goes there, you'll go there and the organisation is like quicksand.
Yeah, so sometimes we're a little bit hesitate to use,
not saying we follow Donald Trump in any way, but the sense of the kind of more outlandish
you are, the starting point moves, and I think you have to be quite brave as an organisation,
and that's not easy.
I think we're lucky we have the experience of the separation and how that galvanised the
force because organisations are like quicksand. Yeah, and I think you know that leadership to
be brave. It's not easy.
It's not easy in emergency services, but it's absolutely the most critical thing to get
things done. And you can, you can turn hero to zero
overnight, right? Very much so, you know.
And if you've got a 5 year 10 year life as a chief constable,
why would you take that approach? Yeah, I get it,
you know, you deploy something you got to live with it.
Yeah, you know, so it's, yeah, it's interesting. Heidi.
So coastguards for you and all of this, or do you just buy what you want and throw it out
there in the sea.
Uh Um, well, I guess taking it back to, um, a more sort of user focus,
um, because my team is mainly focused on the user, um.
I think really for us we'd be looking at um you know, what is it gonna do for the user,
what information is the user gonna be presented with, um,
what data is coming through the system for the user?
Is that data verifiable and reliable and a lot of these things,
it, it might not actually, it might not actually be either of those things,
um, and, and obviously for us that's quite a big and quite an important part of that,
so making sure that we can.
Um, you know, match whatever we have out there with something that is verifiable and reliable
to our users, um, and also they're being presented with that information in the right
way, um, and we know the processes we want them to follow when they're using that information,
um, so I think, um, yeah, from my perspective that's, that's kind of how we are within the
Coast Guard, um, other than the, um, the procurement, uh,
other, other bits we've already discussed, so yeah.
Interesting, interesting. OK, so let's move on to the next one that sort
of linked some of you've touched on it a bit, but looking at the conference and the
exhibition, if anybody has had much time, some people probably be sat on here all afternoon,
uh, how much time looking out there.
There's lots of innovative solutions, some new, uh, I'm gonna call them new toys,
new gadgets, new widgets, some of them are, you know, quite powerful,
some of them are, yeah, interesting.
I'm not gonna say who or what, you can work that out,
um. But I'm always told that.
A lot of those things that are interesting, they don't ever go beyond a pilot or a trial,
so somebody will pilot something.
And I always used to say, Wayne, I'm gonna look at Wayne Stephen,
maybe we'll start there. Um You know, police was always very good at
trying to pilot things and you know, even play with things in a small form,
then it would never ever really move on.
I know there's been some national initiatives and things where people have taken hold of
things and driven them through, so maybe we'll start with Steve and we'll then jump to Wayne
because I know he's been very much involved in some of those pushing those national
initiatives.
Why do we think again, you know, without saying the same thing he said about commercial people
and money, but why do you think a lot of those pilots and those trials?
Just sort of have a pilot and a trial and there's no real thing beyond that and I'll just
get your views and then I'll add to that in a minute.
And you know, we don't have a kind of shareholder value,
so proving benefits is always really a challenge, you know,
and therefore that sense it's quite easy to quantify the cost of software and there's so
many, you know, there's 1000 things that impact on police productivity and therefore isolating
that in the evaluation. So I think we're just quite upfront about that,
that it's really messy.
In policing an outcome, so we try and have a balanced portfolio.
There's a few things where we think we've ended up being front of the game,
but probably not deliberately, equally, you know, it's quite pleasing in policing there's
43 all doing the same thing, so I will happily borrow with pride if I know Bedfordshire is
doing something. I don't see it as an ego thing.
You know, and I think policing gets a bit of a hard time from the centre because it says do it
once well, and I think we would if the centre was a bit quicker,
and I'll use an example, co-pilot, a tool that you know most people who work in the private
sector have had a go at.
We can't get it assured in policing yet. Some forces are doing it.
I'm speaking to those forces, the centre.
It's kind of arguing over money and who does it and you kind of go,
I'm sat here with people walking in and Microsoft are pushing it through every door to
try and get into my service. I'm just going to have to go so I do think we
get a bit of a hard time as local forces that that we go off on our own and do things and
then don't. I think the inspectorate has a view if I take
right care, right person.
Actually, if the inspectorate and the college endorsed something,
I do think policing is quite good at taking it on, and we had auto reduction which went across
the reality of the commercials of who picks which provider,
but I do think probably.
It's a little bit harsh. It's probably better on the ground in my
experience. I've only been around in policing for 6 years
that if something does work, policing will take it on.
Some of the challenges, the architecture and design was slightly different,
but I think if something works, people will take it on.
I think the benefits realisation is just a challenge.
At some point you can make police officers so productive.
I still need 1000 officers. I've still got the cost cashing them.
I've got 10. And officers on the books.
I can't change that. It's a political number,
so sometimes it makes sense, but politically I can't actually bring it in because people don't
value the technology. They value seeing officers out on the beat and
their benefits realisation and monitoring that's been brought up a few times about what
the benefits, what the real outcomes of these things, and and and a couple of us and me and
Tony were involved in a mobile data pilot and project with a force in the very,
very early days. And some of the benefits that have been
collected basically if they have this mobile data platform,
I think their view was they've replaced all the police officers,
but actually in reality they would never you know replace the police officers and then you
sort of go, oh well they've saved time, but no actually it means they can focus on priority
things and you've got to look at those benefits and you need to almost put them in the right
buckets and apply the right logic to that.
Um, and sometimes, you know, we, we read things and we did then Tony you read to say it will
never happen. This is absolutely, you know,
rubbish and then that doesn't help when you try to take that business case upstairs to people
to say, I want to try this, I want to invest in this, and ultimately I'm going to get rid of
all the police officers in the force. Well, no,
you're not, you're never going to do that. So um it does need to be balanced,
it does need to be right.
Wayne, we were leaving the best to last here.
Yeah, thanks. Um, It's quite unbelievable the amount of proof
of concepts that go on, you know, and I think that's not just policing,
that's right across the sector.
And and I think lots of the ones that I've done reviews and and all kinds of stuff within
different forces that.
They're a solution looking for a problem, they always start from the technology end,
you know, let, let's have a look at this technology and trial it in one area,
then there isn't a clear problem that it's trying to solve at the end of it,
you know, and that's valid to a certain degree, testing some of these elements,
but. You know, and I talked to chief officers in
policing quite a lot saying just stick to the wicked problems,
just, just stick to the things that would really make a difference and let your
technologist go out and find the answer to those problems,
but everybody seems to want to get on the AI bandwagon.
We need some AI, you know, and too many people say we need more AI.
It's a bit like saying we need more car.
Well, what does that mean? You know, it stick,
stick to the problems, and I think that that's the biggest issue I see with proof of concepts
and then they just stop because there isn't a good business case you can create around it cos
it's not trying to solve um one of those wicked problems.
Yeah, and, and like I say again, you see some of that stuff around here or you see some of
those videos, you see, there was a famous one that the Americans did.
I'm almost gonna look at, look at, we don't have a microphone,
but I'm gonna look at Tony. A few years ago when we were looking at
developing 3GPP standards, the Americans came up with some broadband glossy video,
didn't they, with Microsoft with big tables in the control room and 3D imaging and IoT and
heads up displays in fire trucks which would point you where the fire was and you'd love it.
You'd love it. But and then people have shown some of this
technology and it's never actually got in the hands of the people that really need it,
and I think on some hands, yeah, some of it dies.
There was I say there was a company that had a great headset which showed where criminals
would go and maps of buildings and would triangulate that they were going to escape
through a certain door and you could stand out the back and arrest them.
There was a case of a guy in America who had Built some AI tool to look at um
triangulating where the next, you know, where people get cars dumped,
and I had a policeman friend of ours who unfortunately passed away a few years ago.
So you know I used to do that in the 60s on a chalkboard and just sort of draw around where
people were getting robbed and then I'd just stand on the street corner and grab them as
they walked by. But you know, there are elements of technology
where you go, do you know what, be really good. There are other.
And I just think that whole piloting piece and then sorting out almost the wheat from the
chaff, the good from the bad, the ones that do deliver the benefits that really,
really, really matter to those end users, as well as,
yes, it does cost money and you do want to make some efficiencies and savings.
They don't need to be hard cash. That could be making people.
Releasing time so they can focus on other things while it's doing things in the
background. There's bits of technology, there's stuff out
there that can be used, you know, even in the coastguard world with helicopters and video
footage, you could collect that stuff and some AI tools could look at it and spot things and
do things a lot quicker than, you know, you could.
Scanning lots of cliff tops or whatever, but you know,
it's about adopting the right technology and in the right place.
So let's now get the views of the other colleagues now we've done police.
I was going to say police to death then, but very, very interesting there about pilots and
prototypes, so.
I don't know about Ambling, you have pilots and proto I say is Chris Lucas a pilot or a
prototype, but we have lots of pilots and pro you know,
do you have many pilots and prototypes that really, do you?
Yeah, go on. What about Chris Lucas or the pilots,
is he a pilot or a pro not here.
No, no, um, uh, we do actually, we do, we do tend to have pilot,
um, um, you know, pilots. They take a long time to get going if I'm
honest, in in in the ambulance service world, um, but.
In terms of Turning him into something tangible, I I think the problem we,
we encounter is. Um, and you sort of touched on it,
I think, with the fact that, you know, other departments, uh,
you know, um, just picking the bits of shiny stuff off the shelf and saying,
can we have a go at this, please?
Right, we, we suffer that a lot in ambulance service.
We're trying to turn that round now into the, and I think you said that,
Robin, but in in terms of, you know, our, um, uh, let's have the problem.
You know, deal with the problem. What, what is the problem you're trying to
solve? And now, as I think you said about
technologists, didn't you, and then let the technologists, um,
find, find the solution and we're trying, we're trying to change that culture within certainly
my ambulance service. I, and I think, I think we're at a bit of a
cusp now as well. We've you'll find, we're finding a lot more in
in terms of the um.
English Ambulance Service trusts starting to collaborate and they're starting to collaborate,
you know, 2 trusts, 3 trusts, not not not not a sort of big UK type approach,
but um and and actually that.
That in itself is quite interesting in the fact that um.
When you work with another trust, or when we work with another trust,
there's a different set of energy, yeah, right, maybe even a different set of experts,
right, who can see something that you can't.
Right, or they've tried a different product, right, um,
and that's I think also where we sort of fall short sometimes we go proof of concept one
product, not proof of concept 3 products, which is the best of those three,
right, and that's what lets us down sometimes. Oh, that,
that one didn't work very well.
So it's been that, right, you know, without sort of testing that market,
um, but of course, you know, and I will say it comes down to the cost thing,
but, you know, I've got to say it again.
It does, um, you know, but, uh, but that, that for me I think is on the cusp.
It's a bit of a change for us now, you know, I can see that proofs of context may very well go
somewhere, um, um, in, in the, in the not too distant future for us.
Interesting, yeah, interesting. Gillian, what's your thoughts,
sir? Again, pilots and try again seen bits and
pieces coming out of sort of fire service generally.
Um, looking at bits of tech and trying to, you know, or,
or try it and pilot it, but then beyond that rolling it out wider.
What's your thoughts on this. We describe them as pockets of innovation,
right, yeah, that's a good one.
Yeah, yeah.
I think you know, um, naively when I joined fire and rescue as a as a sector,
I expected that there might be more similarities across the governance models um
that would make it easier to to get beyond the pilot stage,
so that definitely complicates things, um, but I think that that we need to use things
like the new procurement regulations. To be able to engage better with suppliers at
an early stage.
We've got opportunities there instead of seeing it as a as a kind of them and us relationship
and to work in cooperatively develop products.
And again, absolutely a proponent of a, you know, kind of service design techniques,
having that that approach to problem solving, not looking at the the technology,
because when you, when you think about what we're trying to do.
And the people who are receiving our service, members of the public,
they don't care who delivers it, how it's delivered, what department is responsible for
it. All they want to see is that they get the
service they need when they need it, and we need to start thinking more in those terms
rather than seeing the the the boundaries across the way that we deliver things and to
start to look at it from the from the public's perspective and a lot of these things.
And it does come down to cost as well, because all And you can get money for a pilot,
you can get money for a proof of concept, but there's never any money to scale it up,
to roll it out, and to think about what happens for the development of that product across 3
years, 5 years, who maintains it, who takes, um, you know,
who's going to drive that forward, and we need to think right from the start beyond that pilot
phase and what happens when the pilot is successful, if the pilot is successful,
what comes next?
Yeah, yeah. Interesting. Heidi, do you want?
Put in some final comments.
Um, yeah, I, well, I, I guess just building on that as well,
thinking about our, um, our end user for us in the maritime,
you know, in, well, for a lot of our areas, the maritime domain,
um, and most people are quite aware that the maritime domain and the industry moves quite
slowly with things. Um, you know, you might have a vessel that's,
um, in operation for 40, 50 years, and we have to take that into account when we're trying to
roll, roll roll out something new.
Um, which, you know, some of these things, there is like a 10 to 20 year roll out process,
uh, that we have to follow, um, and we're working very closely with the International
Maritime organisation on those things, um, to try and get them moving.
Um, but just as an example, one of the things at the moment is,
um, the VHF data exchange system, uh, VDES, which was,
um, supposed to be the 2nd generation, uh, vessel traffic solution,
um, but you know, we have been talking about this internationally for.
Many, many years now, at least, you know, a good 10 years,
and we still really haven't got any major progress on it.
So, um, you know, we, we, we move very slowly with these things.
We're working slowly, slowly with them.
I think a lot of that is to do with the standardisation.
Um, so for us again, I think it's important there to have the standardisation across the,
the, those different systems and solutions, um, as particularly when we're talking about like
the interfaces, APIs, for example, um, and making sure that is available within those
systems. Um, going back to our, our own end users,
um, of the systems, I mean, they You know, they, they might have lots of different technologies
and, you know, as we have mentioned, all these exciting new things that come out,
but actually how can we ensure that they interface and fit together and present the
right information to them at the right time.
So yeah, interesting, interesting, and you touched there about standards and things,
so just remember when you come back at 3:15.
After a small break between this one and the next one,
Mr. Mladen we'll talk more about standards and the
importance of standards, won't we? So we'll cover that.
We'll cover that one off then.
So we're getting on quite swiftly to to my final question.
Um, we've got one more slide, we'll talk about that, and then we'll we'll open the floor up to
see if the audience want to ask you anything.
If they do, we'll stay a bit longer, if not, we'll, uh,
we can always knock off a bit early cause nobody really gave me a time to finish other
than by 3 o'clock, so that's fine.
So, is there anything particularly innovative that wasn't necessarily even on my chart,
and I'm leaving my pictures to last.
Uh, that comes out that might change the way that mission critical services are delivered in
the future. And Babco has, has, has asked me to include in
that discussion about, what about 6G? I mean, I don't know,
7G is around yet, but it's following, right? It's bound to cause it's.
77 is a bigger number than 6, but all about 6G.
So is there anything particularly emitative or jumps out that you think will just,
you know, like that killer rat, that thing that would really,
really make a difference and as Heidi has got the microphone,
Heidi can go first.
Thanks, um. I guess, uh, in terms of, um,
communications, particularly, obviously connectivity is always,
um, you know, uh, quite top of our list, particularly in the maritime field,
so in that industry, um, we're, we, there are trials ongoing at the moment with the,
um, emergency locator beacons, um, and being able to have some two-way comms with that.
So I think that will, that will really, um, improve our communications in that,
in the maritime industry in that way, so.
That's interesting. Because I don't have an answer to the question,
I'll answer a different question.
So like a politician. I guess for me, if I come back to AI.
If there's no more development in AI if it stopped today,
if Google, Microsoft, Anthropic did nothing today, I still think the technology exists
today. I could spend 5 years transforming policing if
they did. Now they won't do nothing, and it's quite clear
AGIs, they're increasingly confident.
What does that mean? I've got 2 young children and I sit there and I
guess you look to the future for them. What will the future look like?
And I sit there thinking how I work today, how I live my life will be completely different to
how they are. You see, you see it in bits, but I think truly
it is that transformative as a general purpose technology.
It feels a little bit overwhelming in terms of what it will be,
but equally I think it will dominate.
It's a general purpose so we won't talk about it in the same way,
in the same way we don't talk about the internet and the web in that generic sense.
But I do think it will transform every aspect of life and therefore the emergency services,
whether it's criminality, productivity, how we attract, will be a kind of in 5 years' time,
I think it will be a very different conversation.
And and it's very interesting because even now we talked a bit about commercial people
delaying things, but you see in some tenders now when they're sent out there.
Have you used AI to write your tender response?
You know, and if you say yes, you've got to say how and why.
I don't know, ask the AI.
Um, I had one the other day, a question which was what will the,
what will the environmental impact be of deploying AI?
I don't know, I'll ask Chat GPT.
gave me pages of stuff, but I still never understood it,
so I just left that question the concept of what knowledge is,
what work is, yeah, what it is to be human as an officer,
you know, I talk about there's a couple of stories I mentioned about officers,
the things we do not so much as put the people that are so human.
Yeah, and actually that that will remain. I think that separation between what is
valuable work for human and what is just waste and can be automated.
I think we'll be increasing over the next. Yeah.
And it's really I've got, I got one picture in a minute.
I've got several pictures on a, on a slide. I got one picture of one thing that is really
interesting to me and just makes me laugh and and we'll touch a bit about that human aspect
in a moment. Gillian, what's your thoughts?
This might be a bit disappointing, but it's it's a fairly similar answer actually.
I think um.
I don't see there being something that's going to be as big a sea changes as we're we're
seeing with with the eye at the moment in the next couple of years.
I think it's going to take us the time to really understand and to to standardise our
data to make sure that it's of good quality so that we can use it to get good results from the
technologies that are available to us, um, and I think that that's gonna put,
uh, it's going to take a bit of time.
Um, I think it will lead to us, um.
Reviewing how we view people's rules within services, how,
how we recruit differently because we're going to expect them to have a different skill set.
Uh, we really need people, regardless of their role across all these services to be,
um, not just competent at writing a word document or sending an email.
We need them to have a good understanding of, you know,
kind of being data consumers at the very least, um, and to understand what that data's telling
them, and then moving on through, we need them to be,
um, competent, adept, skilled at um at ingesting and and analysing and,
and using all these different data sources.
Yeah. And it's interesting.
There's quite a lot of projects out there, isn't there,
around enterprise-wide data platforms and people building data platforms and collecting
all this data, and the question is about what the quality of the data and how do you know
what's right or wrong, and I used to have this old saying of garbage in and garbage out,
so you know. It's very interesting and you know,
there is quite a bit of investment going on in that space at the moment.
I could turn around the audience and look at somebody who's doing some of that,
but I won't, um, uh, but it, you know, it is again interesting and like you say,
um, when will it come and, and how do we, how do we use it and adopt it.
Do you let it have control or are you just using it as a tool to aid some decision making,
then I'd come back and go, you know what, in fire, I've been doing this PDA thing for a long,
long, long time. It tells me what sort of event I've got and how
many pumps I need to send and things like that, and I quite like that,
so there we are. But maybe I'm getting too old now.
Jason, what's your thoughts?
Ah, ah, at the risk of being on the AI bandwagon.
Yeah, carry on, I'm sorry about that, but it is gonna be AI,
I think, you know, again, we are starting to embrace it even at low lower level,
you know, the co-pilot chat GBT stuff within sort of our,
um, uh, um, corporate, you know, corporate things.
I mean how many reports now have I done with a bit of co-pilot,
right, just to start me off cos I've sat there scratching my head before,
um, it's been really, really helpful just to sort of,
you know. Kickstart something, but let's uh take it,
take it from that, from the patient perspective again.
I'm gonna, I'm sorry, it's my job, right, I'm gonna, I'm gonna sit with the patient thing,
but I'm gonna look in the control room.
And um.
a bit like our police and fire colleagues.
All the way through the journey from the time the um call is made and hits our switch,
there are, we've got targets all the way through that journey,
all from the time that call's answered through, you know,
the time we have to dispatch an ambulance, triage, etc.
etc. you can imagine, right, um, really short.
A period of time, but there are lots of little.
Actions within that journey like that AI for example, could have some
impact on, and I'm gonna pick a really easy one if you like because I I I I I had a walk around
earlier and I saw, I saw a product but.
If we can imagine, if we can imagine a er someone in in peril,
phone in the control room.
Control centre and um They've got a different, they're in a different language.
Right, that, and now we, the, the process for the ambulance service is to then phone
something called language line, right, to get somebody to,
to, to do that interpretation, right, in, in real time.
So now that is extend and of course, if the dialect's wrong,
all of those types of things, that's extended that period.
Yeah. So if just by one little thing, if we could add.
An AI type approach, so somebody's recognised that this is it this person speaking in Hindi
or whatever the case may be, such and such a dialect, right,
and can interpret some of that conversation for us, for our call taker.
I, I can see that being an absolute benefit and a time,
um, a a a time benefit for us, as well as for the patient,
of course, cos we we'll be able to communicate now with the patient or the the caller,
right, and, and hopefully bring their, their anxiety down in a,
in a in a much quicker way and of course we've all listened now,
aren't we, to some some of these AIs and they can actually have that conversation,
right, and, and you're sort of, is that an AI or is that somebody else on the other end,
you know, they're getting really quite clever at doing it themselves,
aren't they? So. For me, I think, I think it's going to be a
game changer for us. And it is interesting.
There are some people out there and I remember last year there was a we had a panel and a guy
from Norfolk Police was talking about adopting some AI on Wednes AI,
AI, you know, it's a bit cloud cloud, but some technology that would look at filtering out
calls and reduce the amount of calls to to the.
You know, to actually the call handling people, um, and again,
like you say, you know, translation, I always think of a dragon naturally speaking,
is that still going, you know, but, but these things can improve those things and they can
help some very simple tasks and hopefully that's where they will add benefit,
right? And that and that's where you manage that risk,
is it? You do, you do something small, you do one
thing and see if it works, I think, and that's, uh, you know,
we're, we're all too encompassed, I think, in trying to go.
Let's do AI, let's do something really big, and let's have a nice new cab for what,
you know, just pick one thing and start there. I mean,
we, you know, most of you know, I spend a lot of time in national highways,
we have the same thing with particularly Polish lorry drivers and emergency roadside telephones
and use language line as well, right?
So why can't you turn that into some sort of automated thing,
um, and, and, and put some, put some AI and some clever,
clever stuff in, in the front and that you could do,
right? So, let's see, let's see where that goes.
Finally, you think it's finally, we'll have one more round of of thinks in a minute,
but I'll put some slide a slide up after this and tell a few stories.
But Wayne, what's your, what's your thoughts on uh.
So quite, quite something that will really make a change.
AI related, but, but I think it's the.
The human to machine interface, I think that's going to be the revolution,
particularly for frontline staff, because the, the natural language type of interface I think
is going to do away with systems and applications as we know them,
you know, absolutely they'll be used in back office, but you know,
to be able to do things from a smartphone, a tablet, whatever,
to say, you know, town with a number of the mountain bikes sort of been stalling in
Leamington in the last 6 months and plot it on a map.
That that's gonna be your interface, you know, and it's gonna keep bringing back information
in whatever format or um I want to update crime number, whatever,
I want to update incident number, whatever, you know, and just gonna walk you through those
things. I think that's absolutely gonna transform for
our frontline staff of, you know, the day to day business of logging into yet another app,
you know, going through quite clumsy interfaces, you know,
particularly when you're mobile, you know, that, that part of AI I think is.
It's developing quite quickly now, but I think that's going to be transformative.
Yeah, and it's interesting, isn't it? So let's move on to this,
we're gonna get some of your thoughts on some of these stories,
and we'll add a few things.
You touched there about the human interface.
01:00:00.300 --> 01:00:03.179 I mean, there is somebody out there. I'm not going to say where it's a treasure hunt
01:00:03.179 --> 01:00:04.489 now. You're going to have to go and find it out
01:00:04.489 --> 01:00:06.899 there, but there is on somebody's stand.
01:00:07.580 --> 01:00:12.899 Uh, a virtual reality control room where you can put that on and you've got your command and
01:00:12.899 --> 01:00:15.459 control system, you can find that out there, but that's out there,
01:00:15.500 --> 01:00:17.179 and again this is interesting AI.
01:00:18.020 --> 01:00:22.620 sorry, augmented reality and virtual reality headsets and things have developed a lot.
01:00:22.699 --> 01:00:27.530 They're a lot lighter now, so will the control room of the future not have all those screens?
01:00:27.540 --> 01:00:29.090 Will it just have one of those?
01:00:29.360 --> 01:00:31.739 I don't know. Another interesting question about will there
01:00:31.739 --> 01:00:35.899 be control rooms or will people sit at home? Oh, there's different interesting views on that
01:00:35.899 --> 01:00:38.669 about privacy and protection and so on and so forth.
01:00:38.979 --> 01:00:41.489 I thought at the end you might touch you talked there about,
01:00:41.659 --> 01:00:45.020 you know, Eubs and beacons and clever other things and two-way thing.
01:00:45.429 --> 01:00:49.489 I mean, clearly satellite's interesting, isn't it, because obviously I'm not going to mention
01:00:49.489 --> 01:00:53.129 certain companies, but you know you can get satellite on your phone now and you know,
01:00:53.429 --> 01:00:59.379 some phones, and that's again been aligned to some of the standards work we've been doing.
01:00:59.750 --> 01:01:02.389 So satellite communications and more satellites out there,
01:01:02.469 --> 01:01:06.260 so improving, improving that point of view.
01:01:06.830 --> 01:01:10.340 Um, anybody here in the audience ever met spot the dog.
01:01:11.409 --> 01:01:13.850 I really do not like spot the dog everywhere I go there.
01:01:13.929 --> 01:01:17.330 I don't think there is one here. Has anybody seen to spot the dog at this event?
01:01:17.729 --> 01:01:20.800 That'll be the first event in about 10 years, I have not seen this dog.
01:01:21.129 --> 01:01:25.120 So New York Police Department decided to roll out Spot the dog,
01:01:26.270 --> 01:01:30.719 um, and the New Yorkers didn't really like Spot the dog cos it wasn't very human,
01:01:30.729 --> 01:01:32.800 and they thought it was quite creepy, really.
01:01:33.050 --> 01:01:37.530 And interesting what you touched on about that human uh perspective in terms of,
01:01:37.689 --> 01:01:40.590 you know, I think. I think some of these are being used for
01:01:40.590 --> 01:01:44.870 securing perimeters and looking at various things with scanners and things on,
01:01:45.040 --> 01:01:49.750 great for that, but you start trying to replace the human being with robots,
01:01:49.760 --> 01:01:54.030 and Dubai is a very interesting place. There's also a police robot in Singapore that
01:01:54.030 --> 01:01:56.489 wanders around. Occasionally it falls over when it gets off the
01:01:56.489 --> 01:02:01.679 edge of the curb. It's like he's drunk, but using robots to
01:02:01.679 --> 01:02:03.040 replace humans is interesting.
01:02:03.330 --> 01:02:07.149 And I really don't tell Boston Dynamics, but I don't like Spot the dog.
01:02:07.330 --> 01:02:10.919 I, I'm, I'm, I'm really sorry, um, but he sort of touched on some of that.
01:02:11.169 --> 01:02:15.090 Um, devices and putting things in, in people's hands, we haven't,
01:02:15.100 --> 01:02:17.929 uh, you know, we focused a lot about sort of AI, um.
01:02:18.550 --> 01:02:21.870 And probably the control room, things like that, there's an interesting,
01:02:22.280 --> 01:02:25.790 you know, view about the devices and will the devices change,
01:02:26.040 --> 01:02:28.669 you know, come back to Wayne's point, that might be more,
01:02:28.959 --> 01:02:31.350 I don't know, on the body, might be more wearables, as you say,
01:02:31.520 --> 01:02:37.225 might be different, but I think those devices, those Technologies as I said before,
01:02:37.264 --> 01:02:40.385 the battery technology always interests me because you want them to last longer.
01:02:40.425 --> 01:02:43.264 I don't want to have to carry all these battery packs and power,
01:02:43.304 --> 01:02:45.854 so I've got a bag full of battery packs, as my colleagues will tell you.
01:02:45.864 --> 01:02:50.625 I've got a power station in my bag, right, and, and charging all this stuff and all these
01:02:50.625 --> 01:02:54.294 gizmos. I'd quite like them to last longer and do I use
01:02:54.294 --> 01:02:58.104 solar? We saw somebody at CCW had a like body armour,
01:02:58.155 --> 01:03:00.814 didn't we, Tony, with solar panels in it.
01:03:01.379 --> 01:03:04.060 So hopefully he's not really in Britain much because it rains,
01:03:04.510 --> 01:03:07.870 but he's somewhere, you know, where the sun's shining and he can charge all these devices
01:03:07.870 --> 01:03:12.310 with his solar panels. So interesting use of technology and how we
01:03:12.310 --> 01:03:13.469 take it, take it there.
01:03:13.709 --> 01:03:16.300 We've got 15 minutes, so I'll look at my time. We've got 10 minutes.
01:03:16.620 --> 01:03:19.949 Um, any closing thoughts before we ask the audience if they've got anything,
01:03:20.060 --> 01:03:22.179 if they've got anything for us, we'll answer it.
01:03:22.389 --> 01:03:26.629 If not, we can go and have a tea break and a nature break before Mr.
01:03:26.780 --> 01:03:27.979 Blain takes the stage.
01:03:28.270 --> 01:03:29.989 Any closing comments, Wayne?
01:03:30.729 --> 01:03:33.439 I think you had a good time? Yeah, thoroughly enjoyed it,
01:03:33.520 --> 01:03:35.699 thank you, Robin. I, I think the last one is the,
01:03:36.149 --> 01:03:38.909 the kind of skills that go with all of this, the people element,
01:03:39.320 --> 01:03:42.520 you know, I think that that's a massive challenge for all the services to get.
01:03:43.219 --> 01:03:46.060 The right people who understand some of this to do it in the right way,
01:03:46.139 --> 01:03:51.139 so I think that that is an enduring challenge because all the suppliers here in the room are
01:03:51.139 --> 01:03:52.409 gonna pinch all the good people.
01:03:52.969 --> 01:03:57.209 It's also interesting about should there be a bit more around innovation and innovation hubs
01:03:57.209 --> 01:03:59.919 and previously, you know, we've talked a bit about police,
01:03:59.929 --> 01:04:03.560 digital services and policing. We talked about the Home Office and A and of
01:04:03.560 --> 01:04:06.229 Archie and Simon came here last year and talked a bit about,
01:04:06.439 --> 01:04:09.199 you know, piloting things and doing things in 6 weeks.
01:04:09.239 --> 01:04:15.320 There's the MOD defence sort of research part that also will take things and do things I
01:04:15.320 --> 01:04:17.812 don't know about aspect the maritime. have a similar thing,
01:04:17.972 --> 01:04:20.732 but there's different bits and fire probably. I don't know,
01:04:20.972 --> 01:04:24.933 we talked about the college, police college, different people will play and do different
01:04:24.933 --> 01:04:28.042 things, trying to join that up, maybe share that knowledge,
01:04:28.173 --> 01:04:33.173 have some innovation hubs, do some things with the organisations because some organisations
01:04:33.173 --> 01:04:37.052 can afford and do have people that monitor and look at innovation and monitor those things
01:04:37.052 --> 01:04:39.762 their interests. Other people, other people don't,
01:04:39.772 --> 01:04:43.216 right? So Therefore, and you sort of touched on it.
01:04:43.426 --> 01:04:45.416 I don't want to necessarily be at the bleeding edge.
01:04:45.585 --> 01:04:49.385 I want to see what other lessons I learned, whereas other people played with some of this
01:04:49.385 --> 01:04:52.545 stuff. If it's added some benefit, can I do something
01:04:52.545 --> 01:04:54.545 with it? And I think that's a very, very important
01:04:54.545 --> 01:04:56.416 message. So go on, Jason.
01:04:56.746 --> 01:04:58.506 I'm I'm just going to pick on this one, if that's right.
01:04:58.666 --> 01:05:02.686 The bendy, the bendy phone, but we look at it as devices,
01:05:03.025 --> 01:05:06.865 right? I, I, I'm gonna think of the ambulance crew,
01:05:07.186 --> 01:05:09.426 ambulance paramedic, whatever, you know, um.
01:05:09.909 --> 01:05:14.550 The amount of kit they've got to take into someone's house or roadside or whatever,
01:05:15.000 --> 01:05:19.959 um, when they're dealing with something, what we would love to get to is a position where
01:05:19.959 --> 01:05:23.629 they've got one device. Doesn't matter, don't necessarily,
01:05:24.000 --> 01:05:28.320 you know, with, with an ecosystem of, of applications.
01:05:29.250 --> 01:05:33.290 Uh, linked to other devices, etc. not necessarily one manufacturer.
01:05:33.330 --> 01:05:38.280 They could, we can, we can pick and choose the best of the best for the job that we need to do,
01:05:38.439 --> 01:05:43.719 but, but interface it into one device, right, because I've got to make their job easier,
01:05:44.370 --> 01:05:48.090 you know, carrying, you can imagine, you know, going in the house,
01:05:48.120 --> 01:05:52.870 they've got a defib, electronic patient record, radio, phone,
01:05:53.090 --> 01:05:56.409 you know. All of those things at one point, and I trust
01:05:56.409 --> 01:06:00.370 me, I've done it, I, I've left, I've left something at a patient's house.
01:06:00.439 --> 01:06:02.649 I've taken him to the hospital, I can't get back to get that kit.
01:06:03.479 --> 01:06:04.870 Right, cos they're now in hospital.
01:06:05.159 --> 01:06:09.330 Right, so, so by limiting that, but still having all of the tools they need on a,
01:06:09.520 --> 01:06:15.959 on a device or um um uh you know, like a platform that that allows integration of all
01:06:15.959 --> 01:06:19.570 the other things they want, it's gonna reduce that for me and make the job easier.
01:06:19.750 --> 01:06:22.560 Yeah, yeah, it's interesting, very, very interesting.
01:06:22.709 --> 01:06:24.149 Gillian, last thoughts.
01:06:24.679 --> 01:06:27.590 Last thoughts, I think, um, what we talked about a bit earlier,
01:06:27.679 --> 01:06:31.189 you know, about, um, take something small, do it, do it well,
01:06:31.449 --> 01:06:32.879 try not to overcomplicate it.
01:06:33.219 --> 01:06:37.219 Um, and, and look to what other sectors are doing, what other countries are doing,
01:06:37.550 --> 01:06:39.870 um, whether there are examples of good practise, adopt it,
01:06:39.879 --> 01:06:43.340 we don't need to be the ones that are, are inventing it all the time,
01:06:43.959 --> 01:06:49.469 um, so for example, we, we've just rolled out through, through the NFCC to all the services a
01:06:49.469 --> 01:06:52.699 document translation. Um, so it is an EI tool,
01:06:53.020 --> 01:06:56.729 um, and it was something that we saw being used by a local authority,
01:06:57.100 --> 01:07:01.219 um, and we recognised the value for people going out doing protection work,
01:07:01.300 --> 01:07:05.179 for example, you know, kind of where business owners speak a different language as a primary
01:07:05.179 --> 01:07:07.540 language. So now they're able to actually take the
01:07:07.540 --> 01:07:11.020 documents to them in their own language, to have that they are presented to them.
01:07:11.310 --> 01:07:15.810 Um, and it just makes that interaction with the communities much more seamless.
01:07:16.020 --> 01:07:20.699 Um, and I think that's what we always have to remember whenever we we are looking at doing
01:07:20.699 --> 01:07:24.300 something, we have to think, who are the end users and what will the difference be for them.
01:07:24.610 --> 01:07:27.830 Um, whether that's our staff or whether that's the communities that we're serving,
01:07:27.909 --> 01:07:29.979 because that, that's the most important part of all.
01:07:30.229 --> 01:07:31.659 OK, yeah, interesting.
01:07:31.939 --> 01:07:35.510 Stephen, last thoughts from you. Yeah, I think it's just because when you work
01:07:35.510 --> 01:07:37.909 in technology, it's no longer can we do it?
01:07:38.270 --> 01:07:40.659 The question is now how we do it and should we do it,
01:07:40.790 --> 01:07:44.060 you know, when, when users come to us because pretty much every question they ask of me,
01:07:44.270 --> 01:07:46.389 I kind of say, yeah, it can be done, whether we should do it and how.
01:07:46.520 --> 01:07:49.510 Do it. I think there's a big case of a bias for action.
01:07:49.550 --> 01:07:53.020 You'll never get the stars to align systems, everything else there's always a reason.
01:07:53.830 --> 01:07:58.389 And so I have a huge bias for action and I'm fairly comfortable that the next CTO might look
01:07:58.389 --> 01:08:00.790 at some of my decisions and think, why did you do that?
01:08:00.909 --> 01:08:05.629 But I think I've got a moral imperative now, and I use the example of I've got a senior
01:08:05.629 --> 01:08:08.270 detective who spends 10 hours watching CCTV.
01:08:08.639 --> 01:08:10.699 And I can do it from 2 hours. There's risk to that,
01:08:11.030 --> 01:08:14.709 but what's the risk of that detective watching CCTV and not helping the child that's being
01:08:14.709 --> 01:08:19.109 groomed or the domestic abuse victims, so I, I think there is a moral imperative for a bias
01:08:19.109 --> 01:08:22.299 for action in what we do as an emergency services.
01:08:22.310 --> 01:08:23.620 Yeah, it's very interesting.
01:08:23.870 --> 01:08:26.040 Making the right use of the tools and again, you know,
01:08:26.229 --> 01:08:29.209 you're not freeing up their time as in they've got the time off,
01:08:29.350 --> 01:08:33.979 they can focus on those things that are more important because those tools help and yeah,
01:08:34.109 --> 01:08:35.830 you've hit the nail on the head there. Heidi.
01:08:36.939 --> 01:08:40.508 Saved the best for last, did we? Thanks. Um, yeah,
01:08:40.588 --> 01:08:45.668 I mean, I'll come back again to the, um, you know, reducing the amount of applications,
01:08:45.789 --> 01:08:50.019 systems, devices that people need, uh, you know, our, our operators,
01:08:50.188 --> 01:08:52.548 um, and our users need and bringing that down as, uh,
01:08:52.599 --> 01:08:54.179 you know, making it as simple as possible.
01:08:54.569 --> 01:08:58.939 Um, so they don't have to carry around multiple devices, so they have to be trying to flick
01:08:58.939 --> 01:09:02.580 between different devices and all of these things, they're all exciting and they've got
01:09:02.580 --> 01:09:06.009 lots of information on there, but you know they do overlap to some extent as well,
01:09:06.020 --> 01:09:10.169 so it's understanding how to, you know, make sure that they've got the right reliable
01:09:10.169 --> 01:09:14.089 information. Um, and I guess just coming back to the people
01:09:14.089 --> 01:09:17.930 part as well, I mean, it's, it's, I think it's really important at the moment to have those
01:09:17.930 --> 01:09:21.930 critical thinkers, to have people, um, who have that analytical mindset.
01:09:22.250 --> 01:09:26.129 Um, I was listening to something the other day where it was saying about what do you teach the
01:09:26.129 --> 01:09:28.589 next generation with all this technology, all this AI?
01:09:28.689 --> 01:09:31.810 What do, what do we, what's important for the next generation to learn?
01:09:32.169 --> 01:09:37.129 Um, and you know, time and time again I hear that it's critical thinking because when you're
01:09:37.129 --> 01:09:40.640 presented with data and you're presented with answers to things all the time,
01:09:40.729 --> 01:09:43.089 you need someone to critically be determining, you know,
01:09:43.169 --> 01:09:45.600 is that the right information that I'm being presented with,
01:09:45.810 --> 01:09:49.378 so. Brilliant, OK, so what we're gonna do is I'm
01:09:49.378 --> 01:09:52.659 not gonna open the floor up to questions cause we'll take more than 5 minutes and we've only
01:09:52.659 --> 01:09:55.599 got 5 minutes. So what I'm gonna do is, is we'll,
01:09:55.769 --> 01:09:58.699 we'll close now. If anybody's got any particular questions for
01:09:58.699 --> 01:10:02.469 the panel, the panel will be around for at least the next 5 or 6 minutes.
01:10:02.738 --> 01:10:05.539 So you can ask the panel anything you, anything you want,
01:10:05.619 --> 01:10:10.938 and we'll do it that way because otherwise we'll end up being 10 or 15 minutes in and
01:10:10.938 --> 01:10:14.699 we've got another panel at 1:15. So if we end now then at least it's a bit more
01:10:14.699 --> 01:10:18.258 or less formal and also people might be a bit more open to answer questions because.
01:10:18.452 --> 01:10:22.202 And ask questions because sometimes people don't like to ask questions.
01:10:22.282 --> 01:10:24.801 So I'm just going to close by saying thank you very much,
01:10:24.881 --> 01:10:28.881 some really, really interesting, some interesting synergies there between the
01:10:28.881 --> 01:10:34.071 services and even Wayne, who was once here and then now does lots of other things,
01:10:34.332 --> 01:10:39.151 you know, very informative, very interesting independent views and thoughts there.
01:10:39.321 --> 01:10:42.961 There's some, you know, there's some snippets of definitely synergy.
01:10:43.074 --> 01:10:49.344 And similar ideas and similar challenges and similar problems and even similar needs and
01:10:49.344 --> 01:10:54.943 wants and what's important, one day we'll write a paper on some of these things that comes out.
01:10:55.063 --> 01:10:59.624 Maybe we should collect and record the notes and turn something into something for other
01:10:59.624 --> 01:11:03.383 people to see. Really interesting topic for me because I spend
01:11:03.383 --> 01:11:07.554 a lot of time looking at doing this horizon scanning for various years with colleagues.
01:11:07.916 --> 01:11:10.516 And collecting some of this stuff, so it's nice to talk about it.
01:11:10.645 --> 01:11:14.395 It's nice to get your views. So I'd just like to say thank you to the panel
01:11:14.395 --> 01:11:17.065 of esteemed colleagues and guests.
01:11:17.405 --> 01:11:21.945 I know it takes a bit to come and sit in front. I know some of you have been here many years
01:11:21.945 --> 01:11:26.206 before and you just keep coming back for more fun, but I would just like to close in the
01:11:26.206 --> 01:11:29.286 traditional fashion with a clap hands to say thank you very much,
01:11:29.366 --> 01:11:32.166 panel. We will see you TCCA.
01:11:32.700 --> 01:11:36.609 At 1:15 here, if anybody else has any questions for any of us,
01:11:36.779 --> 01:11:38.649 we're here, so please come up at the end.
01:11:38.859 --> 01:11:43.089 So thank you very much and I'll clap my iPad and they'll clap their hands.
01:11:43.339 --> 01:11:44.290 Thank you very much.
What does the future of tech look like?
4 May 2025
Panel discussion on the future of tech at the BAPCO Annual Event 2025.
Chair: Robin Davis, Executive Director - Actica
Panellists:
- Heidi Clevett, Head of Technical Infrastructure - HM Coastguard Technology Team
- Gillian Fyfe, Strategic Digital, Technology & Cyber Lead - National Fire Chiefs Council (NFCC
- Jason Somerville, Head of Clinical Communications and Telemetry - South Central Ambulance Service NHS Trust
- Charlotte Hails, Head of Public Sector Vertical Strategy - Virgin Media O2 Business
- Wayne Parkes, Independent DDaT consultant and strategic advisor, specialising in PoliceTech
The BAPCO Annual Event is the UK's leading public safety event. It is your gateway to discovering the latest advancements, networking with industry experts, and exploring cutting-edge technologies that are transforming the way we ensure public safety.