Accelerate Digital Transformation with IoT Data Using Data Analytics and AI (Cloud Next ’19)

Accelerate Digital Transformation with IoT Data Using Data Analytics and AI (Cloud Next ’19)


[MUSIC PLAYING] ANTONY PASSEMARD: So,
thanks for being here. Welcome to that session
on how to accelerate your digital
transformation through IoT, data analytics, and AI. My name is Antony Passemard. I lead the product management
team for Cloud IoT at Google. I’m really happy
to be here today. I have a few customers
who will join me on stage during the session. I have also a great partner and
he is going to come on stage and talk about their experience. Let’s go ahead and get started. If you’re here today, it’s
probably because IoT– you know that IoT already
is producing value. It may already be, or you
know it can produce value for your business. Either it’s part of a
small implementation that you’re using
in some use case, or it’s part of a broader
digital transformation initiative inside your company. You know IoT’s
producing value, but you want to learn more about that. We’ve seen that across
different industries, for example, in
agriculture, how to reduce water usage, increase yield. In oil and gas, it’s more
about predictive maintenance and increasing security of
workers in those oil platforms. We see the logistics
for asset tracking. We’ll talk a little bit
about an example there. I mean, it’s really
across every industry. Those are just three industries
that I want to point out. But it’s really across
all those industries that we’re seeing IoT
having an impact today. And that impact it’s
pretty undeniable. The driver behind
those use cases are oftentimes seen
through three megatrends. Those megatrends are
smart factories– that’s a very
important one happening right now– the
rapid urbanization, and the digitization
of workflows. In smart factories, you may
hear manufacturing 4.0, or maybe IIoT, or however you call it. The manufacturing process is
really changing right now. It’s adding a lot more
real-time insights, a lot more just-in-time provisioning of
the assets of the raw material of the parts,
just-in-time deliveries. You have now robots inside the
factories either moving around or stationary doing their work. All this is driven by
machine learning and AI. The addition of machine
learning and the AI in factories have retransformed how
things are built today. And that’s a big
transformation and a big shift that’s driving almost
everything in the value chain. In urbanization, there’s pretty
staggering statistics by the UN that in 2050, 64%
of the developing world and 86% of the developed
world will be urbanized. That means that shift from the
rural areas to the urban areas is pretty massive. And it’s happening
at a rapid pace. So cities have to get data
from sensors, either traffic light– you know, traffic,
buses, transportation, waste management, energy
consumption for the city. All of those have to
be equipped with IoT so the cities can plan
better for that growth and conserve their residents
better, and increase security also, which is a big challenge
for those rapid urbanization. So that’s a big, big
trend that’s happening. The last one is almost
across every industry. It’s about the
digitization of workflow. It’s replacing things
that were done manually or were really difficult
or discrete to happen through an automatization
workflow using machine learning, using AI to
really accelerate that. So it’s about decreasing
production costs. It’s about faster
turnaround time. It’s about increasing
efficiency. That digitization
of workflow can look like replacing clicks
from a call center agent by a script that just goes
through that whole process, gets data in ASAP and
brings it into the case. Could be that simple, but it’s
really happening everywhere. And that’s really
accelerating everything that’s happening today. The real-time nature of
the digitalization workflow is really important there. Real time is something I’ll come
back a lot during this session because we’re seeing
that with real-time data, you can really realize
and maximize the value from your IoT implementation. This is a statistic from IDC. By 2025, which is not too far
from now, but a quarter of all data generated in
the data sphere will be real-time in nature. And what’s even
more staggering is that 95% of that quarter
of data is driven by IoT. So IoT is real time. That’s really what’s driving
all the real-time data in the world. So that’s a pretty
impressive statistic. So real time is an
important concept. And it’s not a surprise to
see that after a few years– I’ve been in that space
for quite some time now. And I’ve seen kind of the
hype and the disillusion curb it a little bit. Now it’s coming back, but it’s
not a surprise to see that– that’s a study from
IoT Analytics– that 50% of the companies
in the next two years are going to start to
see value from IoT. And that’s from 12% in 2018. That’s different from the
stats you’ve seen before, which were around how people
will do an IoT project, or will start an IoT project. We’ve seen a lot of those stats. This is about generating
value from IoT projects. That’s a shift. That’s really going from
a PoC to production. And that’s really
starting to get the reality of what
IoT should be doing, which is producing value. We see that real-time insights
and that real-time value in different areas,
obviously reducing costs, driving efficiency. We’ve talked about that. But in logistics,
for example, we had a customer who’s
transporting biopsies from health care, from
a hospital to a lab, having visibility
of that biopsy, having the visibility of the
temperature of that biopsy throughout the whole route
that that sample is taking is extremely important. If the biopsy has
a bad temperature or goes bad during the
transport, it’s useless and it may be a little bit
of a problem for the patient, obviously. So that real-time nature of
data is really important. You can’t have
that the next day. You can’t see the next day, oh,
that that track was not good. In manufacturing,
oil, and gas, we’ve seen use cases where they
want to track workers and see if they have their hat on,
if they’re supposed to be in certain areas in the plant. That’s super important for
security and safety reasons in particular. Same thing– if you don’t
have the information right there, right now,
that’s pretty useless to have the next day. If there’s an accident,
it’s too late. The real-time nature of IoT is
really what’s driving value. And we’d argue that
the time is now. It’s happening now. As you’ve seen the stats on the
50% of company will see value, we’re really seeing it. But it’s actually
pretty complex. There’s still a lot of
hurdles to go through. You may be sitting there
saying, OK, that’s great. Real time is great. OK, all that. I get it. But how am I doing that? How am I going to apply
that to my business? Am I doing the right projects? Am I thinking broad enough? Am I maximizing value? It’s all great to think about
all those business cases, but you oftentimes
overlook the complexity of the infrastructure. There’s no real standard in IoT. It’s not great. It’s a pretty fragmented market. Managing an
infrastructure, scaling it, securing it– all
of that is very complex, and sometimes it causes
delays, causes issues, it even cancels launch. We’ve seen some of those. So choosing the provider for
your platform for the IoT deployment is really important. And keep that
complexity in mind. Try to limit the
complexity so you can focus on the right value for you. With CIoT, there’s a
big data challenge. That’s really across the board. And, you know, if you want to
do machine learning and AI, you need a lot of data. Actually, the joke in
the data scientist world is, how much data do you
need to train a model? A billion. That’s always a billion. You need a billion
stuff to train a model. That’s what they want. So a lot of data
to train models, and to be able to get access to
those real-time insights that are actually accurate. If you want safety, if you
want security on your devices, and you have a large
fleet, you need to be able to scan that fleet
in real time at very high scale. So again, it’s a
big data challenge. As soon as you
start having volume, becomes a big data challenge. And choosing the right platform
to do that, to serve that need, is really, really important. And that’s what’s going
to make your deployment or your implementation
competitive or not. So last year, we announced
a product called Cloud IoT Core that was the main
component, the main ingestion break, of our
Cloud IoT platform. That was in March, I
believe, or February. Cloud IoT Core is a
fully managed service, will scale automatically
up and down, and it enables you to
connect millions of devices securely to the cloud. And then from there,
the data will flow in the rest of the platform. I’ll talk a little bit about
the different components of that platform a
little bit later. But the idea is that Cloud IoT
Core is your entry point, is your device
management, and beside that is all the data analytics
transformation and insights. One of our customers,
called Derive System, implemented IoT Core in
the Cloud IoT platform for their deployment. So, Derive System is
pretty interesting. They’re in the $18 billion
fleet technology business. They actually equip about
two million vehicles today, and put a little ODB2
dongle into the car to match the engine data and the
driver data into a single view. Their goal is to really redefine
how the cars are configured and operate in the field. It’s mostly for fleets. The thing that they noticed
is the OEM manufacturers, or the car manufacturers, ship
millions of car every year with the same software. It’s the same
software for everyone. And it makes no sense. Nobody drives the
same way, and nobody drives into the same place. It’s different
temperature, different way of driving, different
frequency, different lengths. Why all the cars are
configured exactly the same? It’s definitely not optimal. And Derive System
comes with a solution to change that, upgrade
the car, and control them better to fit them to
the mission that they have. The output of that
is higher safety– we’ll see an example– better sustainability with less
emission, and the lower cost, obviously. So the way they done is they
have that little ODB2 dongle. They plug it in. That connects to IoT Core. Then they use– no, that’s
just two of the major tools. They use BigQuery. I don’t know if
you’ve seen a lot of the announcements for
BigQuery during the session. It’s pretty incredible. And then Cloud Machine
Learning for deriving insights in real time. So they really use that. And then they present all that
information through a portal, through a mobile app,
through dashboards, to the fleet managers. The result was
pretty staggering. So let’s look at an
example that they did. They wanted to limit
the dangerous driving for a major fleet provider. So they took two groups, a
control group and another group of pretty bad drivers that were
doing speed violation quite often. And they plugged
that little thing, and they’d started tracking
what they were doing. And as soon as they
turned it on and started alerting the drivers about speed
limitation, speed violation, turned out they did
zero speed violations. So that was actually
a pretty good result. They didn’t need
to do any training. They didn’t need
to do any coaching. And it was just, hey,
let’s plug that in, and it’s right away– that, I
mean, that’s pretty impressive. First, it’s less accident. That’s pretty good. So it saves lives. That’s good. They drive safer, so
they consume less oil for their gas for their car. Actually, the gas gain
here was between 8% and 12% per year, which is
pretty interesting. And the return investment
just on the cost saving justifies the Derive System
implementation right away. So it’s actually
pretty impressive. So that was a very concrete
output that we really like with that use case. So, the Derive example
is pretty cool. It shows how it could scale. It’s 2 million vehicles. It’s real time. And let’s look a little bit
about the platform itself and kind of the flow of data. So as I said, IoT Core itself is
serverless and highly scalable. But the rest of the Cloud IoT
platform is also serverless. It’s fully managed. You never have to configure
the scaling policies, the load balancing, the region
sometimes, even. It’s all included– what
you have to focus on, and what kind of
data do you want, and what kind of analytics
do you want to do, and how are you going
to present that. So, really focusing on your
differentiation and your value. So, we’ll go
through that process of connecting, processing,
storing, and analyzing data through a little bit
of a journey for data. So let’s start with
the ingestion of data. I talked about IoT Core. IoT Core has a global front end. And that’s important
because when you configure your
devices, they only have one URL across the globe. You send that device
anywhere in the world, it will connect to that URL. But usually when you have one
URL, there’s a DNS resolution, and the data will hop
through the internet to get to that endpoint. With IoT Core, we use the
global front ends from Google. So we use over a hundred
front ends across the globe. And you never really
hop across the internet. You always hit the
closest front end– the one that serves Google
Search and YouTube and all that. You hit that, and then
you’re on our fiber network and go through your region. So you’re not hopping
through the internet. It’s much faster,
much more scalable. Then you land into
Pub/Sub, the data lands into Pub/Sub, which is
also a global service. That means that if the data
is generated in Australia, in EMEA, in the US, when you
pull the data out of Pub/Sub, you don’t have to know
where it was posted. You don’t have to know
that this vehicle was in Australia or in the US and
what topic do I need to get to. You just pull it from that
Pub/Sub topic out of your IoT Core and you just
get all the data. So that simplifies a lot. Pub/Sub scales massively. I mean, it’s used
tremendously inside Google and by our customers. So that’s pretty cool. So that data flows in. We even have QS/1 on
Pub/Sub for the data. And there’s seven
days of storage included in Pub/Sub
for the data, so you can always
pick it up later. Then we go into transformation. Now, transformation
historically was done through Dataflow, which
will scale also and do your job processing over
a window of time. We just announced yesterday– I don’t know if you saw– Cloud Data Fusion in beta. Cloud Data Fusion allows you
to do data processing pipeline with drag-and-drop blocks. They have over a hundred
of those open-source blocks that you can add with
sources and destinations and transformation
capabilities in the middle. It’s all drag-and-drop. Looks like a big rules engine. Really, really cool. So this actually– Data Fusion is becoming
the glue between all the different services
on the GCP platform, especially for IoT data. So that same thing–
it’s a managed service. It scales. You don’t have to worry
too much about that. Then you go into the
analysis of that data. So you went into Data Fusion. Fusion is going to
drop it into BigQuery. And then you can do
all your queries. It’s very high scale,
petabyte-scale-type queries in a few minutes. I think we’ll have an example
from Birdz a little later, who is using a lot of BigQuery. Machine Learning
will pull the data from BigQuery or other
sources and do your training, and then you’ll be able
to visualize that data through Cloud Data Studio. And there’s no IoT value without
some kind of act at the end. You can ingest,
transform, analyze. Now you need to act. We’ve talked about real-time
insights, real-time actions. You need to act on the data. And one good example that we
like to talk to our customers about is the application
of a machine learning model to our own data
center for the HVACs, and how we manage
the cold, and how to cool the servers
in our data centers. When we enabled the
adaptive control systems for our data centers,
we gained 40% of energy using
our data centers. 40%. It was so big, that the
team for the data center turned off the machine
learning models and the adaptive controls to
see if it wasn’t something else. And it went back up right away. This is pretty impressive. Our data centers are probably
the most efficient already, and that was on top of the
very efficient data center that we have. So instead of talking
about our own use case, I wanted to bring
Ludovic on stage. Ludovic Millier–
he’s from Birdz. It’s a subsidiary of Veolia. So, and thanks for
being here, Ludovic. LUDOVIC MILLIER:
Thank you very much. ANTONY PASSEMARD:
Good to have you. LUDOVIC MILLIER: Antony. [INAUDIBLE] ANTONY PASSEMARD: So tell
us a little bit about Birdz and Veolia in particular. LUDOVIC MILLIER:
Yeah, of course. So, thank you very
much for being here. Veolia is a French major
company with 170,000 employees over 50 countries. We focus on the three
main businesses, which are water, waste, and energy. And Veolia clearly
take care about how all these resources are used. And Birdz invent monitor
of these resources. Birdz is a French company– a Veolia company,
as you understood. And we are designing,
deploy, configure, and operate IoT devices. We are doing the data
collection, processing, aggregates, and delivery of
services to our end customers. Our IoT devices are designed
to last more than 15 years. And we are working
through an LPWAN network, which is a low-power network,
like Sigfox, LoRaWAN, and BIoT. We also have our own
homemade protocol. Today, we are handling more
than 3 million devices. 90% of the devices are
used for water use cases. But we also do tank monitoring,
electricity, and waste management device management. ANTONY PASSEMARD:
So, maybe if you want to tell us
a little bit more about the water usage in
particular– what kind of value you’re bringing there. LUDOVIC MILLIER:
Yeah, so, here it is. As you can see, there
is a water meter. And on top of it
lays the IoT device. And this IoT device
just send us some datas where we send us some alarms. We can be alert on
leak detection, which is very important for
the end customers. We have also alarm from fraud
if you remove the IoT devices. Maybe sometimes people think
that if they remove it, they won’t be charged. And it’s not really the case. And we have
over-clever people that remove it, and turn the meter,
fill their swimming pool, and think that we won’t
be able to see it. But it’s not the case. We see it. We see that. We also have some alarming
when meter is blocked and when the meter is
under or oversized. But on top of these
alarms, so we also have some services coming
through the [INAUDIBLE] data [INAUDIBLE] from these devices. We can prevent sanitary risks. For example, if
you have backflow, if you think about
something like there is dirty water inside a
private water network, And if this water coming back
from the private network, it could be something not
very good for the water usage. So, and we also can make
some temperature analysis, which can also do some
network efficiency to see if there’s
unbilled water. We also can do some consumption
forecast using your big data or Cloud Matching
Learning to a– we have customer
consumption models. We can make some forecasts, and
also predictive maintenance. ANTONY PASSEMARD:
So you’re talking about customers over there. Who was your customer? Can you give us
an example of some of the value provided to a
typical customer that you have? LUDOVIC MILLIER: Yeah. We have we have the city
of Lyon, for example. ANTONY PASSEMARD: The
cities are your customers. LUDOVIC MILLIER:
Yes, cities and– yeah, city is the customer. Yeah. ANTONY PASSEMARD: So,
tell us about Lyon. What did they do? LUDOVIC MILLIER: Lyon, yeah. ANTONY PASSEMARD: That’s the
second largest city in France. LUDOVIC MILLIER: Yes it is. For me, it’s the best. ANTONY PASSEMARD:
[LAUGHS] I’m from Paris. He’s from Lyon. LUDOVIC MILLIER: Yeah. [LAUGHTER] And, well, we have something
like 400,000 smart meters installed in Lyon. And we have a nice use
case at the end of January. The weather forecast
tell us that we will have a week or two
weeks with very, very low temperatures. So we have a risk of damage
on the meters, of course. So what we did– we took the
data we have on a BigQuery. And BigQuery– we have
something like two years of raw data coming from all of
these meters, something like 10 terabytes in some table. So we did a little analysis and
get all the temperature data from this BigQuery table. And we get the list
of all the meters which will be under very,
very low temperature. And we give this list
to the town of Lyon, and they send the
mails, SMS, and maybe some mails and emails, to
prevent the end customer. Because in France, if the
meter has some problem, if the meter is damaged, the
end user have to pay for that. So the city of Lyon makes
some kind of services to prevent the end customers. ANTONY PASSEMARD: Great. That’s concrete value. LUDOVIC MILLIER: Yes, it is. ANTONY PASSEMARD: Thank
you very much, Ludovic. That’s a great use case. LUDOVIC MILLIER:
Thanks for being here. Thank you very much. Thank you for the [INAUDIBLE]. [APPLAUSE] ANTONY PASSEMARD:
So you see companies like Birdz innovating. And so on our end, we’re trying
to provide more innovation to our customers so they can
build these kind of use cases. What we’ve done
this year– we’ve launched quite a few features
like the groups feature for devices. We’ve launched advanced logging. We’ve launched fast
commands down to the device. We can do a hundred commands
per second through devices also, so we’re not just ingesting,
we’re also pushing it. So we’ve launched
a bunch of stuff. But I wanted to spend
time on two major features that we launched last year. One of them is the
gateway feature, and the other one is the
token service feature. So, the gateway
feature is something that enables you to declare a
gateway in the device manager and declare a bunch of
devices behind that gateway. And that enables
the first gateway to be a proxy for authentication
for those devices. And what that means is you
can now connect or identify devices that are non-IP
devices directly to IoT Core. That means that if you have
a Modbus device, or a BACnet device, or a Zigbee device, as
long as they can generate a job token and sign it with
their own identity, they can pass that
job to the gateway. The gateway will pass
the job to IoT Core. And the device will
be seen as an entity, as a first-party entity. And that’s really important
because usually when you have a gateway, the gateway
does the authentication. That’s proxy– all that
identity for the devices. And the devices have less
secure identification. With this system, the devices
are authenticating themselves and can be seen in IoT Core. That was used by one of our
customers called Smart Parking. They have sensors
in parking lots, and they have a gateway in
each of those parking lots, because they also have
their mesh network there. And they were mostly seeing
the gateway itself, and not– I mean, they were
seeing the sensors, but not as different entities. With that, now they
see all those sensors as individual entities,
that they can act on that. So that was a pretty
cool use case. The other service
is token service. Token service came from a
lot of demand from customers. The customer’s saying, hey, my
device connects to IoT Core. I’m using my public
and private key. That’s great. I can authenticate. But now I need to
download a patch. Now I need to upload
an image in GCS. Now I need to get a machine
learning model update because my model drifted. So how do I do that? I need another authentication. I need a guy IDed now in GCP. So what’s the use of
having my job token? Now I have another guy
IDed for my device. So that’s what token
service solves. When you authenticate to
IoT Core with your job, you can request a
token that allows you to authenticate into
GCS or Cloud Functions, or do any bunch of other
services, the container services, and things like that,
to download or upload data without having to have another
identity in the rest of the GCP platform. So it opens up the
entire GCP platform to devices through the
authentication of IoT Core. So that was a big demand. And that’s going to be
extremely useful, in particular for models that
need to be refreshed on a fairly regular
basis or to do security patches on OTA, for example. We also integrated
with a LPWAN partner. So, Ludovic was
talking about the fact that Birdz is using
those LPWAN technologies. They use LoRa. They use Sigfox. They use a bunch of others. So we actually released
an integration with Sigfox in particular. So if you don’t know, they’re
a very low-power network. They’re a fairly global
company right now, and you can have the
super, super low power– five year, ten
year, on the battery to connect to the
Sigfox networks and then send data to Cloud IoT. And the same thing– we
did some integration, I think, with
Objenious, and there’s another one on the LoRaWAN
technology, the TTN network. So we have those integration
here with the LoRaWAN network. So those are the
kind of innovation that we’re releasing. And we’ll keep going and
adding more and more feature. So I want to bring on
stage another customer, Bill Schwebel from Zebra. Zebra– and you’ll
tell us what it is. And welcome, Bill. Thank you very much
for being here. BILL SCHWEBEL: You bet. [APPLAUSE] ANTONY PASSEMARD: So tell us
a bit about what Zebra is. BILL SCHWEBEL: So, Zebra is
one of the companies that’s probably everywhere that maybe
you don’t know much about us. But we do barcode
scanning and printing and mobile computers that
power frontline workers pretty much in most
industries, in manufacturing and distribution, in retail,
in transportation logistics, in health care, field mobility. Matter of fact, your
badge is printed by a Zebra printer, which could
have been embedded with an RFID chip. I’m not sure if we
did that here, but– ANTONY PASSEMARD: I
would probably say yes. BILL SCHWEBEL: Yes,
of course it is. You’re being tracked. But really, what we
do is we’re connecting those frontline workers
into our cloud platform, a platform we call Savanna,
to create something we call enterprise
asset intelligence. And essentially, tracking those
millions of users’ information from those millions of worker
users, the assets that– whether they’re a forklift
or some control equipment or the goods themselves. So you can think of
billions of goods that may be better with RFID,
may have a barcode, et cetera. That enterprise
asset intelligence follows the same
trajectory that we talked about earlier
with the ingest, transform, analyze, act. For us, we call it
sense, analyze, act. And a lot of what has
been happening on the edge has been around sense. And really, for
that to have value, have to go through the other
two steps of analyze, act. ANTONY PASSEMARD:
OK, so you’re very much at the edge, all local. So how does the
cloud play in there? What’s the use of
the cloud for you? BILL SCHWEBEL: So the cloud
really does two things for us. One is it’s the
main tool to lock that data that’s on the edge. So that data is trapped today. And it’s really that kind of
physical world we’ve built. And we’re building a digital
representation, whether it’s through the device. So we’re collecting about
250 data points continuously from our devices that tell a
lot of things about condition, location, et cetera. But also the other
things that you could see, whether it’s
a visual sensor just to look at workflow,
and how, for example, somebody could be
loading a trailer, whether it’s assets in
a hospital, et cetera. So we use that, and you get kind
of facilitating some use cases, whether it’s a retail situation
where maybe a customer buys something on the
internet and they need to pick it up in the store. How many people have
gone to a store, and the thing’s not there? That’s a real-time action that
needs to happen in that sense, analyze, act to
fulfill on that order. Same in health care, serving
a patient in distress. So that’s one. The second is it
becomes a key interface point for our ISV partners. We have 2,500 ISV partners, as
well as about 10,000 to 20,000 resale partners. That becomes the
interface point for them, rather than having an
interface on the edge. ANTONY PASSEMARD: Oh, yeah. That’s actually a good point. So how did how did Google
in particular and Cloud IoT help to solve those challenges? BILL SCHWEBEL: So, we’re
just beginning our journey with Google Cloud IoT. And we’ve been doing some PoCs. We built our own IoT platform. Our group has been– you know, we’re about eight
years into our endeavor around IoT. But we really looked
at it as an opportunity to move into the
platform as a service. And one of our
biggest challenges is operating on the edge. So most of these
facilities think of a distribution hub or a
retail facility or a hospital. They’re not on the internet. And we need to do a lot
of processing on the edge, and then seamlessly be able
to take that data whether I’m running an inference
on the edge, running models in the cloud, but
be able to manage and simplify that piece. ANTONY PASSEMARD: Cool. Well, that’s a
pretty cool use case. I really like it. And thanks for starting with us. It’s the beginning
of your journey. BILL SCHWEBEL: Beginning
of the journey. ANTONY PASSEMARD: Well
thank you very much, Bill. BILL SCHWEBEL: Yeah, thank you. ANTONY PASSEMARD:
Thanks for having– [APPLAUSE] It’s very interesting
because you’re seeing the megatrend
of rapid urbanization that Veolia and
Birdz are tackling. You’re really seeing that
digitization of workflow is happening in the
manufacturing 4.0 with Bill. So you’re seeing those trends
across the different customer use cases. So as Bill mentioned, when you
build your solution, a lot– I don’t know about you
in the room specifically, but a lot of our customers have
tried to do it on their own. And that’s definitely
a way to do it. If you want to build
your own platform, you can really build from
scratch from virtual machines, really. I don’t advise doing
it, but you can. And you can also use the
different services, the managed services that we have in
the Cloud IoT platform and build your own solution. That’s no problem. A lot of the customers
come to us and said, we’d like a solution. I don’t want just
bricks to put together. I want a solution
for my use case. So what do you do for agro? What do you do
for manufacturing? What do you do for retail? What do you do for
asset tracking? Those are all kinds of
solution that people are coming to us for. And we’ve worked with partners
to build solution accelerators. We believe that working
with partner in that case is actually more important,
because they know the business actually better than Google. It’s more of an
infrastructure player. They know the business. They know the verticals. They know the customers. And they can build out solution
accelerators that are maybe 80% of what you need
out of the gate, and then they’ll customize that. So working with partners
is a really important part. In most implementation
that we’ve seen in IoT, there are partners. It could be on the hardware. It could be on the software
that’s running on your device. It could be the
connectivity partner. It could be the SI, could
be the ODM, could be the platform, the cloud. Will be some of
the analytics, some of the more vertical players. There’s going to be
some partners in there. And we see that as a natural
fit for any IoT deployment. There are going to be several
vendors working with it. So what we’re
trying to do is work with partners to make sure
that we’re all integrated. We’ve done the groundwork
before you actually have to do it yourself. So it’s cool that you can
build the bricks together. But if we’re bringing you
solutions from partner, they’re already integrated,
they’re already scaled, and they’re already using a lot
of the benefits of the Google Cloud Platform. So that’s really centrally
important for us. So you’re going to see a lot
of the partners coming in. And to that, in that case, I
want to show you three solution accelerators that we’ve seen. So, Accenture released an
artificial intelligence visual inspection
solution that’s really using Edge TPU that
we announced last year. Edge TPU is our
inference accelerator for TensorFlow Lite models
that you can run on devices. Tiny chip that goes on a device
that consumes 2 to 4 watt only and can do 30, 40 frames
per second machine learning, 30, 40 frames per second pretty
easily on a video stream, and apply those machine
learning models there. So it’s very, very efficient
chip that we released. Actually, it’s available now. You can go on Mouser.com
and buy a module for it. So they’ve been
using that module to do visual inspection–
so replace somebody looking at the
different hardware, having a camera there, and
doing the visual inspection. They use it on a drone to do
power lines inspections as well. And they use it to do analysis
of bacteria samples in health care as well. So those were usually
done manually by a person. Going up the power line there
and climbing, there’s risk. It’s dangerous. It takes time. It costs a lot money. This is much more efficient. So I believe they have a
booth where they present that. But you have a
visual inspection use case using the Edge TPU in the
show floor that you can see. The second solution
is from Deloitte. And they focus more on security. So they have two solutions–
the Product Security Manager, which allows you to do a lot
of the applying the patches, scanning the vulnerability
of your fleet, knowing which IoT devices is
at risk, and what kind of patch they have. And so it’s really
having a fleet view of your IoT deployments. And that’s really
important as, you know, every new IoT device is
a potential entry point to your network. So the more you have, the
more security risk you have. It’s not like we have one
firewall that everything’s behind. You deploy these
things in the field. They are sometimes
accessible physically. So you have to have good
visibility on security. So, Product Security
Manager is that. The Technical Security
Testing is a solution to do more of the ongoing
scans of the vulnerability of your fleet. It would be like scanning
your web servers or your mail servers to see if
there are any problems. They do that for IoT devices. They have robots
also going in your– if you have a
little data center, they can have robots roaming
around, looking at cables, and see if the layout of
what’s plugged in here is what you’re supposed
to be and there hasn’t been any changes. So they try to
automate– that’s really in the workflow automation
that we discussed earlier. So those are two solution. But for the third
one, I’m actually honored to have Shashank
here from Hitachi Consulting. Shashank, who’s driving
digital transformation for Hitachi Consulting. [APPLAUSE] So, Shashank, tell us about
Hitachi Process Intelligence. SHASHANK VOLETY: Sure. I’m going to talk about
Hitachi in global partnership, and then talk about the Hitachi
Process Intelligence solution. Hitachi, all of you know,
is a really old company. They’ve been around
for over 100 years. And we’ve used the deep
expertise and experience that we have gained on
operational technology and IT in building these solutions. So these solutions are
powered by Google Cloud. And really what
we are doing here is helping businesses
transform and embark on digital transformation
to really drive business, improve processes,
and increase agility. HPI, or the Process Intelligence
is one such solution. Antony has asked me
to talk about HPI and take a customer example
where Hitachi partnering with Google has implemented
components of HPI to help accelerate their
digital transformation. I’m only talking about GROWMARK. ANTONY PASSEMARD:
Yeah, so it’s GROWMARK. So this is HPI for agriculture. Does this HPI work for other
industries as well, or? SHASHANK VOLETY:
HPI specifically for the smart agricultural
space, really what it does is it takes advanced processes. We combine that with
analytics to really support IoT-driven decisions for
agriculture and livestock industry. ANTONY PASSEMARD: So,
you have a customer live with this using HPI. And that’s GROWMARK. So tell us more about what– SHASHANK VOLETY: So,
this was a pilot project that we had done with Google. And really, we are in
the subsequent phases of the project. What we have done here is
help GROWMARK implement an IoT and an ML solution. What they really
wanted to do was to take some of their
logistics solutions where they are actually deploying tanks. And these are fuel tanks– propane, gasoline,
and refined fuels. This particular solution is
focused on propane tanks. It’s an IoT solution
where we’re actually measuring the tank level,
the inventory in the tank, and predicting the demand. So it’s really
forecasting the demand, and we’re looking at a two-day
forecast for the level of fuel in the tank. Just to give you some sense
for where and how it actually has an impact on the
business, GROWMARK works with their members. And it’s an agricultural
cooperative– very large agricultural
cooperative. So they work with the members. And they deploy these
propane tanks in fields. And imagine having a tank
in Sheldon, Iowa, which is a small town of 1,500
people, and 40 miles away from the nearest
interstate, right? These tanks actually
are providing the fuel in the winter
seasons for these farms for various functions. And as the fuel
level depletes, what happens in the
traditional way is GROWMARK is actually
looking at the refilling and processing the orders. With this IoT solution,
with the forecast that you can get in two days,
combined with weather data, we are able to predict
what the demand looks like. And it’s really automating the
order and the reorder process. ANTONY PASSEMARD: Excellent. So tell us about how Google
helped in that case there. SHASHANK VOLETY: So, yeah. So I think Google
obviously played a big part of the solution. So, when we started having
the discussion with GROWMARK, really, they wanted
to take the solution, put it on the cloud for
scalability and reliability. We looked at Google’s
ML and IoT capabilities, and that was a
no-brainer, really. We actually proposed
our solution with along with Google. And you’re really
looking at being able to scale the
solution, and being able to provide some real-time
actionable insights using ML. ANTONY PASSEMARD: Well, great. Well, thank you very much
for sharing that story. SHASHANK VOLETY: Thank you. ANTONY PASSEMARD: It’s great. Thanks, Shashank,
for being here. SHASHANK VOLETY: Thanks
for having me here. [APPLAUSE] ANTONY PASSEMARD: Now,
we’re actually getting to the end of the presentation. And I was really
fortunate to have real customers coming and
talking about their use cases. It’s always better
than more slides. We like to show
what’s been possible, what’s possible in the
platform, and what people have done at a pretty high scale. You know, you have
3 million devices that analyze with Birdz. And you have 2 million
on Derive System. We have Zebra starting
to deploy that across their fleet, Shashank
with Hitachi really bringing value to GROWMARK right away. So this is really great. If you want to
learn more, there’s a few things in the intelligence
neighborhood show floor. There’s a little
demo of the Edge TPU. As I said, you can write the
number of little– on a roll. It will do an analysis
in a few milliseconds and drop it in the right bucket. You can see how Edge
TPU is pretty powerful. Runs on something that
looks like a Raspberry Pi, so it’s pretty cool. You can find a lot of
information online as well in the
cloud.google.com/solutions/iot. It’ll have more information
about the platform itself. And in /iot/partners, you’ll
find some on the partner accelerator solutions, and
some of the hardware as well. So, thank you very much for
being here this morning. And I hope that
was useful for you. And thanks for coming. And I hope you enjoy
the rest of the show. [MUSIC PLAYING]

Danny Hutson

1 thought on “Accelerate Digital Transformation with IoT Data Using Data Analytics and AI (Cloud Next ’19)

Leave a Reply

Your email address will not be published. Required fields are marked *