Mike Branch, vice president data & analytics at Geotab, talked to automotiveIT International at the company’s recent “Mobility Connect” customer event in Barcelona

geotab-branch

Geotab’s platform processes 30 billion records a day. Says Branch: “We have AI because that’s too much for the human mind to make sense of”(Photo: Geotab)

Geotab has built a software platform that provides fleet managers with information and analysis designed to improve all aspects of their operations. The Canadian telematics company, founded in 2000, offers software that can help improve safety, increase productivity and boost fuel efficiency. It provides its own hardware connector to vehicles’ OBD ports, but it is happy to take information from cars and commercial vehicles in other ways as well. Key to Geotab’s strategy is its offer to make sense of billions of data points gathered from cars and trucks and the environments they operate in. .

How does Geotab differ from the many other providers of telematics products and services in the automotive space?

We have an ecosystem approach, where we focus heavily on the engineering side and then allow our partners to build on top of that. We’re also making available a lot of the data we have gathered and this lets people see which possibilities there are. And we want to make it easy to work with a platform like the one Geotab has built.

What happens when you approach a fleet manager and start talking about data? Do you generally get a warm reception?

It varies. Some fleet managers are still very much stuck in the past and don’t have a firm understanding. We try to help them from an education perspective. But we also meet some really forward-thinking fleet managers. There tends to be a lot of focus on predictive maintenance, which is very easy to understand. It’s more difficult to see where else you can leverage data and vehicle data in particular. More and more fleet managers and their CIOs are starting to grasp that they have an amazing set of vehicle data and that, once you extract this data, you come up with all sorts of solutions.

CIOs are also involved in how companies work with Geotab?

More and more you see that folks in the analytics realm fall under the CIO. The worst situation is when the fleet segment of a company is in a separate silo and stays there. You need to have the data discussion at board level, which is where it becomes really transformational.

Does the commitment to use data to improve fleet management vary between global regions?

On the electric-vehicle side, I would say Europe is ahead. When it comes to understanding how to activate some of the data from multiple business systems and bring them together, the US might be farther ahead.

With the availability of vehicle and other data growing exponentially, where are we today in analyzing and making business use of this data?

I think we’re still just at the beginning, though we’ve made some incredible strides in the past two years. Take machine learning, people now understand how it can be used and what kind of power it has. It’s transformational in helping businesses achieve scale. Artificial intelligence is perfect for that, but we’re still at the tip of the iceberg. What’s happening now is the democratization of machine learning and AI. Google and Amazon have done an amazing job bringing it down to a realm where, if I have some expertise in using traditional databases, I can now very easily activate a machine-learning model. That wasn’t the case a few years ago.

Geotab.automotiveIT

Branch says Google and Amazon are instrumental in “the democratization of machine learning and AI” 

(Photo: Geotab)

You’re a big fan of BigQuery, Google’s enterprise data warehouse that allows companies like Geotab to store and query massive datasets.

BigQuery, coupled with the IP we create in our Geotab platform, is the single biggest reason we are where we are. I can write a query across millions of records and get results in seconds. And I don’t have to train up. Google’s platform also allows us to scale up. If you would need more servers to execute a request, Google will make them available. Because our data is resident on the Google cloud platform, our data scientists can just write a simple query and don’t have to worry about scaling up. We’re the 11thbiggest user of BigQuery worldwide. We also executed the biggest BigQuery request so far. It involved more than a trillion records.

Please give us an example of such a query.

A query could, for example, be that you want to look at second-by-second GPS locations of all 1.3 million vehicles connected to our platform for a particular week. You then want to marry that information with temperature data and windshield-wiper activation data. Something like that becomes very difficult to do if you don’t have a service like BigQuery.

Why did you opt for Google to host your data as opposed to Amazon or Microsoft?

We needed BigQuery. Amazon and Microsoft Azure offered similar products but they just didn’t match the speed and ease of use of BigQuery. When we started back in 2014, BigQuery was the best choice and still is.

How does BigQuery compare with an open-source data processing framework such as Hadoop?

With Hadoop, we would have to manage the data infrastructure, which we don’t want to do. And it’s a lot more complex to ask queries of Hadoop. The nice thing about BigQuery, too, is the real-time response you get.

You mentioned that you have a team of data scientists extracting insights from your data? So not everything can be done by software?

We’ve got 28 people on my team, up from five just a few years ago. The team consists of a combination of data scientists, data engineers, and data visualization specialists. You need to visualize the data so that machine learning algorithms will be able to learn from the data. My department is set to grow to 40 or 50 people. The majority are in Ontario, but we also have a few people in Spain.

Is it difficult to find data specialists?

Two years ago it was exceptionally difficult to find data-scientist talent, but now it is exceptionally easy. Where we struggle is on the data engineering side, where we need experts who are good at ingesting massive amounts of data and delivering insights in near-real time. That is a different skillset and it’s still a challenge to find people who can do that.

Both cars and the infrastructure will need to have more sensors to provide the data. Who foots the bill for that?

The end-customers are paying, but keep in mind that you don’t always need ubiquitous coverage to get good data. In Toronto, we have shown that, with 50 vehicles moving around in particular areas for a couple of hours, we get 85 percent of the information we need. If you have really good coverage in some areas, you can rely on the data coming from those areas.

You project that, with cars staying on the road a long time, only 50 percent of all vehicles will actually be connected 10 years from now. Is that a problem?

Getting vehicle data will be a combination of aftermarket technology and systems that are embedded by the manufacturer. We’re working more and more with carmakers to get data directly. But cars in North America, on average, stay on the road for 11 years, so the aftermarket will continue to play a big role.

How do you deal with the absence of standards for automotive data?

Geotab is a great equalizer. If we get data from different companies, we’re normalizing all this data and presenting it through our APIs. Fuel economy, for example, is handled differently by each automaker, but when I’m interested as an end-user or fleet manager in fuel economy, I want it to be normalized across the board. For us, it doesn’t matter whether we’re talking about a Nissan or a Ford; a seatbelt-buckle-unbuckle event is just an event. Our platform handles the normalization. We’re working with the W3C consortium to adopt standards.