Q&A: How fire departments can predict their needs
Many private sector businesses are taking complex data and distilling it into an accurate predictor of future needs; it's something the fire service may embrace
Chief William Hyde is the deputy fire chief of special operations, training and emergency medical services for the Rogers (Ark.) Fire Department. He’s well educated with, among other designations, an MBA, EFO and CFO under his belt. He serves as the International Association of Fire Chief’s director of the executive fire officers section.
He’s also keenly aware of the potential big data holds for the fire service and how that’s being applied to improve efficiency and service delivery in other industries. He sat down with Fire Chief to discuss that potential.
Fire Chief: What technologies not being used by the fire service have the greatest potential for adaptation?
Chief Hyde: What is on the forefront for us as an industry is the use of predictive analytics. Right now, it is more for the business environment. But there’s a lot of potential in that domain for fire and emergency services to capture the ability of big data extraction and understanding so that we can make better local, regional, state and to some extent national-level decisions.
What are the barriers?
Cost. This type of data collection and data diving is focused on private corporations right now, and that’s because of the budgets they have. Whether you are in an economy that’s growing and healthy or if you are in an economy that still is not to that point, you’re still limited by the public dollar, and we want to be good stewards of that without a doubt.
[Equipment, staffing and station placement] has the potential to have a very solid answer with predictive analytics, but I don’t know that we’ll ever be able to get there without some good solid funding. That’s not me pleading for money for it, it is just the realization of having to divert funds from real assets to something that may make you have a more efficient use of assets.
Is this a long-term vs short-term problem with how budgets are set?
The challenge is you can’t get to the long term if you can’t stifle some of the short-term immediate needs. For so many communities it is just not practical for them to divert anything away from their immediate needs. If you look at the potential for analytics, in some way, you’d have to sideline some immediate short-term budget to invest in the future.
That’s not a role local government will typically play. The higher up in government you go, the more investment they can make in those avenues. But conversely, the real benefit comes in the opposite form in that local units may actually benefit more from it.
Can this be created once as a template and applied broadly or does it have to be reinvented each time?
I don’t think there’s a reinvention, but at the same time I don’t see it being a template, nor would it be static. If I make a dozen orders a month through Amazon, in a short period of time, Amazon is going to have enough data on my ordering habits to predict when I may need my next batch of household cleaners.
If you were to apply that to a community in predicting what resources will be needed and where they will be needed in an emergency fashion, [given] a certain margin of error, that allows us to be much more efficient. That wouldn’t be a template that goes across the borders, but at the same time it won’t have to be completely reinvented in each locale.
It could not be static, because the needs would fluctuate as the needs of the population changed and as the population itself grows or diminishes.
So you need continuous data input to verify or change initial findings?
Absolutely, and there’s current and pending technologies that are going to assist with all of that. A lot of the data we collect right now is input manually by people. Most of it is based on what happens at our responses.
Think about 10 years ago when OnStar was a pretty new advent for vehicles. Now, OnStar or some similar internet-linked system is present in almost every car manufactured. And that kind of data is auto dumped into a store house for each of those manufacturers where they can learn about what the driving behaviors are and how the car behaves. There’s a lot of benefits for the auto manufacturer, and they are not having to manually input that.
I’m not clairvoyant on this, I just see a lot of promise.
There’s always going to have to be some element of manual input. Manual data input is only as good as the attitude of the person putting it in. If they completely comprehend how important that data is, they will have 100 percent attention to accuracy. If we’re being realistic, that’s not the case in all data entry, especially the information regarding incidents and calls for service.
The intriguing thing to me about auto data input, say from our fleet of fire trucks and ambulances, if they are already populating data into our manufacturers’ databases, why couldn’t we capture that data? The data already exists, why couldn’t we capture driving behaviors and consumption issues if we were trying to optimize fuel economy?
What we currently have to do is put another system in our fleet so we can get that data. In the short- and mid-term, we are going to see a proliferation of more devices that are capable of collecting those types of things.
What will that technology do to the cost of apparatuses or fire stations?
Well, Internet of Things is what gives me a glimmer [of insight] into this. I hate to say that it is certainly going to increase costs; I’m hesitant to even believe that. Our way of thinking over the last decade has been more automation and more features is supposed to increase costs.
But what I think we are actually seeing is that automation and data collection doesn’t increase the cost so much as it allows business that use it to increase their efficiency, which drives down the cost in the end. The intuition might be that it increases cost, but through efficiencies you save in the long term.
If we looked at a heart monitor 20 years ago and were to forecast two decades later what capabilities they would have, I don’t think you could have predicted it. It’s really space-age stuff that we’re sitting in the middle of now and I don’t know that anybody could have completely predicted 20 years ago.
In 1998, I held my first thermal imaging camera. That thing weighed about 6 pounds and after about 10 minutes of carrying it, I was tired and it was always in the way. That thing was $14,000. Now, I have one that I carry in my pocket and it hooks into my phone — it’s not ruggedized or for interior firefighting — but it cost $250. It’s the same technology — just not ruggedized.
What private-sector industry is getting this right?
Because I live and work in Rogers, Ark., we are super familiar with Wal-Mart and its entrance in direct competition with Amazon in home delivery and ordering online. That’s a glaring example on a very large scale. You are talking about companies that have hundreds of millions of customers per day and they are able to personalize it down to the individual level.
If a corporation can be that large and focus that small, I certainly see how, in a mid-range timeline, we couldn’t find benefit in the fire and emergency services industry to do the same from a national perspective down to a specific locale.
How far off is that?
Farther than we should be (laughing). There are tremendous minds not just in the response element of what we do, but in the research element that are working on so many aspects of what our industry manages. The talent is there.
I couldn’t measure in a timeline how long until we get there because I don’t know of any segment focused on this component of it. It may happen as a result of other applications coming into our business, which seems to be historically how we adopt things to the emergency services. It works well for someone else, we bring it in and try it out in a little element and it grows.
Are we waiting for a guinea pig fire department to chart the way?
There are a lot of departments that are willing to step into that role today. I can come up with probably 20 departments where it would be great to do something like this. The problem for any of those that might want to be on the leading edge, there’s going to be a little bit of a price to pay.
There’s also a relationship that has to be forged between the private component of it, like the analytic companies that have the computing power to know how to correctly collect and analyze those things.
The way I like to look at is you had the predominant number of fire engines two or three decades ago were coming off the assembly line for general purposes and there would be secondary manufacturers that would modify them into being a fire engine.
Over time, you’ve seen this relationship where manufacturers really customize from frame to top every bit of a fire engine and ambulance. It was that corporate element that really spawned these vehicles being highly customized to what we do. While we’re not talking about a tangible product, it is that private partnership and getting the right companies in to understand what we need to do and how we can do it better.
How much appetite for failure will these early adopters need?
Failure, especially in an early adoptive phase, is expected. I wouldn’t expect any of this to build up, come into use and fall flat on its face. I would expect some failure elements in the entire system. It’s not perfect, and that’s what the early adopter phase is for.
Every new wave is going to have some element of failure, but an element of failure does not equal failure as a whole. Any of those who are going to be early adopters realize that it is not going to be perfect. Anybody who’s waiting for perfection in a system will be the late adopters.