Case Study: The ROI of Digital Video Surveillance

Allen Rude, security manager at Intel, invested more than four years in an ROI study to justify the cost of digital video surveillance

Way back in 2000, digital video recorders (DVRs) were the next big thing in surveillance, and Intel Security Manager Allen Rude, who had seen some at a trade show, knew it. But as a veteran security executive, Rude also knew something else: Even if the value seemed obvious, it wouldn't be easy to prove.

So, with management on one side of Rude wondering aloud why they'd ever spend money replacing a perfectly good CCTV system, and with eager vendors on the other side wondering aloud how he could not see the benefits of digital, Rude started an ROI study.

"We wanted to jump on it, but we're always challenged to show why it's better than what we've got, in a tangible way," says Rude. "ROI is extremely frustrating, but it's also a reality. If you don't do it, you will not get past the finance hurdle. It's that simple. And if you don't do it centrally, it becomes a game of which department is best at procuring a little bit of funding rather than are we doing the right thing. So managing ROI from the corporate level is key too."

It wasn't until earlier this year that Rude finally proved a positive ROI on the DVR technology that wowed him and his colleagues half a decade ago. Five years of expected and unexpected challenges later, Rude shares his plan and how he and his team brought the next big thing in surveillance to Intel, eventually.

Phase 1: 2000 to 2001—Technology Identification and Benchmarking

To start, Rude culled expertise. He talked to members of Intel's Joint Engineering Technical Team (JETT), which documents and implements new technologies and also does Intel's security construction projects. The JETT team subsequently established a digital technology subgroup, which managed site projects and data gathering efforts for the ROI study. Rude also talked to the purchasing folks at integration companies. He researched the technology at trade shows and in trade journals. Finally, he built a list of DVR vendors and then pared this down to a shortlist, based on his newfound expertise. The vendors that made the shortlist had to fit three criteria:

1. Have good technology and a technology plan (adopting standards and so on)

2. Be geographically able to support Intel's manifold facilities

3. Have long-term viability as a company

Rude paid special attention to number three, "especially in surveillance," he says "where at every trade show there are 20 new vendors and 15 from last year's show are already gone."

Shortlist in hand, Rude got some DVR systems from those vendors and called Intel Architecture Labs. He had the lab benchmark the DVRs versus the time-lapse VCRs he had deployed. This benchmarking, while rigorous, was purely technical. Rude wasn't interested in the real-world issues that would affect performance. He only wanted a lab-controlled horsepower comparison to see if the new technology, in and of itself, was better than the old. And it was.

Phase 2: 2001 to 2003—Pilot Systems and Productivity Benchmarking

Next, Rude added the real world to the mix. Ideally when he starts a pilot, Rude wants to visit other sites that have adopted the technology already and learn from them so that the pilot program will avoid simple mistakes that could derail the project. "It was difficult finding enterprise DVR customers at that time," Rude recalls. "But the gaming industry was adopting it. So we visited a lot of casinos."

Rude also had to choose between a greenfield and a legacy setting for the pilot. Testing the new system in new buildings would be easier. Piloting in older facilities would make it harder to deploy. Still, he chose older facilities rather than a greenfield, for two reasons. One, it made a more even comparison with the time-lapse VCR systems he had in other old facilities, and two, if his pilot proved the value of DVR technology there, he'd have done the hardest work already.

Rude had 808 cameras connected to dozens of DVRs at four sites, which were chosen for reasons both practical and arbitrary. Mainly, he wanted their sizes to roughly match the sizes of the sites with VCR systems, to which he would compare them.

Rude's team then began collecting the data for his productivity benchmark. He started with these calculations for each site. He based his metrics on the main expense (after capital expense, which will come up in Phase 3) he must pay for when he uses video surveillance: man-hours spent retrieving and reviewing footage.

Once he had this figure for each site, he averaged all the sites of each type (DVR or VCR) together to get one average monthly viewing time per camera.

The per camera per month numbers, however, are small, hard to apply and not very useful, because most deployments for Intel will number in the hundreds of cameras. So Rude made his data more real-world and executive-friendly by calculating the viewing time for 100 cameras over the course of a full year. He took his per camera per month number and multiplied it by 100 (for the camera total) and then by 12 (for a full year).

This gave him the time staff spent looking at tape over a full year at a small Intel site (100 cameras), a figure that could be easily adjusted for larger sites, those with, say, 500 cameras.

Still, dollars are what matter. Rude had to come up with an expense for all of this camera viewing. To do that he made one final determination: cost of an event per hour. He chose $50 per hour as the cost of dealing with video surveillance events. While it wasn't a perfectly scientific number, it wasn't a capricious one either. To arrive at the figure, Rude factored in the cost of paying staffers to go get the video, the cost of devoting investigation time and energy to an event (thus taking those away from other jobs), the cost of any related investigations coming out of the event, and other factors.

By multiplying his annual hours of viewing time for 100 cameras by $50 per hour, Rude arrived at his productivity benchmark: a full 33 percent savings by using digital video surveillance.

It's important to note that life expectancy of systems, maintenance and, most important, the capital expense of installing a new system were not factored in here. Rude's pilot was done to prove productivity gains once the system was installed. Now came the hard part for Rude: He had to prove that the large-scale capital investment and associated costs of maintaining digital video surveillance would save his company money on the bottom line.

Diversion:

2001 to 2004—The Battle with IT

One of Rude's greatest challenges as he sought to prove digital video surveillance's worth was remaking the relationship between his security group and the information technology department. Rude says it took three full years to get the two teams to support each other on this project, and to build what is now an excellent relationship. Still, recalling the early days of his DVR project, Rude sounds frustrated.

"The network for security was a 10Mbps shared network that we owned," says Rude. "We had to plan for a 100Mbps switched network for digital video surveillance. We spent years just getting IT to take ownership of the security network."

Another key point of contention was that the network upgrade would also have to stay out of the DVR project budget. IT was having none of it, because (Rude eventually discovered) they thought the security group wanted to stream live video from every camera on the network. Nearly a dozen meetings over the three years, and still no buy-in from IT. So Rude changed tacks.

Instead of asking IT for support for DVR surveillance, as he had tried and failed to get for years, Rude played it coyly. He simply asked the IT department to benchmark some new technology and applications that, he mentioned with an air of inevitability, were landing in IT's network whether or not IT supported the project.

"I told them, 'We'd prefer it if you did support it, but the project's going forward, so you need to tell us what the bandwidth consumption will be,'" Rude recalls.

IT capitulated to the benchmarking, and that was the turning point. They were surprised to find that the bandwidth consumption wasn't as absurd as they assumed it would be. "There was this fear of the unknownthis assumption about what video would do to the network," says Rude. "They began to see it was just another app, and they said, 'Oh, we can absorb this.'

"I've learned what you present to them and what they hear are two different things," Rude continues. "We weren't talking about putting cameras with live feeds streaming over the network. We were talking about DVRs, data collection devices. But all they heard was, 'We're going to put cameras on the network.' I just had to control the message."

Though it took far longer than he anticipated, Rude and the IT department now have a good working relationship, and have started some other video surveillance benchmarking projects together.

Phase 3: 2003 to 2005—Hard ROI

The pilot went well, but Rude knew he still had a major problem in proving ROI: capital investment. Though the DVR systems performed better, they also cost more. For a long while, Rude couldn't get the numbers to a point where the return in performance made up for the pool of capital required to get the digital system up and going.

"Even with better technology and performance, we've got to be able to save money before they let us do it," says Rude. "And that was the hardest part. It took over two years, working with suppliers and, frankly, waiting for prices to come down."

Rude says that many of the small vendors in the digital video surveillance space haven't reached commodity stage yet, meaning they're still trying to recoup R&D dollars. That means higher prices. But it wasn't the only problem he faced.

Rude laughs when he recalls how many vendors would downplay the price of computers when working with him on a way to make their systems economically viable. "They had this idea that we were Intel, and we could just go grab computers in the back room for free or something." They also never took into account the cost of operating and maintaining the equipment, only the cost of buying it. Other vendors simply discounted Rude's entire ROI exercise. "They'd say, 'You're Intel. You've got so much money. Just buy the stuff.'"

Rude also met resistance from his own executives, many of whom thought a camera's a camera's a camera. Rude had to show how the quality of images drastically improved from analog to digital and, what's more, could be tuned on the fly so that the camera kicks into high-resolution during an event.

His point was made for him several times over when Intel was able to catch bad guys and resolve incidents based on high-quality visual evidence stored in the DVRs—visuals the old system couldn't have captured.

Finally, the system's worth was showing itself, and the market was cooperating, too, as equipment prices slowly, surely declined. "We've finally, just now, gotten to the point where we can show a break-even, and maybe even a slightly positive ROI," says Rude. "It was a massive effort, but we got funded."

Postscript: DVRs on the Edge

A new trend in digital video surveillance puts a DVR out on the edge. Rude says the benefits look tremendous. "Rather than running fiber from the cameras back to a cluster of PCs with video boards connected to a big DVR array, the DVR sits out there with the camera. So if you lose a DVR, you lose only one camera. We can also use off-the-shelf IT equipment with this new generation." Of course, Rude says, "We've got to see if it saves us money too." Just as he's finishing deploying DVRs, he's starting the ROI process again with the next next big thing.

Copyright © 2005 IDG Communications, Inc.

7 hot cybersecurity trends (and 2 going cold)