Blog Archives

Posts Tagged ‘ video analytics ’

 
Thursday, September 10th, 2009
by Gary Myers

Since my post last week, I’ve received a number of questions and comments so I thought I’d address them as a post (hopefully for everyone’s benefit.)

As Steve Mitchell points out, the end-user definitely has an impact on the performance of the analytics and training is needed.  He drew the analogy last week about pilots and airplanes.  Stretching this pilot/plane analogy a bit, the end-users are the pilots and OV is a part of the airplane (with the whole plane being the end product delivered to market by our partners.)   OV builds a component that works in many different planes and our responsibility is to make sure it performs in a wide variety of settings.  Our OEM partners deliver the complete plane, which includes working with the pilots (users) to understand how to operate the plane (product) most effectively.

As part of the release process, we qualify our software in several ways:

  • Science testing to validate the newest release is at least as good, if not better, than prior releases. These automated tests utilize thousands of hours of videos and corresponding rules to approximate real-world scenarios. These are compared against the baseline results taking into account the metrics listed in my post.
  • Product testing to ensure that the whole product works end-to-end, including manual testing to approximate the end-user experience.

ObjectVideo focuses on testing our software for release to our partners. Different partners focus on different areas so our partners are in the best position to provide the performance criteria to the end market based upon their own test methodologies, results and sales programs. In this way, they can effectively support their analytics-enabled products and know, as well, that those products are meeting the needs of their customers.

 
Thursday, September 3rd, 2009
by Gary Myers

Since OV is the leader in the industry, we get asked a lot about analytics performance.  This can be hard to quantify as there are a lot of contributing factors. In general, accurate event detection is affected by some combination of camera angle, camera placement, lighting conditions, other environmental factors and system configuration. The goal when deploying and configuring an analytics-enabled system is to strike the proper balance between being too sensitive (causing false events) and not sensitive enough (causing missed events).

Over the years of building and testing our software, we’ve focused on three primary testing criteria when determining performance metrics: number of detected events, false events and missed events. The ideal case is to detect all expected events but have low numbers of false and missed events. If you catch all the expected events but you still have a lot of false ones, we would consider performance low as there will be too many nuisance events.  Likewise with the missed events – miss too many then overall user confidence goes down.

In future posts, I’ll cover some ways to improve effectiveness, either through camera setup or system adjustments, to enable the user to get the most from their investment in analytics.

 
Wednesday, August 12th, 2009
by Brian Baker

“Video analytics don’t work. “

I’m still tired of hearing this.

Video analytics, and software as a whole, require a different approach – regardless of the installation platform. Many in our industry correlate the need for trained users and the need to configure the analytics with the notion that analytics, as a whole, are immature and unreliable. Nothing could be further from the truth. This thinking highlights the resistance to change that exists within the security industry.

I suspect companies like SAP and Oracle have a different take on products that require training and configuration. These companies make complex and feature-rich software products, yet they wouldn’t imagine a customer deploying their stuff without key people taking product-specific configuration and user training. Don’t tell me SAP and Oracle are immature products that haven’t yet hit their sweet spot in the market. And what about Adobe products? People who are trained and certified to use the Adobe Creative Suite get paid big bucks!

Software is like that. It is made of bits and bytes. You can’t hold it in your hand and turn the dial or push the button. If you take the time necessary to learn how to configure and use it, the real value will show through the hype.