by Gary Myers
Since OV is the leader in the industry, we get asked a lot about analytics performance. This can be hard to quantify as there are a lot of contributing factors. In general, accurate event detection is affected by some combination of camera angle, camera placement, lighting conditions, other environmental factors and system configuration. The goal when deploying and configuring an analytics-enabled system is to strike the proper balance between being too sensitive (causing false events) and not sensitive enough (causing missed events).
Over the years of building and testing our software, we’ve focused on three primary testing criteria when determining performance metrics: number of detected events, false events and missed events. The ideal case is to detect all expected events but have low numbers of false and missed events. If you catch all the expected events but you still have a lot of false ones, we would consider performance low as there will be too many nuisance events. Likewise with the missed events – miss too many then overall user confidence goes down.
In future posts, I’ll cover some ways to improve effectiveness, either through camera setup or system adjustments, to enable the user to get the most from their investment in analytics.