NCSC – Combining Surveillance and Session Analysis

Update 05/13/15: I made two changes to this article. First, I decided to use the term “session” instead of “episode”. I mean the same thing, but session seems to flow better. Second, I fixed some typos and rewrote the conclusion to make it clear that this article is designed to explain the initial problem space, but is not meant to be negative about the ultimate system performance. 

A lot has happened since my last blog article. For example, we assembled a video recording and cloud publishing system for the National Collegiate Sales Competition at Kennesaw State University, April 7-10.  It was a long journey, with lots of technology twists and turns. I will write following articles to examine specific aspects of the technical problems and solutions, but this article steps back to examine the conceptual distinctions that seemed to form the heart of the challenge.

What We Had to Do

We were asked to video record, stream and publish approximately 350 videos of sales role plays @ 20 minutes each. The videos were recorded in 9 different rooms simultaneously. The role plays were done by student competitors who had to give their sales pitch to a corporate buyer. The video shows Ashton Carter (the National Champion from the University of Georgia) at work. Pretty impressive!

Each session was judged by real sales managers who watched the sessions on live video in nearby rooms. Independently, the videos for each 20 minute session were recorded, trans-coded and uploaded to a cloud video sharing site (a sort of private YouTube) … with associated labeling and meta-tagging. Finally, the competition’s corporate sponsors could watch the uploaded videos through a web browser within minutes of each session. The sponsors used this information to target the competitors that they wanted to bring in for job interviews.

If you need a quick point of comparison … think of the NFL Combine.

The Technology We Used

We were operating under a very tight budget. To make everything work, we assembled a system with two primary sources of technology:

  • Bosch Security Video System –  We bought 9 Bosch Dinion 5000 HD IP cameras and a Bosch Divar IP 3000 Video Management System. We were very pleased with the quality, performance, reliability and ease of management that this system provided. I am sure there are equally good video surveillance systems, but this was stellar and easy to work with. The cost for the server and 9 cameras was less than $8000. Configuring the system took some time because the cameras and the IP 3000 have an amazing number of options and features. Ultimately, the Bosch system recorded every second of video in the 9 rooms with HD video and the ability to go back and review anything that occurred.
    Bosch was the system that guaranteed that every session was recorded and every dispute could be resolved by “watching the film”.
  • Dartfish Video Software – As previous blog articles have noted, Dartfish (www.dartfish.com) makes software that is designed to capture, analyze, tag and publish video from sports events. While the students weren’t doing handsprings, they were performing for the judges … and their coaches and corporate recruiters wanted to “review the films”. The Dartfish software has a worldwide reputation as the gold standard for these functions … and it contains all of the tools to make this happen.
    Dartfish was the system that captured, labeled and published each competitor’s session on the web.

Sounds easy, right?  Well … no.

All of this excellent hardware and software worked as or better than advertised … yet at every turn we encountered some small gap or mismatch that killed a promising idea. It took many (over a hundred) hours of testing, configuring, retesting, and re-configuring to finally find one lonely configuration that would work reliably. That solution ultimately worked and worked well … but it was very tense nearly up to the day of the competition.

Which leads to the question and the point of this article. With two excellent and seemingly complementary software technologies, why was it so hard?

What Did We Learn?

There are lots of little reasons why the systems did not mate up well. A file format difference here, a slow export there. However, I am convinced that our experience exposed a fundamental difference in the way people think about recording video of ongoing activities. The surveillance industry (Bosch) thinks about it one way and the performance industry (Dartfish) thinks about it another.

The surveillance industry wants complete historical recordings and a way to flag (e.g., alarm) or tag (aka bookmark) interesting historical events. The industry has little interest in pre-defined sessions. Why would it? It only wants to see the unexpected events (e.g., robbery, disturbance, etc.) that merit review. It also wants the historical record to be tamper proof for possible use as evidence. The entire mindset is historical and event-driven.

The performance analysis industry, by contrast, cares little for continuous recording. It needs detailed recordings of the sessions (e.g., games) that will be reviewed and studied. The coach doesn’t want to review video of an empty stadium. The coach wants to see videos of the games and practice sessions … past and in the future. The schedules of these sessions or sessions are roughly known in advance, but exact times may change. There may be a delay. A practice may be moved up. It’s not possible to use a timer. Someone or something (a button or an alarm) must trigger the start and stop to match the actual times of each session.

As shown below, the NCSC competition and most similar competitions require that the two perspectives be combined in one system. The historical recording is essential as a backup and “master record” of the competition. In the event of a problem or dispute, there must be some way that officials can see what actually occurred. However, the surveillance function is arguably secondary to the main task of correctly recording the actual individual competitive sessions for review, tagging and immediate publication.

The Problem was Performance

Our problem was that the surveillance industry and the performance industries focus on different aspects … and they don’t communicate very well. . Each technology does a great job of looking at a different aspect of the video record … and neither technology or industry sees a reason to mesh with the other perspective. So neither industry has invested in the software tools and bridges to make it work well.

That’s not to say that the surveillance and performance analysis software don’t work together. For normal, day-to-day classroom exercises (equivalent to recording practice sessions in sports), there are several ways to configure the Bosch and Dartfish systems to work in harmony. You can use the surveillance system to record all of the action and select and export the sessions that you want for detailed study. Alternatively, you can use the surveillance system as a backup and camera management tool and use Dartfish to record sessions as needed. In both cases, the raw session videos would be opened in Dartfish for meta-tagging and upload to the Dartfish.TV cloud video platform.

The limitations surfaced when we needed to capture 250+ twenty minute sessions of HD video from nine rooms in one day. The schedule allowed 10 minutes between sessions and that was not enough time to find and export 9 HD video files from the surveillance system before the next round began. We could have tried to simultaneously read and write the 9 streams from the surveillance server, but timing was touch and go and a server failure would have been catastrophic. At the same time, the Dartfish software is not designed to do live capture from large numbers of cameras … it handles 2 cameras very well … and everything happened over a standard university LAN in a typical mid-semester state of confusion.

In short, the competition conditions pushed the two systems past their collaborative comfort zone.  I am convinced that the surveillance and performance analysis systems had the intrinsic capacity to handle everything in real time, but the disconnect between their conceptual models meant that their developers never invested in the connections that would have allowed them to share content at full speed.

With 200 student competitors flying in from all over the country, we couldn’t under size the system and risk a failure in competition … but if we invested in extra software and higher performance equipment, the system would have been massive overkill for day-to-day teaching. It would also have been more complicated and harder to maintain.

So What did We Do?

This is a classic bad news/good news story. This article summarized the bad news. It was a really tough system design problem with high stakes, demanding performance criteria and a very tight budget. Ultimately, we ran both technologies in parallel. The surveillance system recorded everything for backup and last-resort problem solving. The Dartfish session and performance analysis system did the heavy lifting of recording the video clips of each competitive session and putting them quickly to the web where coaches and recruiters could view them. I will describe and explain our configuration (the good news) in subsequent articles.

 

Leave a Comment

Skip to toolbar