I’m reviving this blog post from 2012 as it is relevant to the amount of video and imagery the FBI has had to deal with in support of the manhunt in progress in Boston.
There was a time that intelligence was performed in a vacuum.
I served 14 years in the US Naval Intelligence field, which I observed and lived the changes that were brought about by new technologies and methodologies. I was deployed on the USS Nassau, which was the host test bed for a networked intelligence computer system. We played with it, and tried to augment our standard methodology of reviewing paper message traffic, encyclopedias and picking the brains of the older, saltier sailors.
That system didn’t do very well due to communications problems, but it wasn’t considered a failure as it became very apparent that communication pipelines were as important, if not more important, as the information being transmitted and retrieved.
Fast forward a couple of years and now the computers and the data pipelines are sufficient to support analysis of data that is offsite from one’s assigned location. However, that analysis is still performed in a silo, as one had no insight into data others were analyzing or if one had the most current data. But it was a step forward.
Go forward a few more years to the late 1990’s and the silo is mostly gone, as chat and live, online collaboration is a reality. The computers and the networks are more than sufficient to support the information analysis task.
Now consider the modern day intelligence analyst. She has a data pipeline that was at one time considered unnecessary and the computer may actually be a lightweight terminal that acts as a gateway to a cloud-based architecture. The amount of data available on one’s fingertips would probably make someone such as Orwell believe we had actually arrived at his futuristic vision.
Enter the problem. There’s too much data for a single analyst to digest, so it seems that “networked” intelligence may be necessary. In previous blogs I’ve pushed back against the hive mind and groupthink mentality, but there may be instances that it works. But how one exploits the data will drive the effectiveness of the networked intelligence. I’m not suggesting collaboration, as that is already in use. Collaboration is working with known data or a known desired goal. Inversely, networked intelligence is performed at the raw data phase. Imagine pouring five gallons of water into a drinking glass and not stopping when the glass is full. That analogy is exactly what any knowledge worker is up against today, especially an intelligence analyst. Now imagine the drinking glass has been replaced by a five gallon bucket, now when the original bucket that was used to pour water will now simply pass it’s contents to the new bucket. Now we need work management, data management, network management, real time information parsing.
How, though, does one share the workload in a logical manner to ensure redundancy is minimal? Certainly the data analysis cannot be performed serially, as that would be no better than a single analyst working a problem set. The data analysis must be performed in parallel. But that’s a problem, as the analyst may not be co-located.
There is simply too much information for an individual to consume and process. So, then, how does one separate the wheat from the chaff to arrive at the right answer? Conceivably you cannot arrive at the precise answer. Instead you will arrive at an answer that serves the purpose for that moment in time. The same data may very well provide a different answer the next time even with the same question. It is the data timeliness and freshness that matter.
The human experience is what determines to be the best option for that moment in time, which is something a computer cannot yet do. Sure, we’ve all seen Watson beat contestants on the game show Jeopardy, but there will likely never be any urgency applied to knowing when a king was crowned or assassinated. It is the nuance of data, the contextual feeling of data that is becoming more important, and only a trained analyst can divine that information at this moment in history.
Informed, intelligent decisions are what matter most whether it is from a disconnected analyst or from a team that is fully connected. Funneling the massive available data down to usable information is, and will be, the biggest challenge to overcome for the intelligence industry. This is quite the opposite the challenge of our predecessors which had little data to work with.
So, what do you think?