A Wall Street Journal story last updated on January 9, 2015 offers a comprehensive and complete story of what happened over a three day period in the Charlie Hebdo attack and parallel shooting of a French traffic cop and hostage incident at a Kosher deli that left 17 victims, three dead perpetrators, and almost a dozen other people injured, in addition to disrupting the lives of millions of the French people. The story continued beyond that as well, in the form of the reactions to the incidents like a massive unity march, the "Je suis Charlie" and "Je suis Ahmed" slogans, the policy initiatives that are being formulated to respond to the attacks, and an attack on a possibly related terrorist cell in Belgium.
The Wall Street Journal story has details, like the fact that text messages from a print shop employee hiding while two of the Charlie Hebdo gunmen holed up there in a police siege were critical to the police success in killing the gunmen before they could kill anyone else, that didn't make it into the headlines.
The problem is that while there are good stories like this that are written eventually, if you are searching for news as it happens on the Internet, you face two problems.
First, there are hundreds of reports and it is hard to distinguish those with the same source and same information, from those that add new information to the coverage. It is easy to find the basic core of information about a big story like the Charlie Hebdo attack, but is hard to find the stories that go beyond the core of the early news wire reports. It is even harder to find the much later corrections to facts that were inaccurately reported in early coverage. Even if there is a correction from one source (which is hard enough to find when looking at the original in many cases), many other media outlets rebroadcasting the original story will not note the later correction in what is by the time it comes along cold news.
Second, it is not always easy to be confident that you have located all updates to developing stories that transpire in multiple related incidents over a period of days, weeks, months or even years. This is particularly an issue in the case of local news stories with diffuse coverage, that only a handful of media outlets may cover in half a dozen stories over many months, like the developments in a just barely newsworthy local murder case, or land use fight.
Even in the case of a big story like the Charlie Hebdo attack, while it is easy to learn, for example, how many people were initially reported to be injured in the attacks, it is much harder to learn, a week later, how successfully (or not) those who were injured have been at recovering from the injuries, or how they felt about the incidents, even though that kind of reporting is relatively easy to do with just a little elbow grease, and often is done by smaller circulation media outlets or much later buried in the lifestyle section far from the front page.
An ideal service would rapidly assimilate all information relevant to the story (not just from media outlets, but also from sources like social media and blog posts and public and commercial record source data), over the entire time period of the story's development even long after the fact follow up reporting, while also purging redundant information and eliminating the time and effort necessary to compile it all. The service would also track when you last looked at the story so it could highlight everything that has happened since you last read about the story.
One could imagine an artificially intelligent computer program with Google class processing capacity that could do that automatically, in real time, with every single developing story worthy of any media outlet that is available on the Internet.
Systems that do that are common place in flashy spy movies set in the present or near future, but in reality, don't exist. The function can be and sometimes is carried out, but it is done much more slowly and laboriously by smart (and expensive) people.
No comments:
Post a Comment