Platt Perspective on Business and Technology

Learnable lessons from Manning, Snowden and inevitable others 5 – the myth of the technically sweet solution, or of the quick procedural fix

Posted in business and convergent technologies, in the News by Timothy Platt on August 30, 2013

This is my sixth posting to date on what is becoming a series of leaks and unauthorized disclosures of classified US government documents that relate to its War on Terror. See:

John Peter Zenger, Henry L. Stimson, Edward J. Snowden and the challenge of free speech and
• The first four installments to this series at Ubiquitous Computing and Communications – everywhere all the time 2 as postings 225, 227, 229 and 230.)

I have been discussing a complex and far reaching set of problems and challenges that I have to add is largely self-inflicted, from the manner in which the US government and I add other governments as well, conduct clandestine information gathering and surveillance on members of the civilian population – and even without specific cause for gathering information from the many specific people so tracked and monitored. And I begin this posting by citing an old adage that fits our developing situation all too well, as word of surveillance programs such as PRISM surface with their massive and massively indiscriminant collection of phone call logs and personal online information.

• “When the only tool you have is a hammer, everything begins to look like a nail.”

I could easily start developing this posting from there in either of two directions.

• I could write about how the selection of tools and approaches deemed available to conduct a War on Terror have fundamentally shaped our understanding of the problems and challenges that we would resolve through it, or
• I could write about the underlying challenges that our national War on Terror has created for us, from blowback from the programs we develop to conduct it. In that, and as a specific case in point I would focus on the developing crisis of trust that disclosure of PRISM and programs like it have created for us.

For the first of those perspectives, I simply note that one of the most searing and soul searching lessons to come out of the 9/11 attacks was that a relatively large and highly organized team of al-Qaeda terrorists entered this country as tourists and routine visitors, but that they did leave enough of a trail that was picked up upon and entered into records systems so that in principle they could have been identified for who they were before their attacks – if that information could have been brought together and a pattern spotted. I have already written about the practical impossibility of actually doing that, in advance of any action on their part (in Part 2 of this series), but that detail does not matter here. The overarching response to this attack, at least on an intelligence systems level was two-fold:

• Bring all intelligence gathering national security-oriented agencies and services together under one roof in the form of a single overarching US Department of Homeland Security, and
• Collect as much information as possible about as many individual people as possible and about as many organizations as possible and add all of that into this idealized universal and (ideally) near omniscient data and knowledge base.

When you start out with those two points as an organizing conceptual framework, a lot of the rest of what has happened more or less automatically follows. No one knew who the 9/11 attackers were until after the fact as they blended in too well in the general public to be identifiable as sources of risk, so look at and monitor and track and keep detailed data trail records on everyone to prevent a repeat of that type of tragedy. And exponentially expand the scale of the overall surveillance system and its data management needs, and you need to exponentially expand out its scale of operations too (see Part 4.)

And this brings me back to PRISM as a poster child example of the consequences of looking for some simple, special select-tools or organizational quick-fix approaches to developing a reliable long-term resolution to a complex and rapidly evolving challenge.

• Collect every piece of data on everyone that you can and put all of it into one big organizational box.
• Add some really effective big data tools for searching and filtering from the overwhelming volumes of data and processed information so included, including newly developed tools for searching and organizing the semantic web that might help organize and automate the search.
• Get enough people working on this to track enough possible threat vectors and possibilities all at once and all the time, and
• You will be safe.
• Note that this mixes a combination of technology and business process quick fixes into setting things up for that fourth bullet point, that with that combined effort becomes a more elusive goal than ever before.

And this brings me to that second way that this situation can be viewed as having taken a “looks like a nail” turn: dealing with the aftermath from word of the existence of PRISM getting out.

The fundamental nature of national security threat has been changing at a very rapid pace, and both from the evolution and implementation of new attack options such as cyber warfare approaches, and from development of new vulnerabilities and targets at risk – such as our ever-expanding reliance on cyberspace in our everyday lives.

• There was no internet, and there were no computerized command, control and communications systems in place that they could have targeted in any way when the Japanese attacked Pearl Harbor.
• Today, any country that was considering a military assault of that scale would be certain to launch a cyber-attack of at least matching scale in immediate prelude to it and I add during it too. And in fact the cyber-attack approach might and probably would constitute the entire attack.
• Direct physical attacks are most likely to come from smaller organizations that lack the wherewithal for launching a more stealthy and more easily denied cyber-attack – though as technology advances that becomes a progressively more viable option for progressively smaller and less funded organizations and groups too, and even as a comprehensive and large-scale effort (see Stuxnet and the Democratization of Warfare.)

New and emerging vulnerabilities have to be identified, and understood and safeguarded, and whether that would be at an individual or family level, a business level, or a community or national level. But any effective response, and certainly any effective proactive response to warding off or limiting the impact of new forms of threat has to be more systematically organized. I know, in the context of this discussion, that the leadership of Homeland Security, and the Obama administration in general would argue that their responses and the systems that they have developed for safeguarding national security are systematic and well thought out. But I find myself pointing to the fallout from the outing of PRISM to suggest that their systems have been build out and implemented with an enormous systematic gap in their overall approach. We need some fundamentally new tools, and new ways to look at and understand the threat sources that they would be used to address.

I am going to continue this discussion in my next series installment in a few days. Meanwhile, you can find this and related postings at Ubiquitous Computing and Communications – everywhere all the time 2 and in my first Ubiquitous Computing and Communications directory page. I am also listing this under my In the News posting category.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: