Platt Perspective on Business and Technology

Learnable lessons from Manning, Snowden and inevitable others 6 – the myth of the technically sweet solution, or of the quick procedural fix, continued

Posted in business and convergent technologies, in the News by Timothy Platt on September 5, 2013

This is my seventh posting to date on what is becoming a series of leaks and unauthorized disclosures of classified US government documents that relate to its War on Terror. See:

John Peter Zenger, Henry L. Stimson, Edward J. Snowden and the challenge of free speech and
• The first five installments to this series at Ubiquitous Computing and Communications – everywhere all the time 2 as postings 225 and loosely following.

I ended Part 5 of this series noting that Homeland Security (HSD) and the Obama administration in general have been pursuing an approach to national security, and certainly in a cyber-arena that suffers from fundamental strategic and overall conceptual gaps. I stated there, that I would offer some thoughts on the precise nature of those gaps in this series installment, and that I would at least offer in general outline a possible approach for addressing them. But developing events have in effected compelled me to take a digression from that first. And in this case, this is a digression that addresses developments that I have expected to see hit the news, and only the precise timing was unpredictable. To set the stage for what I will discuss here:

• When I first began posting on this developing news story with my John Peter Zenger, Henry L. Stimson, Edward J. Snowden and the challenge of free speech, I already knew that I was writing about a much larger story than had come out in the news and even with the ongoing coverage of events revolving around both Bradley Manning and Edward Snowden.
• Focusing here on the Snowden disclosures, he outed one program called PRISM and that raised a firestorm of protest and concern and both within the United States and from our allies. But I was already at least very close to 100% certain that PRISM was only one component of a much wider and closely coordinated and integrated system of equal-scaled programs that the National Security Agency (NSA) and their partner agencies within the HSD are also running.
• I have been writing about collecting all online data and information that it is technically possible to capture, to add into US national security databases. And I cited technical capabilities such as semantic web and I add Web 3.0 tools that would be used for tagging and identifying, and for query-based searching for any type of captured data from all of this.
• Given the readily apparent overall strategic goals that the HSD was tasked with addressing here, as discussed in brief outline in Part 5, and the demonstrable existence of PRISM, if nothing else, it was already clear that parallel programs must be underway for capturing search engine and even general web browser activity data too, and probably in as open ended a manner as was built into PRISM. And then three days ago, as of this writing, on July 30, 2013 a second and explicitly web-oriented global surveillance program was publically identified: XKeyscore.
• This is a program jointly run by the US NSA and by its Australian counterpart: their Defence Signals Directorate, with a goal of tracking essentially everything that people do online, from web browser and search engine use patterns to online chat to email activity and more.

I wrote in Part 5 of this series, and in earlier installments about how all of this information is being collected into a centralized, all-inclusive repository system, and with a goal of capturing as much of everything as possible about as close to everyone as possible, for all of our online and telephone activity. I wrote about the new and emergent challenges and even fundamental barriers to effective operational action that trying to capture everything entails. If you put absolutely everything in one big box, how could you possibly find and assemble the critical pieces to a hidden puzzle from them, such as the puzzle of a developing 9/11 attack, and preemptively so as to prevent that attack? And more specifically, how can you expect to do this when you re-partition the intended unification of Homeland Security, with all intelligence resources brought together under one roof, by adding in new barriers to access such as their new two-man rule implementations? You cannot expect even the best intelligence analysts to find patterns out of what are initially seemingly unrelated data points coming in from diverse sources, if you limit them to only having access to combinations of data sources that you already have significant reason to know are related. This type of system can help you verify what you already know, but it cannot help you to preemptively find the unexpected so you can take proactive measures to protect against novel adversarial attacks.

But this only addresses the technical feasibility and the potential technical effectiveness of these tools and of the operational processes that are put in place to run them. The real problem here is much more fundamental and this brings me to that gap in strategic reasoning and that gap in even really perceiving the nature of the threats faced, that I noted at the end of Part 5. A public revelation of the existence of PRISM opened a door that the Obama administration wanted to keep closed, and expected to be able to. The outing of XKeyscore as a second massively open ended surveillance program sheds light on the fact that PRISM was no anomaly and that it was not and is not a unique and stand-alone program. I think that I can state at this time that PRISM and XKeyscore together, are only two components of a much larger system, that all but certainly includes other data collection component programs. And this overall system all but certainly includes supportive programs for organizing and searching all of this data for patterns of operational significance.

As an aside, the disclosures of PRISM and then XKeyscore have shed some real light on why the Defense Advanced Research Projects Agency (DARPA) has requested project proposal submissions for some of the big data and semantic web-oriented technology development challenges that it has been pursuing. At least one of the reasons for pursuing best practices technology solutions there, has almost certainly involved developing new tools and approaches for making sense, and use of all of this incredible flood of raw new data, and of essentially every conceivable data type as gathered in by these surveillance programs.

But this still leaves the question of whether this pursuit of technology perfection is in fact the right one. And this still leaves us with the developing and ongoing crisis that word of these programs have made for the Obama administration and other governments as well – and those systematic gaps in how we think about the War on Terror and on how we conduct it, and certainly where the internet and our communications systems are concerned. With this story kicked up another notch by revelation of XKeyscore, I will write as my Part 7 next series installment, about the issues that I was initially planning on discussing here. Meanwhile, you can find this and related postings at Ubiquitous Computing and Communications – everywhere all the time 2 and in my first Ubiquitous Computing and Communications directory page. I am also listing this under my In the News posting category.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: