Platt Perspective on Business and Technology

Learnable lessons from Manning, Snowden and inevitable others 4 – some thoughts on possible lessons themselves 2

Posted in business and convergent technologies, in the News by Timothy Platt on August 25, 2013

This is my fifth posting to date on what is becoming a series of leaks and unauthorized disclosures of classified US government documents that relate to its war on terror. See:

John Peter Zenger, Henry L. Stimson, Edward J. Snowden and the challenge of free speech and
Part 1, Part 2 and Part 3 of this developing series.

One of the points that I have raised in this series is that:

• Since the 9/11 attacks, the US government has been actively waging a proactive, global war on terror, and as a part of that and starting with the George W. Bush administration and continuing very actively into the administration of Barack Obama,
• The amount and variety of files and documents, and of information in general that is included under the national security umbrella has grown to unprecedented levels (see my Zenger, Stimson, et al. posting as cited above for more details on that.)
• And the overall national security system has also grown in scale accordingly.

As a part of that, I have also at least briefly discussed how the US Department of Homeland Security was formed as an attempt to more closely and effectively integrate together all of the government’s intelligence gathering agencies with a goal of better organizing and coordinating all relevant resources and capabilities. And I have written in particular in Part 1 of this series, of how one of the core goals in that was to more effectively bring together all of the raw intelligence that had previously been disconnectedly gathered, evaluated and stored to at least increase the likelihood that threats be identified earlier, and in fact before any attacks that this information could reveal can take place. The goal there, in simplest terms was and is to prevent another 9/11, and certainly where it might come out afterwards that enough information was already in US hands beforehand so that in principle an attack might have been prevented.

I wrote in that posting about how certain of the changes that are being implemented in this system as a response to Manning and Snowden, will in all probability blunt the effectiveness of that centralization of information resources. But for purposes of this discussion, the key detail coming out of that is that everything from raw unanalyzed data through to include the most highly processed and analyzed findings and reports have been assembled into essentially one single data and knowledge base, even if one that is partitioned off for allowable access.

My goal for this posting is to at least raise the issue that a sufficiently large increase in scale for such a system can in and of itself create new sources of vulnerability for the overall national security system. So in a fundamental sense this posting is all about information systems scalability.

• The US government put agencies and organizations that have their own cultures and histories and their own systems and approaches, all under one roof, with a clear directive that they had to unify and coordinate where they have traditionally faced each other more as rivals – rivals for budget dollars, and for precedence in the policy decision making processes that take place at a national governmental level.
• Then, with many issues of coordinated data and intelligence information sharing and collective organization still unresolved, new barriers for doing this are being added in. One of the most difficult challenges faced is in managing a flow of raw and outside-processed data that is seemingly endlessly large, and with all of it coming in at least initially unvetted for accuracy, currency or relevancy, and bringing that together as a source of reliable analysis and prediction.
• So old systems were partly scrapped to be replaced by a now more globally combined-agency approach. And the kinks resulting from that were and are still being worked out, and certainly at the lower level employee and task, operational level. And this is all happening at a time of massive increases in now combined systems scale.
• Systems of processes and procedures that are still being formalized and tuned at the operational level – at least at a hands-on and data source and instance, by data source and instance level are being scaled up to include exponentially larger volumes of data and processed information that are deemed classified and that have to be securely included here – while striving to make everything readily findable and usable when it is legitimately needed – and where there may be no easy way to know that a priori.

The number of individual people who seek out and are awarded top secret security clearance and who have had to be background check vetted for that, has grown into the six significant digits range. That places a great deal of strain on the Federal Bureau of Investigation (FBI) and on every other agency and office that is drawn into this effort, as well as its ongoing effort to vet to lower security clearance levels. And with the workload that this increased scale demands, speed of vetting and of conferring necessary security clearance becomes an important consideration and a source of pressure to start and complete and move on to the next candidate as quickly as possible.

The words “after the fact” come to mind here, and so do 9/11 and the names Manning and Snowden. And another and I add closely related issue that comes to mind is that of “everything” as in PRISM scooping up all of the data it technically can, and the US federal government classifying essentially anything and everything that they can and with an equal lack of discrimination.

I would not argue a case against classifying information in the name of national security, as that can be vital for our national interest, and I add for the security and interests of our allies too. But nothing should simply be classified simply because it can be. If scale and a seemingly uncontrolled rate of scaling up have created new vulnerabilities to go along with the old ones we still have to address as a nation, then scaling up through a process of reasoned planning and design, instead of simply to build bigger, is a key to our actually maintaining national security.

I am going to end this posting here at this point. As of this writing (on July 23, 2013) I am planning on writing one more series installment, with that one focusing on the strengths and perils of seeking purely technology solutions to the issues that I have been discussing here and in related postings, or of seeking simple procedural quick fixes either. Meanwhile, you can find this and related postings at Ubiquitous Computing and Communications – everywhere all the time 2 and in my first Ubiquitous Computing and Communications directory page. I am also listing this under my In the News posting category.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: