Platt Perspective on Business and Technology

Cloud storage and the potential for loss of data security control

Posted in strategy and planning by Timothy Platt on July 3, 2011

I recently posted a brief series on cloud computing and ubiquitous computing, with a focus on how they coordinately impact on individuals and on families in communities (see Cloud Computing as Enabler to True Ubiquitous Computing and Communications Part 1 and Part 2.) And in the course of writing Part 2 of that, I noted that for the purposes of that series, I was “setting aside larger organizational and strategic requirements and preferences that businesses would see in the cloud.”

I decided even as I wrote that qualifier that I would turn to at least some of those issues in a second series and I start that here, with consideration of security. And I divide cloud security into two distinct areas:

• Proactive identification of and response to external threat with unauthorized access to or interference in an organization’s cloud based information infrastructure, and
• Proactive identification of and response to internal threat to cloud security from user practices within the organization itself.

The first of these sets of issues is frequently discussed, and both from business practice and even more so from the technology fix perspective. Organizations are advised on setting up and managing multi-layered security systems, with password controls that users would actively and consciously engage in, and with user-transparent barriers as well. An example of the later would be in restricting access, and certainly to more security sensitive areas of a cloud to connections and queries coming from select lists of permitted IP addresses. And hardware and software-based secure tokens and virtual private networks (VPN), and a range of other technologies and approaches are added in to create what is principle at least would be secure in depth systems. And actual security in practice would be tested periodically through white hat hacker penetration tests, security audits and other means. I may very well come back to this topic area to discuss it in fuller detail but for the purpose of this series, I simply make note of it in broad brush stroke outline.

The second area of concern here, as noted above tends to be a lot less discussed or even considered. So I will focus on that here – the practices: good, bad and uncertain of legitimate insiders who would in effect be offered passage as a matter of course through many if not all of the first bullet point security barriers. And in fact the barriers created and maintained in order to meet first bullet point security needs can actually create second bullet point vulnerabilities and significantly lead to second bullet point security breaches – and vice versa.

I will start by noting a point that I have already raised several times, and especially in postings like Developing and Enforcing Password Best Practices. Passwords, of necessity, require active and cooperative participation by users, and to prevent breaches good password practice needs to be followed by all legitimate users. For the purpose of this posting I generalize the point made there.

• Everything done in addressing external threat risks as per the first bullet point above, needs to be designed and implemented so as to limit bad practice breaches from insiders.
• That means developing as many protective measures as possible so as to be transparent to the end user. And it means developing security details that are user-visible so as to make acceptable, secure usage practice easier than the alternatives.
• That is always going to be a not-quite attainable goal; the one thing you can always rely on end users to do, is to at least try doing whatever they can come up with that would at least seem to make their immediate here-and-now work easier for them, in completing their tasks at hand. And people are very clever in finding workarounds where they see even the slightest possible delay or impediment in their doing a task they want to finish up on.

It is well known that John M. Deutch, a now former CIA director, violated security by bringing highly classified work home with him to complete on his personal home computer. What is perhaps less known is that he did this because the security vetting process in place for determining what hardware and software could be used in the national security organizations was (and still is) so slow and so onerous that it was (and to a large extent still is) impossible to fully and quickly do your job on vetted, approved systems. They are always too out of date, obsolete technically and limited to meet current real world needs. Quite simply, Mr. Deutch was in a quandary – having work responsibilities he was required to complete but the only way he could do that was to use an unvetted, unsecured and much more modern computer and software than anything approved for his work use.

Here, I cite hardware and software security vetting as a first bullet point due diligence response that can, and for national security purposes does collide with identification and management of second bullet point risks. And this, of course, applies in both local network and cloud contexts as well as when considering stand-alone computers and their software.

I add that this problem of vetting computers and information technology to meet bullet point one needs, at least as much as failure to communicate between government agencies (FBI, CIA, NSA, and so on) led to the 9/11 attacks succeeding. First, all of these agencies are continuously inundated with raw “intelligence” data, which may or may not be true and that can be contradictory and full of gaps. So while the crucial information that must be known and acted upon is almost always there, it too often only becomes found and identified as such, after the fact. And restrictions in the capabilities of the information processing technologies needed to sort through all of this flood of intelligence significantly increases the change that the right information not be found and brought together and in time. And if the analysts at any one of these agencies cannot find the pieces of the puzzle they have, and recognize them as such, they are not exactly going to be able to communicate them to others in the national security network. This is an old story, and while John M. Deutch was singled out for his security breaches, his actions were much less the exception than anyone would want to admit.

I chose a national security example here, but the basic point I would draw from it is very clear and very generally applicable, and in some respects with greater force in distributed cloud-based systems than anywhere else.

• Pursuing the first, external threat bullet point without an acute awareness of the second, internal practice bullet point can only, long term, create impossible conflicts of need and action, and with security breaches and loss of business effectiveness too, to prove that.

When a business’ information technology system with its physical hardware is distant and everything is done through outside networks for at least key parts of any information flow then responses to the two bullet points has to be closely coordinated. And at the same time, it becomes that much harder to fully know the details of any information flow process, or to track, monitor or security-manage it.

I am going to follow up on this posting with a second series installment in which I will focus on data and document management with version synchronization added into the security and risk management mix. You can find this and related postings in Business Strategy and Operations.

Tagged with:

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: