Platt Perspective on Business and Technology

Monoculture and ecological diversity as a paradigm for modeling cyber risk – 2

Posted in business and convergent technologies, strategy and planning by Timothy Platt on May 25, 2011

A week ago I posted a thought piece on standardization and its implications for risk in a cyber-security context. At the end of that posting I said that I would follow up on it with a discussion of how open standards impact on this, and on how emerging monoculture-based risk factors might be addressed in an open standards environment.

• Progressively larger volumes of information, as raw data, partly processed information and highly processed and organized knowledge drive business and the overall economy.
• This same information flow informs and enables social media and online social networking and connectivity in general. So this has become part of the core backbone to both professional and private lives and to the organizations that sustain them.
• The processing, storage, access to and sharing of this vast and ever-increasing flow of information forms the rationale behind the internet as it has evolved.
• And the need for this information flow to be ubiquitously available, and independently of the particular specifications of end-user platforms has compelled the development and implementation of open connectivity standards.
• The World Wide Web Consortium (W3C) is the central clearing house for approved, standardized connectivity and user interface presentation standards, and its goal is to provide generally accepted and universally implemented standards for the transmission, reception and use of data of all types. See their Standards and Drafts for more information on specific protocol standards, scripting languages used to support them and related.

And just as malware and the motivations behind its use challenge individual computers and more local/restricted computer networks, as discussed in Part 1 of this series, the internet as a whole faces challenges and threats too, much of which is broad-based but more and more of which will become highly focused and target-specific. I cite my series on Stuxnet and related threats in that regard, noting that I will add to that series in a couple of days with a posting on a new targeted threat identified as Stars. (See Ubiquitous Computing and Communications, postings 58-60, 62-65, 67, 68, 70 and 73-75 for series installments 1-13.)

As a basis for thought and discussion, I would divide possible responses to this into two basic areas:

• A counterpart to the threat identification and response systems currently in place, in counterpart to malware definition screening and filtering, and related reactive technologies for managing computer viruses, worms, root kits, and so on.
• A counterpart to the in-principle more proactive mechanism of systems diversity that I also raised in Part 1.

There are a wide range of for profit businesses that conduct system audits and threat assessments, carry out penetration tests to actively validate systems and find their gaps, and that help with clean-up and remediation when systems are compromised, but most of them operate at an organizational level that would make them more appropriate as responses for Part 1 issues here. There is one organization that serves as a best practices cleaning house for addressing online threat at a level where it would apply to the internet as a whole, and its local and system-wide vulnerabilities. So in response to the first bullet point, above, I would cite The Open Web Application Security Project (OWASP). And as a specific orienting point I would cite their Top 10 Project.

For a second approach here I would cite the opportunities and best practices that can be added into the security layers of internet protocol stacks (e.g. adding packet authentication into implementations of the HTTP stack to limit dedicated denial of service attacks, and making sender authentication a requirement for validating packet source to limit the impact of email source spoofing in spam and phishing attacks.)

And I point out what should be obvious to anyone who reads this. Botnets would not exist and could not if everyone, or even if just most everyone kept their antivirus and related protection systems up to date on their computers, and if they showed at least moderate levels of judgment at all times before clicking on links emailed to them from strangers. And even a cursory review of the bad practices listed in OWASP’s Top 10 Project would bring out a point they all hold in common. All of them are well known and of very long standing. The specific details of implementation for SQL injection attacks or cross-site scripting may change as the technology evolves but the attacks themselves are all depressingly familiar.

• Reactive always means playing catch-up, and even when widespread efforts are made to take preventative measures for many users and on many systems.
• Even at best, many users and many systems will not be proactively responsible here.

And this brings me to that second bullet point response option of systems diversity. And I admit, my goal here is not so much to offer nicely packaged, analytically reasoned resolutions to these issues, but rather to raise questions and perhaps offer a possible direction of approach.

• Standardization is important, and for adherence to internet-wide open connectivity standards at least as much as it is for developing server farms and other scalable local and restricted systems. Ad hoc just would not work in either case.
• But real world users and the paths of least resistance that they follow in developing their usage practices mean that networked systems, the internet as a whole included, face genuine risk. That is not to say that the entire internet is in danger of going down, but that users, groups of users, businesses and even wide-spread regions can be imperiled, and certainly as to ongoing continuity of online service.
• Should certain systems be maintained separately, and if they are basically maintained using open standards should that be done using more restricted, proprietary gatekeeper layers in place too?
• I do not consider completely separate, parallel systems here as that would be both cost-prohibitive for any but perhaps certain national governments, and that approach would be excessively limiting for communications reach for most purposes even where cost-effective – and a sufficiently wide-ranging system would itself become a target for attack too.

So I leave this posting with some open questions.

• How important would it be to develop a systems and protocol diversity so as to limit the impact of more generic attack threats?
• Which systems would merit this and have this effort prove cost-effective?
• How would cost-effectiveness best be determined?
• How could this best be done and still retain the wide ranging connectivity capabilities of linking into the internet as a whole?

Here, as a final thought I note that many of our standard online security mechanisms that we rely on now, no longer offer the types or levels of protection for us or our systems that we tend to presume. As an example, HTTPS cannot be considered secure in and of itself as so many organizations have gained authentication certification control and oversight capabilities, and sold rights to that authority to others. Basically, HTTPS oversight has been resold and resold again and so widely that no one actually knows even roughly how many organizations and individuals can offer authentication validation – or what standards they in practice follow when granting that. And certificates can be spoofed too. So new approaches are needed, and they cannot simply depend on proactive compliance by individuals or organizations online. They have to be automatic and transparent – the way HTTPS was intended to be.

These are important issues and are becoming more and more so and I leave this with that, except to note that I am sure to be coming back to this again.

Tagged with:

Comments Off on Monoculture and ecological diversity as a paradigm for modeling cyber risk – 2

%d bloggers like this: