by Edward Goldstick
I often discover things in three's, whether just things to share or the thoughts that immediately come to mind while considering them... which is a topic for discussion in its own right, so here is an initial example...
-- a computer virus brings down a province-wide ambulance service in southwest Australia and pen-and-pencil plus telephones take up the banner...
-- Bruce Schneier has a new book in the works that will attempt to describe the differences between individual and societal security...
-- ...and an observation to tie it together.
1) From Slashdot moments ago:
...to which I will only add that these sorts of failures amaze me, almost as much as the problems that distributed mission critical systems or sites encounter when nobody seriously considers the necessity to "turn off" all or parts of it occasionally in either a scheduled or spontaneous fashion whether for maintenance that is directly related or, perhaps, for peripheral matters such as cleaning.
These folks were lucky because someone still knew the protocols for manual communications from before the computer-based dispatching was deployed.
2) This from Bruce Schneier's latest Crypto-Gram newsletter in my email moments after:
The rationale behind his new book project on "Societal Security" (here on his website).
This could be a really important contribution, but what I find disconcerting are some of his initial presumptions that instantly seemed misguided to me... but then again, he admits this openly and that might be, in fact, at the core of the conundrum that he's proposing to illuminate, if not unravel completely (though we can always hope).
I completely concur, however, with the choice of the Prisoner's Dilemma conceptual framework and will be very interested to see how he applies it; however, I do hope that he will also delve into some of the philosophical underpinnings of the great human civilizations that endure in our time; in particular, I hope that he will consider the similarities and differences between the Golden Rule as expressed within the Abrahamic traditions: "Treat your fellow human beings as you would want to be treated by them."
...as compared to its mirror image in the Confucian sense (especially as expressed by Mencius): "Do not treat your fellow human beings in ways that you would not want them to treat you."
P.S.: I considered two "Do [not] unto others..." forms, but my ecclesiastical rhetoric is rusty... and it does sound rather pretentious...
A note to adepts of the other great religious traditions -- in particular, Hinduism and Buddhism: I mean no disrespect by not integrating your civilizations in this "instant" philosophy... on the contrary, I simply do not know enough or appreciate these worldviews enough for the instinct to be rapid enough for this venue.
3) Why "robust systems" in the title of this post?
I firmly believe that security is about more than worrying about the bogeyman at the door or under the bed or in the darkness of the future unseen. Is the failure of the Australian EMS communications network the result of an "insecure" system that was susceptible to a malicious intruder (a virus) or was it a product of a system that could not fail gracefully when facing an unanticipated circumstance?
There are many people who have written about this sort of thing, from Stephen Flynn to Bruce Schneier himself. The underlying point, of course, is that we are less fearful of the failure of robust systems if we have a clear notion that they might fail but also of how they might fail gracefully as compared to the fear of insecurity when facing the unknown, whether real, imagined, or "theatrical."
Edward Goldstick is a veteran of the high-tech, software, defense, and energy-technology worlds in the U.S. and France.