Wednesday, October 27, 2004

Burt Rutan Comes to Moontown

I've been reading the umpteenth web discussion of 'simple systems are better and innovation as disruption occurs using simple systems (concat with) early vs late adopters' and reflecting on what Burt Rutan, designer of the winner of the Ansari X-prize has to say on the subject.

Rutan recently visited a local private airport called Moontown. Moontown is a grass strip. We stood just outside a hanger packed with pilots and kids. While rain dripped on us through a crack in the hangar door, Burt talked long into the night on the 'leave out the unnecessary' and how 'simpler systems are better'. He asked why the space shuttle was developed without exploiting the early systems that were simpler and reliable. He asked why NASA did not fly more civilians and failed to use the simpler systems for that as the Russians have. He didn't like the answer that those who signed the checks didn't want to pay for that. He said 'it's your money, so what's the disconnect in interest?" I told him, "There is no disconnect. We're here aren't we?"

His charts that show the reliability of the simpler systems are factually correct. His interpretations are historically inaccurate.

1) Mercury capsules never killed a pilot. Sounds good. Fact: the second manned flight almost killed Gus Grissom by sinking when the hatch door blew unexpectedly. The Apollo 1 capsule killed him on the pad. Scott Carpenter missed his landing target by hundreds of miles. The Faith 7 flight of Gordon Cooper of 22 orbits had almost every onboard system fail. He lined up the vehicle using his window on the horizon and manually reentered. The magnetosphere is quite a test of 50's simple technologies. We all know what happened to Apollo 13, and yes, it was the complexity of the redundant systems that enabled that crew to come home combined with the real time ingenuity of very smart people.

2) While devoting a lot of his speech to kicking NASA in the butt (he calls them Nay Say) and lauding his simpler system, he neglected to mention the composites that made his craft orders of magnitude lighter than the X-15 were developed by ... NASA. High performance materials require substantial investments before they can used on simple systems.

3) The checks for launch vehicles used in the manned space program with the sole exception of the Saturn series were signed by the United States Department of Defense who, while they were happy to see them used for manned flight, were mainly interested in delivering weighty bundles of thermonuclear destruction with them. Nuclear weapons and spy satellites don't weigh enough to rate a Saturn V, so they were happy to have the civilians sign those checks.

You see, the mission determines the necessary complexity and the acceptable risks.

Simple systems in the hands of very skilled and intelligent people can do a good job. Complex systems are often designed for the less skillful. Early adopters pay the price for becoming skillful enough to make a system good enough to make a late adopter successful at a tenth of the cost. So the rule of thumb for early adopters is: only if you absolutely need it now or are getting into the core business of the technology. As for why NASA did not go forward except into the shuttle program:

1) The shuttle design requirements are for... ta da... the delivery of military hardware to low Earth orbit.

2) The designers of the early 'simpler systems' were gutted from NASA when it was determined that the next American space program should not be tainted by "ex NAZI rocket scientists'. So the accomplished teams and visionaries were sent packing. Oddly enough, Rutan comes to our city because he wants to rub elbows with the remaining living members of that team. When your main financiers are a Microsoft manager made good and a British knight, it might be bad salesmanship to mention the history of that team although he says up front that Werhner Von Braun is his top hero. Tom Hanks, Phillip Kaufmann and Tom Wolfe did a good job of minimizing Von Braun's role, perhaps it is acceptable for Rutan to say that.

While he is right that it is impossible to prove that a system is safe, it is possible to test one long enough to rid oneself of most of the bugs as long as the bugs manifest in the time and environment given to testing. He did talk a lot about the fact that more regulators from the Office of Commercial Space Launch monitored his work than there are members of his team doing the work. I have to agree with assessment of that situation. It sucks.

As to buying a ticket, given that he has yet to ride SpaceShipOne himself and doesn't intend to (I asked), and he is busily selling his concept for Sir Richard Branson's space line, I think I will be a late purchaser of a Virgin Galactic ticket. Rutan never mentioned the effects of radiation through composites in sub-orbital flights. Early adopters often pay the price not only for the ticket, but for the tests. Thrill or no, I'd like to keep the tan I have free of splotches.

Homeland Security Systems and Event Types

One of the requirements set forth in the Markle Foundation Report is that access to private information has to be policy driven, thus, there is a need for doctrine with respect to the design to be implemented for SHARE. A concept I propose for this would be based on the notion of event types, or to be precise, an ontology of events in which information is recorded such that the event type determines who can access this information with or without the consent of the observed. An event is a combination of a time, location and event type. Any individual who participates in the event by dint of being at that location at that time obtains a role.

This is not a new idea. This is fundamental to police records management now and to measurements of policing behaviors. For instance, a traffic stop is a simple example of an event type. Many states require the reason for the traffic stop to be recorded in the records management system so that post-stop analysis can be used to discover trends that might indicate racial or other kinds of profiling. Use of force information is recorded to determine if officers are brutalizing citizens, or to justify the use of force given the circumstances of the 'event'. Role-based access to information is the basis for much security management in computer systems today.

Event types may be cultural. For example, a concert held in an outdoor setting is a cultural event. The opt-in to the rules for the behavior at the concert is the ticket purchase. Holders of tickets are legal participants of a type, concert employees are another type. Each of these is a role in a system of roles where the contract to obtain the role determines the allowable and disallowable behaviors. It also determines the rights to being observed just as telephone and email communications are legally observable when one obligates to the role of an employee. This means that someone attending a public event is observable and that identity obtained by biometric observation is legally usable because the opt-in is contracted and the contract is a service of the event type.

The implication is that the semantic web technologies such as OWL for declaring ontologies can be used to create legally recognizable event types and that the policies for observation and for later access to information beyond the observed information such as access to private data can be described in terms of the event types. This is how subpoenas work now. Suspicion of activity must be based on reasonable grounds. Access to private information is also based on these grounds.

It is likely that the seminal concept that unites these and could enable the homeland security industry to organize a standard for policy-contrained access is the concept of event types as a cross-product with roles.

Monday, October 25, 2004

Scared Stupid

I commented on a C/Net article (Scared Witless) by Charles Cooper about people being scared about their personal safety and how this was playing out in the election. As usual, the not-so-latent paranoia about surveillance systems emerged from readers. What follows are my replies. I'm blogging these here as part of a topic I blogged earlier about effective systems.

Take a Look At Your 911 Systems


By 911, I mean your public safety systems used by first responders for near real time operations (say Computer-aided Dispatch) and the police and fire records databases.

1. Until 9/11 (the event), these have been local systems, aka, vertical smokestacks, with little interoperation even across neighboring jurisdictions.

2. Public safety is a late adopter market with a lifecycle at about 12 years long (time from purchase to retirement). This means that technologies taken for granted on the public Internet are not used widely for public safety systems (security concerns aside for the moment).

3. 9/11 was the wakeup call the industry needed to finally take the issue of interoperability seriously and begin to work on standards for CAD-to-CAD (APCO 36) and database interchange (GJXDM).

4. The larger picture of the SHARE system as envisioned by the Markle Foundation is still blurry. The issues here are how to create a policy-aware system with immutable auditing, and the doctrines that must be provided from the governing authorities.

While it is trendlily paranoid to think the police have far reaching powers to snoop, they actually are more restricted than your neighborhood supermarket. On the other hand, the effect of government incursion given its other powers is great enough to warrant the restrictions.

Until after this bitter election, one won't see much action even though the public safety vendors began to solve the problems even before the 9/11 Commission and the Foundation did their work. Pundits such as yourself should begin to understand the differences among investigation and enforcement as defined first by local and state authorities, then by the Federal authorities.

And the political parties should quit scaring people with both sides of this argument. There is a tremendous amount of online information and it has been there for years. What 9/11 has provided is an accelerant to opening up the private sources and integrating the existing public safety resources. It will make for a safer community, but it is very necessary for the public to pay attention to the legislation and the doctrine that will come from the post-election administration.

Fear kills rational thought. This is a topic for rationality and imagination. There is a lot we can do with the technology to *prevent* us from becoming an Orwellian society. You need to spend less time looking for 'if it bleeds it leads' stories and start researching how public safety systems actually work.

First, the Homework


To discuss this without the deep doses of election hysteria, first it is necessary to read the Markle Foundation Report and to compare it to the 9/11 Commission Report. Then there is a Senate Bill and a House Bill to be reviewed to determine what is to be offered out to vendors to build. Pay particular attention to the sections on identity management, access by policy, and immutable auditing.

If you are only using web news sources, blogs, etc., you are missing the details that would enable you to understand the concern that the authors of those reports share with you. I share your concerns, and given the way the House bill has been created and its contents, they are justified. If you want to have impact, you need to take your Congressmen and Senators to task and emphasize the Senate version.

National driver's license standards are long overdue, but that is a simple problem compared to the issue of the breeder documents that enable one to establish false identities. The ease with which Al Qaeda was able to penetrate security systems and execute their plans SHOULD scare you. The general incompetence of the administration that allowed that to happen SHOULD scare you. On the whole, the 9/11 terrorists were boobs. We were beaten by boobs. If that doesn't scare you, you may want to check your pulse.

Instead, you are scared of the people who investigated the event and responded with approaches to this kind of event. That is predictable but not smart.

Technology has a role to play in a solution that protects and enables, and it doesn't have to mean we give up freedoms. It does mean that the freedoms we have are smartly applied. It does mean that you make the effort to understand the proposals for technology and decide for yourself which are acceptable solutions.

The parallel with the elections are this: you can yell or you can get involved. That's your decision. But that's as far as the comparison goes. The vendors will build what is described in the RFPs; so if you want to influence that, you have to read the details and understand them. This IS technology and you are paying for it.

Purging Vs Archival


That is one of the doctrinal and policy questions. Currently, your police records databases are mandated to purge records (excise all data and references to data) at some interval depending on crime type, age of offender, etc. So your 'permanent record' is not that permanent. On the other hand, the laws aren't that clear with respect to collection of private data, for example, who determines how long Google caches this web page reply?

But these are the issues to be worked. A problem with purge rules is that they are often confused with archival rules. Then there is the problem of jurisdiction and a judge: a judge can order your local agency to purge a record, then tell them a year later that they need to restore it. Given technical problems of referential integrity, that can be a hard task, so some systems purge the data to media which are then stored in a cabinet 'just in case'.

But yes, when the high school teacher tells you 'this will go on your permanent record', it can mean just that.

You will get as good a system and as protective as system as you have elected officials to provide sensible mandates. That is only as scary as you and your community are smart voters. The technology doesn't care. So you do have to. Again, systems that acquire identity biometrically don't need your buy in to get you into the system. A driver's license does. For the former, you need legal policy, for example, depending on the real-time event it is monitoring. Your local police agency has the right to demand your license at a traffic stop or accident. Does a sensor web have the right to match its biometric reading to its image of you smuggling beer into an outdoor concert? Well, that depends on the contract you opted into when you bought the ticket. Should it be tracking you for that? Well, you might actually want your cell phone to tell you when an act on the other side of the venue is coming on and how long it will take for you to walk there through the current crowd density, and by the way, where the nearest bathroom is. So you opted into services that enabled that tracking by identity and location. All perfectly reasonable.

What I've been saying here is that people are too often 'scared' of the wrong things and too trusting of the right ones. It is entirely possible to build a reasonable system, but it is not possible for it to protect you from unreasonable people using your reasonable services. That is why we have laws and elect smart people (we hope) and why elections built over fear and religion and Spy Vs Spy agendas are ever more dangerous to conduct.

The first years of powered flight and powered carriages were not safe. The difference is that you had to buy one of those. Neither I nor any product we create can make you feel safer; we and such products can help YOU create a safer world. You have to understand the means and the threat, and such understanding will not come of simply tossing pithy barbs at each other.

Summary: smarter electors == fewer regulators AND smarter systems.

Comment Policy

If you don't sign it, I won't post it. To quote an ancient source: "All your private property is target for your enemy. And your enemy is me."