Take a Look At Your 911 Systems
By 911, I mean your public safety systems used by first responders for near real time operations (say Computer-aided Dispatch) and the police and fire records databases.
1. Until 9/11 (the event), these have been local systems, aka, vertical smokestacks, with little interoperation even across neighboring jurisdictions.
2. Public safety is a late adopter market with a lifecycle at about 12 years long (time from purchase to retirement). This means that technologies taken for granted on the public Internet are not used widely for public safety systems (security concerns aside for the moment).
3. 9/11 was the wakeup call the industry needed to finally take the issue of interoperability seriously and begin to work on standards for CAD-to-CAD (APCO 36) and database interchange (GJXDM).
4. The larger picture of the SHARE system as envisioned by the Markle Foundation is still blurry. The issues here are how to create a policy-aware system with immutable auditing, and the doctrines that must be provided from the governing authorities.
While it is trendlily paranoid to think the police have far reaching powers to snoop, they actually are more restricted than your neighborhood supermarket. On the other hand, the effect of government incursion given its other powers is great enough to warrant the restrictions.
Until after this bitter election, one won't see much action even though the public safety vendors began to solve the problems even before the 9/11 Commission and the Foundation did their work. Pundits such as yourself should begin to understand the differences among investigation and enforcement as defined first by local and state authorities, then by the Federal authorities.
And the political parties should quit scaring people with both sides of this argument. There is a tremendous amount of online information and it has been there for years. What 9/11 has provided is an accelerant to opening up the private sources and integrating the existing public safety resources. It will make for a safer community, but it is very necessary for the public to pay attention to the legislation and the doctrine that will come from the post-election administration.
Fear kills rational thought. This is a topic for rationality and imagination. There is a lot we can do with the technology to *prevent* us from becoming an Orwellian society. You need to spend less time looking for 'if it bleeds it leads' stories and start researching how public safety systems actually work.
First, the Homework
To discuss this without the deep doses of election hysteria, first it is necessary to read the Markle Foundation Report and to compare it to the 9/11 Commission Report. Then there is a Senate Bill and a House Bill to be reviewed to determine what is to be offered out to vendors to build. Pay particular attention to the sections on identity management, access by policy, and immutable auditing.
If you are only using web news sources, blogs, etc., you are missing the details that would enable you to understand the concern that the authors of those reports share with you. I share your concerns, and given the way the House bill has been created and its contents, they are justified. If you want to have impact, you need to take your Congressmen and Senators to task and emphasize the Senate version.
National driver's license standards are long overdue, but that is a simple problem compared to the issue of the breeder documents that enable one to establish false identities. The ease with which Al Qaeda was able to penetrate security systems and execute their plans SHOULD scare you. The general incompetence of the administration that allowed that to happen SHOULD scare you. On the whole, the 9/11 terrorists were boobs. We were beaten by boobs. If that doesn't scare you, you may want to check your pulse.
Instead, you are scared of the people who investigated the event and responded with approaches to this kind of event. That is predictable but not smart.
Technology has a role to play in a solution that protects and enables, and it doesn't have to mean we give up freedoms. It does mean that the freedoms we have are smartly applied. It does mean that you make the effort to understand the proposals for technology and decide for yourself which are acceptable solutions.
The parallel with the elections are this: you can yell or you can get involved. That's your decision. But that's as far as the comparison goes. The vendors will build what is described in the RFPs; so if you want to influence that, you have to read the details and understand them. This IS technology and you are paying for it.
Purging Vs Archival
That is one of the doctrinal and policy questions. Currently, your police records databases are mandated to purge records (excise all data and references to data) at some interval depending on crime type, age of offender, etc. So your 'permanent record' is not that permanent. On the other hand, the laws aren't that clear with respect to collection of private data, for example, who determines how long Google caches this web page reply?
But these are the issues to be worked. A problem with purge rules is that they are often confused with archival rules. Then there is the problem of jurisdiction and a judge: a judge can order your local agency to purge a record, then tell them a year later that they need to restore it. Given technical problems of referential integrity, that can be a hard task, so some systems purge the data to media which are then stored in a cabinet 'just in case'.
But yes, when the high school teacher tells you 'this will go on your permanent record', it can mean just that.
You will get as good a system and as protective as system as you have elected officials to provide sensible mandates. That is only as scary as you and your community are smart voters. The technology doesn't care. So you do have to. Again, systems that acquire identity biometrically don't need your buy in to get you into the system. A driver's license does. For the former, you need legal policy, for example, depending on the real-time event it is monitoring. Your local police agency has the right to demand your license at a traffic stop or accident. Does a sensor web have the right to match its biometric reading to its image of you smuggling beer into an outdoor concert? Well, that depends on the contract you opted into when you bought the ticket. Should it be tracking you for that? Well, you might actually want your cell phone to tell you when an act on the other side of the venue is coming on and how long it will take for you to walk there through the current crowd density, and by the way, where the nearest bathroom is. So you opted into services that enabled that tracking by identity and location. All perfectly reasonable.
What I've been saying here is that people are too often 'scared' of the wrong things and too trusting of the right ones. It is entirely possible to build a reasonable system, but it is not possible for it to protect you from unreasonable people using your reasonable services. That is why we have laws and elect smart people (we hope) and why elections built over fear and religion and Spy Vs Spy agendas are ever more dangerous to conduct.
The first years of powered flight and powered carriages were not safe. The difference is that you had to buy one of those. Neither I nor any product we create can make you feel safer; we and such products can help YOU create a safer world. You have to understand the means and the threat, and such understanding will not come of simply tossing pithy barbs at each other.
Summary: smarter electors == fewer regulators AND smarter systems.