On 8/18/13 1:19 PM, Steve Smith wrote:
Just as adding deadbolts to my doors at home or putting a lo-jack in
my vehicle, or keeping a loaded gun in my bedstand would feel like
inviting in the bad things they are supposed to keep out.
I look at it not that differently from how I hope a legislator looks at
a problem like this.
A legislator creates systems of rules which aim to serve his or her
constituency and the country in general. From this, at least in
principle, products need to be reviewed for safety, health care
providers are held to a standard of performance, buildings are built to
code, people are entitled to see their credit reports, privacy is
ensured, and so on.
Putting aside the secret law that allows for opportunistic use of
intercepts, here, in the United States, there is the public law that
search and seizure requires a warrant, and for that warrant there needs
to be probable cause. It's reasonable to be concerned about what
probable cause means in the case of large scale data mining. It's
appropriate to be skeptical about statistical integrity of conclusions
drawn from a mechanism that's only useful purpose is to generate
hypotheses -- the definition of data mining. If an analyst does not
need to test the hypothesis from other independent observables, and
argue to their case to critical ears, then it is just guesswork. The
might as well type in a record in their database with the "50.001%
suspicion" and begin their target intercepts. Assuming they even need
to do that, it's not good enough. It's especially not good enough if
some half-cocked search and seizure occurs without any strong technical
system in place to record that it occurred, or any recourse to complain.
This is a recipe for abuse. I think the overriding, long-standing
public law needs to have some technical teeth to make sure it is enforced.
I don't expect my non-technical friends and family to armor their
systems. At this point I wouldn't bother myself, except as an
intellectual exercise and as a bit of open source activism. But people
do have the right to have systems that are armored, and that there is no
reason to have bad juju about it. The way I would imagine it working at
scale is that customers would create demand for hardware and services to
show (say, by an automated process to bootstrap the appliance from
source code and then run related open test suites to vindicate it) that
the e-mail appliance they purchased was in secure to the best known
practices. That, for example, a single byte would never hit disk/SSD or
be sent over a wire that was unencrypted. The bar for how secure is
secure can be an ongoing discussion. One could imagine the RAM buffers
and caches holding the unencrypted data even need to have physical
protection, like a TPM module does.
The bad guys planning their jihad over open e-mail or cloud services are
a dumb and dying breed. It should be clear now that it is
irresponsible for the U.S. to count on that working in the future.
Marcus
============================================================
FRIAM Applied Complexity Group listserv
Meets Fridays 9a-11:30 at cafe at St. John's College
to unsubscribe http://redfish.com/mailman/listinfo/friam_redfish.com