Yes, this is a somewhat interesting problem - probably not that difficult
considering that the goal here is to create plausible deniability in a
setting like a court of law.  Generating traffic patterns that convince
other crytpographers (or even sysadmins) is much harder than generating
traffic patterns that simply create reasonable doubt.

Also, unless you have some very odd friends, user activity will
vary in statistically likely ways over time, so the ideal system
would "randomly" compensate for that.

I've only caugfht the tail end of this, but from what I can understand some of this might be overkill.


Let's imagine you're an ISP that "sells" privacy in your customers' surfing. So you make your server regularly go get random webpages so that your customers have a cloak of privacy, and this is what you tell the court. (Provided, of course, they have some way to 'prove' that the traffic into and out of your server is fake traffic!) In these kinds of scenarios, 'realistic' web traffic might be unneccessary. (Or maybe there's some other 'cover' story such as the need to update weblinks or whatever...)

Or do I misunderstand here?

-TD






From: John Kozubik <[EMAIL PROTECTED]>
To: ken <[EMAIL PROTECTED]>
CC: cypherpunks <[EMAIL PROTECTED]>
Subject: Re: Extent of UK snooping revealed
Date: Wed, 28 May 2003 10:36:23 -0700 (PDT)

On Wed, 28 May 2003, ken wrote:

> John Kozubik wrote:
>
> > d) set up an automated script on the server that _constantly_ fetches
> > random web pages, thus creating a constant stream of http traffic in and
> > out of the server, again diminishing traffic patterns. Log the actual
> > proxy requests in some temporary fashion and randomly hit those web sites
> > in an automated fashion throughout the day, regardless of whether someone
> > is requesting them through the proxy or not...and then, script a constant
> > stream of requests to the proxy as well
>
> Fun & difficult part is setting up fetching of "random" web pages
> that looks like real user activity.


Yes, this is a somewhat interesting problem - probably not that difficult
considering that the goal here is to create plausible deniability in a
setting like a court of law.  Generating traffic patterns that convince
other crytpographers (or even sysadmins) is much harder than generating
traffic patterns that simply create reasonable doubt.

> Also, unless you have some very odd friends, user activity will
> vary in statistically likely ways over time, so the ideal system
> would "randomly" compensate for that.

Exactly. The ideal system would monitor in and outbound:

- web requests
- bytes transferred
- bytes per page
- pictures per page
- binary files transferred
- (all of those) / second

and generate pseudo-random browsing to smooth these variables over time.
Perhaps a script that chose random word pairs from the dictionary, googled
them, and browsed the pages that were returned would be a good platform.

-----
John Kozubik - [EMAIL PROTECTED] - http://www.kozubik.com

_________________________________________________________________
MSN 8 helps eliminate e-mail viruses. Get 2 months FREE*. http://join.msn.com/?page=features/virus




Reply via email to