On Fri, Aug 29, 2003 at 05:30:37PM +0200, Leopold Toetsch wrote:
> I think, we need a general solution for freeze, dump and clone. As shown

I don't know if this is relevant here, but I'll mention it in case.
For perl5 there isn't a single good generic clone system. Probably the
best (in terms of quality of what it can cope with, if not speed) is Storable.
Storable provides a dclone method which internally serialises the passed
in reference, and then deserialises a new reference. This is a lot of
extra effort, but it does work. Obviously this fail's parrots aim - speed.

I don't know whether this is relevant either:
The internal representation of Storable's serialisation is stream of
tagged data. There was a discussion on p5p a while back about whether it
would be possible to make a callback interface to the tags, so that people
could write custom deserialisers (eg vetting the data as it came in)
This sort of interface to hooking deserialisation would be useful.
Likewise being able to hook into the traversal/serialisation interface
to provide custom output (eg XML when we're serialising as standard to
compressed YAML) would save people having to reinvent traversal/introspection
routines. (eg perl as both Data::Dumper and Storable in core doing
separate traversal routines)

I suspect that this is relevant:
You can't trust you data deserialiser. It can do evil on you before it returns.

I forget who worked this attack out, but Jonathan Stowe presented a talk on
it at YAPC::EU in Amsterdam. The specific attack form on Storable is as
follows, but it should generalise to any system capable of deserialising
objects:

You create serialisation which holds a hash. The hash happens to have 2
identical keys (the serialisation format does not prevent this)
The second key is innocent. When Storable writes this key into the hash
it's deserialising, perl's hash API automatically frees any previous
value for that key. This is how hashes are supposed to work.

Obviously while deserialising there should never have been a previous key (a
legitimate file generated from a real hash could not have had repeating
keys) The problem is that things like perl let you (de)serialise objects,
and objects have destructors that can run arbitrary code. So your attacker
puts an object as the value associated with the first copy of the key which
is an object with attributes that cause it to do something
"interesting". The suggestion for attacking a Perl 5 CGI was to use a
CGITempFile object. The destructor looks like this:

sub DESTROY {
    my($self) = @_;
    unlink $$self;              # get rid of the file
}

The attacker can craft a bogus CGITempFile object that refers to any file on
the system, and when this object is destroyed it will attempt to delete that
file at whatever privilege level the CGI runs at. And because that object
is getting destroyed inside the deserialise routine of Storable, this all
happens without the user written code getting any chance to inspect the
data. And even Storable can't do anything about it, because by the time it
encounters the repeated hash key, it has already deserialised this
time-bomb. How does it defuse it?

Simple solutions such as checking the hash keys are unique don't work
either. All the attacker does then increase the abstraction slightly.
Put the time-bomb as the first element in an array. Put one of these
bogus hashes as the second object. Fine, you can realise that you've
got bad keys in the bogus hash and never build it. But at this
point the time-bomb object already exists. You'd have to validate the
entire serialised stream before continuing. And if deserialisers for
objects are allowed to fail, then you're still stuffed, because the
attacker then crafts a time-bomb object, and a second malformed object that
is known to cause its class's deserialiser to fail.

I presume that parrot is going to be able to de-serialise objects. In
which case we are exposed to this sort of attack.

Nicholas Clark

Reply via email to