On Fri, Sep 4, 2009 at 4:57 PM, Joshua Russo<josh.r.ru...@gmail.com> wrote:
> My goal here is to create a backup and recovery scheme, where recovery is
> possible on any computer.
> I've been performing incremental updates to an application that people have
> started to use. The incremental updates seem to have created a problem for
> the dump and load data functions when trying to reload into a fresh
> database. I tried to use dump data to create an __ initial __ .json file,
> but I received a duplication error on load, I think from the unique index
> (not the primary key) on the content type. I believe this is because
> the tables (and thus content types) are created in a different order when
> doing a syncdb from scratch, as opposed to incrementally on the old
> production machine.

> Firstly, does anyone have a more elegant solution to the dump and load data
> problem specifically?

You've pretty much correctly diagnosed the problem that you are
experiencing. It's well known to the core, and is logged as #7052.

I've had a solution in mind for a while, but never got around to
implementing it. However, just before v1.1 was released, a patch was
offered that implements my proposed solution. I expect this patch (or
something like it) will be in v1.2.

However, that said...

> Second, what do you use for a data backup and recovery scheme for your
> Django data?

I'm unclear why normal MySQL backup tools aren't the obvious first
solution here. Yes, Django does provide data dumping and loading
tools, but they are primarily designed for small fixtures for tests
and initial database setup.

I'm not saying that loaddata and dumpdata _can't_ be used on large
data sets, when you're talking about backing up the entire database,
the database is going to be able to provide much better tools than
Django will. Trying to build a backup and recovery system without
using the tools closest to the data seems like a lot of extra work to
me.

You also appear to be conflating two separate problems. Talking about
the fragility of fixtures when the database changes isn't a problem
specific to fixtures - any data dumping scheme will experience the
same problems. Some data loading schemes might be more robust to
simple changes, but for anything other than trivial changes, you're
going to need a much more sophisticated approach than "dump the data
and reload it". This is a schema evolution issue, not a backup issue.

Yours,
Russ Magee %_)

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To post to this group, send email to django-users@googlegroups.com
To unsubscribe from this group, send email to 
django-users+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/django-users?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to