On 22/10/10 17:34, Kevin Monceaux wrote:
> I think the OP was referring to mysqldump's --compatible option which
> one could use, for example, as mysqldump --compatible=postgresql ...
Doesn't Work, btw. (really doesn't help much, at least in my experience
- you just end up with dump that's neithe
On Fri, Oct 22, 2010 at 07:58:20AM +0100, Chris Withers wrote:
> On 21/10/2010 15:40, ringemup wrote:
> >MySQL has a tool (mysqldump) that will output the contents of an
> >entire database to a SQL file that can then be loaded directly into
> >another database. Does Postgres not have anything ana
hm seriously... why not use Django 1.2 and multidb? It is very easy not?
conf two dbs, default is your postgres db, I'll call db2 your mysql db and
you assert that you did a syncdb in both...
so for each model you do:
for item in Model.objects.all(): item.save(using='db2',force_insert=True)
Pro
Not spectacular, but mysqldump has flags that can increase the chances
of portability, and one would hope pg_dumpall does too. I figured
something like that might be worth a try before you write a custom
migration script.
On Oct 22, 2:58 am, Chris Withers wrote:
> On 21/10/2010 15:40, ringemup wr
On Fri, 2010-10-22 at 07:58 +0100, Chris Withers wrote:
> On 21/10/2010 15:40, ringemup wrote:
> > MySQL has a tool (mysqldump) that will output the contents of an
> > entire database to a SQL file that can then be loaded directly into
> > another database. Does Postgres not have anything analogou
On 21/10/2010 15:40, ringemup wrote:
MySQL has a tool (mysqldump) that will output the contents of an
entire database to a SQL file that can then be loaded directly into
another database. Does Postgres not have anything analogous?
Sure, pg_dumpall. Now, what're the chances of the SQL that spit
On 21/10/10 15:06, Chris Withers wrote:
> ...bt, why would dumpdata dump out something invalid?
Why indeed, but that doesn't mean it isn't.
(aside: of course the dumb regex match I suggested wasn't a proper date
parse either, you might want to try an actual parse in the loop - which
is what
MySQL has a tool (mysqldump) that will output the contents of an
entire database to a SQL file that can then be loaded directly into
another database. Does Postgres not have anything analogous?
On Oct 11, 8:58 am, Chris Withers wrote:
> Hi All,
>
> I have an existing Django app with lots of da
On 21/10/2010 14:48, David De La Harpe Golden wrote:
On 21/10/10 13:31, Chris Withers wrote:
...which is a little odd, given that the file was created by 'dumpdata'.
Any ideas?
Do you see any genuine wierdness in the format of any stringified
datetimes in the dumped json? Yes I know you've go
On 21/10/10 13:31, Chris Withers wrote:
> ...which is a little odd, given that the file was created by 'dumpdata'.
> Any ideas?
>
Do you see any genuine wierdness in the format of any stringified
datetimes in the dumped json? Yes I know you've got 132 megs, but I
don't mean check totally manuall
On 21/10/2010 14:06, Jeff Green wrote:
When I was using loaddata I found out that if I did not have a True or
False value for any boolean fields, I would have an issue loading the
data. Once, I set the value for any records to True or False I was
successfully able to use loaddata. Hope that helps
When I was using loaddata I found out that if I did not have a True or False
value for any boolean fields, I would have an issue loading the data. Once,
I set the value for any records to True or False I was successfully able to
use loaddata. Hope that helps
On Thu, Oct 21, 2010 at 7:31 AM, Chris
On 11/10/2010 14:03, Shawn Milochik wrote:
One way would be to use the dumpdata command to export everything, change your
settings to point to the new database, then loaddata to restore.
Okay, so I'm using buildout and djangorecipe for my deployment.
On the postgres-backed server, I did:
bin/
I am in the process of migrating a postgresql database to oracle which has a
lot of data.
I tried to use the dumpdata command but because of the amount of data I had
to scrap the plan. My alternative solution with Django 1.2 is to do a read
and write from postgresql to oracle.
On Mon, Oct 11,
One way would be to use the dumpdata command to export everything, change your
settings to point to the new database, then loaddata to restore.
There may be a better way, but this way allows you to dump to a
database-agnostic backup so it seems like it would suit your needs.
Shawn
--
You rece
Hi All,
I have an existing Django app with lots of data in it. For reasons
beyond my control, this app needs to move from Postgres to MySQL.
What's the best way of going doing this?
cheers,
Chris
--
Simplistix - Content Management, Batch Processing & Python Consulting
- http://w
16 matches
Mail list logo