Hi Ivo, On 11/20/2011 09:29 PM, Ivo Brodien wrote: > hi, > > on MYSQL I just do this to copy a MySQL DB to my local machine which runs in > MAMP on Mac OS X. > > Something similar should be possible with PostgreSQL as well. > > ssh USERNAME@SERVER "mysqldump -u USERNAME_REMOTE -p'DB_PASSWORD_REMOTE' > --single-transaction DB_NAME | gzip -c" | gunzip -c | > /Applications/MAMP/Library/bin/mysql -u USERNAME_LOCAL -p'DB_PASSWORD_LOCAL’ > DB_NAME > > Since the password is provided on the command line it should only be run on a > system with a single user. >
Thanks a lot. Yes this is definitely an option if source and target host the same database engine. The reason why would I like to avoid this approach and why I tried to stick with django commands is, that above approach will not work if I want to dump a postgres database to a test machine which has only sqlite or mysql. Therfore my motivation was to use dumpdata and loaddata of manage.py Concerning above approach for mysql: I think it is not necessary to provide the password in the command line. If I remember well one can create a temp file with permissions 600 and store the user/password in it and specify it on the command line with --defaults-file=tmp_my.cnf However as I don't use postgres that much I don't know whether postgres can do the same (specify a non default config file) > > On Nov 20, 2011, at 19:09 , Gelonida N wrote: > >> On 11/20/2011 06:37 PM, Gelonida N wrote: >>> Hi, >>> >>> I for debugging purposes I'd like to clone the data base of one machine >>> to another one. (both are not necessarily using the same data base engine) >>> >>> What would be the suggested procedure? >>> >>> What I tried, but what failed is following: >>> >>> 1.) On the remote machine >>> ./manage.py dumpdata > dumpall.json >>> >>> 2.) copy dumpall.json to remote machine >>> >>> 3.) On the machine to be cloned to: >>> ./manage.py flush >>> ./manage.py loaddata dumpall.json >>> >>> The error, that I get is: >>> >>> IntegrityError: duplicate key value violates unique constraint >>> "django_content_type_app_label_key" >>> >>> >>> The only ideas, that I have are: >>> - delete all conflicting tables from the source host before applying my >>> fixture (I would do the with ./manage.py shell or with a tiny script) >>> >>> - dump all apps except the ones causing problems. >>> >>> Thanks a lot in advance for any suggestion. >>> >>> >>> >> What I did now is following (this time the source and destination >> database were both postgres) >> >> >> >> 1.) On the remote machine >> ./manage.py dumpdata > dumpall.json >> >> 2.) copy dumpall.json to remote machine >> >> 3.) On the machine to be cloned to: >> ./manage.py sqlflush >> copy the truncate statement from the output to the clipboard >> ./manage.py dbshell >> enter the password and paste the truncate statement and quite the db shell >> >> ./manage.py loaddata dumpall.json >> >> >> This is working, but not that nice to automate, >> as the SQL snippet, that I have to extract from sqlflush >> depends on the db engine and as I have to be sure, that I have to create >> a .pgpass entry for postgres in order to avoid the password prompt. >> >> >> >> >> >> -- >> You received this message because you are subscribed to the Google Groups >> "Django users" group. >> To post to this group, send email to django-users@googlegroups.com. >> To unsubscribe from this group, send email to >> django-users+unsubscr...@googlegroups.com. >> For more options, visit this group at >> http://groups.google.com/group/django-users?hl=en. >> > -- You received this message because you are subscribed to the Google Groups "Django users" group. To post to this group, send email to django-users@googlegroups.com. To unsubscribe from this group, send email to django-users+unsubscr...@googlegroups.com. For more options, visit this group at http://groups.google.com/group/django-users?hl=en.