On 12/30/12 10:40 AM, Wolfgang Denk wrote:
Dear Marco van Wieringen,
In message <loom.20121228t234109-...@post.gmane.org> you wrote:
I have no idea what to try next. Any pointers? All help welcome.
Maybe try the in 2010 working solution available here
http://mtu.net/~jpschewe/blog/2010/06/migrating-bacula-from-mysql-to-postgresql/
which is way more up to date then anything in the current docs which talk
about a 1.x version of 8+ years ago.
Indeed - the document says:
This process was tested using the following software
versions:
Linux Mandrake 10/Kernel 2.4.22-10 SMP
Mysql Ver 12.21 Distrib 4.0.15, for mandrake-linux-gnu (i586)
PostgreSQL 7.3.4
Bacula 1.34.5
No surprise this is breaking down.
Yes 2.4 kernel, 7 series PostgreSQL should ring a bell. All from long
long ago.
B.T.W. this info comes from the wiki which has most of the time much more
up to data docs then the official docs. We only can do so much nobody
seems to be interested in updating the docs and there are more interesting
things to explore.
With your description I get no errors, but I'm still not through.
Observations:
- With the options used here, mysqldump needs much longer, and it
consumes an awful lot of memory - in my case close to 14 GB. I had
to add extra swp space because the 4 GB RAM / 4 GB swap I had on my
old DB server did not work.
Try the |--quick|
<http://dev.mysql.com/doc/refman/5.5/en/mysqldump.html#option_mysqldump_quick>,
|-q option.
http://dev.mysql.com/doc/refman/5.5/en/mysqldump.html#option_mysqldump_quick
|
- The generates SQL dump is about twice as large as the dumps I got
before.
That could be because its using a compatible format.Its probably also not
the smartest way of inserting data into PostgreSQLe.g. using the copy in
PostgreSQL constructs etc.
Maybe you should look here and see if there is something better then
mysqldump etc.
http://wiki.postgresql.org/wiki/Converting_from_other_Databases_to_PostgreSQL
- The import into PostgreSQL is terribly slow. It spits out a large
number of lines "INSERT 0 1" on the terminal (7.7 millions such
lines so far, after about 20 hours of run time); but my SQL dump
file contains more than 72 millions 'INSERT INTO' lines. Extra-
polating from that the import will take at least 190 hours or about
8 days. And this is only the smaller of my bacula databases...
Is there something going wrong? I guess there is no way to check
before the process completes, right?
Did you even tune the PostgreSQL database ? With the stuff you are trying
to insert you need to first tune your PostgreSQL database with bigger
memory buffers etc. The defaults of PostgreSQL are like from the 80's
and make sure you have a proper amount of RAM in the box running
the database with this amount of data. Next to that is probably whining
it taking a log switch every x seconds (or maybe even more per second)
also due to the fact the defaults don't work to well for these enormous
amounts of inserts.
If you don't know what to do you could start with pgtune which at least
may lead to somewhat more sane settings then the defaults. But it seems
you need to do some serious tuning to get this running.
Last option would be to look into tools like these:
http://www.enterprisedb.com/products-services-training/products-overview/postgres-plus-solution-pack/migration-toolkit
I think its free to use and probably will work for migrating to a
normal postgresql database too (as enterprise db is just a commercial
version of postgresql).
Marco
------------------------------------------------------------------------------
Master Visual Studio, SharePoint, SQL, ASP.NET, C# 2012, HTML5, CSS,
MVC, Windows 8 Apps, JavaScript and much more. Keep your skills current
with LearnDevNow - 3,200 step-by-step video tutorials by Microsoft
MVPs and experts. ON SALE this month only -- learn more at:
http://p.sf.net/sfu/learnmore_123012
_______________________________________________
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users