You can search from google,
https://www.google.com/search?q=mysql2pg&ie=utf-8&oe=utf-8&aq=t&rls=org.mozilla:en-US:official&client=firefox-a


On Sat, Jan 12, 2013 at 7:54 AM, Ken Tanzer <ken.tan...@gmail.com> wrote:

> I'm wondering if anyone can point me towards a good method for moving
> mysql data into Postgres?  I've done some web searching, and found
> documentation from various years, but it's not clear what's current and
> what works best.  Much of what I found seems to be flame war material (why
> Postgres is better), or is both old and seemingly involved and complex.
>
> Here's the fuller description of what I'm trying to do.  I've got a
> dataset (a UMLS* *Metathesaurus subset) that I need to get into a
> Postgres database.  It's all reference data, and so will be read-only.
> There's no functions or logic involved. I anticipate having to update it at
> least quarterly, so I'd like to get to a well-grooved import process.
>
> The data as distributed can be had in Oracle or Mysql formats.  (I already
> gave them my two cents to include Postgres.)  I did see some information
> about modifying the Mysql distribution files to make them
> Postgres-compatible, but I thought (perhaps foolishly) it would be easier
> to bring them into Mysql, and from there export them to Postgres.
>
> A recurring idea seemed to be to use:
>
> mysqldump -v --compatible=postgresql umls_test > dumpfile.sql
>
> followed by
>
> sed -i "s/\\\'/\'\'/g" dumpfile.sql
>
>
> but that didn't bring me much success.  I figure this has to be a fairly
> common need, and hopefully by 2013 there's an easy solution.  Thanks in
> advance!
>
> Ken
>
> --
> AGENCY Software
> A data system that puts you in control
> *http://agency-software.org/*
> ken.tan...@agency-software.org
> (253) 245-3801
>
>

Reply via email to