On Wed, Mar 8, 2017 at 8:45 AM, vinny <vi...@xs4all.nl> wrote:

> On 2017-03-08 10:13, Günce Kaya wrote:
>
>> Hi all,
>>
>> I want to import content of CSV file to a table via bash script
>> without creating temporary table and I also want to skip some columns
>> in CSV file (for instance, CSV file has 12 column and main table has
>> only 2 column, If possible I would use only 2 column in CSV file) Is
>> there any way to do it?
>>
>> Regards,
>>
>> --
>>
>> Gunce Kaya
>>
>
> This is more a programming question than a database question, and there
> are many possible solutions.
> Do *not*, whatever you do, try to write your own piece of code to read the
> CSV. There are lots of unexpected
> ways that the CSV file can be slightly different from what you expect, and
> figuring all those out is a waste of time.
> The example of embedded comma's is just one way, there could also be
> newlines, linebreaks, utf8-escape characters etc.
>
> Personally I'd go the python route because it's simple and
> straightforward, but anything you are comfortable with will do.
> If you are going to install additional software to do this then remember
> that you'll need that same software again if
> you need to do this again, or when you need to move this code to a
> different server.


​I agree. I went with a "pure BASH" approach because it is what the user
asked for & I wasn't sure what language she might be comfortable with. I
use PERL a lot. Or maybe I should say that I abuse PERL a lot. Such as a
PERL script with writes out another PERL script, based on some input files
& parameters, then runs the just written PERL script, which does the load
into a PostgreSQL database (multiple tables). Ya, a bit perverted.​

-- 
"Irrigation of the land with seawater desalinated by fusion power is
ancient. It's called 'rain'." -- Michael McClary, in alt.fusion

Maranatha! <><
John McKown

Reply via email to