On 2009-03-19 13:40, Tim Chase wrote:
>> DB-API 2.0 has cursor.executemany() to make this differentiation
>> at the API level. mxODBC will lift this requirement in the next
>> version, promised :-)
>
> glad to hear...will executemany() take an arbitrary iterable? My
> (albeit somewhat-antiquated
DB-API 2.0 has cursor.executemany() to make this differentiation
at the API level. mxODBC will lift this requirement in the next
version, promised :-)
glad to hear...will executemany() take an arbitrary iterable? My
(albeit somewhat-antiquated) version balked at anything that
wasn't a list/t
On 2009-03-19 00:30, Tim Chase wrote:
> Bruno Desthuilliers wrote:
>> Tim Chase a écrit :
>>> (if your columns in your CSV happen to match the order of your INSERT
>>> statement, you can just use
>>>
>>> execute(sql, tuple(row))
>>
>> Or more simply:
>>
>> cursor.execute(sql, row)
>
> that'
> You have to know the original encoding (I mean, the one used for the csv
> file), else there's nothing you can do. Then it's just a matter of
> decoding (to unicode) then encoding (to utf8), ie (if your source is in
> latin1):
>
> utf_string = latin1_string.decode("latin1").encode("utf8")
The O
Bruno Desthuilliers wrote:
Tim Chase a écrit :
(if your columns in your CSV happen to match the order of your INSERT
statement, you can just use
execute(sql, tuple(row))
Or more simply:
cursor.execute(sql, row)
that's always annoyed me with the mxODBC drivers I've
usedthey req
rewonka a écrit :
(snip)
Now i stucked when i tried to pu into db.
Because i have some cell that is in somekind of unicoded text,
You mean "encoded in something else than utf8" ?
and i'm
looking a solution how to put this into db (my db in utf-8 format).
(snip)
but something binary in a c
Tim Chase a écrit :
sql = ''' INSERT INTO table (column1,column2, ...) VALUES ( %s,
%s, ); '''
for row in rows:
connection.cursor.execute(sql % (row[0],row[1],))
connection.corsur.commit()
(snip)
The first step is to use the database's quoting to prevent problems
where miscreant
sql = ''' INSERT INTO table (column1,column2, ...) VALUES ( %s,
%s, ); '''
for row in rows:
connection.cursor.execute(sql % (row[0],row[1],))
connection.corsur.commit()
but something binary in a cell, the pgdb says it is not in utf-8
format, or something like this.
I know it's a newbi
On márc. 18, 14:10, Peter Otten <__pete...@web.de> wrote:
> rewonka wrote:
> > I had a problem, i would like to process a file into a PSQL, but in
> > the file the delimiter char is ':'
> > and i have the same charater inside the text field also.
> > something like this:
> > text = 1:23:34:"sample:
rewonka wrote:
> I had a problem, i would like to process a file into a PSQL, but in
> the file the delimiter char is ':'
> and i have the same charater inside the text field also.
> something like this:
> text = 1:23:34:"sample: text":" something"
> if I use text.split(':')
> it will be ['1', '23
rewonka wrote:
Hi,
I had a problem, i would like to process a file into a PSQL, but in
the file the delimiter char is ':'
and i have the same charater inside the text field also.
something like this:
text = 1:23:34:"sample: text":" something"
if I use text.split(':')
it will be ['1', '23', '34',
Hi,
I had a problem, i would like to process a file into a PSQL, but in
the file the delimiter char is ':'
and i have the same charater inside the text field also.
something like this:
text = 1:23:34:"sample: text":" something"
if I use text.split(':')
it will be ['1', '23', '34', '"sample', 'text
12 matches
Mail list logo