Thank you.
On Wed, May 10, 2017 at 12:54 PM, Adrian Klaver
wrote:
> On 05/10/2017 08:48 AM, Sandeep Gupta wrote:
>>
>> Currently, the postgres database by has SQL_ASCII encoding.
>>
>> psql -p 5771 postgres -l
>> List of databases
>>Name| Owner | Encoding |
On 05/10/2017 08:48 AM, Sandeep Gupta wrote:
Currently, the postgres database by has SQL_ASCII encoding.
psql -p 5771 postgres -l
List of databases
Name| Owner | Encoding | Collate | Ctype | Access privileges
---+-+---+-+
On 5/10/17 11:48, Sandeep Gupta wrote:
> Currently, the postgres database by has SQL_ASCII encoding.
> Is it possible to start the postgres database with UTF-8 encoding, instead
> of modifying it later.
This is done when initdb is run, with the --locale and/or --encoding option.
--
Peter Eisent
Currently, the postgres database by has SQL_ASCII encoding.
psql -p 5771 postgres -l
List of databases
Name| Owner | Encoding | Collate | Ctype | Access privileges
---+-+---+-+---+-
postgres | sandeep
On 2012-12-06 17:30, Adrian Klaver wrote:
On 12/06/2012 07:20 AM, Doug Kunzman wrote:
I'm trying to support an automatic character encoding to UNICODE so
Java
strings with none ASCII character can be stored in a table.
I've edited my postgressql.conf with the following command,
PGCLIENTENCODIN
On 12/06/2012 07:20 AM, Doug Kunzman wrote:
> I'm trying to support an automatic character encoding to UNICODE so Java
> strings with none ASCII character can be stored in a table.
>
> I've edited my postgressql.conf with the following command,
> PGCLIENTENCODING=UNICODE
>
> And I'm getting this
I'm trying to support an automatic character encoding to UNICODE so Java
strings with none ASCII character can be stored in a table.
I've edited my postgressql.conf with the following command,
PGCLIENTENCODING=UNICODE
And I'm getting this error message,
FATAL: unrecognized configuration parame
On 12/08/11 7:54 PM, Bruce Clay wrote:
Is there a "proper" encoding type that I should use to load the word lists so
they can be interoperable with the WordNet dataset that happily uses the UTF8 encoding?
some of your input data may be in other encodings, not UTF8, for
instance, LATIIN1. if
Sorry for the duplicate postings. I have only recieved one reply so far and
that was a suggestion to post to this forum.
I trying to build a database to support natural language processing from a
variety of data files posted on the internet. Many of them are identified as
using UTF-8 encoding
"Daniel Verite" <[EMAIL PROTECTED]>
wrote on 07-12-2005 12:30:17:
> It should depend on the locale. Can you tell the results of
> `show lc_ctype` and `show lc_collate`?
I've fried one of the disks. When the new one arrives,
I'll send you the results (will be sometime next week).
Sincerely,
Rich
"Daniel Verite" <[EMAIL PROTECTED]> writes:
> Richard van den Berg wrote:
>> During the
>> pg_restore, it complained about failing to create several indexes
>> because of duplicates. Inspecting these cases showed that the bogus
>> characters are to blame. It seems that these characters (that
Richard van den Berg wrote:
> During the
> pg_restore, it complained about failing to create several indexes
> because of duplicates. Inspecting these cases showed that the bogus
> characters are to blame. It seems that these characters (that are
> probably not defined in LATIN1) are now t
I just upgraded our database from PostgreSQL 7.4.7 to 8.1.0. Our
original character enconding was set to the default of 7.4, being
SQL_ASCII. Along the way I realized our data does have non-ASCII
strings, so I decided to use this upgrade to fix this setting. Our new
8.1.0 database now has the LATIN
Hello,
I have a problem using latin2 in postgresql 8 (beta2) with Delphi,
with settings client_encoding = server_encoding = latin2 for database
there is no way I could post special characters
to database, i try with BDE ,dbexpress, delphi 5 & 7 and
usualy receive: ignoring unconvertible UTF-8 cha
On Mon, 8 Mar 2004, Markus Wollny wrote:
> Hi!
>
> As ODBC seems to be blissfully unaware of any character encodings
> whatsoever, so were we - our databases are encoded in SQL_ASCII,
> although we have stored german special chars (ÄÖÜäöü and ß), and from
> what I have read so far, these are st
Hi!
We've been using PostgreSQL (currently 7.4) via ODBC with ColdFusion
until now and didn't experience any problems. But now we want to move
from ColdFusion 4.5 to ColdFusion MX, thus abandoning ODBC and migrating
to JDBC.
As ODBC seems to be blissfully unaware of any character encodings
whatso
Tom Lane wrote:
>"Johann Woeckinger" <[EMAIL PROTECTED]> writes:
>> But when using a Tcl-based interface (e.g. pgaccess or home made tcl
>> based programs) to insert such characters into a table, they are not corre
>ct
>> displayed on queries by use of psql - they appear as two 'un
"Johann Woeckinger" <[EMAIL PROTECTED]> writes:
> But when using a Tcl-based interface (e.g. pgaccess or home made tcl
> based programs) to insert such characters into a table, they are not correct
> displayed on queries by use of psql - they appear as two 'unreadable'
> characters (they appear
I use PostgreSQL 7.0.3 on Linux (i386) platform, Tcl 8.3 is installed.
When entering National special characters (e.g. accented german
characters) by use of psql into a database, all seems to be ok., the special
characters appear correct on queries (locale seems to work well in this case).
But
19 matches
Mail list logo