[no subject]

2020-07-12 Thread Anto Aravinth
Hello All, I have the following table: postgres=# \d so_rum; Table "public.so_rum" Column | Type | Collation | Nullable | Default ---+-+---+--+- id| integer

How to do phrase search?

2020-07-10 Thread Anto Aravinth
Hello, I have the following table: so2, which has following column details: ​ id, title, posts, body (tsvector). And I created the index on the following: "so2_pkey" PRIMARY KEY, btree (id) "body" gin (body) ​ And I wanted to query on my tsvector with the string: `Is it possible to toggle

Re: Building a notification system.

2018-07-15 Thread Anto Aravinth
On Mon, Jul 16, 2018 at 8:02 AM, Christopher Browne wrote: > On Sun, Jul 15, 2018, 5:30 AM Anto Aravinth, > wrote: > >> Hello Everyone, >> >> >> I'm playing around with postgresql with SO datasets. In the process, I >> have dumped 60M questions dat

Re: Building a notification system.

2018-07-15 Thread Anto Aravinth
15, 2018, David G. Johnston > wrote: > >> On Sunday, July 15, 2018, Anto Aravinth >> wrote: >>> >>> I'm not sure, how to get started with this. Read about NOTIFY: >>> https://www.postgresql.org/docs/current/static/sql-notify.html >>>

Building a notification system.

2018-07-15 Thread Anto Aravinth
Hello Everyone, I'm playing around with postgresql with SO datasets. In the process, I have dumped 60M questions data onto the postgresql. I'm trying to build a notification system on top of this, so that, when a user edits a question, I need to show a notification to the user when he/she logs in

Index Gin Creation is taking long time..

2018-06-28 Thread Anto Aravinth
Hello, I'm trying to create an index: create index search_idx on so2 using gin (to_tsvector('english',posts)); Looks like its running atleast for 8hours :( Totally I have 47M records in so2. Not sure why its taking so long time. Any idea or tips to debug while the index creation is going on? T

Re: Using COPY to import large xml file

2018-06-26 Thread Anto Aravinth
things like vacuum etc. Really loving postgres! Thanks, Anto. On Tue, Jun 26, 2018 at 3:40 AM, Tim Cross wrote: > > Anto Aravinth writes: > > > Thanks a lot. But I do got lot of challenges! Looks like SO data contains > > lot of tabs within itself.. So tabs delimiter

Re: Using COPY to import large xml file

2018-06-25 Thread Anto Aravinth
On Mon, Jun 25, 2018 at 8:54 PM, Anto Aravinth wrote: > > > On Mon, Jun 25, 2018 at 8:20 PM, Nicolas Paris > wrote: > >> >> 2018-06-25 16:25 GMT+02:00 Anto Aravinth : >> >>> Thanks a lot. But I do got lot of challenges! Looks like SO data >&g

Re: Using COPY to import large xml file

2018-06-25 Thread Anto Aravinth
On Mon, Jun 25, 2018 at 8:20 PM, Nicolas Paris wrote: > > 2018-06-25 16:25 GMT+02:00 Anto Aravinth : > >> Thanks a lot. But I do got lot of challenges! Looks like SO data contains >> lot of tabs within itself.. So tabs delimiter didn't work for me. I thought >> I

Re: Using COPY to import large xml file

2018-06-25 Thread Anto Aravinth
do a through serialization of my data into something that COPY can understand. On Mon, Jun 25, 2018 at 8:09 AM, Tim Cross wrote: > > > On Mon, 25 Jun 2018 at 11:38, Anto Aravinth > wrote: > >> >> >> On Mon, Jun 25, 2018 at 3:44 AM, Tim Cross wrote: >> >

Re: Using COPY to import large xml file

2018-06-24 Thread Anto Aravinth
On Mon, Jun 25, 2018 at 3:44 AM, Tim Cross wrote: > > Anto Aravinth writes: > > > Thanks for the response. I'm not sure, how long does this tool takes for > > the 70GB data. > > > > I used node to stream the xml files into inserts.. which was very slow.. &g

Re: Using COPY to import large xml file

2018-06-24 Thread Anto Aravinth
COPY command, as suggested on the internet. Definitely, will try the code and let you know.. But looks like it uses the same INSERT, not copy.. interesting if it runs quick on my machine. On Sun, Jun 24, 2018 at 9:23 PM, Adrien Nayrat wrote: > On 06/24/2018 05:25 PM, Anto Aravinth wrote: &

Using COPY to import large xml file

2018-06-24 Thread Anto Aravinth
Hello Everyone, I have downloaded the Stackoverflow posts xml (contains all SO questions till date).. the file is around 70GB.. I wanna import the data in those xml to my table.. is there a way to do so in postgres? Thanks, Anto.