Thank you all very much for this information
On Sat, Sep 22, 2018 at 12:38 AM Alban Hertroys wrote:
>
>
> > On 21 Sep 2018, at 17:49, Durgamahesh Manne
> wrote:
> >
> >
>
> Considering how hard you try to get rid of duplicates, I'm quite convinced
> that you're at least short a few join condit
> On 21 Sep 2018, at 17:49, Durgamahesh Manne wrote:
>
>
Considering how hard you try to get rid of duplicates, I'm quite convinced that
you're at least short a few join conditions. Getting rid of duplicates early
has the added benefit of having to aggregate fewer rows, which should
drast
> i can see a lot of max(string-field) (for instance, LastName,
> MiddleName, FirstName).
> wild guess: completely broken design, but i don't know your application
> and use-case for that.
> again, as i said already, i think this is a case for an in-deep
> consultation.
My thoughts exactly.
Am 21.09.2018 um 17:49 schrieb Durgamahesh Manne:
Please find below attached query plan file
query and plan still hard to read :-(
Query:
SELECT distinct
Max(v."vchSubmittersCode") as vchSubmittersCode
, Max(v."vchRecordType") as vchRecordType
, Max(v."vchSequenceNumber") as vchSeq
On Fri, Sep 21, 2018 at 9:12 PM Andreas Kretschmer
wrote:
>
>
> Am 21.09.2018 um 17:13 schrieb Durgamahesh Manne:
> > query is below
>
> query and plan still not readable. Store it into a textfile and attach
> it here.
>
>
> Andreas
>
> --
> 2ndQuadrant - The PostgreSQL Support Company.
> www.2nd
Am 21.09.2018 um 17:13 schrieb Durgamahesh Manne:
query is below
query and plan still not readable. Store it into a textfile and attach
it here.
Andreas
--
2ndQuadrant - The PostgreSQL Support Company.
www.2ndQuadrant.com
Your plan is not readable to me (perhaps because of gmail). Does
https://explain.depesz.com/ give you any useful insights?
On Fri, 21 Sep 2018 at 16:15, Durgamahesh Manne
wrote:
>
>
> On Fri, Sep 21, 2018 at 7:38 PM Durgamahesh Manne
> wrote:
>>
>> Hi
>>
>> Complex query taken around 30 minutes
On Fri, Sep 21, 2018 at 7:38 PM Durgamahesh Manne
wrote:
> Hi
>
> Complex query taken around 30 minutes to execute even i have
> increased work_mem value to 4GB temporarily as total ram is 16gb
>
> Explain analyze query taken around 30 minutes to execute even i have
> created partial indexes wit
Hi
Complex query taken around 30 minutes to execute even i have
increased work_mem value to 4GB temporarily as total ram is 16gb
Explain analyze query taken around 30 minutes to execute even i have
created partial indexes with where condition on required columns
Below is the query plan for ex
On Thu, Sep 20, 2018 at 7:25 PM Durgamahesh Manne
wrote:
>
>
> On Thu, Sep 20, 2018 at 6:39 PM Andreas Kretschmer <
> andr...@a-kretschmer.de> wrote:
>
>>
>>
>> Am 20.09.2018 um 13:11 schrieb Durgamahesh Manne:
>> > Query was executed at less time without distinct
>> >
>> > As well as query was t
On Thu, Sep 20, 2018 at 6:39 PM Andreas Kretschmer
wrote:
>
>
> Am 20.09.2018 um 13:11 schrieb Durgamahesh Manne:
> > Query was executed at less time without distinct
> >
> > As well as query was taking around 7 minutes to complete execution
> > with distinct
> >
> > select distinct sub_head.
Am 20.09.2018 um 13:11 schrieb Durgamahesh Manne:
Query was executed at less time without distinct
As well as query was taking around 7 minutes to complete execution
with distinct
select distinct sub_head."vchSubmittersCode" ,rec."vchFileName" ,
rec."vchCusipFundIDSubFundID" , rec."vc
On Thu, Sep 20, 2018 at 3:41 PM Durgamahesh Manne
wrote:
> hi
> as per your request
> i ran below query without distinct
>
> select sub_head."vchSubmittersCode" ,rec."vchFileName" ,
> rec."vchCusipFundIDSubFundID" , rec."vchFundUnitPrice" ,
> sub_head."vchValuationDate" ,rec."vchAdvisorLabel" f
hi
as per your request
i ran below query without distinct
select sub_head."vchSubmittersCode" ,rec."vchFileName" ,
rec."vchCusipFundIDSubFundID" , rec."vchFundUnitPrice" ,
sub_head."vchValuationDate" ,rec."vchAdvisorLabel" from table1 rec join
table2 sub_head on rec."vchSubmittersCode"=sub_head
On Thu, Sep 20, 2018 at 3:22 PM Andreas Kretschmer
wrote:
> Hi,
>
>
> the problem is there:
>
>
> Am 20.09.2018 um 11:48 schrieb Durgamahesh Manne:
> > Unique (cost=5871873.64..6580630.73 rows=7400074 width=89) (actual
> > time=326397.551..389515.863 rows=370 loops=1)
> >
Hi,
the problem is there:
Am 20.09.2018 um 11:48 schrieb Durgamahesh Manne:
Unique (cost=5871873.64..6580630.73 rows=7400074 width=89) (actual
time=326397.551..389515.863 rows=370 loops=1)
|
| -> Sort (cost=5871873.64..5973124.65 rows
On Wed, Sep 19, 2018 at 8:27 PM Andreas Kretschmer
wrote:
>
>
> Am 19.09.2018 um 16:43 schrieb Durgamahesh Manne:
> >
> >
> > On Wed, Sep 19, 2018 at 8:02 PM Andreas Kretschmer
> > mailto:andr...@a-kretschmer.de>> wrote:
> >
> >
> >
> > Am 19.09.2018 um 15:51 schrieb Durgamahesh Manne:
> >
Hi
I have complex query like for ex select distinct
sub_head."vchSubmittersCode" ,rec."vchFileName" ,
rec."vchCusipFundIDSubFundID" , rec."vchFundUnitPrice" ,
sub_head."vchValuationDate" ,rec."vchAdvisorLabel" from "table1" rec join
"table2" sub_head on rec."vchSubmittersCode"=sub_head."vc
Am 19.09.2018 um 16:43 schrieb Durgamahesh Manne:
On Wed, Sep 19, 2018 at 8:02 PM Andreas Kretschmer
mailto:andr...@a-kretschmer.de>> wrote:
Am 19.09.2018 um 15:51 schrieb Durgamahesh Manne:
> I have created BRIN index on few columns of the table without any
> issues. But i
On Wed, Sep 19, 2018 at 8:02 PM Andreas Kretschmer
wrote:
>
>
> Am 19.09.2018 um 15:51 schrieb Durgamahesh Manne:
> > I have created BRIN index on few columns of the table without any
> > issues. But i am unable to create BRIN index on one column of the
> > table as i got error listed below
> >
>
Am 19.09.2018 um 15:51 schrieb Durgamahesh Manne:
I have created BRIN index on few columns of the table without any
issues. But i am unable to create BRIN index on one column of the
table as i got error listed below
[local]:6263 postgres@huawei=# CREATE INDEX brin_idx on huawei using
brin
Hi
Respected postgres community members
I have created BRIN index on few columns of the table without any issues.
But i am unable to create BRIN index on one column of the table as i got
error listed below
[local]:6263 postgres@huawei=# CREATE INDEX brin_idx on huawei using brin
("dFetch");
ERRO
22 matches
Mail list logo