"mean": 8}
}
}'::json->'ports'
))
) T
WHERE (value::json->>'mean')::float >= 7;
From: David Gauthier
Sent: Tuesday, November 30, 2021 9:40 PM
To: P
On Tue, Nov 30, 2021 at 1:40 PM David Gauthier
wrote:
> {
> ports : {
> port_abc:{min: 5, max: 7, mean: 6},
> port_def:{min: 5, max: 9, mean: 7},
> port_ghi:{min: 6, max: 10, mean: 8}
> }
> }
>
> select 1 from mytbl where cast(test_results#>'{ports,***,mean}' as float)
> >= 7 ;
>
PG 11.5 on linux
Let's say I store a jsonb in a column called test_results that looks like
this...
{
ports : {
port_abc:{min: 5, max: 7, mean: 6},
port_def:{min: 5, max: 9, mean: 7},
port_ghi:{min: 6, max: 10, mean: 8}
}
}
And I want to to get all the port names where the mean is
> On Oct 18, 2021, at 10:02 PM, David G. Johnston
> wrote:
>
> (jsonb - text[]) = ‘{}’::jsonb …?
Aha, thank you!
On Monday, October 18, 2021, Scott Ribe wrote:
>
> "containing only keys from this list of keys"
>
>
(jsonb - text[]) = ‘{}’::jsonb …?
Combine with (jsonb ?& text[]) if all tested keys need to be present as
well.
David J.
What's a good way to query jsonb column for
"no keys other than those in this list of keys"
in other words
"containing only keys from this list of keys"
--
Scott Ribe
scott_r...@elevated-dev.com
https://www.linkedin.com/in/scottribe/
Paul Jones writes:
> I may have discovered a situation in 10.1 where EXECUTEing a PREPARED
> statement acting on JSON data in partitioned tables hangs in an
> infinite loop for a particular set of data. Unfortunately, the data is
> proprietary, so I did the best I could below to describe what hap
Version 10.1, Community version from PGDG repo
OSRHEL 7.3
I may have discovered a situation in 10.1 where EXECUTEing a PREPARED
statement acting on JSON data in partitioned tables hangs in an
infinite loop for a particular set of data. Unfortunately, the data is
proprietary, so I did th
Version 10.1, Community version from PGDG repo
OS RHEL 7.3
I may have discovered a situation in 10.1 where EXECUTEing a PREPARED
statement acting on JSON data in partitioned tables hangs in an
infinite loop for a particular set of data. Unfortunately, the data is
proprietary