I would start with something like this
http://www.ebay.com/itm/2U-24-bay-2-5-Supermicro-Server-X8DTH-iF-2x-Xeon-Quad-Core-32GB-RAM-SAS2-216EL1-/222132081393?hash=item33b81a92f1:g:UzYAAOSwR5dXSQVw
With it being 2U you can then pop out the motherboard and go with anything
more modern you wanted in
On Thu, Apr 21, 2016 at 9:55 AM, Melvin Davidson
wrote:
> Please, just ONE LOGICAL VALID argument, not speculation. Otherwise, stop
> with the nay saying.
I think you should look seriously at the suggestion offered of using an
event trigger to get what you desire here. I think the most logical
Assuming 3 things
Table name - test
Column names - start_time, end_time
Added an id column (int) to distinguish each record in the table
You can go with this. (my apologies for formatting issues)
with
slots as (
select *
fromgenerate_series(0,1439) as s(slot)
),
slots_hours as (
create table json_data(row_id int, json_text jsonb);
insert into json_data(1,
'[{"ID":"1","location_name":"Test"},{"ID":"2","location_name":"Examples"}]');
To search for an ID
select row_id, parsed.* from json_data, lateral
jsonb_to_recordset(json_data.json_text) as parsed("ID" text, location_na
https://gist.github.com/wishdev/635f7a839877d79a6781
Sorry for the 3rd party site - just easier to get the layout correct.
A CTE and dense_rank is all it takes. I am always amazed at what one can
now pack into such small amounts of code.
On Wed, Jul 23, 2014 at 4:00 PM, Jim Garrison wrote:
Afternoon Frank,
I believe what you might wish to look at is a single database with a set of
schemas[1] which would separate your data in a logical way. You could have
a single connection url and then each individual connection could create a
schema (or reuse if you wish), set the search path (fir