Slightly off topic, but has anyone tried TimescaleDB for timeseries databases?

The issues discussed here are still there as they apply to the underlying 
Postgres ORDBMS.

We solve the problem (around 4 billion records of instrument sensor readings) 
by using UTC for the "native" timestamp, and working in that. Even though we 
are ½ way around the world. The local times can easily be determined & applied 
if desired, but by standardising on the reference time zone at the start, 
things have "just worked", for around 15 years now.


Brent Wood

Principal Technician, Fisheries
NIWA
DDI:  +64 (4) 3860529
________________________________
From: Lincoln Swaine-Moore <lswainemo...@gmail.com>
Sent: Thursday, October 5, 2023 08:30
To: Alban Hertroys <haram...@gmail.com>
Cc: Marian Wendt <marian.we...@yahoo.com>; pgsql-general 
<pgsql-general@lists.postgresql.org>
Subject: Re: Strategies for converting UTC data to local windows for arbitrary 
resolutions and timezones

> What I do in such cases is to add an extra column with the UTC timestamp to 
> serve as a linear scale to the local timestamps. That also helps with 
> ordering buckets in reports and such during DST changes (especially the ones 
> where an hour repeats).

> For hours and quarter hours I found it to be fairly convenient to base a view 
> on a join between a date calendar and an (quarter of an) hour per UTC day 
> table, but materialising that with some indexes may perform better (at the 
> cost of disk space). I do materialise that currently, but our database server 
> doesn’t have a lot of memory so I’m often not hitting the cache and 
> performance suffers a bit (infrastructure is about to change for the better 
> though).

That's an interesting idea, but I'm not sure I fully understand. Assuming 
you're aggregating data: what do you group by? For instance, at an hourly 
resolution, if you group by both the UTC timestamp and the local one, you might 
end up, say, dividing an hour-long bucket in two for time zones with 
half-hour-based offsets, no?

Thanks for the detailed writeup! Definitely helpful to learn more about what 
people are using in production to handle this sort of thing.

--
Lincoln Swaine-Moore
[https://www.niwa.co.nz/static/niwa-2018-horizontal-180.png] 
<https://www.niwa.co.nz>
Brent Wood
Principal Technician - GIS and Spatial Data Management
Programme Leader - Environmental Information Delivery
+64-4-386-0529

National Institute of Water & Atmospheric Research Ltd (NIWA)
301 Evans Bay Parade Hataitai Wellington New Zealand
Connect with NIWA: niwa.co.nz<https://www.niwa.co.nz> 
Facebook<https://www.facebook.com/nzniwa> 
LinkedIn<https://www.linkedin.com/company/niwa> 
Twitter<https://twitter.com/niwa_nz> 
Instagram<https://www.instagram.com/niwa_science> 
YouTube<https://www.youtube.com/channel/UCJ-j3MLMg1H59Ak2UaNLL3A>
To ensure compliance with legal requirements and to maintain cyber security 
standards, NIWA's IT systems are subject to ongoing monitoring, activity 
logging and auditing. This monitoring and auditing service may be provided by 
third parties. Such third parties can access information transmitted to, 
processed by and stored on NIWA's IT systems.
Note: This email is intended solely for the use of the addressee and may 
contain information that is confidential or subject to legal professional 
privilege. If you receive this email in error please immediately notify the 
sender and delete the email.

Reply via email to