@tomhughes commented on this pull request.
> @@ -0,0 +1,7 @@
+class AddNotesUserIdCreatedAtIndex < ActiveRecord::Migration[7.2]
+ disable_ddl_transaction!
+
+ def change
+ add_index :notes, [:user_id, :created_at], :algorithm => :concurrently
I'd suggest adding `:where => "user_id IS NOT NULL"` here so that we don't
index anonymous notes.
> @@ -0,0 +1,7 @@
+class AddGpxFilesUserIdTimestampIndex < ActiveRecord::Migration[7.2]
+ disable_ddl_transaction!
+
+ def change
+ add_index :gpx_files, [:user_id, :timestamp], :algorithm => :concurrently
I think it would be to index on `:user_id, :id` here as IDs are a proxy for
timestamp. Such an index would already be useful for pagination of a user's
traces and can be used for efficient lookup recent traces even if you want to
limit by time as this test shows:
```
apis_master=# explain select * from gpx_files where user_id = 1 and timestamp >
'2024-01-01' order by id desc limit 10;
QUERY PLAN
-----------------------------------------------------------------------------------------------------
Limit (cost=2.51..2.51 rows=1 width=106)
-> Sort (cost=2.51..2.51 rows=1 width=106)
Sort Key: id DESC
-> Index Scan using gpx_files_user_id_idx on gpx_files
(cost=0.28..2.50 rows=1 width=106)
Index Cond: (user_id = 1)
Filter: ("timestamp" > '2024-01-01 00:00:00'::timestamp without
time zone)
(6 rows)
apis_master=# create index on gpx_files(user_id, id);
CREATE INDEX
apis_master=# explain select * from gpx_files where user_id = 1 and timestamp >
'2024-01-01' order by id desc limit 10;
QUERY PLAN
-----------------------------------------------------------------------------------------------------------
Limit (cost=0.28..2.50 rows=1 width=106)
-> Index Scan Backward using gpx_files_user_id_id_idx on gpx_files
(cost=0.28..2.50 rows=1 width=106)
Index Cond: (user_id = 1)
Filter: ("timestamp" > '2024-01-01 00:00:00'::timestamp without time
zone)
(4 rows)
```
Adding the index switches to a descending index scan on ID which will also be
more or less timestamp order and so long as you have some limit on the number
of changesets you will consider it should be reasonably efficient?
--
Reply to this email directly or view it on GitHub:
https://github.com/openstreetmap/openstreetmap-website/pull/5747#pullrequestreview-2665076082
You are receiving this because you are subscribed to this thread.
Message ID:
<openstreetmap/openstreetmap-website/pull/5747/review/2665076...@github.com>
_______________________________________________
rails-dev mailing list
rails-dev@openstreetmap.org
https://lists.openstreetmap.org/listinfo/rails-dev