I came across this paper making a case for indices that use machine
learning to optimise search.

https://arxiv.org/pdf/1712.01208.pdf

The gist seems to be to use a linear regression model or feed a tensor flow
model when a more complicated distribution is needed for the data and allow
SIMD instructions working on top of GPUs / TPUs to speed up lookups. The
speedup observed is anywhere from 40-60%.

That result looks impressive but I don't have enough context on say
rebuilding a neural net on every DML operation. The equivalent operation
that I can relate to on PG would be to rebalance the B-tree for DML
operations.

In your opinion, would a ML model work for a table whose operations are
both write and read heavy? I'd love to hear your thoughts on the paper.

Thanks for reading
- Deepak

Reply via email to