The binary I"m trying to create should automatically be able to read data
from a postgres instance without users having to
run commands for backup / pg_dump etc.
Having access to the appropriate source headers would allow me to read the
data.

On Tue, Mar 19, 2024 at 8:03 PM Sushrut Shivaswamy <
sushrut.shivasw...@gmail.com> wrote:

> I'd like to read individual rows from the pages as they are updated and
> stream them to a server to create a copy of the data.
> The data will be rewritten to columnar format for analytics queries.
>
> On Tue, Mar 19, 2024 at 7:58 PM Alexander Korotkov <aekorot...@gmail.com>
> wrote:
>
>> Hi
>>
>> On Tue, Mar 19, 2024 at 4:23 PM Sushrut Shivaswamy
>> <sushrut.shivasw...@gmail.com> wrote:
>> > I'm trying to build a postgres export tool that reads data from table
>> pages and exports it to an S3 bucket. I'd like to avoid manual commands
>> like pg_dump, I need access to the raw data.
>> >
>> > Can you please point me to the postgres source header / cc files that
>> encapsulate this functionality?
>> >  - List all pages for a table
>> > - Read a given page for a table
>> >
>> > Any pointers to the relevant source code would be appreciated.
>>
>> Why do you need to work on the source code level?
>> Please, check this about having a binary  copy of the database on the
>> filesystem level.
>> https://www.postgresql.org/docs/current/backup-file.html
>>
>> ------
>> Regards,
>> Alexander Korotkov
>>
>

Reply via email to