ate table-space and
> we need to clear table and index bloat values and using jobs also how can i
> achieve this need exact command ?
>
> Regards,
> Prakash.R
>
> On Wed, Jun 5, 2019 at 6:16 PM Sathish Kumar wrote:
>
>> Hi Prakash,
>>
>> You can run below co
Hi Prakash,
You can run below command.
pg_repack -d dbname -E DEBUG
On Wed, Jun 5, 2019, 7:55 PM Prakash Ramakrishnan <
prakash.ramakrishnan...@nielsen.com> wrote:
> Hi Peter,
>
> Thanks i have successfully created the extension how to use full vacuum
> using pg_repack.
>
> Regards,
> Prakash.
Hi Team,
We have a database and keep creating new tables for the requirement.
Every time we have to grant readonly permission to the new tables which are
created for the db user. Instead is there a way to inherit privileges.
Basically, we have a readonly user, who should run only select statement
:
>
>
> Am 05.05.19 um 19:26 schrieb Ron:
> > On 5/5/19 12:20 PM, Andreas Kretschmer wrote:
> >>
> >>
> >> Am 05.05.19 um 18:47 schrieb Sathish Kumar:
> >>> Is there a way to speed up the importing process by tweaking
> >>> Po
Hi All,
Postgresql version: 9.6
On Mon, May 6, 2019, 7:14 AM Sathish Kumar wrote:
> Hi,
>
> I am trying to export our database in GCE instance to Google Cloud SQL.
>
> Below are the commands used to export/import the database. I am exporting
> only 1 database which is re
Hi,
I am trying to export our database in GCE instance to Google Cloud SQL.
Below are the commands used to export/import the database. I am exporting
only 1 database which is required.
Export:
pg_dump -h olddbserver -U dbuser --format=plain --no-owner --no-acl
production | sed -E 's/(DROP|C
Hi,
I am trying to import a database of size 300+gb to another server. It's
taking a long time.
4vCPU
15GB RAM
psql -h newserver -U dbuser -d production -W < prod.sql
Is there a way to speed up the importing process by tweaking Postgresql
config like maintenance_workmem, work_mem, shared_buff
Hi All,
Can you tell me a way for table replication or sync or to achieve minimal
downtime from dbserver1 to dbserver2 on Postgresql 9.5
Table Size: 160gb
4VCPU, 16gb RAM
On Tue, Apr 2, 2019, 12:19 AM Sathish Kumar wrote:
> The table size is 160gb. We would like to move/copy this table
The table size is 160gb. We would like to move/copy this table fro one db
server to another db server.
On Tue, Apr 2, 2019, 12:17 AM Michel Pelletier
wrote:
> On Mon, Apr 1, 2019 at 7:47 AM Sathish Kumar wrote:
>
>> Hi Adrian,
>> We are exporting live table data to a new da
Hi Adrian,
We are exporting live table data to a new database, so we need to stop our
application until the export/import is completed. We would like to minimise
this downtime.
On Mon, Apr 1, 2019, 10:22 PM Adrian Klaver
wrote:
> On 3/31/19 11:09 PM, Sathish Kumar wrote:
> > Hi Team,
&
Hi Ros,
Using server on Cloud.
On Mon, Apr 1, 2019, 5:26 PM ROS Didier wrote:
> Hi
>
> One solution could be to use intel technology: FPGA :
> https://www.intel.fr/content/www/fr/fr/products/programmable.html
>
> the principle is to add an PCI electronic card on the server with CPUs and
> RAM.
Hi Team,
We have a requirement to copy a table from one database server to another
database server. We are looking for a solution to achieve this with lesser
downtime on Prod. Can you help us with this?
Table Size: 160GB
Postgresql Server Version: 9.5
Hi All,
I have created a read only user to perform select statements on our
database but whenever we create new tables on the database this user is
unable to view it unless I grant select again for this table. Is there a
way I can make select as default permission for this user so that in future
i
Hi All,
I would like to duplicate our existing db on the same server, what will be
the faster way to achieve it.
DB size is around 300gb.
Hi Adrian,
I am looking to do it either during export or while importing data in the
secondary db.
On Tue, Jan 29, 2019, 10:43 PM Adrian Klaver On 1/29/19 2:08 AM, Sathish Kumar wrote:
> > Hi Team,
> >
> > I am trying to protect some data on few tables when exporting to othe
>
> Le mar. 29 janv. 2019 à 11:08, Sathish Kumar a
> écrit :
>
>> Hi Team,
>>
>> I am trying to protect some data on few tables when exporting to other
>> environments, is there anyway or extension which can anonymize of
>> personal data like name, credi
Hi,
We are trying to replicate few tables from one postgresql server to another
server. We are currently using Postgresql 9.5.x, is there any way to
achieve it without Postgresql upgrade.
Hi Team,
I am trying to protect some data on few tables when exporting to other
environments, is there anyway or extension which can anonymize of
personal data like name, credit card numbers ,etc. after import.
Thanks & Regards
Sathish Kumar.V
Hi Pavel,
We would like to use with Google Cloud Sql where third party extensions are
not supported.
On Fri, Dec 7, 2018, 9:55 PM Pavel Stehule Hi
>
> pá 7. 12. 2018 v 14:48 odesílatel Sathish Kumar
> napsal:
>
>> Hi Team,
>>
>> Do PL/pgSQL support to create a fun
Hi Team,
Do PL/pgSQL support to create a function to make HTTP request. We have a
requirement to send data to external server from Postgres DB using
HTTP/HTTPS Post Method.
Hi,
I am trying to export and import sql file of a database. I would like
to know whether it will execute all the triggers when importing the
sql dump which is for Insert or Update or Delete.
Export:pg_dump -h test -U db_admin --format=plain --no-owner --no-acl
production | sed -E 's/(DROP|CR
Hi Team,
We would like to migrate our Postgresql VM instance on Google Cloud
Platform to Google Cloud SQL with a minimal downtime. As I checked, we have
to export and import the SQL file and our database size is large and cannot
afford longer downtime.
Do any have solution to achieve this?.
Than
22 matches
Mail list logo