Skip to main content

How to Backup postgres globals without sysadmin permission in RDS

For those of my Friends who are on postgres RDS and require to backup the globas for a rds instance you would soon find it can't perform a backup with pg_dumpall --globals-only.  The default user that gets created along with the RDS postgres instance does not have adequate permission to get through the backup processes. Bellow is the message that you would receive.


pg_dumpall.exe : pg_dumpall: query failed: ERROR:  permission denied for relation pg_authid
At line:1 char:1

+ & 'C:\setup_msi\postgres\bin\pg_dumpall.exe'  -h XXXXXX.cXXXX...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
   + CategoryInfo          : NotSpecified: (pg_dumpall: que...ation pg_authid:String) [], RemoteException
   + FullyQualifiedErrorId : NativeCommandError

pg_dumpall: query was: SELECT oid, rolname, rolsuper, rolinherit, rolcreaterole, rolcreatedb, rolcanlogin, rolconnlimit, rolpassword, rolvaliduntil, rolreplication, rolbypassrls,
pg_catalog.shobj_description(oid, 'pg_authid') as rolcomment, rolname = current_user AS is_current_user FROM pg_authid WHERE rolname !~ '^pg_' ORDER BY 2

The default user created with the instance is not a real sysadmin(even a user with rds_admin)  and i actually admire why AWS didn't treat the rds_admin user like a typical sysadmin. The pg_authid view has the user credentials exposed and thats a potential security risk, as a result good AWS folks have taken a conscious decision to disallow the user with rds_admin to get to the mentioned view. That said, the rds_admin user can read the pg_roles view which is similar to pg_authid but with out the credentials exposed.

I was hoping to write a method to override the default behaviour when i stumbled on the --no-role-password option in postgresv10. What this means is that the pg_dumpall can be executed with --globals-only and the credentials will not make it to the backup file. This will introduce a different challenge to get the restored database to be functional , but i have left that for a different day to fight.

ie. pg_dumpall -h <> -U <> -l <> --no-role-password --globals-only -s -f



Fyi : For those folks who are postgres fluent, you would find that i may not use the common terms relating to postgres as i am a working in progress fella.









Comments

  1. RDS backup is very important and this blog explain very well. RDS automates common administrative tasks such as performing backups and patching the software that powers your database. Thanks for sharing useful info.

    ReplyDelete
  2. Thanks you so much

    ReplyDelete

Post a Comment

Popular posts from this blog

High Watermarks For Incremental Models in dbt

The last few months it’s all been dbt. Dbt is a transform and load tool which is provided by fishtown analytics. For those that have created incremental models in dbt would have found the simplicity and easiness of how it drives the workload. Depending on the target datastore, the incremental model workload implementation changes. But all that said, the question is, should the incremental model use high-watermark as part of the implementation. How incremental models work behind the scenes is the best place to start this investigation. And when it’s not obvious, the next best place is to investigate the log after an test incremental model execution and find the implementation. Following are the internal steps followed for a datastore that does not support the merge statements. This was observed in the dbt log. - As the first step, It will copy all the data to a temp table generated from the incremental execution. - It will then delete all the data from the base table th

Create a dacpac To Compare Schema Differences

It's been some time since i added anything to the blog and a lot has happened in the last few months. I have run into many number of challenging stuff at Xero and spread my self to learn new things. As a start i want to share a situation where I used a dacpac to compare the differences of a database schema's. - This involves of creating the two dacpacs for the different databases - Comparing the two dacpacs and generating a report to know the exact differences - Generate a script that would have all the changes How to generate a dacbpac The easiest way to create a dacpac for a database is through management studio ( right click on the databae --> task --> Extract data-tier-application). This will work under most cases but will error out when the database has difffrent settings. ie. if CDC is enabled To work around this blocker, you need to use command line to send the extra parameters. Bellow is the command used to generate the dacpac. "%ProgramFiles

How To Execute A SQL Job Remotely

One of the clients needed its users to remotely execute a SQL job and as usual I picked this up hoping for a quick brownie point. Sure enough there was a catch and there was something to learn. Executing the job through SQLCMD was a no-brainer but getting it to execute on the remote machine was bit of challenge. On the coding Front 1    1.)     The bat file included the following code                 SQLCMD -S "[ServerName] " -E -Q "EXEC MSDB.dbo.sp_start_job @Job_Name = ' '[JobName]" 2    2.)     The Individual users were given minimum permissions  to execute the SQL job Ex. use msdb EXECUTE sp_addrolemember @rolename = 'SQLAgentOperatorRole', @membername = ' Domain\UserLogin ' At the client machine              This took a fair bit of time till our sysadmin got me an empty VM machine.  Thanks Michael G                   I’m just going to copy the exact instructions that I copied to OneNote and passed on