- Denish Patel
- 21st October 2015
Last year, I have presented on Postgres RDS in Amazon at couple of conferences but I haven’t got change to follow up on my experience working with RDS after them. If you aren’t aware, I recently changed job & started working at WithMe, a disruptive start-up that focuses on three primary solutions for commerce, including WithMe Business, Financial and Retail.
At WithMe, we are using a fair bit of Postgres RDS instances. After joining, I reviewed the setup and found out the pgabadger log analysis is not setup on the database servers. Even though the databases are still in development phase, it makes sense to have reports that will help to provide more insight to software developers on the slow queries. I had other projects (schema(s) design, database migrations,DW system) going on so i wasn’t able to spend time identify and tuning queries to assist software engineers to expedite their develpement so we decided to prioritize log analysis project.
At previous job, I have setup pgbadger reports on number of servers but most of the time I had to deal with real db servers or at the least had ssh access on the server. On the RDS, you can only have psql access, no shell level access on the instance. Additionally, You have to deal with AWS tools and access methods.
I followed following steps to setup pgbadger reports for Postgres RDS instances:
- Brought up pgadmin ec2 instance to be used as utility server.
- Install AWS tool to interact with API
- Make sure /root/.aws/aws_credential_file file in place
[denishpatel@pgadmin1 ~]$ aws configure –profile oregon
AWS Access Key ID [None]: XXX
AWS Secret Access Key [None]: XXX
Default region name [None]: us-west-2
Default output format [None]: text
[denishpatel@pgadmin1 ~]$ aws rds describe-db-instances –profile oregon –output table
- Postgres logging parameters are set in parameter groups :
- I have come up with quick shell script to run reports. The script is located @ https://github.com/denishpatel/pgbadger_on_rds
- You can easily cron them. As the new servers will be added, I will be adding new cron entries for the server:
# pgbadger reports
0 7 * * * ~/run_pgbadger_rds.sh -i devel-postgres -r us-west-2 -c 1 >/dev/null 2>&1
15 7 * * * ~/run_pgbadger_rds.sh -i prod-postgres-rds -r us-east-1 -c 1 >/dev/null 2>&1
- I have setup nginx web server on pgadmin to server to serve html reports so team doesn’t have to download html files to review them.
In case, you are tasked to setup pgbadger report on Postgres RDS instance, this post might help ! I haven’t spent a lot of time writing a script so script provides very bare minimal to run reports but it can have a lot of improvements. Hope to see your contribution to improve the script 🙂
As always, I’m looking forward to see some comments from your experience working with Postgres RDS 🙂
- Running VACUUM FULL in Non-blocking Mode
- Connection Scaling
- PSQL Helper: Managing Connections and Simplifying Queries
- Vacuum Those MVs!
- Working With Repmgr: Using Other 3rd Party Tools for Setting up a Standby
- Working with Amazon Aurora PostgreSQL: dag, standby rebooted again!
- Working with Amazon Aurora PostgreSQL: what happened to the stats?
- How to set application_name for psql command line utility?
- Is there a limit on number of partitions handled by Postgres?
- Postgres 11 partitioning
Here’s hoping we can replace & automate this all for you soon with VividCortex so you don’t need to fuss with any of this!
That will be cool 🙂
I’ve run into issues trying to download logs > 10-12M from AWS. Instead of the lovely Pg log messages I want, the file just says “None”.
Have you experienced this behavior?
Leave a comment