Cloud-provided managed databases are great. Especially when you're CTO of a small company, like me. No sleepless night over (nonexistant) backup procedures, encryption-at-rest, firewalling, critical software updates. Production, Enterprise-grade Redis and PostgreSQL at your fingertips in a matter of minutes.
Sounds easy, but there are several things you need to consider. The database is not yours, it's theirs - so take good care who you entrust your (customers) data with and what you put in place to make it as secure as can be.
Here are 8 pieces of advice that are on my "managed database provider" checklist, in no particular order of importance. The following is pretty much a copy-paste from our own risk management assessment and security baseline documents at Firmhouse. Use at your own will (or peril!)
Ensure compliance with legislation and ensure secure standards
Picking just any provider because they offer the easiest and cheapest "one-click install" cloud databases is simply naïve. Always be sure that these providers have some information management certification, like ISO27001. Also make sure that their physical datacenters are operated under PCI standard and have SOC2 Type II reports available.
On top of that, if you cannot publicly get access to their security procedures or documentation, that's a bad sign.
Oh, and being able to sign a GDPR-compatible Data Processing Agreement/Addendum is also a pretty must-have! Must-have if you're a European company. And pretty important if you don't want your company to skip the whole European market, readily awaiting to give you money for your service.
We use Aiven.io as our managed database service, and they have pretty detailed information on both their compliance documentation (https://aiven.io/security-compliance) and logs of detailed information about their security, storage, and backup procedures (https://help.aiven.io/security/cloud-security-overview).
Service accounts and role-based authorization
Always create a service account/user to a database per system that's using it. If you only have one application that reads and write to your database, create a user for that application to your specific database.
Have some external or 3rd party reporting tool that just reads information from your database for dashboarding or business intelligence? Go ahead and create a read-only service account for just that user on the database.
Bottom line, just like for any other account: never share an account between multiple users or services.
Backup snapshots and point-in-time recovery
A database without automated backups cannot call itself a serious managed database service. Make sure your database provider offers automated backup snapshots and that they also support point-in-time recovery. With point-in-time recovery way you can quickly get your database back into a state from a few hours ago without loosing too much recent data.
I've never had the need for it luckily, but I'm sure I'm shooting myself in the foot by typing this now.
Have off-vendor backups
Yes backups are great. But what if your vendor goes bankrupt or due to some lawsuit is legally required to stop any active services they're providing? For these reasons, always export database snapshots to a 3rd party location and keep them stored there for 14 to 30 days. In the case that your vendor is wiped from the earth for whatever reason, you can at least start a recovery procedure to a different vendor that way.
Now, let's make sure an attacker can't do anything with the data in case it does get stolen somehow.
Making sure live data and backups are encrypted is a must-have. It is a good security measure against the once in a lifetime occurance that someone does get unpermitted access to a hard drive and rips it out of a server. But it's also just something practical: if you want to sell software to The Enterprise and Corporate, this is simply an important security requirement.
Encrypting your data "at rest" is pretty important. But encrypting it in-transit is even more important. Head over to the next paragraph.
SSLmode enabled by default
All serious databases (like PostgreSQL or Redis) allow you to connect to them over SSL. If your managed provider does not support this. Run away fast. Encryption of data-in-transit is just a must have to keep your data secure without people from sniffing around in your client's data.
However, don't think you're already done by simply using the SSLmode of your database connection! Nasty things can happen if you don't configure the thing from the next paragraph.
Configure SSL certificate pinning
Most managed database servers generate a self-signed certificate for securing the connection to the database. Without our applications verifying if the database they are talking to, is actually the database they were meant to be talking to, SSL encryption pretty much doesn't matter.
You need to make sure that your applications will only connect to the database service with the exact same SSL certificate as they are expecting. If your applications allow connections to any database service with an SSL certificate, you get caught by "the man in the middle". With this technique (and some additional hacks in/around your network) someone can spoof the database service and collect all the connection information that it needs. When this happens, the "man in the middle" essentially gets access to your full database, if you haven't applied the next and last security measure.
Private Networking/Firewall/IP whitelisting
Last but not least: make sure our managed database service allows some kind of networking constraints. This can either be a private network where your database lives in the same (virtual) network as your application servers. Or it can be a true public firewall with an IP whitelist if the service is accessed over the public internet.
That's it for now! Hope you enjoyed this post and that this "checklist" helps you in your day-to-day job. Or that it made you realize that you have a security gap somewhere. No biggy! Just calmly fill the hole and you're good to go for a good night's sleep again.