Tests and small websites starts small and grow. Building a small website like it was an established large website is often not the best way to get started. This leads to the small, test website needing to be converted to microservices at some stage.
Let’s say I have a legacy website that is currently in use but has not been seriously worked on for some time and/or just needs an overhaul. I’m going to look at a specific example, let’s call it example.com. This is not any one particular website, it is based on several of my own personal websites. Here is a summary of the website…
Current functionality…
New functionality needed…
So, the team want to add some fairly major changes to the website. The look and feel of the website is being modernised and the feeling is that the deployment process will also need to change.
Possible changes…
If CI/CD is a goal, to get there we need to separate the database from the instance/container we’re deploying to. If we have a small number of images we could put them in git, but if the number of images is large we won’t want to keep them in git, which means separating them out into an S3 bucket, or similar. Moving the database and images are fairly small steps that can be done pretty quickly, so we can do these steps first. Then, with the database and media files out of the way we can choose what we want to do next.
Do we change the deployment process first, before moving to a framework such as Laravel and redesigning the site? If we have a development server it doesn’t really matter which order we do things in from here as long as we don’t make multiple large changes at once.
If the external API functionality will be separated out into a microservice maybe we can work on that at the same time as something else. You could even say that removing API functionality entirely from the main codebase makes the main codebase smaller. If the main codebase is smaller, two large jobs (migrating to CI/CD and Laravel) are also slightly smaller, which can only be a good thing. The aim is to make each step as small as possible.
This is a simplified plan of the upgrade process…
But, first things first, I need to make sure everything is backed up, and we’re ready to deploy in a new way.
Lets make sure we have daily backups of the database to S3…
mysqldump -u\'user\' -p\'********\' db_name > 2021-10-11-database-dump.sql
But, we really want something like this format…
mysqldump -u\'user\' -p\'********\' db_name | aws s3 cp - s3://where-i-store-my-backups/2021-10-11-database-dump.sql
Which, with a dynamic date, is this…
mysqldump -u\'user\' -p\'********\' db_name | aws s3 cp - s3://where-i-store-my-backups/`date +%Y-%m-%d`-database-dump.sql
Then, if we want to make sure the database is backed up daily we can do this…
23 12 * * * /bin/bash -c \"mysqldump -u\'user\' -p\'********\' db_name | aws s3 cp - s3://where-i-store-my-backups/\`date +\%Y-\%m-\%d\`-database-dump.sql\"
To copy the backed up data to a new remote database instance…
create database db_name;exit
mysql -u\'user\' -p\'********\' -h \'remote.db.address.com\' db_name < 2021-10-11-database-dump.sql
In the same way as we’ve just backed the database up.
I don’t want the images directory in my bucket so I can run…
aws s3 sync ./images s3://bucketname/
Quick Links
Legal Stuff