As a WordPress developer you have lots of WordPress sites installed on your computer, you live or staging servers. Every WordPress site needs a folder to store its files and also a database to connect to to get and put contents to. For this reason it’s always good to keep the files nicely organized to have the database exported inside the site’s root directory or one level above it. That way when you backup the site using your favorite archiving tool (zip, tar, 7zip) you know that all that is necessary is packaged.

Of course to really prepare the site for migration you also need to export the site’s details such as site url, current directory and its salt keys but that’s for another post.

The following script starts scanning a root folder that may contain the sites in order to find wp-config.php files. If it finds one it will add that folder for later processing. 

the script checks if wp-cli is installed by running it with –info command just in case.

The script also checks if it runs on Linux and is run by root. If it is, it will switch to the owner of the folder otherwise wp-cli would fail to execute which is normal because running as root is a bit risky. The script may need to be run as root in case you want to scan for sites that are owned by different users. For example, /var/www/vhosts/site1 is owned by user1 and /var/www/vhosts/site2 is owned by user2 and the folders have 0700 permissions.

The exported database file is named .ht_site_db.sql for security reasons. Web servers usually block the access to files that start with .ht (such as .htaccess). This is just in case you forget about the exported file and somebody tries to download that db export file. If they manage to download it they can import it and access client information such as names and email addresses.

The tool tries to be efficient and to not go into the expected WordPress folders as there’s no need to scan for wp-config in wp-content or wp-admin folders.

Of course there could be lots of improvements such as compressing the db or maybe it would be faster with the Linux find command that can limit the scan to X number of depth.

Feel free to submit ideas and suggestions for improvements.

Disclaimer: The content in this post is for educational purposes only. Always remember to take a backup before doing any of the suggested steps just to be on the safe side.
Referral Note: When you purchase through an referral link (if any) on this page, we may earn a commission.
If you're feeling thankful, you can buy me a coffee or a beer