Here’s a fun little issue that didn’t take a newer developer more than 2.5 hours to figure out!
Let’s say you plan to migrate a site from another host to Rackspace Cloud Sites, so you grab the files from the ftp server and move them over to the new host. You then retrieve either an unzipped, zipped, or gzipped download of the database and begin to import that database into PHPMyAdmin. After a few seconds, you get a white screen, with the sidebar still on the left. You inspect element on the page to find a 500 error in console, unable to find /login/. Strange. So you try again and again, using different zipped versions of the database, and you come across different SQL errors when importing the database piecemeal. The main culprit is, without a doubt, the size of the database, even while zipped, most likely causing PHP to time out and you don’t want to take the time to figure which tables require others to avoid those SQL errors.
We will also assume you don’t have a sysadmin available at 8 in the evening, you don’t have much experience using SSH/command line to access the database, and you’ve exhausted nearly every Google query you can think of to describe your problem. Well, luckily for you, Rackspace has a more family-friendly solution! Unfortunately, this resource does not name sections in the FAQ so linking directly to the section you need is not possible. Below are the instructions from Rackspace for Importing a Large MySQL Database, which can be found towards the bottom of the FAQ.
Major Tip!
After creating this cron job through Rackspace, MAKE SURE you delete it. Otherwise, if the database file remains in the same spot, the upload WILL happen again and it will overwrite the database every time. Currently, there is no option to run it just once through the Rackspace interface, so be wary of this.
You should be able to use the Online Manager (see PHPmyAdmin Database Management Interface) to import databases that are less than 16 MB in size without issue.
However, if you need to import database data that exceeds 16 MB into your Cloud Sites environment, you can use a script which you can schedule via a cron job.
Following is an example to help you set up this script:
#!/bin/sh mysql -h DB_HOST -u DB_USER -p'DB_PASSWORD' DB_NAME < /path/to/file/db_import.sql
- Create a new text file, add the preceding code to it, and save it as restore.sh. In the preceding script, replace the placeholders with your actual information, as follows:
- pathToFile – This is the absolute path to your files. You can find this path by clicking on Hosting > Cloud Sites > Features in the Cloud Sites Control Panel and scrolling down the page. This path is the Linux Path listed there. An example is /mnt/target02/123456/www.domain.com. (Note that the file name is not included in this path.)
- databaseHost – The name of the database host. An example is mysql5-9.wc1.
- databaseUser – The name of the database user.
- database_Password – The password of the target database.
- databaseName – The name of the database to which you are restoring.
- Upload this file to your Cloud Site (for security purposes, we recommend keep this file outside your webroot) and create a “perl” cron job to schedule the task (the perl option runs shell scripts as well)..
Note: For security purposes, we recommend that you keep this file outside your webroot directory.
- Put the name of the file, which in this example is import.sh, in the “Command To Run” field.
- Select Perl as the Command Language and then select the minimum interval at which you want to run that script (5 minutes is recommended).Logs are available in your
/logs/
folder each time a cron job is performed and you receive an email confirmation informing you if the job was successful or not.Note: The cron job has 15 minutes to complete. If the script takes longer than 15 minutes to complete, it will time out. If this is a problem for you please break up the import into smaller pieces (several .sql files) and reschedule the job as necessary.