Import Export (SQL Instance -> Cloud Storage Bucket) Using Deployment Manager GCP

This article speaks about, how you can deploy your data from SQL Instance to Google Cloud Storage Bucket and vice versa i.e. Create a new SQL Instance based on the sql file stored on the Cloud Storage.

This article mainly focuses on the Import and Export of the Data and not how to create Database, Table and add records to the table, for that you may refer the following Quickstart :

It is assumed that you have Installed GCP SDK(command line interface for google), if not then refer :

The Approach Followed is:

i. Jinja Scripts that are used to define the Structure of the SQL Instance, Import & Export Structure.

ii. Yaml Files that imports jinja files and defines the templates for the above Structure.

iii. Python Scripts to execute the Deployment Manager and provide the necessary permissions(ACL) to the SQL Instance and Cloud Storage.

First Things First (Create a Cloud Storage Bucket) :

We create a Cloud Storage Bucket that is located in us-central1 and of storage class “NEARLINE”, we here choose NEARLINE as we want to only store the backup of our SQL Instance and will be accessing it once in a month.

The Jinja File for our Storage Bucket is :

The Yaml File is :

Now, its time to create SQL Instance :

We create the SQL Instance in the same region us-central1 .

We create a Jinja file , the same file would be used to : Create SQL Instance, Export Data to Cloud Storage & Import Data from Cloud Storage.

The Yaml file to create SQL Instance is :

The Yaml file to export SQL Instance Data(Backup) to Storage Bucket is :

In the above yaml file, change “ENTER_SQL_BUCKETPATH” with the actual Storage Bucket Path, it looks like : “gs://your-bucket/”

sqlexportstructure: ENTER_SQL_BUCKETPATH

The Yaml file to import Date from Storage Bucket to SQL Instance is :

In the above yaml file, change “ENTER_SQL_DUMPFILE” with the actual Storage Bucket Path along with the Dump file(Object Name) with extension, it looks like : “gs://your-bucket/object”

sqlimportstructure: ENTER_SQL_DUMPFILE

Executing the Scripts created :

Finally, after creating all the Jinja templates and the Yaml files, we are all set to deploy it using the GCP Deployment Manager.

To Execute, all we need to do is create a Python Script that will execute the Deployment Manager and also add the Access Control List (ACL) permissions required to execute.

Steps to follow while executing the Python Script:

  1. When you execute the script for the first time, you will need to enable the API’s.
  2. Here, we will be creating 2 Deployment Manager’s, one for Storage Bucket and one for SQL Instance.
  3. For first time, you will need to create a new Deployment Manager, for which you will be asked to provide :

i. Deployment Manager Name

ii. Yaml File Name

4. While Creating Deployment Manager for Storage Bucket the Yaml File is “storagebucket.yaml” and for SQL Instance is “create_sql_instance.yaml”.

5. Now Export the Data to Storage Bucket, for this choose “2. Update Deployment Manager” when prompted, then choose “2. Export to Cloud Storage”

6. It will now add the write ACL on that Storage Bucket to the SQL Instance using the “Service Account” of your SQL Instance. You will now be prompted to Enter the Service Account, You can Find your SQL Service Account Here

7. Now to Import the Data(Backup), again choose “2. Update Deployment Manager” when prompted, then choose “1. Import to SQL Instance” and then follow step 6 for Entering Service Account.

References:

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store