Last Updated: 2020-06-17
- DBeaver is a free open source multi-platform, cross-platform database tool for developers, SQL programmers, and database administrators. DBeaver supports various databases including PostgreSQL, MariaDB, MySQL, YugabyteDB. In addition, there are plugins and extensions for other databases that support the JDBC driver.
- Connecting DBeaver to the Cloud SQL Proxy This step is the same regardless of if you are running the Cloud SQL Proxy in a container or not. When we have the proxy running, connecting DBeaver is.
Visualize and optimize slow queries from Google Cloud SQL. The next and final step will be to install EverSQL's Chrome extension - no configuration needed. Once you're done, you can navigate to the Logging tab in the Google Cloud logs dashboard, and start scrolling through the logs, or navigating to a specific point in time where you suspect a.
What is Cloud Functions?
Cloud Functions is a lightweight compute solution for developers to create single-purpose, stand-alone functions that respond to Cloud events without needing to manage a server or runtime environment.
In this codelab, you'll write a Cloud Function in Python. The function:
- Connects to a Cloud SQL Database instance.
- Sends an insert statement to a table in the database.
- How to access the Cloud Functions web UI in the Google Cloud Console.
- How to create a Cloud Function.
- How to test a Cloud Function.
- How to connect to a Cloud SQL database instance (either MySQL or PostgreSQL) using Python.
- How to write to a Cloud SQL database using Python.
- A browser, such as Chrome or Firefox.
- A Google Cloud Platform project that contains your Cloud SQL instance.
- If you don't already have one, we have a tutorial for that. Do all the steps except for deleting the instance.
- Your instance contains a MySQL or PostgreSQL database with a table.
- Your instance connection name, the database and table names, the database user name and the user's password.
- You can find the connection_name in the Console UI. Navigate to your SQL instance's Overview page. The connection name is in the format: PROJECT_ID:REGION:INSTANCE_NAME
- The database names are your own. If you have not already created a Cloud SQL database, we have a tutorial for that. Here is an example of a simple database and table for MySQL:
- CREATE DATABASE library;
- USE library;
- CREATE TABLE books (title VARCHAR(100));
- INSERT into books (title) VALUES ('Cloud SQL for Winners');
- A Service account with the Cloud SQL Client role.
- In the Console UI, navigate to the IAM & Admins > IAM page.
- You can use the default Compute Engine service account. It has the suffix compute@developer.gserviceaccount.com.
- Select Edit using the pencil icon.
- Click ADD ANOTHER ROLE and add Cloud SQL > Client.
- Click SAVE.
The Cloud Function code for connecting to a Cloud SQL database is right here. Some of the variable values depend on whether your Cloud SQL database is MySQL or PostgreSQL, and depend on your own database information.
The Cloud Functions UI in the Cloud Console includes a text editor. You can copy/paste and edit the code there, or edit the code locally first, and then copy/paste it into the UI.
main.py
- In a browser, go to the Google Cloud Platform Console UI.
- Select Cloud Functions from the Navigation menu.
- Click CREATE FUNCTION on the button bar.
- Enter a name for the function.
- Select the HTTP trigger. (Make a note of the URL displayed beneath the trigger item. It will be in this format: https://REGION-PROJECT_ID.cloudfunctions.net/FUNCTION_NAME)
- Select Inline editor for the source code option.
- Select Python 3.7 for the runtime option.
- In the source code editor windows, delete the existing content for both requirements.txt and main.py, and replace them with your edited versions of the code above.
- Enter insert as the name of the Function to execute.
- In the Advanced options, select a Service account that has the Cloud SQL Client role.
- Click Create and wait for the spinner to stop. A green check appears when the function is ready to use.
- In a browser, go to the Google Cloud Platform Console UI.
- Select Cloud Functions from the Navigation menu.
- Click on the name of the function you created earlier.
- Select the TESTING link in the middle of the page.
- Select TEST THE FUNCTION.
- The result should appear: ok (If the test fails, you'll see a stack trace to help with debugging.)
- In a browser, go to the URL that you saved earlier, when you created the function.
- The ok result should appear in the browser as well.
Congratulations, you've successfully built a Cloud Function that works with Cloud SQL.
Specifically, you've created a Cloud Function that connects and writes to a Cloud SQL database instance.
Further reading
Create a SQL Server instance, configuration screen - Connectivity section
Create a SQL Server instance, popup asking to enable Service Networking API
Backups / Automate backups: Disabled
Create a SQL Server instance, configuration screen - Backups, recovery, and high availability section
5. Click Create. Creating the SQL Server instance will take about 10 minutes to complete.
Wait for SQL Server instance to be created screen
SQL Server instance screen
3. Migrate the database
Next, we need to migrate the database using Entity Framework.
First, open the Cloud SQL, Overview page for the SQL Server instance that we just created. Make a note of the connection name.
Second, we need to connect to the database from our local machine. For this, the Cloud SQL Proxy for SQL Server is needed. Download it and follow the instructions to install it from here: https://cloud.google.com/sql/docs/sqlserver/sql-proxy
Then, start the proxy using the command:
Windows:
macOS or Linux:
Where the instances parameter is the connection name, with =tcp:1433 added at the end.
NOTE: Make sure that you don’t have a SQL Server instance running on the default port (1433) on your local machine. If so, stop it before attempting to run the above command.
Next, on your local machine, run the database migration according to the documentation (Section “Configuring the sample to use SQL Server”), making sure to edit src/Web/appsettings.json to use the following connection strings:
Use the generated password from the “Create a SQL Server instance” step, then proceed to run the migration according to the eShopOnWeb documentation (the steps involving dotnet ef database update ...)
NOTE: Skip step “1. Ensure your connection strings in appsettings.json point to a local SQL Server instance” from the eShopOnWeb documentation, as we’ll be using the connection strings above instead
4. Browse the SQL Server instance (optional)
Use Azure Data Studio Manager, or SQL Server Management Studio (Windows only), while making sure that Cloud SQL Proxy for SQL Server is still running.
In the main connection settings, use:
Server: 127.0.0.1
User name: sqlserver
Password: The generated password from the “Create a SQL Server instance” step
You should now be able to connect to the instance, and browse the recently created Microsoft.eShopOnWeb.CatalogDb and Microsoft.eShopOnWeb.Identity databases.
5. Connecting the Cloud Run service to the Cloud SQL instance
1. Open Cloud Run.
2. Click the Cloud Run service that we created earlier (“eshop-on-web”).
3. Click Edit & Deploy New Revision.
Dbeaver Google Cloud Sql Login
Container: Default values (make sure Port is 8080, this should be the default).
Variables / Environment Variables:
First, remove ASPNETCORE_ENVIRONMENT
Deploy revision screen - Advanced settings, variables, before deleting item
Then, configure:
ASPNETCORE_URLS (already present) | http://*:8080 |
ConnectionStrings__CatalogConnection | Server=PRIVATE_IP_ADDRESS_OF_SQL_SERVER_INSTANCE;User Id=sqlserver;Password=PASSWORD_FROM_CREATE_SQL_SERVER_INSTANCE_STEP;Initial Catalog=Microsoft.eShopOnWeb.CatalogDb; |
ConnectionStrings__IdentityConnection | Server=PRIVATE_IP_ADDRESS_OF_SQL_SERVER_INSTANCE;User Id=sqlserver;Password=PASSWORD_FROM_CREATE_SQL_SERVER_INSTANCE_STEP;Initial Catalog=Microsoft.eShopOnWeb.Identity; |
The last two are connection strings in the format required by SQL Server.
In both connection strings, replace the Server parameter with the Private IP address that was assigned to the SQL Server instance you just created. This is visible in the dashboard for Cloud SQL and will be a private IP address, e.g. in the format 10.x.x.x
Also, replace the Password parameter with the generated password from the “Create a SQL Server instance” step.
Dbeaver Google Cloud Sql Database
Make sure, after removing ASPNETCORE_ENVIRONMENT, there should now be exactly three environment variables defined:
- ASPNETCORE_URLS
- ConnectionStrings__CatalogConnection
- ConnectionStrings__IdentityConnection
Deploy revision screen - Advanced settings, variables, after adding new envisionment variables
4. Continue to the Connections tab, and under “VPC Connector”, click “Generate a serverless connector”.
Deploy revision screen - Advanced settings, connections
5. Click Enable to enable to Serverless VPC Access API.
6. Click Create connector.
Enable Serverless VPC Access API screen
Serverless VPC access screen
Create connector:
Name: eshop-on-web
Region: australia-southeast1
Network: default
IP range: e.g. 10.8.0.0(any unused /28 CIDR range in your VPC network)
Create connector screen
7. Click Create. Creating the connector will take a few minutes to complete.
8. Go back to the deployment page for Cloud Run which was open previously, and refresh the page, then click the “Connections” tab again.
9. Select the eshop-on-web connector in the VPC Connector dropdown (it should be available now).
Deploy revision screen - Advanced settings, connections, new eshop-on-web VPC connector selected
10. Click Deploy. Deploying the new container image will take a few minutes to complete.
Note that there was no need for a new build (from Cloud Build), as the container image already supports the necessary configuration values.
5. Test the deployed website
Navigate to the website again using Firefox with the “Modify Header Value (HTTP Headers)” add-on, as outlined in the previous blog post, making sure to renew the token using gcloud auth print-identity-token, as the token expires after 1 hour.
The website should function much like it did previously, but now using a real database instance for persistence.
Conclusion
Building on the setup from the previous post, we now have a managed database instance connected to our ASP.NET web application on Google Cloud, that can easily be scaled to suit any operational requirements.
The managed database instance is also easily reconfigurable to a High Availability setup with automated backups, which may be important in a production environment setting.