Maybe 5 questions repeated from the exam in 2022. I’m surprised to see more MySQL compared to Postgresql related questions.
- Learn about Cloud SQL serverlss export is part of best practices of importing,exporting.
- When a Cloud SQL instance is suffering from HDD disk IO related bottleneck, what’s the solution? You can’t edit an instance to change from HHD to SSD. You’d need to create a replica that has SSD and promote it to be primary.
- When you migrate a SQL server to Google cloud SQL for SQL server, you found the existing SQL server has disk performance peaking at 25000 IOPS, how can you choose the machine type for the migration target machine type to maximize disk performance? Choose the largest disk to have the highest disk IO performance.
- You notice a write heavy application has normal performance with a regional Cloud Spanner instance but slow performance on a multi-regional Cloud Spanner instance. What’s the solution? a. increase read write leader replicas; b. bring the application close to the write leader’s region.
- at least 2 questions appeared about choosing Cloud SQL query insight for troubleshooting application’s slow query problems.
- Understand the difference between Datastream and Database migration service. Datastream only has Cloud Storage and BigQuery as targets. What are the targets for database migration service?
- For migrating PostgreSQL or MySQL databases on premises, the cloud native method is Database migration service over pg_dump or mysqldump commands. Database migration service has less downtime than using the commands to import databases.
- ML engineers need daily export of a database in a Cloud SQL for MySQL instance, what’s Google recommended method? a. execute mysqldump manually; b. configure Cloud scheduler to trigger a cloud function that calls the REST API to export from a database; remember the REST API endpoint format: sqladmin.googleapis.com and parameters. c. (I chose this wrong answer) configure Cloud scheduler to trigger a cloud function that calls the REST API to export with SELECT * FROM Table, which is wrong as the REST API does not accept a query.
- Convert an existing Python script that executes queries to assess a business scenario (I forgot the full story) that only reads from a Cloud SQL instance. a. Configure Cloud scheduler and a Cloud Function triggered by pub/sub; b. Correct answer: configure cloud scheduler to trigger a cloud function. c,d: Create a Cloud Composer instance and deploy the DAG run.
- Choose 2 answers for Migrating a 100TB SQL server over a 1 Gbp/s interconnect with less than 48 hour down time: a. Enable CDC to replicate the database to Cloud SQL for SQL server which should only have down time less than 1 hour. b. increase the interconnect bandwidth to 10 Gbp/s. Export the database on prem, import to the Cloud SQL for SQL server instance over the interconnect. c,d. (wrong) export from on prem and import over the 1 Gbp/s interconnect or 2 Gbp/s interconnect.
- Choose the best database service for ingesting sensor IoT data 10 times a second: prefer bigTable.
- Choose the best database service for Android application’s backend where devices have intermittent connectivity: prefer Firebase, Firestore.
- How do you increase an existing Cloud SQL instance’s vCPU and memory with minimal effort? Prefer the
gcloud sql instance patch
command to increase vCPU count. - What’s the cloud native method to run analytical read only queries on a postgreSQL server on prem? create a read replica that replicates from the database on prem, also called external replication. Create a bigQuery external table to the Cloud SQL instance to run federated queries. pg_dump related answers are wrong.
- How do you configure the BigTable application profile to support multi-cluster routing? You want the BigTable instance with 2 clusters in different regions to continue to accept write requests after 1 cluster fails. You want any application to benefit from it. a. create a custom app profile and choose single cluster routing. b. create custom app profile and choose multi-cluster routing. c. create default app profile and choose multi-cluster routing. d. (correct) create default app profile and choose multi-cluster routing.
- At a retail client, How do you prevent Cloud SQL instances from executing the the security updates during the maintenance window? The client wants to avoid any database downtime during the Q4 holiday shopping season. set the deny maintenance between November 1st and January 15th. setting it between November 1st and February 15 is wrong as the max of days is 90 and February is not in the holiday shopping season. Using cloud scheduler to set maintanence window in Cloud SQL is wrong.
- At a company, how do you isolate a group of Cloud SQL instances to be created in only 2 allowed regions? The group of Cloud SQL instances are for specific regulatory use cases so data can’t be stored in other regions. Other Cloud SQL instances at the company don’t have such restriction. Create a project and set organizational policy to allow Cloud SQL to be created in the 2 regions. Don’t Create 2 organizations or implement any manual checking of Cloud SQL instance creation’s regions.