5 Tips to Become a Google Cloud Certified Professional Architect

Category: Google Cloud Platform Posted:Oct 13, 2020 By: Serena Josh

If you have an aspiration to become a Google Cloud certified engineer, below are some pointers for you. These are based on the experiences of getting ready for the exam and the observations from the actual test.

1. Understand the Concepts of the Hybrid Cloud

There is a lot of focus on connecting on-premises Infrastructure to the Google Cloud Platform. You require to completely understand the options and tradeoffs of prolonging an enterprise data center to the GCP.

Google, like its competitors, has several channels to connect on-prem resources to the cloud. Each channel has one-of-a-kind attributes that deal with specific enterprise scenarios. You got to know the benefits and drawbacks of using one solution against the other while applying a hybrid strategy. Concentrate on the hybrid networking solutions provided by Google.

Cloud VPN, which safely connects on-prem resources to GCP VPC using the public internet. It is the most inexpensive choice available to consumers to open a secure passage between their data center and cloud.

Cloud Interconnect supplies a dedicated, thick 10 Gbps pipe directly to a place where Google Cloud has a point of presence. This delivers unequaled connectivity but is pricey.

Direct Peering is a cheap option for Cloud Interconnect that provides better performance than a VPN. While it does not have an SLA, Direct Peering allows clients to connect directly to Google by cutting the egress fees substantially.

2. Know How Move Data to Google Cloud

Moving data to the cloud is a crucial action in the migration. Google offers several services to migrate data to GCP. You should have the ability to select the right service given the business situation.

Become familiar with gsutil command-line tool to perform standard operations on Google Cloud Storage. In numerous situations, these tools come helpful to migrate a large number of documents from neighborhood storage to the cloud. Understand how to parallelize uploads, configure security, and automate data activity with gsutil. It only makes good sense to use this CLI when the data is in a few GBs. Consider other options when you have to submit terabytes or petabytes of data.

Cloud Storage Transfer Solution is indicated to migrate data from an online resource such as Amazon S3, Azure Storage, or perhaps an HTTP endpoint. Considering that Google doesn’t bill for ingress, this being an ideal option to move large quantities of data from other cloud platforms or storage solutions.

Transfer Appliance is the least expensive and fastest option when you require to firmly move terabytes or petabytes of data to GCP. Both, Google and the client team, take part in the migration process.

If the consumer requires to relocate large datasets straight to BigQuery, take into consideration BigQuery Data Transfer Service, which automates data activity from SaaS applications to Google BigQuery on a scheduled, managed basis.

3. Learn About Google Cloud IAM Inside Out

Google Cloud Identity and Accessibility Management(IAM) is a service to implement granular or fine-grained security policies. It’s a co comprehensive framework to secure any type of Google Cloud resource.

Learn the key differences between individual accounts and business accounts. If you are familiar with AWS IAM, business accounts are a lot like IAM roles for EC2 where instances assume the context of a role. In GCP, business accounts can be utilized by any application that needs fine-grained access to a cloud resource. You need a business account to connect Compute Engine VMs to Cloud SQL circumstances.

Understand how permissions circulate within the IAM hierarchy. The permissions specified at the parent level are always inherited by the kid resources.

Explore the use cases of using Google Groups vs private user accounts when defining a plan.

Google Cloud Storage supports both IAM and ACL policies. IAM is preferred when you want to shield containers while ACLs are great for protecting private objects kept in containers. It is necessary to understand effective policies when both are active.

4. Choose the Right Storage and Database Offerings

GCP has unique object storage tiers that can offer more value to clients at cheap prices than the competitors. The same is the case with the GCP data source and Big data offerings.

You should know when to use regional, multi-regional, nearline, and cold line storage tiers when uploading and saving data in object storage. When you don’t require duplication across areas, regional storage pail is the best selection. Nearline makes good sense when information is accessed at least once a month. Coldline is optimal when the data is accessed only as soon as in a year. Make certain you learn the principles of object versioning and object lifecycle monitoring which helps in automating the archival and deletion process.

Architects will need to choose amongst a range of data sources based on the use case. Become familiar with the core principles of Datastore, Cloud SQL, Cloud Spanner, Cloud Bigtable, and BigQuery. From a test point of view, you can securely disregard Firebase and Firestore.

Datastore is great for web and mobile backends that need to store schema-less documents. But if the application needs transactional support with exceptionally low-latency and compatibility with HBase, go with Bigtable. Cloud SQL offers compatibility with existing MySQL and PostgreSQL data resources. Cloud Spanner is an expensive solution that’s used when you need a global data source with transactional support. BigQuery is meant for storing and recovering large datasets with support for ANSI SQL. It’s not a substitute for a NoSQL and RDBMS database server.

Cloud Dataproc can be a substitute for Apache Hadoop and Flicker jobs running in an on-prem environment. Cloud Dataflow is used when you need to build data pipelines for streaming and batch processing circumstances. Cloud Pub/Sub is implied for ingesting high volumes of data which can be connected to a Dataflow pipeline to inevitably save the last output in BigQuery for analytics.

5.Get a Hold on the Enterprise Case Studies

Google did a wonderful work with scenario-based questions aligned with business case studies. These case studies are openly available which implies that you can access them even before you register for the exam. Pay attention to detail when you read the case studies. The selection of words used in these requirements shares a lot of intricate details regarding the design and architecture of the solution.

  • The Dress4Win study is a typical example of an enterprise considering the cloud for development and test. The key takeaway is the replication of the existing environment in the cloud with minimal changes.
  • TerramEarth is a classic connected vehicle/ IoT use case with a lot of scope for creating a data handling pipeline. Subtle tips and wordings used in the event study indicate substantial architectural selections and design decisions that affect selecting the ideal data platform services in GCP.
  • Mountkirk Games is an example of a classic mobile gaming backend running in the cloud. It has remarkable scope for implementing a scalable backend integrated with an analytics engine.


As a part of the preparation, please take a print of these case studies and highlight the keyword used in the technical and business need section. Mapping them to the appropriate services and tools of GCP will save you valuable time during the test.

For more such informative and engaging articles on Google Cloud Platform, feel free to visit our website.

Also, at ZaranTech we offer self-paced online training on Google Cloud Platform. To learn more about our courses, feel free to visit our website.

24 X 7 Customer Support X

  • us flag 99999999 (Toll Free)
  • india flag +91 9999999