Complete GCP DevOps Tutorial with Usage Examples

Table of Contents

1. What is GCP DevOps?

GCP DevOps refers to the practice of leveraging Google Cloud Platform (GCP) services and tools to implement DevOps methodologies. DevOps is a cultural and operational shift that aims to unify software development and IT operations, emphasizing automation, collaboration, and continuous improvement across the entire software delivery lifecycle.

GCP offers a comprehensive, integrated suite of services that are inherently designed for DevOps workflows, from code commit to deployment, monitoring, and operations.

Key Goals of GCP DevOps:

2. GCP DevOps Pillars

GCP's approach to DevOps aligns with industry best practices, focusing on key areas:

Getting Started: To follow this tutorial, you'll need a Google Cloud account. Many services fall under the GCP Free Tier, making it ideal for learning. Always remember to clean up resources after practice to avoid unexpected costs.

3. Getting Started with GCP

A. Create a Google Cloud Account:

Go to cloud.google.com/free/ to sign up for a free Google Cloud account. New customers typically receive $300 in free credits for 90 days and free usage limits for certain products.

Security Best Practice: After creating your account, immediately set up **Multi-Factor Authentication (MFA)** for your Google account. Understand Google's Shared Responsibility Model.

B. The Google Cloud Console:

This is the web-based graphical user interface (GUI) for managing your GCP projects and resources. This tutorial will focus on using the Console GUI.

# Access: console.cloud.google.com

Upon logging in, you'll see the **Google Cloud Console Dashboard**, providing an overview of your active projects, resources, and billing information.

C. Cloud Shell:

An interactive, browser-accessible shell environment within the Console, pre-installed with the `gcloud` CLI and other development tools. Useful for executing quick commands without local setup.

# Access: Click the Cloud Shell icon (top right of Console).

4. Resource Hierarchy & IAM

GCP organizes your resources hierarchically. **Cloud IAM (Identity and Access Management)** controls who has what access to these resources.

Cloud IAM - GUI Usage:

  1. Navigate to IAM:
    Google Cloud Console > Navigation menu (☰) > IAM & Admin > IAM
  2. Granting Permissions:
    • Click **+ GRANT ACCESS**.
    • In the "New principals" field, enter the email of the user/service account.
    • In the "Select a role" dropdown, search for and select the desired **Predefined Role** (e.g., `Compute Instance Admin`, `Storage Object Viewer`) or a **Custom Role**.
    • Click **SAVE**.
  3. Creating a Service Account: Used by applications/services to authenticate to GCP.
    • Go to:
      Google Cloud Console > Navigation menu (☰) > IAM & Admin > Service Accounts
    • Click **+ CREATE SERVICE ACCOUNT**.
    • Follow the wizard to provide details and assign roles.
    • To create a JSON key for the service account (for non-GCP hosted apps): Click the service account, go to **KEYS** tab > **ADD KEY > Create new key** > **JSON**. Store securely.

5. Version Control (Cloud Source Repositories)

Cloud Source Repositories provides private Git repositories hosted on Google Cloud. It integrates seamlessly with other GCP services.

**Note on Cloud Source Repositories:** As of June 17, 2024, Cloud Source Repositories is not available to new customers. Existing users are unaffected. For new projects, use GitHub, GitLab, or Bitbucket. This tutorial will assume you use an external Git repo for CI/CD, but the steps for connecting Cloud Source Repositories would be similar.

Usage Example: Connecting an External Repository (e.g., GitHub) for CI/CD:

  1. Navigate to Cloud Source Repositories:
    Google Cloud Console > Navigation menu (☰) > Cloud Source Repositories
  2. Connect Repository:
    • Click **Add repository**.
    • Choose **Connect external repository** (e.g., `Connect GitHub repository`).
    • Follow the authentication and selection process for your GitHub organization and repository.
    • Click **CONNECT**.
  3. Browse Code: Once connected, you can view your repository's files directly in the Console.
  4. For Pipeline Integration: This connected repository will now be available as a source in Cloud Build.

6. Continuous Integration (Cloud Build)

Cloud Build is a fully managed CI service that executes your builds on Google Cloud infrastructure. It can import source code, execute build steps (compile, test, package), and produce artifacts.

Usage Example: Create a Cloud Build Trigger for CI (GUI):

This example sets up a CI pipeline for a Node.js application, triggered by Git pushes. Assume your app code and a `cloudbuild.yaml` file are in your connected Git repository.

# cloudbuild.yaml (for a Node.js project)
steps:
- name: 'gcr.io/cloud-builders/npm' # Use official npm builder image
  args: ['install'] # Install dependencies
  dir: 'app/' # Assuming your Node.js app is in an 'app' directory

- name: 'gcr.io/cloud-builders/npm'
  args: ['test'] # Run tests
  dir: 'app/'

- name: 'gcr.io/cloud-builders/npm'
  args: ['run', 'build'] # Run your build script (e.g., webpack)
  dir: 'app/'

- name: 'gcr.io/cloud-builders/docker' # Use official Docker builder
  args: ['build', '-t', 'europe-west1-docker.pkg.dev/$PROJECT_ID/my-app-repo/my-nodejs-app:$COMMIT_SHA', 'app/'] # Build Docker image
  # Build a Docker image and tag it with project ID, repository, and commit SHA

- name: 'gcr.io/cloud-builders/docker'
  args: ['push', 'europe-west1-docker.pkg.dev/$PROJECT_ID/my-app-repo/my-nodejs-app:$COMMIT_SHA'] # Push to Artifact Registry
images:
- 'europe-west1-docker.pkg.dev/$PROJECT_ID/my-app-repo/my-nodejs-app:$COMMIT_SHA' # Declare the final image artifact
  1. Enable Cloud Build API:
    Google Cloud Console > Navigation menu (☰) > APIs & Services > Enabled APIs & Services > Search for "Cloud Build API" > Enable
  2. Navigate to Cloud Build:
    Google Cloud Console > Navigation menu (☰) > CI/CD > Cloud Build
  3. Create Trigger:
    • In the left pane, click **Triggers** > **+ CREATE TRIGGER**.
    • Name: (e.g., `my-app-ci-trigger`).
    • Region: Select a region (e.g., `europe-west1`).
    • Event: `Push to a branch`.
    • Source:
      • Repository: Select your connected GitHub/GitLab/Cloud Source Repository.
      • Branch: `^main$` (or `^master$`).
    • Build configuration:
      • Type: `Cloud Build configuration file (yaml or json)`.
      • Location: `/cloudbuild.yaml` (default, assuming file is in root).
    • Click **CREATE**.
  4. Trigger a Build:
    • Perform a `git push` to your connected repository's `main` branch.
    • Alternatively, from the "Triggers" page, click **Run** button next to your trigger.
  5. Monitor Builds:
    • In the left pane, click **History**. You'll see your build runs.
    • Click on a build ID to view detailed logs, build steps, and artifacts.

7. Artifact Management (Artifact Registry)

Artifact Registry is a universal package manager for all your build artifacts (Docker images, Maven, npm, Python packages, etc.). It replaces Container Registry for Docker images and provides a single place for all package types.

Usage Example: Create a Docker Repository in Artifact Registry (GUI):

  1. Enable Artifact Registry API:
    Google Cloud Console > Navigation menu (☰) > APIs & Services > Enabled APIs & Services > Search for "Artifact Registry API" > Enable
  2. Navigate to Artifact Registry:
    Google Cloud Console > Navigation menu (☰) > CI/CD > Artifact Registry
  3. Create Repository:
    • Click **+ CREATE REPOSITORY**.
    • Name: (e.g., `my-app-repo`).
    • Format: `Docker`.
    • Mode: `Standard`.
    • Location Type: `Region` > select your region (e.g., `europe-west1`).
    • Click **CREATE**.
  4. Push/Pull Images:
    • Click on your created repository name.
    • Click **SETUP INSTRUCTIONS**. Follow the instructions for your client (Docker, npm, Maven etc.) to authenticate and push/pull.
      # Example Docker authentication for europe-west1:
      gcloud auth configure-docker europe-west1-docker.pkg.dev
      
      # Example Docker Push (after building an image locally):
      docker tag my-app:1.0 europe-west1-docker.pkg.dev/<YOUR_PROJECT_ID>/my-app-repo/my-app:1.0
      docker push europe-west1-docker.pkg.dev/<YOUR_PROJECT_ID>/my-app-repo/my-app:1.0
  5. Integration with Cloud Build: As seen in the `cloudbuild.yaml` example, Cloud Build automatically authenticates and pushes/pulls images from Artifact Registry.

8. Container Orchestration (Google Kubernetes Engine - GKE)

Google Kubernetes Engine (GKE) is a managed environment for deploying, managing, and scaling containerized applications using Kubernetes. It automates cluster management tasks, allowing you to focus on your applications.

Usage Example: Create a GKE Cluster & Deploy a Simple App (GUI):

  1. Enable Kubernetes Engine API:
    Google Cloud Console > Navigation menu (☰) > APIs & Services > Enabled APIs & Services > Search for "Kubernetes Engine API" > Enable
  2. Navigate to Kubernetes Engine:
    Google Cloud Console > Navigation menu (☰) > Kubernetes Engine > Clusters
  3. Create a Cluster:
    • Click **+ CREATE CLUSTER**.
    • Choose a deployment mode:
      • `Autopilot`: Recommended for simplicity and managed node provisioning (fully serverless Kubernetes).
      • `Standard`: Provides more control over nodes.
    • For this example, choose **Standard cluster**.
    • Cluster name: (e.g., `my-gke-cluster`).
    • Location type: `Regional` or `Zonal`. Choose a **Region** (e.g., `us-central1`).
    • **Node pools:** Default settings are usually fine for testing (`e2-medium` machine type, 3 nodes).
    • Click **CREATE**. (Cluster creation can take 5-10 minutes).
  4. Connect to Cluster & Deploy an Application:
    • Once the cluster status is "Running," click on its name.
    • Click **CONNECT** in the top bar.
    • Select "Connect with Cloud Shell" and click **RUN IN CLOUD SHELL**. This will open Cloud Shell and automatically configure `kubectl` to your cluster.
    • Deploy Nginx (in Cloud Shell):
      # Create an Nginx deployment and service (using default Docker Hub image)
      kubectl create deployment nginx --image=nginx:latest
      kubectl expose deployment nginx --type=LoadBalancer --port=80
      
      # Check deployment and service status
      kubectl get deployments
      kubectl get services
      # Wait for the external IP address to be assigned to 'nginx' service.
    • Access your Nginx app by navigating to the external IP address in your browser.

9. Serverless (Cloud Run, Cloud Functions)

GCP offers powerful serverless compute options, allowing you to run code without managing servers.

A. Cloud Run - GUI Usage:

Fully managed serverless platform for running containerized applications. Scales automatically from zero to millions of requests.

  1. Navigate to Cloud Run:
    Google Cloud Console > Navigation menu (☰) > Serverless > Cloud Run
  2. Create Service:
    • Click **+ CREATE SERVICE**.
    • Deployment platform: `Deploy a new revision from an existing container image`.
    • Container image URL: Provide the path to your Docker image in Artifact Registry (e.g., `europe-west1-docker.pkg.dev/<YOUR_PROJECT_ID>/my-app-repo/my-nodejs-app:latest`).
    • Service name: (e.g., `my-cloud-run-service`).
    • Region: Select.
    • **Authentication:** Choose "Allow unauthenticated invocations" for public web apps.
    • Click **CREATE**.
  3. Test: Once deployed, the service URL will be provided. Access this URL in your browser.

B. Cloud Functions - GUI Usage:

Event-driven serverless FaaS (Functions as a Service) for running small, single-purpose functions in response to events.

  1. Navigate to Cloud Functions:
    Google Cloud Console > Navigation menu (☰) > Serverless > Cloud Functions
  2. Create Function:
    • Click **+ CREATE FUNCTION**.
    • Environment: `2nd generation` (recommended for new functions).
    • Function name: (e.g., `my-http-trigger`).
    • Region: (e.g., `us-central1`).
    • Trigger type: `HTTP`. Check "Allow unauthenticated invocations".
    • Runtime: (e.g., `Python 3.9`).
    • Entry point: `hello_http` (the name of your function in the code).
    • In the "Source code" editor, paste your function logic (e.g., Python code from prior Python tutorial).
    • Click **DEPLOY**.
  3. Test: Once deployed, click on the function > **TESTING** tab. You can manually test or get the "Triggering URL" to test via browser/curl.

10. Continuous Delivery (Cloud Deploy)

Cloud Deploy is a fully managed service that automates delivery to a series of target environments (e.g., dev, staging, prod) for GKE, Cloud Run, and App Engine.

Usage Example: Create a Cloud Deploy Pipeline (GUI):

This example will create a simple pipeline to deploy to a GKE cluster.

  1. Enable Cloud Deploy API:
    Google Cloud Console > Navigation menu (☰) > APIs & Services > Enabled APIs & Services > Search for "Cloud Deploy API" > Enable
  2. Navigate to Cloud Deploy:
    Google Cloud Console > Navigation menu (☰) > CI/CD > Cloud Deploy
  3. Create Delivery Pipeline:
    • Click **+ CREATE DELIVERY PIPELINE**.
    • Pipeline name: (e.g., `my-app-delivery`).
    • Description: (Optional).
    • Click **CREATE**.
  4. Configure Targets (Environments):
    • On the pipeline details page, click **ADD TARGET**.
    • Target name: (e.g., `staging-gke`).
    • Target type: `Google Kubernetes Engine`.
    • GKE Cluster: Select your `my-gke-cluster`.
    • Click **CREATE**.
    • Repeat for `prod-gke` target, potentially adding a "requires approval" option.
  5. Create Release (to deploy your image):
    • On the pipeline overview, click **CREATE RELEASE**.
    • Name: (e.g., `release-1.0.0`).
    • Source: "Container image" (e.g., `europe-west1-docker.pkg.dev/<YOUR_PROJECT_ID>/my-app-repo/my-nodejs-app:latest`).
    • Click **CREATE**.
    • Cloud Deploy will create a release and begin deploying to the first target (e.g., `staging-gke`). You can then manually promote it to `prod-gke` (if approval is configured).

11. Infrastructure as Code (Cloud Deployment Manager)

Cloud Deployment Manager is GCP's native IaC service for provisioning and managing Google Cloud resources. It uses declarative configuration written in YAML or Python templates.

Usage Example: Create an EC2-like VM with Deployment Manager (GUI):

This will create a VM, its network interface, public IP, and firewall rules.

  1. Enable Deployment Manager API:
    Google Cloud Console > Navigation menu (☰) > APIs & Services > Enabled APIs & Services > Search for "Cloud Deployment Manager V2 API" > Enable
  2. Navigate to Deployment Manager:
    Google Cloud Console > Navigation menu (☰) > CI/CD > Deployment Manager
  3. Create Deployment:
    • Click **+ CREATE DEPLOYMENT**.
    • Name: (e.g., `my-devops-vm-deployment`).
    • Specify your configuration: Choose "Upload your configuration" and select a YAML or JSON template file.
    • Example `vm_template.yaml` (from your local machine, then upload):
      # vm_template.yaml
      resources:
      - name: my-web-instance
        type: compute.v1.instance
        properties:
          zone: us-central1-a
          machineType: zones/us-central1-a/machineTypes/e2-micro
          disks:
          - deviceName: boot
            type: PERSISTENT
            boot: true
            autoDelete: true
            initializeParams:
              sourceImage: projects/debian-cloud/global/images/family/debian-11
          networkInterfaces:
          - network: global/networks/default
            accessConfigs:
            - name: External NAT
              type: ONE_TO_ONE_NAT
          metadata:
            items:
            - key: startup-script
              value: |
                #!/bin/bash
                sudo apt update
                sudo apt install -y nginx
                sudo systemctl start nginx
                sudo systemctl enable nginx
                echo "Hello from Deployment Manager!" | sudo tee /var/www/html/index.html
      - name: allow-http
        type: compute.v1.firewall
        properties:
          network: global/networks/default
          allowed:
          - IPProtocol: tcp
            ports: ["80"]
          sourceRanges: ["0.0.0.0/0"]
          targetTags: ["http-server"]
      - name: set-http-tag
        type: compute.v1.instance-set-tags
        properties:
          instance: $(ref.my-web-instance.selfLink)
          tags:
            items:
            - http-server
    • Click **DEPLOY**.
  4. Monitor & Delete: Monitor deployment status. Once "Deployed," find your VM in Compute Engine and get its IP. To delete all resources created by this deployment, select it and click **DELETE**.

12. Monitoring & Logging (Cloud Monitoring, Cloud Logging)

Google Cloud's operations suite (formerly Stackdriver) provides comprehensive monitoring, logging, and diagnostics for your applications and infrastructure.

A. Cloud Monitoring - GUI Usage:

Collects metrics, events, and metadata to provide visibility into performance and health.

  1. Navigate to Monitoring:
    Google Cloud Console > Navigation menu (☰) > Operations > Monitoring
  2. Explore Dashboards:
    • View pre-built dashboards (e.g., "VM Instances," "GKE Cluster") or create custom ones.
    • Click **+ CREATE DASHBOARD**. Add various widgets (charts, gauges, text) using metrics (e.g., `VM Instance > CPU utilization`, `Kubernetes Pod > CPU usage`).
  3. Create Alerts:
    • In the left pane, click **Alerting** > **+ CREATE POLICY**.
    • **Select a metric:** (e.g., `VM Instance > CPU utilization`).
    • Define **Trigger condition** (e.g., Average CPU > 80% for 5 minutes).
    • **Notification channels:** Configure email, Slack, PagerDuty, etc.
    • Click **CREATE POLICY**.

B. Cloud Logging - GUI Usage:

A fully managed service that collects, stores, and analyzes logs from all your GCP resources, as well as on-premises and hybrid cloud environments.

  1. Navigate to Logging:
    Google Cloud Console > Navigation menu (☰) > Operations > Logging > Logs Explorer
  2. Explore Logs:
    • Use the "Query builder" to filter logs by resource type, log name, severity, etc.
    • Apply time range filters.
    • View log entries and expand them for detailed JSON.
    • Click "Stream logs" to see new logs in real-time.
  3. Create Log Sinks: Export logs to BigQuery, Cloud Storage, or Pub/Sub for long-term storage or further analysis.

13. Security (Cloud Armor, Security Command Center)

Integrating security throughout the DevOps lifecycle (DevSecOps) on GCP.

A. Cloud Armor (DDoS Protection & WAF) - GUI Usage:

Provides DDoS protection and Web Application Firewall (WAF) capabilities for applications behind Cloud Load Balancing.

  1. Navigate to Cloud Armor:
    Google Cloud Console > Navigation menu (☰) > Network Security > Cloud Armor
  2. Create Security Policy:
    • Click **+ CREATE POLICY**.
    • **Name:** (e.g., `my-web-app-waf`).
    • **Default rule action:** `Allow`.
    • Click **NEXT STEP**.
    • **Add rules:** Click **ADD RULE**.
      • Mode: `Advanced mode` for WAF rules.
      • Match: Use preconfigured WAF rules (e.g., `evaluatePreconfiguredWaf('sqli-stable')` for SQL injection).
      • Action: `Deny` (e.g., `403 Forbidden`).
    • Click **DONE** > **CREATE POLICY**.
  3. Apply Policy to Target:
    • In the policy details, go to the **Targets** tab.
    • Click **APPLY POLICY TO NEW TARGET**. Select your HTTP(S) Load Balancer or backend service.

B. Security Command Center - GUI Usage:

A centralized security and risk management platform for GCP, aggregating findings from various GCP security services.

  1. Navigate to Security Command Center:
    Google Cloud Console > Navigation menu (☰) > Security > Security Command Center
  2. Enable and Explore: Follow the prompts to enable SCC for your organization/project. Explore the "Overview," "Vulnerabilities," and "Threats" dashboards for security posture and findings.

C. Binary Authorization (for GKE/Cloud Run):

A deploy-time security control that ensures only trusted container images are deployed on GKE or Cloud Run.

  1. Enable Binary Authorization API: (As with other APIs).
  2. Navigate to Binary Authorization:
    Google Cloud Console > Navigation menu (☰) > Security > Binary Authorization
  3. Configure Policy: Define rules (e.g., require images to be signed by a specific Cloud Build pipeline) and apply them to specific clusters or projects.

14. Cost Management & Optimization

DevOps practices on GCP contribute to cost optimization through automation and efficient resource usage.

  1. Cloud Billing Reports:
    Google Cloud Console > Navigation menu (☰) > Billing > Reports
    • View detailed cost reports, filter by project, service, SKU, region, and custom labels.
  2. Budgets & Alerts:
    • In the Billing section, go to **Budgets & alerts** > **+ CREATE BUDGET**.
    • Define a budget name, time range, scope (project, services), and amount.
    • Configure **Thresholds** to trigger email notifications when spending reaches a certain percentage of the budget.
  3. Cost Optimization Recommendations:
    • Navigate to **Cloud Advisor** (Navigation menu > Tools > Advisor) > **Cost recommendations**.
    • Receive personalized recommendations to reduce costs (e.g., right-size VMs, delete idle resources).
  4. Leverage Free Tier & Sustained Use Discounts (SUDs): Use free services and benefit from automatic discounts for long-running Compute Engine VMs.
  5. Committed Use Discounts (CUDs): For predictable workloads, commit to resource usage for 1 or 3 years for significant savings.
  6. Preemptible VMs: Low-cost, short-lived Compute Engine instances ideal for fault-tolerant CI/CD jobs or batch processing.
  7. Lifecycle Management (Cloud Storage): Automatically transition data to cheaper storage classes (Nearline, Coldline, Archive) or delete it based on age.
  8. Tagging & Labels: Apply labels to resources to categorize and track costs by project, team, environment.

15. Best Practices (GCP Well-Architected Framework)

The Google Cloud Well-Architected Framework provides guidance for building secure, reliable, efficient, and cost-effective workloads. It aligns well with DevOps principles.

GCP DevOps Specific Best Practices:

Your GCP DevOps Journey: Innovation at Scale!

GCP offers a cutting-edge and integrated platform for implementing modern DevOps practices. By understanding its core services, mastering their GUI-based usage, and applying the recommended best practices, you can build highly automated, scalable, secure, and reliable software delivery pipelines. Continuous hands-on practice, combined with a deep dive into GCP documentation and the Well-Architected Framework, will accelerate your journey to becoming a proficient GCP DevOps engineer.