Junior Cloud Engineer GCP Interview Questions: Complete Guide

Milad Bonakdar
Author
Master essential GCP fundamentals with comprehensive interview questions covering Compute Engine, Cloud Storage, VPC, IAM, and core Google Cloud concepts for junior cloud engineer roles.
Introduction
Google Cloud Platform (GCP) is a comprehensive suite of cloud computing services offering compute, storage, networking, big data, and machine learning capabilities. As a junior cloud engineer, you'll need foundational knowledge of core GCP services to build and manage cloud infrastructure.
This guide covers essential interview questions for junior GCP cloud engineers, focusing on Compute Engine, Cloud Storage, VPC, and IAM.
GCP Compute Engine
1. What is Google Compute Engine and what are its main use cases?
Answer: Compute Engine provides scalable virtual machines running in Google's data centers.
Key Features:
- Custom or predefined machine types
- Persistent disks and local SSDs
- Preemptible VMs for cost savings
- Live migration for maintenance
- Global load balancing
Use Cases:
- Web hosting
- Application servers
- Batch processing
- High-performance computing
# Create VM instance
gcloud compute instances create my-instance \
--zone=us-central1-a \
--machine-type=e2-medium \
--image-family=debian-11 \
--image-project=debian-cloud
# List instances
gcloud compute instances list
# SSH into instance
gcloud compute ssh my-instance --zone=us-central1-a
# Stop instance
gcloud compute instances stop my-instance --zone=us-central1-aRarity: Very Common
Difficulty: Easy
2. Explain the difference between Persistent Disks and Local SSDs.
Answer:
| Feature | Persistent Disk | Local SSD |
|---|---|---|
| Durability | Data persists independently | Data lost when VM stops |
| Performance | Good | Excellent (low latency) |
| Size | Up to 64 TB | Up to 9 TB |
| Use Case | Boot disks, data storage | Temporary cache, scratch space |
| Cost | Lower | Higher |
| Snapshots | Supported | Not supported |
Example:
# Create VM with persistent disk
gcloud compute instances create my-vm \
--boot-disk-size=50GB \
--boot-disk-type=pd-ssd
# Create VM with local SSD
gcloud compute instances create my-vm \
--local-ssd interface=NVMERarity: Common
Difficulty: Easy-Medium
GCP Cloud Storage
3. What are the different storage classes in Cloud Storage?
Answer: Cloud Storage offers multiple classes for different access patterns:
| Class | Use Case | Availability | Min Duration | Cost |
|---|---|---|---|---|
| Standard | Frequently accessed | 99.95% | None | Highest |
| Nearline | < once/month | 99.9% | 30 days | Lower |
| Coldline | < once/quarter | 99.9% | 90 days | Very low |
| Archive | < once/year | 99.9% | 365 days | Lowest |
# Create bucket
gsutil mb -c STANDARD -l us-central1 gs://my-bucket
# Upload file
gsutil cp myfile.txt gs://my-bucket/
# List objects
gsutil ls gs://my-bucket/
# Download file
gsutil cp gs://my-bucket/myfile.txt ./
# Change storage class
gsutil rewrite -s NEARLINE gs://my-bucket/myfile.txtRarity: Very Common
Difficulty: Easy-Medium
GCP VPC (Virtual Private Cloud)
4. What is a VPC and what are its key components?
Answer: VPC is a virtual network that provides connectivity for GCP resources.
Key Components:
Components:
- Subnets: Regional IP ranges
- Firewall Rules: Control traffic
- Routes: Define traffic paths
- VPC Peering: Connect VPCs
- Cloud VPN: Connect to on-premises
# Create VPC
gcloud compute networks create my-vpc \
--subnet-mode=custom
# Create subnet
gcloud compute networks subnets create my-subnet \
--network=my-vpc \
--region=us-central1 \
--range=10.0.1.0/24
# Create firewall rule (allow SSH)
gcloud compute firewall-rules create allow-ssh \
--network=my-vpc \
--allow=tcp:22 \
--source-ranges=0.0.0.0/0Rarity: Very Common
Difficulty: Medium
5. How do firewall rules work in GCP?
Answer: Firewall rules control incoming and outgoing traffic.
Characteristics:
- Stateful (return traffic automatically allowed)
- Applied to network or specific instances
- Priority-based (0-65535, lower = higher priority)
- Default: Allow egress, deny ingress
Rule Components:
- Direction (ingress/egress)
- Priority
- Action (allow/deny)
- Source/destination
- Protocols and ports
# Allow HTTP traffic
gcloud compute firewall-rules create allow-http \
--network=my-vpc \
--allow=tcp:80 \
--source-ranges=0.0.0.0/0 \
--target-tags=web-server
# Allow internal communication
gcloud compute firewall-rules create allow-internal \
--network=my-vpc \
--allow=tcp:0-65535,udp:0-65535,icmp \
--source-ranges=10.0.0.0/8
# Deny specific traffic
gcloud compute firewall-rules create deny-telnet \
--network=my-vpc \
--action=DENY \
--rules=tcp:23 \
--priority=1000Rarity: Very Common
Difficulty: Medium
GCP IAM
6. Explain IAM roles and permissions in GCP.
Answer: IAM controls who can do what on which resources.
Key Concepts:
- Member: User, service account, or group
- Role: Collection of permissions
- Policy: Binds members to roles
Role Types:
- Primitive: Owner, Editor, Viewer (broad)
- Predefined: Service-specific (e.g., Compute Admin)
- Custom: User-defined permissions
# Grant role to user
gcloud projects add-iam-policy-binding my-project \
--member=user:alice@example.com \
--role=roles/compute.instanceAdmin.v1
# Grant role to service account
gcloud projects add-iam-policy-binding my-project \
--member=serviceAccount:my-sa@my-project.iam.gserviceaccount.com \
--role=roles/storage.objectViewer
# List IAM policy
gcloud projects get-iam-policy my-project
# Remove role
gcloud projects remove-iam-policy-binding my-project \
--member=user:alice@example.com \
--role=roles/compute.instanceAdmin.v1Best Practices:
- Use predefined roles when possible
- Follow least privilege principle
- Use service accounts for applications
- Regular audit of permissions
Rarity: Very Common
Difficulty: Medium
GCP Core Concepts
7. What are GCP regions and zones?
Answer:
Region:
- Geographic location (e.g., us-central1, europe-west1)
- Contains multiple zones
- Independent failure domains
- Choose based on latency, compliance, cost
Zone:
- Isolated location within a region
- Single failure domain
- Deploy across zones for high availability
Example:
# List regions
gcloud compute regions list
# List zones
gcloud compute zones list
# Create instance in specific zone
gcloud compute instances create my-vm \
--zone=us-central1-aRarity: Very Common
Difficulty: Easy
8. What is a service account and when do you use it?
Answer: Service Account is a special account for applications and VMs.
Characteristics:
- Not for humans
- Used by applications
- Can have IAM roles
- Can create keys for authentication
Use Cases:
- VM instances accessing Cloud Storage
- Applications calling GCP APIs
- CI/CD pipelines
- Cross-project access
# Create service account
gcloud iam service-accounts create my-sa \
--display-name="My Service Account"
# Grant role to service account
gcloud projects add-iam-policy-binding my-project \
--member=serviceAccount:my-sa@my-project.iam.gserviceaccount.com \
--role=roles/storage.objectViewer
# Attach to VM
gcloud compute instances create my-vm \
--service-account=my-sa@my-project.iam.gserviceaccount.com \
--scopes=cloud-platformRarity: Common
Difficulty: Easy-Medium
Serverless & Messaging
9. What is Cloud Pub/Sub and when do you use it?
Answer: Cloud Pub/Sub is a fully managed messaging service for asynchronous communication.
Key Concepts:
- Topic: Named resource to which messages are sent
- Subscription: Named resource representing message stream
- Publisher: Sends messages to topics
- Subscriber: Receives messages from subscriptions
Architecture:
Basic Operations:
# Create topic
gcloud pubsub topics create my-topic
# Create subscription
gcloud pubsub subscriptions create my-subscription \
--topic=my-topic \
--ack-deadline=60
# Publish message
gcloud pubsub topics publish my-topic \
--message="Hello, World!"
# Pull messages
gcloud pubsub subscriptions pull my-subscription \
--auto-ack \
--limit=10Publisher Example (Python):
from google.cloud import pubsub_v1
import json
# Create publisher client
publisher = pubsub_v1.PublisherClient()
topic_path = publisher.topic_path('my-project', 'my-topic')
# Publish message
def publish_message(data):
message_json = json.dumps(data)
message_bytes = message_json.encode('utf-8')
# Publish with attributes
future = publisher.publish(
topic_path,
message_bytes,
event_type='order_created',
user_id='123'
)
print(f'Published message ID: {future.result()}')
# Batch publishing for efficiency
def publish_batch(messages):
futures = []
for message in messages:
message_bytes = json.dumps(message).encode('utf-8')
future = publisher.publish(topic_path, message_bytes)
futures.append(future)
# Wait for all messages to be published
for future in futures:
future.result()Subscriber Example (Python):
from google.cloud import pubsub_v1
subscriber = pubsub_v1.SubscriberClient()
subscription_path = subscriber.subscription_path('my-project', 'my-subscription')
def callback(message):
print(f'Received message: {message.data.decode("utf-8")}')
print(f'Attributes: {message.attributes}')
# Process message
try:
process_message(message.data)
message.ack() # Acknowledge successful processing
except Exception as e:
print(f'Error processing message: {e}')
message.nack() # Negative acknowledge (retry)
# Subscribe
streaming_pull_future = subscriber.subscribe(subscription_path, callback=callback)
print(f'Listening for messages on {subscription_path}...')
try:
streaming_pull_future.result()
except KeyboardInterrupt:
streaming_pull_future.cancel()Subscription Types:
1. Pull Subscription:
# Subscriber pulls messages on demand
gcloud pubsub subscriptions create pull-sub \
--topic=my-topic2. Push Subscription:
# Pub/Sub pushes messages to HTTPS endpoint
gcloud pubsub subscriptions create push-sub \
--topic=my-topic \
--push-endpoint=https://myapp.example.com/webhookUse Cases:
- Event-driven architectures
- Microservices communication
- Stream processing pipelines
- IoT data ingestion
- Asynchronous task processing
Best Practices:
- Use message attributes for filtering
- Implement idempotent message processing
- Set appropriate acknowledgment deadlines
- Use dead-letter topics for failed messages
- Monitor subscription backlog
Rarity: Common
Difficulty: Medium
10. What is Cloud Functions and how do you deploy one?
Answer: Cloud Functions is a serverless execution environment for building event-driven applications.
Triggers:
- HTTP requests
- Cloud Pub/Sub messages
- Cloud Storage events
- Firestore events
- Firebase events
HTTP Function Example:
# main.py
import functions_framework
from flask import jsonify
@functions_framework.http
def hello_http(request):
"""HTTP Cloud Function"""
request_json = request.get_json(silent=True)
if request_json and 'name' in request_json:
name = request_json['name']
else:
name = 'World'
return jsonify({
'message': f'Hello, {name}!',
'status': 'success'
})Pub/Sub Function Example:
import base64
import json
import functions_framework
@functions_framework.cloud_event
def process_pubsub(cloud_event):
"""Triggered by Pub/Sub message"""
# Decode message
message_data = base64.b64decode(cloud_event.data["message"]["data"]).decode()
message_json = json.loads(message_data)
print(f'Processing message: {message_json}')
# Process the message
result = process_data(message_json)
return resultStorage Function Example:
import functions_framework
@functions_framework.cloud_event
def process_file(cloud_event):
"""Triggered by Cloud Storage object creation"""
data = cloud_event.data
bucket = data["bucket"]
name = data["name"]
print(f'File {name} uploaded to {bucket}')
# Process the file
process_uploaded_file(bucket, name)Deployment:
# Deploy HTTP function
gcloud functions deploy hello_http \
--runtime=python39 \
--trigger-http \
--allow-unauthenticated \
--entry-point=hello_http \
--region=us-central1
# Deploy Pub/Sub function
gcloud functions deploy process_pubsub \
--runtime=python39 \
--trigger-topic=my-topic \
--entry-point=process_pubsub \
--region=us-central1
# Deploy Storage function
gcloud functions deploy process_file \
--runtime=python39 \
--trigger-resource=my-bucket \
--trigger-event=google.storage.object.finalize \
--entry-point=process_file \
--region=us-central1
# Deploy with environment variables
gcloud functions deploy my_function \
--runtime=python39 \
--trigger-http \
--set-env-vars DATABASE_URL=...,API_KEY=...
# Deploy with specific memory and timeout
gcloud functions deploy my_function \
--runtime=python39 \
--trigger-http \
--memory=512MB \
--timeout=300sRequirements File:
# requirements.txt
functions-framework==3.*
google-cloud-storage==2.*
google-cloud-pubsub==2.*
requests==2.*Testing Locally:
# Install Functions Framework
pip install functions-framework
# Run locally
functions-framework --target=hello_http --port=8080
# Test with curl
curl -X POST http://localhost:8080 \
-H "Content-Type: application/json" \
-d '{"name": "Alice"}'Monitoring:
# View logs
gcloud functions logs read hello_http \
--region=us-central1 \
--limit=50
# View function details
gcloud functions describe hello_http \
--region=us-central1Best Practices:
- Keep functions small and focused
- Use environment variables for configuration
- Implement proper error handling
- Set appropriate timeout values
- Use Cloud Logging for debugging
- Minimize cold start time
Rarity: Very Common
Difficulty: Easy-Medium
CLI & Tools
11. Explain common gcloud CLI commands and configuration.
Answer: The gcloud CLI is the primary tool for managing GCP resources.
Initial Setup:
# Install gcloud SDK (macOS)
curl https://sdk.cloud.google.com | bash
exec -l $SHELL
# Initialize and authenticate
gcloud init
# Login
gcloud auth login
# Set default project
gcloud config set project my-project-id
# Set default region/zone
gcloud config set compute/region us-central1
gcloud config set compute/zone us-central1-aConfiguration Management:
# List configurations
gcloud config configurations list
# Create new configuration
gcloud config configurations create dev-config
# Activate configuration
gcloud config configurations activate dev-config
# Set properties
gcloud config set account dev@example.com
gcloud config set project dev-project
# View current configuration
gcloud config list
# Unset property
gcloud config unset compute/zoneCommon Commands by Service:
Compute Engine:
# List instances
gcloud compute instances list
# Create instance
gcloud compute instances create my-vm \
--machine-type=e2-medium \
--zone=us-central1-a
# SSH into instance
gcloud compute ssh my-vm --zone=us-central1-a
# Stop/start instance
gcloud compute instances stop my-vm --zone=us-central1-a
gcloud compute instances start my-vm --zone=us-central1-a
# Delete instance
gcloud compute instances delete my-vm --zone=us-central1-aCloud Storage:
# List buckets
gsutil ls
# Create bucket
gsutil mb -l us-central1 gs://my-bucket
# Upload file
gsutil cp myfile.txt gs://my-bucket/
# Download file
gsutil cp gs://my-bucket/myfile.txt ./
# Sync directory
gsutil -m rsync -r ./local-dir gs://my-bucket/remote-dir
# Set bucket lifecycle
gsutil lifecycle set lifecycle.json gs://my-bucketIAM:
# List IAM policies
gcloud projects get-iam-policy my-project
# Add IAM binding
gcloud projects add-iam-policy-binding my-project \
--member=user:alice@example.com \
--role=roles/viewer
# Create service account
gcloud iam service-accounts create my-sa \
--display-name="My Service Account"
# Create and download key
gcloud iam service-accounts keys create key.json \
--iam-account=my-sa@my-project.iam.gserviceaccount.comKubernetes Engine:
# List clusters
gcloud container clusters list
# Get credentials
gcloud container clusters get-credentials my-cluster \
--zone=us-central1-a
# Create cluster
gcloud container clusters create my-cluster \
--num-nodes=3 \
--zone=us-central1-aUseful Flags:
# Format output
gcloud compute instances list --format=json
gcloud compute instances list --format=yaml
gcloud compute instances list --format="table(name,zone,status)"
# Filter results
gcloud compute instances list --filter="zone:us-central1-a"
gcloud compute instances list --filter="status=RUNNING"
# Limit results
gcloud compute instances list --limit=10
# Sort results
gcloud compute instances list --sort-by=creationTimestampHelpful Commands:
# Get help
gcloud help
gcloud compute instances create --help
# View project info
gcloud projects describe my-project
# List available regions/zones
gcloud compute regions list
gcloud compute zones list
# View quotas
gcloud compute project-info describe \
--project=my-project
# Enable API
gcloud services enable compute.googleapis.com
# List enabled APIs
gcloud services list --enabledBest Practices:
- Use configurations for different environments
- Set default project and region
- Use
--formatfor scripting - Use
--filterto narrow results - Enable command completion
- Keep gcloud SDK updated
Rarity: Very Common
Difficulty: Easy-Medium
Conclusion
Preparing for a junior GCP cloud engineer interview requires understanding core services and cloud concepts. Focus on:
- Compute Engine: VM instances, machine types, disks
- Cloud Storage: Storage classes, buckets, lifecycle
- VPC: Networking, subnets, firewall rules
- IAM: Roles, permissions, service accounts
- Core Concepts: Regions, zones, projects
Practice using the GCP Console and gcloud CLI to gain hands-on experience. Good luck!




