Documentation

GCP Cloud Connector Cloud Storage Setup

Configure Google Cloud Storage, IAM, lifecycle rules, and service account access for the GCP Cloud Connector storage mode.

  • Cloud Storage IAM
  • Bucket lifecycle policies
  • Storage mode troubleshooting

GCP proof

Use Cloud Functions and Cloud Storage for project-level speed, scale, and IAM control

The documented GCP pattern uses Cloud Functions for execution, direct or Cloud Storage modes for document handling, and Google Cloud IAM controls for a customer-managed sealing workflow.

Trust & Standards

32 MB

speed path

Direct mode handles base64 PDF requests up to about 32 MB for a simple single-request integration path.

Gen2

scale model

The connector runs as a second-generation Cloud Function and can shift large-file workflows to Cloud Storage mode.

IAM

security controls

Invoker bindings, dedicated service accounts, and bucket-scoped roles keep runtime and document access tightly scoped.

SHA-256

data boundary

Only the PDF digest is sent to Trusted Signatures while source and sealed files remain in your GCP project.

Google Cloud Storage Setup

This guide explains how to configure Google Cloud Storage for use with the GCP Cloud Connector storage mode.

Overview

The GCP Cloud Connector supports two modes:

  • Direct Mode: PDF sent as base64 in request body (up to ~32MB)
  • Storage Mode: PDF stored in Cloud Storage buckets (no size limit)

Storage mode is recommended for:

  • Large PDF files (>32MB)
  • High-volume processing
  • Better security (PDFs don’t transit through HTTP requests)
  • Integration with existing Cloud Storage workflows

Prerequisites

  • Google Cloud Project with billing enabled
  • Cloud Functions API enabled
  • Cloud Storage API enabled
  • gcloud CLI installed and authenticated

Service Account Setup

Create a dedicated service account for the Cloud Function:

1
2
3
4
5
6
7
# Create service account
gcloud iam service-accounts create pdf-sealer-function \
  --display-name="PDF Sealer Function" \
  --description="Service account for PDF Sealer Cloud Function"

# Get the service account email
export SERVICE_ACCOUNT_EMAIL="pdf-sealer-function@PROJECT_ID.iam.gserviceaccount.com"

2. Grant Required IAM Roles

The service account needs specific Cloud Storage permissions:

1
2
3
4
5
6
7
8
9
# For source buckets (read access)
gcloud projects add-iam-policy-binding PROJECT_ID \
  --member="serviceAccount:$SERVICE_ACCOUNT_EMAIL" \
  --role="roles/storage.objectViewer"

# For destination buckets (write access)
gcloud projects add-iam-policy-binding PROJECT_ID \
  --member="serviceAccount:$SERVICE_ACCOUNT_EMAIL" \
  --role="roles/storage.objectCreator"

Alternative: Bucket-level permissions (more restrictive):

1
2
3
# Grant access to specific buckets only
gsutil iam ch serviceAccount:$SERVICE_ACCOUNT_EMAIL:objectViewer gs://your-source-bucket
gsutil iam ch serviceAccount:$SERVICE_ACCOUNT_EMAIL:objectCreator gs://your-destination-bucket

3. Required IAM Roles Summary

RolePurposeScope
roles/storage.objectViewerDownload PDFs from source bucketsProject or bucket-level
roles/storage.objectCreatorUpload sealed PDFs to destination bucketsProject or bucket-level

Note: The function does NOT need:

  • storage.buckets.get (bucket metadata access)
  • storage.objects.delete (file deletion)
  • storage.objects.update (file modification)

Cloud Storage Bucket Setup

1. Create Buckets

1
2
3
4
5
# Create source bucket
gsutil mb -p PROJECT_ID -c STANDARD -l REGION gs://your-pdf-source-bucket

# Create destination bucket (can be the same as source)
gsutil mb -p PROJECT_ID -c STANDARD -l REGION gs://your-pdf-destination-bucket

2. Configure Bucket Permissions

If using bucket-level permissions instead of project-level:

1
2
3
4
5
# Source bucket - allow function to read objects
gsutil iam ch serviceAccount:$SERVICE_ACCOUNT_EMAIL:objectViewer gs://your-pdf-source-bucket

# Destination bucket - allow function to create objects
gsutil iam ch serviceAccount:$SERVICE_ACCOUNT_EMAIL:objectCreator gs://your-pdf-destination-bucket

3. Optional: Configure Lifecycle Policies

Automatically delete temporary files:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
# Create lifecycle configuration
cat > lifecycle.json << EOF
{
  "lifecycle": {
    "rule": [
      {
        "action": {"type": "Delete"},
        "condition": {
          "age": 1,
          "matchesPrefix": ["temp/", "processing/"]
        }
      }
    ]
  }
}
EOF

# Apply to buckets
gsutil lifecycle set lifecycle.json gs://your-pdf-source-bucket
gsutil lifecycle set lifecycle.json gs://your-pdf-destination-bucket

Cloud Function Deployment

1. Deploy with Custom Service Account

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
gcloud functions deploy pdf-sealer-gateway \
  --gen2 \
  --runtime=nodejs22 \
  --region=REGION \
  --source=. \
  --entry-point=pdfSealer \
  --trigger=https \
  --service-account=$SERVICE_ACCOUNT_EMAIL \
  --memory=512MB \
  --timeout=540s

2. Verify Permissions

Test the function’s access to Cloud Storage:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
# Upload a test PDF
echo "test pdf content" > test.pdf
gsutil cp test.pdf gs://your-pdf-source-bucket/test.pdf

# Test function (replace with your function URL)
curl -X POST https://REGION-PROJECT_ID.cloudfunctions.net/pdf-sealer-gateway/seal \
  -H "Content-Type: application/json" \
  -d '{
    "sourceBucket": "your-pdf-source-bucket",
    "sourceKey": "test.pdf",
    "destinationBucket": "your-pdf-destination-bucket", 
    "destinationKey": "sealed-test.pdf",
    "apiKey": "your-hex-api-key",
    "apiKeyId": "your-key-id",
    "tsaTimestamp": true,
    "includeLtv": true
  }'

# Clean up
gsutil rm gs://your-pdf-source-bucket/test.pdf
gsutil rm gs://your-pdf-destination-bucket/sealed-test.pdf

Security Best Practices

1. Principle of Least Privilege

  • Use bucket-level permissions instead of project-level when possible
  • Create separate buckets for different environments (dev/staging/prod)
  • Use different service accounts for different functions

2. Network Security

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
# Deploy function with private networking (optional)
gcloud functions deploy pdf-sealer-gateway \
  --gen2 \
  --runtime=nodejs22 \
  --region=REGION \
  --source=. \
  --entry-point=pdfSealer \
  --trigger=https \
  --service-account=$SERVICE_ACCOUNT_EMAIL \
  --vpc-connector=projects/PROJECT_ID/locations/REGION/connectors/CONNECTOR_NAME \
  --egress-settings=private-ranges-only

3. Access Control

1
2
3
4
5
# Restrict function access (optional)
gcloud functions add-iam-policy-binding pdf-sealer-gateway \
  --region=REGION \
  --member="user:admin@company.com" \
  --role="roles/cloudfunctions.invoker"

Cross-Project Access

To access buckets in different projects:

1. Grant Cross-Project Permissions

1
2
# Grant service account access to bucket in different project
gsutil iam ch serviceAccount:pdf-sealer-function@PROJECT_A.iam.gserviceaccount.com:objectViewer gs://bucket-in-project-b

2. Update Function Configuration

No code changes needed - just use the full bucket names in API calls:

1
2
3
4
5
6
{
  "sourceBucket": "bucket-in-project-b",
  "sourceKey": "document.pdf",
  "destinationBucket": "bucket-in-project-a",
  "destinationKey": "sealed-document.pdf"
}

Monitoring and Logging

1. Enable Audit Logs

1
2
3
4
# Enable Cloud Storage audit logs
gcloud logging sinks create storage-audit-sink \
  bigquery.googleapis.com/projects/PROJECT_ID/datasets/audit_logs \
  --log-filter='protoPayload.serviceName="storage.googleapis.com"'

2. Monitor Function Performance

1
2
3
4
5
6
7
8
# View function logs
gcloud functions logs read pdf-sealer-gateway \
  --gen2 \
  --region=REGION \
  --limit=50

# Monitor metrics in Cloud Monitoring
gcloud alpha monitoring dashboards create --config-from-file=dashboard.json

Troubleshooting

Common Permission Errors

403 Forbidden - Source Bucket:

1
Access denied to source: gs://bucket/key. Check Cloud Function service account permissions.

Solution:

1
2
# Verify service account has objectViewer role
gsutil iam get gs://your-source-bucket | grep $SERVICE_ACCOUNT_EMAIL

403 Forbidden - Destination Bucket:

1
Access denied to destination: gs://bucket/key. Check Cloud Function service account permissions.

Solution:

1
2
# Verify service account has objectCreator role
gsutil iam get gs://your-destination-bucket | grep $SERVICE_ACCOUNT_EMAIL

404 Not Found:

1
Source PDF not found: gs://bucket/key

Solution:

1
2
# Verify file exists
gsutil ls gs://your-source-bucket/path/to/file.pdf

Performance Issues

Slow Downloads/Uploads:

  • Use regional buckets in same region as Cloud Function
  • Consider using Cloud Storage Transfer Service for large files
  • Monitor network egress costs

Memory Issues:

  • Increase Cloud Function memory allocation
  • Use streaming for very large files (requires code modification)

Cost Optimization

1. Storage Classes

1
2
3
# Use appropriate storage class for your use case
gsutil mb -c NEARLINE gs://archive-bucket  # For infrequent access
gsutil mb -c COLDLINE gs://backup-bucket   # For archival

2. Lifecycle Management

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
# Automatically transition to cheaper storage classes
cat > lifecycle-cost.json << EOF
{
  "lifecycle": {
    "rule": [
      {
        "action": {"type": "SetStorageClass", "storageClass": "NEARLINE"},
        "condition": {"age": 30}
      },
      {
        "action": {"type": "SetStorageClass", "storageClass": "COLDLINE"}, 
        "condition": {"age": 90}
      },
      {
        "action": {"type": "Delete"},
        "condition": {"age": 365}
      }
    ]
  }
}
EOF

gsutil lifecycle set lifecycle-cost.json gs://your-bucket

3. Regional Considerations

  • Deploy function and buckets in same region to minimize egress costs
  • Use multi-regional buckets only if global access is required
  • Monitor Cloud Storage usage in billing reports

Support

For Cloud Storage specific issues:

Need architectural review?

Book a technical walkthrough

For enterprise rollout, we can review trust model, controls, and integration patterns with your team.