1. Introduction
Cloud Run allows you to run stateless containers in a fully managed environment. It is built from open-source Knative, letting you choose to run your containers either fully managed with Cloud Run, or in your Google Kubernetes Engine cluster with Cloud Run for Anthos.
Eventarc makes it easy to connect various services (Cloud Run, Cloud Functions, Workfklows) with events from a variety of sources. It allows you to build event-driven architectures in which microservices are loosely coupled and distributed. It also takes care of event ingestion, delivery, security, authorization, and error-handling for you which improves developer agility and application resilience.
In this codelab, you will learn about Eventarc. More specifically, you will listen to events from Pub/Sub, Cloud Storage, and Cloud Audit Logs with Eventarc and pass them to a Cloud Run service.
What you'll learn
- Vision of Eventarc
- Discover events in Eventarc
- Create a Cloud Run sink
- Create a trigger for Pub/Sub
- Create a trigger for Cloud Storage
- Create a trigger for Cloud Audit Logs
- Explore the Eventarc UI
2. Vision of Eventarc
Eventarc aims to deliver events from various Google, Google Cloud, and 3rd party event sources to Google Cloud event destinations.
Google Cloud sources | Event sources that are Google Cloud owned products |
Google sources | Event sources that are Google-owned products such as Gmail, Hangouts, Android Management and more |
Custom sources | Event sources that are not Google-owned products and are created by end-users themselves |
3rd party sources | Event sources that are neither Google-owned nor customer-produced. This includes popular event sources such as Check Point CloudGuard, Datadog, ForgeRock, Lacework, etc. that are owned and maintained by 3rd party providers and partners. |
Events are normalized to CloudEvents v1.0 format for cross-service interoperability. CloudEvents is a vendor-neutral open spec describing event data in common formats, enabling interoperability across services, platforms and systems.
3. Setup and Requirements
Self-paced environment setup
- Sign-in to the Google Cloud Console and create a new project or reuse an existing one. If you don't already have a Gmail or Google Workspace account, you must create one.
- The Project name is the display name for this project's participants. It is a character string not used by Google APIs, and you can update it at any time.
- The Project ID must be unique across all Google Cloud projects and is immutable (cannot be changed after it has been set). The Cloud Console auto-generates a unique string; usually you don't care what it is. In most codelabs, you'll need to reference the Project ID (and it is typically identified as
PROJECT_ID
), so if you don't like it, generate another random one, or, you can try your own and see if it's available. Then it's "frozen" after the project is created. - There is a third value, a Project Number which some APIs use. Learn more about all three of these values in the documentation.
- Next, you'll need to enable billing in the Cloud Console in order to use Cloud resources/APIs. Running through this codelab shouldn't cost much, if anything at all. To shut down resources so you don't incur billing beyond this tutorial, follow any "clean-up" instructions found at the end of the codelab. New users of Google Cloud are eligible for the $300 USD Free Trial program.
Start Cloud Shell
While Google Cloud can be operated remotely from your laptop, in this codelab you will be using Google Cloud Shell, a command line environment running in the Cloud.
From the GCP Console click the Cloud Shell icon on the top right toolbar:
It should only take a few moments to provision and connect to the environment. When it is finished, you should see something like this:
This virtual machine is loaded with all the development tools you'll need. It offers a persistent 5GB home directory, and runs on Google Cloud, greatly enhancing network performance and authentication. All of your work in this lab can be done with simply a browser.
Before you begin
Inside Cloud Shell, make sure that your project ID is setup:
PROJECT_ID=your-project-id gcloud config set project $PROJECT_ID
4. Deploy a Cloud Run service
Deploy a Cloud Run service to receive events. You will deploy Cloud Run's Hello container that logs the contents of CloudEvents.
First, enable required services for Cloud Run:
gcloud services enable run.googleapis.com
Deploy the hello container to Cloud Run:
REGION=us-central1 SERVICE_NAME=hello gcloud run deploy $SERVICE_NAME \ --allow-unauthenticated \ --image=gcr.io/cloudrun/hello \ --region=$REGION
On success, the command line displays the service URL. You can open the service URL in any browser window to double check that the service is now deployed.
5. Event Discovery
Before creating triggers in Eventarc, you can discover what the event sources are, the types of events they can emit, and how to configure triggers in order to consume them.
To see the list of different types of events:
gcloud beta eventarc attributes types list NAME DESCRIPTION google.cloud.audit.log.v1.written Cloud Audit Log written google.cloud.pubsub.topic.v1.messagePublished Cloud Pub/Sub message published google.cloud.storage.object.v1.archived Cloud Storage: Sent when a live version of an (object versioned) object is archived or deleted. google.cloud.storage.object.v1.deleted Cloud Storage: Sent when an object has been permanently deleted. google.cloud.storage.object.v1.finalized Cloud Storage: Sent when a new object (or a new generation of an existing object). google.cloud.storage.object.v1.metadataUpdated Cloud Storage: Sent when the metadata of an existing object changes.
To get more information about each event type:
gcloud beta eventarc attributes types describe google.cloud.audit.log.v1.written attributes: type,serviceName,methodName,resourceName description: 'Cloud Audit Log: Sent when a log is written.' name: google.cloud.audit.log.v1.written
To see the list of services that emit a certain event type:
gcloud beta eventarc attributes service-names list --type=google.cloud.audit.log.v1.written SERVICE_NAME DISPLAY_NAME accessapproval.googleapis.com Access Approval accesscontextmanager.googleapis.com Access Context Manager admin.googleapis.com Google Workspace Admin aiplatform.googleapis.com AI Platform (under Vertex AI) apigee.googleapis.com Apigee apigeeconnect.googleapis.com Apigee Connect ... workflows.googleapis.com Workflows
To see the list of method names (sub-events) that each service can emit:
gcloud beta eventarc attributes method-names list --type=google.cloud.audit.log.v1.written --service-name=workflows.googleapis.com METHOD_NAME google.cloud.workflows.v1.Workflows.CreateWorkflow google.cloud.workflows.v1.Workflows.DeleteWorkflow google.cloud.workflows.v1.Workflows.GetWorkflow google.cloud.workflows.v1.Workflows.ListWorkflows google.cloud.workflows.v1.Workflows.UpdateWorkflow google.cloud.workflows.v1beta.Workflows.CreateWorkflow google.cloud.workflows.v1beta.Workflows.DeleteWorkflow google.cloud.workflows.v1beta.Workflows.GetWorkflow google.cloud.workflows.v1beta.Workflows.ListWorkflows google.cloud.workflows.v1beta.Workflows.UpdateWorkflow
6. Create a Pub/Sub trigger
One way of receiving events is through Cloud Pub/Sub. Any application can publish messages to Pub/Sub and these messages can be delivered to Cloud Run via Eventarc.
Setup
Before creating any triggers, enable required services for Eventarc:
gcloud services enable eventarc.googleapis.com
You also need a service account to be used by triggers. Create a service account:
SERVICE_ACCOUNT=eventarc-trigger-sa gcloud iam service-accounts create $SERVICE_ACCOUNT
Create
Create a trigger to filter events published to the Pub/Sub topic to our deployed Cloud Run service:
TRIGGER_NAME=trigger-pubsub gcloud eventarc triggers create $TRIGGER_NAME \ --destination-run-service=$SERVICE_NAME \ --destination-run-region=$REGION \ --event-filters="type=google.cloud.pubsub.topic.v1.messagePublished" \ --location=$REGION \ --service-account=$SERVICE_ACCOUNT@$PROJECT_ID.iam.gserviceaccount.com
Test
Pub/Sub trigger creates a topic under the covers. Let's find it out and assign to a variable:
TOPIC_ID=$(gcloud eventarc triggers describe $TRIGGER_NAME --location $REGION --format='value(transport.pubsub.topic)')
Use gcloud
to publish a message to the topic:
gcloud pubsub topics publish $TOPIC_ID --message="Hello World"
The Cloud Run service logs the body of the incoming message. You can view this in the Logs section of your Cloud Run instance:
Create with an existing Pub/Sub topic
By default, when you create a Pub/Sub trigger, Eventarc creates a topic under the covers for you to use as a transport topic between your application and a Cloud Run service. This is useful to easily and quickly create a Pub/Sub backed trigger but sometimes you want might to use an existing topic. Eventarc allows you to specify an existing Pub/Sub topic in the same project with --transport-topic
gcloud flag.
To see how this works, create a Pub/Sub topic to use as transport topic:
TOPIC_ID=eventarc-topic gcloud pubsub topics create $TOPIC_ID
Create a trigger:
TRIGGER_NAME=trigger-pubsub-existing gcloud eventarc triggers create $TRIGGER_NAME \ --destination-run-service=$SERVICE_NAME \ --destination-run-region=$REGION \ --event-filters="type=google.cloud.pubsub.topic.v1.messagePublished" \ --location=$REGION \ --transport-topic=projects/$PROJECT_ID/topics/$TOPIC_ID \ --service-account=$SERVICE_ACCOUNT@$PROJECT_ID.iam.gserviceaccount.com
You can test the trigger by sending a message to the topic:
gcloud pubsub topics publish $TOPIC_ID --message="Hello again"
7. Create a Cloud Storage trigger
In this step, you will create a trigger to listen for events from Cloud Storage.
Setup
First, create a bucket to receive events from:
BUCKET_NAME=eventarc-gcs-$PROJECT_ID gsutil mb -l $REGION gs://$BUCKET_NAME
Grant the eventarc.eventReceiver
role, so the service account can be used in a Cloud Storage trigger:
gcloud projects add-iam-policy-binding $PROJECT_ID \ --role roles/eventarc.eventReceiver \ --member serviceAccount:$SERVICE_ACCOUNT@$PROJECT_ID.iam.gserviceaccount.com
You also need to add the pubsub.publisher
role to the Cloud Storage service account for Cloud Storage triggers:
SERVICE_ACCOUNT_STORAGE=$(gsutil kms serviceaccount -p $PROJECT_ID) gcloud projects add-iam-policy-binding $PROJECT_ID \ --member serviceAccount:$SERVICE_ACCOUNT_STORAGE \ --role roles/pubsub.publisher
Create
Create a trigger to route new file creation events from the bucket to your service:
TRIGGER_NAME=trigger-storage gcloud eventarc triggers create $TRIGGER_NAME \ --destination-run-service=$SERVICE_NAME \ --destination-run-region=$REGION \ --event-filters="type=google.cloud.storage.object.v1.finalized" \ --event-filters="bucket=$BUCKET_NAME" \ --location=$REGION \ --service-account=$SERVICE_ACCOUNT@$PROJECT_ID.iam.gserviceaccount.com
Test
List all triggers to confirm that trigger was successfully created:
gcloud eventarc triggers list
Upload a file to the Cloud Storage bucket:
echo "Hello World" > random.txt gsutil cp random.txt gs://$BUCKET_NAME/random.txt
If you check the logs of the Cloud Run service in Cloud Console, you should see the received event:
8. Create a Cloud Audit Logs trigger
Although Cloud Storage trigger is the better way to listen for Cloud Storage events, in this step, you create a Cloud Audit Log trigger to do the same.
Setup
In order to receive events from a service, you need to enable Cloud Audit Logs. From the Cloud Console, select IAM & Admin
and Audit Logs
from the upper left-hand menu. In the list of services, check Google Cloud Storage
:
On the right hand side, make sure Admin
, Read
and Write
are selected and click Save
:
Create
Create a trigger to route new file creation events from the bucket to your service:
TRIGGER_NAME=trigger-auditlog-storage gcloud eventarc triggers create $TRIGGER_NAME\ --destination-run-service=$SERVICE_NAME \ --destination-run-region=$REGION \ --event-filters="type=google.cloud.audit.log.v1.written" \ --event-filters="serviceName=storage.googleapis.com" \ --event-filters="methodName=storage.objects.create" \ --event-filters-path-pattern="resourceName=/projects/_/buckets/$BUCKET_NAME/objects/*" \ --location=$REGION \ --service-account=$SERVICE_ACCOUNT@$PROJECT_ID.iam.gserviceaccount.com
Test
Audit Logs triggers take a little bit of time to initialize. You can check that the trigger is created by listing all triggers:
gcloud eventarc triggers list
You should see that the ACTIVE
field is Yes
:
NAME TYPE DESTINATION ACTIVE trigger-auditlog-storage google.cloud.audit.log.v1.written Cloud Run service: hello Yes
Upload the same file to the Cloud Storage bucket as you did earlier:
gsutil cp random.txt gs://$BUCKET_NAME/random.txt
If you check the logs of the Cloud Run service in Cloud Console, you should see the received event:
9. Explore the Eventarc UI
In this step, you will explore the Eventarc UI in Google Cloud Console. In the Eventarc UI, you can get an overview of all the triggers, edit and delete them, and create new triggers from Google Cloud Console.
Go to the Eventarc section of Google Cloud:
You'll see the list of triggers you created earlier:
If you click on a trigger, you can see the details of the trigger, edit or delete it:
You can also create a new trigger by selecting Create trigger
and filling in details of the trigger:
10. Congratulations!
Congratulations for completing the codelab.
What we've covered
- Vision of Eventarc
- Discover events in Eventarc
- Create a Cloud Run sink
- Create a trigger for Pub/Sub
- Create a trigger for Cloud Storage
- Create a trigger for Cloud Audit Logs
- Explore the Eventarc UI