Creating Eventarc triggers with Terraform

Terraform is increasingly the preferred tool for building, changing, and versioning infrastructure in Google Cloud and across clouds. In an earlier post, I showed how to create Eventarc triggers using Google Cloud Console or via the command line with gcloud. In this post, I show how to create the same triggers with the google_eventarc_trigger Terraform resource. See eventarc-samples/terraform on GitHub for the prerequisites and for full Terraform configuration. Define a Cloud Run service as an event sink Before you can create a trigger, you need to create a Cloud Run service as an event sink for the trigger. Read More ↗︎

Applying a path pattern when filtering in Eventarc

You can now apply a path pattern when filtering in Eventarc. This is especially useful when you need to filter on resource names beyond exact match. Path pattern syntax allows you to define a regex-like expression that matches events as broadly as you like. Let’s take a look at a concrete example. Without path patterns Let’s say you want to listen for new file creations in a Cloud Storage bucket with an AuditLog trigger. Read More →

Building APIs with Cloud Functions and API Gateway

Building APIs with Cloud Run If I want to build an API, I usually use Cloud Run. In Cloud Run, you run a container and in that container, you run a web server that handles a base URL in this format: https://<service-name>-<hash>-<region> You can then have the web server handle any path under that base URL such as: https://<service-name>-<hash>-<region> https://<service-name>-<hash>-<region> Building APIs with Cloud Functions In Cloud Functions, you only have access to a function (no web server) and that function can only handle the base path: Read More →

Long-running containers with Workflows and Compute Engine

Sometimes, you need to run a piece of code for hours, days, or even weeks. Cloud Functions and Cloud Run are my default choices to run code. However, they both have limitations on how long a function or container can run. This rules out the idea of executing long-running code in a serverless way. Thanks to Workflows and Compute Engine, you can have an almost serverless experience with long running code. Read More ↗︎

Implementing the saga pattern in Workflows

It’s common to have a separate database for each service in microservices-based architectures. This pattern ensures that the independently designed and deployed microservices remain independent in the data layer as well. But it also introduces a new problem: How do you implement transactions (single units of work, usually made up of multiple operations) that span multiple microservices each with their own local database? In a traditional monolith architecture, you can rely on ACID transactions (atomicity, consistency, isolation, durability) against a single database. Read More ↗︎

.NET 6 support in Google Cloud Buildpacks and Cloud Run

I really like the idea of containers, a reproducible context that our apps can rely on. However, I really dislike having to create a Dockerfile to define my container images. I always need to copy & paste it from somewhere and it takes me some time to get it right. That’s why I like using Google Cloud Buildpacks to build containers without having to worry about a Dockerfile. I’m happy to announce that, as of yesterday, Google Cloud Buildpacks now supports . Read More →

Auto-completion for Workflows JSON and YAML on Visual Studio Code

If you’re like me, you probably use VS Code to author your Workflows JSON or YAML. You also probably expect some kind syntax validation or auto-completion as you work on your workflow. Unfortunately, there’s no VS Code extension for Workflows and Cloud Code for VS Code does not support Workflows. However, there’s a way to get partial auto-completion for Workflows in VS Code. VS Code and JSON Schema VS Code has the ability to display auto-complete suggestions for JSON and YAML files out of the box. Read More →

Introducing the new Eventarc UI, Cloud Run for Anthos destinations

December was a busy month for the Eventarc team, who landed a number of features at the end of the year. Let’s take a closer look at some of these new capabilities. Cloud Storage trigger is GA Back in September, we announced the public preview of Cloud Storage triggers as the preferred way of routing Cloud Storage events to Cloud Run targets. They are now generally available. For more details, see the documentation on how to create a Cloud Storage trigger and check out my previous blog post on Cloud Storage triggers. Read More ↗︎

Cross-region and cross-project event routing with Eventarc and Pub/Sub

With event-driven architectures, it’s quite common to read events from a source in one region or project and route them to a destination in another region or another project. Let’s take a look at how you can implement cross-region and cross-project event routing in Google Cloud. Cross-region event routing is straightforward in Google Cloud, whether you’re using Pub/Sub directly or Eventarc. Pub/Sub routes messages globally. When applications hosted in any region publish messages to a topic, subscribers from any region can pull from that topic. Read More ↗︎

A closer look at locations in Eventarc

New locations in Eventarc Back in August, we announced more Eventarc locations (17 new regions, as well as 6 new dual-region and multi-region locations to be precise). This takes the total number of locations in Eventarc to more than 30. You can see the full list in the Eventarc locations page or by running gcloud eventarc locations list . What does location mean in Eventarc? An Eventarc location usually refers to the single region that the Eventarc trigger gets created in. Read More ↗︎