Creating Workflows that pause and wait for events

In Workflows, it’s easy to chain various services together into an automated workflow. For some use cases, you might need to pause workflow execution and wait for some input. This input could be a human approval or an external service calling back with data needed to complete the workflow. ​​With Workflows callbacks, a workflow can create an HTTP endpoint and pause execution until it receives an HTTP callback to that endpoint. Read More →

Taking screenshots of web pages with Cloud Run jobs, Workflows, and Eventarc

At Google Cloud I/O, we announced the public preview of Cloud Run jobs. Unlike Cloud Run services that run continuously to respond to web requests or events, Cloud Run jobs run code that performs some work and quits when the work is done. Cloud Run jobs are a good fit for administrative tasks such as database migration, scheduled work like nightly reports, or doing batch data transformation. In this post, I show you a fully serverless, event-driven application to take screenshots of web pages, powered by Cloud Run jobs, Workflows, and Eventarc. Read More ↗︎

Worklows state management with Firestore

In Workflows, sometimes, you need to store some state, a key/value pair, in a step in one execution and later read that state in another step in another execution. There’s no intrinsic key/value store in Workflows. However, you can use Firestore as a key/value store and that’s what I want to show you here. If you want to skip to see some samples, check out workflow.yaml. If you want to learn more about it, keep reading. Read More →

Introducing Eventarc triggers for Workflows

We’re happy to announce that you can now create Eventarc triggers to directly target Workflows destinations. Available as a preview feature, it simplifies event-driven orchestrations by enabling you to route Eventarc events to Workflows without having an intermediary service. Integrating Eventarc and Workflows In a previous post, we talked about how to integrate Eventarc and Workflows. Since there was no direct integration between the two services, we had to deploy an intermediary Cloud Run service to receive events from Eventarc and then use the Workflows API to kick off a Workflows execution. Read More ↗︎

Creating Eventarc triggers with Terraform

Terraform is increasingly the preferred tool for building, changing, and versioning infrastructure in Google Cloud and across clouds. In an earlier post, I showed how to create Eventarc triggers using Google Cloud Console or via the command line with gcloud. In this post, I show how to create the same triggers with the google_eventarc_trigger Terraform resource. See eventarc-samples/terraform on GitHub for the prerequisites and main.tf for full Terraform configuration. Define a Cloud Run service as an event sink Before you can create a trigger, you need to create a Cloud Run service as an event sink for the trigger. Read More ↗︎

Applying a path pattern when filtering in Eventarc

You can now apply a path pattern when filtering in Eventarc. This is especially useful when you need to filter on resource names beyond exact match. Path pattern syntax allows you to define a regex-like expression that matches events as broadly as you like. Let’s take a look at a concrete example. Without path patterns Let’s say you want to listen for new file creations in a Cloud Storage bucket with an AuditLog trigger. Read More →

Building APIs with Cloud Functions and API Gateway

Building APIs with Cloud Run If I want to build an API, I usually use Cloud Run. In Cloud Run, you run a container and in that container, you run a web server that handles a base URL in this format: https://<service-name>-<hash>-<region>.a.run.app You can then have the web server handle any path under that base URL such as: https://<service-name>-<hash>-<region>.a.run.app/hello https://<service-name>-<hash>-<region>.a.run.app/bye Building APIs with Cloud Functions In Cloud Functions, you only have access to a function (no web server) and that function can only handle the base path: Read More →

Long-running containers with Workflows and Compute Engine

Sometimes, you need to run a piece of code for hours, days, or even weeks. Cloud Functions and Cloud Run are my default choices to run code. However, they both have limitations on how long a function or container can run. This rules out the idea of executing long-running code in a serverless way. Thanks to Workflows and Compute Engine, you can have an almost serverless experience with long running code. Read More ↗︎

Implementing the saga pattern in Workflows

It’s common to have a separate database for each service in microservices-based architectures. This pattern ensures that the independently designed and deployed microservices remain independent in the data layer as well. But it also introduces a new problem: How do you implement transactions (single units of work, usually made up of multiple operations) that span multiple microservices each with their own local database? In a traditional monolith architecture, you can rely on ACID transactions (atomicity, consistency, isolation, durability) against a single database. Read More ↗︎

.NET 6 support in Google Cloud Buildpacks and Cloud Run

I really like the idea of containers, a reproducible context that our apps can rely on. However, I really dislike having to create a Dockerfile to define my container images. I always need to copy & paste it from somewhere and it takes me some time to get it right. That’s why I like using Google Cloud Buildpacks to build containers without having to worry about a Dockerfile. I’m happy to announce that, as of yesterday, Google Cloud Buildpacks now supports . Read More →