Multi-environment service orchestrations

In a previous post, I showed how to use a GitOps approach to manage the deployment lifecycle of a service orchestration. This approach makes it easy to deploy changes to a workflow in a staging environment, run tests against it, and gradually roll out these changes to the production environment. While GitOps helps to manage the deployment lifecycle, it’s not enough. Sometimes, you need to make changes to the workflow before deploying to different environments. Read More ↗︎

GitOps your service orchestrations

GitOps takes DevOps best practices used for application development (such as version control and CI/CD) and applies them to infrastructure automation. In GitOps, the Git repository serves as the source of truth and the CD pipeline is responsible for building, testing, and deploying the application code and the underlying infrastructure. Nowadays, an application is not just code running on infrastructure that you own and operate. It is usually a set of first-party and third-party microservices working together in an event-driven architecture or with a central service orchestrator such as Workflows. Read More ↗︎

Triggering Workflows from Google Sheets

Is it possible to integrate Google Workspace tools such as Calendar, Sheets, and Forms with Workflows? For example, can you trigger a workflow from a Google Form or a Sheet? Turns out, this is not only possible but also easier than you might think. Let me show you how with a sample use case. Use case Imagine you are an administrator in charge of allocating virtual machines (VM) in your cloud infrastructure to users. Read More →

Route Datadog monitoring alerts to Google Cloud with Eventarc

A few weeks ago, we announced third-party event sources for Eventarc, with the first cohort of third-party providers by our ecosystem partners. This blog post describes how to listen for events from one of those third-party providers, Datadog, and route them to a service in Google Cloud via Eventarc. Datadog is a monitoring platform for cloud applications. It brings together end-to-end traces, metrics, and logs to make your applications and infrastructure observable. Read More ↗︎

Creating Workflows that pause and wait for events

In Workflows, it’s easy to chain various services together into an automated workflow. For some use cases, you might need to pause workflow execution and wait for some input. This input could be a human approval or an external service calling back with data needed to complete the workflow. ​​With Workflows callbacks, a workflow can create an HTTP endpoint and pause execution until it receives an HTTP callback to that endpoint. Read More →

Taking screenshots of web pages with Cloud Run jobs, Workflows, and Eventarc

At Google Cloud I/O, we announced the public preview of Cloud Run jobs. Unlike Cloud Run services that run continuously to respond to web requests or events, Cloud Run jobs run code that performs some work and quits when the work is done. Cloud Run jobs are a good fit for administrative tasks such as database migration, scheduled work like nightly reports, or doing batch data transformation. In this post, I show you a fully serverless, event-driven application to take screenshots of web pages, powered by Cloud Run jobs, Workflows, and Eventarc. Read More ↗︎

Worklows state management with Firestore

In Workflows, sometimes, you need to store some state, a key/value pair, in a step in one execution and later read that state in another step in another execution. There’s no intrinsic key/value store in Workflows. However, you can use Firestore as a key/value store and that’s what I want to show you here. If you want to skip to see some samples, check out workflow.yaml. If you want to learn more about it, keep reading. Read More →

Introducing Eventarc triggers for Workflows

We’re happy to announce that you can now create Eventarc triggers to directly target Workflows destinations. Available as a preview feature, it simplifies event-driven orchestrations by enabling you to route Eventarc events to Workflows without having an intermediary service. Integrating Eventarc and Workflows In a previous post, we talked about how to integrate Eventarc and Workflows. Since there was no direct integration between the two services, we had to deploy an intermediary Cloud Run service to receive events from Eventarc and then use the Workflows API to kick off a Workflows execution. Read More ↗︎

Creating Eventarc triggers with Terraform

Terraform is increasingly the preferred tool for building, changing, and versioning infrastructure in Google Cloud and across clouds. In an earlier post, I showed how to create Eventarc triggers using Google Cloud Console or via the command line with gcloud. In this post, I show how to create the same triggers with the google_eventarc_trigger Terraform resource. See eventarc-samples/terraform on GitHub for the prerequisites and main.tf for full Terraform configuration. Define a Cloud Run service as an event sink Before you can create a trigger, you need to create a Cloud Run service as an event sink for the trigger. Read More ↗︎

Applying a path pattern when filtering in Eventarc

You can now apply a path pattern when filtering in Eventarc. This is especially useful when you need to filter on resource names beyond exact match. Path pattern syntax allows you to define a regex-like expression that matches events as broadly as you like. Let’s take a look at a concrete example. Without path patterns Let’s say you want to listen for new file creations in a Cloud Storage bucket with an AuditLog trigger. Read More →