Implementing the saga pattern in Workflows

It’s common to have a separate database for each service in microservices-based architectures. This pattern ensures that the independently designed and deployed microservices remain independent in the data layer as well. But it also introduces a new problem: How do you implement transactions (single units of work, usually made up of multiple operations) that span multiple microservices each with their own local database?

In a traditional monolith architecture, you can rely on ACID transactions (atomicity, consistency, isolation, durability) against a single database. In a microservices architecture, ensuring data consistency across multiple service-specific databases becomes more challenging. You cannot simply rely on local transactions. You need a cross-service transaction strategy. That’s where the saga pattern comes into play.

.NET 6 support in Google Cloud Buildpacks and Cloud Run

I really like the idea of containers, a reproducible context that our apps can rely on. However, I really dislike having to create a Dockerfile to define my container images. I always need to copy & paste it from somewhere and it takes me some time to get it right.

That’s why I like using Google Cloud Buildpacks to build containers without having to worry about a Dockerfile.

I’m happy to announce that, as of yesterday, Google Cloud Buildpacks now supports .NET 6 (the latest LTS version) apps as well!

Read More →

Auto-completion for Workflows JSON and YAML on Visual Studio Code

If you’re like me, you probably use VS Code to author your Workflows JSON or YAML. You also probably expect some kind syntax validation or auto-completion as you work on your workflow. Unfortunately, there’s no VS Code extension for Workflows and Cloud Code for VS Code does not support Workflows.

However, there’s a way to get partial auto-completion for Workflows in VS Code.

VS Code and JSON Schema

VS Code has the ability to display auto-complete suggestions for JSON and YAML files out of the box. It uses JSON Schema Store which hosts JSON Schemas for popular configuration files in JSON and YAML.

Read More →

Introducing the new Eventarc UI, Cloud Run for Anthos destinations

December was a busy month for the Eventarc team, who landed a number of features at the end of the year. Let’s take a closer look at some of these new capabilities.

Cloud Storage trigger is GA Back in September, we announced the public preview of Cloud Storage triggers as the preferred way of routing Cloud Storage events to Cloud Run targets. They are now generally available. For more details, see the documentation on how to create a Cloud Storage trigger and check out my previous blog post on Cloud Storage triggers.

Read More ↗︎

Cross-region and cross-project event routing with Eventarc and Pub/Sub

With event-driven architectures, it’s quite common to read events from a source in one region or project and route them to a destination in another region or another project. Let’s take a look at how you can implement cross-region and cross-project event routing in Google Cloud.

Cross-region event routing is straightforward in Google Cloud, whether you’re using Pub/Sub directly or Eventarc. Pub/Sub routes messages globally. When applications hosted in any region publish messages to a topic, subscribers from any region can pull from that topic. Eventarc enables you to route events across regions by creating a trigger in the region of the event’s source and specifying a destination in a different region. For more details, take a look at my previous blog post on Eventarc locations.

A closer look at locations in Eventarc

New locations in Eventarc Back in August, we announced more Eventarc locations (17 new regions, as well as 6 new dual-region and multi-region locations to be precise). This takes the total number of locations in Eventarc to more than 30. You can see the full list in the Eventarc locations page or by running gcloud eventarc locations list .

What does location mean in Eventarc? An Eventarc location usually refers to the single region that the Eventarc trigger gets created in. However, depending on the trigger type, the location can be more than a single region:

Read More ↗︎

Trying out source-based deployment in Cloud Run

Until recently, this is how you deployed code to Cloud Run:

Define your container-based app with a Dockerfile. Build the container image and push it to the Container Registry (typically with Cloud Build). Deploy the container image to Cloud Run. Back in December, we announced the beta release of source-based deployments for Cloud Run. This combines steps 2 and 3 above into a single command. Perhaps more importantly, it also eliminates the need for a Dockerfile for supported language versions.

Read More ↗︎

Analyzing Twitter sentiment with new Workflows processing capabilities

The Workflows team recently announced the general availability of iteration syntax and connectors!

Iteration syntax supports easier creation and better readability of workflows that process many items. You can use a for loop to iterate through a collection of data in a list or map, and keep track of the current index. If you have a specific range of numeric values to iterate through, you can also use range-based iteration.

Read More ↗︎

Introducing the new Cloud Storage trigger in Eventarc

Eventarc now supports a new Cloud Storage trigger to receive events from Cloud Storage buckets!

Wait a minute. Didn’t Eventarc already support receiving Cloud Storage events? You’re absolutely right! Eventarc has long supported Cloud Storage events via the Cloud Audit Logs trigger. However, the new Cloud Storage trigger has a number of advantages and it’s now the preferred way of receiving Cloud Storage events. Let’s take a look at the details.

Read More ↗︎

Get notified when an expensive BigQuery job executes using Eventarc and SendGrid

Events supported by Eventarc

Last week, I put together a list of events supported by Eventarc in our eventarc-samples repo. Thanks to our docs team, this list is now part of our official docs under reference section.

After looking at the full list, I started thinking about some use cases enabled by these events. I want to talk about one of those use cases today: How to get notified when an expensive BigQuery job executes?

Read More →