Azure Service Fabric Data Packages

There seems to be very little documentation on how to use data packages in Service Fabric. This post intends to explore this feature.

The purpose of data packages

Data packages are intended to provide static data to your microservices. Service Fabric doesn’t care what format this data is in, so you might have JSON data, XML data, and/or some other type of data. 

The reason that this static data exists at the service level and not at the application level is because your microservices should all be able to be deployed independently.

You might wonder, why should I add this data as a data package instead of just putting it directly in my code? It is, after all, perfectly possible to compile static data as part of your code.

The reason is that data packages provide you with flexibility when you need to change the static data, and you lose that flexibility when you choose to compile the data with your code. 

More specifically, if you don’t use a data package, then when you need to update the static data you will need to re-deploy your microservice to your all nodes in your cluster in order to get your changes out there. On the other hand, if you use data packages, you can version and update these packages indepedently without having to re-deploy your code. This allows you to do rolling updates on your data which keeps your application deployed to the cluster during data updates. Your code can also watch for updates on this data and act accordingly.

Overview

As you can see in the Service Fabric application model, a Service Fabric service is comprised of 3 packages:

  1. Code
  2. Configuration
  3. Data

Together, these 3 packages are bundled up and are used to describe a Service Fabric service. Let’s take a simple example:

Here we have an application called Application1. This application contains a service called Microservice. (There is nothing special about this service; it is simply the default Web API that you get when you create a service in Visual Studio.)

When you’re getting ready to deploy a Service Fabric application to a cluster, the first thing you do is package it. Each application package contains a set of service packages for all of the services in the application.

From the example above, we have only one service. So when we package that application, we get the following:

Again, what we’re looking at is the application package. Now, when we drill down into the service package for Microservice, this is what we see:

Notice that we’ve only got our code and configuration. This is because by default, Visual Studio doesn’t include a data package for your service. You need to add it manually if you need it.

Adding a data package to your service

To add a data package, we need to add a Data folder underneath the PackageRoot folder of our service:

Note that Service Fabric doesn’t care about the format of this data. In this case I just added a JSON file.

Once you’ve added this, you need to actually register it in your service. To do that you need to update your ServiceManifest as follows:

Now, when I package this application again, it will give me the data package with my json file.

Using your data package

To access your data package in your code, you will rely upon the ServiceContext object. You can retrieve your data package from the CodePackageActivationContext using the method GetDataPackageObject(string packageName) as shown below.

You can see that when I debug into this code and use the Immediate Window to view my results, my JSON data is successfully parsed:

Note: here’s my JSON file.

Listening for data package changes

The Service Fabric SDK allows you to write code that can react to changes in your data packages. There are 3 events that you can listen for:

  1. Data Package Added
  2. Data Package Modified
  3. Data Package Removed

You can register handlers for these events to take whatever action you want to happen when they occur. Let’s look at the modified example.

You can see that I’m adding an event handler called CodePackageActivationContext_DataPackageModifiedEvent which will be executed when the data package is updated. This event handler has access to both the old and the new package and will allow you to perform whatever operation you need to on the data.

The other 2 events can be handled in a similar manner, you just have to register handlers for the DataPackageAddedEvent or DataPackageRemovedEvent events.

Leave a comment

Your email address will not be published. Required fields are marked *