Over the last 12-months I've been spending quite a bit of time looking at Internet-of-Things (IoT) related scenarios. For the most part I've been looking at how to build and deploy backend solutions that can process telemetry from sensors.
I'm a bit of a DevOps/ALM guy and so like to automate the delivery of application components and reduce unnecessary manual intervention as much as possible. For the backend components in IoT solutions there isn't much that changes from a DevOps point of view - the skills you use to setup continuous delivery of a line of business application will serve you well.
What I didn't know was how challenging it might be to achieve the same kind of delivery model to low cost, occasionally connected embedded devices. As it turns out, for devices that have a decent API for pushing over the air updates, it isn't that hard at all!
Recently I purchased a Particle Photon board. The Photon is a small prototyping board which supports the Arduino-style wiring APIs and includes an embedded WiFi controller and a event publishing API that can transmit and receive messages from a cloud-hosted backend. The WiFi controller also allows the device to have its firmware updated over-the-air (OTA).
It was this last feature that I was most interested in because I wanted to see whether I could build firmware from the device using VSTS Build, and then push it out to the device using VSTS Release Management. In the end I managed to get a simple end-to-end scenario working.
The first thing that I did was stand-up a GitHub repository (hello-particle) which contains the source code for the firmware, and a few supporting scripts for compiling and flashing devices. Strictly speaking I didn't need to use GitHub for this but since VSTS doesn't support public access I thought it would be the easiest way to share my scripts.
Using the Particle Tools
Once I had my repository setup I then had to figure out how I could automate the compilation of the firmware, and then the flashing of that compiled firmware OTA to the Photon. Particle (the creators of the Photon) provide a command-line tool implemented as an NPM package which allows you to upload the firmware source files and compile them in the cloud and then get a firmware binary back out. To install this tool you need to issue the following NPM command.
npm install -g particle-cli
particle-cli package drops a command onto your system called
particle which can be used to configure Photon devices for the first time and do various bits and pieces of automation. The before you can do much, you first need to authenticate to the Particle cloud backend. To do this you use the interactive
particle login command which prompts for the username and password.
This is where the first wrinkle is encountered. Before we can use this CLI on the build server we need to be able to authenticate non-interactively.
Authenticating Particle Commands
After some digging I learned that particle drops a configuration file (
particle.config.json) in a directory (
.particle) off your home/profile directory. This JSON formatted file contains the username and access token used to authenticate in subsequent
particle command invocations after the initial
login. All I needed to do was recreate this file on the build/release agents in order to execute commands.
Given that I was already working with NPM to pull down the
setup task in
Gulpfile.js is designed to work with the command-line arguments
--accesstoken. If these arguments aren't passed in then the script will attempt to get the values from the environment variables
PARTICLE_ACCESSTOKEN. The reason for having both approaches is that in your local development rig, using environment variables would be easier than constantly typing or pasting in the access token. However, on a VSTS build agent environment variables for secret build variables aren't populated and I'll be marking the access token as a secret, so had to have this alternative way of feeding them in. With the VSTS Build the following command would end up being invoked under the hood.
gulp compile --username $(Particle.Username) --accesstoken $(Particle.AccessToken)
Regardless of how the username and access token are supplied, the gulp script generates a configuration file and drops it into a freshly created
~/.particle/particle.config.json file so that the commands we run can authenticate against the Particle backend.
In my setup I'm using the hosted build agent to do all this work, so another thing that I had to overcome was reliably finding the user's home directory. In Node.js there is a function off the
os module called
homedir(), however on the hosted build agent there is quite an old version of Node.js installed which doesn't have this function yet, so I used the
os-homedir module to work around that (just in case you were wondering why I did that). Maybe in a future post I'll figure out the best way of forcibly updating Node.js on the hosted build agent!
The next step was to take the simple firmware sketch that I had in the repository. The particle command to do this is:
particle compile photon firmware.ino
This will upload the sketch to the cloud, perform the combination and output a
*.bin file. The command automatically appends a timestamp to each
.bin file that it outputs which can be annoying to deal with on a CI server (not all tasks/commands handle globbing syntax). To coerce the command into outputting a specific file you can use the
--saveTo option, thus the command is:
particle compile photon firmware.ino --saveTo firmware.bin
One thing to note is that the
--saveTo option is case sensitive which caught me out initially. I also couldn't find anywhere in the documentation that mentioned the
--saveTo option, I only found it after looking through the source code for the CLI in their own GitHub repository.
Once I had the command I needed figured out, I added a new gulp task to
compile. This task invoked the
particle command with the correct arguments to compile the firmware. I set a dependency between this task and the
setup task so that whenever I wanted to do a compilation, the configuration for the commands would automatically be set.
One thing that you might want to consider is what operating system your build agent is running vs. what operating system your local development rig is running. I tend to alternate between Windows and OS X so I put in some logic into the
compile task which detects which OS is being used and adjusts the command to be invoked accordingly. Because the
particle-cli module is being installed locally on the build server, to invoke the
particle command you have to path your way to it in the
node_modules/.bin directory. On Windows the command is invoked using the
particle.cmd command-shell script, whereas on OS X you can invoke the
particle command directly because that operating system supports the hash-bang syntax (
#!) to select the correct interpreter at invocation.
Hooking up VSTS Build
Now that I could successfully compile the firmware locally using my gulp script, I wanted to get the compile happening up on the hosted build agent. For this I created a team project called
hello-particle and within it I created a build definition, also called
The first thing you need to do when you create a build definition is tell VSTS what source repository you are going to be building from. My source code was in GitHub so I first had to set up a connection to my GitHub account so that I could pick the repository that I wanted to build. If you go to the repository tab in the build definition and select GitHub as the repository type, you can then select Manage to go and create a connection between VSTS and GitHub.
Once the repository link was setup I created three tasks. The first task invokes NPM to install all the dependencies required to make the script work (
os-homedir). The second task invokes gulp, specifically the
compile task which results in the
firmware.bin file being dropped into the working directory. The final task picks up the
firmware.bin file along with the
gulpfile.js files and uploads them into the VSTS artifact repository.
The Gulp task requires a little bit more detail. We explicitly want to pass in the username and access token to the compile task.
Finally - before I could kick off the build I needed to provide the username and access token as variables to the script so that the
setup task could inject them into the Particle configuration file. This is done on the variables tab in the build definition.
Note that the
$(Particle.AccessToken) variable is marked as a secret. This was the reason I needed to adapt the gulp script to support the
With all that configuration, the build should run quite successfully. The three files mentioned above will be used later by VSTS Release Management to take the pre-compiled firmware binary and upload it to the Particle cloud to drive an OTA update to the Photon device. In theory only the
firmware.bin file is a necessary output from the build process, in practice however, carrying the
package.json files will make it easier to invoke commands in the next step.
Flashing with VSTS Release Management
VSTS Release Management is built on top of the same underlying task and agent infrastructure as VSTS Build. As a result we can easily adapt our approach for compiling the firmware into flashing it.
particle-cli module command for flashing firmware is:
particle flash [device name] firmware.bin
Where device name is either serial number of the device, or the friendly name that you pay have provided when setting up the device for the first time. I integrated the flashing process into the sample gulp script under a new task called
flash task also takes a dependency on the
setup task and so also supports the
--accesstoken arguments - so this will need to be set in the Configuration tab in VSTS Release Management.
To get this working I created a new release definition called
hello-particle and created an environment called
In-house Lab. I then setup the release to pull artifacts from the
hello-particle build definition. Once you've got those basics setup you can go and configure the variables mentioned above in a way similar to the build definition.
Release Management actually supports pulling in multiple artifacts into a single release. Artifacts are dropped down into the working directory in a sub-folder that corresponds to the name of the build artifact. In this case the three files from the previous build can be found in the
In-house Lab environment there are three tasks, the first copies the files from the artifact drop folder into the working directory, the second does an NPM install to pull down
particle-cli and under dependencies required by the gulp script, and the third task invokes the
flash task on the gulp-script.
Whilst I could have invoked Gulp directly in the artifact working directories, I find for simple scenarios it is easier just to relocate the files to the default file system location expected by tasks to avoid having to change all the working directories later on.
The Gulp task configuration for the release definition is very similar to the build definition except that we are calling the
flash task instead of the
Once everything is configured we should be able to execute the release picking a recent successful build from the
hello-particle build. Assuming that everything is configured correctly you should be able to do an end-to-end deployment from firmware change to device via VSTS.
I think the ability to streamline deployment of embedded firmware is an important capability as the IoT wave continues through society. The ability to track the rollout of firmware through environments will be important.
In a more complex example I would have hundreds or thousands of devices each segmented into cohorts starting with an internal test lab to pilot groups and so on. As you start deploying to more devices you would have to anticipate failure as some devices might not be immediately available.
In the end, I expect most projects to adopt a specialised OTA deployment tool which can be orchestrated by tools like VSTS Release Management. But for now directly driving the Particle cloud to push out firmware has been a fun experiment.