Update on May 4th, 2020: Azure DevOps now supports this feature: Jobs can access output variables from previous stages. So you don’t need anymore this tips, but could just read this article for fun and to learn few commands with jq ;)

Since I wrote my first Multi-stage YAML Azure Pipelines I played few times with this new capability to have both Build and Release pipelines in one YAML file, I really love it!

One of the limitation I have found is it’s not possible to create/write/pass variables from one Stage to another Stage, but it’s possible from one Job to another Job within one Stage.
Donovan Brown recently proposed one way to do that: Passing variables from stage to stage in Azure DevOps release. Furthermore, still on the same track, Stephan Stranger recently wrote Passing variables from stage to stage in Azure DevOps Release Pipelines too.
Interesting, but… to be honest I was looking for something lighter and easier to setup and maintain.

After a quick chat with my colleague Chris Wiederspan, we came with the idea to leverage the Artifacts within Azure pipelines (not Azure Artifacts). So typically creating a file with key/value pairs in it and then shared across Stages.

And here is how I was able to accomplish that with an Ubuntu agent to play a little bit with jq:

Stage 1

- bash: |
 echo $(jq -n --arg jqVar "$(oneAdoVariable)" '{yourVariableToShare: $jqVar}') > $(build.artifactStagingDirectory)/variables.json
- publish: $(build.artifactStagingDirectory)
 artifact: yourartifact

Note: “$(oneAdoVariable)” could be replaced by either “yourvalue” or “$oneBashVariable”, depending on your use case.

Stage 2..n

- download: current  
 artifact: yourartifact  
- bash: |  
 valueOfYourVariableShared=$(jq .yourVariableToShare $(pipeline.workspace)/yourartifact/variables.json -r)  

Not that complicated, isn’t it? Hope you could leverage this approach for your own needs.

Cheers!