Configuration Management Strategies
When building pipelines, there are many possible strategies for structuring your configuration in source control as well as in pipeline design. No single method can cover all situations. After reading this document, we hope you feel equipped to select an approach.
Single Repository for Each Foundation
This is the simplest thing that could possibly work. It's the default assumed in all our examples, unless we've articulated a specific reason to choose a different approach. It entails using a single Git repository for each foundation.
Tracking foundation changes are simple, getting started is easy, duplicating foundations is simply a matter of cloning a repository, and configuration files are not difficult to understand.
Let's examine an example configuration repository that uses the "Single Repository for each Foundation" pattern:
1 2 3 4 5 6 7 8 9 10 11 12 13 14
Notice that there is only one subdirectory
and that all other files are at the repositories base directory.
This minimizes parameter mapping in the platform-automation tasks.
For example, in the
1 2 3 4 5 6
We map the config files
to the expected input named
env of the
configure-director task's default
ENV parameter is
it automatically uses the
env.yml file in our configuration repo.
We do not need to explicitly name the
ENV parameter for the task.
This also works for
Another option for mapping resources to inputs is discussed in the Matching Resource Names and Input Names section.
For reference, here is the
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40
Multiple Foundations with one Repository
Multiple foundations may use a single Git configuration source but have different variables loaded from a foundation specific vars file, Credhub, etc.
This strategy can reduce foundation drift and streamline the configuration promotion process between foundations.
This is the strategy used in our Reference Pipeline
The design considerations for this strategy as implemented are as follows:
- Prioritization of ease of configuration promotion over minimization of configuration file duplication between foundations.
- Global, non-public variables can be overwritten by
foundation-specific variables based on
- Product configuration can differ between product versions, so the entire configuration file is promoted between foundations.
- No outside tooling or additional preparation tasks are required to use this strategy. It makes use of only concepts and workflows built-in to Platform Automation and Concourse.
No significant differences between the required setup of foundations.
This doesn't mean that this strategy cannot be used with more complicated differences. If the pipelines need to be different for one reason or another, you might want the
pipelinesdirectory to be at the foundation level and for the
pipeline.ymlto be foundation-specific.
The Reference Pipeline handles the different environments via a
flyvariable. The pipeline set script is found in the
A simplified view of the config-repo is represented below:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36
Let's start with the top-level folders:
download-product-pivnetcontains config files for downloading products from pivnet and uploading those products to a blobstore.
foundationscontains all of the configuration files and variable files for all foundations.
pipelinescontains the pipeline files for the resources pipeline and the foundation pipelines.
scriptscontains the BASH script for setting all of the pipelines.
foundations folder, we have all of our foundations as well as two additional folders:
configcontains any global config files -- in our case,
auth.yml. These files are used by
omand their structure is not foundation-dependent. As a result, each foundation pipeline fills out the parameterized variables from Concourse's credential manager.
varscontains foundation-independent variables for any of the configuration files. In this example, all of the foundations are on a single IAAS, so the common vars tend to be IAAS-specific. These files can also include any other variables determined to be consistently the same across foundations.
For each foundation, we have two folders:
configcontains the configuration files that
- Downloading products from a blobstore; specified with the prefix
- Configuring a product; specified by
- Configuring the BOSH director; specified with
- Configuring the Ops Manager VM; specified with
- Downloading products from a blobstore; specified with the prefix
varscontains any foundation-specific variables used by Platform Automation tasks. These variables will fill in any variables
((parameterized))in config files that are not stored in Concourse's credential manager.
Config Promotion Example
In this example, we will be updating PKS from 1.3.8 to 1.4.3.
We will start with updating this tile in our
and then promote the configuration to the
We assume that you are viewing this example
from the root of the Reference Pipeline Config Repo.
- product-version-regex: ^1\.3\..*$ + product-version-regex: ^1\.4\..*$
Commit this change and run the resource pipeline which will download the 1.4.3 PKS tile and make it available on S3.
Update the versions file for sandbox:
- pks-version: 1.3.8 + pks-version: 1.4.3
upload-and-stage-pksjob, but do not run the
This makes it so that the
apply-changesstep won't automatically fail if there are configuration changes between what we currently have deployed and the new tile.
Login to the Ops Manager UI. If the tile has unconfigured properties:
Manually configure the tile and deploy
Re-export the staged-config:
om -e env.yml staged-config --include-credentials -p pivotal-container-service
Merge the resulting config with the existing
Diffing the previous
pks.ymland the new one makes this process much easier.
Pull out new parameterizable variables and store them in
foundations/sandbox/vars/pks.yml, or directly into Credhub. Note, there may be nothing new to parameterize. This is okay, and makes the process go faster.
Commit any changes.
apply-product-changesjobs on the
sandboxpipeline is all green, copy the
foundations/development/vars/pks.ymlfiles to have all of the property references that exist in their sandbox counterparts as well as the foundation-specific values.
Commit these changes and run the
developmentpipeline all the way through.
development Deploy Process
Since all of the legwork was done manually in the
there is no need to login to the
development Ops Manager environment.
If there are no configuration changes, the only file that needs to be promoted is
Advanced Pipeline Design
Matching Resource Names and Input Names
As an alternative to
we can create resources that match the input names on our tasks.
Even if these resources map to the same git repository and branch,
they can be declared as separate inputs.
1 2 3 4 5 6 7 8 9 10 11 12 13
As long as those resources have an associated
in the job, they will automatically be mapped to the inputs of the tasks in that job:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17
If you have two resources defined with the same git repository, such as env and config, and have a passed constraint on only one of them, there is a possibility that they will not be at the same SHA for any given job in your pipeline.
1 2 3
Modifying Resources in-place
Concourse 5+ Only
This section uses a Concourse feature that allows inputs and outputs to have the same name. This feature is only available in Concourse 5+. The following does not work with Concourse 4.
In certain circumstances, resources can be modified by one task in a job for use later in that same job. A few tasks that offer this ability include:
For each of these tasks,
output_mapping can be used to "overwrite"
an input with a modified input for use with tasks later in that job.
In the following example,
prepare-tasks-with-secrets takes in the
platform-automation-tasks input and modifies it for the
task. A more in-depth explanation of this can be found on the secrets-handling page.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28