Configuration Management Strategies
When building pipelines, there are many possible strategies for structuring your configuration in source control as well as in pipeline design. No single method can cover all situations. After reading this document, we hope you feel equipped to select an approach.
Single Repository for Each Foundation
This is the simplest thing that could possibly work. It's the default assumed in all our examples, unless we've articulated a specific reason to choose a different approach. It entails using a single Git repository for each foundation.
Tracking foundation changes are simple, getting started is easy, duplicating foundations is simply a matter of cloning a repository, and configuration files are not difficult to understand.
This is the strategy used throughout the Install Ops Man How to Guide and the Upgrading an Existing Ops Manager How to Guide.
Let's examine an example configuration repository that uses the "Single Repository for each Foundation" pattern:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 |
|
Notice that there is only one subdirectory
and that all other files are at the repositories base directory.
This minimizes parameter mapping in the platform-automation tasks.
For example, in the configure-director
step:
1 2 3 4 5 6 |
|
We map the config files
to the expected input named env
of the configure-director
task.
Because the configure-director
task's default ENV
parameter is env.yml
,
it automatically uses the env.yml
file in our configuration repo.
We do not need to explicitly name the ENV
parameter for the task.
This also works for director.yml
.
Another option for mapping resources to inputs is discussed in the Matching Resource Names and Input Names section.
For reference, here is the configure-director
task:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 |
|
Multiple Foundations with one Repository
Multiple foundations may use a single Git configuration source but have different variables loaded from a foundation specific vars file, Credhub, etc.
This strategy can reduce foundation drift and streamline the configuration promotion process between foundations.
This is the strategy used in our Reference Pipeline
Overview
The Reference Pipeline uses a public config repo with all secrets stored in our Concourse's Credhub.
The design considerations for this strategy as implemented are as follows:
- Prioritization of ease of configuration promotion over minimization of configuration file duplication between foundations.
- Global, non-public variables can be overwritten by
foundation-specific variables based on
VARS_FILES
ordering. - Product configuration can differ between product versions, so the entire configuration file is promoted between foundations.
- No outside tooling or additional preparation tasks are required to use this strategy. It makes use of only concepts and workflows built-in to Platform Automation and Concourse.
-
No significant differences between the required setup of foundations.
This doesn't mean that this strategy cannot be used with more complicated differences. If the pipelines need to be different for one reason or another, you might want the
pipelines
directory to be at the foundation level and for thepipeline.yml
to be foundation-specific.The Reference Pipeline handles the different environments via a
fly
variable. The pipeline set script is found in thescripts
directory.
Structure
A simplified view of the config-repo is represented below:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 |
|
Let's start with the top-level folders:
download-product-pivnet
contains config files for downloading products from pivnet and uploading those products to a blobstore.foundations
contains all of the configuration files and variable files for all foundations.pipelines
contains the pipeline files for the resources pipeline and the foundation pipelines.scripts
contains the BASH script for setting all of the pipelines.
foundations
Within the foundations
folder, we have all of our foundations as well as two additional folders:
config
contains any global config files -- in our case,env.yml
andauth.yml
. These files are used byom
and their structure is not foundation-dependent. As a result, each foundation pipeline fills out the parameterized variables from Concourse's credential manager.vars
contains foundation-independent variables for any of the configuration files. In this example, all of the foundations are on a single IAAS, so the common vars tend to be IAAS-specific. These files can also include any other variables determined to be consistently the same across foundations.
foundations/<foundation>
For each foundation, we have two folders:
config
contains the configuration files thatom
uses for:- Downloading products from a blobstore; specified with the prefix
download-
- Configuring a product; specified by
<product-name>.yml
- Configuring the BOSH director; specified with
director.yml
- Configuring the Ops Manager VM; specified with
opsman.yml
- Downloading products from a blobstore; specified with the prefix
vars
contains any foundation-specific variables used by Platform Automation tasks. These variables will fill in any variables((parameterized))
in config files that are not stored in Concourse's credential manager.
Config Promotion Example
In this example, we will be updating PKS from 1.3.8 to 1.4.3.
We will start with updating this tile in our sandbox
foundation
and then promote the configuration to the development
foundation.
We assume that you are viewing this example
from the root of the Reference Pipeline Config Repo.
-
Update
download-product-pivnet/download-pks.yml
:1 2
- product-version-regex: ^1\.3\..*$ + product-version-regex: ^1\.4\..*$
-
Commit this change and run the resource pipeline which will download the 1.4.3 PKS tile and make it available on S3.
-
Update the versions file for sandbox:
1 2
- pks-version: 1.3.8 + pks-version: 1.4.3
-
Run the
upload-and-stage-pks
job, but do not run theconfigure-pks
orapply-product-changes
jobs.This makes it so that the
apply-changes
step won't automatically fail if there are configuration changes between what we currently have deployed and the new tile. -
Login to the Ops Manager UI. If the tile has unconfigured properties:
-
Manually configure the tile and deploy
-
Re-export the staged-config:
1
om -e env.yml staged-config --include-credentials -p pivotal-container-service
-
Merge the resulting config with the existing
foundations/sandbox/config/pks.yml
.Diffing the previous
pks.yml
and the new one makes this process much easier. -
Pull out new parameterizable variables and store them in
foundations/vars/pks.yml
orfoundations/sandbox/vars/pks.yml
, or directly into Credhub. Note, there may be nothing new to parameterize. This is okay, and makes the process go faster. -
Commit any changes.
-
-
Run the
configure-pks
andapply-product-changes
jobs on thesandbox
pipeline. -
Assuming the
sandbox
pipeline is all green, copy thefoundations/sandbox/config
folder intofoundations/development/config
. -
Modify the
foundations/development/vars/versions.yml
andfoundations/development/vars/pks.yml
files to have all of the property references that exist in their sandbox counterparts as well as the foundation-specific values. -
Commit these changes and run the
development
pipeline all the way through.
A Quicker development
Deploy Process
Since all of the legwork was done manually in the sandbox
environment
there is no need to login to the development
Ops Manager environment.
If there are no configuration changes, the only file that needs to be promoted is versions.yml
Advanced Pipeline Design
Matching Resource Names and Input Names
As an alternative to input_mapping
,
we can create resources that match the input names on our tasks.
Even if these resources map to the same git repository and branch,
they can be declared as separate inputs.
1 2 3 4 5 6 7 8 9 10 11 12 13 |
|
As long as those resources have an associated get: <resource-name>
in the job, they will automatically be mapped to the inputs of the tasks in that job:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 |
|
Passed Constraints
If you have two resources defined with the same git repository, such as env and config, and have a passed constraint on only one of them, there is a possibility that they will not be at the same SHA for any given job in your pipeline.
Example:
1 2 3 |
|
Modifying Resources in-place
Concourse 5+ Only
This section uses a Concourse feature that allows inputs and outputs to have the same name. This feature is only available in Concourse 5+. The following does not work with Concourse 4.
In certain circumstances, resources can be modified by one task in a job for use later in that same job. A few tasks that offer this ability include:
For each of these tasks, output_mapping
can be used to "overwrite"
an input with a modified input for use with tasks later in that job.
In the following example, prepare-tasks-with-secrets
takes in the
platform-automation-tasks
input and modifies it for the download-product
task. A more in-depth explanation of this can be found on the secrets-handling page.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 |
|