Skip to content

Configuration Management Strategies

When building pipelines, there are many possible strategies for structuring your configuration in source control as well as in pipeline design. No single method can cover all situations. After reading this document, we hope you feel equipped to select an approach.

Single Repository for Each Foundation

This is the simplest thing that could possibly work. It's the default assumed in all our examples, unless we've articulated a specific reason to choose a different approach. It entails using a single Git repository for each foundation.

Tracking foundation changes are simple, getting started is easy, duplicating foundations is simply a matter of cloning a repository, and configuration files are not difficult to understand.

This is the strategy used throughout the Install Ops Man How to Guide and the Upgrading an Existing Ops Manager How to Guide.

Let's examine an example configuration repository that uses the "Single Repository for each Foundation" pattern:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
├── auth.yml
├── pas.yml
├── director.yml
├── download-opsman.yml
├── download-product-configs
│   ├── healthwatch.yml
│   ├── opsman.yml
│   ├── pas-windows.yml
│   ├── pas.yml
│   └── telemetry.yml
├── env.yml
├── healthwatch.yml
├── opsman.yml
└── pas-windows.yml

Notice that there is only one subdirectory and that all other files are at the repositories base directory. This minimizes parameter mapping in the platform-automation tasks. For example, in the configure-director step:

1
2
3
4
5
6
- task: configure-director
  image: platform-automation-image
  file: platform-automation-tasks/tasks/configure-director.yml
  input_mapping:
    config: configuration
    env: configuration

We map the config files to the expected input named env of the configure-director task. Because the configure-director task's default ENV parameter is env.yml, it automatically uses the env.yml file in our configuration repo. We do not need to explicitly name the ENV parameter for the task. This also works for director.yml.

Another option for mapping resources to inputs is discussed in the Matching Resource Names and Input Names section.

For reference, here is the configure-director task:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
---
platform: linux

inputs:
- name: platform-automation-tasks
- name: config # contains the director configuration file
- name: env # contains the env file with target OpsMan Information
- name: vars # variable files to be made available
  optional: true
- name: secrets
  # secret files to be made available
  # separate from vars, so they can be store securely
  optional: true
- name: ops-files # operations files to custom configure the product
  optional: true

params:
  VARS_FILES:
  # - Optional
  # - Filepath to the Ops Manager vars yaml file
  # - The path is relative to root of the task build,
  #   so `vars` and `secrets` can be used.

  OPS_FILES:
  # - Optional
  # - Filepath to the Ops Manager operations yaml files
  # - The path is relative to root of the task build

  ENV_FILE: env.yml
  # - Required
  # - Filepath of the env config YAML
  # - The path is relative to root of the `env` input

  DIRECTOR_CONFIG_FILE: director.yml
  # - Required
  # - Filepath to the director configuration yaml file
  # - The path is relative to the root of the `config` input

run:
  path: platform-automation-tasks/tasks/configure-director.sh

Multiple Foundations with one Repository

Multiple foundations may use a single Git configuration source but have different variables loaded from a foundation specific vars file, Credhub, etc.

This strategy can reduce foundation drift and streamline the configuration promotion process between foundations.

This is the strategy used in our Reference Pipeline

Overview

The Reference Pipeline uses a public config repo with all secrets stored in our Concourse's Credhub.

The design considerations for this strategy as implemented are as follows:

  • Prioritization of ease of configuration promotion over minimization of configuration file duplication between foundations.
  • Global, non-public variables can be overwritten by foundation-specific variables based on VARS_FILES ordering.
  • Product configuration can differ between product versions, so the entire configuration file is promoted between foundations.
  • No outside tooling or additional preparation tasks are required to use this strategy. It makes use of only concepts and workflows built-in to Platform Automation and Concourse.
  • No significant differences between the required setup of foundations.

    This doesn't mean that this strategy cannot be used with more complicated differences. If the pipelines need to be different for one reason or another, you might want the pipelines directory to be at the foundation level and for the pipeline.yml to be foundation-specific.

    The Reference Pipeline handles the different environments via a fly variable. The pipeline set script is found in the scripts directory.

Structure

A simplified view of the config-repo is represented below:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
├── download-product-pivnet
│   ├── download-opsman.yml
│   └── download-pks.yml
├── foundations
│   ├── config
│   │   ├── auth.yml
│   │   └── env.yml
│   ├── development
│   │   ├── config
│   │   │   ├── director.yml
│   │   │   ├── download-opsman.yml
│   │   │   ├── download-pks.yml
│   │   │   ├── opsman.yml
│   │   │   └── pks.yml
│   │   └── vars
│   │       ├── director.yml
│   │       ├── pks.yml
│   │       └── versions.yml
│   ├── sandbox
│   │   ├── config
│   │   │   ├── director.yml
│   │   │   ├── download-opsman.yml
│   │   │   ├── download-pks.yml
│   │   │   ├── opsman.yml
│   │   │   └── pks.yml
│   │   └── vars
│   │       ├── director.yml
│   │       ├── pks.yml
│   │       └── versions.yml
│   └── vars
│       └── director.yml
├── pipelines
│   ├── download-products.yml
│   └── pipeline.yml
└── scripts
    └── update-reference-pipeline.sh

Let's start with the top-level folders:

  • download-product-pivnet contains config files for downloading products from pivnet and uploading those products to a blobstore.
  • foundations contains all of the configuration files and variable files for all foundations.
  • pipelines contains the pipeline files for the resources pipeline and the foundation pipelines.
  • scripts contains the BASH script for setting all of the pipelines.

foundations

Within the foundations folder, we have all of our foundations as well as two additional folders:

  • config contains any global config files -- in our case, env.yml and auth.yml. These files are used by om and their structure is not foundation-dependent. As a result, each foundation pipeline fills out the parameterized variables from Concourse's credential manager.
  • vars contains foundation-independent variables for any of the configuration files. In this example, all of the foundations are on a single IAAS, so the common vars tend to be IAAS-specific. These files can also include any other variables determined to be consistently the same across foundations.

foundations/<foundation>

For each foundation, we have two folders:

  • config contains the configuration files that om uses for:
    • Downloading products from a blobstore; specified with the prefix download-
    • Configuring a product; specified by <product-name>.yml
    • Configuring the BOSH director; specified with director.yml
    • Configuring the Ops Manager VM; specified with opsman.yml
  • vars contains any foundation-specific variables used by Platform Automation tasks. These variables will fill in any variables ((parameterized)) in config files that are not stored in Concourse's credential manager.

Config Promotion Example

In this example, we will be updating PKS from 1.3.8 to 1.4.3. We will start with updating this tile in our sandbox foundation and then promote the configuration to the development foundation. We assume that you are viewing this example from the root of the Reference Pipeline Config Repo.

  1. Update download-product-pivnet/download-pks.yml:

    1
    2
    - product-version-regex: ^1\.3\..*$
    + product-version-regex: ^1\.4\..*$
    
  2. Commit this change and run the resource pipeline which will download the 1.4.3 PKS tile and make it available on S3.

  3. Update the versions file for sandbox:

    1
    2
    - pks-version: 1.3.8
    + pks-version: 1.4.3
    
  4. Run the upload-and-stage-pks job, but do not run the configure-pks or apply-product-changes jobs.

    This makes it so that the apply-changes step won't automatically fail if there are configuration changes between what we currently have deployed and the new tile.

  5. Login to the Ops Manager UI. If the tile has unconfigured properties:

    1. Manually configure the tile and deploy

    2. Re-export the staged-config:

      1
      om -e env.yml staged-config --include-credentials -p pivotal-container-service
      
    3. Merge the resulting config with the existing foundations/sandbox/config/pks.yml.

      Diffing the previous pks.yml and the new one makes this process much easier.

    4. Pull out new parameterizable variables and store them in foundations/vars/pks.yml or foundations/sandbox/vars/pks.yml, or directly into Credhub. Note, there may be nothing new to parameterize. This is okay, and makes the process go faster.

    5. Commit any changes.

  6. Run the configure-pks and apply-product-changes jobs on the sandbox pipeline.

  7. Assuming the sandbox pipeline is all green, copy the foundations/sandbox/config folder into foundations/development/config.

  8. Modify the foundations/development/vars/versions.yml and foundations/development/vars/pks.yml files to have all of the property references that exist in their sandbox counterparts as well as the foundation-specific values.

  9. Commit these changes and run the development pipeline all the way through.

A Quicker development Deploy Process

Since all of the legwork was done manually in the sandbox environment there is no need to login to the development Ops Manager environment.

If there are no configuration changes, the only file that needs to be promoted is versions.yml

Advanced Pipeline Design

Matching Resource Names and Input Names

As an alternative to input_mapping, we can create resources that match the input names on our tasks. Even if these resources map to the same git repository and branch, they can be declared as separate inputs.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
- name: config
  type: git
  source:
    private_key: ((repo-key.private_key))
    uri: ((repo-uri))
    branch: develop

- name: env
  type: git
  source:
    private_key: ((repo-key.private_key))
    uri: ((repo-uri))
    branch: develop

As long as those resources have an associated get: <resource-name> in the job, they will automatically be mapped to the inputs of the tasks in that job:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
- name: configure-director
  serial: true
  plan:
    - aggregate:
      - get: platform-automation-image
        params:
          unpack: true
      - get: platform-automation-tasks
        params:
          unpack: true
      - get: config
        passed: [previous-job]
      - get: env
        passed: [previous-job]
    - task: configure-director
      image: platform-automation-image
      file: platform-automation-tasks/tasks/configure-director.yml

Passed Constraints

If you have two resources defined with the same git repository, such as env and config, and have a passed constraint on only one of them, there is a possibility that they will not be at the same SHA for any given job in your pipeline.

Example:

1
2
3
- get: config
- get: env
  passed: [previous-job]

Modifying Resources in-place

Concourse 5+ Only

This section uses a Concourse feature that allows inputs and outputs to have the same name. This feature is only available in Concourse 5+. The following does not work with Concourse 4.

In certain circumstances, resources can be modified by one task in a job for use later in that same job. A few tasks that offer this ability include:

For each of these tasks, output_mapping can be used to "overwrite" an input with a modified input for use with tasks later in that job.

In the following example, prepare-tasks-with-secrets takes in the platform-automation-tasks input and modifies it for the download-product task. A more in-depth explanation of this can be found on the secrets-handling page.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
- name: configure-director
  serial: true
  plan:
    - aggregate:
      - get: platform-automation-image
        params:
          unpack: true
      - get: platform-automation-tasks
        params:
          unpack: true
      - get: config
      - get: env
    - task: prepare-tasks-with-secrets
      image: platform-automation-image
      file: platform-automation-tasks/tasks/prepare-tasks-with-secrets.yml
      input_mapping:
        tasks: platform-automation-tasks
      output_mapping:
        tasks: platform-automation-tasks
      params:
        CONFIG_PATHS: config
    - task: download-product
      image: platform-automation-image
      # The following platform-automation-tasks have been modified
      # by the prepare-tasks-with-secrets task
      file: platform-automation-tasks/tasks/download-product.yml
      params:
        CONFIG_FILE: download-ops-manager.yml