Skip to content

crsimmons/pks-azure

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

PKS Azure

An example of using Platform Automation to deploy PKS on Microsoft Azure.

This repo contains non-foundation-specific files. Foundation-specific (or "promotable") files can be found in pks-azure-config.

Prerequisites

You must have an Azure storage account set up manually with two containers: pivnet and resources.

Fetch Dependencies Pipeline

A pipeline for grabbing blobs from Pivotal Network and storing them in Azure Blobstore for use in other pipelines.

Required parameters

Ensure the following parameters are made available to the pipeline.

credential value
credhub-ca-cert CA certificate of your credhub
credhub-client Client configured with credhub.read and credhub.write in UAA
credhub-secret Client secret of credhub-client
credhub-server The URL of your credhub
github_private_key Private key for your git repo(s)
pivnet_token Token for Pivotal Network
storage_account_key Key for your Azure storage account
storage_account_name Name of your Azure storage account

For interpolation into config files ensure the following parameters are set in credhub on the path fetch-dependencies/<parameter>

credential value
pivnet_token Token for Pivotal Network

Tasks

merge-files

In my experience the credhub-interpolate job in the Platform Automation docs causes confusion by only outputting files that have had values interpolated into them.

For example, if the pks-azure-config were to be run through the credhub-interpolate job the result would be just the yaml files. None of the terraform files would be present as they weren't interpolated. This leads to confusion when deciding which resource to use as an input into tasks.

I find it much easier to pass a single directory between plan steps containing both interpolated and non-interpolated files.

This task copies all the files from the input resource to the output then moves the interpolated files on top.

tfstate-interpolate

When running terraform from the pipeline it doesn't really make sense to pass vars into tfvars then also need to pass them into the pipeline tasks. It also interrupts automation because some values needed by the pipeline are generated by terraform.

This task is stolen adapted from Simon O'Brien's work. It works in much the same way as credhub interpolate but instead uses texplate to do interpolation of values from terraform state.

In order to not clash with credhub vars the placeholders used by tfstate-interpolate are different.

credhub-interpolate: ((parameter))

tfstate-interpolate: {{.parameter}}

Parameters must be defined as outputs in your terraform.

extract-tf-files

Extracts the terraform files out of terraforming-azure and adds some custom terraform files in.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages