Vault & Openshift: a secret relationship

By Hugo Lesta, EDRANS Cloud Engineer

At first, sight, integrating Vault secrets with Openshift can be a daunting task, but you can go around different blog posts or documentation on the internet to solve it… sometimes doing this can sound like a lot of trouble but at the end of the day the implementation ends up being easier than you expected.

In this blog article, we would like to explain how to add external Vault’s secrets into Openshift clusters, following a workaround we had to do with one of our customers.

And illustrated image about Vault and Openshift
Vault and Openshift solution for one of our customers.


Openshift is a Cloud development Platform as a Service (PaaS). It’s open-source and developed by Red Hat, and it enables developers to code and deploys their applications on Cloud infrastructure.

HashiCorp Vault is a secrets management solution that provides programmatic access for both humans and machines. Secrets can be stored, dynamically generated, and, in the case of encryption, keys can be consumed as a service without the need to expose the underlying resources.

The previous introduction was made by their creators and I think is a good way to explain them.

What should you know before starting?

Getting Vault secrets into Openshift deployment is a task performed by a Vault Agent Injector: when a deployment happens a pod init is intercepted by Vault injector and therefore the pod manifest will be changed for this agent. It’s a Kubernetes mutating webhook controller implementation and it will only work when the deployment manifest includes vault annotations.

The MutatingAdmissionWebhook and ValidatingAdmissionWebhook configure a shared memory volume where secrets will be stored and mounted in the path `/vault/secrets/` (within the pod filesystem), as we will see at the end.

For more details please refer to this link.


You can easily deploy a Vault Agent Injector using Helm; make sure you have it installed, it’s an awesome tool to manage your Kubernetes packages.

1st. Add the Vault Helm chart

$ helm repo add hashicorp

2nd. Create a new project for Vault

Openshift, like any other Kubernetes based solution, can leverage the kubectl command to interact with and manage the cluster. Red Hat also creates and publishes the oc CLI which extends the capabilities of kubectl with many convenience functions that make interacting with both Kubernetes and Openshift clusters easier.

Once you have oc binary installed on your computer and let’s create a new Openshift project named “vault”:

$ oc new-project vault

3rd. Install the agent

$ helm install vault hashicorp/vault --set “global.Openshift=true” --set=”injector.enabled=true” --set “injector.externalVaultAddr=https://${YOUR-VAULT-URL}

Make sure to change injector.externalVaultAddr for your own.

4th. Assign auth delegator role to the new Vault project

$ oc adm policy add-cluster-role-to-user system:auth-delegator system:serviceaccount:vault:vault-agent-injector

Auth-delegator policy allows delegating authentication and authorization checks to third party providers. By default, Vault Agent injector process all namespaces except the Kube-system and Kube-public.

5th. Configure Kubernetes auth method in Vault

The first time you perform this you should execute the following commands in order to create a bidirectional communication between Vault and Openshift.

$ vault login ${YOUR-VAULT-TOKEN} 
$ vault auth enable kubernetes

For the next step you will need a vault-agent-injector service account token; you can get it from the helm chart installed in the previous step.

6th. Save the service account token in a variable called REVIEWER_SERVICE_ACCOUNT_JWT

$ REVIEWER_SERVICE_ACCOUNT_JWT=$(oc serviceaccounts get-token vault-agent-injector -n vault)

7th. Get Vault pod name

$ POD=$(oc get pods — no-headers -o -n vault)

8th. Export the cert of the pod agent injector, by redirecting it to a local file

$ oc exec $POD -it cat /var/run/secrets/ > /tmp/ca.crt -n vault

9th. Exports a new variable called OPENSHIFT_HOST with your Openshift API endpoint.


10th. Write the Kubernetes auth config into the Vault cluster

$ vault write auth/kubernetes/config token_reviewer_jwt=$REVIEWER_SERVICE_ACCOUNT_JWT kubernetes_host=$OPENSHIFT_HOST kubernetes_ca_cert=@/tmp/ca.crt


Openshift should authenticate against vault, for which you need to create a new role in Vault and attach a policy to it.

1st. Let’s create a new policy called myapp-policy in Vault.

$ vault policy write myapp-policy policy/policy-myapp.hcl

2nd. The policy file should look like the following.


path “secret*” {
capabilities = [“read”, “list”]

3rd. We’ll use the previously created policy to attach it to a role called vault-agent-injector.

$ vault write auth/kubernetes/role/vault-agent-injector \
bound_service_account_names=vault-agent-injector \
bound_service_account_namespaces=vault \
policies=myapp-policy \

4th. We can test it by doing the following.

Get the service account token again in a variable called DEFAULT_ACCOUNT_TOKEN

$ DEFAULT_ACCOUNT_TOKEN=$(oc sa get-token vault-agent-injector -n vault)

5th. Once you get that service account token, you should execute the following command.

$ vault write auth/kubernetes/login role=vault-agent-injector jwt=$DEFAULT_ACCOUNT_TOKEN

At this time, if you authenticated successfully it means the implementation is starting to make sense.

6th. Finally, you should add the following annotations to your deployment manifest. ‘true’ ‘true’ ‘secret/data/${YOUR-KEY}’ | {{ with secret “secret/data/${YOUR-KEY}” -}} export VAULT_SECRET=”{{${YOUR-KEY} }}”{{- end }} ‘vault-agent-injector’

Those annotations allow you to get secrets from Vault to Openshift. In that case, a file called config will be created in `/vault/secrets/`.

You can use that secret as an environment variable made for the vault agent inject template; to perform that you can use args in the same deployment manifest.

args: [“sh”, “-c”, “source /vault/secrets/config && ./app”]

Wrapping up

That’s all, by following the previous steps you should be ready to get Vault secrets without any issues.


Want to talk to us about security?

We can schedule a Cloud Security Assessment in order to present to your business results, generate a remediation road map, and work with your team to close a potential security gap efficiently.

Reach us at!

We are an AWS Premier Consulting Partner company. Since 2009 we’ve been delivering business outcomes and we want to share our experience with you. Enjoy!