DevOps teams are smart, motivated, and will use whatever works best for the use case they have at hand. This may explain some of the success of HashiCorp Vault (which I will now refer to as just Vault). Vault has taken three key security needs—secrets management, encryption as a service, and privileged access management—to new heights via a lightweight, portable solution that doesn’t require a lot of infrastructure.
And, although other DevOps tools often have their own secrets stores (e.g. Kubernetes secrets, Ansible Vault), they are all different and introduce layers of complexity to the code base. Vault provides a standardized, streamlined solution that abstracts the secret store from applications to reduce and manage “secret sprawl.”
If you need a primer, check out this overview video from Armon Dadgar, HashiCorp’s co-founder and CTO.
DevOps teams have been quick to appreciate, for example, how easy Vault makes it to generate and store SSL/TLS certificates on demand, which has historically been a cumbersome process using OpenSSL or frameworks like CFSSL. For internal certificates, Vault’s native PKI engine generates certificates from an untrusted, self-signed root certificate authority (CA). Alternately, Vault can be configured to issue certificates from a private PKI subordinate CA (e.g. Microsoft CA) via its PKI backend.
But savvy DevOps teams still run into challenges with getting external-facing certificates that can be trusted by every browser when the code moves into production. To explain this conundrum, let’s dive into the difference between internal and external certificates and the typical process for getting them.
Zero Trust with cert-manager, Istio and Kubernetes
Internal vs. External Certificates
Most organizations leverage multiple types of CAs. For internal-facing applications, InfoSec or PKI teams generally set up internal issuing CAs and then ensure that all employee browsers trust the internal root CA to prevent browser warnings. However, for public web applications or to secure traffic with third-parties, external-facing certificates from publicly-trusted CAs (e.g. DigiCert, Entrust, GlobalSign) are used because certificates need to be trusted by browsers and applications may not be managed by the organization.
Getting External Certificates is Challenging
So, how do DevOps teams go about deploying external-facing certificates within continuous integration/continuous delivery (CI/CD) pipelines? Generally, the answer is, they don’t. Instead, they request an external certificate outside of the pipeline by creating a ticket (and waiting). Or, they use a certificate from their public cloud provider (if the infrastructure resides there, but these certificates may not comply with policy). They may also be authorized to use a publicly-trusted CA’s REST API or web interface. Importantly, notice that Vault is nowhere to be found in any of these answers, and that’s because, until now, Vault was best suited for issuing internal certificates.
Vault Can Do More than It’s Doing as a Standalone Solution
Vault’s ability to simplify, automate and make things easier and faster for internal certificates is a huge accomplishment. But there are still areas where Vault could and should be extended to further benefit DevOps teams, such as allowing them to easily request external certificates. Plus, Vault could do more to help DevOps teams comply with security policies and regulatory frameworks. Granted, Vault’s ability to push logs to Splunk and other logging tools is valuable. But these tools could do more to ease the tension between InfoSec and DevOps by providing an easily auditable solution that enables DevOps teams to request policy-enforced certificates. In most situations, InfoSec teams have no real visibility into the types of certificates being issued by Vault, which causes challenges with compliance and audits.
Extending Vault’s Capabilities for the Enterprise
So, once you get past the amazing potential of DevOps, you come to realize that there are still enterprise IT security challenges that need to be solved. Plus, you need to acknowledge that DevOps has unique requirements for achieving its goal of continuous delivery. Thankfully, the Vault team had the foresight to create an integration program, which enables partners to extend its capabilities to better serve enterprises.
As a strategic partner, Venafi’s newly-developed PKI backend for Vault facilitates certificate enrollment from both internal and publicly-trusted CAs, while enforcing policy and providing InfoSec reporting on issued certificates. Instead of using the Vault-native PKI, it leverages Venafi’s integrations with certificate authorities, generates the private key and certificate signing request to issue policy-enforced certificates, making the process of getting certificate from any CA fully streamlined via Vault. This gives InfoSec teams the ability to toggle between certificate authorities and certificate attributes without impacting DevOps teams.
And, DevOps teams just use a standard command when calling Vault to get certificates:
vault write venafi-pki/issue/tpp-backend common_name="test.example.com" alt_names="test-1.example.com,test-2.example.com"
Venafi also interacts with Vault in a disconnected manner, so instead of being inline in the certificate process it monitors certificate issuance activity within Vault and then pulls these certificates into the Venafi platform to provide InfoSec and PKI teams the visibility they need for compliance and audit purposes.
By extending Vault, these capabilities solve the challenges outlined above. DevOps teams get programmatic access to both internal and external certificates via Vault (using a separate PKI role which amounts to a single parameter in the command), and InfoSec and PKI teams get to seamlessly enforce certificate policies while getting visibility to issued certificates. It’s a win for everyone and showcases the power of how a well-developed technology ecosystem can better serve the needs of all stakeholders.
Related posts