Let's Encrypt Certificate with Key Vault and Azure App Service Import Issues
Hi comunity,
I have a wildcard certificate issued by Let's Encrypt.
I want to use this certificate with an Azure App Service. Following the instructions, I created a password-protected .pfx
file using the following command:
openssl pkcs12 \
-export \
-out ./cert.pfx \
-in ./fullchain.pem \
-inkey ./privkey.pem \
-passout pass:test123
When I try to import this .pfx
file through the Azure portal using the "Bring Your Own Certificate" option, everything works smoothly. The certificate is successfully imported, and I can bind it to the custom domain.
However, when I import the same .pfx
file into Azure Key Vault as a certificate, the import itself works fine. But when I attempt to import the certificate from Key Vault into the App Service, I encounter a permissions error.
I followed the steps outlined in Microsoft's documentation and assigned the App Service's Managed Identity both the "Key Vault Certificate User" and "Key Vault Secrets User" roles. This resolved the permission error, but now, when attempting to import the certificate into the App Service, I get a generic error:
An error has occurred.
Unfortunately, no further details are provided.
Upon checking the Key Vault, I noticed that the associated secret for the certificate was created, although it doesn't appear in the list of secrets. I was able to access it by name, and it appears to be a base64-encoded version of the .pfx
file.
My question is: What should I do from here? How can I debug this issue? What might be causing the problem?
The reason I need the certificate in Key Vault is to automate the process of importing the certificate into the App Service using Terraform. I also need to store the certificate in Key Vault for a future GitHub Actions workflow to handle certificate renewals.
Please let me know if you need any additional information to get a clearer understanding of the issue. I haven't provided specific details here because it involves a lot of information, including extensive tokenization of sensitive information.
Thank you,
Lucian