Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

hiera-eyaml Error was PKCS7[Method: 112, Reason: 115, Data: null] when using mutiple public private keys #306

Open
creativefre opened this issue Oct 28, 2020 · 2 comments

Comments

@creativefre
Copy link

creativefre commented Oct 28, 2020

In our puppet environment we have 1 global hiera.yaml in our control repo which has a global public/private key to decrypt eyaml.
We also have antoher module that also uses a hiera.yaml file with the option of another public/private key that is used to derypt values.

The strange thing is that these puppet runs randomly fail with the error Error was PKCS7[Method: 112, Reason: 115, Data: null].
But the next run that puppet run seems to be able to decrypt it. We have about 100 servers with that role but only aprox 5 to 10 of the runs fail and then restore and then other run of that role fail.
This issue occured on puppet master enterprise 2019.8.

Is it possible that hiera is confused which key it should use on random occasions?

Control hiera file:

---
version: 5
defaults:
  datadir: modules/hieradata
hierarchy:
  - name: 'Eyaml hierarchy'
    lookup_key: eyaml_lookup_key
    paths:
      - "%{::mdi_region}/hosts/%{clientcert}.eyaml"
      - "%{::mdi_region}/platforms/%{::mdi_platform}/%{::mdi_tier}.eyaml"
      - "%{::mdi_region}/platforms/%{::mdi_platform}.eyaml"
      - "%{::mdi_region}.eyaml"
      - global.eyaml
    options:
      pkcs7_private_key: /etc/puppetlabs/puppet/eyaml_keys/private_key.pkcs7.pem
      pkcs7_public_key: /etc/puppetlabs/puppet/eyaml_keys/public_key.pkcs7.pem

seperate module jbossap7 hiera file

---
version: 5
defaults:
  datadir: data
  data_hash: yaml_data
  
hierarchy:
  - name: 'Eyaml hierarchy jbosseap7'
    lookup_key: eyaml_lookup_key
    paths:
      - "jboss/%{::mdi_region}/hosts/%{clientcert}.eyaml"
      - "jboss/%{::mdi_region}/platforms/%{::mdi_platform}/%{::mdi_tier}.eyaml"
      - "jboss/%{::mdi_region}/platforms/%{::mdi_platform}.eyaml"
      - "jboss/%{::mdi_region}.eyaml"
      - jboss/global.eyaml
    options:
      pkcs7_private_key: /etc/puppetlabs/puppet/eyaml_keys/jboss_modules_glent/private_key.pkcs7.pem
      pkcs7_public_key: /etc/puppetlabs/puppet/eyaml_keys/jboss_modules_glent/public_key.pkcs7.pem

Kind Regards,
Frédéricq Stuer

@otheus
Copy link

otheus commented Feb 3, 2022

Are you saying you have one set of keys to encrypt/decrypt things "globally", but also a set of keys for each host? Are all those keys contained in the pkcs7 file? Or you keep the jobss-related keypairs in their own pkcs7 chain?

Does puppet load both hiera files? If so, that explains it. PUppet is notorious for not dealing well with ordering in merging, especially when two files are in the same directory and read in with a fileglob -- the order of the files is unsorted and dependent on the underlying OS, which can change arbitrarily. You will need IMO to use a different label for "eyaml_lookup_key".

@redrac
Copy link

redrac commented Apr 25, 2023

+1 this is still happening.. to be more clear if you are using more than one set of pkcs7 keys (each in different modules/eyaml configs) sometimes puppet will load the incorrect key and try to decrypt. Exactly as the reporter described. @otheus not sure what you mean by different label for eyaml_lookup_key.. that is the provider name?

Ultimately seems like a multithreading issue:
https://github.com/puppetlabs/puppet/blob/main/lib/puppet/functions/eyaml_lookup_key.rb#L83-L87

But kind of makes it unusable to use multiple pkcs7 keys in an environment even though that is a supported functionality.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants