Rebuild of Foreman server doesn't import ALL puppet classes in modules directory

I have a weird issue that driving me nuts. We had a Foreman server that
died. It was running RHEL 6.6 and we had DB backups.

So I've rebuilt the server, quite easily, on RHEL 7.1. Our custom puppet
manifests are in a git repo so we restored that also from git no problem.

Here is where it gets weird. After we had everything back to where it
belongs clicking 'Import from foreman' found most of our puppet modules
but not all…

On module in particular 'usrs' just does not exist even though it has the
same permissions and is in the sam (base)modulepath as everything else.
My puppet config is below.

[main]
logdir = /var/log/puppet
rundir = /var/run/puppet
confdir = /etc/puppet
ssldir = $vardir/ssl

privatekeydir = $ssldir/private_keys { group = service }
hostprivkey = $privatekeydir/$certname.pem { mode = 640 }
autosign       = $confdir/autosign.conf { mode = 664 }
show_diff     = false

hiera_config = $confdir/hiera.yaml
environmentpath  = $confdir/environments/puppet

[agent]
classfile = $vardir/classes.txt

localconfig = $vardir/localconfig
default_schedules = false

report            = true
pluginsync        = true
masterport        = 8140
environment       = production
certname          = foreman.local
server            = foreman.local
listen            = false
splay             = false
splaylimit        = 1800
runinterval       = 1800
noop              = false
configtimeout     = 120
usecacheonfailure = true

[master]
autosign = $confdir/autosign.conf { mode = 664 }
reports = foreman
external_nodes = /etc/puppet/node.rb
node_terminus = exec
ca = true
ssldir = /var/lib/puppet/ssl
certname = foreman.local
strict_variables = false

[root@foreman modules]# ll
/etc/puppet/environments/puppet/production/modules/
total 4
lrwxrwxrwx 1 root root 17 May 7 16:39 customize -> fsurcc-customize/
lrwxrwxrwx 1 root root 11 May 7 16:39 env -> fsurcc-env/
drwxr-xr-x 6 root root 4096 May 7 16:39 foreman
drwxr-xr-x 3 root root 22 May 7 16:39 FSUHPCMonitoring
drwxr-xr-x 5 root root 50 May 7 16:39 FSUHPCServices
drwxr-xr-x 7 root root 103 May 7 16:39 fsurcc-customize
drwxr-xr-x 5 root root 84 May 7 16:39 fsurcc-env
drwxr-xr-x 5 root root 100 May 7 16:39 fsurcc-hpcfiles
drwxr-xr-x 4 root root 54 May 7 16:39 fsurcc-licenses
drwxr-xr-x 8 root root 113 May 7 16:39 fsurcc-monitoring
drwxr-xr-x 5 root root 110 May 11 14:53 fsurcc-usrs
drwxr-xr-x 5 root root 48 May 7 16:39 gregsutcliffe-ssh
lrwxrwxrwx 1 root root 16 May 7 16:39 hpcfiles -> fsurcc-hpcfiles/
lrwxrwxrwx 1 root root 16 May 7 16:39 licences -> fsurcc-licenses/
lrwxrwxrwx 1 root root 17 May 7 16:39 monit -> FSUHPCMonitoring/
lrwxrwxrwx 1 root root 18 May 7 16:39 monitoring -> fsurcc-monitoring/
lrwxrwxrwx 1 root root 15 May 7 16:39 svcs -> FSUHPCServices/
lrwxrwxrwx 1 root root 12 May 11 14:54 usrs -> fsurcc-usrs/

RHEL 7.1

[root@foreman modules]# rpm -q foreman foreman-proxy puppet
foreman-1.7.4-1.el7.noarch
foreman-proxy-1.7.4-1.el7.noarch
puppet-3.7.5-1.el7.noarch

So basically it looks like it's boiling down to puppet syntax:

If I comment out all of the lines in the module to where it's a skeleton
class it gets found by foreman. Eg.
class svcs::slurm
(
$slurm_cluster = 'test',
$svc_ensure = running,
$svc_enable = true,
)

··· # # # provides the slurm service to all nodes. # see customize:individualhosts:slurm_controler for the rest of the cofig for the controler # { package { 'slurm': ensure => installed, } package { 'slurm-munge': ensure => installed, } package { 'slurm-plugins': ensure => installed, } # # service { 'slurm': # ensure => $svc_ensure, # enable => $svc_enable, # provider => 'systemd', # hasstatus => true, # hasrestart => true, # require => [Package['slurm'], # File['slurm_conf'], # ], # } ###! add something to make sure munge is up first.. # file { 'slurm_conf': # ensure => present, # path => "/etc/slurm/slurm.conf", # owner => 'slurm', # group => 'slurm', # mode => '0644', # content => template("svcs/slurm/"${slurm_cluster}.slurm.conf.erb","svcs/slurm/"${slurm_cluster}.partition.list.erb"), # notify => Service['slurm'], # require => [Package['slurm']], # } # file { 'cgroup_conf': # ensure => present, # path => "/etc/slurm/cgroup.conf", # owner => 'slurm', # group => 'slurm', # mode => '0644', # content => template("svcs/slurm/cgroup.conf.erb"), # notify => Service['slurm'], # require => [Package['slurm']], # } # user { "slurm": # ensure => present, # uid => '309', # gid => '309', # shell => '/sbin/nologin', # } # group { "slurm": # ensure => present, # gid => '309', # } }

On Monday, May 11, 2015 at 3:03:04 PM UTC-4, Edson Manners wrote:

I have a weird issue that driving me nuts. We had a Foreman server that
died. It was running RHEL 6.6 and we had DB backups.

So I’ve rebuilt the server, quite easily, on RHEL 7.1. Our custom puppet
manifests are in a git repo so we restored that also from git no problem.

Here is where it gets weird. After we had everything back to where it
belongs clicking ‘Import from foreman’ found most of our puppet modules
but not all…

On module in particular ‘usrs’ just does not exist even though it has the
same permissions and is in the sam (base)modulepath as everything else.
My puppet config is below.

[main]
logdir = /var/log/puppet
rundir = /var/run/puppet
confdir = /etc/puppet
ssldir = $vardir/ssl

privatekeydir = $ssldir/private_keys { group = service }
hostprivkey = $privatekeydir/$certname.pem { mode = 640 }
autosign       = $confdir/autosign.conf { mode = 664 }
show_diff     = false

hiera_config = $confdir/hiera.yaml
environmentpath  = $confdir/environments/puppet

[agent]
classfile = $vardir/classes.txt

localconfig = $vardir/localconfig
default_schedules = false

report            = true
pluginsync        = true
masterport        = 8140
environment       = production
certname          = foreman.local
server            = foreman.local
listen            = false
splay             = false
splaylimit        = 1800
runinterval       = 1800
noop              = false
configtimeout     = 120
usecacheonfailure = true

[master]
autosign = $confdir/autosign.conf { mode = 664 }
reports = foreman
external_nodes = /etc/puppet/node.rb
node_terminus = exec
ca = true
ssldir = /var/lib/puppet/ssl
certname = foreman.local
strict_variables = false

[root@foreman modules]# ll
/etc/puppet/environments/puppet/production/modules/
total 4
lrwxrwxrwx 1 root root 17 May 7 16:39 customize -> fsurcc-customize/
lrwxrwxrwx 1 root root 11 May 7 16:39 env -> fsurcc-env/
drwxr-xr-x 6 root root 4096 May 7 16:39 foreman
drwxr-xr-x 3 root root 22 May 7 16:39 FSUHPCMonitoring
drwxr-xr-x 5 root root 50 May 7 16:39 FSUHPCServices
drwxr-xr-x 7 root root 103 May 7 16:39 fsurcc-customize
drwxr-xr-x 5 root root 84 May 7 16:39 fsurcc-env
drwxr-xr-x 5 root root 100 May 7 16:39 fsurcc-hpcfiles
drwxr-xr-x 4 root root 54 May 7 16:39 fsurcc-licenses
drwxr-xr-x 8 root root 113 May 7 16:39 fsurcc-monitoring
drwxr-xr-x 5 root root 110 May 11 14:53 fsurcc-usrs
drwxr-xr-x 5 root root 48 May 7 16:39 gregsutcliffe-ssh
lrwxrwxrwx 1 root root 16 May 7 16:39 hpcfiles -> fsurcc-hpcfiles/
lrwxrwxrwx 1 root root 16 May 7 16:39 licences -> fsurcc-licenses/
lrwxrwxrwx 1 root root 17 May 7 16:39 monit -> FSUHPCMonitoring/
lrwxrwxrwx 1 root root 18 May 7 16:39 monitoring -> fsurcc-monitoring/
lrwxrwxrwx 1 root root 15 May 7 16:39 svcs -> FSUHPCServices/
lrwxrwxrwx 1 root root 12 May 11 14:54 usrs -> fsurcc-usrs/

RHEL 7.1

[root@foreman modules]# rpm -q foreman foreman-proxy puppet
foreman-1.7.4-1.el7.noarch
foreman-proxy-1.7.4-1.el7.noarch
puppet-3.7.5-1.el7.noarch

So basically the takeaway here is that Foreman(proxy?) needed to spit out a
nasty error message for why it saw the file:
'/etc/puppet/environments/svcs/manifests/slurm.pp'

With class:
svcs::slurm

And did not think that it was a valid class to be imported into foreman.
I'm still looking into this.
NB. My foreman proxy is set to debug logging.

··· On Monday, May 11, 2015 at 3:03:04 PM UTC-4, Edson Manners wrote: > > I have a weird issue that driving me nuts. We had a Foreman server that > died. It was running RHEL 6.6 and we had DB backups. > > So I've rebuilt the server, quite easily, on RHEL 7.1. Our custom puppet > manifests are in a git repo so we restored that also from git no problem. > > Here is where it gets weird. After we had everything back to where it > belongs clicking 'Import from foreman' found most of our puppet modules > but *not* all.... > > On module in particular 'usrs' just does not exist even though it has the > same permissions and is in the sam (base)modulepath as everything else. > My puppet config is below. > > > [main] > logdir = /var/log/puppet > rundir = /var/run/puppet > confdir = /etc/puppet > ssldir = $vardir/ssl > > privatekeydir = $ssldir/private_keys { group = service } > hostprivkey = $privatekeydir/$certname.pem { mode = 640 } > autosign = $confdir/autosign.conf { mode = 664 } > show_diff = false > > hiera_config = $confdir/hiera.yaml > environmentpath = $confdir/environments/puppet > > [agent] > classfile = $vardir/classes.txt > > localconfig = $vardir/localconfig > default_schedules = false > > report = true > pluginsync = true > masterport = 8140 > environment = production > certname = foreman.local > server = foreman.local > listen = false > splay = false > splaylimit = 1800 > runinterval = 1800 > noop = false > configtimeout = 120 > usecacheonfailure = true > > [master] > autosign = $confdir/autosign.conf { mode = 664 } > reports = foreman > external_nodes = /etc/puppet/node.rb > node_terminus = exec > ca = true > ssldir = /var/lib/puppet/ssl > certname = foreman.local > strict_variables = false > > [root@foreman modules]# ll > /etc/puppet/environments/puppet/production/modules/ > total 4 > lrwxrwxrwx 1 root root 17 May 7 16:39 customize -> fsurcc-customize/ > lrwxrwxrwx 1 root root 11 May 7 16:39 env -> fsurcc-env/ > drwxr-xr-x 6 root root 4096 May 7 16:39 foreman > drwxr-xr-x 3 root root 22 May 7 16:39 FSUHPCMonitoring > drwxr-xr-x 5 root root 50 May 7 16:39 FSUHPCServices > drwxr-xr-x 7 root root 103 May 7 16:39 fsurcc-customize > drwxr-xr-x 5 root root 84 May 7 16:39 fsurcc-env > drwxr-xr-x 5 root root 100 May 7 16:39 fsurcc-hpcfiles > drwxr-xr-x 4 root root 54 May 7 16:39 fsurcc-licenses > drwxr-xr-x 8 root root 113 May 7 16:39 fsurcc-monitoring > drwxr-xr-x 5 root root 110 May 11 14:53 fsurcc-usrs > drwxr-xr-x 5 root root 48 May 7 16:39 gregsutcliffe-ssh > lrwxrwxrwx 1 root root 16 May 7 16:39 hpcfiles -> fsurcc-hpcfiles/ > lrwxrwxrwx 1 root root 16 May 7 16:39 licences -> fsurcc-licenses/ > lrwxrwxrwx 1 root root 17 May 7 16:39 monit -> FSUHPCMonitoring/ > lrwxrwxrwx 1 root root 18 May 7 16:39 monitoring -> fsurcc-monitoring/ > lrwxrwxrwx 1 root root 15 May 7 16:39 svcs -> FSUHPCServices/ > lrwxrwxrwx 1 root root 12 May 11 14:54 usrs -> fsurcc-usrs/ > > > RHEL 7.1 > > [root@foreman modules]# rpm -q foreman foreman-proxy puppet > foreman-1.7.4-1.el7.noarch > foreman-proxy-1.7.4-1.el7.noarch > puppet-3.7.5-1.el7.noarch >

This line has some extra quotes in it:

content => template("svcs/slurm/"${slurm_cluster}.slurm.conf.erb","svcs/slurm/"${slurm_cluster}.partition.list.erb"),

Should maybe be:

content => template("svcs/slurm/${slurm_cluster}.slurm.conf.erb","svcs/slurm/${slurm_cluster}.partition.list.erb"),

··· From: foreman-users@googlegroups.com [mailto:foreman-users@googlegroups.com] On Behalf Of Edson Manners Sent: Tuesday, May 12, 2015 11:17 AM To: foreman-users@googlegroups.com Subject: [foreman-users] Re: Rebuild of Foreman server doesn't import ALL puppet classes in modules directory

So basically it looks like it’s boiling down to puppet syntax:

If I comment out all of the lines in the module to where it’s a skeleton class it gets found by foreman. Eg.
class svcs::slurm
(
$slurm_cluster = ‘test’,
$svc_ensure = running,
$svc_enable = true,
)

provides the slurm service to all nodes.

see customize:individualhosts:slurm_controler for the rest of the cofig for the controler

{
package { ‘slurm’: ensure => installed, }
package { ‘slurm-munge’: ensure => installed, }
package { ‘slurm-plugins’: ensure => installed, }

service { ‘slurm’:

ensure => $svc_ensure,

enable => $svc_enable,

provider => ‘systemd’,

hasstatus => true,

hasrestart => true,

require => [Package[‘slurm’],

File[‘slurm_conf’],

],

}

###! add something to make sure munge is up first…

file { ‘slurm_conf’:

ensure => present,

path => “/etc/slurm/slurm.conf”,

owner => ‘slurm’,

group => ‘slurm’,

mode => ‘0644’,

content => template(“svcs/slurm/”${slurm_cluster}.slurm.conf.erb",“svcs/slurm/”${slurm_cluster}.partition.list.erb"),

notify => Service[‘slurm’],

require => [Package[‘slurm’]],

}

file { ‘cgroup_conf’:

ensure => present,

path => “/etc/slurm/cgroup.conf”,

owner => ‘slurm’,

group => ‘slurm’,

mode => ‘0644’,

content => template(“svcs/slurm/cgroup.conf.erb”),

notify => Service[‘slurm’],

require => [Package[‘slurm’]],

}

user { “slurm”:

ensure => present,

uid => ‘309’,

gid => ‘309’,

shell => ‘/sbin/nologin’,

}

group { “slurm”:

ensure => present,

gid => ‘309’,

}

}

On Monday, May 11, 2015 at 3:03:04 PM UTC-4, Edson Manners wrote:
I have a weird issue that driving me nuts. We had a Foreman server that died. It was running RHEL 6.6 and we had DB backups.

So I’ve rebuilt the server, quite easily, on RHEL 7.1. Our custom puppet manifests are in a git repo so we restored that also from git no problem.

Here is where it gets weird. After we had everything back to where it belongs clicking ‘Import from foreman’ found most of our puppet modules
but not all…

On module in particular ‘usrs’ just does not exist even though it has the same permissions and is in the sam (base)modulepath as everything else.
My puppet config is below.

[main]
logdir = /var/log/puppet
rundir = /var/run/puppet
confdir = /etc/puppet
ssldir = $vardir/ssl

privatekeydir = $ssldir/private_keys { group = service }
hostprivkey = $privatekeydir/$certname.pem { mode = 640 }
autosign       = $confdir/autosign.conf { mode = 664 }
show_diff     = false

hiera_config = $confdir/hiera.yaml
environmentpath  = $confdir/environments/puppet

[agent]
classfile = $vardir/classes.txt

localconfig = $vardir/localconfig
default_schedules = false

report            = true
pluginsync        = true
masterport        = 8140
environment       = production
certname          = foreman.local
server            = foreman.local
listen            = false
splay             = false
splaylimit        = 1800
runinterval       = 1800
noop              = false
configtimeout     = 120
usecacheonfailure = true

[master]
autosign = $confdir/autosign.conf { mode = 664 }
reports = foreman
external_nodes = /etc/puppet/node.rb
node_terminus = exec
ca = true
ssldir = /var/lib/puppet/ssl
certname = foreman.local
strict_variables = false

[root@foreman modules]# ll /etc/puppet/environments/puppet/production/modules/
total 4
lrwxrwxrwx 1 root root 17 May 7 16:39 customize -> fsurcc-customize/
lrwxrwxrwx 1 root root 11 May 7 16:39 env -> fsurcc-env/
drwxr-xr-x 6 root root 4096 May 7 16:39 foreman
drwxr-xr-x 3 root root 22 May 7 16:39 FSUHPCMonitoring
drwxr-xr-x 5 root root 50 May 7 16:39 FSUHPCServices
drwxr-xr-x 7 root root 103 May 7 16:39 fsurcc-customize
drwxr-xr-x 5 root root 84 May 7 16:39 fsurcc-env
drwxr-xr-x 5 root root 100 May 7 16:39 fsurcc-hpcfiles
drwxr-xr-x 4 root root 54 May 7 16:39 fsurcc-licenses
drwxr-xr-x 8 root root 113 May 7 16:39 fsurcc-monitoring
drwxr-xr-x 5 root root 110 May 11 14:53 fsurcc-usrs
drwxr-xr-x 5 root root 48 May 7 16:39 gregsutcliffe-ssh
lrwxrwxrwx 1 root root 16 May 7 16:39 hpcfiles -> fsurcc-hpcfiles/
lrwxrwxrwx 1 root root 16 May 7 16:39 licences -> fsurcc-licenses/
lrwxrwxrwx 1 root root 17 May 7 16:39 monit -> FSUHPCMonitoring/
lrwxrwxrwx 1 root root 18 May 7 16:39 monitoring -> fsurcc-monitoring/
lrwxrwxrwx 1 root root 15 May 7 16:39 svcs -> FSUHPCServices/
lrwxrwxrwx 1 root root 12 May 11 14:54 usrs -> fsurcc-usrs/

RHEL 7.1

[root@foreman modules]# rpm -q foreman foreman-proxy puppet
foreman-1.7.4-1.el7.noarch
foreman-proxy-1.7.4-1.el7.noarch
puppet-3.7.5-1.el7.noarch

You received this message because you are subscribed to the Google Groups “Foreman users” group.
To unsubscribe from this group and stop receiving emails from it, send an email to foreman-users+unsubscribe@googlegroups.commailto:foreman-users+unsubscribe@googlegroups.com.
To post to this group, send email to foreman-users@googlegroups.commailto:foreman-users@googlegroups.com.
Visit this group at http://groups.google.com/group/foreman-users.
For more options, visit https://groups.google.com/d/optout.
The information contained in this message is proprietary and/or confidential. If you are not the intended recipient, please: (i) delete the message and all copies; (ii) do not disclose, distribute or use the message in any manner; and (iii) notify the sender immediately. In addition, please be aware that any message addressed to our domain is subject to archiving and review by persons other than the intended recipient. Thank you.

Yeah, that's an issue at the moment - there are a few bugs filed I'm
sure in either the smart proxy project's Puppet category, or Foreman's
Puppet integration category.

We don't have a mechanism in the smart proxy API to return those as
warnings right now, it needs adding into the environment class list
response I think, then we can figure out how to display them.

I'd strongly recommend adding a manifest validation step in your source
control pre-commit hook or deployment process if at all possible, as an
interim solution.

··· On 12/05/15 16:51, Edson Manners wrote: > So basically the takeaway here is that Foreman(proxy?) needed to spit > out a nasty error message for why it saw the file: > '/etc/puppet/environments/svcs/manifests/slurm.pp' > > With class: > svcs::slurm > > And did not think that it was a valid class to be imported into foreman. > I'm still looking into this. > NB. My foreman proxy is set to debug logging.


Dominic Cleal
Red Hat Engineering

Thanks Dominic. A pre-commit hook is definitely needed internally as this
isn't the first time that a Puppet syntax error has caused me grief.

··· On Wednesday, May 13, 2015 at 5:00:37 AM UTC-4, Dominic Cleal wrote: > > On 12/05/15 16:51, Edson Manners wrote: > > So basically the takeaway here is that Foreman(proxy?) needed to spit > > out a nasty error message for why it saw the file: > > '/etc/puppet/environments/svcs/manifests/slurm.pp' > > > > With class: > > svcs::slurm > > > > And did not think that it was a valid class to be imported into foreman. > > I'm still looking into this. > > NB. My foreman proxy is set to debug logging. > > Yeah, that's an issue at the moment - there are a few bugs filed I'm > sure in either the smart proxy project's Puppet category, or Foreman's > Puppet integration category. > > We don't have a mechanism in the smart proxy API to return those as > warnings right now, it needs adding into the environment class list > response I think, then we can figure out how to display them. > > I'd strongly recommend adding a manifest validation step in your source > control pre-commit hook or deployment process if at all possible, as an > interim solution. > > -- > Dominic Cleal > Red Hat Engineering >

Thanks. I didn't catch that as I didn't write the code and also didn't
realize that a syntax error could cause this behavior. As per Dominic's
suggestion below we'll be implementing a few things internally to prevent
this in the future.

··· On Tuesday, May 12, 2015 at 12:13:57 PM UTC-4, Fletcher, Robert wrote: > > This line has some extra quotes in it: > > > > # content => > template("svcs/slurm/"${slurm_cluster}.slurm.conf.erb","svcs/slurm/"${slurm_cluster}.partition.list.erb"), > > > > Should maybe be: > > > > # content => > template("svcs/slurm/${slurm_cluster}.slurm.conf.erb","svcs/slurm/${slurm_cluster}.partition.list.erb"), > > > > *From:* forema...@googlegroups.com [mailto: > forema...@googlegroups.com ] *On Behalf Of *Edson Manners > *Sent:* Tuesday, May 12, 2015 11:17 AM > *To:* forema...@googlegroups.com > *Subject:* [foreman-users] Re: Rebuild of Foreman server doesn't import > ALL puppet classes in modules directory > > > > So basically it looks like it's boiling down to puppet syntax: > > > > If I comment out all of the lines in the module to where it's a skeleton > class it gets found by foreman. Eg. > > class svcs::slurm > > ( > > $slurm_cluster = 'test', > > $svc_ensure = running, > > $svc_enable = true, > > ) > > # > > # > > # provides the slurm service to all nodes. > > # see customize:individualhosts:slurm_controler for the rest of the > cofig for the controler > > # > > { > > package { 'slurm': ensure => installed, } > > package { 'slurm-munge': ensure => installed, } > > package { 'slurm-plugins': ensure => installed, } > > # > > # service { 'slurm': > > # ensure => $svc_ensure, > > # enable => $svc_enable, > > # provider => 'systemd', > > # hasstatus => true, > > # hasrestart => true, > > # require => [Package['slurm'], > > # File['slurm_conf'], > > # ], > > # } > > ###! add something to make sure munge is up first.. > > # file { 'slurm_conf': > > # ensure => present, > > # path => "/etc/slurm/slurm.conf", > > # owner => 'slurm', > > # group => 'slurm', > > # mode => '0644', > > # content => > template("svcs/slurm/"${slurm_cluster}.slurm.conf.erb","svcs/slurm/"${slurm_cluster}.partition.list.erb"), > > # notify => Service['slurm'], > > # require => [Package['slurm']], > > # } > > # file { 'cgroup_conf': > > # ensure => present, > > # path => "/etc/slurm/cgroup.conf", > > # owner => 'slurm', > > # group => 'slurm', > > # mode => '0644', > > # content => template("svcs/slurm/cgroup.conf.erb"), > > # notify => Service['slurm'], > > # require => [Package['slurm']], > > # } > > # user { "slurm": > > # ensure => present, > > # uid => '309', > > # gid => '309', > > # shell => '/sbin/nologin', > > # } > > # group { "slurm": > > # ensure => present, > > # gid => '309', > > # } > > } > > > On Monday, May 11, 2015 at 3:03:04 PM UTC-4, Edson Manners wrote: > > I have a weird issue that driving me nuts. We had a Foreman server that > died. It was running RHEL 6.6 and we had DB backups. > > > > So I've rebuilt the server, quite easily, on RHEL 7.1. Our custom puppet > manifests are in a git repo so we restored that also from git no problem. > > > > Here is where it gets weird. After we had everything back to where it > belongs clicking 'Import from foreman' found most of our puppet modules > > but *not* all.... > > > > On module in particular 'usrs' just does not exist even though it has the > same permissions and is in the sam (base)modulepath as everything else. > > My puppet config is below. > > > > > > [main] > > logdir = /var/log/puppet > > rundir = /var/run/puppet > > confdir = /etc/puppet > > ssldir = $vardir/ssl > > > > privatekeydir = $ssldir/private_keys { group = service } > > hostprivkey = $privatekeydir/$certname.pem { mode = 640 } > > autosign = $confdir/autosign.conf { mode = 664 } > > show_diff = false > > > > hiera_config = $confdir/hiera.yaml > > environmentpath = $confdir/environments/puppet > > > > [agent] > > classfile = $vardir/classes.txt > > > > localconfig = $vardir/localconfig > > default_schedules = false > > > > report = true > > pluginsync = true > > masterport = 8140 > > environment = production > > certname = foreman.local > > server = foreman.local > > listen = false > > splay = false > > splaylimit = 1800 > > runinterval = 1800 > > noop = false > > configtimeout = 120 > > usecacheonfailure = true > > > > [master] > > autosign = $confdir/autosign.conf { mode = 664 } > > reports = foreman > > external_nodes = /etc/puppet/node.rb > > node_terminus = exec > > ca = true > > ssldir = /var/lib/puppet/ssl > > certname = foreman.local > > strict_variables = false > > > > [root@foreman modules]# ll > /etc/puppet/environments/puppet/production/modules/ > > total 4 > > lrwxrwxrwx 1 root root 17 May 7 16:39 customize -> fsurcc-customize/ > > lrwxrwxrwx 1 root root 11 May 7 16:39 env -> fsurcc-env/ > > drwxr-xr-x 6 root root 4096 May 7 16:39 foreman > > drwxr-xr-x 3 root root 22 May 7 16:39 FSUHPCMonitoring > > drwxr-xr-x 5 root root 50 May 7 16:39 FSUHPCServices > > drwxr-xr-x 7 root root 103 May 7 16:39 fsurcc-customize > > drwxr-xr-x 5 root root 84 May 7 16:39 fsurcc-env > > drwxr-xr-x 5 root root 100 May 7 16:39 fsurcc-hpcfiles > > drwxr-xr-x 4 root root 54 May 7 16:39 fsurcc-licenses > > drwxr-xr-x 8 root root 113 May 7 16:39 fsurcc-monitoring > > drwxr-xr-x 5 root root 110 May 11 14:53 fsurcc-usrs > > drwxr-xr-x 5 root root 48 May 7 16:39 gregsutcliffe-ssh > > lrwxrwxrwx 1 root root 16 May 7 16:39 hpcfiles -> fsurcc-hpcfiles/ > > lrwxrwxrwx 1 root root 16 May 7 16:39 licences -> fsurcc-licenses/ > > lrwxrwxrwx 1 root root 17 May 7 16:39 monit -> FSUHPCMonitoring/ > > lrwxrwxrwx 1 root root 18 May 7 16:39 monitoring -> fsurcc-monitoring/ > > lrwxrwxrwx 1 root root 15 May 7 16:39 svcs -> FSUHPCServices/ > > lrwxrwxrwx 1 root root 12 May 11 14:54 usrs -> fsurcc-usrs/ > > > > > > RHEL 7.1 > > > > [root@foreman modules]# rpm -q foreman foreman-proxy puppet > > foreman-1.7.4-1.el7.noarch > > foreman-proxy-1.7.4-1.el7.noarch > > puppet-3.7.5-1.el7.noarch > > -- > You received this message because you are subscribed to the Google Groups > "Foreman users" group. > To unsubscribe from this group and stop receiving emails from it, send an > email to foreman-user...@googlegroups.com . > To post to this group, send email to forema...@googlegroups.com > . > Visit this group at http://groups.google.com/group/foreman-users. > For more options, visit https://groups.google.com/d/optout. > The information contained in this message is proprietary and/or > confidential. If you are not the intended recipient, please: (i) delete the > message and all copies; (ii) do not disclose, distribute or use the message > in any manner; and (iii) notify the sender immediately. In addition, please > be aware that any message addressed to our domain is subject to archiving > and review by persons other than the intended recipient. Thank you. >