Connecting to a new host

There isn't a lot of information out there and what there is isn't solid.
What I'm trying to find out is how to do an ssh login into a newly-created
host.

The environment is a 1.9 server running under Ubuntu 14.04LTS, in AWS,
creating EC2 hosts (using the same Ubuntu base AMI that was used to create
the Foreman host).
My understanding is that when the new host is created, Foreman is injecting
in SSH key information that it uses to do things like log in and run stuff
like "finish template" commands. It generates its own AWS credentials
rather than using one of my existing set. Which means that the key isn't
where I can get at it easily.

Google results tell me that I can pull the private key from Foreman's
postgresql server key_pairs table, and I have. Not the most straightforward
process in the world, but on the other hand, if it was too easy, it would
be too easy - for unauthorized personnel to obtain it.

When I use this key, however, I'm prompted for a passphrase. According to
one potential solution, that's because I need to update something in the
Ruby environment, but that particular solution was given for the RPM/YUM
platform and I don't know if/how it carries over to the Ubuntu world.
According to another potential solution, however, it may be that this key
is now secured and I really do need a passphrase. Problem is, I have no
clue on what it should be or how to find out.

Can someone please enlighten me? Or is there something I should be doing in
the host setup ("finish template" or otherwise) if I want SSH access to the
new host?

You should customise the finish script to add your own public key or
credentials to the new server instead of trying to use Foreman's private
key.

It might be something as simple as:

cat >> ~root/.ssh/authorized_keys << EOF
ssh-rsa … etc.
EOF

Or use config management (Puppet etc.) to add new user accounts and sudo
rules to the new server.

··· On 22/09/15 04:00, Tim Holloway wrote: > There isn't a lot of information out there and what there is isn't > solid. What I'm trying to find out is how to do an ssh login into a > newly-created host. > > The environment is a 1.9 server running under Ubuntu 14.04LTS, in AWS, > creating EC2 hosts (using the same Ubuntu base AMI that was used to > create the Foreman host). > My understanding is that when the new host is created, Foreman is > injecting in SSH key information that it uses to do things like log in > and run stuff like "finish template" commands. It generates its own AWS > credentials rather than using one of my existing set. Which means that > the key isn't where I can get at it easily.


Dominic Cleal
dominic@cleal.org

That makes sense. Would be nice to have it in the "Create new Hosts"
documentation, though.

I do have some concerns, though. The VM is managed under a key that's only
accessible by the Foreman internals and (so far) I don't think there's a
Foreman "SSH Console" plugin. So in extremis, there's no obvious way to
connect by brute force if, for example, the setup sawed off the limb it was
sitting on. Or for that matter, later maintenance did so.

In practical terms, I added the ssh key to my finish script, but I cannot
log in, and as far as can be determined, the new VM doesn't have port 22
running to log into, so I still cannot connect. I added an apt-get install
for the SSD daemon and coerced ssh startup at boot (both items which
presumably were already in effect as part of VM creation). I could not set
the AWS security group (I think there's an open ticket for that problem),
but AWS claims that it's running under a default rule that opens all ports.
So I'm at a loss how to get root console access to the new host. Or, for
that matter, any sort of access.

··· On Tuesday, 22 September 2015 03:23:11 UTC-4, Dominic Cleal wrote: > > On 22/09/15 04:00, Tim Holloway wrote: > > There isn't a lot of information out there and what there is isn't > > solid. What I'm trying to find out is how to do an ssh login into a > > newly-created host. > > > > The environment is a 1.9 server running under Ubuntu 14.04LTS, in AWS, > > creating EC2 hosts (using the same Ubuntu base AMI that was used to > > create the Foreman host). > > My understanding is that when the new host is created, Foreman is > > injecting in SSH key information that it uses to do things like log in > > and run stuff like "finish template" commands. It generates its own AWS > > credentials rather than using one of my existing set. Which means that > > the key isn't where I can get at it easily. > > You should customise the finish script to add your own public key or > credentials to the new server instead of trying to use Foreman's private > key. > > It might be something as simple as: > > cat >> ~root/.ssh/authorized_keys < ssh-rsa .... etc. > EOF > > Or use config management (Puppet etc.) to add new user accounts and sudo > rules to the new server. > > -- > Dominic Cleal > dom...@cleal.org >

Additional info.

I can talk to the new host ssh by manually adding an AWS security profile
(via AWS console), since Foreman didn't allow doing so.

Still cannot login, though, as all I get is "Permission denied
(publickey)." and lacking a working login, I can't do a post-mortem on the
process to see if something went wrong injecting the public key into the
new host.

BTW, I'm pretty sure that

cat >> ~root/.ssh/authorized_keys << EOF

should have read

cat >> /root/.ssh/authorized_keys << EOF

··· On Tuesday, 22 September 2015 06:45:48 UTC-4, Tim Holloway wrote: > > That makes sense. Would be nice to have it in the "Create new Hosts" > documentation, though. > > I do have some concerns, though. The VM is managed under a key that's only > accessible by the Foreman internals and (so far) I don't think there's a > Foreman "SSH Console" plugin. So *in extremis*, there's no obvious way to > connect by brute force if, for example, the setup sawed off the limb it was > sitting on. Or for that matter, later maintenance did so. > > In practical terms, I added the ssh key to my finish script, but I cannot > log in, and as far as can be determined, the new VM doesn't have port 22 > running to log into, so I still cannot connect. I added an apt-get install > for the SSD daemon and coerced ssh startup at boot (both items which > presumably were already in effect as part of VM creation). I could not set > the AWS security group (I think there's an open ticket for that problem), > but AWS claims that it's running under a default rule that opens all ports. > So I'm at a loss how to get root console access to the new host. Or, for > that matter, *any* sort of access. > > On Tuesday, 22 September 2015 03:23:11 UTC-4, Dominic Cleal wrote: >> >> On 22/09/15 04:00, Tim Holloway wrote: >> > There isn't a lot of information out there and what there is isn't >> > solid. What I'm trying to find out is how to do an ssh login into a >> > newly-created host. >> > >> > The environment is a 1.9 server running under Ubuntu 14.04LTS, in AWS, >> > creating EC2 hosts (using the same Ubuntu base AMI that was used to >> > create the Foreman host). >> > My understanding is that when the new host is created, Foreman is >> > injecting in SSH key information that it uses to do things like log in >> > and run stuff like "finish template" commands. It generates its own AWS >> > credentials rather than using one of my existing set. Which means that >> > the key isn't where I can get at it easily. >> >> You should customise the finish script to add your own public key or >> credentials to the new server instead of trying to use Foreman's private >> key. >> >> It might be something as simple as: >> >> cat >> ~root/.ssh/authorized_keys <> ssh-rsa .... etc. >> EOF >> >> Or use config management (Puppet etc.) to add new user accounts and sudo >> rules to the new server. >> >> -- >> Dominic Cleal >> dom...@cleal.org >> >

> Additional info.
>
> I can talk to the new host ssh by manually adding an AWS security
> profile (via AWS console), since Foreman didn't allow doing so.
>
> Still cannot login, though, as all I get is "Permission denied
> (publickey)." and lacking a working login, I can't do a post-mortem on
> the process to see if something went wrong injecting the public key into
> the new host.

Yeah, I appreciate the problem. My only thoughts are:

a) Perhaps it isn't the root user that you need to use. Check the image
documentation and what Foreman's configured to use for the image, it
might be cloud-user, ec2-user, ubuntu etc.

b) Some OSes have root logins disabled by default in sshd_config, you
may need to reconfigure that.

> BTW, I'm pretty sure that
>
> cat >> ~root/.ssh/authorized_keys << EOF
>
> should have read
>
> cat >> /root/.ssh/authorized_keys << EOF

That should be equivalent.

··· On 22/09/15 14:27, Tim Holloway wrote:


Dominic Cleal
dominic@cleal.org