Issues since upgrade to 2.1.1

I think that in /var/log/foreman/production.log there’s timing information about how long pages take. Can you share which ones are so slow? A timeout of 1800 is insane.

I get this error when creating a host with hammer:

[ INFO 2020-09-03T16:14:12 API] POST /api/hosts
[DEBUG 2020-09-03T16:14:12 API] Params: {
        "location_id" => 6,
    "organization_id" => 3,
               "host" => {
                                 "name" => "test-host",
                          "location_id" => 6,
                      "organization_id" => 3,
                            "domain_id" => 5,
                            "subnet_id" => 5,
                  "compute_resource_id" => 9,
                         "hostgroup_id" => 104,
                             "image_id" => 6,
                     "provision_method" => "image",
                   "compute_profile_id" => 1,
                   "compute_attributes" => {
                         "start" => "1",
            "volumes_attributes" => {},
                      "image_id" => "dad8887a-dd2a-4db2-8b9c-f7ce8dd7e7a9"
        },
             "content_facet_attributes" => {},
        "subscription_facet_attributes" => {},
                      "puppet_proxy_id" => 13,
                   "puppet_ca_proxy_id" => 13,
                                "build" => true,
                              "managed" => true,
                              "enabled" => true,
                            "overwrite" => true,
                "interfaces_attributes" => [
            [0] {
                              "name" => "nic1",
                "compute_attributes" => {
                    "network" => "977efbe9-f9e0-4383-a19a-a7ee65ad5027"
                }
            }
        ]
    }
}
[DEBUG 2020-09-03T16:14:12 API] Headers: {}
[DEBUG 2020-09-03T16:14:12 API] Using authenticator:     HammerCLIForeman::Api::InteractiveBasicAuth
[ERROR 2020-09-03T16:15:13 API] 502 Bad Gateway
[DEBUG 2020-09-03T16:15:13 API] "<!DOCTYPE HTML PUBLIC \"-//IETF//DTD HTML 2.0//EN\">\n<html><head>\n<title>502 Proxy Error</title>\n</head><body>\n<h1>Proxy Error</h1>\n<p>The proxy server received an invalid\r\nresponse from an upstream server.<br />\r\nThe proxy server could not handle the request <em><a href=\"/api/hosts\">POST&nbsp;/api/hosts</a></em>.<p>\nReason: <strong>Error reading from remote server</strong></p></p>\n</body></html>\n"
[DEBUG 2020-09-03T16:15:13 Exception] Using exception handler HammerCLIForeman::ExceptionHandler#handle_general_exception
[ERROR 2020-09-03T16:15:13 Exception] Error: 502 Bad Gateway
Could not create the host:
  Error: 502 Bad Gateway
[ERROR 2020-09-03T16:15:13 Exception] 

RestClient::BadGateway (502 Bad Gateway):
    /opt/theforeman/tfm/root/usr/share/gems/gems/rest-client-2.0.2/lib/restclient/abstract_response.rb:223:in `exception_with_response'
    /opt/theforeman/tfm/root/usr/share/gems/gems/rest-client-2.0.2/lib/restclient/abstract_response.rb:103:in `return!'
    /opt/theforeman/tfm/root/usr/share/gems/gems/apipie-bindings-0.3.0/lib/apipie_bindings/api.rb:353:in `block in rest_client_call_block'
    /opt/theforeman/tfm/root/usr/share/gems/gems/rest-client-2.0.2/lib/restclient/request.rb:807:in `process_result'
    /opt/theforeman/tfm/root/usr/share/gems/gems/rest-client-2.0.2/lib/restclient/request.rb:725:in `block in transmit'
    /opt/rh/rh-ruby25/root/usr/share/ruby/net/http.rb:910:in `start'
    /opt/theforeman/tfm/root/usr/share/gems/gems/rest-client-2.0.2/lib/restclient/request.rb:715:in `transmit'
    /opt/theforeman/tfm/root/usr/share/gems/gems/rest-client-2.0.2/lib/restclient/request.rb:145:in `execute'
    /opt/theforeman/tfm/root/usr/share/gems/gems/rest-client-2.0.2/lib/restclient/request.rb:52:in `execute'
    /opt/theforeman/tfm/root/usr/share/gems/gems/rest-client-2.0.2/lib/restclient/resource.rb:67:in `post'
    /opt/theforeman/tfm/root/usr/share/gems/gems/apipie-bindings-0.3.0/lib/apipie_bindings/api.rb:327:in `call_client'
    /opt/theforeman/tfm/root/usr/share/gems/gems/apipie-bindings-0.3.0/lib/apipie_bindings/api.rb:240:in `http_call'
    /opt/theforeman/tfm/root/usr/share/gems/gems/apipie-bindings-0.3.0/lib/apipie_bindings/api.rb:190:in `call_action'
    /opt/theforeman/tfm/root/usr/share/gems/gems/apipie-bindings-0.3.0/lib/apipie_bindings/api.rb:185:in `call'
    /opt/theforeman/tfm/root/usr/share/gems/gems/apipie-bindings-0.3.0/lib/apipie_bindings/resource.rb:21:in `call'
    /opt/theforeman/tfm/root/usr/share/gems/gems/hammer_cli-2.1.1/lib/hammer_cli/apipie/command.rb:53:in `send_request'
    /opt/theforeman/tfm/root/usr/share/gems/gems/hammer_cli_foreman-2.1.2/lib/hammer_cli_foreman/commands.rb:188:in `send_request'
    /opt/theforeman/tfm/root/usr/share/gems/gems/hammer_cli-2.1.1/lib/hammer_cli/apipie/command.rb:34:in `execute'
    /opt/theforeman/tfm/root/usr/share/gems/gems/clamp-1.1.2/lib/clamp/command.rb:63:in `run'
    /opt/theforeman/tfm/root/usr/share/gems/gems/hammer_cli-2.1.1/lib/hammer_cli/abstract.rb:77:in `run'
    /opt/theforeman/tfm/root/usr/share/gems/gems/clamp-1.1.2/lib/clamp/subcommand/execution.rb:11:in `execute'
    /opt/theforeman/tfm/root/usr/share/gems/gems/clamp-1.1.2/lib/clamp/command.rb:63:in `run'
    /opt/theforeman/tfm/root/usr/share/gems/gems/hammer_cli-2.1.1/lib/hammer_cli/abstract.rb:77:in `run'
    /opt/theforeman/tfm/root/usr/share/gems/gems/clamp-1.1.2/lib/clamp/subcommand/execution.rb:11:in `execute'
    /opt/theforeman/tfm/root/usr/share/gems/gems/clamp-1.1.2/lib/clamp/command.rb:63:in `run'
    /opt/theforeman/tfm/root/usr/share/gems/gems/hammer_cli-2.1.1/lib/hammer_cli/abstract.rb:77:in `run'
    /opt/theforeman/tfm/root/usr/share/gems/gems/clamp-1.1.2/lib/clamp/command.rb:132:in `run'
    /opt/theforeman/tfm/root/usr/share/gems/gems/hammer_cli-2.1.1/bin/hammer:147:in `<top (required)>'
    /usr/bin/hammer:23:in `load'
    /usr/bin/hammer:23:in `<main>'

real  1m24.134s
user  0m0.711s
sys 0m0.404s

error in /var/log/httpd/foreman-ssl_error_ssl.log:

[Thu Sep 03 16:15:13.215404 2020] [proxy_http:error] [pid 27428] (70007)The timeout specified has expired: [client 10.130.0.1:35150] AH01102: error reading status line from remote server 127.0.0.1:3000
[Thu Sep 03 16:15:13.215496 2020] [proxy:error] [pid 27428] [client 10.130.0.1:35150] AH00898: Error reading from remote server returned by /api/hosts

However I don’t see anything wrong in production.log and the host get created correclty.

From what I understand this 60s timeout comes from the global default Timeout 60 in /etc/httpd/conf/httpd.conf

But can you see the timing in production.log? How long does it take there?

This appears to be using a compute resource and that may be slow to respond. Can you share some info about which compute resource type this is?

Here is the production.log (info level):

2020-09-03T16:14:13 [I|app|4bcde6fd] Started POST "/api/hosts" for 127.0.0.1 at 2020-09-03     16:14:13 +0000
2020-09-03T16:14:13 [I|app|4bcde6fd] Processing by Api::V2::HostsController#create as JSON
2020-09-03T16:14:13 [I|app|4bcde6fd]   Parameters: {"location_id"=>6, "organization_id"=>3, "host"=>{"name"=>"test-host", "location_id"=>6, "organization_id"=>3, "domain_id"=>5, "subnet_id"=>5, "compute_resource_id"=>9, "hostgroup_id"=>104, "image_id"=>6, "provision_method"=>"image", "compute_profile_id"=>1, "compute_attributes"=>{"start"=>"1", "volumes_attributes"=>{}, "image_id"=>"dad8887a-dd2a-4db2-8b9c-f7ce8dd7e7a9"}, "content_facet_attributes"=>{}, "subscription_facet_attributes"=>{}, "puppet_proxy_id"=>13, "puppet_ca_proxy_id"=>13, "build"=>true, "managed"=>true, "enabled"=>true, "overwrite"=>true, "interfaces_attributes"=>[{"name"=>"nic1", "compute_attributes"=>{"network"=>"977efbe9-f9e0-4383-a19a-a7ee65ad5027"}}]}, "apiv"=>"v2"}
2020-09-03T16:14:13 [I|app|4bcde6fd] Authorized user admin(Admin User)
2020-09-03T16:14:15 [I|app|4bcde6fd] Adding Compute instance for test-host.com
2020-09-03T16:16:22 [I|app|4bcde6fd] Create DHCP reservation test-host.com for test-host.com-56:6f:c4:69:00:84/10.32.211.190
2020-09-03T16:16:23 [I|app|4bcde6fd] Add DNS A record for test-host.com/10.32.211.190
2020-09-03T16:16:23 [I|app|4bcde6fd] Add DNS PTR record for 10.32.211.190/test-host.com
2020-09-03T16:16:24 [I|app|4bcde6fd] Powering up Compute instance for test-host.com
2020-09-03T16:16:25 [I|app|4bcde6fd] Processed 7 tasks from queue 'Host::Managed Main', completed 7/7
2020-09-03T16:16:25 [I|aud|4bcde6fd] Nic::Managed (170) create event on mac 56:6f:c4:69:00:84
2020-09-03T16:16:25 [I|aud|4bcde6fd] Nic::Managed (170) create event on ip 10.32.211.190
2020-09-03T16:16:25 [I|aud|4bcde6fd] Nic::Managed (170) create event on type Nic::Managed
2020-09-03T16:16:25 [I|aud|4bcde6fd] Nic::Managed (170) create event on name test-host.com
2020-09-03T16:16:25 [I|aud|4bcde6fd] Nic::Managed (170) create event on host_id 170
2020-09-03T16:16:25 [I|aud|4bcde6fd] Nic::Managed (170) create event on subnet_id 5
2020-09-03T16:16:25 [I|aud|4bcde6fd] Nic::Managed (170) create event on domain_id 5
2020-09-03T16:16:25 [I|aud|4bcde6fd] Nic::Managed (170) create event on attrs {}
2020-09-03T16:16:25 [I|aud|4bcde6fd] Nic::Managed (170) create event on provider 
2020-09-03T16:16:25 [I|aud|4bcde6fd] Nic::Managed (170) create event on username 
2020-09-03T16:16:25 [I|aud|4bcde6fd] Nic::Managed (170) create event on password [redacted]
2020-09-03T16:16:25 [I|aud|4bcde6fd] Nic::Managed (170) create event on virtual false
2020-09-03T16:16:25 [I|aud|4bcde6fd] Nic::Managed (170) create event on link true
2020-09-03T16:16:25 [I|aud|4bcde6fd] Nic::Managed (170) create event on identifier 
2020-09-03T16:16:25 [I|aud|4bcde6fd] Nic::Managed (170) create event on tag 
2020-09-03T16:16:25 [I|aud|4bcde6fd] Nic::Managed (170) create event on attached_to 
2020-09-03T16:16:25 [I|aud|4bcde6fd] Nic::Managed (170) create event on managed true
2020-09-03T16:16:25 [I|aud|4bcde6fd] Nic::Managed (170) create event on mode balance-rr
2020-09-03T16:16:25 [I|aud|4bcde6fd] Nic::Managed (170) create event on attached_devices 
2020-09-03T16:16:25 [I|aud|4bcde6fd] Nic::Managed (170) create event on bond_options 
2020-09-03T16:16:25 [I|aud|4bcde6fd] Nic::Managed (170) create event on primary true
2020-09-03T16:16:25 [I|aud|4bcde6fd] Nic::Managed (170) create event on provision true
2020-09-03T16:16:25 [I|aud|4bcde6fd] Nic::Managed (170) create event on compute_attributes {"name"=>"nic1", "network"=>"977efbe9-f9e0-4383-a19a-a7ee65ad5027", "interface"=>"virtio", "from_profile"=>"1-Small"}
2020-09-03T16:16:25 [I|aud|4bcde6fd] Nic::Managed (170) create event on ip6 
2020-09-03T16:16:25 [I|aud|4bcde6fd] Nic::Managed (170) create event on subnet6_id 
2020-09-03T16:16:25 [I|aud|4bcde6fd] Nic::Managed (170) create event on execution true
2020-09-03T16:16:25 [I|aud|4bcde6fd] Host::Base (170) create event on name test-host.com
2020-09-03T16:16:25 [I|aud|4bcde6fd] Host::Base (170) create event on root_pass $5$REDACTED$REDACTED
2020-09-03T16:16:25 [I|aud|4bcde6fd] Host::Base (170) create event on architecture_id 1
2020-09-03T16:16:25 [I|aud|4bcde6fd] Host::Base (170) create event on operatingsystem_id 8
2020-09-03T16:16:25 [I|aud|4bcde6fd] Host::Base (170) create event on environment_id 1
2020-09-03T16:16:25 [I|aud|4bcde6fd] Host::Base (170) create event on ptable_id 
2020-09-03T16:16:25 [I|aud|4bcde6fd] Host::Base (170) create event on medium_id 
2020-09-03T16:16:25 [I|aud|4bcde6fd] Host::Base (170) create event on build true
2020-09-03T16:16:25 [I|aud|4bcde6fd] Host::Base (170) create event on comment 
2020-09-03T16:16:25 [I|aud|4bcde6fd] Host::Base (170) create event on disk 
2020-09-03T16:16:25 [I|aud|4bcde6fd] Host::Base (170) create event on installed_at 
2020-09-03T16:16:25 [I|aud|4bcde6fd] Host::Base (170) create event on model_id 
2020-09-03T16:16:25 [I|aud|4bcde6fd] Host::Base (170) create event on hostgroup_id 104
2020-09-03T16:16:25 [I|aud|4bcde6fd] Host::Base (170) create event on owner_id 3
2020-09-03T16:16:25 [I|aud|4bcde6fd] Host::Base (170) create event on owner_type User
2020-09-03T16:16:25 [I|aud|4bcde6fd] Host::Base (170) create event on enabled true
2020-09-03T16:16:25 [I|aud|4bcde6fd] Host::Base (170) create event on puppet_ca_proxy_id 13
2020-09-03T16:16:25 [I|aud|4bcde6fd] Host::Base (170) create event on managed true
2020-09-03T16:16:25 [I|aud|4bcde6fd] Host::Base (170) create event on use_image 
2020-09-03T16:16:25 [I|aud|4bcde6fd] Host::Base (170) create event on image_file 
2020-09-03T16:16:25 [I|aud|4bcde6fd] Host::Base (170) create event on uuid 284b2166-a775-4c92-97cb-138ed3c7187c
2020-09-03T16:16:25 [I|aud|4bcde6fd] Host::Base (170) create event on compute_resource_id 9
2020-09-03T16:16:25 [I|aud|4bcde6fd] Host::Base (170) create event on puppet_proxy_id 13
2020-09-03T16:16:25 [I|aud|4bcde6fd] Host::Base (170) create event on certname 
2020-09-03T16:16:25 [I|aud|4bcde6fd] Host::Base (170) create event on image_id 6
2020-09-03T16:16:25 [I|aud|4bcde6fd] Host::Base (170) create event on organization_id 3
2020-09-03T16:16:25 [I|aud|4bcde6fd] Host::Base (170) create event on location_id 6
2020-09-03T16:16:25 [I|aud|4bcde6fd] Host::Base (170) create event on otp 
2020-09-03T16:16:25 [I|aud|4bcde6fd] Host::Base (170) create event on realm_id 
2020-09-03T16:16:25 [I|aud|4bcde6fd] Host::Base (170) create event on compute_profile_id 1
2020-09-03T16:16:25 [I|aud|4bcde6fd] Host::Base (170) create event on provision_method image
2020-09-03T16:16:25 [I|aud|4bcde6fd] Host::Base (170) create event on grub_pass $5$REDACTED$REDACTED
2020-09-03T16:16:25 [I|aud|4bcde6fd] Host::Base (170) create event on pxe_loader 
2020-09-03T16:16:25 [I|aud|4bcde6fd] Host::Base (170) create event on initiated_at 
2020-09-03T16:16:25 [I|aud|4bcde6fd] Host::Base (170) create event on build_errors 
2020-09-03T16:16:25 [I|aud|4bcde6fd] Host::Base (170) create event on openscap_proxy_id 
2020-09-03T16:16:25 [I|aud|4bcde6fd] Host::Base (170) create event on content_facet_attributes {"id"=>nil, "host_id"=>nil, "uuid"=>nil, "content_view_id"=>2, "lifecycle_environment_id"=>2, "kickstart_repository_id"=>nil, "content_source_id"=>1, "installable_security_errata_count"=>0, "installable_enhancement_errata_count"=>0, "installable_bugfix_errata_count"=>0, "applicable_rpm_count"=>0, "upgradable_rpm_count"=>0, "applicable_module_stream_count"=>0, "upgradable_module_stream_count"=>0}
2020-09-03T16:16:25 [I|aud|4bcde6fd] Host::Base (167) create event on host_id 170
2020-09-03T16:16:25 [I|aud|4bcde6fd] Host::Base (167) create event on uuid 
2020-09-03T16:16:25 [I|aud|4bcde6fd] Host::Base (167) create event on content_view_id 2
2020-09-03T16:16:25 [I|aud|4bcde6fd] Host::Base (167) create event on lifecycle_environment_id 2
2020-09-03T16:16:25 [I|aud|4bcde6fd] Host::Base (167) create event on kickstart_repository_id 
2020-09-03T16:16:25 [I|aud|4bcde6fd] Host::Base (167) create event on content_source_id 1
2020-09-03T16:16:25 [I|aud|4bcde6fd] Host::Base (167) create event on installable_security_errata_count 0
2020-09-03T16:16:25 [I|aud|4bcde6fd] Host::Base (167) create event on installable_enhancement_errata_count 0
2020-09-03T16:16:25 [I|aud|4bcde6fd] Host::Base (167) create event on installable_bugfix_errata_count 0
2020-09-03T16:16:25 [I|aud|4bcde6fd] Host::Base (167) create event on applicable_rpm_count 0
2020-09-03T16:16:25 [I|aud|4bcde6fd] Host::Base (167) create event on upgradable_rpm_count 0
2020-09-03T16:16:25 [I|aud|4bcde6fd] Host::Base (167) create event on applicable_module_stream_count 0
2020-09-03T16:16:25 [I|aud|4bcde6fd] Host::Base (167) create event on upgradable_module_stream_count 0
2020-09-03T16:16:25 [I|app|4bcde6fd] Remove puppet certificate for test-host.com
2020-09-03T16:16:25 [I|app|4bcde6fd] Adding autosign entry for test-host.com
2020-09-03T16:16:25 [I|app|4bcde6fd] About to start post launch script on test-host.com
2020-09-03T16:16:25 [I|app|4bcde6fd] generating template to upload to test-host.com
2020-09-03T16:16:25 [W|app|4bcde6fd] Could not find a provider for test-host.com. Providers returned {"Katello::ManagedContentMediumProvider"=>["Kickstart repository was not set for host 'test-host.com'"], "MediumProviders::Default"=>["CentOS 7.8.2003 medium was not set for host 'test-host.com'", "Invalid medium '' for 'CentOS 7.8.2003'"]}
2020-09-03T16:16:25 [W|app|4bcde6fd] Could not find a provider for test-host.com. Providers returned {"Katello::ManagedContentMediumProvider"=>["Kickstart repository was not set for host 'test-host.com'"], "MediumProviders::Default"=>["CentOS 7.8.2003 medium was not set for host 'test-host.com'", "Invalid medium '' for 'CentOS 7.8.2003'"]}
2020-09-03T16:16:25 [W|app|4bcde6fd] Could not find a provider for test-host.com. Providers returned {"Katello::ManagedContentMediumProvider"=>["Kickstart repository was not set for host 'test-host.com'"], "MediumProviders::Default"=>["CentOS 7.8.2003 medium was not set for host 'test-host.com'", "Invalid medium '' for 'CentOS 7.8.2003'"]}
2020-09-03T16:16:25 [W|app|4bcde6fd] Could not find a provider for test-host.com. Providers returned {"Katello::ManagedContentMediumProvider"=>["Kickstart repository was not set for host 'test-host.com'"], "MediumProviders::Default"=>["CentOS 7.8.2003 medium was not set for host 'test-host.com'", "Invalid medium '' for 'CentOS 7.8.2003'"]}
2020-09-03T16:16:25 [W|app|4bcde6fd] Could not find a provider for test-host.com. Providers returned {"Katello::ManagedContentMediumProvider"=>["Kickstart repository was not set for host 'test-host.com'"], "MediumProviders::Default"=>["CentOS 7.8.2003 medium was not set for host 'test-host.com'", "Invalid medium '' for 'CentOS 7.8.2003'"]}
2020-09-03T16:16:25 [W|app|4bcde6fd] Could not find a provider for test-host.com. Providers returned {"Katello::ManagedContentMediumProvider"=>["Kickstart repository was not set for host 'test-host.com'"], "MediumProviders::Default"=>["CentOS 7.8.2003 medium was not set for host 'test-host.com'", "Invalid medium '' for 'CentOS 7.8.2003'"]}
2020-09-03T16:16:25 [W|app|4bcde6fd] Could not find a provider for test-host.com. Providers returned {"Katello::ManagedContentMediumProvider"=>["Kickstart repository was not set for host 'test-host.com'"], "MediumProviders::Default"=>["CentOS 7.8.2003 medium was not set for host 'test-host.com'", "Invalid medium '' for 'CentOS 7.8.2003'"]}
2020-09-03T16:16:25 [W|app|4bcde6fd] Could not find a provider for test-host.com. Providers returned {"Katello::ManagedContentMediumProvider"=>["Kickstart repository was not set for host 'test-host.com'"], "MediumProviders::Default"=>["CentOS 7.8.2003 medium was not set for host 'test-host.com'", "Invalid medium '' for 'CentOS 7.8.2003'"]}
2020-09-03T16:16:25 [W|app|4bcde6fd] Could not find a provider for test-host.com. Providers returned {"Katello::ManagedContentMediumProvider"=>["Kickstart repository was not set for host 'test-host.com'"], "MediumProviders::Default"=>["CentOS 7.8.2003 medium was not set for host 'test-host.com'", "Invalid medium '' for 'CentOS 7.8.2003'"]}
2020-09-03T16:16:25 [W|app|4bcde6fd] Could not find a provider for test-host.com. Providers returned {"Katello::ManagedContentMediumProvider"=>["Kickstart repository was not set for host 'test-host.com'"], "MediumProviders::Default"=>["CentOS 7.8.2003 medium was not set for host 'test-host.com'", "Invalid medium '' for 'CentOS 7.8.2003'"]}
2020-09-03T16:16:25 [I|app|4bcde6fd] Starting SSH provisioning script - waiting for 10.32.211.190 to respond
2020-09-03T16:16:42 [I|app|4bcde6fd] negotiating protocol version
2020-09-03T16:16:42 [I|app|4bcde6fd] sending KEXINIT
2020-09-03T16:16:42 [I|app|4bcde6fd] got KEXINIT from server
2020-09-03T16:16:42 [I|app|4bcde6fd] negotiating algorithms
2020-09-03T16:16:42 [I|app|4bcde6fd] global request received: hostkeys-00@openssh.com false
2020-09-03T16:16:42 [I|app|4bcde6fd] channel_open_confirmation: 0 0 0 32768
2020-09-03T16:16:42 [I|app|4bcde6fd] sending channel request "pty-req"
2020-09-03T16:16:42 [I|app|4bcde6fd] sending channel request "exec"
2020-09-03T16:16:42 [I|app|4bcde6fd] channel_window_adjust: 0 +2097152
2020-09-03T16:16:42 [I|app|4bcde6fd] channel_success: 0
2020-09-03T16:16:42 [I|app|4bcde6fd] channel_request: 0 exit-status false
2020-09-03T16:16:42 [I|app|4bcde6fd] channel_data: 0 7b
2020-09-03T16:16:42 [I|app|4bcde6fd] channel_eof: 0
2020-09-03T16:16:42 [I|app|4bcde6fd] channel_close: 0
2020-09-03T16:16:42 [I|app|4bcde6fd] 10.32.211.190 delete channel 0 which closed locally and remotely
2020-09-03T16:16:42 [I|app|4bcde6fd] closing remaining channels (0 open)
2020-09-03T16:16:43 [I|app|4bcde6fd] SSH connection established to 10.32.211.190 - executing template
2020-09-03T16:16:43 [I|app|4bcde6fd] negotiating protocol version
2020-09-03T16:16:43 [I|app|4bcde6fd] sending KEXINIT
2020-09-03T16:16:43 [I|app|4bcde6fd] got KEXINIT from server
2020-09-03T16:16:43 [I|app|4bcde6fd] negotiating algorithms
2020-09-03T16:16:43 [I|app|4bcde6fd] global request received: hostkeys-00@openssh.com false
2020-09-03T16:16:43 [I|app|4bcde6fd] channel_open_confirmation: 0 0 0 32768
2020-09-03T16:16:43 [I|app|4bcde6fd] sending channel request "exec"
2020-09-03T16:16:43 [I|app|4bcde6fd] channel_window_adjust: 0 +2097152
2020-09-03T16:16:43 [I|app|4bcde6fd] channel_success: 0
2020-09-03T16:16:43 [I|app|4bcde6fd] channel_data: 0 1b
2020-09-03T16:16:43 [I|app|4bcde6fd] channel_data: 0 1b
2020-09-03T16:16:43 [I|app|4bcde6fd] channel_data: 0 1b
2020-09-03T16:16:43 [I|app|4bcde6fd] channel_request: 0 exit-status false
2020-09-03T16:16:43 [I|app|4bcde6fd] channel_eof: 0
2020-09-03T16:16:43 [I|app|4bcde6fd] channel_close: 0
2020-09-03T16:16:43 [I|app|4bcde6fd] 10.32.211.190 delete channel 0 which closed locally and remotely
2020-09-03T16:16:43 [I|app|4bcde6fd] closing remaining channels (0 open)
2020-09-03T16:16:43 [I|app|4bcde6fd] negotiating protocol version
2020-09-03T16:16:43 [I|app|4bcde6fd] sending KEXINIT
2020-09-03T16:16:43 [I|app|4bcde6fd] got KEXINIT from server
2020-09-03T16:16:43 [I|app|4bcde6fd] negotiating algorithms
2020-09-03T16:16:44 [I|app|4bcde6fd] global request received: hostkeys-00@openssh.com false
2020-09-03T16:16:44 [I|app|4bcde6fd] channel_open_confirmation: 0 0 0 32768
2020-09-03T16:16:44 [I|app|4bcde6fd] sending channel request "pty-req"
2020-09-03T16:16:44 [I|app|4bcde6fd] sending channel request "exec"
2020-09-03T16:16:44 [I|app|4bcde6fd] channel_window_adjust: 0 +2097152
2020-09-03T16:16:44 [I|app|4bcde6fd] channel_success: 0
2020-09-03T16:21:49 [I|app|4bcde6fd] channel_data: 0 167b
2020-09-03T16:21:52 [I|app|4bcde6fd] channel_request: 0 exit-status false
2020-09-03T16:21:52 [I|app|4bcde6fd] channel_eof: 0
2020-09-03T16:21:52 [I|app|4bcde6fd] channel_close: 0
2020-09-03T16:21:52 [I|app|4bcde6fd] 10.32.211.190 delete channel 0 which closed locally and remotely
2020-09-03T16:21:52 [I|app|4bcde6fd] closing remaining channels (0 open)
2020-09-03T16:21:53 [I|aud|4bcde6fd] Host::Base (170) update event on installed_at 2020-09-03 16:21:50 UTC, 2020-09-03 16:21:52 UTC
2020-09-03T16:21:53 [I|app|4bcde6fd] Delete the autosign entry for test-host.com
2020-09-03T16:21:53 [I|app|4bcde6fd] Processed 5 tasks from queue 'Host::Managed Post', completed 5/5
2020-09-03T16:21:53 [I|app|4bcde6fd]   Rendering api/v2/hosts/create.json.rabl
2020-09-03T16:21:53 [I|app|4bcde6fd]   Rendered api/v2/hosts/create.json.rabl (Duration: 154.6ms | Allocations: 44249)
2020-09-03T16:21:53 [I|app|4bcde6fd] Completed 201 Created in 460562ms (Views: 130.2ms | ActiveRecord: 216.7ms | Allocations: 2372460)

This is on oVirt and I get the same on libvirt, openstack, vmware and ec2.

Basically It starts at 16:14:12:

[ INFO 2020-09-03T16:14:12 API] POST /api/hosts

hammer returns 502 at 16:15:13:

[ERROR 2020-09-03T16:15:13 Exception] Error: 502 Bad Gateway
[Thu Sep 03 16:15:13.215404 2020] [proxy_http:error] [pid 27428] (70007)The timeout specified has expired: [client 10.130.0.1:35150] AH01102: error reading status line from remote server 127.0.0.1:3000
[Thu Sep 03 16:15:13.215496 2020] [proxy:error] [pid 27428] [client 10.130.0.1:35150] AH00898: Error reading from remote server returned by /api/hosts

and the host built finish at 16:21:53:

2020-09-03T16:21:53 [I|aud|4bcde6fd] Host::Base (170) update event on installed_at 2020-09-03 16:21:50 UTC, 2020-09-03 16:21:52 UTC

It looks like the SSH request after takes 5 minutes. That is causing the problems.

Actually it fails before running the SSH finish script. And I have the same on openstack/ec2 with user data.

Hmm, you’re right. It takes 2 minutes to create the Compute instance. I wonder why it’s taking so long.

Btw, a heads up that you posted the root_pass hash in your log. I’ll redact it, but you should consider replacing it if it contains a sensitive password.

It is not a sensitive password but thanks anyway!

I wonder if it’s triggering this code. It has a 2 minute timeout.

Perhaps you can increase the log level to debug so see what’s happening in between.

And this really should have been moved to background processing a long time ago.

I’m not sure, tested on VMware and it takes less than 2 minutes:

2020-09-03T17:18:12 [I|app|ff0b9110] Started POST "/api/hosts" for 127.0.0.1 at 2020-09-03     17:18:12 +0000
2020-09-03T17:18:12 [I|app|ff0b9110] Processing by Api::V2::HostsController#create as JSON
2020-09-03T17:18:12 [I|app|ff0b9110] Authorized user admin(Admin User)
2020-09-03T17:18:13 [I|app|ff0b9110] Adding Compute instance for test-host.com
2020-09-03T17:18:13 [I|app|ff0b9110] Loaded compute resource data for networks in 0.275832785 seconds
2020-09-03T17:19:50 [I|app|ff0b9110] Create DHCP reservation test-host.com for test-host.com-00:50:56:9d:04:44/10.33.18.200

This is just the time it takes to clone the image on slow hardware…

Note that if I set httpd timeout to something like 180 or 300 it fails exactly the same, but during the finish script execution.

This is not linked to provisioning, any hammer commands exceeding 60 seconds fail.

# time hammer proxy import-classes --name $HOSTNAME
Error: 502 Bad Gateway

real	1m2.241s
user	0m1.728s
sys	0m0.252s

It’s like the httpd timeout did not apply when it was running with Passenger, but it does apply with Puma?

@ekohl I think as @Anthony_Chevalet mentioned, the issue appears to exist on almost every compute resource and Azure behaves no differently in this case. Even for GCE, the timeout is required to be increased approx 200 - 300, right @kgaikwad ?

1 Like

@ekhol IIUC, for Azure also we have to increase the proxytimeout in /etc/httpd/conf.modules.d/proxy.conf to >= 300 approx otherwise it fails with 502 proxy error.

That’s correct - Passenger ran inside the Apache process where Puma is a standalone process.

I didn’t realize compute resources were so slow. A few years ago @Marek_Hulan and @iNecas suggested to move orchestration into the tasking system. That would have avoided this problem.

Right now I’m not sure what the best solution is. For one, you can use foreman-installer --foreman-passenger true to revert back to Passenger. We would like to migrate to a reverse proxy setup because it allows us to use websockets so we need a real solution.

@Jonathon_Turel is experimenting with keepalive options and maybe those help:

Note you can also apply this via custom-hiera.yaml without an installer patch, but I didn’t test this myself:

foreman::config::apache::proxy_params:
  retry: 0
  keepalive: 'On'
  ping: 30

mod_proxy - Apache HTTP Server Version 2.4 has a table of the various options. From reading the docs I’m not sure these options help but I haven’t tried them myself.

1 Like

I observed that with GCE compute resource, it takes more than 60s to finish tasks from orchestration. This time interval varies depending upon which method (User-data/Finish template) user selects for host provisioning.

Even for GCE, the timeout is required to be increased approx 200 - 300, right @kgaikwad ?

Yes, I found this timeout error frequently for finish template. I didn’t really try number of combinations but after setting 200, it worked for me.

I think this can be addressed by setting the ProxyTimeout to a large value (Passenger seemed to have an infinite timeout) using heira:

apache::mod::proxy::proxy_timeout: 1200  # 20 minutes

And tune it from there. I think we will need to apply a value globally to the project and then let users tune it from there if they find they have situations that exceed that base, large timeout.

1 Like

I don’t know if this is related in any way, but I tried to add a new subscription manifest and its been stuck on 16% for quite a few hours now.

I’m seeing the following in the production.log over again…

2020-09-04T16:17:24 [E|app|fbc239fe] RestClient::BadRequest: 400 Bad Request
Body: {"displayMessage":"Content access mode does not allow this request.","requestUuid":"288fad6a-7851-4979-abb4-6f6a05e60a5d"}
/opt/theforeman/tfm/root/usr/share/gems/gems/rest-client-2.0.2/lib/restclient/abstract_response.rb:223:in `exception_with_response'
/opt/theforeman/tfm/root/usr/share/gems/gems/rest-client-2.0.2/lib/restclient/abstract_response.rb:103:in `return!'
/opt/theforeman/tfm/root/usr/share/gems/gems/rest-client-2.0.2/lib/restclient/request.rb:809:in `process_result'
/opt/theforeman/tfm/root/usr/share/gems/gems/rest-client-2.0.2/lib/restclient/request.rb:725:in `block in transmit'
/opt/rh/rh-ruby25/root/usr/share/ruby/net/http.rb:910:in `start'
/opt/theforeman/tfm/root/usr/share/gems/gems/rest-client-2.0.2/lib/restclient/request.rb:715:in `transmit'
/opt/theforeman/tfm/root/usr/share/gems/gems/rest-client-2.0.2/lib/restclient/request.rb:145:in `execute'
/opt/theforeman/tfm/root/usr/share/gems/gems/rest-client-2.0.2/lib/restclient/request.rb:52:in `execute'
/opt/theforeman/tfm/root/usr/share/gems/gems/rest-client-2.0.2/lib/restclient/resource.rb:51:in `get'
/opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.16.0/app/lib/katello/resources/candlepin/proxy.rb:26:in `get'
/opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.16.0/app/controllers/katello/api/rhsm/candlepin_proxies_controller.rb:75:in `get'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_controller/metal/basic_implicit_render.rb:6:in `send_action'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/abstract_controller/base.rb:195:in `process_action'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_controller/metal/rendering.rb:30:in `process_action'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/abstract_controller/callbacks.rb:42:in `block in process_action'
/opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-6.0.3.1/lib/active_support/callbacks.rb:112:in `block in run_callbacks'
/opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.16.0/app/controllers/katello/api/rhsm/candlepin_proxies_controller.rb:35:in `repackage_message'
/opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-6.0.3.1/lib/active_support/callbacks.rb:121:in `block in run_callbacks'
/usr/share/foreman/app/controllers/api/v2/base_controller.rb:163:in `disable_json_root'
/opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-6.0.3.1/lib/active_support/callbacks.rb:121:in `block in run_callbacks'
/usr/share/foreman/app/controllers/concerns/foreman/controller/timezone.rb:10:in `set_timezone'
/opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-6.0.3.1/lib/active_support/callbacks.rb:121:in `block in run_callbacks'
/usr/share/foreman/app/models/concerns/foreman/thread_session.rb:32:in `clear_thread'
/opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-6.0.3.1/lib/active_support/callbacks.rb:121:in `block in run_callbacks'
/usr/share/foreman/app/controllers/concerns/foreman/controller/topbar_sweeper.rb:12:in `set_topbar_sweeper_controller'
/opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-6.0.3.1/lib/active_support/callbacks.rb:121:in `block in run_callbacks'
/opt/theforeman/tfm/root/usr/share/gems/gems/audited-4.9.0/lib/audited/sweeper.rb:14:in `around'
/opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-6.0.3.1/lib/active_support/callbacks.rb:121:in `block in run_callbacks'
/opt/theforeman/tfm/root/usr/share/gems/gems/audited-4.9.0/lib/audited/sweeper.rb:14:in `around'
/opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-6.0.3.1/lib/active_support/callbacks.rb:121:in `block in run_callbacks'
/opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-6.0.3.1/lib/active_support/callbacks.rb:139:in `run_callbacks'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/abstract_controller/callbacks.rb:41:in `process_action'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_controller/metal/rescue.rb:22:in `process_action'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_controller/metal/instrumentation.rb:33:in `block in process_action'
/opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-6.0.3.1/lib/active_support/notifications.rb:180:in `block in instrument'
/opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-6.0.3.1/lib/active_support/notifications/instrumenter.rb:24:in `instrument'
/opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-6.0.3.1/lib/active_support/notifications.rb:180:in `instrument'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_controller/metal/instrumentation.rb:32:in `process_action'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_controller/metal/params_wrapper.rb:245:in `process_action'
/opt/theforeman/tfm/root/usr/share/gems/gems/activerecord-6.0.3.1/lib/active_record/railties/controller_runtime.rb:27:in `process_action'
/opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.16.0/app/controllers/katello/concerns/api/api_controller.rb:40:in `process_action'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/abstract_controller/base.rb:136:in `process'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionview-6.0.3.1/lib/action_view/rendering.rb:39:in `process'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_controller/metal.rb:190:in `dispatch'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_controller/metal.rb:254:in `dispatch'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/routing/route_set.rb:50:in `dispatch'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/routing/route_set.rb:33:in `serve'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/journey/router.rb:49:in `block in serve'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/journey/router.rb:32:in `each'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/journey/router.rb:32:in `serve'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/routing/route_set.rb:834:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/railties-6.0.3.1/lib/rails/engine.rb:527:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/railties-6.0.3.1/lib/rails/railtie.rb:190:in `public_send'
/opt/theforeman/tfm/root/usr/share/gems/gems/railties-6.0.3.1/lib/rails/railtie.rb:190:in `method_missing'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/routing/mapper.rb:19:in `block in <class:Constraints>'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/routing/mapper.rb:48:in `serve'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/journey/router.rb:49:in `block in serve'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/journey/router.rb:32:in `each'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/journey/router.rb:32:in `serve'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/routing/route_set.rb:834:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.16.0/lib/katello/middleware/event_daemon.rb:10:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/middleware/static.rb:126:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/middleware/static.rb:126:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/apipie-dsl-2.2.2/lib/apipie_dsl/static_dispatcher.rb:67:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/apipie-rails-0.5.17/lib/apipie/static_dispatcher.rb:66:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/apipie-rails-0.5.17/lib/apipie/extractor/recorder.rb:137:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/middleware/static.rb:126:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/middleware/static.rb:126:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/middleware/static.rb:126:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/middleware/static.rb:126:in `call'
/usr/share/foreman/lib/foreman/middleware/telemetry.rb:10:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/apipie-rails-0.5.17/lib/apipie/middleware/checksum_in_headers.rb:27:in `call'
/usr/share/foreman/lib/foreman/middleware/catch_json_parse_errors.rb:9:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/rack-2.2.2/lib/rack/tempfile_reaper.rb:15:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/rack-2.2.2/lib/rack/etag.rb:27:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/rack-2.2.2/lib/rack/conditional_get.rb:27:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/rack-2.2.2/lib/rack/head.rb:12:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/http/content_security_policy.rb:18:in `call'
/usr/share/foreman/lib/foreman/middleware/logging_context_session.rb:22:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/rack-2.2.2/lib/rack/session/abstract/id.rb:266:in `context'
/opt/theforeman/tfm/root/usr/share/gems/gems/rack-2.2.2/lib/rack/session/abstract/id.rb:260:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/middleware/cookies.rb:648:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/middleware/callbacks.rb:27:in `block in call'
/opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-6.0.3.1/lib/active_support/callbacks.rb:101:in `run_callbacks'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/middleware/callbacks.rb:26:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/middleware/actionable_exceptions.rb:17:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/middleware/debug_exceptions.rb:32:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/middleware/show_exceptions.rb:33:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/railties-6.0.3.1/lib/rails/rack/logger.rb:37:in `call_app'
/opt/theforeman/tfm/root/usr/share/gems/gems/railties-6.0.3.1/lib/rails/rack/logger.rb:28:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/sprockets-rails-3.2.1/lib/sprockets/rails/quiet_assets.rb:13:in `call'
/usr/share/foreman/lib/foreman/middleware/logging_context_request.rb:11:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/middleware/remote_ip.rb:81:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/middleware/request_id.rb:27:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.16.0/lib/katello/prevent_json_parsing.rb:12:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/rack-2.2.2/lib/rack/method_override.rb:24:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/rack-2.2.2/lib/rack/runtime.rb:22:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-6.0.3.1/lib/active_support/cache/strategy/local_cache_middleware.rb:29:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/middleware/executor.rb:14:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/middleware/static.rb:126:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/rack-2.2.2/lib/rack/sendfile.rb:110:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.1/lib/action_dispatch/middleware/host_authorization.rb:76:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/secure_headers-6.3.0/lib/secure_headers/middleware.rb:11:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/railties-6.0.3.1/lib/rails/engine.rb:527:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/railties-6.0.3.1/lib/rails/railtie.rb:190:in `public_send'
/opt/theforeman/tfm/root/usr/share/gems/gems/railties-6.0.3.1/lib/rails/railtie.rb:190:in `method_missing'
/opt/theforeman/tfm/root/usr/share/gems/gems/rack-2.2.2/lib/rack/urlmap.rb:74:in `block in call'
/opt/theforeman/tfm/root/usr/share/gems/gems/rack-2.2.2/lib/rack/urlmap.rb:58:in `each'
/opt/theforeman/tfm/root/usr/share/gems/gems/rack-2.2.2/lib/rack/urlmap.rb:58:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/puma-4.3.3/lib/puma/configuration.rb:228:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/puma-4.3.3/lib/puma/server.rb:682:in `handle_request'
/opt/theforeman/tfm/root/usr/share/gems/gems/puma-4.3.3/lib/puma/server.rb:472:in `process_client'
/opt/theforeman/tfm/root/usr/share/gems/gems/puma-4.3.3/lib/puma/server.rb:328:in `block in run'
/opt/theforeman/tfm/root/usr/share/gems/gems/puma-4.3.3/lib/puma/thread_pool.rb:134:in `block in spawn_thread'
/opt/theforeman/tfm/root/usr/share/gems/gems/logging-2.2.2/lib/logging/diagnostic_context.rb:474:in `block in create_with_logging_context'
2020-09-04T16:17:24 [I|app|fbc239fe]   Rendering text template
2020-09-04T16:17:24 [I|app|fbc239fe]   Rendered text template (Duration: 0.0ms | Allocations: 2)
2020-09-04T16:17:24 [I|app|fbc239fe] Completed 400 Bad Request in 84ms (Views: 0.9ms | ActiveRecord: 3.5ms | Allocations: 4010)

What is the best way to get it to cancel or complete this?