NGINX spnego-http-auth-nginx-module 未考虑 KRB5_CONFIG 环境变量

问题描述 投票:0回答:0

我有 nginx 模块spnego-http-auth-nginx-module。 nginx 配置文件的配置如下

user  nginx;
worker_processes  3;

error_log  /var/logs/nginx/error.log warn;
pid        /tmp/pid/nginx.pid;


events
{
    worker_connections  1024;
    multi_accept on;
    use epoll;
}


http
{
    default_type  application/octet-stream;
    include       /usr/local/nginx/mime.types;

    log_format  main  '$remote_addr - $remote_user [$time_local] "$request" '
                      '$status $body_bytes_sent "$http_referer" '
                      '"$http_user_agent" "$http_x_forwarded_for"';

    access_log  /var/logs/nginx/access.log  main;
    error_log   /var/logs/nginx/error.log debug;

    sendfile        on;
    tcp_nopush     on;
    server_tokens off;
    keepalive_timeout  65;
    proxy_connect_timeout       600;
    proxy_send_timeout          600;
    proxy_read_timeout          600;
    send_timeout                600;

    gzip  on;

    proxy_set_header            Host            $http_host;
    proxy_set_header            X-Real-IP       $remote_addr;
    proxy_set_header            X-Forwared-For  $proxy_add_x_forwarded_for;

    server
    {
        listen      50070;
        server_name  hadoopnamenodeui;

        add_header Access-Control-Allow-Origin "*";
        add_header Access-Control-Allow-Methods "GET, OPTIONS, POST";
        add_header Access-Control-Allow-Headers "origin, authorization, accept";
        location /
        {
            auth_gss on;
            auth_gss_keytab /hadoopkrb5/ui.keytab;
            auth_gss_realm HADOOP.COM;
            auth_gss_service_name HTTP/hadoop-1.hadoop.com;
            proxy_pass  https://10.7.1.73:50070/;
        }
    }
}

问题是部署所有这些 hadoop 节点的平台已经在使用 krb5 进行集群身份验证,因此我们为 hadoop 集群部署了另一个 kdc。

它有不同的 krb5.conf 文件,而不是默认的

/etc/krb5.conf
.

所以nginx正在考虑默认

/etc/krb5.conf
。如何配置 nginx 以使用指定的 krb5.conf 文件而不是默认文件。

我尝试在导出

KRB5_CONFIG
shell 环境变量后启动 nginx 进程。在大多数情况下,我们习惯于指出另一个
krb5.conf
文件。 但它没有用。

我也尝试在导出相同的

KRB5_CONFIG
变量后构建nginx,但它仍然认为默认
krb5.conf only
.

在 kinit 和其他 krb5 相关实用程序的情况下,会考虑此变量。那么我怎样才能强制 nginx 这样做。

我还附加了错误日志,这是在我进行 HTTP 调用时写的。

2023/03/17 00:32:49 [debug] 15366#0: accept on 0.0.0.0:50070, ready: 1
2023/03/17 00:32:49 [debug] 15365#0: accept on 0.0.0.0:50070, ready: 1
2023/03/17 00:32:49 [debug] 15364#0: accept on 0.0.0.0:50070, ready: 1
2023/03/17 00:32:49 [debug] 15366#0: accept() not ready (11: Resource temporarily unavailable)
2023/03/17 00:32:49 [debug] 15365#0: posix_memalign: 0000000001E30C70:512 @16
2023/03/17 00:32:49 [debug] 15364#0: accept() not ready (11: Resource temporarily unavailable)
2023/03/17 00:32:49 [debug] 15365#0: *2 accept: 172.17.42.1:53685 fd:11
2023/03/17 00:32:49 [debug] 15365#0: *2 event timer add: 11: 60000:9258050
2023/03/17 00:32:49 [debug] 15365#0: *2 reusable connection: 1
2023/03/17 00:32:49 [debug] 15365#0: *2 epoll add event: fd:11 op:1 ev:80002001
2023/03/17 00:32:49 [debug] 15365#0: accept() not ready (11: Resource temporarily unavailable)
2023/03/17 00:32:49 [debug] 15365#0: *2 http wait request handler
2023/03/17 00:32:49 [debug] 15365#0: *2 malloc: 0000000001E32790:1024
2023/03/17 00:32:49 [debug] 15365#0: *2 recv: eof:0, avail:-1
2023/03/17 00:32:49 [debug] 15365#0: *2 recv: fd:11 424 of 1024
2023/03/17 00:32:49 [debug] 15365#0: *2 reusable connection: 0
2023/03/17 00:32:49 [debug] 15365#0: *2 posix_memalign: 0000000001DCD8C0:4096 @16
2023/03/17 00:32:49 [debug] 15365#0: *2 http process request line
2023/03/17 00:32:49 [debug] 15365#0: *2 http request line: "GET / HTTP/1.1"
2023/03/17 00:32:49 [debug] 15365#0: *2 http uri: "/"
2023/03/17 00:32:49 [debug] 15365#0: *2 http args: ""
2023/03/17 00:32:49 [debug] 15365#0: *2 http exten: ""
2023/03/17 00:32:49 [debug] 15365#0: *2 posix_memalign: 0000000001DC1800:4096 @16
2023/03/17 00:32:49 [debug] 15365#0: *2 http process request header line
2023/03/17 00:32:49 [debug] 15365#0: *2 http header: "Host: hadoop-1.hadoop.com:50071"
2023/03/17 00:32:49 [debug] 15365#0: *2 http header: "User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:109.0) Gecko/20100101 Firefox/110.0"
2023/03/17 00:32:49 [debug] 15365#0: *2 http header: "Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,*/*;q=0.8"
2023/03/17 00:32:49 [debug] 15365#0: *2 http header: "Accept-Language: en-US,en;q=0.5"
2023/03/17 00:32:49 [debug] 15365#0: *2 http header: "Accept-Encoding: gzip, deflate"
2023/03/17 00:32:49 [debug] 15365#0: *2 http header: "DNT: 1"
2023/03/17 00:32:49 [debug] 15365#0: *2 http header: "Connection: keep-alive"
2023/03/17 00:32:49 [debug] 15365#0: *2 http header: "Upgrade-Insecure-Requests: 1"
2023/03/17 00:32:49 [debug] 15365#0: *2 http header: "Authorization: Basic dGVzdDpkZW1v"
2023/03/17 00:32:49 [debug] 15365#0: *2 http header done
2023/03/17 00:32:49 [debug] 15365#0: *2 event timer del: 11: 9258050
2023/03/17 00:32:49 [debug] 15365#0: *2 rewrite phase: 0
2023/03/17 00:32:49 [debug] 15365#0: *2 test location: "/"
2023/03/17 00:32:49 [debug] 15365#0: *2 using configuration "/"
2023/03/17 00:32:49 [debug] 15365#0: *2 http cl:-1 max:1048576
2023/03/17 00:32:49 [debug] 15365#0: *2 rewrite phase: 2
2023/03/17 00:32:49 [debug] 15365#0: *2 post rewrite phase: 3
2023/03/17 00:32:49 [debug] 15365#0: *2 generic phase: 4
2023/03/17 00:32:49 [debug] 15365#0: *2 generic phase: 5
2023/03/17 00:32:49 [debug] 15365#0: *2 access phase: 6
2023/03/17 00:32:49 [debug] 15365#0: *2 SSO auth handling IN: token.len=0, head=0, ret=401
2023/03/17 00:32:49 [debug] 15365#0: *2 Begin auth
2023/03/17 00:32:49 [debug] 15365#0: *2 Detect basic auth
2023/03/17 00:32:49 [debug] 15365#0: *2 Basic auth credentials supplied by client
2023/03/17 00:32:49 [debug] 15365#0: *2 Attempting authentication with principal root
2023/03/17 00:32:49 [error] 15365#0: *2 Kerberos error: Credentials failed, client: 172.17.42.1, server: nameui, request: "GET / HTTP/1.1", host: "hadoop-1.hadoop.com:50071"
2023/03/17 00:32:49 [debug] 15365#0: *2 Kerberos error: -1765328378, Client '[email protected]' not found in Kerberos database
2023/03/17 00:32:49 [debug] 15365#0: *2 Basic auth failed
2023/03/17 00:32:49 [debug] 15365#0: *2 http finalize request: 401, "/?" a:1, c:1
2023/03/17 00:32:49 [debug] 15365#0: *2 http special response: 401, "/?"
2023/03/17 00:32:49 [debug] 15365#0: *2 http set discard body
2023/03/17 00:32:49 [debug] 15365#0: *2 HTTP/1.1 401 Unauthorized
Server: nginx
Date: Fri, 17 Mar 2023 07:32:49 GMT
Content-Type: text/html
Content-Length: 172
Connection: keep-alive
WWW-Authenticate: Basic realm="HADOOP.COM"

2023/03/17 00:32:49 [debug] 15365#0: *2 write new buf t:1 f:0 0000000001DCE6B0, pos 0000000001DCE6B0, size: 198 file: 0, size: 0
2023/03/17 00:32:49 [debug] 15365#0: *2 http write filter: l:0 f:0 s:198
2023/03/17 00:32:49 [debug] 15365#0: *2 http output filter "/?"
2023/03/17 00:32:49 [debug] 15365#0: *2 http copy filter: "/?"
2023/03/17 00:32:49 [debug] 15365#0: *2 http postpone filter "/?" 0000000001DCE8A0
2023/03/17 00:32:49 [debug] 15365#0: *2 write old buf t:1 f:0 0000000001DCE6B0, pos 0000000001DCE6B0, size: 198 file: 0, size: 0
2023/03/17 00:32:49 [debug] 15365#0: *2 write new buf t:0 f:0 0000000000000000, pos 0000000000B0CE40, size: 126 file: 0, size: 0
2023/03/17 00:32:49 [debug] 15365#0: *2 write new buf t:0 f:0 0000000000000000, pos 0000000000B0D3A0, size: 46 file: 0, size: 0
2023/03/17 00:32:49 [debug] 15365#0: *2 http write filter: l:1 f:0 s:370
2023/03/17 00:32:49 [debug] 15365#0: *2 http write filter limit 2097152
2023/03/17 00:32:49 [debug] 15365#0: *2 writev: 370 of 370
2023/03/17 00:32:49 [debug] 15365#0: *2 http write filter 0000000000000000
2023/03/17 00:32:49 [debug] 15365#0: *2 http copy filter: 0 "/?"
2023/03/17 00:32:49 [debug] 15365#0: *2 http finalize request: 0, "/?" a:1, c:1
2023/03/17 00:32:49 [debug] 15365#0: *2 set http keepalive handler
2023/03/17 00:32:49 [debug] 15365#0: *2 http close request
2023/03/17 00:32:49 [debug] 15365#0: *2 http log handler
2023/03/17 00:32:49 [debug] 15365#0: *2 free: 0000000001DCD8C0, unused: 0
2023/03/17 00:32:49 [debug] 15365#0: *2 free: 0000000001DC1800, unused: 2852
2023/03/17 00:32:49 [debug] 15365#0: *2 free: 0000000001E32790
2023/03/17 00:32:49 [debug] 15365#0: *2 hc free: 0000000000000000
2023/03/17 00:32:49 [debug] 15365#0: *2 hc busy: 0000000000000000 0
2023/03/17 00:32:49 [debug] 15365#0: *2 tcp_nodelay
2023/03/17 00:32:49 [debug] 15365#0: *2 reusable connection: 1
2023/03/17 00:32:49 [debug] 15365#0: *2 event timer add: 11: 65000:9263198

注意这里Platform REALM是EXAMPLE.COM,HADOOP REALM是HADOOP.COM 因此,当我尝试使用测试用户名登录时,它发现 [email protected](默认的 krb5.conf 文件默认领域)但是 pricipal 不存在,所以给出了错误。 而且我也无法修改默认的 krb5.conf 来添加 HADOOP.COM 领域。

所以我的问题是有任何 env 变量或 nginx 变量,通过它们我可以让 nginx 考虑另一个默认的 krb5.conf。

c++ http nginx kerberos spnego
© www.soinside.com 2019 - 2024. All rights reserved.