免费注册

【技术实践】kyuubi+集成kerberos认证

发布时间 :2019-02-15

  kyuubi 集成kerberos认证

  1.配置

  创建kyuubi服务的principal(非必需):

  #启动

  kadmin.local

  #kadmin命令行增加principal

  addprinc -randkey kyuubi/compile.bigdata@BIGDATA

  WARNING: no policy specified for kyuubi/compile.bigdata@BIGDATA; defaulting to no policy

  Principal "kyuubi/compile.bigdata@BIGDATA" created.

  #导出keytab

  xst -k /etc/security/keytabs/kyuubi.keytab kyuubi/compile.bigdata@BIGDATA

  #检查keytab文件

  klist -kt /etc/security/keytabs/kyuubi.keytab

  Keytab name: FILE:/etc/security/keytabs/kyuubi.keytab

  KVNO Timestamp Principal

  ---- ------------------- ------------------------------------------------------

  2 11/15/2018 11:21:00 kyuubi/compile.bigdata@BIGDATA

  2 11/15/2018 11:21:00 kyuubi/compile.bigdata@BIGDATA

  2 11/15/2018 11:21:00 kyuubi/compile.bigdata@BIGDATA

  2 11/15/2018 11:21:00 kyuubi/compile.bigdata@BIGDATA

  2 11/15/2018 11:21:00 kyuubi/compile.bigdata@BIGDATA

  2 11/15/2018 11:21:00 kyuubi/compile.bigdata@BIGDATA

  修改$SPARK_HOME/conf/spark-defaults.conf配置kerberos票据(使用hive)

  spark.yarn.keytab /etc/security/keytabs/hive.service.keytab

  spark.yarn.principal hive/compile.bigdata@BIGDATA

  spark.yarn.principal – Kerberos principal for Kyuubi server.

  spark.yarn.keytab – Keytab for Kyuubi server principal.

  修改$SPARK_HOME/conf/spark-defaults.conf配置metastorekeytab(注意:hive.metastore.uris中的主机名要与principal中的HOST一致)

     hive.metastore.uris

  thrift://compile.bigdata:9083

  hive.metastore.warehouse.dir

  /apps/hive/warehouse

  hive.metastore.kerberos.keytab.file

  /etc/security/keytabs/hive.service.keytab

  hive.metastore.kerberos.principal

  hive/_HOST@BIGDATA

  hive.metastore.sasl.enabled

  true

  2.启动

  kinit -kt /etc/security/keytabs/hive.service.keytab hive/compile.bigdata@BIGDATA

  bin/start-kyuubi.sh --driver-memory 1g --conf spark.kyuubi.backend.session.init.timeout=180s \

  --conf spark.driver.extraClassPath=$JARS \

  --conf spark.executor.extraClassPath=$JARS \

  --conf spark.driver.allowMultipleContexts=true \

  --conf spark.kyuubi.authentication=KERBEROS \

  --deploy-mode client

  3.客户端连接

  kinit -kt /etc/security/keytabs/hive.service.keytab hive/compile.bigdata@BIGDATA

  bin/beeline -u "jdbc:hive2://compile.bigdata:10009/;principal=hive/compile.bigdata@BIGDATA"

  新建用户:

  addprinc -randkey jiadx/compile.bigdata@BIGDATA

  xst -k /etc/security/keytabs/jiadx.keytab jiadx/compile.bigdata@BIGDATA

  使用该用户连接:

  #使用jiadx初始化票据

  kinit -kt /etc/security/keytabs/jiadx.keytab jiadx/compile.bigdata@BIGDATA

  #仍然需要用hive principal连接jdbc

  bin/beeline –u

  "jdbc:hive2://compile.bigdata:10009/;principal=hive/compile.bigdata@BIGDATA"

  #查看当前已初始化票据的用户:

  0: jdbc:hive2://compile.bigdata:10009/> select current_user(); #hive中的命令

  +--------+--+

  | _c0 |

  +--------+--+

  | jiadx |

  +--------+--+

  1 row selected (0.992 seconds)

  #也可以hive.server2.proxy.user=hdfs 参数设置用户

  bin/beeline –u

  "jdbc:hive2://compile.bigdata:10009/;principal=hive/compile.bigdata@BIGDATA;hive.server2.proxy.user=hdfs"

  0: jdbc:hive2://compile.bigdata:10009/> select current_user();

  +-------+--+

  | _c0 |

  +-------+--+

  | hdfs |

  +-------+--+

  1 row selected (0.675 seconds)

  如果使用jiadx principal连接会有错误:

  bin/beeline -u "jdbc:hive2://compile.bigdata:10009/;principal=jiadx/compile.bigdata@BIGDATA;auth=kerberos"

  javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: Failure unspecified at GSS-API level (Mechanism level: Checksum failed)]

  at com.sun.security.sasl.gsskerb.GssKrb5Server.evaluateResponse(GssKrb5Server.java:199)

  at org.apache.thrift.transport.TSaslTransport$SaslParticipant.evaluateChallengeOrResponse(TSaslTransport.java:539)

  at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:283)

  at org.apache.thrift.transport.TSaslServerTransport.open(TSaslServerTransport.java:41)

  at org.apache.thrift.transport.TSaslServerTransport$Factory.getTransport(TSaslServerTransport.java:216)

  at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:739)

at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingTransportFactory$1.run(HadoopThriftAuthBridge.java:736)

 

  作者:贾德星

  职务:云服务集团云计算产品中心高级架构师

  专业领域:大数据

  专家简介:系统软件架构师,具备二十余年一线软件开发的工作经历,经验丰富。主持研发浪潮大数据平台产品云海InsightHD,专注于大数据Hadoop/Spark/流计算/机器学习/深度学习等相关技术组件的研究与应用及组件研发。参与起草信息技术国家标准二十余项,已正式发布12项国家标准。研发并申请9项国家专利获得授权。

 

 

我可以为您提供哪些帮助?
  • 您需要什么帮助么?-

    立即咨询 >

    请浪潮云联系我 >

  • 电话咨询

    售前:400-607-6657
    售后:400-619-2176(政府)
    售后:400-603-1123(企业)
       7*24小时服务

  • 联系邮箱

    cloud@inspur.com