基于0.14.0版本配置HiveServer2(二)

2015-07-24 07:02:28 · 作者: · 浏览: 7
vided (Mechanism level: Server not found in Kerberos database (7) - UNKNOWN_SERVER)] at com.sun.security.sasl.gsskerb.GssKrb5Client.eva luateChallenge(GssKrb5Client.java:212) at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:94) at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:253) at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49) at java.security.AccessController.doPrivileged(Native Method)
这里我用的是hive用户,这个用户具有代理权限,而我使用jdbc连接的机器使用的用户是intern,首先在hive机器上执行启动hiveserver2的命令:
./bin/hive --service hiveserver2

然后再客户端的机器上通过hive自带的beeline进行连接:
./bin/beeline

然后使用connect命令连接hiveserver2:
beeline> !connect jdbc:hive2://hiveserver2-ip:10000/foodmart;principal=hive/xxx@HADOOP.XXX.COM;
scan complete in 34ms
Connecting to jdbc:hive2://bitest0.server.163.org:10000/foodmart;principal=hive/app-20.photo.163.org@HADOOP.HZ.NETEASE.COM;
Enter username for jdbc:hive2://bitest0.server.163.org:10000/foodmart;principal=hive/app-20.photo.163.org@HADOOP.HZ.NETEASE.COM;:
Enter password for jdbc:hive2://bitest0.server.163.org:10000/foodmart;principal=hive/app-20.photo.163.org@HADOOP.HZ.NETEASE.COM;:
Connected to: Apache Hive (version 0.14.0)
Driver: Hive JDBC (version 0.14.0)
Transaction isolation: TRANSACTION_REPEATABLE_READ
0: jdbc:hive2://bitest0.server.163.org:10000/> 

连接的时候需要指定jdbc的url(默认的端口号为10000,也可以在hiveserver2的配置文件中配置),另外还需要制定服务器的principal,也就是在上面配置的那个hive.server2.authentication.kerberos.principal,而客户端用户使用的用户就是客户端的当前用户,可以使用klist查看。
除了使用自带的beeline连接,还可以在程序中使用jdbc进行连接,测试代码如下:
import java.io.IOException;
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.Statement;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.security.UserGroupInformation;

public class TestHive {

  public static void main(String[] args) throws SQLException {
    try {
      Class.forName("org.apache.hive.jdbc.HiveDriver");
    } catch (ClassNotFoundException e) {
      e.printStackTrace();
    }
    

Configuration conf = new Configuration();
conf.setBoolean("hadoop.security.authorization", true);
conf.set("hadoop.security.authentication", "kerberos");
UserGroupInformation.setConfiguration(conf);
try {
UserGroupInformation.loginUserFromKeytab("intern/bigdata", "C:\\Users\\Administrator\\Desktop\\intern.keytab");
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
    Connection conn = DriverManager
        .getConnection(
            "jdbc:hive2://hiveserver2-ip:10000/foodmart;principal=hive/xxx@HADOOP.XXX.COM;User=;Password=;",
            "", "");
    Statement stmt = conn.createStatement();
    String sql = "select * from account limit 10";
    System.out.println("Running: " + sql);
    ResultSet res = stmt.executeQuery(sql);
    while (res.next()) {
      System.out.println(String.valueOf(res.getIn