设为首页 加入收藏

TOP

hive之hwi的使用与配置
2019-02-15 01:06:36 】 浏览:154
Tags:hive hwi 使用 配置

1.启动终端服务
hive –service hwi

[hadoopUser@secondmgt conf]$ hive --service hwi
ls: cannot access /home/hadoopUser/cloud/hive/apache-hive-0.13.1-bin/lib/hive-hwi-*.war: No such file or directory
15/01/09 15:49:35 INFO hwi.HWIServer: HWI is starting up
15/01/09 15:49:35 INFO Configuration.deprecation: mapred.reduce.tasks is deprecated. Instead, use mapreduce.job.reduces
15/01/09 15:49:35 INFO Configuration.deprecation: mapred.min.split.size is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize
15/01/09 15:49:35 INFO Configuration.deprecation: mapred.reduce.tasks.speculative.execution is deprecated. Instead, use mapreduce.reduce.speculative
15/01/09 15:49:35 INFO Configuration.deprecation: mapred.min.split.size.per.node is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize.per.node
15/01/09 15:49:35 INFO Configuration.deprecation: mapred.input.dir.recursive is deprecated. Instead, use mapreduce.input.fileinputformat.input.dir.recursive
15/01/09 15:49:35 INFO Configuration.deprecation: mapred.min.split.size.per.rack is deprecated. Instead, use mapreduce.input.fileinputformat.split.minsize.per.rack
15/01/09 15:49:35 INFO Configuration.deprecation: mapred.max.split.size is deprecated. Instead, use mapreduce.input.fileinputformat.split.maxsize
15/01/09 15:49:35 INFO Configuration.deprecation: mapred.committer.job.setup.cleanup.needed is deprecated. Instead, use mapreduce.job.committer.setup.cleanup.needed
15/01/09 15:49:35 INFO mortbay.log: Logging to org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via org.mortbay.log.Slf4jLog
15/01/09 15:49:35 INFO mortbay.log: jetty-6.1.26
15/01/09 15:49:35 INFO mortbay.log: Started SocketConnector@0.0.0.0:9999

会报hive-hwi-*.war包不存在。

注意:报错误的原因是,Hive目前的发行包里没有hwi的war包文件,这个问题在0.13和0.14上都存在,没有这个war包就不能启动hwi服务。

二.手动构建hwi的war包
1、从Hive官网下载Hive对应版本的源码apache-hive-0.13.0-src.tar.gz,下载后解压,并进入hwi目录中。hwi目录下的web目录就是HWI的web界面目录
2.执行jar cvfM0 hive-hwi-0.13.0.war -C web/ .打包命令,即可生成我们需要的hive-hwi-0.13.0-war,并将其拷贝到hive安装目录lib目录中
3.在hive-site.xml中添加HWI配置

 <property>
  <name>hive.hwi.war.file</name>
  <value>lib/hive-hwi-0.13.1.war</value>
  <description>指定war包所在目录,必须是相对路径,从${HIVE_HOME开始之后的路径}</description>
</property>

<property>
  <name>hive.hwi.listen.host</name>
  <value>192.168.33.10</value>
  <description>hwi所在的主机ip</description>
</property>

<property>
  <name>hive.hwi.listen.port</name>
    <value>9999</value>
    <description>hwi的端口号</description>
</property>

重启就ok了
注意:查询的时候数据如果是中文乱码需要设置编码格式

找到打包好的hive-hwi-2.1.0.war,里面是hwi的web文件,找到view_result.jsp,在文件头部指定编码格式

<%--
   Licensed to the Apache Software Foundation (ASF) under one or more
   contributor license agreements.  See the NOTICE file distributed with
   this work for additional information regarding copyright ownership.
   The ASF licenses this file to You under the Apache License, Version 2.0
   (the "License"); you may not use this file except in compliance with
   the License.  You may obtain a copy of the License at

       http://www.apache.org/licenses/LICENSE-2.0

   Unless required by applicable law or agreed to in writing, software
   distributed under the License is distributed on an "AS IS" BASIS,
   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
   See the License for the specific language governing permissions and
   limitations under the License.
--%>
<%@ page contentType="text/html;charset=UTF-8"%>
<%@page errorPage="error_page.jsp" %>
<%@ page import="org.apache.hadoop.hive.hwi.*,java.io.*" %>
<% HWIAuth auth = (HWIAuth) session.getAttribute("auth"); %>
<% HWISessionManager hs = (HWISessionManager) application.getAttribute("hs"); %>
<% if (auth==null) { %>
    <jsp:forward page="/authorize.jsp" />
<% } %>
<% String sessionName=request.getParameter("sessionName"); %>
<% HWISessionItem sess = hs.findSessionItemByName(auth,sessionName);    %>
<% int start=0; 
   if (request.getParameter("start")!=null){
     start = Integer.parseInt( request.getParameter("start") );
   }
%>
<% int bsize=1024; 
   if (request.getParameter("bsize")!=null){
     bsize = Integer.parseInt( request.getParameter("bsize") );
   }
%>

设置编码格式为utf-8
3.重启就ok了
访问地址http://192.168.33.10:9999/hwi/
这里写图片描述

】【打印繁体】【投稿】【收藏】 【推荐】【举报】【评论】 【关闭】 【返回顶部
上一篇基于 hive 的日志数据统计实战 下一篇Hive内部表和外部表的区别

最新文章

热门文章

Hot 文章

Python

C 语言

C++基础

大数据基础

linux编程基础

C/C++面试题目