设为首页 加入收藏

TOP

Hive常用设置
2019-02-11 01:22:23 】 浏览:71
Tags:Hive 常用 设置

  1. 切换计算引擎(Hive CLI内设置)
    设置MapReduce为计算引擎 set hive.execution.engine=mr;
    设置Spark为计算引擎 set hive.execution.engine=spark;
    设置Spark的master set spark.master=spark://master:7077
    设置Spark的master为YARN set spark.master=yarn
  1. hive中查看HDFS的位置
0: jdbc:hive2://secondnamenode-c1:10000/> show create table testdb.tttest;
INFO  : Compiling command(queryId=hive_20170721163434_87f6cb07-5fc2-4de7-9792-8ad8782f1fe4): show create table testdb.tttest
INFO  : Semantic Analysis Completed
INFO  : Returning Hive schema: Schema(fieldSchemas:[FieldSchema(name:createtab_stmt, type:string, comment:from deserializer)], properties:null)
INFO  : Completed compiling command(queryId=hive_20170721163434_87f6cb07-5fc2-4de7-9792-8ad8782f1fe4); Time taken: 0.009 seconds
INFO  : Executing command(queryId=hive_20170721163434_87f6cb07-5fc2-4de7-9792-8ad8782f1fe4): show create table testdb.tttest
INFO  : Starting task [Stage-0:DDL] in serial mode
INFO  : Completed executing command(queryId=hive_20170721163434_87f6cb07-5fc2-4de7-9792-8ad8782f1fe4); Time taken: 0.008 seconds
INFO  : OK
+-----------------------------------------------------------------+--+
|                         createtab_stmt                          |
+-----------------------------------------------------------------+--+
| CREATE TABLE `testdb.tttest`(                                   |
|   `username` string,                                            |
|   `sex` string)                                                 |
| COMMENT 'Imported by sqoop on 2017/04/17 10:11:26'              |
| ROW FORMAT SERDE                                                |
|   'org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe'          |
| WITH SERDEPROPERTIES (                                          |
|   'field.delim'='\t',                                           |
|   'line.delim'='\n',                                            |
|   'serialization.format'='\t')                                  |
| STORED AS INPUTFORMAT                                           |
|   'org.apache.hadoop.mapred.TextInputFormat'                    |
| OUTPUTFORMAT                                                    |
|   'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat'  |
| LOCATION                                                        |
|   'hdfs://nameservice1/user/hive/warehouse/testdb.db/tttest'    |
| TBLPROPERTIES (                                                 |
|   'COLUMN_STATS_ACCURATE'='true',                               |
|   'numFiles'='1',                                               |
|   'numRows'='0',                                                |
|   'rawDataSize'='0',                                            |
|   'totalSize'='66',                                             |
|   'transient_lastDdlTime'='1492395090')                         |
+-----------------------------------------------------------------+--+
23 rows selected (0.039 seconds)

在hive命令行用dfs查看文件存储位置

0: jdbc:hive2://secondnamenode-c1:10000/> dfs -ls hdfs://nameservice1/user/hive/warehouse/testdb.db/tttest;
+-----------------------------------------------------------------------------------------------------------------------------+--+
|                                                         DFS Output                                                          |
+-----------------------------------------------------------------------------------------------------------------------------+--+
| Found 1 items                                                                                                               |
| -rwxrwxrwt   2 hdfs hive         66 2017-04-17 10:11 hdfs://nameservice1/user/hive/warehouse/testdb.db/tttest/part-m-00000  |
+-----------------------------------------------------------------------------------------------------------------------------+--+
2 rows selected (0.008 seconds)
0: jdbc:hive2://secondnamenode-c1:10000/>

编程开发网
】【打印繁体】【投稿】【收藏】 【推荐】【举报】【评论】 【关闭】 【返回顶部
上一篇(一)Spark SQL三种方式启动 下一篇『 Spark 』7. 使用 Spark DataFr..