在Hive的测试中,发现使用正则表达式的字段分隔方式时,总是进行不下去,创建表的语句如下:
create table mylog2(time string,clause string) row format serde 'org.apache.hadoop.hive.contrib.serde2.RegexSerDe' with serdeproperties (
"input.regex" = "(\\d{4}-\\d{2}-\\d{2} \\d{2}:\\d{2}:\\d{2},\\d{3}) - (.*) -(.*)"
);
通过查看Tracking URL,发现抛出类找不到的异常,类
org.apache.hadoop.hive.contrib.serde2.RegexSerDe
没有找到。
解决方法:
将下面的配置语句,加在配置文件: $HIVE_INSTALL/conf/hive-site.xml中,value中hive-contrib-*.jar的路径为你机器上实际的放置,在$HIVE_INSTALL/lib目录下寻找。
<property>
<name>hive.aux.jars.path</name>
<value>file:///home/develop/hive-0.8.0/lib/hive-contrib-0.8.0.jar</value>
<description>Added by tiger.zeng on 20120202.These JAR file are available to all users for all jobs</description>
</property>
退出hive shell, 重新启动hive shell, 可运行正常。
运行结果如下:
hive> select time,clause from mylog2;
Total MapReduce jobs = 1
Launching Job 1 out of 1
Number of reduce tasks is set to 0 since there's no reduce operator
Starting Job = job_201202021642_0003, Tracking URL = http://hdp0:50030/jobdetails.jsp?jobid=job_201202021642_0003
Kill Command = /home/hadoop/hadoop-0.20.203.0/bin/hadoop job -Dmapred.job.tracker=hdp0:9001 -kill job_201202021642_0003
Hadoop job information for Stage-1: number of mappers: 1; number of reducers: 0
2012-02-02 16:58:43,327 Stage-1 map = 0%, reduce = 0%
2012-02-02 16:58:49,375 Stage-1 map = 100%, reduce = 0%
2012-02-02 16:58:55,437 Stage-1 map = 100%, reduce = 100%
Ended Job = job_201202021642_0003
MapReduce Jobs Launched:
Job 0: Map: 1 HDFS Read: 1286 HDFS Write: 564 SUCESS
Total MapReduce CPU Time Spent: 0 msec
OK
NULL NULL
NULL NULL
NULL NULL
NULL NULL
2012-01-17 00:14:19,301 org.apache.hadoop.conf.Configuration -0 [main] DEBUG org.apache.hadoop.conf.Configuration
NULL NULL
NULL NULL
NULL NULL
NULL NULL
NULL NULL
2012-01-17 00:14:19,314 org.apache.hadoop.util.NativeCodeLoader -13 [main] DEBUG org.apache.hadoop.util.NativeCodeLoader
2012-01-17 00:14:19,314 org.apache.hadoop.util.NativeCodeLoader -13 [main] INFO org.apache.hadoop.util.NativeCodeLoader
2012-01-17 00:14:19,316 org.apache.hadoop.io.compress.zlib.ZlibFactory -15 [main] INFO org.apache.hadoop.io.compress.zlib.ZlibFactory
NULL NULL
Time taken: 21.28 seconds
hive>
参考:
Jasper Knulst 13020438802011年04月06日 星期三 06时51分20秒
|
|
Hi(ve),
I created a table like this;
create table testtable (veld1 STRING,veld2 STRING,veld3 STRING) ROW FORMAT
SERDE 'org.apache.hadoop.hive.contrib.serde2.RegexSerDe'
> WITH SERDEPROPERTIES ("input.regex" =
"([a-z]{4}[0-9])þ([a-z]{4}[0-9])þ([a-z]{4}[0-9])") STORED AS TEXTFILE;
The table is OK, select * from testtable shows the contents of the
underlying HDFS file.
However when I invoke a MR job by select veld2 from testtable, the MR job
starts but I get mapper errors saying:
"Caused by: org.apache.hadoop.hive.ql.metadata.HiveException:
java.lang.ClassNotFoundException:
org.apache.hadoop.hive.contrib.serde2.RegexSerDe"
I already copied the hive serde jar (in my case hive-serde-0.7.0-CDH3B4.jar)
to $HADOOP_HOME/lib and restarted jobtracker/tasktrackers but that doesn't
help.
Cheers Jasper
Loren Siebert 13020443192011年04月06日 星期三 06时58分39秒
|
|
You need to tell Hive about the JAR. This is how I do it in hive-site.xml:
<property>
<name>hive.aux.jars.path</name>
<value>file:///usr/lib/hive/lib/hive-contrib-0.7.0-CDH3B4.jar</value>
<description>These JAR file are available to all users for all jobs</description>
</property>Show quoted text
分享到:
相关推荐
Caused by: java.sql.SQLException: java.lang.ClassNotFoundException: org.apache.hive.jdbc.HiveDriver at com.trs.gateway.commons.hive.HiveFeature.getConnection(HiveFeature.java:57) at ...
Hive错误之 Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask错误分析_xiaohu21的博客-CSDN博客.mht
FAILED: SemanticException org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient 解决方法 ...
Caused by: java.sql.SQLException: java.lang.ClassNotFoundException: org.apache.hive.jdbc.HiveDriver at com.trs.gateway.commons.hive.HiveFeature.getConnection(HiveFeature.java:57) at ...
FAILED: SemanticException org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient 通过控制台的...
hive 开发UDF 使用maven工程 引发jar包缺失 hive 开发UDF 使用maven工程 引发jar包缺失
Exception in thread main org.apache.hadoop.security.AccessControlException: Permission denied: user=L.MOON, access=WRITE, inode=/user/lsy/result1/_temporary/0:lsy:supergroup:drwxr-xr-x Caused by: org....
ERROR : FAILED: Execution Error, return code 30041 from org.apache.hadoop.hive.ql.exec.spark.SparkTask. 前言报错信息异常分析配置改动后记 前言 在成功消除Cloudare管理界面上那些可恶的警告之后,我又对yarn...
Learn how to use, deploy, and maintain Apache Spark with this comprehensive guide, written by the creators of this open-source cluster-computing framework. With an emphasis on improvements and new ...
apache-hive-2.1.1-bin.tar apache-hive-2.1.1-bin.tar apache-hive-2.1.1-bin.tarapache-hive-2.1.1-bin.tar apache-hive-2.1.1-bin.tar apache-hive-2.1.1-bin.tarapache-hive-2.1.1-bin.tar apache-hive-2.1.1-...
亲测有用,在ambari2.5.2下安装的hadoop版本,后面的版本应该都没问题
源码主要用于学习:1. Spring Boot+Hadoop+Hive+Hbase实现数据基本操作,Hive数据源使
cdh-hive2.1.1版本orc文件读取数据报错替换包: Caused by: java.lang.ArrayIndexOutOfBoundsException: 7 at org.apache.orc.OrcFile$... at org.apache.hadoop.hive.ql.io.orc.VectorizedOrcInputFormat.getRecord
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234) at org.apache.sqoop.Sqoop.runTool...
import org.apache.hadoop.hive.conf.HiveConf; import org.apache.hadoop.hive.metastore.HiveMetaStoreClient; import org.apache.hadoop.hive.metastore.api.FieldSchema; import org.apache.ha
Hive-jdbc-3.1.1.zip hive jdbc,用于大数据 hive 开发和数据库连接。
hive-anttasks.jar ...hive-contrib.jar hive-hbaseec.jar hive-hbase-handler.jar hive-hwi.jar hive-jdbc.jar hive-metastorejar hive-serde.jar hive-service.jar hive-shims.jar hadoop-core-1.0.4.jar
大数据笔记,包含Hadoop、Spark、Flink、Hive、Kafka、Flume、ZK...... 大数据笔记,包含Hadoop、Spark、Flink、Hive、Kafka、Flume、ZK...... 大数据笔记,包含Hadoop、Spark、Flink、Hive、Kafka、Flume、ZK.......
hive-exec-1.2.1.spark2.jar spark2-shell 支持 hive2 hadoop3