MangoCool

The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwx------

2016-09-14 15:07:15   作者:MangoCool   来源:MangoCool

在win7环境下,运行spark,local模式,将dataframe的数据写入hive表中,出现报错:The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwx------


报错详情:

Exception in thread "main" java.lang.reflect.InvocationTargetException
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
	at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:258)
	at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:359)
	at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:263)
	at org.apache.spark.sql.hive.HiveSharedState.metadataHive$lzycompute(HiveSharedState.scala:39)
	at org.apache.spark.sql.hive.HiveSharedState.metadataHive(HiveSharedState.scala:38)
	at org.apache.spark.sql.hive.HiveSharedState.externalCatalog$lzycompute(HiveSharedState.scala:46)
	at org.apache.spark.sql.hive.HiveSharedState.externalCatalog(HiveSharedState.scala:45)
	at org.apache.spark.sql.hive.HiveSessionState.catalog$lzycompute(HiveSessionState.scala:50)
	at org.apache.spark.sql.hive.HiveSessionState.catalog(HiveSessionState.scala:48)
	at org.apache.spark.sql.hive.HiveSessionState$$anon$1.<init>(HiveSessionState.scala:63)
	at org.apache.spark.sql.hive.HiveSessionState.analyzer$lzycompute(HiveSessionState.scala:63)
	at org.apache.spark.sql.hive.HiveSessionState.analyzer(HiveSessionState.scala:62)
	at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:49)
	at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:64)
	at org.apache.spark.sql.SparkSession.createDataFrame(SparkSession.scala:542)
	at org.apache.spark.sql.SparkSession.createDataFrame(SparkSession.scala:302)
	at com.dtxy.xbdp.widget.HiveOutputWidget$.main(HiveOutputWidget.scala:101)
	at com.dtxy.xbdp.widget.HiveOutputWidget.main(HiveOutputWidget.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)
Caused by: java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwx------
	at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
	at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:171)
	... 27 more
Caused by: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwx------
	at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:612)
	at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:554)
	at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:508)
	... 28 more


问题分析:

因为spark运行采用的是local模式,因此将dataframe保存到hive表时:

df.write.mode(SaveMode.Overwrite).saveAsTable(tableName)
也就是执行这句代码时,spark会在本地默认建立一个default的数据库,然后将你的dataframe保存到对应的表中。

而hive默认的数据仓库文件系统路径是:/tmp/hive,这个目录可以在你项目所在的盘符中找到,这个目录是程序运行时生成的。

而这个目录目前的权限是rwx------,还不具备可写权限。

注:如果你的tableName中指定了非default的数据库,如:myDB.user,也会报错找不到该数据库,本地测试只用表名即可。


问题解决

1、下载一个hadoop版本,我这里下载的是hadoop-2.7.0;

2、在cmd命令行中运行:

E:\Program Files\hadoop-2.7.0\bin\winutils.exe chmod 777 D:\tmp\hive
注:HADOOP_HOME\bin下没有winutils.exe,可以去Github下载;D:\是我项目所在目录。


参考来源:http://stackoverflow.com/questions/34196302/the-root-scratch-dir-tmp-hive-on-hdfs-should-be-writable-current-permissions

标签: win sparkSession hive error

分享:

上一篇win7下dataframe保存到hive表的本地模式和远程集群模式

下一篇idea打jar包方法

关于我

崇尚极简,热爱技术,喜欢唱歌,热衷旅行,爱好电子产品的一介码农。

座右铭

当你的才华还撑不起你的野心的时候,你就应该静下心来学习,永不止步!

人生之旅历途甚长,所争决不在一年半月,万不可因此着急失望,招精神之萎葸。

Copyright 2015- 芒果酷(mangocool.com) All rights reserved. 湘ICP备14019394号

免责声明:本网站部分文章转载其他媒体,意在为公众提供免费服务。如有信息侵犯了您的权益,可与本网站联系,本网站将尽快予以撤除。