设为首页 加入收藏

TOP

解决centos7上初始化spark意外
2019-03-20 01:20:58 】 浏览:213
Tags:解决 centos7 初始 spark 意外

在Linux(CentOS7)上初始化spark遇到“org.apache.spark.SparkException: Invalid Spark URL: spark://HeartbeatReceiver@VM_0_9_centos:34068”问题,具体如下:

ERROR SparkContext:91 - Error initializing SparkContext.
org.apache.spark.SparkException: Invalid Spark URL: spark://HeartbeatReceiver@VM_0_9_centos:34068
 at org.apache.spark.rpc.RpcEndpointAddress$.apply(RpcEndpointAddress.scala:66)
 at org.apache.spark.rpc.netty.NettyRpcEnv.asyncSetupEndpointRefByURI(NettyRpcEnv.scala:134)
 at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:101)
 at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:109)
 at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:32)
 at org.apache.spark.executor.Executor.<init>(Executor.scala:178)
 at org.apache.spark.scheduler.local.LocalEndpoint.<init>(LocalSchedulerBackend.scala:59)
 at org.apache.spark.scheduler.local.LocalSchedulerBackend.start(LocalSchedulerBackend.scala:127)
 at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:178)
 at org.apache.spark.SparkContext.<init>(SparkContext.scala:501)
 at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
 at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
 at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
 at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
 at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
 at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
 at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
 at py4j.Gateway.invoke(Gateway.java:238)
 at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
 at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
 at py4j.GatewayConnection.run(GatewayConnection.java:238)
 at java.lang.Thread.run(Thread.java:748)

解决方法:在spark-env.shell中设置SPARK_LOCAL_IP=127.0.0.1后运行:
export SPARK_LOCAL_HOSTNAME=localhost

export SPARK_LOCAL_HOSTNAME=localhost

即可解决
在这里插入图片描述

】【打印繁体】【投稿】【收藏】 【推荐】【举报】【评论】 【关闭】 【返回顶部
上一篇学习spark之spark编译部署 下一篇spark IDE:   System memory..

最新文章

热门文章

Hot 文章

Python

C 语言

C++基础

大数据基础

linux编程基础

C/C++面试题目