{"rsdb":{"rid":"309139","subhead":"","postdate":"0","aid":"224301","fid":"116","uid":"1","topic":"1","content":"
\n \n \n
\n \n

Spark\u7248\u672c:2.1.2\uff0cspark\u81ea\u5e26scala\u7248\u672c2.11.8<\/span>\uff0cspark IDE \u7248\u672c4.7\n<\/span><\/p>\n

Eclipse \u4e2d\u521b\u5efaWordCount\u9879\u76ee, \u5c06scala library container \u8bbe\u7f6e\u4e3a2.11.11 , \u5c06spark\/jars\u4e2d\u7684jar\u90fd\u5bfc\u5165reference libraries\uff0c<\/span><\/p>\n

\u4ee3\u7801<\/span><\/p>\n

<\/p>\n

import org.apache.log4j.Logger\nimport org.apache.log4j.Level\nimport org.apache.spark.{ SparkConf, SparkContext }\nimport org.apache.spark.rdd.RDD\n\nobject RunWordCount {\n def main(args: Array[String]): Unit = {\n Logger.getLogger(\"org\").setLevel(Level.OFF)\n System.setProperty(\"spark.ui.showConsoleProgress\", \"false\")\n println(\"\u5f00\u59cb\u8fd0\u884cRunWordCount\")\n \n val sc = new SparkContext(new SparkConf().setAppName(\"wordCount\").setMaster(\"local[1]\"))\n\/\/val sc = new SparkContext(new SparkConf().setAppName(\"wordCount\").setMaster(\"local[1]\").set(\"spark.testing.memory\", \"536870912\"))\n \n println(\"\u5f00\u59cb\u8bfb\u53d6\u6587\u672c\u6587\u4ef6...\")\n val textFile = sc.textFile(\"data\/LICENSE.txt\") \n println(\"\u5f00\u59cb\u521b\u5efaRDD...\")\n val countsRDD = textFile.flatMap(line => line.split(\" \")) \n .map(word => (word, 1))\n .reduceByKey(_ + _) \n println(\"\u5f00\u59cb\u4fdd\u5b58\u5230\u6587\u672c\u6587\u4ef6...\")\n try {\n countsRDD.saveAsTextFile(\"data\/output\") \n println(\"\u5df2\u7ecf\u5b58\u76d8\u6210\u529f\")\n } catch {\n case e: Exception => println(\"\u8f93\u51fa\u76ee\u5f55\u5df2\u7ecf\u5b58\u5728,\u8bf7\u5148\u5220\u9664\u539f\u6709\u76ee\u5f55\");\n }\n }\n}<\/code><\/pre>
<\/div>\n

<\/p>\n

\u8fd0\u884c\u62a5\u95192<\/p>\n

<\/p>

Exception in thread \"main\" java<\/a>.lang.IllegalArgumentException: System memory 251396096 must be at least 471859200. Please increase heap size using the --driver-memory option or spark.driver.memory in Spark configuration.\nat org.apache.spark.memory.UnifiedMemoryManager$.getMaxMemory(UnifiedMemoryManager.scala:216)\nat org.apache.spark.memory.UnifiedMemoryManager$.apply(UnifiedMemoryManager.scala:198)\nat org.apache.spark.SparkEnv$.create(SparkEnv.scala:330)\nat org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:174)\nat org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:257)\nat org.apache.spark.SparkContext.<init>(SparkContext.scala:432)\nat RunWordCount$.main(RunWordCount.scala:17)\nat RunWordCount.main(RunWordCount.scala)<\/code><\/pre>\u8fd9\u662fJVM\u7533\u8bf7\u7684memory\u4e0d\u591f\u5bfc\u81f4<\/span>

\u53c2\u8003http:\/\/blog.csdn.net\/shenshendeai\/article\/details\/54631237<\/span><\/p>\n

\u5e94\u8be5\u5728conf\u91cc\u8bbe\u7f6e\u4e00\u4e0bspark.testing.memory.<\/span><\/p>\n

\n\u901a\u8fc7\u5c1d\u8bd5\uff0c\u53d1\u73b0\u53ef\u4ee5\u67092\u4e2a\u5730\u65b9\u53ef\u4ee5\u8bbe\u7f6e<\/span><\/p>\n

\n1. \u81ea\u5df1\u7684\u6e90\u4ee3\u7801\u5904\uff0c\u53ef\u4ee5\u5728conf\u4e4b\u540e\u52a0\u4e0a\uff1a<\/span><\/p>\n

\n val conf = new SparkConf().setAppName(\"word count\")
\n conf.set(\"spark.testing.memory\", \"2147480000\")\/\/\u540e\u9762\u7684\u503c\u5927\u4e8e512m\u5373\u53ef<\/span>
<\/p>\n

\n2. \u53ef\u4ee5\u5728Eclipse\u7684Run Configuration\u5904\uff0c\u6709\u4e00\u680f\u662fArguments\uff0c\u4e0b\u9762\u6709VMarguments\uff0c\u5728\u4e0b\u9762\u6dfb\u52a0\u4e0b\u9762\u4e00\u884c(\u503c\u4e5f\u662f\u53ea\u8981\u5927\u4e8e512m\u5373\u53ef)<\/span><\/p>\n

\n-Dspark.testing.memory=1073741824<\/span><\/p>\n

\n\u5176\u4ed6\u7684\u53c2\u6570\uff0c\u4e5f\u53ef\u4ee5\u52a8\u6001\u5730\u5728\u8fd9\u91cc\u8bbe\u7f6e\uff0c\u6bd4\u5982-Dspark.master=spark:\/\/hostname:7077<\/span><\/p>\n

\n\u518d\u8fd0\u884c\u5c31\u4e0d\u4f1a\u62a5\u8fd9\u4e2a\u9519\u8bef\u4e86\u3002<\/span><\/p>\n

\u6211\u7528\u4e86\u7b2c\u4e00\u4e2a\u65b9\u6cd5\uff0c\u4fee\u6539\u6210\u6ce8\u91ca\u6389\u7684\u90a3\u884c\u4ee3\u7801\uff0c\u8fd0\u884c\u5373\u4e0d\u62a5\u9519\u3002<\/span><\/p>\n <\/div>\n <\/div>","orderid":"0","title":"spark IDE:   System memory 251396096 must be at least 471859200","smalltitle":"","mid":"0","fname":"Spark","special_id":"0","bak_id":"0","info":"0","hits":"448","pages":"1","comments":"0","posttime":"2019-05-16 01:30:07","list":"1557941407","username":"admin","author":"","copyfrom":"","copyfromurl":"","titlecolor":"","fonttype":"0","titleicon":"0","picurl":"https:\/\/www.cppentry.com\/upload_files\/","ispic":"0","yz":"1","yzer":"","yztime":"0","levels":"0","levelstime":"0","keywords":"spark<\/A> IDE:<\/A>  <\/A> System<\/A> memory<\/A> 251396096<\/A> must<\/A> least<\/A> 471859200<\/A>","jumpurl":"","iframeurl":"","style":"","template":"a:3:{s:4:\"foot\";s:0:\"\";s:8:\"bencandy\";s:0:\"\";s:4:\"head\";s:0:\"\";}","target":"0","ip":"47.106.78.186","lastfid":"0","money":"0","buyuser":"","passwd":"","allowdown":"","allowview":"","editer":"","edittime":"0","begintime":"0","endtime":"0","description":" Spark\u7248\u672c:2.1.2\uff0cspark\u81ea\u5e26scala\u7248\u672c2.11.8\uff0cspark IDE \u7248\u672c4.7Eclipse \u4e2d\u521b\u5efaWordCount\u9879\u76ee, \u5c06scala library container \u8bbe\u7f6e\u4e3a2.11.11 , \u5c06spark\/jars\u4e2d\u7684jar\u90fd\u5bfc\u5165reference libraries\uff0c\u4ee3\u7801im..","lastview":"1713275887","digg_num":"0","digg_time":"0","forbidcomment":"0","ifvote":"0","heart":"","htmlname":"","city_id":"0"},"page":"1"}