°æȨÉùÃ÷£º±¾ÎÄΪ²©Ö÷Ô´´ÎÄÕ£¬Î´¾²©Ö÷ÔÊÐí²»µÃתÔØ¡£ https://blog.csdn.net/u012922838/article/details/53889142
Ò».°²×°spark
°²×°¿É²ÎÕÕÏÃÃÅ´óѧÊý¾ÝʵÑéÊÒ³öµÄ°²×°½Ì³Ì¨CSpark¿ìËÙÈëÃÅÖ¸ÄÏ - Spark°²×°Óë»ù´¡Ê¹ÓÃ
¶þ.Spark Python
²ÎÕÕ¹Ù·½Îĵµ¨CSpark Programming Guide
ÈκÎSpark³ÌÐò¶¼SparkContext¿ªÊ¼£¬SparkContextµÄ³õʼ»¯ÐèÒªÒ»¸öSparkConf¶ÔÏó¡£ËùÒÔËùÓеÄSpark PythonµÄµÚÒ»¾ä¶¼Ó¦¸ÃÊÇ
from pyspark import SparkContext£¬SparkConf
conf = SparkConf().setAppName(appName).setMaster(master)
sc = SparkContext(conf=conf)
appName ²ÎÊýÊÇÔÚ¼¯Èº½çÃ潫»áչʾµÄÓ¦ÓÃÃû³Æ£¬master ¿ÉÒÔÊÇSpark¡¢Mesos¡¢YarnµÄ¼¯ÈºURL£¬»òÕß¡°local¡±¡£Êµ¼ÊÔÚ¼¯ÈºÉÏÔËÐгÌÐòµÄʱºò£¬ÎÒÃDz»ÐèÒª°Ñmaster дËÀ£¬¿ÉÒÔͨ¹ýspark-submit À´²¿ÊðÓ¦Óò¢½ÓÊÕ²ÎÊý¡£±¾µØ²âÊÔʱÔòÓá°local¡±²ÎÊý
In the PySpark shell, a special interpreter-aware SparkContext is already created for you, in the variable called sc. Making your own SparkContext will not work.