ÉèΪÊ×Ò³ ¼ÓÈëÊÕ²Ø

TOP

SparkÈëÃÅ_python
2019-03-19 13:10:40 ¡¾´ó ÖРС¡¿ ä¯ÀÀ:46´Î
Tags£ºSpark ÈëÃÅ _python
°æȨÉùÃ÷£º±¾ÎÄΪ²©Ö÷Ô­´´ÎÄÕ£¬Î´¾­²©Ö÷ÔÊÐí²»µÃתÔØ¡£ https://blog.csdn.net/u012922838/article/details/53889142

Ò».°²×°spark

°²×°¿É²ÎÕÕÏÃÃÅ´óѧÊý¾ÝʵÑéÊÒ³öµÄ°²×°½Ì³Ì¨CSpark¿ìËÙÈëÃÅÖ¸ÄÏ - Spark°²×°Óë»ù´¡Ê¹ÓÃ

¶þ.Spark Python

²ÎÕÕ¹Ù·½Îĵµ¨CSpark Programming Guide

ÈκÎSpark³ÌÐò¶¼SparkContext¿ªÊ¼£¬SparkContextµÄ³õʼ»¯ÐèÒªÒ»¸öSparkConf¶ÔÏó¡£ËùÒÔËùÓеÄSpark PythonµÄµÚÒ»¾ä¶¼Ó¦¸ÃÊÇ

from  pyspark import SparkContext£¬SparkConf
conf = SparkConf().setAppName(appName).setMaster(master)
sc = SparkContext(conf=conf) 

appName²ÎÊýÊÇÔÚ¼¯Èº½çÃ潫»áչʾµÄÓ¦ÓÃÃû³Æ£¬master¿ÉÒÔÊÇSpark¡¢Mesos¡¢YarnµÄ¼¯ÈºURL£¬»òÕß¡°local¡±¡£Êµ¼ÊÔÚ¼¯ÈºÉÏÔËÐгÌÐòµÄʱºò£¬ÎÒÃDz»ÐèÒª°Ñmaster дËÀ£¬¿ÉÒÔͨ¹ýspark-submitÀ´²¿ÊðÓ¦Óò¢½ÓÊÕ²ÎÊý¡£±¾µØ²âÊÔʱÔòÓá°local¡±²ÎÊý

In the PySpark shell, a special interpreter-aware SparkContext is already created for you, in the variable called sc. Making your own SparkContext will not work.

¡¾´ó ÖРС¡¿¡¾´òÓ¡¡¿ ¡¾·±Ìå¡¿¡¾Í¶¸å¡¿¡¾Êղء¿ ¡¾ÍƼö¡¿¡¾¾Ù±¨¡¿¡¾ÆÀÂÛ¡¿ ¡¾¹Ø±Õ¡¿ ¡¾·µ»Ø¶¥²¿¡¿
ÉÏһƪ£ºsparkstreamingʵÏÖ˼·ºÍÄ£¿é¸ÅÊö ÏÂһƪ£ºSpark on yarnÓзÖΪÁ½ÖÖģʽyarn..

×îÐÂÎÄÕÂ

ÈÈÃÅÎÄÕÂ

Hot ÎÄÕÂ

Python

C ÓïÑÔ

C++»ù´¡

´óÊý¾Ý»ù´¡

linux±à³Ì»ù´¡

C/C++ÃæÊÔÌâÄ¿