设为首页 加入收藏

TOP

查看Spark进程的JVM配置及内存使用
2019-01-06 01:29:14 】 浏览:141
Tags:查看 Spark 进程 JVM 配置 内存 使用

查看Spark进程的JVM配置及内存使用

如何查看正在运行的Spark进程的JVM配置以及分代的内存使用情况,是线上运行作业常用的监控手段:


1、通过ps命令查询PID

  1. ps-ef|grep5661

可以根据命令中的特殊字符来定位pid




2、使用jinfo命令查询该进程的JVM参数设置

  1. jinfo105007

可以得到详细的JVM配置信息

  1. AttachingtoprocessID105007,pleasewait...
  2. Debuggerattachedsuccessfully.
  3. Servercompilerdetected.
  4. JVMversionis24.65-b04
  5. JavaSystemProperties:
  6. spark.local.dir=/diskb/sparktmp,/diskc/sparktmp,/diskd/sparktmp,/diske/sparktmp,/diskf/sparktmp,/diskg/sparktmp
  7. java.runtime.name=Java(TM)SERuntimeEnvironment
  8. java.vm.version=24.65-b04
  9. sun.boot.library.path=/usr/java/jdk1.7.0_67-cloudera/jre/lib/amd64
  10. java.vendor.url=http://java.oracle.com/
  11. java.vm.vendor=OracleCorporation
  12. path.separator=:
  13. file.encoding.pkg=sun.io
  14. java.vm.name=JavaHotSpot(TM)64-BitServerVM
  15. sun.os.patch.level=unknown
  16. sun.java.launcher=SUN_STANDARD
  17. user.country=CN
  18. user.dir=/opt/bin/spark_dev_job
  19. java.vm.specification.name=JavaVirtualMachineSpecification
  20. java.runtime.version=1.7.0_67-b01
  21. java.awt.graphicsenv=sun.awt.X11GraphicsEnvironment
  22. SPARK_SUBMIT=true
  23. os.arch=amd64
  24. java.endorsed.dirs=/usr/java/jdk1.7.0_67-cloudera/jre/lib/endorsed
  25. spark.executor.memory=24g
  26. line.separator=
  27. java.io.tmpdir=/tmp
  28. java.vm.specification.vendor=OracleCorporation
  29. os.name=Linux
  30. spark.driver.memory=15g
  31. spark.master=spark://10.130.2.220:7077
  32. sun.jnu.encoding=UTF-8
  33. java.library.path=:/opt/cloudera/parcels/CDH/lib/hadoop/lib/native:/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
  34. sun.nio.ch.bugLevel=
  35. java.class.version=51.0
  36. java.specification.name=JavaPlatformAPISpecification
  37. sun.management.compiler=HotSpot64-BitTieredCompilers
  38. spark.submit.deployMode=client
  39. spark.executor.extraJavaOptions=-XX:PermSize=8m-XX:+PrintGCDetails-XX:+PrintGCTimeStamps
  40. os.version=2.6.32-573.8.1.el6.x86_64
  41. user.home=/root
  42. user.timezone=PRC
  43. java.awt.printerjob=sun.print.PSPrinterJob
  44. file.encoding=UTF-8
  45. java.specification.version=1.7
  46. spark.app.name=com.hexun.streaming.NewsTopNRealRankOffsetRise
  47. spark.eventLog.enabled=true
  48. user.name=root
  49. java.class.path=/opt/cloudera/parcels/CDH/lib/hadoop/lib/snappy-java-1.0.4.1.jar:/opt/modules/spark-1.6.1-bin-hadoop2.6/conf/:/opt/modules/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar:/opt/modules/spark-1.6.1-bin-hadoop2.6/lib/datanucleus-core-3.2.10.jar:/opt/modules/spark-1.6.1-bin-hadoop2.6/lib/datanucleus-api-jdo-3.2.6.jar:/opt/modules/spark-1.6.1-bin-hadoop2.6/lib/datanucleus-rdbms-3.2.9.jar:/etc/hadoop/conf/
  50. java.vm.specification.version=1.7
  51. sun.arch.data.model=64
  52. sun.java.command=org.apache.spark.deploy.SparkSubmit--masterspark://10.130.2.220:7077--confspark.driver.memory=15g--confspark.executor.extraJavaOptions=-XX:PermSize=8m-XX:+PrintGCDetails-XX:+PrintGCTimeStamps--confspark.ui.port=5661--classcom.hexun.streaming.NewsTopNRealRankOffsetRise--executor-memory24g--total-executor-cores24--jars/opt/bin/sparkJars/kafka_2.10-0.8.2.1.jar,/opt/bin/sparkJars/spark-streaming-kafka_2.10-1.6.1.jar,/opt/bin/sparkJars/metrics-core-2.2.0.jar,/opt/bin/sparkJars/mysql-connector-java-5.1.26-bin.jarNewsTopNRealRankOffsetRise.jar
  53. java.home=/usr/java/jdk1.7.0_67-cloudera/jre
  54. user.language=zh
  55. java.specification.vendor=OracleCorporation
  56. awt.toolkit=sun.awt.X11.XToolkit
  57. spark.ui.port=5661
  58. java.vm.info=mixedmode
  59. java.version=1.7.0_67
  60. java.ext.dirs=/usr/java/jdk1.7.0_67-cloudera/jre/lib/ext:/usr/java/packages/lib/ext
  61. sun.boot.class.path=/usr/java/jdk1.7.0_67-cloudera/jre/lib/resources.jar:/usr/java/jdk1.7.0_67-cloudera/jre/lib/rt.jar:/usr/java/jdk1.7.0_67-cloudera/jre/lib/sunrsasign.jar:/usr/java/jdk1.7.0_67-cloudera/jre/lib/jsse.jar:/usr/java/jdk1.7.0_67-cloudera/jre/lib/jce.jar:/usr/java/jdk1.7.0_67-cloudera/jre/lib/charsets.jar:/usr/java/jdk1.7.0_67-cloudera/jre/lib/jfr.jar:/usr/java/jdk1.7.0_67-cloudera/jre/classes
  62. java.vendor=OracleCorporation
  63. file.separator=/
  64. spark.cores.max=24
  65. spark.eventLog.dir=hdfs://nameservice1/spark-log
  66. java.vendor.url.bug=http://bugreport.sun.com/bugreport/
  67. sun.io.unicode.encoding=UnicodeLittle
  68. sun.cpu.endian=little
  69. spark.jars=file:/opt/bin/sparkJars/kafka_2.10-0.8.2.1.jar,file:/opt/bin/sparkJars/spark-streaming-kafka_2.10-1.6.1.jar,file:/opt/bin/sparkJars/metrics-core-2.2.0.jar,file:/opt/bin/sparkJars/mysql-connector-java-5.1.26-bin.jar,file:/opt/bin/spark_dev_job/NewsTopNRealRankOffsetRise.jar
  70. sun.cpu.isalist=
  71. VMFlags:
  72. -Xms15g-Xmx15g-XX:MaxPermSize=256m


3、使用jmap查看进程中内存分代使用的情况

  1. jmap-heap105007

可以得到该Java进程使用内存的详细情况,包括新生代老年代内存的使用

  1. AttachingtoprocessID105007,pleasewait...
  2. Debuggerattachedsuccessfully.
  3. Servercompilerdetected.
  4. JVMversionis24.65-b04
  5. usingthread-localobjectallocation.
  6. ParallelGCwith18thread(s)
  7. HeapConfiguration:
  8. MinHeapFreeRatio=0
  9. MaxHeapFreeRatio=100
  10. MaxHeapSize=16106127360(15360.0MB)
  11. NewSize=1310720(1.25MB)
  12. MaxNewSize=17592186044415MB
  13. OldSize=5439488(5.1875MB)
  14. NewRatio=2
  15. SurvivorRatio=8
  16. PermSize=21757952(20.75MB)
  17. MaxPermSize=268435456(256.0MB)
  18. G1HeapRegionSize=0(0.0MB)
  19. HeapUsage:
  20. PSYoungGeneration
  21. EdenSpace:
  22. capacity=4945084416(4716.0MB)
  23. used=2674205152(2550.320770263672MB)
  24. free=2270879264(2165.679229736328MB)
  25. 54.07804856369109%used
  26. FromSpace:
  27. capacity=217579520(207.5MB)
  28. used=37486624(35.750030517578125MB)
  29. free=180092896(171.74996948242188MB)
  30. 17.22893036991717%used
  31. ToSpace:
  32. capacity=206045184(196.5MB)
  33. used=0(0.0MB)
  34. free=206045184(196.5MB)
  35. 0.0%used
  36. PSOldGeneration
  37. capacity=10737418240(10240.0MB)
  38. used=7431666880(7087.389831542969MB)
  39. free=3305751360(3152.6101684570312MB)
  40. 69.2127913236618%used
  41. PSPermGeneration
  42. capacity=268435456(256.0MB)
  43. used=128212824(122.27327728271484MB)
  44. free=140222632(133.72672271728516MB)
  45. 47.762998938560486%used

】【打印繁体】【投稿】【收藏】 【推荐】【举报】【评论】 【关闭】 【返回顶部
上一篇第121课: Spark Streaming性能.. 下一篇Spark中repartition和partitionBy..

最新文章

热门文章

Hot 文章

Python

C 语言

C++基础

大数据基础

linux编程基础

C/C++面试题目