每當(dāng)我試圖收集我的RDD時(shí),我就開(kāi)始得到以下錯(cuò)誤,這是在我安裝了Java10.1之后發(fā)生的,所以我當(dāng)然取出它并重新安裝它,同樣的錯(cuò)誤。然后,我安裝了同樣的錯(cuò)誤Java9.04。然后,我取出python 2.7.14、ApacheSequence 2.3.0和Hadoop2.7,同樣的錯(cuò)誤。有沒(méi)有人有其他原因讓我繼續(xù)犯錯(cuò)誤?>>> from operator import add>>> from pyspark import SparkConf, SparkContext>>> import string>>> import sys>>> import re>>>>>>
sc = SparkContext(appName="NEW")2018-04-21 22:28:45 WARN Utils:66 - Service 'SparkUI' could not bind on port 4040. Attempting port 4041.
>>> rdd = sc.parallelize(xrange(1,10))>>> new =rdd.collect()Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "C:\spark\spark-2.3.0-bin-hadoop2.7\python\pyspark\rdd.py", line 824, in collect
port = self.ctx._jvm.PythonRDD.collectAndServe(self._jrdd.rdd())
File "C:\spark\spark-2.3.0-bin-hadoop2.7\python\lib\py4j-0.10.6-src.zip\py4j\java_gateway.py", line 1160, in __call__ File "C:\spark\s
park-2.3.0-bin-hadoop2.7\python\pyspark\sql\utils.py", line 63, in deco return f(*a, **kw)
File "C:\spark\spark-2.3.0-bin-hadoop2.7\python\lib\py4j-0.10.6-src.zip\py4j\protocol.py", line 320, in get_return_value
py4j.protocol.Py4JJavaError: An error occurred while calling z:org.apache.spark.api.python.PythonRDD.collectAndServe.: java.lang.Ill
egalArgumentException
at org.apache.xbean.asm5.ClassReader.<init>(Unknown Source)
at org.apache.xbean.asm5.ClassReader.<init>(Unknown Source)
at org.apache.xbean.asm5.ClassReader.<init>(Unknown Source)
at org.apache.spark.util.ClosureCleaner$.getClassReader(ClosureCleaner.scala:46)
at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:449)
java.lang.IllegalArgumentException at org.apache
繁星點(diǎn)點(diǎn)滴滴
2019-07-13 16:16:11