java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;

刘超 2天前 ⋅ 1363 阅读   编辑

一、描述

  由2.0.2.2.5.6.0-40升级到2.4.4后,执行spark-shell报如下错误

[Loaded java.util.IdentityHashMap$KeySet from /usr/lib/jvm/java-8-openjdk-amd64/jre/lib/rt.jar]
java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;
	at org.apache.spark.deploy.SparkSubmitUtils$.parseSparkConfProperty(SparkSubmit.scala:1331)
	at org.apache.spark.deploy.SparkSubmitArguments.handle(SparkSubmitArguments.scala:448)
	at org.apache.spark.launcher.SparkSubmitOptionParser.parse(SparkSubmitOptionParser.java:163)
	at org.apache.spark.deploy.SparkSubmitArguments.(SparkSubmitArguments.scala:109)
	at org.apache.spark.deploy.SparkSubmit$$anon$2$$anon$1.(SparkSubmit.scala:907)
	at org.apache.spark.deploy.SparkSubmit$$anon$2.parseArguments(SparkSubmit.scala:907)
	at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:81)
	at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:920)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
[Loaded java.util.IdentityHashMap$IdentityHashMapIterator from /usr/lib/jvm/java-8-openjdk-amd64/jre/lib/rt.jar]

二、分析

  1、进入spark2.0.2.2.5.6.0-40安装目录,搜索scala.collection.mutable.ArrayOps,确认这类是哪个包下的,如下

sdev@n-adx-hadoop-client-3:/usr/hdp/2.5.6.0-40/spark2$ grep -rni "scala.collection.mutable.ArrayOps" .
Binary file ./jars/scala-library-2.11.8.jar matches
sdev@n-adx-hadoop-client-3:/usr/hdp/2.5.6.0-40/spark2$ 

  如上,是./jars/scala-library-2.11.8.jar包中的

  2、进入spark2.4.4安装目录,搜索scala.collection.mutable.ArrayOps,确认这个类是否存在,如下

sdev@n-adx-hadoop-client-3:~/liujichao/software/spark-2.4.4-bin-hadoop2.7$ grep -rni "scala.collection.mutable.ArrayOps" .
Binary file ./jars/scala-library-2.12.10.jar matches
sdev@n-adx-hadoop-client-3:~/liujichao/software/spark-2.4.4-bin-hadoop2.7$ 

  如上,也存在只是scala版本是2.12.10

  3、删掉spark2.4.4/jars中的如下jar包

sdev@n-adx-hadoop-client-3:~/liujichao/software/spark-2.4.4-bin-hadoop2.7/jars$ ll | grep scala
-rw-r--r--  1 sdev sdev   515645 Mar  8 13:29 jackson-module-scala_2.11-2.6.7.1.jar
-rw-r--r--  1 sdev sdev   614765 Mar  8 13:29 json4s-scalap_2.11-3.5.3.jar
-rw-r--r--  1 sdev sdev 15612191 Mar  8 13:29 scala-compiler-2.11.12.jar
-rw-r--r--  1 sdev sdev  5749423 Mar  8 13:29 scala-library-2.11.12.jar
-rw-r--r--  1 sdev sdev   471925 Mar  8 13:29 scala-parser-combinators_2.11-1.1.0.jar
-rw-r--r--  1 sdev sdev  4623075 Mar  8 13:29 scala-reflect-2.11.12.jar
-rw-r--r--  1 sdev sdev   671138 Mar  8 13:29 scala-xml_2.11-1.0.5.jar

  把/usr/hdp/2.5.6.0-40/spark2/jars中的以下jar复制到spark2.4.4/jars中

-rw-r--r--  1 sdev sdev   515604 Mar  8 13:22 jackson-module-scala_2.11-2.6.5.jar
-rw-r--r--  1 sdev sdev 15487351 Mar  8 13:25 scala-compiler-2.11.8.jar
-rw-r--r--  1 sdev sdev  5744974 Mar  8 13:25 scala-library-2.11.8.jar
-rw-r--r--  1 sdev sdev   802818 Mar  8 13:25 scalap-2.11.8.jar
-rw-r--r--  1 sdev sdev   423753 Mar  8 13:26 scala-parser-combinators_2.11-1.0.4.jar
-rw-r--r--  1 sdev sdev  4573750 Mar  8 13:26 scala-reflect-2.11.8.jar
-rw-r--r--  1 sdev sdev   648678 Mar  8 13:26 scala-xml_2.11-1.0.2.jar

  再次尝试又报如下错误

Exception in thread "main" java.lang.NoClassDefFoundError: scala/tools/nsc/interpreter/SplashReader$
	at org.apache.spark.repl.SparkILoop$$anonfun$process$1.org$apache$spark$repl$SparkILoop$$anonfun$$preLoop$1(SparkILoop.scala:188)
	at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$startup$1$1.apply(SparkILoop.scala:249)
	at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$startup$1$1.apply(SparkILoop.scala:247)
	at org.apache.spark.repl.SparkILoop$$anonfun$process$1.withSuppressedSettings$1(SparkILoop.scala:235)
	at org.apache.spark.repl.SparkILoop$$anonfun$process$1.startup$1(SparkILoop.scala:247)
	at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:282)
	at org.apache.spark.repl.SparkILoop.runClosure(SparkILoop.scala:159)
	at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:182)
	at org.apache.spark.repl.Main$.doMain(Main.scala:78)
	at org.apache.spark.repl.Main$.main(Main.scala:58)
	at org.apache.spark.repl.Main.main(Main.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
	at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:845)
	at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
	at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
	at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
	at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:920)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: scala.tools.nsc.interpreter.SplashReader$
	at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	... 23 more

  4、还原spark2.4.4/jars中scala包为2.11.12,再次执行又报如下错误

sdev@n-adx-hadoop-client-3:~/liujichao$ spark-shell
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
20/03/08 13:36:16 ERROR SparkContext: Error initializing SparkContext.
java.lang.IllegalArgumentException: java.net.UnknownHostException: opera
	at org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:378)
	at org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:310)
	at org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:176)
	at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:678)
	at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:619)
	at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:149)
	at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2669)
	at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:94)
	at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2703)
	at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2685)
	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:373)
	at org.apache.spark.util.Utils$.getHadoopFileSystem(Utils.scala:1866)
	at org.apache.spark.scheduler.EventLoggingListener.(EventLoggingListener.scala:71)
	at org.apache.spark.SparkContext.(SparkContext.scala:521)
	at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2520)
	at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:935)
	at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:926)
	at scala.Option.getOrElse(Option.scala:121)
	at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926)
	at org.apache.spark.repl.Main$.createSparkSession(Main.scala:106)
	at $line3.$read$$iw$$iw.(:15)
	at $line3.$read$$iw.(:43)
	at $line3.$read.(:45)
	at $line3.$read$.(:49)
	at $line3.$read$.()
	at $line3.$eval$.$print$lzycompute(:7)
	at $line3.$eval$.$print(:6)
	at $line3.$eval.$print()
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:793)
	at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1054)
	at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:645)
	at scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:644)
	at scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
	at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
	at scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:644)
	at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:576)
	at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:572)
	at scala.tools.nsc.interpreter.IMain$$anonfun$quietRun$1.apply(IMain.scala:231)
	at scala.tools.nsc.interpreter.IMain$$anonfun$quietRun$1.apply(IMain.scala:231)
	at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:221)
	at scala.tools.nsc.interpreter.IMain.quietRun(IMain.scala:231)
	at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:109)
	at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1$$anonfun$apply$mcV$sp$1.apply(SparkILoop.scala:109)
	at scala.collection.immutable.List.foreach(List.scala:392)
	at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(SparkILoop.scala:109)
	at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:109)
	at org.apache.spark.repl.SparkILoop$$anonfun$initializeSpark$1.apply(SparkILoop.scala:109)
	at scala.tools.nsc.interpreter.ILoop.savingReplayStack(ILoop.scala:91)
	at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:108)
	at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$org$apache$spark$repl$SparkILoop$$anonfun$$loopPostInit$1$1.apply$mcV$sp(SparkILoop.scala:211)
	at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$org$apache$spark$repl$SparkILoop$$anonfun$$loopPostInit$1$1.apply(SparkILoop.scala:199)
	at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$org$apache$spark$repl$SparkILoop$$anonfun$$loopPostInit$1$1.apply(SparkILoop.scala:199)
	at scala.tools.nsc.interpreter.ILoop$$anonfun$mumly$1.apply(ILoop.scala:189)
	at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:221)
	at scala.tools.nsc.interpreter.ILoop.mumly(ILoop.scala:186)
	at org.apache.spark.repl.SparkILoop$$anonfun$process$1.org$apache$spark$repl$SparkILoop$$anonfun$$loopPostInit$1(SparkILoop.scala:199)
	at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$startup$1$1.apply(SparkILoop.scala:267)
	at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$startup$1$1.apply(SparkILoop.scala:247)
	at org.apache.spark.repl.SparkILoop$$anonfun$process$1.withSuppressedSettings$1(SparkILoop.scala:235)
	at org.apache.spark.repl.SparkILoop$$anonfun$process$1.startup$1(SparkILoop.scala:247)
	at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:282)
	at org.apache.spark.repl.SparkILoop.runClosure(SparkILoop.scala:159)
	at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:182)
	at org.apache.spark.repl.Main$.doMain(Main.scala:78)
	at org.apache.spark.repl.Main$.main(Main.scala:58)
	at org.apache.spark.repl.Main.main(Main.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
	at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:845)
	at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
	at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
	at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
	at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:920)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.net.UnknownHostException: opera
	... 82 more

  5、Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;,这里检查scala版本


注意:本文归作者所有,未经作者允许,不得转载

全部评论: 0

    我有话说: