Uploaded image for project: 'CDAP'
  1. CDAP
  2. CDAP-13768

StandaloneTestSuite fails when running WorkflowTest

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Major
    • Resolution: Duplicate
    • Affects Version/s: 5.0.0
    • Fix Version/s: 5.1.0
    • Component/s: None
    • Labels:
    • Rank:
      1|i00ekv:

      Description

      This test suite fails to run (cdap-integration-test) repo:

      mvn clean test -P standalone-test 

      The error is that the WikipediaPipelineWorkflow fails:

      2018-07-16 14:37:21,697 - ERROR [SparkRunnerSparkWikipediaClustering-LDA:c.c.c.i.a.r.ProgramControllerServiceAdapter$1@97] - Spark Program 'SparkWikipediaClustering-LDA' failed.
      java.util.concurrent.ExecutionException: java.lang.NoSuchMethodError: scala.Predef$.$conforms()Lscala/Predef$$less$colon$less;
      	at com.google.common.util.concurrent.AbstractFuture$Sync.getValue(AbstractFuture.java:294) ~[guava-13.0.1.jar:na]
      	at com.google.common.util.concurrent.AbstractFuture$Sync.get(AbstractFuture.java:281) ~[guava-13.0.1.jar:na]
      	at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:116) ~[guava-13.0.1.jar:na]
      	at co.cask.cdap.app.runtime.spark.SparkRuntimeService.run(SparkRuntimeService.java:347) ~[cdap-spark-core2_2.11-5.0.0-SNAPSHOT.jar:na]
      	at com.google.common.util.concurrent.AbstractExecutionThreadService$1$1.run(AbstractExecutionThreadService.java:52) ~[guava-13.0.1.jar:na]
      	at co.cask.cdap.app.runtime.spark.SparkRuntimeService$5$1.run(SparkRuntimeService.java:405) [cdap-spark-core2_2.11-5.0.0-SNAPSHOT.jar:na]
      	at java.lang.Thread.run(Thread.java:748) [na:1.8.0_151]
      Caused by: java.lang.NoSuchMethodError: scala.Predef$.$conforms()Lscala/Predef$$less$colon$less;
      	at org.apache.spark.util.Utils$.getDefaultPropertiesFile(Utils.scala:2086) ~[spark-core_2.11-2.1.0.jar:2.1.0]
      	at org.apache.spark.deploy.SparkSubmitArguments$$anonfun$mergeDefaultSparkProperties$1.apply(SparkSubmitArguments.scala:118) ~[spark-core_2.11-2.1.0.jar:2.1.0]
      	at org.apache.spark.deploy.SparkSubmitArguments$$anonfun$mergeDefaultSparkProperties$1.apply(SparkSubmitArguments.scala:118) ~[spark-core_2.11-2.1.0.jar:2.1.0]
      	at scala.Option.getOrElse(Option.scala:120) ~[scala-library-2.10.4.jar:na]
      	at org.apache.spark.deploy.SparkSubmitArguments.mergeDefaultSparkProperties(SparkSubmitArguments.scala:118) ~[spark-core_2.11-2.1.0.jar:2.1.0]
      	at org.apache.spark.deploy.SparkSubmitArguments.<init>(SparkSubmitArguments.scala:104) ~[spark-core_2.11-2.1.0.jar:2.1.0]
      	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119) ~[spark-core_2.11-2.1.0.jar:2.1.0]
      	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) ~[spark-core_2.11-2.1.0.jar:2.1.0]
      	at co.cask.cdap.app.runtime.spark.submit.AbstractSparkSubmitter.submit(AbstractSparkSubmitter.java:172) ~[na:na]
      	at co.cask.cdap.app.runtime.spark.submit.AbstractSparkSubmitter.access$000(AbstractSparkSubmitter.java:54) ~[na:na]
      	at co.cask.cdap.app.runtime.spark.submit.AbstractSparkSubmitter$5.run(AbstractSparkSubmitter.java:111) ~[na:na]
      	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[na:1.8.0_151]
      	at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[na:1.8.0_151]
      	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[na:1.8.0_151]
      	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[na:1.8.0_151]
      	... 1 common frames omitted 

      The result is a failing build: https://builds.cask.co/browse/IT-ITS-44

        Attachments

          Issue Links

            Activity

              People

              • Assignee:
                ali.anwar Ali Anwar
                Reporter:
                ali.anwar Ali Anwar
              • Votes:
                0 Vote for this issue
                Watchers:
                2 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: