Uploaded image for project: 'CDAP'
  1. CDAP
  2. CDAP-2821

Spark native library linkage error causes CDAP standalone to stop

    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: None
    • Fix Version/s: 3.1.0
    • Component/s: CDAP Guides, Spark
    • Labels:
      None
    • Sprint:
      Workflow/Spark ending 07/20
    • Rank:
      1|hzyc4f:

      Description

      When trying the MovieRecommender app on standalone running ubuntu, bumped into linkage error,

      2015-06-19 22:18:34,422 - ERROR [Executor task launch worker-0:o.a.s.Logging$class@96] - Uncaught exception in thread Thread[Executor task launch worker-0,5,netty-executor-thread]
      java.lang.UnsatisfiedLinkError: org.jblas.NativeBlas.dposv(CII[DII[DII)I
              at org.jblas.NativeBlas.dposv(Native Method) ~[jblas-1.2.3.jar:na]
              at org.jblas.SimpleBlas.posv(SimpleBlas.java:369) ~[jblas-1.2.3.jar:na]
              at org.jblas.Solve.solvePositive(Solve.java:68) ~[jblas-1.2.3.jar:na]
              at org.apache.spark.mllib.recommendation.ALS.solveLeastSquares(ALS.scala:602) ~[spark-mllib_2.10-1.1.0.jar:1.1.0]
              at org.apache.spark.mllib.recommendation.ALS$$anonfun$org$apache$spark$mllib$recommendation$ALS$$updateBlock$2.apply(ALS.scala:590) ~[spark-mllib_2.10-1.1.0.jar:1.1.0]
              at org.apache.spark.mllib.recommendation.ALS$$anonfun$org$apache$spark$mllib$recommendation$ALS$$updateBlock$2.apply(ALS.scala:576) ~[spark-mllib_2.10-1.1.0.jar:1.1.0]
              at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244) ~[org.scala-lang.scala-library-2.10.4.jar:na]
              at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244) ~[org.scala-lang.scala-library-2.10.4.jar:na]
              at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33) ~[org.scala-lang.scala-library-2.10.4.jar:na]
              at scala.collection.mutable.ArrayOps$ofInt.foreach(ArrayOps.scala:156) ~[org.scala-lang.scala-library-2.10.4.jar:na]
              at scala.collection.TraversableLike$class.map(TraversableLike.scala:244) ~[org.scala-lang.scala-library-2.10.4.jar:na]
              at scala.collection.mutable.ArrayOps$ofInt.map(ArrayOps.scala:156) ~[org.scala-lang.scala-library-2.10.4.jar:na]
              at org.apache.spark.mllib.recommendation.ALS.org$apache$spark$mllib$recommendation$ALS$$updateBlock(ALS.scala:576) ~[spark-mllib_2.10-1.1.0.jar:1.1.0]
              at org.apache.spark.mllib.recommendation.ALS$$anonfun$org$apache$spark$mllib$recommendation$ALS$$updateFeatures$2.apply(ALS.scala:505) ~[spark-mllib_2.10-1.1.0.jar:1.1.0]
              at org.apache.spark.mllib.recommendation.ALS$$anonfun$org$apache$spark$mllib$recommendation$ALS$$updateFeatures$2.apply(ALS.scala:504) ~[spark-mllib_2.10-1.1.0.jar:1.1.0]
              at org.apache.spark.rdd.MappedValuesRDD$$anonfun$compute$1.apply(MappedValuesRDD.scala:31) ~[org.apache.spark.spark-core_2.10-1.2.0.jar:1.2.0]
              at org.apache.spark.rdd.MappedValuesRDD$$anonfun$compute$1.apply(MappedValuesRDD.scala:31) ~[org.apache.spark.spark-core_2.10-1.2.0.jar:1.2.0]
              at scala.collection.Iterator$$anon$11.next(Iterator.scala:328) ~[org.scala-lang.scala-library-2.10.4.jar:na]
              at scala.collection.Iterator$$anon$11.next(Iterator.scala:328) ~[org.scala-lang.scala-library-2.10.4.jar:na]
              at org.apache.spark.util.collection.ExternalAppendOnlyMap.insertAll(ExternalAppendOnlyMap.scala:126) ~[org.apache.spark.spark-core_2.10-1.2.0.jar:1.2.0]
              at org.apache.spark.rdd.CoGroupedRDD$$anonfun$compute$5.apply(CoGroupedRDD.scala:160) ~[org.apache.spark.spark-core_2.10-1.2.0.jar:1.2.0]
              at org.apache.spark.rdd.CoGroupedRDD$$anonfun$compute$5.apply(CoGroupedRDD.scala:159) ~[org.apache.spark.spark-core_2.10-1.2.0.jar:1.2.0]
              at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:772) ~[org.scala-lang.scala-library-2.10.4.jar:na]
              at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) ~[org.scala-lang.scala-library-2.10.4.jar:na]
              at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47) ~[org.scala-lang.scala-library-2.10.4.jar:na]
              at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:771) ~[org.scala-lang.scala-library-2.10.4.jar:na]
              at org.apache.spark.rdd.CoGroupedRDD.compute(CoGroupedRDD.scala:159) ~[org.apache.spark.spark-core_2.10-1.2.0.jar:1.2.0]
              at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:263) ~[org.apache.spark.spark-core_2.10-1.2.0.jar:1.2.0]
              at org.apache.spark.rdd.RDD.iterator(RDD.scala:230) ~[org.apache.spark.spark-core_2.10-1.2.0.jar:1.2.0]
              at org.apache.spark.rdd.MappedValuesRDD.compute(MappedValuesRDD.scala:31) ~[org.apache.spark.spark-core_2.10-1.2.0.jar:1.2.0]
              at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:263) ~[org.apache.spark.spark-core_2.10-1.2.0.jar:1.2.0]
              at org.apache.spark.rdd.RDD.iterator(RDD.scala:230) ~[org.apache.spark.spark-core_2.10-1.2.0.jar:1.2.0]
              at org.apache.spark.rdd.FlatMappedValuesRDD.compute(FlatMappedValuesRDD.scala:31) ~[org.apache.spark.spark-core_2.10-1.2.0.jar:1.2.0]
              at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:263) ~[org.apache.spark.spark-core_2.10-1.2.0.jar:1.2.0]
              at org.apache.spark.rdd.RDD.iterator(RDD.scala:230) ~[org.apache.spark.spark-core_2.10-1.2.0.jar:1.2.0]
              at org.apache.spark.rdd.FlatMappedRDD.compute(FlatMappedRDD.scala:33) ~[org.apache.spark.spark-core_2.10-1.2.0.jar:1.2.0]
              at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:263) ~[org.apache.spark.spark-core_2.10-1.2.0.jar:1.2.0]
              at org.apache.spark.rdd.RDD.iterator(RDD.scala:230) ~[org.apache.spark.spark-core_2.10-1.2.0.jar:1.2.0]
              at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:68) ~[org.apache.spark.spark-core_2.10-1.2.0.jar:1.2.0]
              at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41) ~[org.apache.spark.spark-core_2.10-1.2.0.jar:1.2.0]
              at org.apache.spark.scheduler.Task.run(Task.scala:56) ~[org.apache.spark.spark-core_2.10-1.2.0.jar:1.2.0]
              at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:196) ~[org.apache.spark.spark-core_2.10-1.2.0.jar:1.2.0]
              at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) [na:1.7.0_75]
              at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) [na:1.7.0_75]
              at java.lang.Thread.run(Thread.java:745) [na:1.7.0_75]
      2015-06-19 22:18:34,425 - DEBUG [sparkDriver-akka.actor.default-dispatcher-14:o.a.s.Logging$class@63] - parentName: , name: TaskSet_12, runningTasks: 0
      2015-06-19 22:18:34,426 - INFO  [Thread-4:c.c.c.StandaloneMain@197] - Shutting down Standalone CDAP
      

      Noticed that this is due to missing native library, got past this issue when i installed,
      sudo apt-get install libgfortran3

      following the instructions here
      https://spark.apache.org/docs/0.9.0/mllib-guide.html#dependencies

      might be useful to document this under known issue or recommended steps.

        Attachments

          Activity

            People

            • Assignee:
              John John Jackson
              Reporter:
              shankar Shankar Selvam
            • Votes:
              0 Vote for this issue
              Watchers:
              4 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved: