Uploaded image for project: 'CDAP'
  1. CDAP
  2. CDAP-13316

SparkTest is failing

    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: None
    • Fix Version/s: 5.0.0
    • Component/s: Spark
    • Labels:
      None
    • Rank:
      1|i00c27:

      Description

      SparkTest fails sometimes. It fails waiting for results in a table because the Spark program fails with error:

      Caused by: java.util.concurrent.ExecutionException: org.apache.tephra.TransactionFailureException: Exception raised in transactional execution. Cause: Attempted to use closed dataset InMemoryTable(table = cdap_default.KeyValueTable.kv)
              at com.google.common.util.concurrent.AbstractFuture$Sync.getValue(AbstractFuture.java:294)
              at com.google.common.util.concurrent.AbstractFuture$Sync.get(AbstractFuture.java:281)
              at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:116)
              at co.cask.cdap.app.runtime.spark.SparkRuntimeService.run(SparkRuntimeService.java:342)
              at com.google.common.util.concurrent.AbstractExecutionThreadService$1$1.run(AbstractExecutionThreadService.java:52)
              ... 2 more
      Caused by: org.apache.tephra.TransactionFailureException: Exception raised in transactional execution. Cause: Attempted to use closed dataset InMemoryTable(table = cdap_default.KeyValueTable.kv)
              at co.cask.cdap.data2.transaction.Transactions.asTransactionFailure(Transactions.java:82)
              at co.cask.cdap.data2.transaction.Transactions.asTransactionFailure(Transactions.java:68)
              at co.cask.cdap.app.runtime.spark.SparkTransactional.execute(SparkTransactional.java:225)
              at co.cask.cdap.app.runtime.spark.AbstractSparkExecutionContext$$anon$5.apply(AbstractSparkExecutionContext.scala:440)
              at co.cask.cdap.app.runtime.spark.data.DatasetRDD.createDelegateRDD(DatasetRDD.scala:65)
              at co.cask.cdap.app.runtime.spark.data.DatasetRDD.delegateRDD$lzycompute(DatasetRDD.scala:54)
              at co.cask.cdap.app.runtime.spark.data.DatasetRDD.delegateRDD(DatasetRDD.scala:54)
              at co.cask.cdap.app.runtime.spark.data.DatasetRDD.getPartitions(DatasetRDD.scala:61)
              at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
              at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
              at scala.Option.getOrElse(Option.scala:120)
              at org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
              at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)
              at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
              at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
              at scala.Option.getOrElse(Option.scala:120)
              at org.apache.spark.rdd.RDD.partitions(RDD.scala:237)
              at org.apache.spark.Partitioner$.defaultPartitioner(Partitioner.scala:65)
              at org.apache.spark.rdd.PairRDDFunctions$$anonfun$reduceByKey$3.apply(PairRDDFunctions.scala:331)
              at org.apache.spark.rdd.PairRDDFunctions$$anonfun$reduceByKey$3.apply(PairRDDFunctions.scala:331)
              at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
              at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
              at org.apache.spark.rdd.RDD.withScope(RDD.scala:316)
              at org.apache.spark.rdd.PairRDDFunctions.reduceByKey(PairRDDFunctions.scala:330)
              at co.cask.cdap.spark.app.TransactionSpark.run(TransactionSpark.scala:69)
              at co.cask.cdap.app.runtime.spark.SparkMainWrapper$.main(SparkMainWrapper.scala:78)
              at co.cask.cdap.app.runtime.spark.SparkMainWrapper.main(SparkMainWrapper.scala)
              at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
              at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
              at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
              at java.lang.reflect.Method.invoke(Method.java:498)
              at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
              at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
              at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
              at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
              at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
              at co.cask.cdap.app.runtime.spark.submit.AbstractSparkSubmitter.submit(AbstractSparkSubmitter.java:172)
              at co.cask.cdap.app.runtime.spark.submit.AbstractSparkSubmitter.access$000(AbstractSparkSubmitter.java:54)
              at co.cask.cdap.app.runtime.spark.submit.AbstractSparkSubmitter$5.run(AbstractSparkSubmitter.java:111)
              at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
              at java.util.concurrent.FutureTask.run(FutureTask.java:266)
              at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
              at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
              ... 1 more
      Caused by: java.lang.IllegalStateException: Attempted to use closed dataset InMemoryTable(table = cdap_default.KeyValueTable.kv)
              at co.cask.cdap.data2.dataset2.lib.table.BufferingTable.startTx(BufferingTable.java:272)
              at co.cask.cdap.api.dataset.lib.AbstractDataset.startTx(AbstractDataset.java:76)
              at co.cask.cdap.app.runtime.spark.SparkTransactional$TransactionalDatasetContext.getDataset(SparkTransactional.java:375)
              at co.cask.cdap.app.runtime.spark.AbstractSparkExecutionContext$$anon$5$$anon$6.run(AbstractSparkExecutionContext.scala:442)
              at co.cask.cdap.app.runtime.spark.SparkTransactional.execute(SparkTransactional.java:206)
              ... 41 more
      

        Attachments

          Activity

            People

            • Assignee:
              terence Terence Yim
              Reporter:
              ashau Albert Shau
            • Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved: