Uploaded image for project: 'CDAP'
  1. CDAP
  2. CDAP-16760

Pipelines with Spark Program plugins ignore engine config

    Details

    • Release Notes:
      Fixed a bug where memory, cpu, and engine config properties were not being set for sparkprogram plugins
    • Rank:
      1|i00wz3:

      Description

      The ScalaSparkProgram plugin does not set the resource requirements when running in a pipeline. The executor memory is fixed at 512m even if the engine config value is different. The value cannot be updated from within the Spark code, I attempted to use spark.conf.set("spark.executor.memory", "2g") to update the value but it did not change anything.

      The workaround was to define task.executor.system.resources.memory in the runtime arguments which seemed to overwrite the default values.

        Attachments

          Activity

            People

            • Assignee:
              ashau Albert Shau
              Reporter:
              meseifan Mo Eseifan
            • Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved: