Uploaded image for project: 'CDAP'
  1. CDAP
  2. CDAP-11882

There should be a way to configure Spark/MapReduce job configuration through preferences

    XMLWordPrintableJSON

    Details

    • Type: Improvement
    • Status: Open
    • Priority: Major
    • Resolution: Unresolved
    • Affects Version/s: 4.2.0
    • Fix Version/s: None
    • Component/s: API, MapReduce
    • Labels:
    • Rank:
      1|i003nr:

      Description

      Currently, this must be done by application code, in the initialize() of the MapReduce. It would be very useful if CDAP could transfer certain runtime arguments (for example, prefixed by org.apache.hadoop,mapreduce) into the job conf automatically,.

        Attachments

          Activity

            People

            • Assignee:
              bhooshan Bhooshan Mogal
              Reporter:
              andreas Andreas Neumann
            • Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

              • Created:
                Updated: