Uploaded image for project: 'CDAP'
  1. CDAP
  2. CDAP-16297

Using "macro" for output schema makes job fail at next component

    Details

    • Type: Bug
    • Status: Open
    • Priority: Major
    • Resolution: Unresolved
    • Affects Version/s: None
    • Fix Version/s: None
    • Component/s: CDAP, ETL, Pipelines
    • Labels:
    • Rank:
      1|i00tu7:

      Description

      I'm currently creating a pipeline that ingests a CSV, creates a bigquery table with the same schema, and loads the data into that table.

       

      Within this job, I have a spark component that looks at this CSV and constructs a JSON string that can be used for the "Output Schema" for components (I have Macro'd the output schema in the components so this job can be used for any CSV).

       

      The issue is that when I use the "macro" functionality in a component's output schema, the next component will have "Input Schema, No Schema Available", and the job will fail here when executed (according to the logs).

       

      Hope someone can help me with this issue. Feel free to ask for more details if more clarity is needed. I have attached some photos as well

       

        Attachments

        1. input schema.PNG
          input schema.PNG
          2 kB
        2. logError.PNG
          logError.PNG
          124 kB
        3. outputSchema.PNG
          outputSchema.PNG
          3 kB

          Activity

            People

            • Assignee:
              trishka Trishka
              Reporter:
              supersahib Sahib Singh
            • Votes:
              1 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

              • Created:
                Updated: