I'm currently creating a pipeline that ingests a CSV, creates a bigquery table with the same schema, and loads the data into that table.
Within this job, I have a spark component that looks at this CSV and constructs a JSON string that can be used for the "Output Schema" for components (I have Macro'd the output schema in the components so this job can be used for any CSV).
The issue is that when I use the "macro" functionality in a component's output schema, the next component will have "Input Schema, No Schema Available", and the job will fail here when executed (according to the logs).
Hope someone can help me with this issue. Feel free to ask for more details if more clarity is needed. I have attached some photos as well