Uploaded image for project: 'CDAP'
  1. CDAP
  2. CDAP-12571

Kafka source plugin skips, rather than fails, on messages that are too large

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Major
    • Resolution: Fixed
    • Affects Version/s: None
    • Fix Version/s: 5.0.0
    • Component/s: Kafka, Pipeline Plugins
    • Labels:
    • Rank:
      1|i007pj:

      Description

      Kafka source plugin seems to skip, rather than fail, on messages that are too large.
      If we send messages larger than the default value (1 MB) then the pipeline will be successful.
      Kafka batch source statistics show that we have not read any messages (0/0), but the offset will be updated in hbase table (as result next run will skip these messages).
      Instead, the pipeline should throw error message and fail. I tried to read the same messages using Flume agent. It throws exceptions saying that the message size is too large. After adding special property to consumer config (consumer.max.partition.fetch.bytes = e.g. 5 MB) everything worked properly.

        Attachments

          Issue Links

            Activity

              People

              • Assignee:
                yaojie Yaojie Feng
                Reporter:
                ali.anwar Ali Anwar
              • Votes:
                0 Vote for this issue
                Watchers:
                2 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: