Uploaded image for project: 'CDAP'
  1. CDAP
  2. CDAP-2946

log.saver creates a lot of threads for HtablePool

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Major
    • Resolution: Duplicate
    • Affects Version/s: None
    • Fix Version/s: 3.1.0
    • Component/s: Log
    • Labels:
      None
    • Rank:
      1|hzyvp3:

      Description

      number of HTablePool threads is really high in log.saver, possibly a leak.

      noticed >1000 threads in jstack for log.saver process as following.

      htable-pool872826-t1" daemon prio=10 tid=0x0000000002118000 nid=0x6c36 waiting on condition [0x00007f95ce5a4000]
         java.lang.Thread.State: TIMED_WAITING (parking)
              at sun.misc.Unsafe.park(Native Method)
              - parking to wait for  <0x00000000d17cfa90> (a java.util.concurrent.SynchronousQueue$TransferStack)
              at java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:226)
              at java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:460)
              at java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:359)
              at java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:942)
              at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1068)
              at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1130)
              at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
              at java.lang.Thread.run(Thread.java:745) 
      

        Attachments

          Issue Links

            Activity

              People

              • Assignee:
                gary Gary Helmling
                Reporter:
                shankar Shankar Selvam
              • Votes:
                0 Vote for this issue
                Watchers:
                2 Start watching this issue

                Dates

                • Created:
                  Updated:
                  Resolved: