In my application, I use Kafka for logging the user events. So the user events are collected in XML format and send it to Kafka. So from Kafka it will be consumed by Flume agent.
In my API, We create a producer thread for each event. So after the message is sent to Kafka, then this producer is closed. This is perfectly fine but while testing these changes in our Load env, we have got an issue. The issue is that that Kafka server could not allocate the resources for the Kafka producer thread and its also complaining that there are two many open files.
So to avoid this issue, we have to increase the open file limit or change our API to create only one Kafka producer instance which is responsible for producing all the messages.
As Kafka producer instance is thread safe, the second solution is the correct fit for this issue