Kafka – Multiple producer threads

In my application, I use Kafka for logging the user events. So the user events are collected in XML format and send it to Kafka. So from Kafka it will be consumed by Flume agent.

In my API, We create a producer thread for each event. So after the message is sent to Kafka, then this producer is closed. This is perfectly fine but while testing these changes in our Load env, we have got an issue. The issue is that that Kafka server could not allocate the resources for the Kafka producer thread and its also complaining that there are two many open files.

So to avoid this issue, we have to increase the open file limit or change our API to create only one Kafka producer instance which is responsible for producing all the messages.

As Kafka producer instance is thread safe, the second solution is the correct fit for this issue

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s