Jersey REST service and Thread Local usage

In this exercise, I am going to provide an example to show how we can use the ThreadLocal with the Jersey REST service.

Consider that it’s a REST application and we need the user details in the application flow many places. The User details will be retrieved with the header information availbale in the HttpServletRequest.
One solution is to pass the HttpServletRequest object whereever its required. But its a tedious job.
So an another way is to use the ContainerRequestFilter and processes the request, then extract out the header and call the back end to get the User details and put into the ThreadLocal storage so that it can be accessible anywhere in that thread flow.

Refer the below sample code to know how to do that. We don’t use the Spring Framework here.

DataHolder:
This class holds the static ThreadLocal variable and it has getter and setter for User object. We will be setting the User Object from the ContainerRequestFilter


package com.utils;

public class DataHolder{

    private static final ThreadLocal userThreadLocal = new ThreadLocal();

    public static void setUser(User user) {
        userThreadLocal.set(user);
    }

    public static User getUser() {
        return userThreadLocal.get();
    }

    public static void remove() {
        userThreadLocal.remove();
    }
}

UserDataCaptureRequestFilter
This filter will intercept the request and extract out the header and retrieve the user details and put it into the ThreadLocal.



package com.filters;

import javax.servlet.http.HttpServletRequest;
import javax.ws.rs.container.ContainerRequestContext;
import javax.ws.rs.container.ContainerRequestFilter;
import javax.ws.rs.core.Context;
import javax.ws.rs.ext.Provider;
import java.io.IOException;

@Provider
public class UserDataCaptureRequestFilter implements ContainerRequestFilter {
    @Context
    private HttpServletRequest httpRequest;

    @Override
    public void filter(ContainerRequestContext requestContext) throws IOException {
        //Get the User data from the back end system with the request header/cookie. I have not given the getUser() method but assume it will get the user details 
        User user = getUser(httpRequest);
        DataHolder.setUser(user);
    }
}


Application
The ResourceConfig class where we can specify the packages to look into for the end points and context


public class Application extends ResourceConfig {
    public Application() {
        packages(true, "com.resource");
        packages(true, "com.filters");
    }
}

 
UserResource
This service returns the User details in JSON format. Here We have not retrived the User Details once again since we already retrieved the data in the filter itself. So the data would be available in the Thread scope. So we could use it


package com.resource;

import com.utils.DataHolder;
import javax.ws.rs.GET;
import javax.ws.rs.Path;
import javax.ws.rs.Produces;
import javax.ws.rs.core.MediaType;
import javax.ws.rs.core.Response;

@Path("user")
public class UserResource{

    @GET
    @Produces(MediaType.APPLICATION_JSON)
    public Response getUser() {
       
        return Response.ok().entity(DataHolder.getUser()).type(MediaType.APPLICATION_JSON).build();
    }
}

 

 

HK2 Dependency Injection – Using Iterable provider

HK2 is a light weight, dynamic dependency injection framework. If you don’t want to rely on spring framework for managing the dependencies then you can use this.

In this post, I am going to explain how to use the Iterable provider with an example.

Assume that you have a Jersey REST application and it has an endpoint which is responsible for sending out some details to different systems such as Kafka, Database. There is a class called Emitter which is responsible for emitting the data. There are two classes KafkaProducer which produce the data to Kafka and another one DatabaseProducer which writes the data into DB and both these classes implement the MessageProducer interface.

I am not going to give the exact code to be used for doing this. My main motive is to show how we can use Iterable provider here.

MessageProducer.java


public interface MessageProducer {
    public void produce(String message);
}

DBProducer.java
DBProducer class sends out the message to DB


public class DBProducer implements MessageProducer {

    @Override
    public void produce(String message) {
        //Write code to send out the message to DB
        System.out.println("Producing the message to DB");
    }
}

KafkaProducer.java
KafkaProducer class sends out the message to Kafka



public class KafkaProducer implements MessageProducer {

    @Override
    public void produce(String message) {
        //Write code to send out the message to kafka
        System.out.println("Producing the message to Kafka");
    }
}

DependencyBinder.java
DependencyBinder class to be used for binding. Hence you need to pass this class as an init parameter to the Jersey servlet(Parameter name: javax.ws.rs.Application)


import org.glassfish.hk2.utilities.binding.AbstractBinder;
import org.glassfish.jersey.server.ResourceConfig;

import javax.inject.Singleton;

public class DependencyBinder extends ResourceConfig {

    public DependencyBinder() {
       //the resource class is available in com/resource folder
        packages(true, "com.resource");
        register(new AbstractBinder() {
            @Override
            protected void configure() {
                bind(Emitter.class).to(Emitter.class).in(Singleton.class);
                bind(KafkaProducer.class).to(MessageProducer.class).in(Singleton.class);
                bind(DBProducer.class).to(MessageProducer.class).in(Singleton.class);
            }
        });
    }

 

Finally the Emitter class. This class has to send the message to different systems. So we can’t inject the each class here. We can use IterableProvider to achieve this. So the HK2 framework gets all the classes which implement the interface MessageProducer and injects into this class. In this example, it will return DBProducer and KafkaProducer instances.


import org.glassfish.hk2.api.IterableProvider;

import javax.inject.Inject;

public class Emitter {

    @Inject
    private IterableProvider<MessageProducer> messageProducers;

    public void emitMessage() {

        // Write code to prepare the message
        String message = "Its a new message";
        for (MessageProducer messageProducer : messageProducers) {
            messageProducer.produce(message);
        }
    }
} 

Testing the mail functionality with Fake Smtp Server

In this post, I am going to show how we can integrate the FakeSmtp server with a java application.

Fake SMTP is a dummy SMTP server which is mainly used for testing the emails. It’s useful for the developers as it emits the email content into a local file directory. So we can check the local file content and do the validations.

The Dockerfile for the Fake Smtp is given below,

mail-rest-docker/tree/master/fake-smtp


FROM java:8

RUN mkdir -p /opt && \
  wget -q http://nilhcem.github.com/FakeSMTP/downloads/fakeSMTP-latest.zip && \ 
   unzip fakeSMTP-latest.zip -d /opt && \
   rm fakeSMTP-latest.zip

VOLUME /output

RUN chmod +x /opt/fakeSMTP-2.0.jar

EXPOSE 25

CMD java -jar /opt/fakeSMTP-2.0.jar --start-server --background --output-dir /output --port 25

If you refer the last line of the above file, you can understand that the email content would be written under the /output folder. So we have to mount the local directory accordingly in the docker compose file.

Next one is the REST application code to send out the email to Fake SMTP server.

mail-rest-docker/blob/master/src/main/java/com/resource/MailResource.java


package com.resource;

import javax.mail.Message;
import javax.mail.Session;
import javax.mail.Transport;
import javax.mail.internet.InternetAddress;
import javax.mail.internet.MimeMessage;
import javax.ws.rs.GET;
import javax.ws.rs.Path;
import javax.ws.rs.Produces;
import javax.ws.rs.WebApplicationException;
import javax.ws.rs.core.MediaType;
import javax.ws.rs.core.Response;
import java.util.Properties;

@Path("/sendMail")
public class MailResource {

    Properties properties = System.getProperties();

    public MailResource() {
        properties.put("mail.smtp.host", System.getenv("SMTP_HOST"));
        properties.put("mail.smtp.port", "25");
    }

    @GET
    @Produces(MediaType.TEXT_HTML) public Response sendSimpleHelloMail() throws WebApplicationException {

        String to = "test123@gmail.com";
        String from = "test123@gmail.com";
        Session session = Session.getDefaultInstance(properties);
        try {
            MimeMessage message = new MimeMessage(session);
            message.setFrom(new InternetAddress(from));
            message.addRecipient(Message.RecipientType.TO, new InternetAddress(to));
            message.setSubject("Subject");
            message.setContent("<h1>Hello</h1>", "text/html");
            Transport.send(message);
            System.out.println("Sent message successfully....");
        }
        catch (Exception ex) {
            ex.printStackTrace();
            throw new WebApplicationException(ex.getMessage());
        }
        return Response.ok().entity("Mail has been sent successfully").build();
    }
}

The SMTP_HOST enviornmental variable should hold the Fake SMTP server host. In this case, we have to link the Fake SMTP service with the REST container and give that service. Refer the below docker-compose.yml file to know how to do that.

mail-rest-docker/blob/master/docker-compose.yml


rest:
    image: mail-rest:latest
    ports:
        - 8080:8080
    environment:
       - SMTP_HOST=mail
    links:
      - mail:mail
mail:
    image: fake-smtp:latest
    volumes:
     - ./output:/output
    expose:
     - 25

Here we have mounted the local output directory to output folder. so the mail content will be available under the output folder and linked the fake-smtp service and giving that in the SMTP_HOST env variable.

Follow the below steps to run this application,

1. Clone this repository
2. Package the project by running mvn Package
3. Run ‘docker images’ and confirm that the ‘mail-rest’ docker images is available.
3. Then go into fake-smtp folder and build the image by running ‘docker build -t fake-smtp:latest . ” and confirm that the ‘fake-smtp’ docker images is available.

4. Then go to project root folder(java-mail-rest) and run “docker-compose up -d”
5. Access the REST endpoint with http://localhost:8080/api/sendEmail
6. Check the output folder and confirm that there is an eml file created which has the email content

Here is the sample file content

mail-rest-docker/blob/master/output/220717072710377.eml


        Sat, 22 Jul 2017 19:27:10 +0000 (UTC)
From: test123@gmail.com
To: test123@gmail.com
Message-ID: <840194110.01500751630345.JavaMail.root@2c7d28e1a4a2>
Subject: Subject
MIME-Version: 1.0
Content-Type: text/html; charset=us-ascii
Content-Transfer-Encoding: 7bit

<h1>Hello</h1>

Refer the code @ https://github.com/dkbalachandar/mail-rest-docker

Test Secure REST services with Chrome Browser Plugin

Most of us want to test out the REST services via Advanced Rest Client or Postman for some reason or debug an issue.

But if the REST services are secure and protected by Ping Access or SiteMinder or any other tool, then we will get a login page. So we have to hard code the browser cookies to bypass the login page.

There is an another way to do that.

If you are using Advanced Rest Client(https://advancedrestclient.com), then you can use ARC cookie exchange plugin.
So this plugin helps ARC plugin to retrieve the browser cookies and send it in the request.

If you are using Postman(https://www.getpostman.com), then you can use Postman interceptor. So the Postman interceptor plugin helps the Postman plugin to use the browser cookies for each service call.

HTTP Caching in REST API

In this post, I am going to show an example which explains the implementation of HTTP caching in REST API.

As we know that REST API services can be cached in the browser whereas SOAP based services are not.

We have two kinds of cache control mechanism

1. Time based cache header
2. Conditional cache header

Time based cache header

Assume that we have a web service and you want to cache the response of that service in client’s browser for 5 min. So to achieve the same, we should have to set the cache control HTTP header appropriately


Cache-Control:private, max-age=300

Here “private” denotes that the response will be cached only in the client mostly by browser and not any intermediate proxy servers

max-age denotes that how long the resource is valid. The value should be given in seconds

Conditional cache header

With the conditional cache header approach, the browser will ask the server whether the response/content has been changed or not. Here the Browser sends out ETag and If-Modified-Since headers[Need to set If-Modified-Since at the client API] to the server and at the server side we should use these headers and implement our logic and send out the response only if its changed or the client side content has been expired.

You can note for the first time, we get the HTTP status 200 and for the subsequent request, we get the HTTP status 304 Not Modified

Please refer the below sample code which shows both approaches



import javax.ws.rs.GET;
import javax.ws.rs.Path;
import javax.ws.rs.Produces;
import javax.ws.rs.QueryParam;
import javax.ws.rs.core.CacheControl;
import javax.ws.rs.core.Context;
import javax.ws.rs.core.EntityTag;
import javax.ws.rs.core.MediaType;
import javax.ws.rs.core.Request;
import javax.ws.rs.core.Response;

@Path("/greeter")
public class GreetingResource {

    private static final String HELLO_MESSAGE= "Welcome %s";

    private static final String BIRTHDAY_MESSAGE= "Happy Birthday %s";

    @Context
    private Request request;

    @GET
    @Path("welcome")
    @Produces(MediaType.TEXT_HTML)
    public Response sayWelcomeMessage(@QueryParam("name") String name) {
        System.out.println("sayWelcomeMessage:::::"+ System.currentTimeMillis());
        CacheControl cacheControl = new CacheControl();
        cacheControl.setMaxAge(60);
        cacheControl.setPrivate(true);
        Response.ResponseBuilder builder = Response.ok(String.format(HELLO_MESSAGE, name));
        builder.cacheControl(cacheControl);
        return builder.build();
    }

    @GET
    @Path("bday")
    @Produces(MediaType.TEXT_HTML)
    public Response sayBdayWishesMessage(@QueryParam("name") String name) {

        System.out.println("sayGreetingMessage:::::"+ System.currentTimeMillis());
        CacheControl cacheControl = new CacheControl();
        //60 seconds
        cacheControl.setMaxAge(60);

        String message = String.format(BIRTHDAY_MESSAGE, name);
        EntityTag entityTag = new EntityTag(Integer.toString(message.hashCode()));
        //Browser sends ETag. So we can use this and
        // find out whether the response is expired or not
        Response.ResponseBuilder builder = request.evaluatePreconditions(entityTag);
        // If the build is null, then the content dont match, so the response should be sent
        if (builder == null) {
            builder = Response.ok(message);
            builder.tag(entityTag);
        } else {
            builder.cacheControl(cacheControl).tag(entityTag);
        }
        return builder.build();
    }
}

Rest API to produce message to Kafka using Docker Maven Plugin

I have developed a simple REST API to send the incoming message to Apache Kafka.

I have used Docker Kafka (https://github.com/spotify/docker-kafka) and the Docker Maven Plugin(https://github.com/fabric8io/docker-maven-plugin) to do this.

So before going through this post be familiarize yourself with Docker and Docker Compose

Docker Maven Plugin[Docker Maven Plugin] provides us a nice way to specify multiple images in POM.xml and link it as necessary. We can also use Docker compose for doing this. But I have used this plugin here.

    1. Clone the project (https://github.com/dkbalachandar/kafka-message-sender)
    2. Then go into kafka-message-sender folder
    3. Then enter ‘mvn clean install’
    4. Then enter  ‘mvn docker:start’. Then enter ‘docker ps’ and make sure that there are two containers are running. The name of those containers are kafka, kafka-rest
    5. Then access http://localhost:8080/api/kafka/send/test (POST) and confirm that you see message has been sent on the browser
    6. Then enter the below command and make sure that whatever message which you sent is available at Kafka[Kafka Command Line Consumer] or you can also consume via a Flume agent[Kafka Flume Agent Consumer]
docker exec -it kafka /opt/kafka_2.11-0.8.2.1/bin/kafka-console-consumer.sh --zookeeper localhost:2181 --topic test --from-beginning