How to integrate JaCoCo Code coverage tool with Maven

I have used Cobertura code coverage tool for one of my recent project and followed the steps mentioned in this link Cobertura Example.

When I have tried to upgrade Java version to 1.8, got some issues due to the usage of Lambda expression. So I am looking for an another code coverage tool. So I choose  JaCoCo

Assume that I have a Maven Java project and some unit test cases. I have given the below snippet from my Maven pom.xml below. Just copy and paste this into the build section and then run your build. After the build is done, then go to below location (target/site/jacoco-ut/) and click on index.html file to view the code coverage report


<plugin>
                <groupId>org.jacoco</groupId>
                <artifactId>jacoco-maven-plugin</artifactId>
                <version>0.7.7.201606060606</version>
                <configuration>
                 <!--  Add it here to exclude it from code coverage analysis -> 
                    <excludes>
                        <exclude>**/model/**</exclude>
                        <exclude>**/test/**</exclude>                        
                    </excludes>
                </configuration>
                <executions>
                    <execution>
                        <id>pre-unit-test</id>
                        <goals>
                            <goal>prepare-agent</goal>
                        </goals>
                        <configuration>
                            <destFile>${project.build.directory}/coverage-reports/jacoco-ut.exec</destFile>
                            <propertyName>surefireArgLine</propertyName>
                        </configuration>
                    </execution>
                    <execution>
                        <id>post-unit-test</id>
                        <phase>test</phase>
                        <goals>
                            <goal>report</goal>
                        </goals>
                        <configuration>
                            <dataFile>${project.build.directory}/coverage-reports/jacoco-ut.exec</dataFile>
                            <outputDirectory>${project.reporting.outputDirectory}/jacoco-ut</outputDirectory>
                        </configuration>
                    </execution>
                    <execution>
                        <id>pre-integration-test</id>
                        <phase>pre-integration-test</phase>
                        <goals>
                            <goal>prepare-agent</goal>
                        </goals>
                        <configuration>
                            <destFile>${project.build.directory}/coverage-reports/jacoco-it.exec</destFile>
                            <propertyName>failsafeArgLine</propertyName>
                        </configuration>
                    </execution>
                    <execution>
                        <id>post-integration-test</id>
                        <phase>post-integration-test</phase>
                        <goals>
                            <goal>report</goal>
                        </goals>
                        <configuration>
                            <dataFile>${project.build.directory}/coverage-reports/jacoco-it.exec</dataFile>
                            <outputDirectory>${project.reporting.outputDirectory}/jacoco-it</outputDirectory>
                        </configuration>
                    </execution>
                </executions>
            </plugin>
            <plugin>
                <artifactId>maven-surefire-plugin</artifactId>
                <version>2.19.1</version>
                <configuration>
                    <argLine>${surefireArgLine}</argLine>
                    <excludes>
                        <exclude>**/*IntegrationTest*</exclude>
                    </excludes>
                </configuration>
            </plugin>

Rest API to produce message to Kafka using Docker Maven Plugin

I have developed a simple REST API to send the incoming message to Apache Kafka.

I have used Docker Kafka (https://github.com/spotify/docker-kafka) and the Docker Maven Plugin(https://github.com/fabric8io/docker-maven-plugin) to do this.

So before going through this post be familiarize yourself with Docker and Docker Compose

Docker Maven Plugin[Docker Maven Plugin] provides us a nice way to specify multiple images in POM.xml and link it as necessary. We can also use Docker compose for doing this. But I have used this plugin here.

    1. Clone the project (https://github.com/dkbalachandar/kafka-message-sender)
    2. Then go into kafka-message-sender folder
    3. Then enter ‘mvn clean install’
    4. Then enter  ‘mvn docker:start’. Then enter ‘docker ps’ and make sure that there are two containers are running. The name of those containers are kafka, kafka-rest
    5. Then access http://localhost:8080/api/kafka/send/test (POST) and confirm that you see message has been sent on the browser
    6. Then enter the below command and make sure that whatever message which you sent is available at Kafka[Kafka Command Line Consumer] or you can also consume via a Flume agent[Kafka Flume Agent Consumer]
docker exec -it kafka /opt/kafka_2.11-0.8.2.1/bin/kafka-console-consumer.sh --zookeeper localhost:2181 --topic test --from-beginning

How to check Kafka topic properties and node information in Zookeeper

We can verify the kafka topic properties by running the below command in the installation directory of Kafka


  bin/kafka-topics.sh --zookeeper zookeeper.host.name:2181 --topic topic1 --describe

So you will have something like below,


Topic:topic1         PartitionCount:1        ReplicationFactor:3     Configs:
        Topic: topic1       Partition: 0    Leader: 100     Replicas: 100,101,102   Isr:  100,101,102

So with that information, we can identify that there the replication factor for this topic is 3 and replicas broker node ids.

If we want to verify the same in Zookeeper then follow the below steps,

Login into the Zookeeper node and go to the installation directory and go into bin folder.

Then run zkCli as below


 /.zkCli.sh

Then Enter the below command and see all the broker nodes id details


  ls /brokers/ids

Then Enter the below command to know the details about the topic


 get /brokers/topics/topic1

Please refer below link for more commands, https://cwiki.apache.org/confluence/display/KAFKA/Kafka+data+structures+in+Zookeeper

Spark Job for Removing Empty XML tags

The below spark program will read a set of XML files, parse it and then remove the empty tags and finally write the output as a sequence file.

package com.spark

import org.apache.spark.{SparkConf, SparkContext}

//Find empty tags and remove it and then write as sequence files
object TagHandler {
 def main(args: Array[String]) {
 if (args.length < 2) {
     println("Usage <Source File or Directory> <Destination File or Directory>")
     // bin/spark-submit --class com.spark.TagHandler --master local tagHandler-assembly-1.0.jar /data/xml /data/output
 }

 val inFile = args(0)
 val outFile = args(1)

 val htmlTags = List("<sub/>", "<sup/>", "<i/>", "<b/>")

 val conf = new SparkConf().setAppName("TagHandler")
 val sc = new SparkContext(conf)
 val wholeFiles = sc.wholeTextFiles(inFile)

 wholeFiles.collect().foreach { case (fileName, content) =>
 var newContent = content
 for (tag <- htmlTags) {
     newContent = newContent.replace(newContent, "")
 }
 val data = sc.parallelize(List((fileName, newContent)))
 data.saveAsSequenceFile(outFile)
 }
 sc.stop()
 }
}

 

Java 8 + Cobertura maven plugin – net/sourceforge/cobertura/coveragedata/TouchCollector – No class def found error

I have got the below stack trace while upgrading the java version from 1.7 to 1.8 and using cobertura maven plugin version 2.7.

 

java.lang.NoClassDefFoundError: net/sourceforge/cobertura/coveragedata/TouchCollector
    at [org.package.ClassName].__cobertura_init([ClassName].java)
    at [org.package.ClassName].([ClassName].java)
    at [org.package.ClassName]Test.[method]([ClassName]Test.java:113)
Caused by: java.lang.ClassNotFoundException: net.sourceforge.cobertura.coveragedata.TouchCollector

 

I have tried with various options but nothing worked out. But solved it by following the workaround given in the below link

http://www.befreeman.com/2014/09/getting-cobertura-code-coverage-with.html

If you have used Lamda expression, then its better to use any other code coverage tool other than Cobertura

Refer my another post to know how to use Jacoco code coverage tool How to integrate JaCoCo Code coverage tool with Maven

Spark XML – How to replace hyphen symbols found in XML elements

Using pattern matching:


import org.apache.spark.{SparkConf, SparkContext}

object TransFormXmlApp {
  def main(args: Array[String]) {
    if (args.length < 2) {
      println("Usage Source File Destination File")
    }
    val inFile = args(0)
    val outFile = args(1)
    val conf = new SparkConf().setAppName("TransFormXmlApp")
    val sc = new SparkContext(conf)
    val wholeFiles = sc.wholeTextFiles(inFile)
    val htmlReg = "<\\/*[a-z-A-Z]+".r     
    wholeFiles.collect().foreach { case (fileName, content) =>
      val newContent = htmlReg.replaceAllIn(content,  m => m.toString.replace("-","_"))
      println(newContent)
      val data = sc.parallelize(List(newContent))
      data.saveAsTextFile(outFile)
    }
    sc.stop()
  }
}

Using scala-xml API:


import org.apache.spark.{SparkConf, SparkContext}

import scala.xml._
import scala.xml.transform._

object TransFormXmlApp {
  def main(args: Array[String]) {
     if (args.length < 2) {         
         println("Usage Source File Destination File")     
      }  
      val inFile = args(0)     
      val outFile = args(1)     
      val conf = new SparkConf().setAppName("TransFormXmlApp")     
      val sc = new SparkContext(conf)     
      val wholeFiles = sc.wholeTextFiles(inFile)   
  
      //Replace hypen as underscore     
      val hypenAsUnderScoreRule = new RewriteRule {       
         override def transform(nodes: scala.xml.Node): Seq[Node] = nodes match {
         case e@Elem(prefix, label, attribs, scope, children@_*) => Elem(prefix, label.replace("-", "_"), attribs, scope, false, children: _*)
         case _ => nodes
      }
    }
    //Remove hyphen symbol
    val removeHyphenRule = new RewriteRule {
      override def transform(nodes: scala.xml.Node): Seq[Node] = nodes match {
        case e@Elem(prefix, label, attribs, scope, children@_*) => Elem(prefix, label.replace("-", ""), attribs, scope, false, children: _*)
        case _ => nodes
      }
    }
    wholeFiles.collect().foreach { case (fileName, content) =>
      val updatedXml = new RuleTransformer(hyphenAsUnderScoreRule).transform(XML.loadString(content))
      val data = sc.parallelize(updatedXml)
      data.saveAsTextFile(outFile)
    }
    sc.stop()
  }
}