UaiMockServer – A lot of new Features to help you with your tests

Hello, how are you?

The newest release of uaiMockVersion has a lot of new features that will influenciate a lot the HTTP mock open sources frameworks.

The first big feature is a GUI that will allow you to edit your requests:

Index

Index

You will not need to edit those boring configuration files anymore, all you need to do is to use access the GUI through the default url: http://localhost:1234/uaiGui/index

Another feature allows you to analyse everything in your request/response, very easily. All you need to do is to go to the Log tab and fire the request:

Request Log

Request Log

And if you detail a success request you will see:

Success Request

Success Request

And if you detail a request with error:

Error Detail 01

Error Detail 01

To help you to find out what kind of error just happened, you will be able to see the server log:

Error Detail 02

Error Detail 02

The last, but not least, feature is a JUnit runner that will control the server for you:

UaiMockServer JUnit Runner

UaiMockServer JUnit Runner

Using like the sample above you will not need to start and stop the uaiMockServer manually. Using like above the uaiMockServer will search for the default file named: uaiMockServer.json

You could also set specific config file:

UaiMockServer JUnit Runner Configuration

UaiMockServer JUnit Runner Configuration

And if you are a Spring user, you could also use the Spring Runner:

UaiMockServer JUnit SpringRunner Configuration

UaiMockServer JUnit SpringRunner Configuration

The last update in this new version was to change the config file from HCON to JSON. Unfortunately you will need to change your config file from HCON to JSON, sorry for that. I could not keep using the HCON type anymore.

The config file will look like below:

New Configuration File as JSON

New Configuration File as JSON

One advantage of the config file as JSON is that any file editor will be able to give you a better support.

The project site: http://uaimockserver.com/

Source code: https://github.com/uaihebert/uaiMockServer

Standalone version and config example: https://sourceforge.net/projects/uaimockserver/

Maven import:

<dependency>
    <groupId>uaihebert.com</groupId>
    <artifactId>uaiMockServer</artifactId>
    <version>1.1.0</version>
    <scope>test</scope>
</dependency>

I hope you liked the news.

If you have any questions/doubt/suggestion just post it below.

o_

uaiMockServer – Create a Mock Rest Server with a single command line

Hello how are you?

Have you ever needed to invoke a REST service, but this service was not created yet?

Imagine that you are developing an APP that needs to invoke an inexistent REST service. You will need to create a fake code that will be replaced once that the real code is finished. No matter if you are working with Java, .NET, PHP, Ruby, Python, etc. You will always need to create a fake invocation method if the REST service is not ready yet.

When we are programming in our unitary test we face the same problem. We will need to create a Mock code that will be invoked during the tests. The problem with this approach is that the real code is never invoked.

How can we solve this problem? Is there a solution that works for delphi, .NET, PHP, Java, Android, IOS, WindowsPhone, etc?

I present to you now my most recent creation: “uaiMockServer”.

With uaiMockServer you will be create a mock server using only a JAR file and a config file. With this server you will be able to execute real HTTP requests, but using your test data.

There are two ways of using uaiMockServer: Standalone and Unit Tests

Standalone

To use the uaiMockServer as a Standalone server you will manually type the command that starts it. Run the command in a prompt:

java -jar uaiMockServer-{VERSION}.Standalone.jar

You will only need the config file, that we will talk about very soon.

In the sample config file there is a mapping using the port 1234. If you are using linux, type the command:

curl localhost:1234/uaiMockServer/

The HTTP Response will contain a JSON.

Unit Test

First step is to add the project in your pom:

<dependency>
   <groupId>uaihebert.com</groupId>
   <artifactId>uaiMockServer</artifactId>
   <version>1.0.1</version>
   <scope>test</scope>
</dependency>

Now you can create a test code like:

public class YourTest {
    private static UaiMockServer uaiMockServer;
 
    @BeforeClass
    public static void beforeClass() {
        uaiMockServer = UaiMockServer.start();
    }
 
    @AfterClass
    public static void afterClass() {
        uaiMockServer.shutdown();
    }
 
    @Test
    public void aTest() {
        final Customer customer = // invoke the URL
        assertTrue(customer != null);
    }
}

You will be able to invoke the URL using any kind of framework. Below you will see a code sample using the RestEasy framework doing a request mapped in the sample config file (that we will see soon):

@Test
public void isGetRootReturning200() {
    final String url = "http://localhost:1234/uaiMockServer/";
 
    Client client = ClientBuilder.newClient();
    Response response = client.target(url).request().get();
 
    assertEquals(200, response.getStatus());
}

What is the gain in the test? You will not need a mock code anymore. You can fire a HTTP request from your JUnit test.

Configuration

To run the project you will need a config file. A simple sample would be like below:

com.uaihebert.uaimockserver {
    port = 1234
    host = localhost
    fileLog = false
    consoleLog = true
    context = /uaiMockServer
    defaultContentTypeResponse = "text/html;charset=UTF-8"
    routes = [
        {
            request {
                path = "/"
                method = "GET"
            }
            response {
                contentType = "application/json;charset=UTF-8"
                statusCode = 200
                body = """{"mockBody":{"title":"Hello World"}}"""
            }
        }
    ]
}

The file must have the .config extension and is not in the JSON format, but has a superset format of the JSON – HCON – used in configurations files (click here for more details).

With this file you can set up the port that will be invoked, the host, header, queryParm and other several options that are described in the uaiMockServer documentation.

Notice that in the configuration was created a request and a response; with these configurations you can mock every request that you want.

If you are going to use the Standalone version the config file must be in the same directory of the JAR file. You can set a different path with the command:

java -jar uaiMockServer-{VERSION}.Standalone.jar FULL_FILE_PATH

If you will run the project in your Unit Tests (like JUnit) you must put the config file in the “src/test/resources” maven folder. Or you can pass the full path like:

uaiMockServer = UaiMockServer.start(FULL_FILE_PATH);

It is free?

Yes, it is. Use it at your will.

Is open source? Yes: https://github.com/uaihebert/uaiMockServer

Where is the documentation? http://uaimockserver.com/

Where can I find the Standalone and the sample Config file: http://sourceforge.net/projects/uaimockserver/files/

It has any automated test? Yes, we have a lot of tests and we got 100% code coverage.

What about performance?

In the GIT there is a test that executes 300 requests in less than 2s.

Thanks to:

Covering your tests with Cobertura, JUnit, HSQLDB, JPA

Hello, how are you?

Let us talk today about a very useful tool named “Cobertura”. This framework has the same functions of the Emma framework that we saw in another post.

The major difference between Cobertura and Emma is that Cobertura displays the resume page with graphics.

If you want to see the other topics about this subject click on the links: Coverage of tests with JUnit Ant and Emma // JUnit with HSQLDB, JPA and Hibernate // TDD – First Steps.

I will use the same code from this post (JUnit with HSQLDB, JPA and Hibernate); if you want to set up an environment to run the code you can follow the steps that you will find there (you will find the source code to download at the end of this article).
I am not a skilful Ant user and you may see some code and think “this script is not good”. Fell free to give me ideas of how to “upgrade” the ant code. ;)

Let us do the download of the Cobertura libraries – Download1.9.4.1

After the download finishes put the jar files (/cobertura.jar, /lib/* that includes asm, jakarta, log4j) inside the lib folder of our project.

In the root source of our project create a file named “build.xml” with the following code (the created file needs to be in the root of your project or your project will not work):

<project name="Cobertura Coverage" basedir=".">

    <!--  Project Source  Code -->
    <property name="src.dir" value="src" />
    <property name="build.dir" value="bin" />
    <property name="teste.dir" value="src/test" />
    <property name="lib.dir" value="lib" />
    <property name="report.dir" value="cobertura" />

    <!-- Project classpath -->
    <path id="project.classpath">
        <pathelement location="${bin.dir}" />
        <fileset dir="${lib.dir}">
            <include name="*.jar" />
        </fileset>
    </path>

    <!-- Tested Class -->
    <property name="DogFacadeTest" value="test.com.facade.DogFacadeTest" />

</project>

In the code above we are creating the paths to the source codes and to the libraries. Let us create a task to delete the files generated by the Cobertura and the compilated Java source code.

<!-- Clears the paths -->
<target name="01-CleannUp" description="Remove all generated files.">
    <delete dir="${build.dir}" />
    <delete file="cobertura.ser" />
    <delete dir="${report.dir}" />

    <mkdir dir="${build.dir}" />
    <mkdir dir="${report.dir}" />
</target>

To compile our source code, add the code bellow and run the task (be sure that the code will compile):

<!-- Compiles the Java code -->
<target name="02-Compile" depends="01-CleannUp" description="invoke compiler">
    <javac debug="true" debuglevel="vars,lines,source" srcdir="${src.dir}" destdir="${build.dir}">
        <classpath refid="project.classpath" />
    </javac>
    <copy file="${src.dir}/META-INF/persistence.xml" todir="${build.dir}/META-INF" />
</target>

Let us setup the Cobertura so it may instrument the tests class and get the environment ready:

<!-- Cobertura configs -->
<property name="cobertura.instrumented-classes.dir" value="${report.dir}/instrumented-classes" />
<property name="cobertura.data.file" value="cobertura.ser" />
<path id="cobertura.classpath">
    <fileset dir="${lib.dir}" includes="/*.jar" />
</path>

<!-- Points to the cobertura jar -->
<taskdef classpath="${lib.dir}/cobertura.jar" resource="tasks.properties" classpathref="cobertura.classpath" />

<!-- Instruments the classes -->
<target name="03-Instrument" depends="02-Compile">
    <delete quiet="false" failonerror="false">
        <fileset dir="${cobertura.instrumented-classes.dir}" />
    </delete>
    <delete file="${cobertura.data.file}" />
    <cobertura-instrument todir="${cobertura.instrumented-classes.dir}">
        <fileset dir="${build.dir}">
            <include name="**/*.class" />
            <exclude name="**/*Test.class" />
        </fileset>
    </cobertura-instrument>
    <copy todir="${cobertura.instrumented-classes.dir}">
        <fileset dir="${src.dir}" casesensitive="yes">
            <patternset id="resources.ps" />
        </fileset>
    </copy>
</target>

Add the code bellow and you will be able to execute the tests with the JUnit through the ant:

<!-- Set up the instrumented classes path -->
<path id="cover-test.classpath">
    <fileset dir="${lib.dir}" includes="**/*.jar" />
    <pathelement location="${cobertura.instrumented-classes.dir}" />
    <pathelement location="${build.dir}" />
</path>

<!-- Run the JUnit test -->
<target name="04-RunTest" depends="03-Instrument" >
    <junit printsummary="yes" haltonerror="no" haltonfailure="no"  fork="yes">
        <batchtest>
            <fileset dir="${build.dir}" includes="**/*Test.class" />
        </batchtest>
        <classpath refid="cover-test.classpath" />
    </junit>
    <delete file="transaction.log" />
</target>

As our last action, let us create the report using the Cobertura. Add the code bellow to your “build.xml” and execute the task:

<!-- Creates the Cobertura report -->
<target name="00-CreateReport" depends="04-RunTest">
    <cobertura-report srcdir="${cobertura.data.file}" destdir="${report.dir}">
        <fileset dir="${src.dir}">
            <include name="**/*.java" />
        </fileset>
    </cobertura-report>

    <delete dir="${report.dir}/instrumented-classes" />
    <delete file="cobertura.ser"  />
</target>

Update your Eclipse project (press F5 in the project) and you will see that the reports were created with success. Open the “index.html” file that it is inside the “cobertura” folder.



The Cobertura framework helps us calculating how many times we should test a method. As homework write the Dog equals method, create the report so you may see that your equals is not covered. Create your tests and run the report again.

You can download the source code from our project in here.

If you have any doubts or questions, just write it bellow.

See you soon. o_

TDD Coverage of tests with JUnit Ant and Emma

Hello, good morning.

When we write out TDD tests we always care to write the right tests, and to be sure if we are testing all classes that we have. Imagine if you test 4 methods and forget to test a Catch of an Exception. How sad is that?

To help us with this problems there are some tools to cover your test; it will let you know where your test passed or missed.

I will talk about two coverage tests tools: Emma (our post today), Cobertura (the next post about TDD).

I will use the same code you can find in here (TDD with HSQLDB, JPA and Hibernate). If you want to set up your environment, follow that how-to and everything will work 100% for sure. Another post about TDD is: TDD – First Steps.

Today we will use the Emma framework with Ant. I do not have a good knowledge about Ant yet, so all the codes that I found on the internet maybe have some redundancies or something like that. Some day I will improve it.

Let us start downloading the libraries:

The JUnit library it is only used by the Emma Framework; it will not be necessary to add them in your project “Build Path”.

Create a file named “build.xml” at the root folder of your project (pay attention to the fact that your file must be in the root source of your project, or this script will fail):

Edit your file and add the code bellow:

<project name="Emma Reports" basedir=".">

    <!--  Project Source  Code -->
    <property name="src.dir" value="src" />
    <property name="bin.dir" value="bin" />
    <property name="teste.dir" value="src/test" />
    <property name="lib.dir" value="lib" />

    <!-- Emma source code -->
    <property name="emma.bin.dir" value="emma/bin" />
    <property name="emma.metadate.dir" value="emma/metadate" />
    <property name="emma.report.dir" value="emma/report" />

    <!-- Tested Class -->
    <property name="DogFacadeTest" value="test.com.facade.DogFacadeTest"/>

    <!-- Project classpath -->
    <path id="project.classpath">
        <pathelement location="${bin.dir}" />
        <fileset dir="${lib.dir}">
            <include name="*.jar" />
        </fileset>
    </path>

    <!-- Emma task definitions that you will find inside the jar -->
    <taskdef resource="emma_ant.properties">
        <classpath refid="project.classpath" />
    </taskdef>

    <!-- JUnit task definition -->
    <taskdef name="junit" classname="org.apache.tools.ant.taskdefs.optional.junit.JUnitTask" />

</project>

In the “build.xml” file we are defining the script variables, the files path and the JUnit to be executed.

Go to the menu Window > Show View > Other. Type Ant and press Ok:


Drag the file “build.xml” and drop it into the Ant view. Double click in the file to execute it. You will see this message:

If you see the message “
Could not load definitions from resource emma_ant.properties. It could not be found.
” It is because your lib path is wrong. Check again your build.xml file is in the root of your project.

Do not go forward without solving this path issue.

Let us edit our “build.xml” and add the code to compile our java code. It has two actions (an action is defined as target):

    <!-- Clean UP Your Code -->
    <target name="01-CleanUp">
        <delete dir="${bin.dir}" />
        <mkdir dir="${bin.dir}" />
    </target>

    <!-- Compile Your Code -->
    <target name="02-CompileSourceCode" depends="01-CleanUp">
        <javac debug="on" srcdir="${src.dir}" destdir="${bin.dir}">
            <classpath refid="project.classpath" />
        </javac>
        <copy file="${src.dir}/META-INF/persistence.xml" todir="${bin.dir}/META-INF" />
    </target>

Notice that we have two actions named “01-CleanUp” and “02-CompileSourceCode”; to compile our code the target “depends” of the action that cleans the bin path; every time that you compile the source code of your project, the ant will run the action of cleaning the old code.

Let us create a task now with a word that you will hear a lot, Instrumentation. Just to comment about it, the Emma framework it is bytecode instrumented; Emma will not need adaptations or switches in the JVM to watch over your code. It will analyze your code through the bytecode only.

Edit your “build.xml” file and add the following task:

    <!-- Generate the Emma  -->
    <target name="03-Instrumentation" depends="02-CompileSourceCode">
        <emma>
            <instr instrpath="${bin.dir}" destdir="${emma.bin.dir}" metadatafile="${emma.metadate.dir}/metadate.emma" merge="false" mode="fullcopy" />
        </emma>
    </target>

Let us now add the target that will allow us to run the JUnit tests:

    <!-- Runs JUnit Tests -->
    <target name="04-RunTests" depends="03-Instrumentation">
        <junit haltonfailure="false" haltonerror="false" fork="true">
            <classpath>
                <pathelement location="${emma.bin.dir}/classes" />
                <pathelement location="${emma.bin.dir}/lib" />
                <path refid="project.classpath" />
            </classpath>
            <formatter type="plain" usefile="false" />
            <test name="${DogFacadeTest}" />
            <jvmarg value="-Demma.coverage.out.file=${emma.metadado.dir}/cobertura.emma" />
            <jvmarg value="-Demma.coverage.out.merge=false" />
        </junit>
    </target>

Run the test and you will see that in our code there is no error:

Let us create the task that will create the report:

    <!-- Creates the report -->
    <target name="00-GenerateReport" depends="04-RunTests">
        <delete dir="${emma.report.dir}" />
        <emma enabled="true">
            <report sourcepath="${src.dir}" sort="+block,+name,+method,+class" metrics="method:70,block:80,line:80,class:100">
                <fileset dir="${emma.metadate.dir}">
                    <include name="*.emma" />
                </fileset>
                <html outfile="${emma.report.dir}/report.html" depth="method" columns="name,class,method,block,line" />
            </report>
        </emma>
    </target>

Execute the task and then open the report file that you will find in: “
/emma/report/report.html
”.

As our last action, let us delete the files that were generated only to build the report and now they are not needed anymore. Edit the task that creates the report and add the line that will call the deleting files task:

    <!-- Creates the report -->
    <target name="00-GenerateReport" depends="04-RunTests">
        <delete dir="${emma.report.dir}" />
        <emma enabled="true">
            <report sourcepath="${src.dir}" sort="+block,+name,+method,+class" metrics="method:70,block:80,line:80,class:100">
                <fileset dir="${emma.metadate.dir}">
                    <include name="*.emma" />
                </fileset>
                <html outfile="${emma.report.dir}/report.html" depth="method" columns="name,class,method,block,line" />
            </report>
        </emma>

        <antcall target="05-DeleteOldReportData" />
    </target>

    <!-- Delete Old Report Data -->
    <target name="05-DeleteOldReportData">
        <delete dir="${emma.bin.dir}" />
        <delete dir="${emma.metadate.dir}" />
    </target>

Click here to download the source code of this tutorial.

I hope this post might help you.

If you have any doubt or comments to do, just post it.

See you later! o_

PS.: Researched from:

JUnit with HSQLDB, JPA and Hibernate

Hello, good morning.

Today we will talk about integration between your database and unit testing. One of the best solutions that we can find, it is to run a runtime database.

The HSQLDB database will do the work of creating the table structures, the primary key relationships and will allow the JPA to work without problems.

In this How-To we will see how to create a Unit Test (TDD) with JPA and HSQLDB.

To see the other post of TDD you can check it here: TDD – First Steps.
You will need to download the HSQLDB jar that you will find here (http://hsqldb.org/- Version2.25 – Last version so far).

I will use the Hibernate as our JPA provider. In this tutorial (Tutorial Hibernate 3 with JPA 2) you will find the link to download it.

Create a Java project at File > New Project > Java Project.

Let us create a folder named lib and put all the libraries inside the folder. The files will be those inside the HSQLDB and Hibernate zips:

I will not get in the details of how to setup the application; I will give only a few hints. To see amore detailed post about JPA/Hibernate you can check this links: (Tutorial Hibernate 3 with JPA 2).

Click with the right mouse button in the Project > Properties. Then go to JavaBuild Path > Libraries tab > Add jars. Select the files that you will find in the lib folder.

You will need to create a folder named “src/META-INF” and inside it create a file named persistence.xml:

<?xml version="1.0" encoding="UTF-8"?>

<persistence version="2.0"
    xmlns="http://java.sun.com/xml/ns/persistence" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://java.sun.com/xml/ns/persistence http://java.sun.com/xml/ns/persistence/persistence_2_0.xsd">

    <persistence-unit name="HsqldbWithTDD" transaction-type="RESOURCE_LOCAL">
        <provider>org.hibernate.ejb.HibernatePersistence</provider>

        <properties>
            <property name="javax.persistence.jdbc.driver" value="org.hsqldb.jdbcDriver" />
            <property name="javax.persistence.jdbc.url" value="jdbc:hsqldb:mem:." />
            <property name="javax.persistence.jdbc.user" value="sa" />
            <property name="javax.persistence.jdbc.password" value="" />
            <property name="hibernate.show_sql" value="true" />
            <property name="hibernate.dialect" value="org.hibernate.dialect.HSQLDialect"/>
            <property name="hibernate.connection.shutdown" value="true"/>
            <property name="hibernate.hbm2ddl.auto" value="create-drop"/>
        </properties>
    </persistence-unit>
</persistence>

Let us create an entity to be persisted, a Dog class:

package com.model;

import javax.persistence.Entity;
import javax.persistence.GeneratedValue;
import javax.persistence.GenerationType;
import javax.persistence.Id;
import javax.persistence.NamedQuery;
import javax.persistence.Table;

@Entity
@Table(name = "dog")
@NamedQuery(name="listALL", query="select d from Dog d")
public class Dog {

    public static final String LIST_ALL = "listALL";

    @Id
    @GeneratedValue(strategy = GenerationType.AUTO)
    private int id;
    private String name;
    private double weight;

    // Getters and Setters
}

Let us create a DAOclass that will have a basic CRUD of the Dog class (In the Dog class you will find the annotation “NamedQuery”, we have not studied about this annotation here in the blog but this annotations “it is like” a raw SQL):

package com.dao;

import java.util.List;

import javax.persistence.EntityManager;
import javax.persistence.EntityManagerFactory;
import javax.persistence.Persistence;

import com.model.Dog;

public class DogDAO {

    private EntityManagerFactory emf;
    private EntityManager em;

    public void startConnection(){
        emf = Persistence.createEntityManagerFactory("HsqldbWithTDD");
        em = emf.createEntityManager();
        em.getTransaction().begin();
    }

    public void closeConnection(){
        em.getTransaction().commit();
        emf.close();
    }

    public void save(Dog dog){
        em.persist(dog);
    }

    public void edit(Dog dog){
        em.merge(dog);
    }

    public Dog find(int dogId){
        return em.find(Dog.class, dogId);
    }

    public void remove(Dog dog){
        em.remove(dog);
    }

    public List listALL(){
        return em.createNamedQuery(Dog.LIST_ALL, Dog.class).getResultList();
    }
}

What if we do a test now? Let us see if the dog is being persisted. We will check this information through a class that we will name “Main”, run this class and you will have a message displayed like the picture bellow:

package com;

import com.dao.DogDAO;
import com.model.Dog;

public class Main {

    public static void main(String[] args) {
        DogDAO dogDAO = new DogDAO();

        dogDAO.startConnection();

        try {
            Dog dog = new Dog();
            dog.setName("Beethoven");
            dog.setWeight(45);

            dogDAO.save(dog);

            // It was the first saved dog, so its id is 1
            Dog persistedDog = dogDAO.find(1);

            System.out.println("Name: " + persistedDog.getName());
            System.out.println("Weight: " + persistedDog.getWeight());
        } catch (Exception e) {
            System.out.println("Ops, something happen: " + e.getMessage());
            e.printStackTrace();
        }finally{
            dogDAO.closeConnection();
        }
    }
}


Let us apply now the TDD methodology to create a method/class. We will create a method to give us the weight average of all dogs in our database.

I will create a DogFacade class that will do all the work, even with the DAO.

We need to configure the JUnit library. We will start from the same screen that we added the HSQLDB and Hibernate; Click in the button add Library:

Select JUnit 4 and finish.

Following the TDD good practices, let us create our test class before the DogFacade class.

package test.com.facade;

import static junit.framework.Assert.assertEquals;

import org.junit.Test;

import com.model.Dog;

public class DogFacadeTest {

    @Test
    public void isWeightOfAllDogsCorret(){
        DogFacade dogFacade = new DogFacade();
        dogFacade.startConnection();

        createData(dogFacade);
        assertEquals(30, dogFacade.getAverageDogsWeight());

        dogFacade.closeConnection();
    }

    private void createData(){
        Dog dogA = new Dog();
        dogA.setName("Big Dog");
        dogA.setWeight(45);

        Dog dogB = new Dog();
        dogB.setName("Medium Dog");
        dogB.setWeight(30);

        Dog dogC = new Dog();
        dogC.setName("Small Dog");
        dogC.setWeight(15);
    }
}

After you create detest class you will see tons of errors, but following the TDD principles, “Red, Green, and Refactor”. To more details about this concept, check this link: TDD – First Steps.

The code of this post is not a complex code, so the Façade by itself handles the connection. When we work with web applications it is a god idea to use the resource injection, so if you want to keep using this kind of resources it is a good idea to use the best pattern for you (I will list bellow some ideas I had, but you can find tons of others ideas on the internet):

  • Pass the EntityManager in the constructors. Your test class would create the EntityManager and it would send it to the DAO through the Façade. It would be like this: new DogFacade(entityManager); and inside the DogFacade class you would do new DogDao(entityManager);
  • You can create a DAO in your Test class (with an active entityManager) and pass it through overloaded constructors in the Façades.

You will need to adopt some strategy to pass the EntityManager from the test class to your DAO or it will “rain” NullPointerException at your console.

When you use any of the strategies above you have the down side that your Façade will now the EntityManager or will have a constructor with a DAO on it. The overloaded constructor should be used only by tests classes and should not interact with the production code.

Let us see how the class DogFacade will look like:

package com.facade;

import java.util.List;

import com.dao.DogDAO;
import com.model.Dog;

public class DogFacade {

    private DogDAO dogDAO;

    public DogFacade() {
        dogDAO = new DogDAO();
    }

    public void startConnection() {
        dogDAO.startConnection();
    }

    public double getAverageDogsWeight() {
        double totalWeight = 0;

        List dogs = dogDAO.listALL();

        for(Dog dog : dogs){
            totalWeight += dog.getWeight();
        }

        return totalWeight / dogs.size();
    }

    public void save(Dog dog){
        dogDAO.save(dog);
    }

    public void closeConnection() {
        dogDAO.closeConnection();
    }
}

Let us run our test again:


Final thoughts:

  • I think this approach it is a better then using mock objects. A mocked object will not behave like you do not want to; it will only act in the way it was programmed to. In our case the DAO was behaving like it will behave with the production code, and not like a mocked object. The Façade was not mocked also; it was behaving without any change at runtime in our tests.
  • I see mock strategy very useful when you need to do tests with classes that integrate with other systems; mock is a good strategy also when you need to work with legacy codes that will cost more to refactor than to mock.
  • With this approach you will be able to create tests to your view classes like: JSF, Struts, Seam, etc. In a future post I will write about this.

I hope this post might help you.

If you have any doubt or want to share an opinion just post it.

Bye o_

TDD – First Steps

What is TDD? Where do we start? How do we do a test?

TDD is a methodology to validate classes/methods/attributes that will be created. By example, if it were asked to create a method that searches the data from a house. With TDD, it is possible to validate all situations that the developer might imagine, like: house without number, null as a method return, owner has died and etc.

Where do we start? We need to configure do JUnit so it will make our vision about TDD clearer. I’m sorry if you already know how to do this, it will be fast (we got to help the rookie!).
I will be using the Eclipse as IDE, but TDD should (and must) be applied to all kind of IDEs. Just in case you don’t use JUnit as you test framework, the concept that you will see in here it is applicable to TDD in every framework.

PS.: The little details such to create packages/classes or to open an Eclipse’s view you may find here: Creating a WebServer.

I’ve created a Project named TDD. Then, using “Package Explorer” view, add JUnit library:
Config 01
Config 02
Config 03

We are going to write a test to this user case: Your project will have a class named User that will store all information about each person logon/logoff in your software. As requirement, this class will have a method that will block the user access to the software. Let’s create a package name “test.com” and a class named TestUser inside this package.

package test.com;

public class TestUser {
}

The TDD ideology is that we should start a test by its end. (WHAT?!) In the software companies today is common to start writing a code from where it should start, but with TDD we start it from its ending. Why? By coding using TDD, we got an idea that will lead us to where we supose to be going, we avoid the creation of unnecessary parameters, the created methods are more cohesive and then it goes…
When we write tests cases we must have this sequence in mind: “Red, Green and Refactor”. What does this mean? When we write tests cases using JUnit we have a red bar (that we will see very soon) showing red status when our tests fail and green when all tests are executed with success.

To get the expected result we might use methods that will guarantee that result for us. We will be using assertEquals(desired value, returned value). In our user case, what do we want? Always that the User got a denied access our method should return true. Let’s do some code?

package test.com;
import static org.junit.Assert.*;
import org.junit.Test;

public class TestUser {
    @Test
    public void isUserWithDeniedAccess(){
        assertEquals(true, user.accessDenied);
    }
}

Let’s talk about this code. Observe that we got the annotation “@Test” and when we run the JUnit framework, it will locate all methods with that annotation and will invoke the test validation. We can see the assertEquals that we imported from the JUnit doing an “import static”. Notice that I’m already “trying to access” an object that does not exists in the method or not in the project either. Do you remember when I said that we should start from its ending? Our ending (goal, target) is that the User gets really no access to the software. Even if the class/attribute/methods it is not in the software code yet, as first steps, we will use it regularly. Writing the code by its ending enable us to start to think on the classes name, attributes and etc.

Even if the code does not compile, we are on the way… Small steps… Always small steps.

And to finish our test, the code will be like:

@Test
public void isUserWithDeniedAccess(){
        User user = new User();
        assertEquals(true, user.accessDenied);
}

Observe that the User class was “instantiated” even when it does not exist in our TDD test software. Now we have a test case that we can’t compile.

Are you remembering TDD steps: “Red, Green and Refactor”? To archive the red bar we need first to remove all compile errors. Let’s create our class User inside a package named “com” and create the needed attribute.

package com;
public class User {
    public boolean accessDenied;
}

To finish, create the import statement in our test class (TestUser):

import com.User;

We are ready to get the read bar. To run the test using the JUnit just right click on the test class and go to “Run as > JUnit Test”.
Running JUnit

The red bar gives us the message that we are ready to work in our method, so the method will return the desired value. We start with an error to be sure that, when we get the green bar, our method is really working as we wished. In our situation, we are doing a simple test, but if we were doing a complex method, would give us the assurance that that we got our goal complete. Using a test unit, allow us to not need to build the application every time to test our recently created method. We just need to execute the tests and check the result of our coding.

Our next step will be to get the green bar. Usually at this part of the process we write the code to our method. This simple use case that we are doing does not need a method to be tested yet; we just need to change the attribute value. Our test is the one that does the change at the classes attributes, stimulating it to get the test value that we want. That’s why we are not changing our User class, but we are using our test method to stimulate it by changing its value so the User class might behave as we want. TDD works with small steps if you already have a good knowledge you might create the necessaries methods instead taking this small steps. But, it is a good idea to access directly the attributes, by small steps, because we are already testing them. And late, we will do some refactoring to encapsulate those attributes.

@Test
public void isUserWithDeniedAccess(){
        User user = new User();
        user.accessDenied = true;
        assertEquals(true, user.accessDenied);
}

Run JUnit again and you will get a green bar.

And what do we have? The uglier working code ever and there is nothing if we start to looking for OO/Design patterns. What should we do? The methods/attributes refactoring should stay to the end. Let’s add these refactoring at our “TO-DO” list, doing this list will help us to do not forget about something.

Using this approach we got a better code view from where we could change it and “upgrade it”. It’s just get better, we are sure that our new code it is working as we desired. But we need to change our code to have OO patterns before we give the task as completed. We will create a method that will check if the user got his access denied, we need to protect our attribute.

// TestUser.java
package test.com;

import static org.junit.Assert.assertEquals;
import org.junit.Test;
import com.User;

public class TestUser {
    @Test
    public void isUserReallyBlocked(){
        User user = new User();
        user.accessDenied = true;
        assertEquals(true, user.hasAccesDenied());
    }
}

//User.java
package com;

public class User {
    public boolean accessDenied;

    public boolean hasAccesDenied() {
        return accessDenied;
    }
}

We did not change the attribute access level yet because we still have one more step. Let’s create a “way” that our class change its attribute and protect it from direct access. We will use the concept Tell, don’t ask. Write the code using this “pattern” we will have a clean and bulletproof code and our task will be complete. Our final code would be:

package test.com;

import static org.junit.Assert.assertEquals;
import org.junit.Test;
import com.User;

public class TestUser {
    @Test
    public void isUserReallyBlocked(){
        User user = new User();
        user.denyAcces();
        assertEquals(true, user.hasAccesDenied());
    }
}

package com;

public class User {
    private boolean accessDenied;

    public boolean hasAccesDenied() {
        return accessDenied;
    }

    public void denyAcces() {
        accessDenied = true;
    }
}

Observe our code and notice that our User class has a clear code, straight and without unnecessary methods. Starting the code by the test allow us to focus more on the behavior. Just run the test again to see the final green bar.

I hope this post might help you on your first steps. Soon, I will post here some good TDD practices so we can get our test case even better. Today post was more to tell about the tool, at next post we will be seeing about TDD concepts.

Good night.