Pages

Test driven development with SpringBoot

Test driven development is a buzz-word nowadays in the software development community. However many developers do not do a very good job in this area. That's a separate topic. In this article, I will explain how we can test different components of a Spring Application.

There are basically three different components in a Spring Application.
  1. Controller Layer
  2. Client Layer
  3. Data Layer
The controller layer is the one which receives the requests from the client.
The client layer is responsible to call external services from our application.
The data layer is responsible to connect to various databases and get/persist data for us.

Actually there can be other layers as well such as business layer where some validations might be done, but writing unit tests for those components is pretty straight forward. Just test each method, mock any external dependencies with Mockito, or Powermock etc and cover the scenarios.

When you create a SpringBoot application, by default spring creates a test class with following annotations

@RunWith(SpringRunner.class)
@SpringBootTest

Normally, developers used the @SpringBootTest annotation to test all the components. But this annotation initializes the whole application context and if there are large number of test classes, using this annotation could be time consuming as you are configuring the application context multiple and this is an expensive operation.

Spring provides a number of annotations to test different layers of the application. When you use the layer specific test, the whole context is not created and the tests run pretty faster.

Annotations provided by Spring:

  • @SpringBootTest
  • @WebMvcTest
  • @RestClientTest
  • @DataJpaTest
  • @JsonTest
  • @DataMongoTest 
One thing to keep in mind is that, when you use @WebMvcTest, the regular @component elements are not scanned. The webmvctest annotation is normally used to test one controller endpoint, and its frequently used with @MockMvc which offers a powerful way to test mvc elements without a full HTTP server.

Sample code:

@RunWith(SpringRunner.class)
@WebMvcTest(MyController.class)
@ActiveProfiles("unittest")
@Category(UnitTest.class)
public class MySampleTest {

 @Autowired
 private MockMvc mockMvc;
 
 @MockBean
 private MyAsyncService mySvc;
 
 @MockBean
 private MyServiceClient myClient;
 
 @MockBean
 private RestTemplate restTemplate;
 
 @Test
 public void test_pageNotFound() throws Exception {
  MockHttpServletResponse response = 
    this.mockMvc.perform(post("/my-endpoint/1234"))
    .andReturn()
    .getResponse();
  assertThat(response.getStatus()).isNotEqualTo(404);
 }
...

The next annotation is @RestClientTest, which can be used if you want to test the REST clients. By default, if configures Jackson and Gson support, configure a RestTemplateBuilder and add support for MockRestServiceServer (intercepts the http call). 

Sample code:

@RunWith(SpringRunner.class)
@RestClientTest(RemoteVehicleDetailsService.class)
public class ExampleRestClientTest {

    @Autowired
    private RemoteVehicleDetailsService service;

    @Autowired
    private MockRestServiceServer server;

    @Test
    public void getVehicleDetailsWhenResultIsSuccessShouldReturnDetails()
            throws Exception {
        this.server.expect(requestTo("/greet/details"))
                .andRespond(withSuccess("hello", MediaType.TEXT_PLAIN));
        String greeting = this.service.callRestService();
        assertThat(greeting).isEqualTo("hello");
    }

}

DataJpaTest annotation is used to test JPA applications. By default it configures in-memory embedded database, scan for @Entity classes and configure SpringDataJPA repositories. The Data JPA tests have transactional and rollback at the end of each test by default.

Sample code:

@RunWith(SpringRunner.class)
@DataJpaTest
@Transactional( propagation=Propagation.NOT_SUPPORTED)
public class ExampleRepositoryTests {

    @Autowired
    private TestEntityManager entityManager;

    @Autowired
    private UserRepository repository;

    @Test
    public void testExample() throws Exception {
        this.entityManager.persist(new User("sboot", "1234"));
        User user = this.repository.findByUsername("sboot");
        assertThat(user.getUsername()).isEqualTo("sboot");
        assertThat(user.getVin()).isEqualTo("1234");
    }

}

Also there are some other best practices that you might wanna use.

When you need the whole application context to be initialized, you can configure spring to use a random port.

@SpringBootTest(webEnvironment = WebEnvironment.RANDOM_PORT)


The random port can be accessed inside your code by following.

@LocalServerPort
private int port;

Enjoy Test Driven Development !!!


References:
https://docs.spring.io/spring-boot/docs/current/reference/html/boot-features-testing.html


Writing automated unit and integration test cases for Spring boot application and gradle | Approach | case example

I want to write about my experience of a TDD environment where I had a unique problem and how i solved it.

The problem was that i have to develop an application which provides some endpoints to another team to access our teams features. The application would receive some service request, call a bunch of other web services to do some analysis and then return the status of the request (valid, invalid with other details). I have to write unit tests as well as integration tests while developing the application, so that, in the future, if someone else made any code change and break something, the tests should fail. The application was being build and deployed to the servers using a continuous integration pipeline. The problem is that I cannot assume that other services that my service is depending on, are always up, because if they are down, the test would fail and the changes would not deploy in the development servers even if there is nothing wrong with my code.

Solution.
The solution to this kind of problem is to separate your tests to unit tests and integration tests. The unit tests should run every time you are trying to build the application and deploy it to the servers. There should be no external dependencies in the unit tests. For example, external services, external databases, message queues or streams should not be called in the unit test context. In the integration test, you test the integration among different components so they could be unit-integration test or integration among different applications/systems. In those automated tests, you could access the separate components and applications.

These concepts are much better described in the awesome article written by the very awesome Martin Fowler in his blog. Do check that out.

So for my approach to separate out the unit tests and the integration tests, I adopted the following approach.

First create categories on which you want to separate your tests into:

import org.junit.runner.RunWith;
import org.springframework.boot.test.autoconfigure.web.servlet.WebMvcTest;
import org.springframework.test.context.junit4.SpringRunner;

@RunWith(SpringRunner.class)
@WebMvcTest
public interface IntegrationTest {

}

public interface UnitTest {

}



Then every test case that you want to mark as UnitTest or integrationTest, you can simply apply the category.

import io.restassured.http.ContentType;
import org.codehaus.jettison.json.JSONObject;
import org.junit.Test;
import org.junit.experimental.categories.Category;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.test.autoconfigure.web.servlet.WebMvcTest;
import org.springframework.test.context.ActiveProfiles;
import org.springframework.test.context.junit4.SpringRunner;
import org.springframework.test.web.servlet.MockMvc;

import com.rajan.IntegrationTest;

import java.io.IOException;
import java.util.HashMap;
import java.util.Map;

import static io.restassured.RestAssured.given;
import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.get;
import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.jsonPath;
import static org.springframework.test.web.servlet.result.MockMvcResultMatchers.status;

;

@RunWith(SpringRunner.class)
@WebMvcTest
@Category(IntegrationTest.class)
public class ControllerTest {

You will require Junit4 though to use this feature of junit.
Once the tests are categorized, you can write a gradle job to run instances of a particular category as follows.

// add this in the build.gradle
task integrationtest(type: Test){

 environment "context", "inttest"
 useJUnit{
  includeCategories 'com.rajan.IntegrationTest'
 }
}

// to execute this job from command do the following
gradle clean integrationtest

The above command will run all the instances of the tests market by the marker interface IntegrationTest class.

You will have to add the test dependencies in the build.gradle file to do these tasks.

//testCompile( 'com.github.tomakehurst:wiremock:2.6.0' )
 testCompile( 'io.rest-assured:rest-assured:3.0.2' )
 testCompile( 'org.springframework.boot:spring-boot-starter-test' )
 
 testCompile( 'org.powermock:powermock-module-junit4:1.6.2' )
 //testCompile( 'org.powermock:powermock-api-mockito:1.10.8' )
 testCompile( 'org.mockito:mockito-core:1.10.8' )
 testCompile( 'org.easymock:easymock:3.4' )
 testCompile group: 'org.hamcrest', name: 'hamcrest-core', version: '1.3'
        testCompile group: 'org.hamcrest', name: 'hamcrest-library', version: '1.3'

After this
I would also like to talk a little bit about a few popular libraries out there for writing tests and some popular features of them.

Customizing docker image | case example

Using concourse for ci-cd pipelines.
Concourse internally uses docker images for its jobs.
Jobs basically require some input and output folders and some environment variables to be set, and then the commands to be executed.

Basically docker container is like an instance of virtual machine with the required environments configured so that user can directly use the binaries to execute commands and do the job.
For example, to run a npm command in the docker, you require a docker image with npm binaries already installed and configured. So when you start the container, you can just start executing those commands.

Sometimes, you will not find a docker image with all the dependencies installed and configured into it. So you will have to prepare the environment yourself once.
I encountered a similar situation where most of the things were configured in one of the docker image but some of the additional dependencies which i require were not there.
I did some research and got the work done.

Basically I had to build some old tomcat grid applications in the pipeline and publish the results to quality hub.

I found a docker image with the java, and ant configured but ivy binaries were missing in the image.

NOTE: IVY is the dependency management api which is normally used in Ant projects. Its similar to maven in that you declare dependencies in an xml file and it can download that for you.

So, here are the steps to get the work done.
1. Pull and run the first docker image.

docker pull docker.artifactory.rajan.com/java/ant-image:latest
docker run -itd --name ant-ivy docker.artifactory.rajan.com/java/ant-image sh

Note: if your docker image is in custom artifactory and not in docker hub, you have to give the complete path of the artifactory as shown above.

2. Attach your cmd to the docker container and check

docker attach ant-ivy

3. Clone ivy project into the folder

git clone https://git-wip-us.apache.org/repos/asf/ant-ivy.git
cd ant-ivy
ant jar

4. copy the ivy.jar to /usr/share/ant/lib folder.

5. If you have any other files to copy from your local to running docker container use the following command.

docker cp fileToCopy ant-ivy:/folder_name

6. Once your configuration is complete, create a new image out of the running container.

docker commit ant-ivy docker.artifactory.rajan.com/rajan/ant_with_ivy:version_01

This command will create a new docker image out of the running ant-ivy container instance. (run it from different cmd as exit command from attached cmd will exit the container)

7. Login to the custom docker artifactory

docker login docker.artifactory.rajan.com

8. push the image to artifactory

docker push docker.artifactory.rajan.com/rajan/ant_with_ivy:version_01

Now if you do the following command from another machine, it will be able to get the docker image.

docker pull docker.artifactory.rajan.com/rajan/ant_with_ivy:version_01

That's all for today.
Hope this helps someone.

Playing with Docker

 Docker is a tool to run applications in an isolated environment. It provides the same advantages as running the application on a virtual machine but its way more lightweight, fast and offers many more advantages.

Adventages:

1. Same environment
2. Sandbox projects
3. Eliminates the configuration and setup phase of virtual box

Demo

There is a tutorial here. Go through the Docker for Beginners workshop and you will have a very good understanding of using docker.

I am going to list few of the commands as a reference in this blog.

First complete these steps:

  • Install Docker in your machine
  • Start the docker service
  • Run a standard container (ie Hello world, or alpine)

Note: If you are linux user, add your user to sudo group to avoid typing sudo infront of every docker command
Basic commands
  • Check Docker info
    docker info
  • Check docker images in your machine
    docker images

  • check docker processes
    docker ps
    docker ps -a

  • stop docker process
    docker stop <container-id or container-name>
  • Remove docker process
    docker rm <container-id or container-name>
  • Remove docker image
    docker rmi <container-id or container-name>
  • Run docker build to create an image
    docker build -t USERNAME/IMG_TAG_NAME . 
  • Run image
    docker run -p HOST_PORT:GUEST_PORT -v HOST_PATH:GUEST_PATH IMAGE_NAME . 
Run interactive command on a container
Lets say you have a container running(say ubuntu). Now you want to connect to the container and run some shell scripts from your host machine. You can run hit this command to connect to the running container and test if the scripts work or not.

docker exec -it <container name or id> sh  

Run mongodb in docker:

  1. docker run --name mongo -p 27017:27017 -v ~/mongodb/data/:/data/db -d mongo

Run another container and line the previous container to it (mongo)
sudo docker run -itd -e NODE_ENV=production -e PEDA_HOST=peda.app.rajanu.com.np -e PEDA_PORT=3000 --link mongo --name=peda -p 3000:3000 rajanpupa/peda
The above command links the mongo container to the peda container and it will create some environment variables in the peda containers which can be used to connect to mongo container from peda container.

for example:
MONGO_PORT_27017_TCP_ADDR  = 127.0.0.10

which can be accessed from a nodejs app running in peda by
process.env.MONGO_PORT_27017_TCP_ADDR

If the container started successfully, it could be attached from the command line of host using the following command.

docker attach peda

To push the image to the hub
docker push rajanpupa/peda
before that, you need to have a docker hub account and you need to be logged in.
docker login

To see logs of a server  (for debugging)
docker logs <id>

Simple right

Pivotal Cloud Foundry Concepts


Cloud Foundry Concepts. 

Cloud Foundry is an open platform as a service, providing a choice of clouds, developer frameworks, and application services. ... It is an open source project and is available through a variety of private cloud distributions and public cloud instances.
Pivotal implementation of Cloud foundry is called PCF Cloud Foundry.

As any cloud environment, PCF also provides easy way to deploy many kinds of applications developed in different languages/frameworks, scalability and availability of your applications, pay as you go, shutdown as you need and its pricing is very competitive as compared to other cloud providers. PCF website provides you a starting credit of 80 something dollars, which you can use to play with pcf environment. If your application is very basic, this credit can last for years because if your application has only one instance and a memory limit of 128mb, you are charged almost 3$ per month. You can use NODEJS as a language to play with PCF as its memory footprint is pretty low (especially compared to Java). If you are using SpringBoot java framework, you need to have a minimum of 512MB memory allocated per instance to function properly.

Getting started.

Aws Experience

I have been thinking about creating an Amazon developer's account and experiment with some Ec2 stuff for about two years now. Recently I created an account and begin experimenting with the free tiers.

Instances

Basically amazon offers free a micro instance for a year. You have the option to choose from different types of operating system from Windows to Ubuntu, Redhat and many more. The name of the instance (nano, micro, ...) is based on the specifications that instance will have. For example, the micro instance has 8GB of local disk space and 1GB of memory, with 1processor core. Similarly you can choose other higher instances based on your need. The cost of the instances goes up based on the specifications.

AMI

 Also there are many other Images (AMI's ) created which have different features already installed/configured. Some AMI's are already configured for Java environment while others have pre-installed databases and other languages/packages. You can even create your own AMI once you install and configure your instance. I think Amazon charges you for the storage of the AMI but its pretty low compared to keeping an instance alive. The benefit of AMI is that you don't have to spend hours installing the required packages and configuring your machine everytime you fire up a new instance.

Security Groups

You can attach various security groups to your instance. Security group is where you will configure your security settings like
  • Should the incoming traffic be allowed or not
  • which ip-addresses/ports are open for incoming traffic and out-going traffic
By default, the port is not open and you can't connect to your remote instance via ssh or anything. You have to manually allow the connections via the security settings.

Storage

In addition to the local storage of each instances, you have the option to choose other storage options for higher capacity and reliability. Local storage is not reliable to store persistent information as it may be lost when the instance restarts for some reason. 

S3 is the most populare simple bulk object storage service provided by amazon at a reasonable cost which depends on storage capacity and data transfer. It's good for long term storage of objects. With the api and key authentication, its very easy to store and retrieve files and objects.

Load Balancing

Amazon has a load balancing service which is very handy for scaling and security. You can create an instance of load-balancer and point it to a instance group, and the load-balancer automatically distributes the load among the server in round-robin or whatever way possible. As of security stand-point, you can configure the security group of your instances to private so that they can not be accessed publicly from the internet, and can only be accessed via internal applications such as load-balancer. This definitely makes the job of a hacker very difficult if not impossible.

Auto Scaling

There is an auto scaling feature available in AWS which can automatically increase or decrease the number of instances depending on some resource utilization in the instance. For example, you can configure you instance to increase 1 instance every 5 minutes for the next 10 minutes if the cpu utilization is more than 70% and similarly decrease the instance if the cpu utilization is less than 50%  with a minimum of 1 instance and so on.

Services

AWS provides various types of services in-built and you can directly use it. The services includes the following
  • Computing services
  • Storage 
  • Databases 
  • Developer tools (CodeCommit, codeBuild, codePipeline)
  • Management tools(cloudWatch, cloudConfig, cloudFormation, )

AWS SDK

Amazon also provides sdk which can be used to automate the interaction with aws. You can basically add the aws sdk dependency in you maven or gradle file and use it to interact with aws and automate the interaction with S3 instances or creation/configuration of ec2 instances and many more.

Here is a very good youtube video of using sdk to interact with S3 instances for storage.

I will keep adding other information to this blog as i gain new information. 

Adding jquery animation to blogger

I want to implement the collapse feature in my blog.

Following the tutorial from http://www.stramaxon.com/2013/09/expandable-section-boxes.html

<div class='hidden-section-container'>
  <div class='sh-section-btn'><span>Show</span></div>
  <div class='h-section-cont shw-box'>
    <!-- All your text/html below this -->
     <p>All your content here</p>
    <!-- All your text/html above this -->
  </div>
</div>


Show
Sample title goes here
All your content here
All the contents that needs to go for the collapse feature goes here