Bytecode Hacking

So here was the problem. A customer was using a 3rd-party library and identified a bug in the library software.  The problem popped up in one particular method of the library. This method wrote data via a socket on a specific port to a remote computer. On occasion, for some reason, the socket was closed and so the data could not be written. The customer and the library vendor went back and forth pointing fingers at each other claiming the problem was either in the library or in the application that used the library.

I inspected the questionable method by decompiling its code. See my earlier article on Java decompilation. It turned out the method was very small and simple:

public void sendData(String host, int port, byte[] data) {      
    InetAddress address = InetAddress.getByName(host);
    DatagramPacket packet = new DatagramPacket(data, 0, 
        data.length, address, port);
    datagramSocket.send(packet);
}

The input to the method was the host and port that was to receive the data along with a byte array of the actual data to be sent. The method converted the host to an InetAddress object which was needed to create the datagram packet which was a wrapper around the recipient host and data. The last line was a call to send() which actually sent the data to the remote computer.

On occasion, something was wrong with one of the arguments and for some reason, datagramSocket had been closed and so the call to send() failed.  And we had no way of knowing the host or port or data when this situation occurred.  (Note that there are really a number of ways to address and debug this problem. I have created a simplified problem and solution to demonstrate the methodology I am presenting.)

At this point I want to introduce the reader to a very powerful Java toolkit call Javassist. Its documentation states:

Javassist (Java Programming Assistant) makes Java bytecode manipulation simple. It is a class library for editing bytecodes in Java; it enables Java programs to define a new class at runtime and to modify a class file when the JVM loads it. Unlike other similar bytecode editors, Javassist provides two levels of API: source level and bytecode level. If the users use the source-level API, they can edit a class file without knowledge of the specifications of the Java bytecode. The whole API is designed with only the vocabulary of the Java language. You can even specify inserted bytecode in the form of source text; Javassist compiles it on the fly. On the other hand, the bytecode-level API allows the users to directly edit a class file as other editors.

We are going to use Javassist to modify sendData() by adding a System.out.println() at the front of the method. This will be called whenever the host name changes (this additional logic is added only to show the power of Javassist).  In reality, anything within the method can be altered. In our example we will replace the entire method contents with our content.  Here is what we want the method to look like when modified:

public String savedHost = null;

public void sendData(String host, int port, byte[] data) {  
    if (savedHost == null || !savedHost.equals(host)) {   
        System.out.println("host=" + host + "; port=" + port); 
        savedHost = host;
    }
    InetAddress address = InetAddress.getByName(host);
    DatagramPacket packet = new DatagramPacket(data, 0, data.length,
        address, port);
    datagramSocket.send(packet);
}

So how do we use Javassist to “hack” this method and add our code?  The first thing we must do is access the class itself.  This is done via the following code snippet:

CtClass ctClass = 
    ClassPool.getDefault().get("com.dataform.MyClass");

The next step is to create our instance variable “savedHost” and associate it with our class:


CtField savedHost = CtField.make("public String savedHost;", ctClass);

Next we want to replace the existing sendData() method with our updated version of the method. There are actually a number of ways to do this. The Javassist API has a rich set of methods for inserting code into an existing method (or even creating new methods). Here I will show how to replace the existing method code with our replacement code:

 
CtMethod method = ctClass.getDeclaredMethod("sendData");
String newMethod =
    "{" +
    "    if (savedHost == null || !savedHost.equals(host)) {" +
    "        System.out.println(\"host=\" + $1 + \"; port=\" + $2);" +
    "        savedHost = $1;" +
    "    }" +
    "    java.net.InetAddress address = java.net.InetAddress.getByName($1);" +
    "    java.net.DatagramPacket packet = new java.net.DatagramPacket($3, $3.length, address, $2);" +
    "    datagramSocket.send(packet);" + 
    "}";

method.setBody(newMethod); 

Looking at the code snippet above there are a number of items of interest. First note that the arguments to the method are not referenced by name, rather by number: first argument is $1, second argument is $2, etc. Second, non-local classes, such as InetAddress were fully qualified (i.e., java.net.InetAddress). Javaassist has methods for inserting import statements but I did not use those.  The last item of note is that the beginning and end of the method body have to be braces ({}).  Note the last line above. The method, which is defined by the string “newMethod” is set as the replacement body of the method.

Our modified class needs to be saved as the original class is part of a library jar file.  To save the class, add the following line to the code:

 ctClass.writeFile(".");

In order to execute our modified class instead of the original class in the library, we need to add the following Java VM argument:

-Xbootclasspath/p:.

And that is all there is to it. You can download the Javassist library by going to http://www.javassist.org.  The downloaded zip file includes a directory with three tutorial html files. All the many options are explained and described in these files.

Happy hacking!

 

Productivity Tools for Work and Home

Over the years, I have read many articles presenting the authors’ opinions of tools that have helped increase their productivity.  Some were good, some were debatable.  Over the years I too have developed my own way of doing things that are, at least in my opinion, smart and efficient ways of being more productive.  In this article, I will present those tools that I use on a day-to-day basis both for work and home.  I will not include socializing apps nor software development tools (that’s worth at least one or more articles) nor games.

Let’s start with work.  I’m a telecommuting computer consultant working currently for one client.  I charge by the hour.  I am aware of tools that actually monitor what you are doing on your computer and associate that time activity to whatever project you define.  I am using a far simpler tool for managing my work hours: MPS TimeLog Pro.  You can find it on the Google Play Store for $3.99.  It’s a simple app that allows you to define clients, projects, and activities.  You press “Start” when you start working on some activity and later press “Stop” when you are done or want to change activities.  A log file shows everything you’ve done.  Filtering, importing, exporting are all available including the ability to manually update and change start/stop times.  There’s nothing particularly fancy about this tool.  It performs one function, keeping track of my hours according to my activities, and it does that well.

People use Dropbox for keeping all sorts of things synchronized between computers and devices.  Until recently I would travel frequently between my two homes in Maryland and Maine, USA.  This entailed bringing my computers with me. I used to also bring a couple of computer reference books.  I quickly realized that bringing books with me was a lose-lose proposition.  I couldn’t bring all my computer books with me; and I never knew what I needed until I needed it.  So I started buying all my work-related books as either PDF or EPUB books.   I place all my books in a “books” folder within Dropbox.  Computer books are arranged by subject:  Java related books were placed in folder “java”; Linux related books were in “linux”, etc. That way, all my books are always accessible, either via laptop, tablet, or smartphone and no matter where I happen to  be.

So how do I read all the various PDF or EPUB computer books that I purchesed?  I own a Samsung tablet which is the perfect size for reading books and magazines and for that tablet I purchased an Android app called Bookari Premium Pro (downloadable from the Google Play Store).  There are many ereader apps out there but I chose this one for its number of convenient features such as being able to read both PDF and EPUB files, organizing my collection of books and magazines, and keeping track of where I last left off in each book.  One of  the really nice features of this app is that it interfaces directly with Dropbox.  I can point to my ‘books’ folder and download any and all of my books to my tablet for offline or simply more convenient reading.

Unless you’re one of those people who cannot forget anything, maintaining a task list is an important way for remembering everything you need to do.  The tasks can be work-related or home-related.  I use a wonderful application called Wunderlist for managing my tasks. For work, I use it to remind me when I need to invoice my clients or that I have a computer-related article I had planned on reading. For home, I use Wunderlist to remind me when particular bills are due or what errands I need to run.  You can associate completion dates and reminders as well as notes with each task.  Plus you can store the tasks inside different folders (e.g., work, bills, etc).  Of course Wunderlist has a Winodows app as well as an Android app. Using their server all tasks are kept in sync.

There is however one little problem with Wunderlist.  The German-based company that created Wunderlist was recently purchased by Microsoft.  And Microsoft has already announced that they will be folding Wunderlist into their new task management software.  Based on many comments I have read Microsoft’s product is not nearly as good as Wunderlist (nor is it even ready for prime-time).  I, and maybe millions of other users of Wunderlist, will continue to use the existing product as long as it is available.

Besides keeping track of tasks, keeping track of meetings and appointments is very important.  Google’s Calendar app is the perfect solution.  As it is browser based it works everywhere (there is a Google Calendar app for Android as well as competitor apps that use the Calendar API for accessing the calendar information).  A nice additional feature is the ability to share calendars amongst different people.  My wife and I share our calendars so we can keep track of each others’ activities.

For all my calendar needs I use Google Calendar.  It’s not what my client uses so whenever I have a new meeting or conference call scheduled, I will quickly jump to Google Calendar and mark that same date/time as a “Busy” item.  This reminds me that I have some work activity even when not next to my work computer. It also lets my wife know that I am not to be disturbed.

For brainstorming activities I use a program called Freeplane.  This program is categorized as a “mind mapping” application.  Per Wikipedia, “A mind map is a diagram used to visually organize information. A mind map is hierarchical and shows relationships among pieces of the whole. It is often created around a single concept, drawn as an image in the center of a blank page, to which associated representations of ideas such as images, words and parts of words are added. Major ideas are connected directly to the central concept, and other ideas branch out from those. In addition to brainstorming, I use Freeplane for taking notes and creating outlines.  (Full disclosure: I recently began contributing to the development of Freeplane and am now involved in the development of a future collaborative version of Freeplane.)

One of my absolute favorite programs is Evernote. I use Evernote as a repository for everything.  You can store text notes, PDF and DOC files, pictures and virtually any kind of file inside Evernote.  I associate my notes with different “notebooks”.  One is for my personal stuff others are for each of my clients. Work-oriented notes include all my billing statements, invoices, and technical documentation.  I also save interesting technical documents that are either PDF files that I have saved or by using the Evernote Google Chrome extension, I can directly save web pages off of the browser.

On the personal side, I save all my monthly bank and credit card statements.  I use Evernote’s powerful tagging mechanism to separate each credit card and bank statement. When I buy some new gadget that requires downloading a PDF document to learn how to operate the gadget I save that PDF file in Evernote.  I bought a new smartphone recently and took a picture of the serial number and other identifying numbers off of the box. I added that photo to Evernote with a title saying this is the phone serial number.

For Windows and Android, Evernote has standalone apps.  The product also works very well via a browser. There is no official Linux client available but I am currently using a third-party program called NixNote that seems to work quite well.  There are third-party programs available for Apple’s MacOS, too.  I personally have Evernote installed on my laptop, smartphone as well as my tablet. One last really cool feature for premium subscribers is that Evernote will scan all saved PDF and DOC files creating an further index of words that you can search for.

The client I work for has distributed developers all over the USA as well as in some “off-shore” countries.  Being able to communicate easily, consistently and with quality is very important.  We typically communicate via Skype for Business for both phone calls as well as for desktop sharing. It is a product that works quite well.

For the open source project I am working on we had the same requirements.  Unfortunately we didn’t have as much luck with the personal version of Skype.  So we switched to and are now using Google Hangouts.  We can send instant messages to each other, talk and view each other as well as share our desktops.  There are many, many different programs that offer somewhat similar services, mostly desktop sharing. One program I have successfully used in the past is TeamViewer.

Using a computer to manage finances is fairly ubiquitous.  I used to use Quicken, possibly the most well-known application name for managing personal finances.  But over time the program has become more and more expensive.  And I finally gave up on the product when they insisted I upgrade “or else”.  I particularly wanted a program that ran both on Windows and Linux and that would not require separate licenses for each. I have now been using Moneydance for a number of years.  It’s fairly straight-forward and easy to use and serves my needs. There’s even an Android version that synchronizes with the main application if you store your data in a Dropbox folder.   Data can be encrypted so the data is safe even in Dropbox.   The only parts of the program that I do not like have to do with budgeting. If budgeting is very important to you, I would suggest looking elsewhere first. It does work, in theory, but I’ve had issues using it.  One suggestion for an alternative program might be GnuCash which is actually free.  Caveat: I have not used this program.

There is one last program that I use on a daily basis: Feedly. This program is a news aggregator.  Per Wikipedia, a news aggregator “is client software or a web application which aggregates syndicated web content such as online newspapers, blogs, podcasts, and video blogs (vlogs) in one location for easy viewing.  I personally use Feedly to read the daily news on particular subjects of interest to me from a number of different sources, blogs by people I follow as well as monitoring the release announcements of specific software.

Perhaps you have a favorite program or application you want to share with the world.  Let me know.

Too Many Types of Video Cables

There was a time when life was simple.  You wanted to connect a monitor to a computer so you took the VGA cable and connected it between your computer and monitor, female to male and male to female.  You couldn’t go wrong.

VGA cable

I just bought a new laptop.  It doesn’t have a VGA connector. So my KVM switch won’t work with this laptop.  I just realized that.  A KVM switch is a set of cables plus a switch that allows you to connect a number of different computers to one keyboard, one video (monitor) and one mouse.  That way you don’t kill all the real estate on your desk with multiple monitors, etc.  Consumer KVM switches usually come with connectors for 2 or 4 computers.  Until recently, I used my KVM switch to connect my work laptop, my personal laptop, and my personal desktop.

But now I have a new somewhat high-end HP Pavilion laptop that only has an HDMI connector.  My work laptop has adapters for VGA and DisplayPort.  Oh, and my desktop computer, which is about 7 years old, only has VGA connectors.  One of my two 24-inch monitors has connections for VGA and DVI while the other monitor has connections for VGA, DisplayPort, and HDMI.  And I’m having problems trying to figure out the differences between the various connector types and whether I can connect everything together via any type of KVM switch.

This blog entry is really meant for me so I can easily look up and remember which type of cable is which.  So first, here are what the cable connectors look like:

HDMI cable
HDMI cable (note the symmetric look of the connector)

 

 

DVI cable
DisplayPort cable (note the asymmetric shape of the connector)
DisplayPort cable
DVI cable (reminiscent of the VGA cable but larger and rectangular)

Now that I have these pictures I hopefully will not forget which cable is which.  What’s somewhat nice is that there is much compatibility between these digital cables.  For example, there are DVI-HDMI cable adapters.  There are also DisplayPort-HDMI cable adapters.  And of course there are DisplayPort-DVI cable adapters such as this one which I own:

DisplayPort-DVI cable adapter
DisplayPort-DVI cable adapter

I need to integrate the computers and monitors so that they all use one type of adapter so that a KVM switch can be used to share the devices.  Note that I will only be sharing one monitor between the computers. The second monitor will only be connected to my work computer.

The DVI interface is the oldest of the three digital interfaces so I’m choosing not to buy a DVI KVM switch.  Where I’m living DisplayPort KVM switches aren’t available so I will buy a HDMI KVM switch.  To get everything connected I will need to buy a couple of adapter cables: one DVI-to-HMDI cable (to connect my older monitor to the KVM switch) and one DisplayPort-to-HDMI cable (to connect my work computer to the KVM switch).

Now that was easy, wasn’t it?

 

 

The Value of Decompiling Java Code

The inspection of a stack trace is a crucial part of a developer’s methodology for determining the source of an error or bug. In some environments, the stack trace can be quite large. I was once working in a WebSphere environment and was asked to try and determine why some of the application related stack traces we were seeing did not include any application code references. Without a reference to the application code, we had no idea where the exception originated.

In a WebSphere environment, users are supposed to throw a ServiceBusinessException (this is an IBM created exception class). This tells the WebSphere framework that a backend application error occurred. With our application however, we frequently saw that this exception didn’t show where the error originated. By reviewing lots of log files, I noted that sometimes the stack trace did point to the source code line where the exception was thrown and sometimes it didn’t. It was most puzzling and I couldn’t figure out what was going on. At the same time, I did notice that whenever a Java IllegalArgumentException was thrown, a complete stack trace was presented.

Using SoapUI, I fed the application some bad data on purpose. I knew exactly which ServiceBusinessException was being thrown within the application. The line of code looked something like this:

throw new ServiceBusinessException(“Some error message.”); 

I had a wild idea. Let’s replace that line and throw an IllegalArgumentException and see if we would get a full stack trace or a stack trace that was missing the crucial lines. If we got a good stack trace, this would infer something amiss within ServiceBusinessException. If we still got a bad stack trace, this would imply something in our code was messing up the stack or the exception handler. So the new line of code looked like this:

throw new IllegalArgumentException(“Some error message.”); 

I ran the test. The stack trace that appeared in the log file was complete! So I now had some evidence that showed the culprit to be the ServiceBusinessException itself and not something in the application. So within WID (the WebSphere modified version of Eclipse) I placed the cursor on the word ‘IllegalArgumentException’ and pressed the F3 key (“Open Declaration”). Since I had installed the JAD (Java Decompiler) plugin, WID was able to decompile the IllegalArgumentException.class into real Java source code. What I saw was the following code:

public IllegalArgumentException(String s) {
    super(s);
}

Then I did the same thing for ServiceBusinessException:

public ServiceBusinessException(Object data)
{
    if (data instanceof DataObject)
        this.data = new XMLExternalizableDataObjectHolder((DataObject)data);
  else
    this.data = data;
}

The first thing I noticed by perusing the decompiled source code was that the ServiceBusinessException constructor did not call the super() method. That did not look right to me. Was that a bug since after all, they both extended class RuntimeException? With further inspection, I noticed that ServiceBusinessException had a second constructor which did include the call to super(). It looked like this:


public ServiceBusinessException(Throwable cause)
{
    super(toString(cause), cause(cause));
    if (cause != null && cause.getClass() ==  
        com/ibm/websphere/sca/ServiceBusinessException)
    {
        // a bunch of logic goes here
    }
}

So I modified my code to force invocation of this second constructor of ServiceBusinessException and test the theory that the missing call to super() was the source of the bug. My code ended up looking like this:

IllegalArgumentException exc = new IllegalArgumentException(“Some error message.”); 
throw new ServiceBusinessException(exc);

Suddenly my stack trace looked correct. This then was the confirmation I needed to demonstrate that ServiceBusinessException had a bug in it (a missing call to super()). I don’t know how long it would have taken me to find this bug had I not had the ability to decompile the two exception classes and compare them. I am convinced, however, that it would have taken much longer than it did take.

Adding JAD and the JAD plugin to Eclipse or WID is very easy. The JAD plugin can be found at http://sourceforge.net/projects/jadclipse/ . JAD can be found at http://www.varaneckas.com/jad/.

Place the JAD decompiler (jad.exe) in a directory of your choice. In a Unix environment, you may need to make JAD executable. Place the JAD plugin (the jar file) in Eclipse plugins directory. Restart Eclipse and go to Window → Preferences → Java → JadClipse. Change the ‘Path to decompiler’ location to fully reference the jad.exe file wherever you have placed it.

The ‘class’ file association must be updated (in case it wasn’t automatically done through the plugin installation). Go to Window → Preferences → General → Editors → File Associations. For file type “.class”, add “JadClipse Class file Viewer” as the default editor.

Unit Testing Spring MVC REST Controllers

Unit tests are supposed to run against functional methods.  The method under test is looked upon as a black box with inputs and outputs.  You define a set of inputs, both legal and illegal, and expect certain outputs or possible exceptions.  Ideally, you exercise all the different paths within the method.

With MVC controllers, there is automatic “wiring” that Spring sets up within the controller class.  The application server (such as tomcat), through the Spring interface, calls the appropriate controller method based on the input path specified in the HTTP request.  Furthermore, if the data package is a JSON string, the data is automatically converted to an appropriate Java object.  How do we implement a unit test for this?  Complete documentation for the Spring MVC Test Framework can be found here.

The example presented below uses the following library versions of applicable libraries:

Library Version
Spring 4.3.0
Mockito 2.0.78
Java 1.8
Jackson 2.7.2

The following pom.xml file is used to define the project:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
 xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
 <modelVersion>4.0.0</modelVersion>
 <groupId>com.topshot.mvcdemo</groupId>
 <packaging>war</packaging>
 <version>1.0-SNAPSHOT</version>
 <artifactId>MVCDemo</artifactId>
 <name>MVC Demo App</name>
 <url>http://maven.apache.org</url>

 <properties>
   <javax.servlet.version>3.1.0</javax.servlet.version>
   <jackson.version>2.7.2</jackson.version>
   <jdk.version>1.8</jdk.version>
   <maven-compiler-plugin.version>2.3.2</maven-compiler-plugin.version>
   <maven-eclipse-plugin.version>2.9</maven-eclipse-plugin.version>
   <mockito.version>2.0.78-beta</mockito.version>
   <spring.version>4.3.0.RELEASE</spring.version>
 </properties>

 <dependencies>
   <!-- Spring dependencies -->
   <dependency>
     <groupId>org.springframework</groupId>
     <artifactId>spring-core</artifactId>
     <version>${spring.version}</version>
   </dependency>

   <dependency>
     <groupId>org.springframework</groupId>
     <artifactId>spring-web</artifactId>
     <version>${spring.version}</version>
   </dependency>

   <dependency>
     <groupId>org.springframework</groupId>
     <artifactId>spring-webmvc</artifactId>
     <version>${spring.version}</version>
   </dependency>

   <!-- Jackson JSON Mapper -->
   <dependency>
     <groupId>com.fasterxml.jackson.core</groupId>
     <artifactId>jackson-core</artifactId>
     <version>${jackson.version}</version>
   </dependency>
 
   <dependency>
     <groupId>com.fasterxml.jackson.core</groupId>
     <artifactId>jackson-databind</artifactId>
     <version>${jackson.version}</version>
   </dependency>
 
   <dependency>
     <groupId>com.fasterxml.jackson.core</groupId>
     <artifactId>jackson-annotations</artifactId>
     <version>${jackson.version}</version>
   </dependency>

   <dependency>
     <groupId>javax.servlet</groupId>
     <artifactId>javax.servlet-api</artifactId>
     <version>${javax.servlet.version}</version>
   </dependency>

   <dependency>
     <groupId>org.mockito</groupId>
     <artifactId>mockito-core</artifactId>
     <version>${mockito.version}</version>
   </dependency>

   <dependency>
     <groupId>junit</groupId>
     <artifactId>junit</artifactId>
     <version>4.12</version>
   </dependency>

   <dependency>
     <groupId>org.springframework</groupId>
     <artifactId>spring-test</artifactId>
     <version>${spring.version}</version>
   </dependency>

 </dependencies>

 <build>
   <finalName>MVCDemo</finalName>
   <plugins>
     <plugin>
       <groupId>org.apache.maven.plugins</groupId>
       <artifactId>maven-eclipse-plugin</artifactId>
       <version>${maven-eclipse-plugin.version}</version>
       <configuration>
         <downloadSources>true</downloadSources>
         <downloadJavadocs>false</downloadJavadocs>
         <wtpversion>2.0</wtpversion>
       </configuration>
     </plugin>
     <plugin>
       <groupId>org.apache.maven.plugins</groupId>
       <artifactId>maven-compiler-plugin</artifactId>
       <version>${maven-compiler-plugin.version}</version>
       <configuration>
         <source>${jdk.version}</source>
         <target>${jdk.version}</target>
       </configuration>
     </plugin>
   </plugins>
 </build>

</project>

Let’s start by looking at our controller class and see what it does.  The request that will initiate the call to getUsers is the path “/users” as seen on line 20 of MainController.java.  Data to be returned is an array of User objects in JSON format. The conversion of variable ‘users’ to JSON format is handled by Spring as a result of the two annotations, @RequestMapping and @ResponseBody (lines 20 and 21).

package com.topshot.mvcdemo.controller;

import java.util.Arrays;

import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Controller;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestMethod;
import org.springframework.web.bind.annotation.ResponseBody;

import com.topshot.mvcdemo.model.User;
import com.topshot.mvcdemo.service.UserService;

@Controller
public class MainController {

   @Autowired
   private UserService userService;
 
   @RequestMapping(value = "/users", method = RequestMethod.GET,
                   produces="application/json")
   public @ResponseBody User[] getUsers() {
       User[] users = userService.getUsers();
       System.out.println("DEBUG users: " + Arrays.toString(users));
       return users;
   }

   @RequestMapping(value = "/user/{name}", method = RequestMethod.GET,  
                   produces="application/json")
   public @ResponseBody String getUser(@PathVariable String name) {
       return "The name is " + name;
   }
}

There are two ways we might want to unit test this code.  Much depends on the real implementation of the UserService getUsers() call within the outer getUsers() call.  If, as it is likely, userService.getUsers() calls a real database or some other service that needs to be running, then this is not really a unit test, rather it is an integration test that requires external setup to be functional. In this case we would want to mock up the userService.getUsers() call.  On the other hand, if this method call returns some kind of static information (perhaps from a flat file), then we can call the real method with our unit test.  I will show how both types of unit tests can be implemented.

In our first example, we will exercise MainController.getUsers() leaving the call to userService.getUsers() alone (i.e., we will make a real call to this method).  Here is the code for the unit test class:

package com.topshot.mvcdemo.controller;

import static org.junit.Assert.*;

import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.http.MediaType;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import org.springframework.test.context.web.WebAppConfiguration;
import org.springframework.test.web.servlet.MockMvc;
import org.springframework.test.web.servlet.RequestBuilder;
import org.springframework.test.web.servlet.ResultActions;
import org.springframework.test.web.servlet.request.MockMvcRequestBuilders;
import org.springframework.test.web.servlet.result.MockMvcResultMatchers;
import org.springframework.test.web.servlet.setup.MockMvcBuilders;
import org.springframework.web.context.WebApplicationContext;

 // 1
 @RunWith(SpringJUnit4ClassRunner.class)
 @ContextConfiguration(locations = {"classpath:/mvc-dispatcher-servlet.xml"})
 @WebAppConfiguration
 public class MainControllerTest {

   // 2
   @Autowired 
   WebApplicationContext wac;
 
   private MockMvc mockMvc;
 
   @Before
   public void setUp() throws Exception {
     // 3
     mockMvc = MockMvcBuilders.webAppContextSetup(wac).build();
   }

   @Test
   public void testGetUsers() throws Exception {
     // 4
     RequestBuilder requestBuilder = 
       MockMvcRequestBuilders.get("/users").accept(new 
          MediaType("application", "json"));
     // 5
     ResultActions result = mockMvc.perform(requestBuilder);
     result.andExpect(MockMvcResultMatchers.status().isOk());
     String contentType = 
       result.andReturn().getResponse().getContentType();
     // 6
     assertTrue("contentType returned is incorrect.", 
     contentType.contains("application/json"));
     // 7
     String jsonData = 
       result.andReturn().getResponse().getContentAsString();
     assertNotNull(jsonData, "No data was returned.");
     assertTrue("Insufficent number of characters returned as data.", 
       jsonData.length() > 2);
     assertTrue("This JSON string is supposed to start with '['", 
       jsonData.startsWith("["));
     assertTrue("This JSON string is supposed to end with ']'", 
       jsonData.endsWith("]"));
   }

   @Test
   public void testBadPath() throws Exception {
     // 8 
     RequestBuilder requestBuilder = 
       MockMvcRequestBuilders.get("/Users").accept(new 
       MediaType("application", "json"));
     // 9
     ResultActions result = mockMvc.perform(requestBuilder);
     result.andExpect(
       MockMvcResultMatchers.status().is4xxClientError());
   }

   @Test
   public void testGetUser() throws Exception {
     RequestBuilder requestBuilder =    
       MockMvcRequestBuilders.get("/user/Dracula")
         .accept(new MediaType("application", "json"));

     ResultActions result = mockMvc.perform(requestBuilder);
     result.andExpect(MockMvcResultMatchers.status().isOk());
     String contentType = 
        result.andReturn().getResponse().getContentType();

     assertTrue("contentType returned is incorrect.",  
        contentType.contains("application/json"));

     String jsonData =   
        result.andReturn().getResponse().getContentAsString();
     assertNotNull(jsonData, "No data was returned.");
     assertTrue("Incorrect response", 
        jsonData.equals("The name is Dracula"));
 }

}

The following are comments associated with the above unit test Java code:

Note 1: These annotations define the wiring harness required to integrate JUnit with Spring MVC.  In particular, class SpringJUnit4ClassRunner provides the interface to the Spring TestContext framework.  The file mvc-dispatcher-servlet.xml defines the MainController bean and the UserService bean used by MainController.java (contents is shown below) via the @Autowired annotation.
Note 2: The WebApplicationContext is used in conjunction with the @WebAppConfiguration annotation to define the controller context. Using variable wac, one can retrieve the servlet context as well as other controller related information. Variable mockMvc is the main entry point for the server-side Spring MVC test support framework.
Note 3: This is standard setup code for initializing the mockMvc variable.
Note 4: The request to the getUsers service is built (but not executed) at this point. The HTTP path specified is “/users” and the data to be returned must be JSON data.
Note 5: It is here that the request is executed with the results being returned in the ResultActions variable, result. 
Note 6: Be careful about using String.equals instead of String contains.  In my own example, “application/json;charset=UTF-8” was actually returned in variable contentType.
Note 7: Here we verify the content returned from the server. This can be done many ways.
Note 8: Note that the RESTful path we are using has an upper-case ‘U’ for /users.  We don’t have such a service defined.
Note 9: An HTTP 404 error should be returned here.

This is the contents of file mvc-dispatcher-servlet.xml:

<beans xmlns="http://www.springframework.org/schema/beans"
 xmlns:context="http://www.springframework.org/schema/context"
 xmlns:mvc="http://www.springframework.org/schema/mvc" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
 xsi:schemaLocation="
 http://www.springframework.org/schema/beans 
 http://www.springframework.org/schema/beans/spring-beans-4.3.xsd
 http://www.springframework.org/schema/context 
 http://www.springframework.org/schema/context/spring-context-4.3.xsd
 http://www.springframework.org/schema/mvc
 http://www.springframework.org/schema/mvc/spring-mvc-4.3.xsd">

 <context:component-scan base-package=
       "com.topshot.proximity.controller" />

 <mvc:annotation-driven />

 <bean id="userService"   
       class="com.topshot.mvcdemo.service.UserService">
 </bean>
 
</beans>

Let’s modify our unit test somewhat. Instead of having MainController.getUsers() call the real UserService, we will create a mockup of that service using Mockito.  Our new unit test code will look like this:

 
package com.topshot.mvcdemo.controller;

import static org.junit.Assert.*;
import static org.mockito.Mockito.*;

import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.InjectMocks;
import org.mockito.Mock;
import org.mockito.MockitoAnnotations;
import org.springframework.http.MediaType;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import org.springframework.test.context.web.WebAppConfiguration;
import org.springframework.test.web.servlet.MockMvc;
import org.springframework.test.web.servlet.RequestBuilder;
import org.springframework.test.web.servlet.ResultActions;
import org.springframework.test.web.servlet.request.MockMvcRequestBuilders;
import org.springframework.test.web.servlet.result.MockMvcResultMatchers;
import org.springframework.test.web.servlet.setup.MockMvcBuilders;

import com.topshot.mvcdemo.model.User;
import com.topshot.mvcdemo.service.UserService;

 @RunWith(SpringJUnit4ClassRunner.class)
 @ContextConfiguration(
     locations = {"classpath:/mvc-dispatcher-servlet.xml"})
 @WebAppConfiguration
 public class MainControllerTest {

   // 1
   @InjectMocks
   private MainController mainController;
 
   @Mock
   private UserService userService;

   private MockMvc mockMvc;
 
   // dummy data
   public User [] getUsers() {
     User[] users = {
       new User("Bob Dylan"), 
       new User("Leonard Cohen")
     };
     return users;
   }
 
   @Before
   public void setUp() throws Exception {
     // 2
     MockitoAnnotations.initMocks(this);
     mockMvc = MockMvcBuilders.standaloneSetup(mainController).build();
   }

   @Test
   public void testGetUsers() throws Exception {
     // 3
     when(userService.getUsers()).thenReturn(getUsers());

     RequestBuilder requestBuilder =  
       MockMvcRequestBuilders.get("/users").accept(new 
       MediaType("application", "json"));

     ResultActions result = mockMvc.perform(requestBuilder);
     result.andExpect(MockMvcResultMatchers.status().isOk());
     String contentType = 
       result.andReturn().getResponse().getContentType();
 
     assertTrue("contentType returned is incorrect.", 
     contentType.contains("application/json"));
 
     String jsonData = 
       result.andReturn().getResponse().getContentAsString();
     assertNotNull(jsonData, "No data was returned.");
     assertTrue("Insufficent number of characters returned as data.", 
       jsonData.length() > 2);
     assertTrue("This JSON string is supposed to start with '['", 
       jsonData.startsWith("["));
     assertTrue("This JSON string is supposed to end with ']'", 
       jsonData.endsWith("]"));
     }
 }

The following comments point to changes in this copy of the Java unit test code when compared to the original unit test code:

Note 1: Annotation @InjectMocks is used by Mockito to define the Mock injections in the MainController class instance for constructors and methods. Using this annotation simplifies the amount of required test code. See the documentation for further information about this class. 
Note 2: Method MockitoAnnotations.initMocks() does the actual initialization of the Mock objects.
Note 3: This is a standard Mockito when() method.  Whenever a call is made to UserService.getUsers(), this code is executed instead of the real code.  

File mvc-dispatcher-servlet.xml can also be simplified.  No bean needs to be specified.  For the sake of completeness, I’m including the code for User and UserService:

User.java:

package com.topshot.mvcdemo.model;

public class User {

 String name;

 public User(String name) {
   this.name = name;
 }
 
 public String getName() {
   return name;
 }

 public void setName(String name) {
   this.name = name;
 }
}

UserService.java:

package com.topshot.mvcdemo.service;

import com.topshot.mvcdemo.model.User;

public class UserService {

   User[] people = {
     new User("Barak Obama"), 
     new User("George Bush"),
     new User("Bill Clinton"),
     new User("Abraham Lincoln")
   }; 
 
   public User[] getUsers() {
     // Get data from DB or wherever
     return people;
 }
}

 

Trials and Tribulations of Installing an Epson GT-1500 Scanner

My scanner died.  Well, it didn’t completely die but weird line streaks appeared on all photos that I scanned.  Plus that scanner was slow.  So I had my excuses ready when I decided to buy a new scanner.  There were four requirements for my new scanner:  (1) it had to be faster than my old one; (2) it had to be able to scan documents via an automatic document feeder (ADF) as well as be able to scan photos; (3) it had to be relatively inexpensive (to be defined when I see it) (4) it had to integrate with Evernote.

There are plenty of scanners available at all sorts of prices.  The vast majority of them are document scanners which of course included ADFs but they could not scan photos.  Then there were the photo scanners which could, of course, scan documents but only by feeding them one page at a time, a rather painful process I was hoping to prevent. Strangely enough, I found only one scanner that met my requirements, the Epson GT-1500.  And the price was right, too.

gt1500

This is not a new scanner.  I believe it first came to market around 2008 so I was a bit worried about whether it was or was not supported by Windows 10.  I read a number of reviews about the scanner.  More than one person commented that the scanner did work with Windows 10 and that they had had no problems with it.  Some people griped about the software that came with the scanner but the complaints sounded more like people issues rather than technical issues.  So I bought it.  Actually, I saved almost 50% of the price of a new scanner by buying a reconditioned model.  Thank you, Amazon.

The scanner arrived and I installed the requisite software (this usually needs to be done before connecting the scanner to the computer; default drivers don’t recognize the hardware correctly).  The software included Epson Scan (the main workhorse program that interfaces to the scanner itself) and ABBYY FineReader (a program that converts the scanned PDF document into a searchable document – it’s an OCR conversion program).  It also included ScanSoft’s PaperPort 11 SE.

PaperPort is a document management program that works with different scanners, managing and storing the scanned documents (somewhat similar to Evernote).  As I saw it, this would be the program that would interface to Evernote and would allow me to scan multi-page, two-sided documents (the GT-1500 does not do two-sided scanning; you have to turn the set of pages over and scan again) and merge them into the proper order (it supposedly has that feature).

I first tried scanning with Epson Scan; if this didn’t work, nothing would work.  Scanning went well and very quickly. So far, I was happy.  I did have an issue getting the various side buttons of the scanner to work (such as scan to email, scan to print) but then I never used these utility buttons on my previous scanner so this did not bother me.  I decided to go to the Epson web site and see if there was any more up-to-date versions of the software that came with the scanner.  Indeed, there was a newer version of Epson Scan so I downloaded and installed it.

Next I wanted to see if I could scan a 3 page, two-sided document and automatically shuffle the pages into the correct order.  I started PaperPort and looked around for a scan button.  It took a little while but I finally saw where, on the left edge of the screen, you specify the hardware scanner PaperPort should use for scanning.  It had actually found the scanner by itself.  The only problem was that I couldn’t scan anything.  The scan button was grayed out. I went through all sorts of hoops looking for the missing magic ingredient that would enable the scan button.  I couldn’t find it.

Was the problem due to Windows 10 and an old version of PaperPort? I searched the internet and surprisingly, in a Brother forum (Brother, as in the company that also makes scanners), I found a note about upgrading PaperPort support for Windows 10. You have to download the “PaperPort Scanner Connection Tool”.  The details can be found here.  Once I downloaded and installed the tool, the scan button worked and I could use PaperPort. But now came the next problem.  PaperPort wouldn’t recognize the automatic data feeder on the scanner. I ran all sorts of tests in the program but it simply never recognized the ADF.

Maybe there is newer version of PaperPort that supports Windows 10 as well as my scanner’s ADF? The program has a pulldown menu where you can check for updates. When I pressed it, a browser window popped open but nothing happened.  I tried this several times but still nothing happened.  I then opened a “view source page” window to try and identify what the problem might be.  In the middle of the source page view was a message saying that JavaScript was not enabled and that I must enable it for this window to work.  The only problem was that JavaScript was enabled (which I even verified).  I tried two additional browsers but got exactly the same error.

I went to the PaperPort web site thinking I could download a more up to date version of the software.  Big surprise. The latest version is 14 and I’m on 11 SE.  I spent a fair amount of time browsing their community forum and discovered that my version 11 SE is an OEM version, made specifically for scanner manufacturers.  I decide to write an email to PaperPort support.  This is what I sent them:

Hi. I just received my new Epson GT-1500 scanner with PaperPort 11 disk. I’m running Windows 10 and it appears that this version is very old. How do I upgrade to a supported version? And if it costs, how much? My software requirements are very simple: (1) scan documents and be able to easily and quickly upload to Evernote; (2) scan photos and upload to a specified folder. I did not see that PaperPort 11 even supports Evernote. Am I wrong? It can also be that PaperPort is overkill for my needs. Your input would be appreciated.

I received the following reply from support the following day:

Thank you for contacting Nuance Customer Service.
Regarding your inquiry, PaperPort program does not support Evernote and your version of PaperPort is not compatible with Windows 10. You might be interested upgrading to PaperPort 14 Standard for only $49.99 plus shipping and applicable tax. You may call us at 1-800-654-1187 from Monday to Friday 9:00am to 8:00pm EST to place the order.
Please let us know if we can be of further assistance.

The response was puzzling to me as I had also browsed the Evernote forums and found that quite a few people were saying that they were using PaperPort with Evernote.

I needed to take a break, have a beer, and step back and really think through what I wanted and/or needed in order to successfully scan. Having read through a number of entries in the Evernote forum related to PaperPort, I discovered that Evernote had a really cool feature that I could take advantage of (I’m an Evernote Premium subscriber but I realized that I should really spend more time learning about some of the more esoteric features of the software).  Evernote has an “import folders” feature: you tell Evernote that a certain folder is an import folder.  Then, whenever you place a document in that folder, Evernote will automatically consume it. I tested it with Epson Scan by specifying the scanned output folder.  Almost immediately after completing the scan, my PDF document ended up in Evernote. Nice.

As I mentioned earlier, the real reason I wanted PaperPort was to reshuffle multi-page, two-sided documents. PaperPort was really over-kill. I decided to solve the problem myself by writing a PDF shuffle script.  But first I needed a utility that would do two things:  (1) take a PDF document and break it into multiple PDF documents, one per page; (2) merge multiple PDF documents into one PDF document.

I found a very nice (and free) program called PDF Split and Merge that did just what I wanted.  While there is a nice GUI front-end to the program, I was specifically interested in their command line interface.  I’m not much of a Windows CMD shell programmer.  I usually open a cygwin bash window when I need to script something.  So my script is written in bash. It should not be difficult to convert it to either CMD shell or Windows PowerShell.

#!/usr/bin/bash

# Home of the utility
PDFSPLIT_PATH="/cygdrive/c/Program Files/PDF Split And Merge Basic"

# My home
HOME_DIR="C:/Users/jberry"

# It is assumed that the input file is located in this directory
INPUT_FILE="$HOME_DIR/Evernote/$1"

# When the input PDF file is split into multiple files, one per page,
# this is the directory where those files will be stored
OUTPUT_DIR="$HOME_DIR/Evernote/splits"
# This is the directory that Evernote uses for importing files
EVERNOTE_DIR=$HOME_DIR/EvernoteImportFolder

cd "$PDFSPLIT_PATH"/bin
export CLASSPATH=$PDFSPLIT_PATH

# Split the input PDF file into multiple files, one per page
./run-console.sh -f $INPUT_FILE -o $OUTPUT_DIR -s BURST split

#Rearrange the output files into the correct merged list
#For a 6 page scanned document, the original page order is
#1, 3, 5, 6, 4, 2
FILES=`dir $OUTPUT_DIR/*`
declare -a arr=($FILES)
NUM_PAGES=${#arr[@]}
BEGIN_PAGE_NO=0
END_PAGE_NO=$((${#arr[@]} - 1))
NUM_LOOPS=$(($NUM_PAGES/2))
FILE_LIST=""
I=0
while [ $I -lt $NUM_LOOPS ]
do
  ((I += 1))
  file=${arr[$BEGIN_PAGE_NO]}
  FILE_LIST=$FILE_LIST" -f "$file" "
  file=${arr[$END_PAGE_NO]}
  FILE_LIST=$FILE_LIST" -f "$file" "
  ((BEGIN_PAGE_NO += 1))
  ((END_PAGE_NO -= 1))
done

#concatinate the files in the new order
./run-console.sh $FILE_LIST -o $OUTPUT_DIR/output.pdf concat

# copy output.pdf to Evernote folder
mv $OUTPUT_DIR/output.pdf $EVERNOTE_DIR/$1

#clean up
rm -f $OUTPUT_DIR/*

Program PDF Split and Merge includes two script files, a dot bat file (run_console.bat) and a dot sh file (run_console.sh).  They provide the command line interface to the various functions of PDF Split and Merge.  The above script should be self-explanatory given the embedded comments.  If there are any requests for it, I will be happy to create a Windows PowerShell version of the script.  Just let me know.

When I finish scanning a multi-page, two-sided document,  I simply call the above script with the file name of the file created by Epson Scan.  When the script completes, the file has been added to Evernote.

With this script, I am now a satisfied owner of the Epson GT-1500.  I also uninstalled PaperPort as I no longer need it.

 

Running Native Ubuntu Linux Under Windows 10

This is very new and very hot.  For many years now we have been able to run all sorts of Linux distros (Ubuntu, Debian, RedHat, etc.) on Windows boxes by use of “virtual machines”, either from within VMWare or from Oracle’s VirtualBox.  These programs emulate the architecture of the CPU and install and run the various distros within their frameworks.

This new product from Microsoft, while still in beta mode, runs Ubuntu natively, not via emulation. It’s called “Bash on Ubuntu on Windows.”  This development effort, while being implemented by Microsoft, is getting support from the Ubuntu community.

Note again that this is a beta product.  If you are running a normal Windows 10 operating system on your computer then this software is not yet available.  You need to sign up and become a member of the Microsoft Insider Program.  Being in this program will get you newer Windows software, newer features and, for course, possible bugs.  I have now been on the “fast track” option of the Insider program (which is also required to get this beta software) for well over a month and have yet to experience any problems.  Microsoft will occasionally bug you with a message asking for your input on some feature or another but I consider that to be quite acceptable and I do respond with as much detail as I can. To sign up for the Insider Program, go to this link.

For those of you interested in understanding a little about the architecture surrounding Bash on Ubuntu on Windows, you might find this 21 minute video interesting: architecture link.

To get started with Bash on Ubuntu on Windows, you must be running Windows 10 version 14316 or later.  I am currently on version 14332.  To find your version number, go to Settings –> System –> About.  The version number is the “OS Build”.

Here is the procedure for installing Bash on Ubuntu on Windows.  I am assuming you are already signed up for the Insider Program and that you have version 14316 or later of Windows 10.  Go to Settings –> Update & Security –> For Developers.  You want to select “Developer mode” as shown in this picture:

ForDeveloper

Once done, close the dialog box.

Next, go to Windows –> search –> and enter “windows features on or off”.  In the dialog box that will appear, scroll down until you see “Windows Subsystem for Linux (Beta)” and select it, as shown in this picture:

WindowsFeatures2

After you click on the OK button above, Windows will initiate the download of the Ubuntu file, bash.exe.  Once done, the computer will need to be restarted.  After the computer as restarted, launch a command prompt (“cmd”) and enter the command “bash”, as shown in this picture:

bash1

Enter “y” to initiate the download of the Ubuntu operating system, as shown here:

bash2

Depending on the speed of your Internet connection, this might take some time to complete.  Once done, enter “exit” to close the cmd window, as shown here:

bash3

At this point, everything is done. Ubuntu for Windows has been installed.  Now simply open Windows and search for “Bash on Ubuntu on Windows” and select it.  A real bash shell will now open up as shown here:

bash4

You are now running Ubuntu on Windows.  There are a few important points that you need to make note of:

  • Not all Ubuntu software will work; for example, ping does not yet work.
  • While it appears you that you don’t have Internet connectivity (due to ping not working) this is not true; you can use apt-get to download and install additional packages just like in a normal Ubuntu environment.
    • If you do have connectivity issues, see this link.
  • sudo works fine; use it.
  • If you want to access your Windows home directory from within Ubuntu, do an “ls /mnt/c/Users/yourUserName” to see the list of files.
  • Read the FAQ to get a better understanding about this product, current status, etc.

Bash on Ubuntu on Windows (they need to come up with a shorter name) has only been out for a couple of months.  There is a lot of activity taking place with this software (see here) especially flushing out bugs.  But there’s a lot of potential, too, for developers to work in a heterogeneous environment when the need exists.

I’ll have more articles on this subject as I continue to play with it.  Stay tuned.

 

Creating a JSON-based Web Server Eclipse Project

With the current work I am doing for our client as a consultant, I have not had any recent opportunity to work with or let alone, configure a Spring framework.  Realizing that I was getting a bit rusty and that Spring 4 had come out since I last worked with Spring, I thought it best to spend a little extra time reviewing Spring by putting together a simple server environment using Spring and JSON (which is a application popular protocol for use with client software).  I thought this should only take a couple of hours at most.  I figured the latest Eclipse would have everything built-in and it would be just a matter of creating a few application classes.

Eclipse can indeed create a client/server project but I saw no sample code for creating JSON controllers. I knew it was just a matter of defining a few annotation items but I had forgotten which ones I needed and their particular syntax.  Solution: go to the Internet and search for “json eclipse server example”.  I found a number of tutorial sites that included downloadable projects.  I tried three of them.  None of them worked out of the box.  I found this particularly surprising as there were comments and thanks from users who had found the tutorials very useful and presumably had used the downloadable code.  I assumed the issue was versioning and so I set out to grab the latest and greatest of everything (at least as of the date that I am writing this article).  Building with these versions, too, wasn’t as obvious as it should have been. One of the library names had changed. But with a little perseverance, I finally hit on the correct versions of libraries that I needed to produce a functional JSON-based web server project.

The details are described below.  Note that this article is not meant to be a tutorial on JSON.  It is simply meant to be informative information for quickly getting a JSON-based web server project up and running.

Install Java 8

I should really be assuming that you have the JDK version of Java 8 installed. How do you expect to develop any software without it?  Nevertheless, if you have Java 6 or Java 7 installed, realize that they are no longer supported. You really need to upgrade to Java 8.  Here’s a link to the latest Java download page:  http://www.oracle.com/technetwork/java/javase/downloads/jdk8-downloads-2133151.html.   You can have more than one version of Java installed at the same time.  Just set your PATH and JAVA_HOME variables to the appropriate version.

Install Tomcat 8

Tomcat is one of the easiest application servers to install, configure and run.  There are other similarly good app servers but this is the one I chose to use.  The latest version of tomcat is version 9 but it’s really very new (and I don’t know what features it has above and beyond version 8) and so I chose to install version 8.  Download tomcat from here: https://tomcat.apache.org/download-80.cgi.  I usually download the zip file onto my Windows box and install it in a local directory called util (this is where I drop all my Apache software after downloading the respective zip files).  So if my user name is ‘jberry’, I will end up with a directory called jberry/util/apache-tomcat-8.0.32.

Install Eclipse

Install the latest version of eclipse.  Go to the eclipse download page:  https://eclipse.org/downloads/.  I suggest installing “Eclipse IDE for Java EE Developers”.  This gives you all the tools you might need for developing your web-based application.  The download file will be a zip file.  I unzipped everything into folder jberry/eclipse.  To execute eclipse, go to directory jberry/eclipse/jee-mars/eclipse.  There you will find the executable file, eclipse.exe.

Workspace setup

Create a directory called jberry/workspace (or wherever you want).  Download the zip file that includes my source code for the JSON example.  The link is here: https://dl.dropboxusercontent.com/u/14719415/RecordCollection.zip.

Eclipse startup

Start eclipse.  It will ask you for the location of your workspace.  Enter the workspace path from the previous step.  After a brief moment, you will see an eclipse window entitled “Eclipse Java EE IDE for Web Developers”.  In the upper right corner, press the button “Workbench”.

From the top pulldown menu, click on WIndow –> Preferences.  By the left side of the window, find Java, expand the arrow and click on Installed JREs, as shown here:

InstallJava

Click on the “Add” button and find the path where your Java 8 JDK was installed.

jdk

Press the “Apply” button.

Setup tomcat within eclipse

In the same preferences window, at the left side of the window, find “Server”, open it up and click on Runtime Environments:

runtime

Press the “Add” button you will see a list of servers.

apache

Select Apache Tomcat v8.0 and click on “Next>”.  Do not check the box that says Create a new local server.   Click on the “Browse” button and find where you installed tomcat.  Enter that path.  Under JRE, click on “Workbench default JRE” and specify your JDK install.  Your window should look something like this:

tomcat

Press “OK” to close your window.

At the bottom of the eclipse window, click on the “Servers” tab, right-click and select “New Server”. A window similar to the follow should pop up.  Press “Finish” to create the server.

newserver

Import the JSON project

Click on “File” from the top pulldown menu and select “Import…”.  Open category General and select “Existing Projects into Workspace” and press “Next>”.  Press the “Browse…” button and select the workspace directory that was defined above.  You should see an Import Projects dialog box like this one:

importprojects

Click on the “Finish” button.

Select the “Markers” tab at the bottom of eclipse.  There should not be any Java errors. If there are errors, right-click on “RecordCollection” which should appear in the Project Explorer view (far left side of eclipse) and select “Maven” –> “Update Project…”.

Description of the project

The data objects, Artist, Cd, and RegistrationInfo are straight-forward and are defined in package com.topshot.recordcolleciton.model.  The two classes, ArtistInfo and CdInfo, are meant to simulate a simple database get or fetch operation but just returned hard-coded values.

The interesting stuff takes place in the controller package. Note how @RequestMapping, both at the class level and the method level are used to define which method is to be executed see the section called Testing below).  And most of all, note that there is no JSON conversion code in the application. Conversion of the various output classes is handled invisibly via Spring and its support libraries.  Simply adding produces=”application/json” tells Spring to perform this magic.

Run the project

Select the “Servers” tab at the bottom of eclipse.  Then right-click on the tomcat instance and select “Start”.  After a moment or so, tomcat will show that it is started and synchronized.  This can be verified by opening the console tab.  You should see something like this:

INFO: Server startup in 10395 ms

Go back to the server tab and right-click on the server.  Select “Add and Remove…”.  Select the RecordCollection resource and press the “Add>” button.  Your dialog window should look something like this:

addremove

Personally, I always uncheck the box that says “If server is started, publish changes immediately”. But that’s up to you.  The pluses and minuses of this choice are best saved for a more detailed discussion on software development and debugging.  Press “Finish”.

Testing

Open a browser and enter URL http://localhost:8080/RecordCollection/Records/artists.  You should get back the following line:

[{“name”:”Bob Dylan”},{“name”:”John Prine”},{“name”:”Joan Baez”},{“name”:”Leonard Cohen”}]

Now enter http://localhost:8080/RecordCollection/Records/cd/CD2 and  you should get back the following line:

{“title”:”CD2″,”artist”:[{“name”:”John Prine”},{“name”:”Joan Baez”}],”song”:null}

 

In both of the above cases, you were executing HTTP GET commands.  In order to execute an HTTP PUT command, a little more setup work is needed.  There are many tools available that will generate a PUT command.  One such tool is SoapUI.  Configuring SoapUI is beyond the scope of this article.  See the file README.txt which is part of the RecordCollection project for a little more information on testing PUTs.