About Me

My photo
I'm project manager of a software development team at www.researchspace.com. I've been developing bioinformatics software for the last 9 years or so, and I'm hoping to share some of the experience and knowledge I've gained with Eclipse RCP, Java web app development and software development in general on these blogs.

Tuesday 4 November 2014

Running multithreaded JUnit tests with Apache Shiro

In this post I'm  going to explain how to run JUnit tests that simulate multiple users logged in simultaneously in a Java web application, using the Apache Shiro security library. Although the motivation for this is based on a specific use case of a current project, the test framework should be useful for anyone wanting to perform multithreaded unit tests with users authenticated in concurrent sessions -  perhaps to test resource contention, permissions, or locking.

Background

In a web project I'm working on just now, users can share and edit documents. Because of the potential for people overwriting other people's edits, or even deleting a document whilst someone else is working on it, we use a locking mechanism to ensure that only one person can edit the document at a time. The lock is acquired when someone starts editing, and is released when any of these conditions are true:
  • The user logs out.
  • The user session expires.
  • The user saves and closes the document.
This is a tricky scenario to develop automated tests for; the calculation on whether someone can edit or not depends on a whole range of factors, such as authorization permissions, the state of document itself (documents can be signed, for example, which prevents further edits), and whether someone else is editing it or not.

Our project uses the Apache Shiro  security library, a very versatile library that can be used in web and non-web projects, and has good support for testing. Up till now, though, all our tests ran in a single thread, with the result that only one user could be logged in at a time.
For our integration tests we needed to have:
  • Several users logged on simultaneously, simulating concurrent web sessions.
  • One user active at a time, whilst the other users wait.
  • Active session management, so that session expiration and  logout listeners would be triggered after session lifecycle events.

Solution

In this solution, we build on a mechanism discussed in a StackOverflow post about sequencing of threads. All the code discussed here is available on Github as a Maven project at https://github.com/otter606/shiro-multithread-junit. Just import into your IDE and run 

mvn test 

to run.

Setting up Shiro.

Shiro provides a TestUtils class in its documentation. Our unit test class can extend this, or include it as a helper class. In our example, for ease of explanation, we'll extend from it.
First of all, we'll just initialise a  regular SecurityManager, using a configuration file, shiro.ini at the root of the classpath - this is a standard approach to initialising Shiro.

 @BeforeClass  
 public static void readShiroConfiguration() {  
           Factory factory = new IniSecurityManagerFactory("classpath:shiro.ini");  
           SecurityManager securityManager = factory.getInstance();  
           SecurityUtils.setSecurityManager(securityManager);  
           log.setLevel(Level.INFO);  
      }  

The shiro.ini file is very simple, it just defines three users and their passwords:
 [users]  
 user1 = password1  
 user2 = password2  
 user3 = password3  

The code to login a user, and bind to a particular thread is boiler-plate code that we can put in a utility method. 

 private Subject doLogin(User user) {  
      Subject subjectUnderTest = new Subject.Builder(SecurityUtils.getSecurityManager())  
                     .buildSubject();  
      subjectUnderTest.login(new UsernamePasswordToken(user.getUsername(), user.getPassword()));  
      setSubject(subjectUnderTest);  
      return subjectUnderTest;  
 }  

So, at the moment we just have some code to login a User - useful, but nothing new. Calling this method multiple times in the same thread is likely to lead to strange results, so now we need to set up a mechanism to run multiple threads in JUnit, where each thread performs actions for a particular user, in a sequenced manner, so that the other threads pause when one thread is active. In this manner, we can test a use case like:
  1. User1 logs in and accesses a resource R for editing.
  2. User2 logs in and and also requests to access R, but is rejected, as user 1 holds a lock,
  3. User1 closes resource R (but remains logged in).
  4. User2 now successfully accesses R.
  5. User1 logs out.
  6. User2 logs out.

 Invokables and CountDown latches.

To run the test, we'll use a simple interface, Invokable, in which to define callback functions that perform a single step in the use case.
 public interface Invokable {  
   void invoke() throws Exception;  
 }  

And let's define an array of these - 6 long - to hold each action:
 Invokable[] invokables = new Invokable[6];  
 invokables[0] = new Invokable() {  
             // annotate these inner class methods with @Test  
             @Test  
     public void invoke() throws Exception {  
         log.info("logging in user1");  
         doLogin(u1);  
         log.info(" user1 doing some actions..");  
         log.info(" user1 pausing but still logged in.");  
             }  
         };  
 invokables[1] = new Invokable() {  
     public void invoke() throws Exception {  
         log.info("logging in user2");  
         doLogin(u2);  
         log.info(" user2 doing some actions..");  
         // some action  
         log.info(" user2 pausing but still logged in.");  
     }  
 };  
 //.. + 4 more Invokables defined for subsequent steps.

Now, we'll set up a mechanism to sequence the execution of these Invokables in different threads using a CountDown Latch mechanism. Here's how we'll call it:
 Map config = new TreeMap<>();  
 // these are the array indices of the Invokable [].  
 config.put("t1", new Integer[] { 0, 3 });  
 config.put("t2", new Integer[] { 1, 5, });  
 config.put("t3", new Integer[] { 2, 4 });  
 SequencedRunnableRunner runner = new SequencedRunnableRunner(config, invokables);  
 runner.runSequence();  

In the code above, we define that  we want to run 3 threads, and specify the indices of the Invokable [] that will run in each thread. I.e., we want Invokable[0] to run in thread t1, then invokable 1 to run in thread t2, etc.,

What happens under the hood in runSequence is as follows. We define an array of CountDownLatch objects. Each Invokable will wait on a CountDownLatch, and each latch will be counted down by the completion of its predecessor:

 public void runSequence() throws InterruptedException {  
 // Lock l = new ReentrantLock(true);  
 CountDownLatch[] conditions = new CountDownLatch[actions.length];  
 for (int i = 0; i < actions.length; i++) {  
 // each latch will be counted down by the action of its predecessor  
 conditions[i] = new CountDownLatch(1);  
 }  
 Thread[] threads = new Thread[nameToSequence.size()];  
 int i = 0;  
 for (String name : nameToSequence.keySet()) {  
    threads[i] = new Thread(new SequencedRunnable(name, conditions, actions,  
    nameToSequence.get(name)));  
    i++;  
 }  
 for (Thread t : threads) {  
     t.start();  
 }  
 try {  
 // tell the thread waiting for the first latch to wake up.  
     conditions[0].countDown();  
 } finally {  
   // l.unlock();  
 }  
 // wait for all threads to finish before leaving the test  
 for (Thread t : threads) {  
   t.join();  
 }  
 }  

In SequenceRunnable, we run an Invokable, and count down the next latch in the sequence:
 public void run() {  
   try {  
     for (int i = 0; i < sequence.length; i++) {  
       int toWaitForIndx = sequence[i];  
       try {  
         log.debug(name + ": waiting for event " + toWaitForIndx);  
         toWaitFor[toWaitForIndx].await();  
       } catch (InterruptedException e) {  
         e.printStackTrace();  
       }  
     log.debug(name + ": invoking action " + toWaitForIndx);  
     actions[toWaitForIndx].invoke();  
     if (toWaitForIndx < toWaitFor.length - 1) {  
       log.debug(name + "counting down for next latch " + (toWaitForIndx + 1));  
       toWaitFor[++toWaitForIndx].countDown();  
     } else  
       log.debug(name + " executed last invokable!");  
     }  
   } catch (Exception e) {  
     e.printStackTrace();  
   }  
 }  

That's it! Using this setup, we can login multiple users simultaneously in concurrent sessions, and perform actions for any user  in a guaranteed order, thus being able to test thoroughly functionality that is affected by resource contention or locking.

Conclusion

In this blog, I've described how you can combine Shiro and JUnit to develop realistic integration tests for functionality that is  affected by concurrency. Thanks for reading!

Wednesday 22 October 2014

Apache Shiro and Spring Boot

Spring Boot is a great way to get a Spring web application up and running, with many default settings to make the configuration of  standard functionality such as logging, view resolution, and database configuration as painless as possible.

It's also possible to add in SpringSecurity. However, since I've been using Apache Shiro for some time in other projects, and didn't particularly want to learn a new security library, I wanted to see if I could get it set up with  a Spring Boot application.

Basic setup

My environment is Java 7, Spring 4.0.5, using Shiro 1.2,  deploying  to a Servlet 3 container.

Differences between configuration described in current Shiro 1.2 documentation and Spring Boot.

 Boot encourages pure Java configuration, with no Spring XML files or even a web.xml file. So we need to declare all the beans needed for Shiro in a class annotated with Spring's @Configuration annotation.
Here is my class to set up Shiro:
 @Configuration  
 public class SecurityConfig {  
      @Bean()  
      public ShiroFilterFactoryBean shiroFilter (){  
           ShiroFilterFactoryBean factory = new ShiroFilterFactoryBean ();  
           factory.setSecurityManager(securityManager());  
           factory.setLoginUrl("/ui/login");  
           factory.setSuccessUrl("/ui/listView");  
           factory.setUnauthorizedUrl("/ui/login");   
           factory.setFilterChainDefinitions(  
                 "/assets/scripts/**=anon\n"+  
                 "/license/**=anon\n"+  
                 "/manage/health/=anon\n"+  
              "/assets/static/*=authc\n"+  
                 "/manage/metrics/**=authc\n"+  
                 "/manage/beans/**=authc\n"+  
                 "/manage/trace/**=authc\n"+  
                 "/manage/mappings/**=authc\n"+  
                 "/manage/dump/**=authc\n"+  
                 "/manage/autoconfig/**=authc\n"+  
                 "/manage/env/**=authc\n"+  
                 "/manage/info/**=authc");  
           return factory;  
      }  
      @Bean  
      public SecurityManager securityManager() {  
           DefaultWebSecurityManager rc = new DefaultWebSecurityManager();  
           rc.setRealm(realm());  
           return rc;  
      }  
     @Bean public AuthorizingRealm realm() {  
           AuthorizingRealm realm = new AuthorizingRealm() {  
                @Override  
                protected AuthenticationInfo doGetAuthenticationInfo(AuthenticationToken token)  
                          throws AuthenticationException {  
                     return new SimpleAuthenticationInfo("user", "password", "login");  
                }  
                @Override  
                protected AuthorizationInfo doGetAuthorizationInfo(PrincipalCollection principals) {  
                     // TODO Auto-generated method stub  
                     return null;  
                }  
           };  
           realm.setName("login");  
           return realm;  
      }  
 }  

As you can see, you just need to set up 3 beans as a minimum, constructing a ShiroFilterFactoryBean, a SecurityManager and a Realm. In this example I've created a trivial Realm implementation, in practice you'll probably want to connect to a backend database to verify credentials. If you'r econfiguring a Realm that needs initialization, or want to add in any Spring Bean that implements the Initializable interface, you'll need to add in one more definition, e.g.,:

 @Bean  
 public LifecycleBeanPostProcessor lifecycleBeanPostProcessor() {  
     return new LifecycleBeanPostProcessor();  
 }  
  @Bean  
  @DependsOn("lifecycleBeanPostProcessor")  
  public TextConfigurationRealm realm() {  
     IniRealm realm = new IniRealm() ;  
     realm.setResourcePath("classpath:users.ini");      
     return realm;  
 }  


Instead of XML Configuration, we can just use setters in the classes to set property values.

SpringBoot can expose a set of URLs for monitoring and health checking. By default these are '/health', '/info' etc, but by setting a property in application.properties:
 management.context-path=/manage  

we can set these URLs with a prefix, and configure them to be authenticated. This is important as they give away a lot of sensitive information.

Customizing Shiro

For my application, I wanted to provide a custom filter that extended FormAuthenticationFilter .
So, I set a new @Bean definition to create my Filter. By giving it the name 'authc' it should replace the existing FormAuthenticationFilter  with my subclass.

In order to get this to work, it was crucial to define the filter after the ShiroFilterFactoryBean in the code. If it was defined first, then it seemed to prevent the correct behaviour of the FactoryBean to produce SpringShiroFilter instances. 
But, by defining after the Factory bean, everything works properly. The reason I'm stressing this point is that in the old  standard XML configuration, the order didn't seem to matter.
 management.context-path=/managepackage com.researchspace.licenseserver;  
 import java.util.HashMap;  
 import java.util.Map;  
 import javax.servlet.Filter;  
 import org.apache.shiro.authc.AuthenticationException;  
 import org.apache.shiro.authc.AuthenticationInfo;  
 import org.apache.shiro.authc.AuthenticationToken;  
 import org.apache.shiro.authc.SimpleAuthenticationInfo;  
 import org.apache.shiro.authz.AuthorizationInfo;  
 import org.apache.shiro.mgt.SecurityManager;  
 import org.apache.shiro.realm.AuthorizingRealm;  
 import org.apache.shiro.spring.security.interceptor.AuthorizationAttributeSourceAdvisor;  
 import org.apache.shiro.spring.web.ShiroFilterFactoryBean;  
 import org.apache.shiro.subject.PrincipalCollection;  
 import org.apache.shiro.web.mgt.DefaultWebSecurityManager;  
 import org.springframework.context.annotation.Bean;  
 import org.springframework.context.annotation.Configuration;  
 import com.researchspace.licenseserver.controller.ShiroFormFilterExt;  
 @Configuration  
 public class SecurityConfig {  
      @Bean()  
      public ShiroFilterFactoryBean shiroFilter (){  
           ShiroFilterFactoryBean factory = new ShiroFilterFactoryBean ();  
           factory.setSecurityManager(securityManager());  
           factory.setLoginUrl("/ui/login");  
           factory.setSuccessUrl("/ui/listView");  
           factory.setUnauthorizedUrl("/ui/login");  
           // this is ordered, better to do like this.  
           factory.setFilterChainDefinitions(  
                 "/assets/scripts/**=anon\n"+  
                 "/license/**=anon\n"+  
                 "/manage/health/=anon\n"+  
              "/assets/static/*=authc\n"+  
                 "/manage/metrics/**=authc\n"+  
                 "/manage/beans/**=authc\n"+  
                 "/manage/trace/**=authc\n"+  
                 "/manage/mappings/**=authc\n"+  
                 "/manage/dump/**=authc\n"+  
                 "/manage/autoconfig/**=authc\n"+  
                 "/manage/env/**=authc\n"+  
                 "/manage/info/**=authc");  
           Map<String,Filter> filters= new HashMap<>();  
           filters.put("authc", authc());  
           factory.setFilters(filters);  
           return factory;  
      }  
      @Bean  
      public SecurityManager securityManager() {  
           DefaultWebSecurityManager rc = new DefaultWebSecurityManager();  
           rc.setRealm(realm());  
           return rc;  
      }  
      @Bean public AuthorizingRealm realm() {  
           AuthorizingRealm realm = new AuthorizingRealm() {  
                @Override  
                protected AuthenticationInfo doGetAuthenticationInfo(AuthenticationToken token)  
                          throws AuthenticationException {  
                     return new SimpleAuthenticationInfo("user", "password", "login");  
                }  
                @Override  
                protected AuthorizationInfo doGetAuthorizationInfo(PrincipalCollection principals) {  
                     // TODO Auto-generated method stub  
                     return null;  
                }  
           };  
           realm.setName("login");  
           return realm;  
      }  
      @Bean(name="authc") // this must be AFTER the factory bean definition 
      public ShiroFormFilterExt authc(){  
           return new ShiroFormFilterExt();  
      }  
 }  

I hope other people trying to use Shiro with SpringBoot will find this useful.

Thursday 31 January 2013

Updating from Spring 3.0 to 3.2

We recently updated our Spring libraries from 3.0.5 to 3.2. It's the first time we have updated Spring libraries in our webapp; we use Spring heavily and were a little nervous if everything would still work. But encouraged by their commitment to backwards compatibility we went ahead... all unit and database tests continued to pass.. great!
Running the application however had a few problems. We had been using  a ContentNegotiatingViewResolver to return a PDF view of certain webpages when the URL ends in ',pdf'. After updating, all webpages were trying to return pdf content, even without the suffix, or we would get a ClassCastException ('String cannot be converted to Media Type').

Content negotiation is not trivial to sort out, and it seems in 3.2 Spring have tried to make it more flexible, configurable and reusable, by introducing  ContentNegotiationStrategy implementations which resolve views using different policies. This is documented here, but at least to me the documentation is a bit cryptic and hard to understand.
Here is how we fixed this:

 
  
   
    
     

      
       
        
         
         
         
        
       
      

      

      
     
    
   
  



  
   
    
   
  


  
   
    
    
    
   
  
 
Note that there are two strategies:PathExtensionContentNegotiationStrategy and HeaderContentNegotiationStrategy. These are evaluated in the order they appear in the list.
  1. First of all, if the request URL has a suffix, the path extension strategy will try to find a view resolver that can deal with that suffix, based on the media type mappings supplied to the constructor. If a match is found, no more strategies are evaluated
  2. If there was no suffix, or there is no view, the HeaderContentNegotiationStrategy  will examine the request headers, and match up a view.
  3. Views are then evaluated to a 'best match' as in earlier versions of Spring.
 In this case, if there is no file suffix, the normal request header will result in a Jstl view being generated.
I'm sure people have different strategies for handling content negotiation; upgrading to 3.2 did break our current web app but hopefully this solution will help others.

Saturday 28 August 2010

SWTBot code coverage

Today I'm going to discuss what a great boost to test code coverage is provided by the SWTBot toolkit for Eclipse RCP applications.

We develop an Eclipse RCP app, SBSIVisual, that contains a mixture of 'core' domain objects and UI packages. Using just JUnit tests, we cover about 22% of the code base with about 42000 test instructions for 71000 production code instructions. If we think about the effort needed to write these tests, we're writing a lot of test code for not a huge coverage.

When we run our SWTBot functional tests, though,  we get a 35% code coverage with 11000 test instructions. Not only do we get a more efficient test:source ratio (1:6.5 for SWTBot, 1:1.75 for Junit), we also get a better test coverage. Moreover, if we merge our Junit and SWTBot results, we get a 47% code coverage overall - indicating that both sets of tests are covering a substantial amount of non-redundant code. 

Now, I don't know enough testing theory to know if these figures are 'typical' - the coverage might be a bit low for purists - but we are small group of  developers, with mixed enthusiasm for testing, working in a technical application with no dedicated testing or QA staff. I feel though that these figures could be quite reproducible in many other applications.

An additional advantage of SWTBot tests is their resilence to refactoring. We make quite severe refactoring from time to time which usually requires a fair amount of work in adapting the JUnits. However so long as the UI remains relatively unchanged, often the SWTBot tests do not need to be altered at all, as the underlying object model is hidden to the user operations mimicked by SWTBot.

The key caveat in the above statement is ' the UI remains relatively unchanged. '  How can we get round this problem? Our policy is to not write SWTBot tests immediately when we implement new functionality - we hold off until we are fairly sure that the UI is 'fit for purpose' with our users and at least the look and feel is acceptable for the next release. Then we write them, based on the use cases, to avoid regressions and only have to make minor changes for incremental changes in functionality. The use of PageObjects is also very helpful, so the details of the UI at the widget level, can be hidden from the test author and also kept in a single class.

Another great feature of SWTBot, is that if you link in your tests to your Ant build, you can be really confident your deployed app will run! This used to be big problem for us - the unit tests would be fine, but a feature might be missing a dependency that would only be detected at runtime with the dreaded ClassDefNotFound exception thrown. Now, we have virtually eliminated manual testing, other than a brief sanity check before each release. 

All this enthusiasm for SWTBot may look like I'm   knocking  the Junits here - far from it. We run the Junits much more frequently, they take seconds rather than minutes to execute, and are resilient to other changes. For example, if we develop a new UI, the Junits will still be relevant, but the SWTBot tests are only useful for our Eclipse-RCP based UI.

Monday 5 April 2010

Headless SWTBot testing for the Eclipse RCP mail example project

Introduction
Today I'm going to describe the steps needed to run the SWTBot functional tests headlessly. Although there is some documentation on the SWTBot wiki pages I've not found a complete example using a simple project such as the RCP Mail client example project.



The aim of this tutorial is to demonstrate how to set up headless SWTBot test for the RCP mail application. I am no expert in this, but I thought it might be useful to provide a set of instructions using an example that everyone can access, at least to get a working example to begin with. To be able to follow this tutorial it is best if


  • You have some experience using SWTBot in the Eclipse IDE
  • You have some knowledge of Ant builds and Eclipse PDE builds.


This tutorial is based on input from several sources : from Ralf Ebert's tutorial blog on p2 builds , from Kai Toedter's mp3 client demo RCP app , and Ketan Padegaonkar's SWTBot tutorial at EclipseCon 2010 and the SWTBot wiki pages.


The main steps in this tutorial are

  1. Create an RCP mail application and feature.
  2. Develop a headless build for the RCP mail app.
  3. Develop an SWTBot based test plugin to test the UI of the mail app
  4. Develop a headless build and execution of the SWTBot tests.


Set up:


This tutorial is set up using Eclipse 3.5.2. You will need two copies of Eclipse, one as your IDE and one as a target to compile against. The target should contain the RCP delta pack , and the full PDE feature set. Both Eclipses should have SWTBot installed (best done through the update site). This tutorial uses Galileo, I've not checked with Helios or Ganymede if this works. The Galileo update site is available at http://download.eclipse.org/technology/swtbot/galileo/dev-build/update-site.

All the projects are available in zipped workspace here






Step 1 : Create RCP mail application:


This takes several stages: create the RCP app, create a product, create a feature,

and do an export build using features.

a ) Create RCP app

Switch to a clean empty workspace

  • Set your target Eclipse to be the Eclipse into which you've installed the RCP delta pack.

  • Create a new plugin project called 'mail'. Make sure you have chosen that it is an RCP application(Pic).
  • Choose 'RCP Mail Template' as the example template. Otherwise accept all defaults.




CHECKPOINT :Select the newly created mail project, right-click Run As-> Eclipse Application - it should run!


b) Create a product file

Now we're going to create a product file for our mail application.

Select the mail project, right-click and choose 'New Product Configuration' and in the ensuing dialog call the product file 'mail.product' . Click Finish.





c) Now we're going to create a feature for the mail project. Click New->Feature Project,

call it 'mail.feature' and in the subsequent page add the 'mail' plugin (Picture below). Click Finish.





d). Now, go back to the product configuration file you created in step b) and open it in the the Product editor. In the overview tab, change the 'This project configuration is based on:' from plugins to features. Now click on the dependencies tab and add the following 2 features

  • org.eclipse.rcp
  • mail.feature





CHECKPOINT:

Go back to the overview tab, click 'Synchronize with defining plugin', and 'Launch an Eclipse app' - it should still run! At this stage you can try a product export from the overview page; just accept the defaults and you should get an exported functional RCP app.

So, at this stage we have a product configuration that has all the required functionality to export a working RCP mail application.


Step 2 Creating a headless ant build for the mail client:


First of all, create project 'mail build' and copy in these 2 files:

build.properties

build.xml.

To get the headless build to work on your machine, you will need to edit some file paths and platform specific settings in 'build.properties'. These are documented at the start of the file.

Build.xml can remain unchanged.


Checkpoint: you can now run the 'Build from workspace' ant target and get a build generated into a folder called user.home/MailBuilds/Builds/I.RCPMail. Unzip the archive and check the application runs.


Step 3: Now we'll finally get round to working with SWTBot!!


Create a new, standard plugin project called 'testMail'. Add the following plugins to the list of required plugins in the 'Dependencies' section:




org.junit4

org.eclipse.swtbot.eclipse.finder

org.eclipse.swtbot.swt.finder

org.hamcrest


Now create a testcase called MailApplicationTest in package 'test' with some SWTBot tests; here is a sample below:



package test;

import static org.junit.Assert.assertEquals;
import static org.junit.Assert.assertNotNull;

import java.util.List;

import org.eclipse.swtbot.eclipse.finder.SWTWorkbenchBot;
import org.eclipse.swtbot.eclipse.finder.matchers.WidgetMatcherFactory;
import org.eclipse.swtbot.eclipse.finder.widgets.SWTBotView;
import org.eclipse.swtbot.swt.finder.SWTBot;
import org.eclipse.swtbot.swt.finder.widgets.SWTBotMenu;
import org.eclipse.swtbot.swt.finder.widgets.SWTBotShell;
import org.junit.Test;

public class MailApplicationTest {

private SWTWorkbenchBot bot=new SWTWorkbenchBot();

@Test
public void testApplicationWindow ()throws Exception{
assertNotNull(bot.shell("RCP Product"));
}

@Test
public void testOpenAnotherView ()throws Exception{
SWTBotMenu file = bot.menu("File").menu("Open Another Message View");
file.click();
List views = bot.views(WidgetMatcherFactory.withPartName("Message"));
assertEquals(2, views.size());
views.get(1).close();
}

@Test
public void testOpenMessage ()throws Exception{
SWTBotMenu file = bot.menu("File").menu("Open Message");
file.click();
bot.shell("Open").bot().button("OK").click();
}

@Test
public void testClickMessageLink ()throws Exception{
SWTBot viewBot = bot.viewByTitle("Message").bot();
viewBot.link("nicole@mail.org").click();
SWTBotShell shell = bot.shell("Not Implemented");
shell.bot().button("OK").click();
}

@Test
public void testNavigationView ()throws Exception{
SWTBot viewBot = bot.viewById("mail.navigationView").bot();
String node = viewBot.tree().expandNode("me@this.com").getNodes().get(0);
assertEquals(node, "Inbox");
}
}


Now check the tests run by configuring a new SWTBot launch configuration:

Choose 'mail.product' as the product to launch. You may need to go into the 'plugins' tab and 'add required plugins' if the app fails to launch.


Below are screenshots needed to get the launch configuration to work:





















CHECKPOINT: You can run the SWTBot tests successfully in the IDE, 5 tests should pass.


Step 4 Finally we're ready to run the tests from an Ant script!!

a) First of all create a new 'General' project called 'mail.test.build' and copy into it the build.xml and build.properties from the mail.build project. We'll come back to these later.


b) Now create a feature for your test plugin. Create a new Feature project called 'testMail.feature' and add the testMail SWTBot test plugin as its single component plugin.



c) In the testMail plugin, create a new product configuration and call the product file 'testMail.product', and make it a feature based product. This product will contain the features needed for the mail app, as well as those needed for SWTBot. The end result is that , when exported, SWTBot and its dependencies will be 'embedded' in the RCP app. So add the following features to the product:

  • mail.feature
  • org.eclipse.rcp (these were needed for our app)
  • testMail.feature (our SWTBot test feature)
  • org.eclipse.swtbot
  • org.eclipse.swtbot.eclipse
  • org.eclipse.swtbot.eclipse.test.junit4 (the swtbot features)
  • org.eclipse.pde
  • org.eclipse.jdt
  • org.eclipse.platform ( dependencies for SWTBot)



CHECK: In the testMail.product configuration overview tab, following a 'synchronize with product's defining plugin' and launching, the test product should launch. Also, an export of the product should proceed successfully (using the Eclipse product export wizard). At this point we don't need the tests to run, we just want to make sure that the app still runs OK


d) Create a headless build of the SWTBotted- mail application. In the mail.test.build project we created in step a), we just need to make a few alterations so it will build our test project:

In build.properties, change the 'product' property to

 product=${buildDirectory}/plugins/testMail/testMail.product
In build. xml, add the lines :

<include name="testMail*/**" />
<exclude name="testMail*.feature*/**" />
<include name="testMail.feature*/**" />


into the copyProjectsFromFilesystem target

so that it looks like this :


<target name="copyProjectsFromFilesystem">
<mkdir dir="${buildDirectory}" />
<mkdir dir="${buildDirectory}/plugins" />
<mkdir dir="${buildDirectory}/features" />
<copy todir="${buildDirectory}/plugins">
<fileset dir="${sourceDir}">
<include name="mail*/**" />
<include name="testMail*/**" />
<exclude name="*mail*.feature*/**" />
<exclude name="testMail*.feature*/**" />
</fileset>
</copy>
<copy todir="${buildDirectory}/features">
<fileset dir="${sourceDir}">
<include name="mail.feature*/**" />
<include name="testMail.feature*/**" />
</fileset>
</copy>
</target>


This just ensures we will include the new test features and plugins in the headless build as well.

The build should just run in the same way as for the standard RCP headless app that we performed in stage 2.


CHECKPOINT: Unzip the build and verify that the exported 'Test' build starts properly.


e) Now invoke the tests using an ant task provided by SWTBot as described in the SWTBot wiki . To begin with you can create

a file called 'SWTBottest.xml' in your IDE eclipse and paste in the content below; in real usage you would probably want this to to merged

in with your standard build and invoked automatically after the build has finished.


This task is provided verbatim here and you will need to edit the property

'eclipse-home' to point to your RCP app install.

You may need to alter some of the other properties, for example those concerning your OS, or the SWTBot build IDs. For the RCP mail application, if you have named the projects and artifacts the same as me, you won't have to alter the plugin-name, classname, testProduct or plugin-name properties.


<project name="testsuite" default="run" basedir=".">

<!-- Edit this to be the path to your exoprted RCP application -->
<property name="eclipse-home" value="/Users/radams/MailBuilds/BUILDS/I.RCPMail/RCPMail" />

<!-- The SWTBot build ID (look in plugins/ to see if this is different) -->
<property name="all.buildId" value="2.0.0.512-dev-e35" />

<!-- The OS running the tests -->
<property name="os" value="macosx"/>
<property name="ws" value="cocoa"/>
<property name="arch" value="x86"/>

<!-- Edit this to be the name of your test plugin -->
<property name="plugin-name" value="testMail" />
<property name="classname" value="test.MailApplicationTest"/>
<property name="testProduct" value="mail.product"/>
<!-- path to library file ( which should be included in your RCP app) -->
<property name="library-file" value="${eclipse-home}/plugins/org.eclipse.swtbot.eclipse.junit4.headless_${all.buildId}/library.xml"/>

<!--- Don't need to edit below this point -->
<target name="suite">

<condition property="jvmOption" value="-XstartOnFirstThread -Dorg.eclipse.swt.internal.carbon.smallFonts">
<os family="mac"/>
</condition>

<property name="jvmOption" value=""></property>

<property name="temp-workspace" value="workspace" />
<delete dir="${temp-workspace}" quiet="true" />

<!-- remove junit3 fragment -->
<delete dir="${eclipse-home}/plugins/org.eclipse.swtbot.eclipse.junit3.headless_${all.buildId}" />
<delete dir="${eclipse-home}/plugins" includes="org.eclipse.swtbot.ant.optional.junit3_${all.buildId}.jar"/>

<ant target="swtbot-test" antfile="${library-file}" dir="${eclipse-home}">
<property name="data-dir" value="${temp-workspace}" />
<property name="testProduct" value="${testProduct}" />
<property name="plugin-name" value="${plugin-name}" />
<property name="classname" value="${classname}" />
<property name="vmargs" value=" -Xms128M -Xmx368M -XX:MaxPermSize=256M ${jvmOption}" />
</ant>
</target>

<target name="cleanup" />

<target name="run" depends="suite,cleanup">
<ant target="collect" antfile="${library-file}" dir="${eclipse-home}">
<property name="includes" value="*.xml" />
<property name="output-file" value="${plugin-name}.xml" />
</ant>
</target>

</project>



Now invoke the build - you should the app fire up and the application respond to the tests. You should now be able to follow the results as described in the Eclipse wiki page on SWTBot.


Summary

In this blog I've tried to give a complete run-through of all the steps needed to get headless SWTBot tests running for an Eclipse RCP. A complete workspace of the projects is available here which hopefully will give interested readers further clarification on details.

Thanks very much for reading - I hope this is of some use.





Friday 2 April 2010

Eclipse RCP Headless build experiences

As an Eclipse RCP developer for the past  4 years  the worst experiences we've had by far have always been with the headless build, especially when upgrading. While APIs for UI components remain wonderfully stable across the 3.x series, the headless build and update site change quite markedly from one release to the next and we find it takes several  days work to get the build to work for the new Eclipse. In fact we've been avoiding upgrading from 3.4->3.5 until now precisely because our small team has been reluctant to break our build whilst we've had several deadlines to meet.  

So finally, emboldened by attendance at EclipseCon2010 ( in particular, Kai Toedter's excellent tutorial) we've bit the bullet and upgraded...and it wasn't too bad at all! So here is a high level set of some ' rules for dummies' to get a headless build working for a standard RCP app, along with some gotchas to watch out for.

Rule 1)
 Make sure your product is based on features, not plugins. This will enable updating, and greatly  facilitate a headless build.

Rule 2) Don't even think about trying a headless build until you've got the standard product export from the PDE UI working!

Rule 3) ... and don't try the standard product export until you can run the 'Launch a product' in the IDE from the product overview page. One problem we had with updating was that Eclipse plugins changed somewhat. So we added (as dependent features in our product config) the RCP feature, the Help feature, plus our own application features. Then to find out what plugins were missing we did the following steps:
  1. Click 'synchronize with defining plugin' on the product overview page.
  2. Click 'Launch product', which typically fails to begin with.
  3. Now open the launch configuration for the launch you just did, and click 'Validate plugins'
  4. By scanning the list of missing dependencies you can add these to your product build. We have  an' Eclipse base' feature which contains all the plugins which our app needs but which are not defined in the RCP or Help features. For example we define  a search page that depends on the ltk.refactoring plugins, so we add these ltk plugins to our 'base' feature.
  5. Add these to your 'base feature' (or just your own feature) until the launch configuration works. You are now ready to try the export from the PDE UI with a reasonable chance of success.
Rule 4) Make sure the version of RCP delta pack you use corresponds exactly with your Eclipse install! We careless used an older version of the delta pack which breaks the headless build but the PDE build in the UI seemed to work fine.

Rule 5) If you are migrating form 3.4 to 3.5, remove all of the 'update' plugins from your feature - these are now  replaced by p2 plugins. You may need to remove some menu contributions from your app as this is now all accomplished by the p2.user.ui feature - just make sure you have a top level 'Help' menu in your app.

Rule 6) When creating a build target of Eclipse plugins + RCP delta pack, we were tempted to remove certain features & plugins which we knew our app didn't need (e.g., jdt plugins) in order to reduce its size. Don't do this!! Just download a fresh eclipse, use update sites to install any other plugins you need, and copy in the delta pack features and plugins. This is then your 'target' Eclipse install against which you build your app.

Rule 7) Use a template build.properties file that you know works if you can find one - Kai Toedter's excellent mp3 demo build.properties works pretty much out of the box for 3.5.2 based apps - you just need to change the file locations to those on your system and it just works! 

Whilst this isn't a complete tutorial, hopefully this may be of use to someone who's been banging your head against their screen for the last few days. Indeed perhaps the days of nightmarish headless builds are coming to an end now that p2 is working nicely... at least until e4 comes along!

Thanks for reading!

Friday 26 March 2010

EclipseCon2010 thoughts

Well, EclipseCon2010 is now over, and what an excellent meeting it was. It was my first attendance and a great way to find out more about the organisation and future aims of the project, put names to faces and meet people.

I was really impressed with the quality of the tutorials - I attended tutorials on SWTBot, serverside Eclipse, and advanced RCP all of which worked flawlessly and were timed very well. (Finally I have some idea how p2 works!) Since we will be giving a tutorial on SBSI at the upcoming ICSB in October 2010 I got some great ideas, such as providing 'sync points' of downloads of the tutorial at various stages, so that people can catch up, or just start from a point they're interested in. Also the use of EtherPad to keep track of questions and issues arising.

As an Eclipse RCP developer, it was great to see a new SWT-native  scientific charting library presented, called SWT XY Graph. Currently we have come across two choices - BIRT chart designer ( which pulls in a lot of other plugins and can bloat a small application) or JFreeChart via an SWT-AWT bridge ( which works quite well but can be a little unresponsive at times). So this new plugin could be a very welcome alternative. 
Another new project which look interesting is Graphiti, which aims to provide rapid development of GEF based editors. The first release should be in June.

The HPC world is  being included in the Eclipse community now -  the 
Parallel Tools Platform aims to provide static analysis and debugging facilities, and a world class IDE for developers writing applications for HPC

It  was  also good to come across a smattering of life sciences people at EclipseCon, who either attended my talk or congregated around Ola Spjuth's Bioclipse poster. It will be fun to try to collaborate to make  Eclipse plugins for biology more interoperable and expand the potential scope for users.