Thursday, March 31, 2011

JUnit based integration testing with Simple JNDI



I regularly use TJWS  a light weight Java Web Server build as a servlet container to provide standard Web Server capabilities at building and testing time.

The main advantage is that I don't have to deploy my war file to my production server (e.g. JBoss) to test my Java Web application. My maven build runs complex integration JUnit tests immediately after building the war file with maven.

In my configuration however, I have to use JNDI (Java Naming and Directory Interface) to connect to a specific datasource (DB2 via JDBC) for some of  my tests.

I used Simple JNDI for this which also offer an easy and elegant way to configure data source access.

Here are the steps I follow to setup my JUnit tests using Simple JNDI:

    • Add dependencies in Maven 2 Pom file for Simple JNDI, java persistence and DB2 JNDI/JDBC:
        
        <dependency>
            <groupId>simple-jndi</groupId>
            <artifactId>simple-jndi</artifactId>
            <version>0.11.4.1</version>
            <scope>test</scope>
        </dependency>
    
        <dependency>
            <groupId>javax.persistence</groupId>
            <artifactId>persistence-api</artifactId>
            <version>1.0</version>
            <scope>test</scope>
        </dependency>
      
        <dependency>
            <groupId>com.ibm.db2</groupId>
            <artifactId>db2jcc4</artifactId>
            <version>9.7.0.2</version>
            <scope>test</scope>
        </dependency>
        

    • Create a small Java class for the JNDI setup
         import javax.naming.InitialContext;
         import javax.sql.DataSource;
    
         public class JndiSetup { 
             /**
              * Setup the Data Source
              */
             public static void doSetup(String ds_name) {
                 try {
                     InitialContext ctxt = new InitialContext();
                     DataSource ds = (DataSource) ctxt.lookup("jdbc."+ds_name);
                     // rebind for alias if needed
                     ctxt.rebind("jdbc/"+ds_name, ds);
                 } catch (Exception ex) {
                     ex.printStackTrace();
                 }
             }
         }

    In more complex situations, you may have also to create an EntityManager and use an EntityManagerFactory.

    • In the JUnit java script code, setup your JNDI connection before running your tests:
        @BeforeClass
         public static void setUpClass() throws Exception {
              JndiSetup.doSetup("");
         }


    • Create a jndi.properties file in the "\src\test\resources" path in your project 
     
         java.naming.factory.initial=org.osjava.sj.SimpleContextFactory
         org.osjava.sj.root=target/test-classes/config
         org.osjava.jndi.delimiter=/
         org.osjava.sj.jndi.shared=true

    • Create a jdbc.properties file in the "\src\test\resources\config" path in your project (I am using an IBM DB2 data source). Create the config directory if it doesn't exist. The "config" name comes from org.osjava.sj.root parameter in jndi.properties file. If you want a different name, "foo", for the folder, make sure to update the "org.osjava.sj.root" property and create a "foo" folder in "\src\test\resources" path. Make sure the directory /src/test/resources is in the Java build path, and the output folder is set to target/test-classes
    
         <your-ds>.type=javax.sql.DataSource
         <your-ds>.driver=com.ibm.db2.jcc.DB2Driver
         <your-ds>.url=jdbc:db2://<your-database-host>:50000/<your-ds-or-ds-alias>
         <your-ds>.user=<your-db-login>
         <your-ds>.password=<your-db-password>

    With all of this you should be ready to test your integration using >mvn clean install.

    You may have also to do a >mvn eclipse:eclipse -DdownloadJavadocs=true as well if you are using Eclipse and your new dependencies and imports do not work properly.

    Monday, February 28, 2011

    REST-Style Architecture and the Development of Mobile Health Care Applications



    Mobile devices offer new ways for users to access health care data and services in a secure and user-friendly environment. These new applications must be easy to create, deploy, test and maintain, and they must rely on a scalable and easily integrated infrastructure.

    In the ambulatory health care environment, providers spend the majority of their time in an examination room with patients. Although some clinics have installed personal computers in the exam room for use at the point of care, many physician practices have yet to do so or have no such intention. Reasons for not installing PCs in the exam room include (among others) lack of space, security concerns, and cost. Often, clinics have PCs installed outside of the exam room to be used for encounter documentation or health history research (i.e., reviewing the patient's health records). This physical setup is often satisfactory for providers to complete their documentation needs. Providers often scratch rough notes on paper during an encounter, then dictate or type their notes after the visit has ended. The absence of computers in the exam room, however, is a disadvantage for research activities. Frequently, after listening to the patient's verbal health history, a provider wishes to read past records. If those records are in an electronic format, it is optimal to access those records at the point of care (i.e., in the exam room).

    Thus, computer devices that are smaller and more mobile than a PC (e.g., smart phones, PDAs, tablets) would be the optimal hardware choice to access these electronic records. Given that many physicians carry smart phones, such mobile devices would be the ultimate tools to look up patient records.

    Since the development of client applications on different mobile platforms requires more time than creating web applications for a handful of browsers, it is important to minimize the complexity of the integration with the back-end services and legacy systems and to try to decouple the development and maintenance of the client- and server-side components.

    The Representational State Transfer (REST) architecture is an alternative to SOAP and offers clear advantages over SOAP including lightweight architecture, extensibility, scalability, easy of development, testing, deployment and maintenance.

    REST API prototypes can be created in a matter of days and a full functioning set of sophisticated clinical based web services accessible by mobile client applications within few weeks.

    In addition to this, REST APIs are particularly suitable for fast and loosely-coupled solution integration such as mobile applications, but can also be used in health care for portal and mash-up applications as well.


    Reference:
    Andry F., Wan L., Nicholson D., A mobile application accessing patients' health records through a REST API, 4th International Conference on Health Informatics (HEALTHINF 2011), pp 27-32, Rome 2011.







    Saturday, January 29, 2011

    Increase your productivity on Windows platforms with Console



    A couple of years ago, one of my colleagues showed me Console, a very useful and nice Windows console window enhancement tool. Since then I have been using it and increased my productivity when it comes to command line tasks on Windows platforms. This is an open source software available on Source Forge.

    With Console, you have all your Windows consoles within a single application - you also get:
    • multiple tabs
    • text editor-like text selection
    • different background types
    • alpha and color-key transparency
    • configurable font, different window styles
    As a result, you can customize each console that appear in different tabs.

    In the example below, I have created a Tab called "MY-MAVEN-BASED-PROJECT" that opens at a specific windows path with a particular prompt: "PROJECT-ROOT:" to build your project.














    To configure your prompt, you need to do the following:

       In the console settings tab, go to the shell field and enter:

                cmd.exe /k "prompt <your-prompt>"




















    and for the startup directory, just specify the directory where you want to open your customized tab.


    You can select your background color and style in the background customization tab:



















    Enjoy!


    Tuesday, December 28, 2010

    LotusScript Connectors for DB2

     

    In my previous post I was explaining how to install the IBM Data Server Runtime Client, including some ODBC DB2 drivers to access a DB2 Database remotely. In this new post I explain the bare minimum to install in order to access a DB2 Data Source via LotusScript Extension for Lotus Connectors (LS LSX-LC).

    By the way, the ODBC DataDirect Lotus-branded drivers might be available only to paying customers ...

    A. Installing the DB2 ODBC CLI drivers

    First you need to install the DB2 Client Drivers on the System (which are different from the those that come with the IBM Data Server Runtime Client). I am using a Windows server, so after download I just unzip either v9.7fp3a_nt32_odbc_cli.zip (23 bits) or v9.7fp3a_ntx64_odbc_cli.zip (64 bits) in a folder (e.g. C:\clidriver).

    Then I open a command prompt and navigate to my folder (e.g. C:\clidriver\bin) and type:

         db2oreg1.exe -i -setup

    Immediately following this you can set up the data sources in ODBC. Open Control Panel -> Administrative Tools -> Data Sources (ODBC).






    You will see a screen that looks like the following, click on the "System DSN" tab:
     
















    Click "Add..." button to get the following, and select "IBM Data Server Driver for ODBC - C:/clidriver:
















    Enter a Data Source name, this is used directly in the Lotus Script( use the same name as the Database itself):

















    Enter the DB2 User ID and password that the agent uses to connect to DB2:
















    Select the "Save Password" option and click OK on the warning popup for saving the password in db2cli.ini file:
















    Click on the "Advanced Settings" tab:



















    Click "Add" and select the "Database" CLI Parameter:
















    Enter the Database name in the prompt and click OK:
















    Continue to do this for these parameters:
    • Database: The database name
    • Hostname: The DB2 server/host name (IP address is not recommended)
    • Port: 50000 (The default value for DB2 TCPIP accepting port)















    B. Accessing the DB2 Database from LotusScript

    1. Get access to the Lotus Connector Extensions (this is always installed)

          Option Public
          Option Explicit

          UseLSX "*lsxlc"

     2. Create the LCSession object at the top of all functions or subroutines

          Dim session As New LCSession

     3. Enable Connection pooling

          session.ConnectionPooling = true

     4. Create the Connection, using the LCConnection class's constructor that takes a single argument (the name of the connector type). We're using ODBC, which has the Lotus Connector name of "odbc2".
          Dim conn As New LCConnection ("odbc2")
          conn.Server = "RLS" 'Using the ODBC DATA SOURCE name created previously.
          conn.Connect

       5. When done with the connection, disconnect - This will not actually disconnect if connection pooling is enabled

          conn.Disconnect

    Queries

    Querying DB2 from a LCConnection object takes a couple of variables for holding the field names. There are multiple ways to issue a query:

    LCConnection Execute

    The execute command takes a full SQL statement, which is useful to capture complex queries. Unfortunately LSX LC (like LS:DO) does not support any kind of parameterized query syntax or method calls. This means that the parameter values sent to the database need to be encoded specifically for DB2. This kind of encoding may be difficult from LotusScript, and therefore it is recommended that for complex queries we use stored procedures in either SQL or Java. For simple select queries involving one table (or potentially view) and "ANDed" WHERE clause predicates, one can use the Select method against the LCConnection class.

    Execute example code:
    Dim fldLst As New LCFieldList
    conn.Execute "SELECT * from TEST.CUSTOMER", fldLst ' fldLst is only used for result set purposes
    Set fld = fldLst.Lookup ("CUST_NAME")
    While (conn.Fetch(fldLst) > 0)
     Dim sName As string
     sName = fld.text(0) '' Do something with this column value
    Wend

    LCConnection Select

    The Select command is best described in the LC LSX Manual, as there are many options. In the example code it shows the user accessing a "count" of returned records, this is not accurate for the DB2 and ODBC setup we are using. Instead, like the Execute method, the count can only be determined by the amount of times we loop in Fetching each row.

    When using Select you must set the Metadata property to the schema and table name you're selecting from. Always use the form "schemaname.tablename" to avoid runtime errors later.

    Select example code:
    Dim result As New LCFieldList
    conn.Metadata = "TEST.CUSTOMER" 
    conn.Select Nothing, 1, result ' Is like SELECT * FROM TEST.CUSTOMER 
    Set fld = result.Lookup ("CUST_NAME")
    While (conn.Fetch(result) > 0)
     MessageBox fld.text(0) ' display the result
    Wend
     
    For more on LotusScript see the IBM documentation on Lotus Domino

    I would also like to thank you my colleague Ravi L. for his walk through step by step on this topic!

    Wednesday, November 24, 2010

    Database Alias and DB2 ODBC Drivers

    One of my recent project required to use ODBC to access DB2 databases located on remote VMWare LabManager images. I am using a Windows PC (Vista) laptop to develop and test my project (LotusScript Data Object code) - a Lotus Notes/Domino DB2 integration using ODBC. The first step for me was to install the ODBC DB2 drivers since I did not have DB2 installed on my laptop.

    Several installations options were offered to me for DB2 9.7:
    The installation of  the IBM Data Server Runtime Client is very fast and straightforward. It installs the ODBC/CLI drivers and a small set of useful command line setup tools:














    After this, we can create the Database Aliases using the Windows ODBC Data Source Administrator.
    When you look at the Drivers tab, you should now see your DB2 ODBC drivers

    To add a DB2 Data Source Name (DSN):
    • select User or System and click on the Add... button.
    • select the DB2 ODBC/CLI driver
    • enter a Data source name and add an Alias if needed (click on the Add button next to your existing aliases if needed)
    • enter Data Source parameters (Description, user ID, password) - click "Save password" checkbox  to save your login and password locally in your db2cli.ini file.
    • enter your TCP/IP connection (port number is 50000 by default for me for DB2), the host name is the IP address of my DB2 server VMWare image.
    • I did not have to change anything in the defaults of Security options and Advanced Settings.






















      From there you are ready to use your ODBC DSN ready to connect to your DB2 Database.

      One issue you will encounter though will be how to delete an existing Database Alias from the DB2 ODBC tab either to modify an existing one or to remove an old one.
      These appear in the drop down of the ODBC IBM DB2 Driver - Add popup window.










      The truth is that even though you are accessing a remote DB2 server machine, these aliases are stored locally on your DB2 installation.
      To remove the DB2 Database Aliases ODBC drivers, just start the IBM DB2 Command Line Processor and use the following command:

      UNCATALOG DATABASE <database_alias>











      In certain cases, you also need to refresh the directory cache. For this, just stop and restart the DB2 Management Service on your local Windows machine.

      Friday, October 29, 2010

      Healthcare REST APIs - JSON or XML?

      I have been working recently on a REST API which produces subsets of Continuity of Care Documents (CCD). This REST API is used by an iPhone application which is targeted to physicians and nurses. Since I wanted to minimize the amount of data exchange between the server and the client, I originally used JSON as my data exchange format. The motivation to use JSON was to have a compact format that offers better performance than a more complex XML representation.

      For example, the request to obtain lab-results from a CCD is as following:
      GET /users/<user-id>/patients/<patient-id>/lab-results?hl7v3=true&max=<max>offset=&<offset>
      

      The resulting of this request to the API is a JSON object containing a list of lab results:
      {"lab-results":{
          "list":[{"lab-result":{"entry":"...",
                                 "facility":"...",
                                 "normalcy":"...",
                                 "orderedBy":"...",
                                 "status":"...",
                                 "subject":"...",
                                 "urgency":"..."}},
                  {"lab-result":{...}},...],
          "count":"...",
          "offset":"...",
          "remain":"..."}}
      

      A lab result HL7 V3 entry is returned as the following JSON object:
      {"entry":{
          "organizer":{
              "code":{"displayName":"..."}},
              "components":[
                  {"component":{...}},
                  {"component":{...}},...],
              "notes":[...]}}}
      

      A lab-result component itself:
      {"component":{
          "observation":{
              "code":{"displayName":"..."},
              "effectiveTime":{"value":""},
              "value":...,
              "interpretationCode":{"code":"..."},
              "referenceRange":{
                  "observationRange":{...}},
              "notes":[...]}}}
      

      An observation value is returned as a JSON object containing either a string value, a unit and a type, or just some text.
      {"value":{"unit":"...","value":"...",type:"..."}}
      
      {"value":"..."}
      

      An observationRange is returned as a JSON value object containing a low and high value, or just some text.
      {"observationRange":{
          "value":{
              "low":{"value":"..."},
              "high":{"value":"..."}}}}
      
      {"observationRange":{"text":"..."}
      

      All these JSON objects are marshalled from annotated Java POJOs using JBOSS RestEasy framework and Jackson:
      XmlRootElement(name = "high")
      public class HighValue {
      
       private String value = "";
      
       /**
        * Construct a new instance.
        */
       public HighValue() { }    // Empty constructor
      
       /**
        * Create a new {@code HighValue} during JAXB unmarshalling.
        * @param value
        *            String as value for the high value.
        */
       public HighValue(final String value) {
        if (value != null)
         this.value = value.trim();
       }
      
       /**
        * Get the {@code value} attribute.
        * @return {@code value} attribute value (may be {@code null}).
        */
       @XmlElement
       public String getValue() {
        return value;
       }
      
       /**
        * Set the {@code value} attribute.
        * @param value
        *            value to set.
        * @see #getValue()
        */
       public void setValue(final String value) {
        if (value != null)
         this.value = value.trim();
       }
      }
      
      This was fine initially since I was focusing on just lab results and I was using a specific back-end API that providing values to populate my POJOs. This solution started to become more complex when I was asked to generated a large set of CCD data types. As a result, the number of Java objects became quickly larger.

      The other option I had was to use another internal API I could use which was already generated full or subset of CCD. However the resulting CCD format provided was in XML:
      <component>
        <observation classCode="OBS" moodCode="EVN">
          <templateId root="2.16.840.1.113883.10.20.1.31"/>
          <templateId root="1.3.6.1.4.1.19376.1.5.3.1.4.13"/>
          <templateId root="2.16.840.1.113883.3.88.11.83.15"/>
          <id root="1"/>
            <code code="Remark" codeSystemName="L" displayName="Remark"/>
            <text>
              <reference value="#Observation_504ccbaf5ecea7b1096720"/>
            </text>
            <statusCode code="completed"/>
            <effectiveTime value="20091223231100"/>
              <value xsi:type="ST">Spec #106641063: 23 Dec 09  2311</value>
            <interpretationCode code="N" codeSystem="2.16.840.1.113883.5.83" codeSystemName="ObservationInterpretation" displayName="Normal"/>
        </observation>
      </component>
      

      I could of course just used it as it is and have my REST API return XML CCD subsets in XML:
      GET /users/<user-id>/patients/<patient-id>/CCD&section=<section>
      

      They are several issues with this:
      • As you can see XML is much more complex to understand, parse and debug than JSON
      • XML increases bandwidth consumption
      • Browsers and client application (e.g. mobile devices) can consume JSON much more efficiently than XML
      For me, the best solution was to have the internal API marshalling the CCD in both XML and JSON so I will not have to unmarshall the CCDs again into POJOS.

      The good news for all of us is that you can use java tools such as JAXB which has adapters to support other formats than XML such as JSON. With Java annotations, this is very easy to implement.

      Friday, September 10, 2010

      Spring Dependency Injection with JBOSS : the CLASSPATH issue

      When facing the problem of deploying web archives (war) to be configured through Spring dependency injection, you probably want to have generic applications that do not have to be recompile every time you deploy them on new configurations.

      In my current project I need to configure a REST API with various parameters (host name, database paths, maximum of records per request). For this I use Spring dependency injection where the parameters are injected at run-time via a resource file located outside the war file, in a folder specified by the Windows CLASSPATH variable (my testing and production platforms are windows machine).

      First I need to add a windows CLASSPATH system variable (in your system properties/environment variables)  if this variable does not exist. Then I add the resources.xml file directly in the folder specified by CLASSPATH. You can also use a sub-folder but you will need to hard-code the name of the folder in your spring config file - in my case applicationContext.xml located in ./src/main/webapp/WEB-INF/

      <beans xmlns="http://www.springframework.org/schema/beans"
       xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
       xmlns:context="http://www.springframework.org/schema/context"
       xsi:schemaLocation="
              http://www.springframework.org/schema/context 
              http://www.springframework.org/schema/context/spring-context-2.5.xsd
              http://www.springframework.org/schema/beans 
              http://www.springframework.org/schema/beans/spring-beans.xsd">
          <import resource="classpath:/resources.xml" />
      </beans>
      

      My resources.xml looks like this:

      <beans xmlns="http://www.springframework.org/schema/beans"
       xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
       xmlns:context="http://www.springframework.org/schema/context"
       xsi:schemaLocation="
              http://www.springframework.org/schema/context 
              http://www.springframework.org/schema/context/spring-context-3.0.xsd
              http://www.springframework.org/schema/beans 
              http://www.springframework.org/schema/beans/spring-beans-3.0.xsd">
        <bean id="custService" 
                            scope="prototype"
                            class=".....">
         <property name="hostName" value="121.122.123.124"/>
         <property name="databasePath" value="..."/>
         <property name="maxRecordsPerRequest" value="1000"/>
        </bean>
      </beans>
      

      One issue you might encounter when you try to deploy your application on JBoss is that the web application server does not take into account the CLASSPATH out-of-the-box (I am using redhat EAP 5.0.X - production setting), but this might be also the case with JBOSS community edition.

      Your war file will probably fail to deploy and you will find a bunch of errors in your log file ./jboss-as/server/<setting>/log/server.log  including:

      org.springframework.beans.factory.parsing.BeanDefinitionParsingException: Configuration problem: 
      Failed to import bean definitions from URL location [classpath:/resources.xml]
      Offending resource: ServletContext resource [/WEB-INF/applicationContext.xml]; nested exception is org.springframework.beans.factory.BeanDefinitionStoreException:
      IOException parsing XML document from class path resource [resources.xml]; 
      nested exception is java.io.FileNotFoundException: 
      class path resource [resources.xml] cannot be opened because it does not exist
      

      What is missing is that you need to tell JBoss about your CLASSPATH variable.
      Just edit ./jboss-as/bin/run.bat and add the CLASSPATH variable and you will be up and running in no time.

      :RESTART
      "%JAVA%" %JAVA_OPTS% ^
         -Djava.endorsed.dirs="%JBOSS_ENDORSED_DIRS%" ^
         -classpath "%JBOSS_CLASSPATH%;%CLASSPATH%" ^
         org.jboss.Main -b 0.0.0.0 -c production %*