Friday, March 23, 2012

X-HTTP header method override and REST APIs





In a previous article, I was explaining how to extract HTTP headers. In this one, I will explain why the use of specific custom HTTP headers such as X-HTTP methods override can be very handy while developing and promoting a REST API.

When deploying REST API based web services, you may encounter access limitations on both the server and client sides.







1) X-HTTP methods override and server side

On the server side, the access to your REST based  web application (I am using JBOSS) might be done via a proxy web server that does not support certain HTTP operations such as DELETE or PUT for security reasons.

For example if you are using Lotus Domino as redirector for your HTTP requests, be aware that by default Lotus Domino does not enable PUT and DELETE to go through.

Such requests will lead to HTTP 405 error:

Http Status Code: 405

Reason: Request method is not allowed by the server

By the way, if you really intend to allow PUT and DELETE on your Domino proxy web server, you need to add the following line to your notes.ini file:

HTTPEnableMethods=PUT,DELETE

However if you really need to have your proxy server filtering DELETE and PUT operations, then you will have to offer these operations via POST operations and ask your clients to add specific X-HTTP method override field in the header of their requests.



2) X-HTTP methods override and client side

The application consuming your web service might also have some constraints, for example restricting forms to only use GET or POST.

For these reasons, web infrastructure and solutions providers have proposed to use customized HTTP header fields:
If you want to support all of these variations, I would suggest to implement them as case insensitive in your REST API.

I personally use JBoss RestEasy to develop a REST API for mobile health care applications, and it was quite straightforward to add the additional level of flexibility to my existing PUT and POST operations.

Note that if you want to use these methods to enforce either a PUT or a DELETE method, be aware that the RFC 2616 - HTTP 1.1 section 9.1.1 describes the GET operation as safe.

9.1.1 Safe Methods

   Implementors should be aware that the software represents the user in
   their interactions over the Internet, and should be careful to allow
   the user to be aware of any actions they might take which may have an
   unexpected significance to themselves or others.

   In particular, the convention has been established that the GET and
   HEAD methods SHOULD NOT have the significance of taking an action
   other than retrieval. These methods ought to be considered "safe".
   This allows user agents to represent other methods, such as POST, PUT
   and DELETE, in a special way, so that the user is made aware of the
   fact that a possibly unsafe action is being requested.


In other words, if you are tempted to use a GET to simulate a PUT or DELETE, don't do it.
Use a POST instead. Thank you to my colleague Ravi to point this to me!


X-HTTP methods override and PUT based operations

If your PUT operation does not have a corresponding DELETE counter-part,
you could just use the POST as a PUT and not even add a X-HHTP-Method override.

Both POST and PUT requests will end up being bound to the same method:


 @PUT
 @POST
 @Path("/users/{userId}/order/{orderNum}")
 @Produces("application/json")
 @GZIP
 public Response putPostUserOrder(@Context HttpServletRequest request, 
               @PathParam("userId") String userId,
               @PathParam("orderNum") String orderNum)

If you still need to use a X-HTTP-Method override, what you need to do is to filter those specific POST requests.

boolean bProceed = true;
if (request.getMethod().compareToIgnoreCase(POST) == 0) {
    String methodOverride = getHttpMethodOverride(request);
    if ((methodOverride == null) || (methodOverride.compareToIgnoreCase(PUT) != 0)) 
        bProceed = false;
}

private String getHttpMethodOverride(HttpServletRequest request){
    String headerValue = request.getHeader("x-http-method-override");
    if (headerValue != null)
      return headerValue;
    else 
      return(request.getHeader("x-method-override"));  
}

In the example above, I have only implemented two customized X-HTTP headers, but you can add all of them if necessary.



X-HTTP methods override and DELETE based operations

For the DELETE operation, even if you don't have a PUT counter part, it is preferable to use a X-HTTP method override with DELETE to be semantically consistent (a PUT and POST are typically using to add or modify a resource, not removing it).

 
 @POST
 @Path("/users/{userId}/order/{orderNum}")
 @Produces("application/json")
 @GZIP
 public Response postDeleteUserOrder(@Context HttpServletRequest request, 
               @PathParam("userId") String userId,
               @PathParam("orderNum") String orderNum)

boolean bDelete = false;
if (request.getMethod().compareToIgnoreCase(POST) == 0) {
 String methodOverride = getHttpMethodOverride(request);
 if ((methodOverride != null) && (methodOverride.compareToIgnoreCase(DELETE) == 0)) 
  bDelete = true;
}

Of course, if most likely want to still support the regular DELETE, so you also keep the original binding DELETE method:

 @DELETE
 @Path("/users/{userId}/order/{orderNum}")
 @Produces("application/json")
 @GZIP
 public Response deleteUserOrder(@Context HttpServletRequest request, 
               @PathParam("userId") String userId,
               @PathParam("orderNum") String orderNum) 
 
 

Test harnessing X-HTTP Methods override operations


You can easily test your REST API with cURL as follow:

curl -H "X-HTTP-Method-Override: DELETE" -X POST http://restservice.acme.com/api/users/JJALGHPBIMA/orderNum/e7687673-479d-4d37-8bc0-a3e718aad33f

Should have the same effect as :

curl -X DELETE http://restservice.acme.com/api/users/JJALGHPBIMA/orderNum/e7687673-479d-4d37-8bc0-a3e718aad33f




Friday, February 24, 2012

Maven war plugin configuration section and web.xml




I am sure you have been in the following situation: you want to mavenize a web application project.

For this, you create a basic dynamic web application in Eclipse:



























You then import the application you want to build:






































After the import is done, you create a pom file.

You then update your eclipse configuration files via 'mvn clean eclipse:eclipse'

Finally you try to build your web application (war file) via 'mvn clean install'

However you get a bad surprise: First you are getting a BUILD ERROR:

 Error assembling WAR: webxml attribute is required (or pre-existing WEB-INF/web.xml if executing in update mode)

 ... and even though you seem to have created a war file, it only contains empty META-INF, WEB-INF and classes folders:


























The problem is that your web.xml file is not picked up by your maven build, because by default maven expects to find your web.xml inside src/main/resources


One option is to refactor your project and move your web.xml under src/main/resources/WEB-INF.

A better option is to modify your pom file and  add a maven war plugin configuration section to your build/plugins section.

In my case, I am specifying that my web.xml file is located under a WebContent folder, relative to the Pom file:
   
   <plugin>
      <groupId>org.apache.maven.plugins</groupId>
      <artifactId>maven-war-plugin</artifactId>
      <version>2.1.1</version>
      <!-- <configuration> section added to pick up the WEB-INF/web.xml inside WebContent -->
      <configuration>
         <webResources>
            <resource>
               <directory>WebContent</directory>
            </resource>
         </webResources>
      </configuration>
   </plugin>


When done, you just need to rebuild your web application (war file) via 'mvn clean install' and you should then obtain a  BUILD SUCCESSFUL with the war file content you were expecting!

Friday, December 30, 2011

How to extract HTTP header fields in a REST API



To be able to extract HTTP headers fields from incoming requests accessing a REST API is particularly useful when trying to log the transactions that are coming through the REST based service.

In this example, I use Java based JBoss RESTEasy together with @Context java annotation to extract the header information.

One of the REST API operation is /version that returns the current version of the API in a JSON format.The interface operation is defined as follow:

     import javax.ws.rs.core.HttpHeaders;

     /**
     * Get the current version of the REST API.
     */
     @GET
     @Path("/version")
     @Produces("application/json")
     @GZIP
     public Response getVersion(@Context HttpHeaders headers);

The @Context annotation allows you to map request HTTP headers to the method invocation.

One way to safely extract all available header parameters is to loop through the headers.The method getRequestHeaders from javax.ws.rs.core.HttpHeaders returns the values of HTTP request headers. The returned map is case-insensitive and is read-only.

   if (headers != null) {
       for (String header : headers.getRequestHeaders().keySet()) {
          System.out.println("Header:"+header+
                             "Value:"+headers.getRequestHeader(header));
       }
   }

When querying the REST API via a browser:

  http://www.acme.com/api/version

you obtain a minimum set of headers fields:

   Header:cache-control Value:[max-age=0]
   Header:connection Value:[keep-alive]
   Header:accept-language Value:[en-us,en;q=0.5]
   Header:host Value:[www.acme.com]
   Header:accept Value:[text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8]
   Header:user-agent Value:[Mozilla/5.0 (Windows NT 6.1; WOW64; rv:8.0)    
   Gecko/20100101 Firefox/8.0]
   Header:accept-encoding Value:[gzip, deflate]
   Header:session-id Value:[d636369c]
   Header:accept-charset Value:[ISO-8859-1,utf-8;q=0.7,*;q=0.7]

To be able to extract a specific header field (e.g. Host), you can use the getRequestHeader function:

 public Response getVersion(@Context HttpHeaders headers) {
     if (headers != null) {
        List<String> hostHeader = headers.getRequestHeader("host");
        if (hostHeader != null) {
           for (String host : hostHeader) {
              LOG.debug("Host:"+host);
           }
        }
     } 
  
     // Get the version
     final Version current_version = new Version();
     final ObjectMapper mapper = new ObjectMapper();
     final JsonNode rootNode = mapper.createObjectNode();
     ((ObjectNode) rootNode).putPOJO(Version.XML_ROOT_ELEMENT, current_version);
     
     final ResponseBuilder builder = Response.ok(rootNode);
     return builder.build();
   }

If you are using a test harness tools like cURL, you can also easily add additional HTTP header fields
(e.g. the email address of the user making the request or the referer) and test the logging functionality of your REST API:
curl -H "From: user@example.com" http://www.acme.com/api/version
curl -H "Referer: http://consumer.service.acme.com" http://www.acme.com/api/version

The headers can also be obtained through the HttpServletRequest object.
This can be done using the @Context HttpServletRequest annotation and can provide additional information about the incoming request:

public Response getVersion(@Context HttpServletRequest request) {
   LOG.debug("Host:"+request.getHeader("host"));
   LOG.debug("Request-URL:"+request.getRequestURL());
   ...
} 

Since HttpServletRequest extends ServletRequest you also getting useful methods providing information on the Internet Protocol (IP) address, host and port of the client or last proxy that initiated the request.

public Response getVersion(@Context HttpServletRequest request) {
   LOG.debug("Remote-IP:"+request.getRemoteAddr());
   LOG.debug("Remote-Host:"+request.getRemoteHost());
   LOG.debug("Remote-Port:"+request.getRemotePort());
   ...
} 
 

Monday, November 28, 2011

cURL setup and OpenSSL


In a previous post I was explaining how to use cURL to quickly test REST web services. If your web service has to be secure, you are probably using Transport Layer Security (TLS) - the new Secure Sockets Layer (SSL) cryptographic protocol.

To be able to use TLS/SSL with cURL, you will need to have some SSL libraries/DLL such as OpenSSL installed. If you did not install OpenSSL, you will most likely get the following error when trying to use cURL:











"The program can't start because LIBEAY32.dll is missing from your computer. Try to reinstall the program to fix this problem."

To fix this, you can either install the OpenSSL on your machine, or if you just want to install the minimum, you can just copy the following DLLs from OpenSSL (I am using OpenSSL 1.0.0c) to your cURL folder:
  • libeay32.dll
  • ssleay32.dll
By the way, you can download cURL for most of the OS and platforms and usually SSL and non SSL versions are available for each OS (see current windows versions below):

















Friday, October 28, 2011

Combining IHE transactions : Orchestration or Choreography?



Integrating the Healthcare Enterprise (IHE) has gained tremendous momentum in the past few years. Started as a Healthcare Information and Management Systems Society (HIMSS) and Radiological Society of North America (RSNA) workshop in October 1998 with only 15 participants including AGFA, Cerner, Fuji, GE, HP, Philips, Siemens, the IHE initiative has more than 400 members worldwide. It is composed of healthcare professional associations, government agencies, Health Information Exchanges (HIE), healthcare providers, IT and consulting companies, trade, educational, standard and research organizations. IHE provides a standards based-interoperable framework to share and exchange information between healthcare organizations across networks.

Combined with the latest technology and well established standards (HL7, DICOM, IDC9/10, LOINC, W3C), clinical data can then be securely and privately accessed and transmitted locally between network endpoints (e.g. within the same hospital between the practices and a lab). IHE profiles can also be used across Health Information Exchanges (HIE) of Regional Health Information Organization (RHIO), or a state level (e.g. an individual state in the US, Canada or Europe), or at the federal level (e.g. the US Nationwide Health Information Network or NwHIN). As a result, there is a strong need to integrate and combine individual IHE profiles end-points to form “hub of hubs” or “network of networks” to support health information exchange between the participating nodes entities.


Because IHE transactions are most likely to be offered as web services, combining those transactions can be done following Service Oriented Architecture (SOA) and Service Oriented Computing (SOC) principles.

Conventional middleware distributed system infrastructures (e.g. JMS) are generally not sufficient or flexible enough to mediate, transform, federate and route messages from and to web services.

Enterprise Application Integration (EAI) goes one step ahead, trying to separate the applications from the web services end-points. EAI usually employs a centralize service broker for this, a set of connectors and an independent data model. Services can then send and subscribe to receive messages to and from the broker. However, this very centralized approach requires a large amount of up front development and business process design for the connectors, as well as high cost of maintenance in general. Enterprise Service Buses (ESB) is an infrastructure that leverages EAI principles.


Orchestration

Like EAI, orchestration uses a centralized approach. Web services orchestration is realized through Business Process Execution Language (BPEL) that describe the collaboration and interaction between the web service participants.

Business workflows, states, actions, events, control flows and exception handling can be specified. Messages can be received and sent directly from and to WSDL ports. Results received asynchronously from web services can be combined to create new messages. Usually BPEL workflows are created and updated with visual design tools.

There is often an overlap between orchestration engines and Enterprise Services Buses (ESB). Vendors now also offer BPEL design on top of more generic EAI mechanisms enabling true orchestration for ESBs.


Choreography

Choreography is more distributed and collaborative in nature and uses the Web Service Choreography Interface (WSCI) specification and the WSDL description files to represent the flow of messages exchanged between the Web services involved.

Choreography seems more flexible than orchestration since it does not rely on a central element that could become a bottle neck and seems to offer more complex interaction potential between web services. However, choreography has some drawbacks including the necessity for all web services to be aware of overall business process workflow. WSCI itself does not specify the definition and the implementation of the mechanism to implement the message exchange.

In addition to this, performance can be an issue if high volume message transactions between the end-points peers are not handled properly.

Moreover, there is no clear responsibility for the overall workflow leading to legal issues related to monitoring and maintenance.


Orchestration vs Choreography

Orchestration also has the advantage to be a much more mature integration technology than choreography. For these reasons, most of the state-wide health information exchange networks in the US employ a Service-Oriented Architectural (SOA) model that is implemented through an Enterprise Service Bus (ESB) orchestration.

In addition to this, web service orchestration offers much more than just technical benefits:
  • Organizational: standardization, narrow gap between business analysts and developers;
  • Managerial: risk reduction, lower costs, more flexibility;
  • Strategic: IT resilience, delivery time reduction, less technology lock-in;
  • Technical: portability, reuse, interoperability of tools, less complex code, better maintainability;
  • Operational: efficiency, automation, higher level tasks management.
When deployed on high performance platforms such as SOA software appliances, orchestration solutions are easy to test, extend and maintain.

  • More on this topic to be published as part of the following paper: Andry F., Wan L., HEALTH INFORMATION EXCHANGE NETWORK INTEROPERABILITY THROUGH IHE TRANSACTIONS ORCHESTRATION, 5th International Conference on Health Informatics (HEALTHINF 2012).

Friday, September 23, 2011

cURL tests harness and TLS



In a previous post, I have explained how to use cURL to test harness REST based Web services. One thing I did not described was how to add Transport Layer Security (TLS) in your tests. In other words, how to successfully test REST Web services over HTTPS?

The cURL manual describes a certain number of options that can be used. One of the most convenient option is -k or --insecure. It allows curl to  perform  "insecure" SSL connections and transfers. All SSL connections are attempted to be made secure by using the CA certificate  bundle  installed by  default. So if you SSL connection does not require client side authentication it is a very quick way to test your web service over SSL:

  curl -k https://www.acme.com/api/version


Another useful option can be used if in conjunction to SSL, you need to compress your payload via GZIP for example to optimize the transfer of large messages. In this case, you will use the option -H or --header that will help you specify custom headers to your request:

  curl -H "Accept-Encoding: gzip,deflate" -k https://www.acme.com/api/users/list


Notice in that case that the -k or --insecure option is always placed just in front of URL.

Of course, other headers such as the one described previously can be combined:

  curl -c ./cookies.txt --data-binary @login_password.json -H "Content-Type: application/json" -H "Accept-Encoding: gzip,deflate" -k https://www.acme.com/api/users/token







Wednesday, August 31, 2011

Medication Adherence - How technology can help?



Looking for ideas to reduce the rise of health care cost? Easy! Ask patients to swallow their medication!






According to recent studies, the lack of prescription medication adherence cost between $250 and $300 billion annually, including $100 billion in hospitalization. For example, it is estimated that in the US 89,000 deaths annually are due to non adherence to anti-hypertensive treatments. This problem is particularly acute for patients with coexisting conditions who take a variety of medications sometimes prescribed by different physicians.

Obviously doctors and pharmacists have a critical role to encourage patients and caregivers to administrate medication correctly and rigorously.

Cost is not necessarily the main factor: even when drugs are free, adherence rate is 60% in the US and only %50 in developed countries. In other words, improvements in co-payments structures will only partially improve adherence. Another area of possible improvement will be the use of fee-for-service model.






Another way to improve medication adherence is the proper use of specific technologies:

Here are some suggestions:
  • the integration of practice EMRs systems and pharmacies via HIE solutions (a lot of practices are still using paper).
    • online incentive programs (e.g. provided by the employers) to promote prevention, quality of life and best practices in therapy. It is recognized that financial incentives and other rewards when well designed and targeted can improve adherence (smaller and higher reward frequency is usually more efficient than big and sporadic rewards).
    • advances in personalized medicine can help taylored medication intake and increase medication adherence
    With the amount of money at stake, I am sure there is a large number of companies, including start-ups, who are working hard on this problem. So stay tune!

    Finally, we have to keep in mind, that other factors including lifestyle, psychological issues and health literacy play an important role as well. And these are not negligible!