Friday, April 16, 2010

SOA and Health Care Meaningful Use requirements of the Recovery Act


The Interim Final Rule of the Health Information Technology for Economic and Clinical Health (HITECH) Act was passed by Congress in February of 2009.  Under this act, eligible providers will be given financial rewards if they demonstrate "meaningful use" of "certified" Electronic Health Record (EHR) technologies.

Therefore there is a big incentive for health care vendors to offer solutions that meet the criteria described in the law.  More precisely, the associated regulation provided by the Department of Health and Human Services describes the set of standards,  implementation, specifications and certification for Electronic Health Record (EHR) technology.


As a Software Architect, I was curious to see whether Service Oriented Architecture (SOA) or Web Services in general were mentioned in these documents.

The definition of an EHR Module includes an open list of services such as electronic health information exchange, clinical decision support, public health and health authorities information queries, quality measure reporting etc.

In the transport standards section, both SOAP and RESTful Web services protocols are described. However Service Oriented Architecture (SOA) is never explicitly described or cited. No reference how these services might be discovered and orchestrated in a "meaningful way". I would assume that the reason is that the law makers and regulators wanted to be as vague as possible on the underlying technologies for an EHR and its components.

The technical aspect of "meaningful use" is specified more precisely when associated with interoperability, functionality, utility, data confidentiality and integrity of the data, security of the health information system in general.

These characteristics are not necessarily specific to SOA, but to any good health care software and solution design.

Still, the following paragraph seems to describe a solution that could be best implemented using a Service Oriented Architecture: "As another example, a subscription to an application service provider (ASP) for electronic prescribing could be an EHR Module"  where software is offered as a service (SaaS).  This looks more like the description of an emerging SOA rather than a full grid enabled SOA.

It will be up to the solutions providers to come up with relevant products and tools to maximize the return on investment (ROI) of the tax payer's money and the professionals and organizations eligible for ARRA/HITECH.

SOA will definitively be part of the mix since it gives the ability create, offer and maintain large numbers of complex EHR Software solutions (SaaS) that have a high level of modularization and interoperability.
 
Further developments toward a complete SOA stack such as offering a Platform as a Service (PaaS) and even the underlying Infrastructure as a Service (IaaS) in the cloud will face more resistance in a domain known for a lot of legacy systems and concerns about privacy and security.

The Object Management Group (OMG) is organizing a conference this summer on the topic of  "SOA in Healthcare: Improving Health through Technology: The role of SOA on the path to meaningful use". It will be interesting to see what healthcare providers, payers, public health organizations and solution providers from both the public and private sector will have to say on this topic.

Wednesday, March 31, 2010

Cloud Computing and Health Care Applications: a change in opinions?

I have designed and implemented Health Care Applications for more than 3 years and I have experienced a dramatic change of opinions toward the use of Cloud Computing for Health IT.

Several years ago, the idea of having on demand resources offered as a service, used to process or store Health Care related data, was out of the question.  The main concerns were the security, privacy and confidentiality of the data; the reliability and ease of use of the underlying systems and platforms.

Health Care solution providers did not hesitate to require a minimum of tens of thousands of dollars of hardware to deploy a minimum configuration for a multi-tier EHR or PHR web based application. In fact, some players were even barely starting to virtualize their platforms.

One of the requirement to comply with the Health Insurance Portability and Accountability Act of 1996 (HIPAA) regulation is that the transmission of patients protected health information (PHI) over open networks must be encrypted.

These issues have been recently addressed and companies offering virtual infrastructure as a service such as Amazon EC2 offer 256 bit AES encryption algorithms for files containing PHI, as well as token or key-based authentication and sophisticated firewall configurations for their virtual servers. Encryption is also available when storing the data on Amazon S3. The access from the internet or EC2 to Amazon S3 is done via encrypted SSL endpoints which ensures that PHI information stays protected. AWS indeed describes several Cloud based Healthcare related applications in their case study, including MedCommons (a health records services provider that give the ability to the end users to store among other medical information CCR and DICOM documents).

Cloud infrastructure providers such as Amazon Web Services (AWS) ensure that their administrators or third-party partners cannot have access to the underlying PHI data. Strong security policies, access consent processes, as well as monitoring and audit capabilities are available to reduce dramatically the risks of  unauthorized access. In addition to this, these providers offer highly available solutions for automated back-ups and disaster recovery which make them more attractive that traditional solutions. Some providers also ensure that the data in question stay within the borders of specific regions, states or countries to comply with regulations in place.

In fact it is very interesting to see these days Health Care becoming a show case of the benefits of Cloud computing. Last month, at the San Francisco Bay Area ACM chapter presentation on cloud computing, I was surprised to see that the first Cloud Application example mentioned was TC3. The numbers were indeed very convincing: When facing with  sudden increase of insurance claims processing (from 1 to 100 millions per day in a very short time), TC3 had the option of a traditional solution consisting of $750K of new hardware and $30K of maintenance and hosting per month, or use an Amazon Web Service Cloud solution for $600 per month. The decision was easy I suppose!

Friday, February 26, 2010

MapReduce an opportunity for Health and BioSciences Applications?

HealthCare and BioScience software products and solutions have embraced Database Management System (DBMS) for their back-end storage and processing for years like most other domains where performance, scalability, security, extensibility, auditing capabilities and maintenance are critical.

In the past few years with alternative or complement technologies such as MapReduce and Hive originally created from the need of extremely high volume web applications such as Google, Facebook or LinkedIn. A lot of people, especially engineers are now wondering if these technologies could be used in HealthCare and BioSciences.

More and more job openings outside the Social Networks or SEO sphere now mention MapReduce and Hadoop in their required or "nice to have" skills, including HealthCare and BioScience companies. In fact, recently at a talk from Bay Area Chapter of the ACM on Hadoop and Hive, even though the talk was quite technical, there were few venture capitalists in the crowd who were checking if this the topic was only hype or would potentially bring big ROI. Healthcare and biotechnologies were definitively in their mind.

Why then would the MapReduce paradigm be a good candidate to provide the "next quantum leap" for HealthCare and BioSciences?

In HealthCare, as more and more users, patients and professionals upload data to applications such as PHRs and EMRs, there is a need to parse, clean and reconcile extremely large amount of data that might be initially stored in log files. Medical observations from patients with chronic diseases such as blood pressure or blood glucose might be good candidates for this, especially when they are uploaded automatically from medical devices. Also the aggregation of data coming from potentially large numbers of sources makes it more suitable to a Map and Reduce processing paradigm than DBMS based data mining tasks.

HealthCare decision makers might be hesitant to use these new technologies as long as they have some concerns related to security, confidentiality and certification to standards such as HL7 (see CCIT and HITSP). However with the overall reforms in progress in HealthCare it will be interesting to see if MapReduce will be part of the technical package for the benefits of not only the patient and care givers, but all healthcare actors including payers and various service providers.

BioSciences (drug discovery, meta-genomics, bioassay activities ...) is also a good candidate for MapReduce. In addition to the fact that BioScience applications deal also with large amount of data (e.g. biological sequences such as DNA, RNA, proteins) a lot of the data is semi-structured data that is semantically rich and most likely best represented as a RDF data model than a Database set of tables (e.g. see "Storage and Retrieval of Large RDF Graph Using Hadoop and MapReduce") . Even though database has made progress to store and process XML, MapReduce is more suitable to very fast processing and aggregation of large amount of key-value elements.

Another element is price and return on investment (ROI), especially for startups is the fact that the implementation of MapReduce over a cloud based infrastructure using an open source framework such as Hadoop and Hive can be an attractive economic proposition for a CTO.

Also both fields can also take advantage of other applications of MapReduce in areas other than hard-core technology but related to brand management, sales and supply chain optimizations used with success in other domains.




Thursday, January 28, 2010

Cloudera & Facebook on Hadoop and Hive



This week I attended a very interesting meeting of the San Francisco Bay Area Chapter of the ACM on the topics of Hadoop and HIVE. I was not the only one interested by MapReduce related projects, since the meeting nicely hosted by LinkedIn at their office of Mountain View, had more than 250 people.






Dr. Amr Awadallah from Cloudera did a very good introduction to Hadoop since a lot of attendees were not very familiar with this java open source version of MapReduce. It is interesting to mention that Desktop product offered by Cloudera is free. Amr explained that Cloudera business model is to offer professional services, training and specific for fees features out of the core of the main product.


Cloudera web site has a lot of good training material on Hadoop and MapReduce. Amr mentioned for example that Hadoop was used at LinkedIn to create and store the recommendations on the fly "People you may know" whereas the profile information is managed by a more traditional RDBMS data store.


They were a couple of questions related to the behavior of Hadoop on top of full virtualization products such as those offered by VMWare. The answer from Amr was first to compare the virtualization of platforms and the parallelism involved in MapReduce/Hadoop. In a way the former architecture goal is to have multiple virtual machines running on the same hardware (e.g. a large mainframe or blade boxes) whereas the later is to have an initial processing and storing job done on multiple cheap commodity two rack units (RU) “pizza” boxes at the same time. So in a way these architectures are completely opposite. Of course it is not fair to try to compare the complete virtualization of complete operating systems such as Windows or Linux and the management of basic map and reduce operations even though they have common characteristics (a file system and some processing capabilities).


However some people do use VMWare images clusters to run Hadoop MapReduce tasks and the question is “is it efficient?”. The answer lies in the way network performance and I/O in general is handled by both the images and the Hadoop scripts.


They was also an interesting question about the fact the Google has several patents on MapReduce this might be an obstacle to the development of open source product on top of hadoop. Amr did not seem to really worry about this.

The second presentation was from Ashish Thusoo from Facebook. Some interesting numbers and statistics about the volume of data processed everyday by Facebook (e.g. already 200GB/day in march 2008). Ashish pointed out that it was more interesting for Facebook to have simple algorithms running on large amount of data than complex data mining algorithms running on small volumes. The benefits were more important and the company was learning much more on their users behaviors and profiles. It was back in 2008 that Facebook started to experiment with MapReduce and Hadoop as an alternative to very expensive existing data mining solutions. One of the issue with Hadoop was the complexity of development and the lack of skill among its teams. This is why Facebook started to look at ways to wrap Hadoop in a more SQL like friendly layer. The result is HIVE which is now open source, although Facebook has some proprietary components, especially on the UI side.

There were some good questions about data skew issues with Hive and Hadoop as well as comparison between HIVE and ASTER. Like Amr did with virtualization and Hadop, Ashish tried to oppose both approaches in simple terms: in a way ASTER is MapReduce applied on top of a RDBMS layer whereas HIVE is a RDBMS layer running on top of MapReduce.

Both presentations:
  • Hadoop: Distributed Data Processing (Amr Awadallah)
  • Facebook’s Petabyte Scale Data Warehouse (Ashish Thusoo)
are available as PDF files on the ACM web site.


Wednesday, December 30, 2009

A Portal Framework for HealthCare


Portals are Web-based applications that give users a centralized point of access for information and applications of relevance. Therefore the portal paradigm is an attractive proposition for health care because it offers a solution to rapidly aggregate heterogeneous applications and services while offering a high level of customization and personalization to the users, patients, care givers and IT personnel.

The integration of healthcare systems and data is a major challenge. Business conditions that typically result in fragmented data stores and limited application functionality are prominent in the healthcare industry.

To meet these challenges, we have created a Portal framework architecture which makes the SOA concept less abstract by offering a concrete service aggregation infrastructure including integration glue like context and code mapping, transformations, master patient index, single sign on and standards based interfaces. The framework facilitates the integration of various applications, so they need not be rewritten to be able to provide services to the portal. Our portal framework is compliant with industry standards such as JSR 168, JSR 286 and WSRP.



In addition to the front-end aggregation layer, a context management layer which uses a subset of the concepts of the HL7 Clinical Context Object Workgroup (CCOW) standard (centralized scheme, robust push-model, simplified context data representation) is used to solve user mapping and facilitate the coordination and synchronization between visual components (portlets in our case). This context management layer connects to the Web services (SOAP or RESTfull) that are exposed by the different systems.







Sessions and Contexts


A portal application like any other web application works with a session. All requests are executed in the context of such session. The session is associated with an authentication context and a lot of other information that is accumulated while processing the requests that are executed with the session. A session can be understood as a temporary storage with a well-defined life cycle. A session is ended either explicitly (log out, connection closing) or by a time-out.



The basic relationship and mechanism between the sessions, identity and context is described as follow: when accessing the web application for the first time there is no session established yet. The user is forced to log in (providing his identity and the credentials to prove the identity). This establishes an authentication context which is kept within a dedicated session. During the requests executed in a session, information is accumulated and processed in the session.





Connecting the Services
Both the portal application (A) and the remote system (B) may have their own identity management capabilities and their own credential storage. In order to integrate A and B we have implemented an extended SAML based token service. The resulting Security Token Service (STS) service includes the token service module as well as an eHF based context management module. This eHF Context Module stores the mapping information between user identifiers from A and the identities of B.








More complex scenarios


In reality, portal applications typically consist of multiple portlets that interact together. Each portlet can themselves aggregate services from various sources. This is where the portlet proxy is very handy because it can shield the presentation layer from back-end service implementation details.

The integration of a new application exposing web services (SOAP or RESTful) is made easier because eHF provides a mediation and routing platform component (IPF) based on Apache Camel that can wrap these services, operate transformations on data and expose them to the portlet proxies. In addition to this the current use of the Security Token Service for authentication can be complemented by the use of a Single Sign On (SSO) mechanism.

For this specific implementation we used Liferay 5.2. as portal server container and a medicine cabinet as healthcare related topic and material.

More details can be found in the paper "A Portal Framework Architecture for Building Healthcare Web-Based Applications" published at the 3rd International Conference on Health Informatics (Health Inf 2010).

Wednesday, November 25, 2009

Context Management, CCOW & HealthCare


What is Context Management?

Context Management is a dynamic computer process that uses 'subjects' of data in one application, to point to data resident in a separate application also containing the same subject.
Context Management allows users to choose a subject once in one application, and have all other applications containing information on that same subject 'tune' to the data they contain, thus obviating the need to redundantly select the subject in the varying applications.
In the healthcare industry where context management is widely used, multiple applications operating "in context" through use of a context manager would allow a user to select a patient (i.e., the subject) in one application and when the user enters the other application, that patient's information is already pre-fetched and presented, obviating the need to re-select the patient in the second application.
In other words it enables clinicians to select a patient's name once in an application and have their screen automatically populate with links to that patient in other applications.
  • Context management is especially used in Patient Information Aggregation Platforms (PIAP) such as Portals.
  • Context Management can be utilized for both CCOW and non-CCOW compliant applications.

What is CCOW?

Context Management is gaining in prominence in healthcare due to the creation of the HL7 Clinical Context Object Workgroup standard committee (CCOW) which has created a standardized protocol enabling applications to function in a 'context aware' state.
The CCOW standard exists to facilitate a more robust, and near "plug-and-play" interoperability across disparate applications.
The Health Level Seven Context Management Standard (CMS) defines a means for the automatic coordination and synchronization of disparate healthcare applications that co-reside on the same clinical desktop.

The clinical context is comprised of a set of clinical context subjects. Each subject represents a real-world entity, such as a particular patient, or concept, such as a specific encounter with a patient.
By sharing context, applications are able to work together to follow the user's thoughts and actions as they interact with a set of applications. These applications are said to be "clinically linked."



The CMS is extremely prescriptive, but as it is only a standard it can only go so far in terms of guiding how applications are actually designed and implemented. Variability among the decisions that application developers make can lead to various amounts of confusion for users of multiple independently-developed CCOW-compliant applications.



  • HL7 CCWO HL7 Context Management Specification acronyms:
    • CMA: Technology and Subject-Independent Component Architecture
    • SDD: Subject Data Definitions
    • UIS: User Interface (Microsoft Windows and Web Browsers)
    • ATM: Component Technology Mapping (ActiveX)
    • WTM: Component Technology Mapping (Web)
CCOW - Context Management Architecture (CMA)

At the most abstract level, the Context Management Architecture (CMA) provides a way for independent applications to share data that describe a common clinical context. However, the CMA must provide solutions for the following problems:
  • What is the general use model for a common context, from the user's perspective?
  • Where does the responsibility for context management reside?
  • How are changes to context data detected by applications?
  • How is context data organized and represented so that it can be uniformly understood by applications?
  • How is context data accessed by applications?
  • How is the meaning of context data consistently interpreted by applications?
  • CMA characteristics:
    • Centralized scheme: The responsibility for managing the common context is centralized in a common facility that is responsible for coordinating the sharing of the context among the applications.
      • The consequence of the service being a single point of failure is offset by the fact that the service and the applications it serves are typically co-resident on the same personal computer.
      • The consequence of the service being a performance bottleneck is offset by the fact that the applications are far more likely to become the performance bottlenecks
    • Robust push-model: This is a push model that deals with synchronization and partial failure issues.
    • Context Data Representation uses Name-value pairs:
      • A set of name-value pairs represent only key summary information about the common context (e.g., just the patient's name and medical record number).
      • The symbolic name for an item describes its meaning.
      • The data types for the items come from a set of simple primitive data types
    • CMA maintains a single authentic copy of the common context for each common context system.
      • Applications can choose to cache context data or they can simply access the authentic copy whenever they need to.
      • Applications can also selectively read or write specific context data name-value pairs.
      • When the context changes, an application is only informed about the change and is not provided with the data that has changed.
      • The application can selectively access this data when it needs to.
    • Context Data Interpretation
      • Standard HL7 CMA subjects and associated context data items includes the core subjects of patient, encounter, observation, user, and certificate, and their respective context data items.
      • Organizations, such as healthcare provider institutions and vendors, may define their own context subjects and data items. These items are in addition to the standard subjects and the standard items defined for the standard subjects.
      • Context item names are case insensitive.
Existing solutions using Context Management
  • Fusionfx from Carefx
    • A Context Manager function is responsible for establishing the links among the applications, which serve as Context Participants.
    • Context Participants synchronize after querying the Context Manager to determine the current context and when to update the context.
    • Context Management also supports Mapping Agents, which map equivalent identifiers when the context is updated so that all participating applications can interoperate.
    • Fusionfx includes JSR-168 based front-end viewers (java portlets) running on IBM Websphere portal solution.
  • Vergence from Sentillion
    • Vergence Wizard (April 2009) a tool configure Single Sign-On (SSO) and Context Management Application Interfaces
      • Fast single sign-on and managing expired passwords
      • Graceful termination of applications at sign-off
      • Support for single sign-on and patient context management use cases
      • Point and click interface to select application controls and associated actions for Windows and Web-based applications
      • Pre-defined actions, which include common navigation, text entry and event monitoring tasks such as Click, Enter Text, Select Menu ..
      • Integrated playback mechanism for testing individual steps
      • Optional display of detailed logs during playback
      • Extensible interface simplifies insertion of custom actions and preserves customizations when regenerating the Bridge
      • Plug-in architecture enables extensibility to incorporate new actions and events and for accommodating new application development technologies
    Sentillon has a patent on context management (US 6,993,556)
Existing solution using CCOW as a participant

  • Centricity Framework from GE
    • Centricity Framework offers GE developers a way to integrate separate GE products, while consolidating sign-on and security to a single point of entry.
    • Developed by the Advanced Technologies Group (ATG), Centricity Framework provides a consistent presentation of login, navigation (menus), and patient banner.
    • Hosted products share context information and can offer cross-product workflows, regardless of their UI technology.
    • Centricity Framework 5.0 (CF 5.0) offers a choice of two client desktop solutions (both require .NET 2.0 on the client desktop):
      • Traditional browser-based Web client (as provided in 4.x)
      • Iris, a .NET-based client solution that does not require Internet Explorer (based on Microsoft smart client technology)
    • Centricity Framework 5.0 supports the Sentillion single sign-on (SSO) solution and Carefx's context manager. The Carefx library is loaded only if CCOW is enabled for the workstation/user.
    • Communications with the Web Framework (WF) server is done via XML-based service calls. Each instance of the CF exposes a certain URL as the handler for all service calls. This URL can be found in the DataURL tag in the ServerInfo.xml file which is located in the WF's main web folder:
      • servlet/IDXWFServlet (for Tomcat-based installs)
      • IDXWFData.asp (for IIS-based installs)
    • CF enable the use of a security plug-in to implement an alternate authentication mechanism in place of the standard Framework username/password check. If a security plug-in is used, the plug-in performs any server-side authentication of users or of authentication tokens generated on the client. The Framework currently provides plug-ins for
      • Kerberos (v4.01)
      • RSA SecurID (v5.0)
      • CCOW userlink (v4.0)
      • CCOW/LDAP integration (v4.03)

Alternatives to CCOW

Worth noting is the Global Session Manager (GSM) developed at Siemens which offers features almost equal to a CCOW environment but is much easier to implement.
The patented system includes:
  • A system that enables (Web) applications to be integrated into process involving concurrent operation of applications
  • The system specifies the rules for conveying URL data and other data between applications
  • The system employs a managing application and services (session manager) to facilitate application session management
  • The system employs by a first (parent) application for supporting concurrent operation with other (children) applications.
  • The system involves an entitlement processor for authorizing user access to the first (parent) application in response to validation of user identification information.
  • The system involves a communication processor for communicating a session initiation request to a managing application to initiate generation of a session identifier particular to a user initiated session.
  • The session manager is used by the managed applications to reference global data that is essential to a workflow. Such global data includes:
    • user identification information
    • a shared key used for the encryption of URL data
    • a common URL to be used for handling logoff and logon function.
  • The session manager is regularly notified of activities from the applications to prevent an inactivity timeout while a user is active in another concurrent application.
  • The session manager employs a system protocol for passing session context information between applications via URL query or form data.
    • The session context information comprises:
      • a session identifier (used by the managed applications to identify a user initiated session in communicating with manager)
      • a hash value (used by the managed applications to validate that a received URL has not been corrupted)
      • application specific data (can be encrypted)
  • The session manager uses a unique session identifier (SID) for each new session (to protect against corruption and replay of a URL)
  • In addition to this, to avoid redirection, the parent application needs to generate a URL link with an embedded hash value from the domain, port and filepathname of the URL (e.g. using RSA MD5)
  • The communication protocol between the client browser and the applications is HTTP
  • The communication protocol between the applications and the Session manager is TCP/IP.
Siemens has a patent on this technology (US 7,334,031).

Wednesday, October 7, 2009

Health 2.0 - 2009 (San Francisco) - Day 2

8:40am - 3 Health 2.0 CEOs & a President!

Adam Bosworth, CEO, Keas

The idea with Keas was to start small trying to help a small number of people and scaling from there. Chronic diseases are life style disease. Keas tries to address these issues.

Quest Diagnostics has been a great partner. We also have good partnership with MS Health Vault and Google Health.

Lessons learned: try to be modest, get out early your product and learn from your customers!




Roy Schoenberg, CEO, American Well

Bottom line: try to make our customers (e.g. the Pentagon) happy. We seems to be at the right place at the right moment, especially now with the Health Care reforms that are taking place. American Well for patient it is convenience (immediate gratification, cost etc ...), for the physician and the payers, the solution brings efficiency, more revenue and ROI.

In Q1 2010 we will deploy our application to more PCP and specialists networks.


Christopher Schroeder, CEO, Health Central

We offer a more holistic experience than other web sites, a sense of connection and empathy. We are more focus on the consumer side of the issues.

Alexandra Drane, President, Eliza

We have 43% click through from email. We have the ability to look at the data that we store overtime and improve our system. We had to acquire a company to quickly get the social network infrastructure and experience.

9:30am - The Consumer Aggregators
27% of American go online to seek HealthCare related information and services.
  • Wayne Gattinella, WebMD - The market more mobile, more global and more personal! - Focus on continuity of care. Demo of WebMD on the iPhone. We need to demonstrate the utility of the Health 2.0 applications. Adoption is key.
  • Roni Zeiger, Google Health - New launches: (e.g. Google flu trend) - Demo of MDLiveCare.
  • David Cerino, Microsoft Health Solutions - Demo of My health Info.
11:00am - Data Drives Decisions - Panel

Tools and platforms to support decision-making by doctors
  • Rex Jakobovits, McKesson - Upload online version of medical images / Pacs systems. Demo of MyPacs.net - more than 100K medical images. Fine grained search capabilities on images (CGI/perl based web site). Help radiologist to look at challenging cases to learn and solve related cases.
  • Lance Hill, Within3 - Customers are hospitals, Pharma, Research centers. Demo of Within3 medical social network in the context of a medical conference (running on Amazon EC2). Issues with publishing abstracts of papers shared between peers before publication.
Tools and platforms to support decision-making by patients
  • Sanjay Koyani, FDA - Pushing information to consumers and providing not only services but also a platform. - Show recalls database from FDA (4000 products) . The dailygreen.com integrated FDA widget (e.g. on Salmonella peanut recall). Collaborates with CDC as well. More than 20,000 sites are using FDA widgets.
  • John deSouza, MedHelp - Online and mobile applications to help patients deal with specific conditions (e.g. breast cancer). Data can be shared with other patients. Mobile applications available as well.
  • Hugo Stephenson, iGuard - Offer online and paper based tools related to drugs interactions - Launch in oct 2007 - 2M US citizen registered - 25 times more registered by paper! Demographic and statistic numbers available online on these patients. Includes recalls information. Business model based on revenue from clinical trials. Hope to have 3M users in the coming months.
The impact of Health 2.0 tools and platforms on clinical research

Kristin Peck, Pfizer

Alexandra Carmichel, CureTogether
- Patient data sharing site - Lot of data on depression. Offers correlation (e.g. depressions & fibromyglia)

Jamie Heywood, PatientsLikeMe
- Very detailed statistics on patients with similar conditions. Compare data with clinical trials.

Amy DuRoss, Navigentics
- Comparison of personal and general population genetic information as well as risks factors.




12:30pm - IDEO design competition

  • Finalist winner of the IDEO competition is LabCheck Plus (Satellite Laboratory Services) - The fastest growing dialysis lab in the US. Users are Nurses, Physicians, Patient care technicians etc. The process involved ethnographic field studies, focus groups and usability testing.

1:00pm - Launch
  • Pathway Genomics - Genetic tests for $348 - Includes ancestry and health tests . Done by saliva.
  • Remedy Systems - Web based Portal & Mobile Portal - ePrescribing application at the point of care
  • AccessDNA - WebMD for genetic testing compare DNA testing providers. Offer personalized reports.
  • CarePass - Online and Mobile solution
  • TrialReach.com - Multimedia online tool to find and understand clinical trials.
  • DNA Guide - Personal online genomic map (includes DNA Classifieds).
  • Unity Medical - ultimedia online content to help diagnosis. Technology to push content to the desktop (e.g. videos relevant to the user). ***
  • Livestrong - Help to determine how food can impact users/patients. Includes interactive graphs.
  • Healthline - Web site offering health related content
  • BiodiMojo - To help families who raise teenagers.Includes health and fitness tracking goals, journal. Send mojos by notifications and community tools related to teenagers.
  • RelateNow - To help families that have children with autism, including reducing the cost of the therapy. Offer collaboration tools for the family and the health care professionals.
2:00pm - Innovations in Health 2.0 Tools: Showcasing the Health 2.0 Accelerator

Combined demos by:

MediKeeper - PHR with a portal look and feel including a health risk management.

change:healthcare - Cost saving alerts and analysis

Kryptic - portal integration hub

Sage - EHR integration tools and platform. Supports CCR imports





MedSimple - Online questionnaires to prepare medical encounters (electronic patient history).

PharmaSURVEYOR - Medication safety survey (interaction and toxicity of individual drugs).

Polka - Mobile observation engine - manage prescriptions and observations.

ReliefInsite - Pain management system.

Keas - Launch Keas Beta - Personalized care plan. Includes todo list, history of tasks and report cards. Uses CCR . Import data from Google Health and Health Vault.

Kinnexxus & MedSimple - Kiosk dashboard for elderly and online application for care givers. 3:30pm - Health 2.0 Around the world

See also: