OR WAIT null SECS
In the past five years, the role of Master Data Management (MDM) and informatics has dramatically changed the strategic and operational objectives of life sciences organizations. As a result, the future of data and informatics in the life sciences industry is less about how much data customers can amass, but rather the questions that can be asked and the insights gleamed. When data is organized appropriately and the right questions asked, life sciences companies will benefit from better competitive intelligence and improved strategic and operational processes, resulting in increased success.
When discussing informatics, we are referring to the collection, transformation, normalization, storage, error identification, mapping, governance, and reporting capabilities of corporate data assets. Pharmaceutical informatics has many subsets ranging from clinical trial management systems (CTMS) to mobile enabled customer relationship management systems (CRM) used by sales forces around the world.
MDM as an offering is the combination of business needs with technology systems that allow for consistency, accuracy, universal identifiers, and attributes of corporate data assets. It is utilized to maximize the value of disparate databases and allow corporations to employ their data assets across the corporate hierarchy. MDM is an essential tool to making informatics a success in any organization.
Insights and trends
In examining the life sciences landscape, the industry is entering a new phase of informatics strategy requiring a new approach to MDM. The past five years has seen an explosion in the type of data available ranging from longitudinal (Lx) to patient outcomes driven by Electronic Health Records (EHRs). This “Big Data” is high volume, high velocity, and high variety assets, which—while of potential value—ultimately requires MDM to make sense of it all.
Changing scope and role
Traditional life sciences data and information was used as a means to detail the physician in the most effective means and get this physician to the Holy Grail: becoming a “high prescriber.” In this model, the breadth and depth of the data was centered on the physician and utilized in answering a simple question—did he or she prescribe a certain drug? Much of the new data available to pharmaceuticals can now be used in a predictive way rather than a reactive way based on what happened in the past. The existing model simply used past Rx data to impact the sales process in terms of detail aids, sales force resource allocation, and sales targets (Fig. 1).
Fig. 2 represents a vastly different data and informatics strategy. Not only has the breadth and depth of data increased exponentially, but its use across the organization has grown dramatically. Data has gone from a tool to a vital piece of corporate strategy and organizational infrastructure.
Not only has the scope of data changed, but the very purpose of analytics/informatics has fundamentally been altered. Data has moved from being reactive (tweaking sales force resources) to being proactive through such ways as predicting which therapeutic categories are worth researching, to assigning resources to product portfolio (Fig. 3).
A path forward
The explosion in data sources and evolution of data/informatics strategy has not happened in a vacuum. Market developments have created strategic trends that are driving enormous changes in the industry (Fig. 4). Four key drivers are providing the impetus and the developing structure of pharmaceutical MDM and resulting informatics—more timely and accurate decision making, ROI-driven solutions, increased compliance, and the shift of data to the cloud.
More timely/accurate decision making: In every aspect of life sciences operations— from preclinical assay research to physician sales strategy—data is making it possible to gleam insights and make decisions in a vastly different manner than just five years ago. Management demands that answers to key questions be obtained quickly and accurately. A single decision—such as state enrollments—can decide whether millions be spent on additional efforts or the potential loss of billions in exclusivity when a drug comes to market after a key competitor.
ROI-driven solutions: While life sciences has always spoken of ROI, only recently has real pressure come down to measure data and informatics on its return. Executives have an enormous pool of products to choose from when it comes to data. The ability to demonstrate and explain a potential return (increased market share, time savings, better asset utilization, etc.) are now some of the most important selection criteria when looking at the purchase of potential data assets. Another driver in the ROI model is the ability to share risk with trading partners and vendors.
Increased compliance: Life sciences faces increased scrutiny ranging from the Physician Payments Sunshine Act to Corporate Integrity Agreements (CIAs). The ability to produce accurate and timely data to demonstrate compliance with these new regulations means life science companies must have an approach to collecting, identifying, structuring, and reporting the necessary data. This requires a committed MDM solution.
Headed to the cloud: Another major shift has been the movement of data from on-site storage to the cloud. Traditionally, life sciences organizations have been extremely leery of hosting and/or sharing its data. The movement to the cloud has challenged life sciences basic premises, yet also has allowed the capture, storage, and use of Big Data without the issues of storage and access. Cloud-based MDM and Software as a Service (SaaS) have created entirely new tools to fit into the explosion of data being required of life sciences today.
The value of new data sources
In addition to the increase in the number of new data sources available, regulatory requirements such as the Sunshine Act and CIAs are creating the need for increased data and reporting requirements. This creates a significant challenge for life science organizations to first choose what data is relevant and valuable to driving key operations and intelligence.
For instance, compliance needs drives new data sources such as HCP license data, NPI, and sanctions. Once those data sources are identified, key decisions on infrastructure support are required. For instance, LexisNexis’ High Performance Computing Cluster (HPCC) platform enables vital MDM functions to be executed via a cloud-based service architecture. This type of service is necessary as external data sources traditionally outside of the life sciences’ business model are added to the mix of legacy data assets, such as Health Economics and Outcomes Research (HEOR), Real World Data/Evidence (RWD/E), and patient outcomes/drug efficacy.
Figure 6 represents the explosion in new data required for these new business requirements. The columns on the right are traditional transactional data utilized for sales efforts. The left represents new content that provide entirely new functionality and insights such as HEOR and RWD.
A working model — bringing it together
The future of MDM brings together the older Rx data model with a new forward-looking integrated data model. In the following example (seen in Fig. 7), a pharmaceutical oncology group was looking to capture both past legacy (backward looking) Rx data with a host of new forward-looking data assets, including HEOR data, patient longitudinal data (APLD), and medical claims.
As life sciences moves into the 21st century, data and informatics will continue to transform the industry. From compliance with CIAs to working with ACOs to develop drugs required to meet CMS outcome measures, pharmaceutical data assets will provide the backbone of new corporate structures and operations. MDM will be the tool required to help these data assets tell their story, identify and create relations, and guide companies in their future strategy. Companies that understand the power of data and informatics and utilize the tools that organize and help make sense of this data will be the leaders in the future.
All of these trends are forcing life sciences executives to find workable solutions in their demand for better data and informatics. However, organizations are struggling to cope in an environment where their ability to collect information is outpacing their ability to use it effectively. In addition, most organizations realize that their information infrastructure cannot hold, link, analyze and process the data collected. Industry reports show that organizations that cannot manage and analyze all their current and historical data are putting themselves at a competitive disadvantage and will lag behind their peers.
ABOUT THE AUTHORS
Josh Schoeller, is Vice President, Chief Solutions Architect, at LexisNexis Risk Solutions. Thomas Macpherson is Managing Partner at Nintai Partners.
The LexisNexis approach
As a leader in informatics, LexisNexis has created an alternative to the traditional installed software and data steward driven processes seen across life sciences today. The company’s solution revolves around three core differentiators:
I. State-of-the-art computing platform. LexisNexis has developed a High Performance Computing Cluster (HPCC) platform that incorporates over 1,000 sources of data on all prescribers. This alone facilitates correct, current and comprehensive profiles. The HPCC technology manages, sorts, links, and analyzes billions of records and has proven successful with enterprise customers in insurance and financial services who need to process large volumes of data in mission-critical 24/7 environments. In 2011, LexisNexis decided to open source the proprietary data intensive supercomputer and launch it into the marketplace as HPCC Systems for Big Data analytics processing. HPCC Systems helps organizations gain competitive advantages by leveraging all the data to help scale for innovation and growth. The streamlined platform needs fewer resources to operate and eliminates expensive legacy technology to Big Data solutions. The launch of HPCC Systems represents a paradigm shift to enterprise companies open-sourcing their proprietary and proven technology assets.
II. “Best in Class” matching capabilities enabling optimized automation of key data verification and augmentation. Big Data processing capabilities are not enough to ensure entity resolution success. Data must be linked quickly and accurately. LexIDSM is the proprietary ingredient in our products that turns disparate information into meaningful insights. This technology enables customers to identify, link and organize data associated with a record, so that identities and entities can be disambiguated quickly with a high degree of accuracy.
III. Rapid deployment via customer master data management proven methodology, including development of business rules and data stewardship/governance to produce continuous actionable results for maintaining an accurate customer master and MDM initiative.