menu
Blog data

Our industry is overflowing in data, and the deluge grows bigger every day. Roughly 2.5 billion gigabytes of data are created daily, with the body of healthcare data doubling every 2 years. The proliferation of healthcare data and increasing regulatory requirements is putting pressure on companies and their drug safety personnel to effectively monitor all this data for potential adverse events and relevant safety information. 

Between 2001 and 2010, almost one third of novel therapeutic drugs approved by the Food and Drug Administration (FDA) had a safety issue detected in the years post approval—suggesting a need for proactive and real-time data mining activities. In 2015 alone, the FDA received 1.2 million adverse drug event reports—nearly five times the total received 10 years ago.  It is not that these drugs aren’t as safe as prior treatments. Rather it is a result of information technology improvements and regulatory changes at the FDA that has added thousands of lower-priority reports to its Adverse Event Reporting System (FAERS) that had previously not been accessible for analysis.

Automate it

The ability to access more safety data can be a huge benefit for the pharma industry, but only if we have the tools to rapidly analyze that data, extract meaningful insights and respond accordingly. With the deluge of data now available, the current manual processes most companies use are no longer sufficient or cost-effective.

Fortunately, technology has evolved to the point where we can augment many of the labor-intensive manual processes with automated tools that can analyze and process large amounts of data in a fraction of the time. At this year’s DIA annual conference, (June 18-22) in Chicago, I will chair a program on this topic entitled Pharmacovigilance 2.0: Redesigning for the Future.

The program will include a panel of experts, sharing real world examples of pharma companies that are adopting automation and artificial intelligence for pharmacovigilance. They will also offer advice on what pharma executives need to do to support this transformation in their own organizations.

In my session, we will explore the impact of automation and the promise of artificial intelligence in pharmacovigilance—including examples in applications of automation technology, and how open source solutions and cloud-based environments make automated safety monitoring more widely available and affordable.

Automation and artificial intelligence are already redefining aspects of the pharmaceutical world. It is helping drive time and cost savings in research and commercialization, and it has the potential to drive greater control, consistency, and cost savings across our pharmacovigilance practices. At QuintilesIMS, our estimates show that if the current rate of growth in data volume continues, then by 2020, our current pharmacovigilance processes will cost companies approximately $100 billion. By introducing and adopting automation, our estimates show we can cut that cost in half.

But industry leaders can only realize these benefits if they inform themselves on how these new technologies work, develop a robust adoption roadmap with performance measurements and set a clear sense of urgency for their teams.

The move to artificial intelligence for pharmacovigilance will take time and dedication to make it work. The sooner companies begin this process, the sooner we all will benefit from the results.