Anthony Philippakis, MD, stays busy — he is both chief data officer at the Broad Institute of MIT and Harvard, and a practicing cardiologist at Brigham and Women’s Hospital. Philippakis spoke with Connected Futures about precision medicine – treatment that takes into account each patient’s genes, environments, and lifestyles, as opposed to a "one size fits all" approach.
In addition to better results for patients, Cisco’s research has identified $1.8 trillion of digital value at stake by 2024 as healthcare organizations digitize and improve processes. Here, Philippakis notes the critical role of data sharing in delivering on all of precision medicine’s promise, and the role CIOs must play.
Connected Futures: What is the current state of precision medicine and what are the next steps?
Anthony Philippakis: There’s no question that we’re in the first phase of precision medicine. Everyone sees the potential for what it could do to transform how we care for patients, yet at the same time there are some real challenges to overcome to make that a reality. One of the biggest things we must do is generate and organize the data sets necessary to build the data-sized toolkit that will make precision medicine a reality.
Can you offer some examples of how precision medicine is being used today?
One of the earliest areas where you’re starting to see adoption of precision medicine is oncology. There’s quite a bit of uptake of cancer genome sequencing. You’re beginning to see reimbursement come online, and with that comes the opportunity to use genomics to select therapies for patients with cancer. Even as a cardiologist, I would say that oncology is at the vanguard of where it’s going.
Precision medicine is not just about the utilization of genomic data types – there’s nothing sacred about genomics; it’s one thing that we can use – but oncology and genomics are really moving things forward.
While we can assume precision medicine and data sharing lead to better patient outcomes, what are the benefits for other stakeholders such as providers and payers?
Every day healthcare professionals make big decisions about their patients’ lives, based on suboptimal tools.
For example, as a cardiologist I often must decide whether to put in a defibrillator, whether to do a stent or surgery, or whether to put a patient on a blood thinner. These are very big decisions with long-lasting impacts on people’s lives. And in the golden age of machinery, the idea that we’re not using a sophisticated data science technology to make these decisions is very troublesome.
Every day healthcare professionals make big decisions about their patients’ lives, based on suboptimal tools.There’s a transformative impact on providers’ work that has yet to come, and will make medicine a much more successful endeavor for humankind.
For payers, there’s something very similar to be said about their incentives. There’s a lot of waste in the medical system, but I don’t believe that’s because of corruption or people looking out for their own self-interest. I think we honestly don’t make the best decisions about what tests to order or what procedure to do because it can be unclear, so you err on the side of wanting to make sure you’re doing something, even if something doesn’t necessarily help a lot.
Precision medicine and data sciences in general hold the potential to lower costs, which is in everyone’s best interests and needs to happen in healthcare.
Data generation and access are critical to precision medicine. What progress is being made on data sharing between providers?
We’ve taken first steps, but we have a long way to go.
Unfortunately, it’s very much the case that when I transfer from one medical system to another, it isn’t very realistic that my medical records will follow me. That’s an obstacle, and the medical community can do better. It will take a combination of incentives to make this a reality, using both the carrot and the stick where we force hospitals to start using standardized APIs to promote interoperability, as well as economic incentives that benefit them when they do.
Do you believe initiatives such as FHIR (Fast Healthcare Interoperability Resources), HL7’s next-generation standards framework, will enable easier data sharing?
It’s certainly a very good step in the right direction. Whether the strategy of incrementally raising the bar will ultimately get us there is anybody’s guess.
One of the things I’m personally very excited about are the opportunities to empower patients to get their medical records themselves. For example, my group is part of the All of Us program, a federally funded initiative to recruit a million Americans to get their genome sequences. And one of the big goals of that effort is to allow patients to donate their data to science. That mindset of empowering patients to get access to their own medical records has a lot of merit to it.
Do doctors and/or researchers need new training and education to take full advantage of emerging tools for data sharing and analysis?
For providers, I don’t think it’s that big of a deal. I would love it if every time I saw a new patient I had access to their complete medical records and it showed up in my EMR [Electronic Medical Records system]. It should be the case that we can build software that makes it no different than what clinicians do today in terms of their training and digital skill. It’s more about getting the policies and technologies in place rather than the physician skill itself.
What role do CIOs have to play in enabling data sharing and analysis?
There’s an interesting movement afoot, not just in healthcare, but across all branches of the economy. During the late ‘90s and early 2000s, we saw the rise of open source software, which had a transformative effect — if you look at the cost of starting a company it’s gone down several orders of magnitude, and one of the big reasons is the availability of open source software as a foundation. That really transformed the world for a lot of CIOs.
I think there’s a similar transformation that’s just getting going in the world of open data. Putting data sets out there for anyone to use could very easily follow a similar pattern, where open data becomes the fuel for training the next set of machinery and algorithms.
Do providers face any unique challenges trying to integrate data sharing/analysis tools with older IT systems?
For sure. Few industries have had to spend as much money as the healthcare system does whenever they want to make a switch in their technology stack. So it’s a big problem in healthcare. What are the remedies for it? That’s more complicated because it goes back to the interoperability question.