Reducing Readmissions: Can Predictive Analytics Help Lower Patient Returns?

Medicare reimbursement penalties for 30-day readmissions deemed unnecessary began in 2012 with three conditions: heart failure, heart attack and pneumonia. Elective hip and knee replacement and lung condition patients have been added to the list. This year, 2,610 hospitals will suffer the penalty of lower reimbursement for the next year. Last year, almost 18 percent of Medicare patients who had been hospitalized were readmitted within a month—a reduction but still representing 2 million patients and a cost of $26 billion. Reports estimate that $17 billion of that came from potentially avoidable readmissions.

Hospitals are now forced to note what’s happening with their patients after discharge and many are using advances in technology to do so. “We’re all trying to figure out how to target interventions to the population that matters,” says Todd Pollock, quality manager of the University of Pittsburgh Medical Center’s (UPMC) Donald D. Wolff Jr. Center for Quality, Safety & Innovation.

“Many people admitted to the hospital do not need everything thrown at them to prevent them from coming back but there is a big group who does.” Making that distinction starts while patients are in the hospital bed, he says.

Make the distinction

UPMC pulls information from its EMR to calculate a risk score. That information includes clinical data, lab data, whether they have an oncology physician, length of stay, prior admissions, whether the stay is for an elective or non-elective procedure and more. Patients are then categorized, based on the risk score, as being at low, medium or high risk of readmission.

To date, using the information in this way has been shown to be “a little bit better than the flip of a coin.” Adding in outpatient data from UPMC’s health plan, however, improves that performance. “Using both [sets of data] gives you better predictability of the patients most likely to come back,” Pollock says.

UPMC has an integrated delivery and finance system with a two million-member health plan, says Pamela Peele, PhD, chief analytics officer of UPMC’s Insurance Services Division, so they have the ability to see, on the inpatient side, all of the care consumed within the UPMC environment and the ability, on the plan side, to see all care consumed.

This involves a conditional model in which the health plan predicts the probability that a member will be readmitted if he or she is admitted today. “We’re using past consumption of medical services to predict readmission. It’s a very unusual model within the industry,” she says.

When patients are on their way out of the hospital, the care management system now has information from two sources. The models can be very different, Peele says, but they perform almost the same. When they don’t emit the same results, if the hospital model says the patient is at high risk, “we need to pay attention to those people and put our resources to work for them.”

While the health plan has lots of resources devoted to translating high risk scores into services, says Pollock, the hospital has started pilot projects that aim to enhance interventions. That includes discharge phone calls, enhancements to medication reconciliation, making sure patients can afford their medications and making sure patients understand the discharge instructions.

Pilots to determine root cause help UPMC mix up the “magic sauce of what these patients get,” says Pollock. “There is some definite learning needed so we can improve this transition of care. One-third of our patients come back in seven days. We’re getting away from ‘readmissions’ and using the term ‘failed transition.’”

They’ve already learned that follow-up is crucial, says Peele. “It’s clear that there’s a distinct difference between people who get follow-up shortly after discharge vs. people who don’t with respect to readmission rates.” UPMC is modeling around optimal timing for that follow-up. Results indicate that patients need to be seen within five days to impact readmission rates. “We pushed that information down on the providers so discharge planners can focus on getting these patients seen back in the clinic.” The health plan follows that as a performance metric.

Pilot drives growth

NorthShore University HealthSystem in Evanston, Ill., also has conducted predictive analytics pilot projects to help curtail its readmission rates. Its clinical analytics team has gone from three or four people to 18, largely due to the success of a heart failure readmission project. “We feel that there’s a real return on investment to be had by doing a better job of using analytics,” says Ari Robicsek, MD, vice president of clinical and quality informatics and associate CMIO.

Case managers are an expensive, limited resource, so Robicsek says the health system would like to target its interventions in a more efficient way. “We can only do very comprehensive case management for a small percentage of our patients so we want to attach them to patients they can help the most.” Predictive analytics can be “extremely powerful in pointing us at the right individuals—high utilizers.”

NorthShore used to spend $1 million a year testing everybody for MRSA. They were able to cut that in half without any reduction in patient safety just by using predictive modeling to learn which patients don’t need testing.

Working off of that success, NorthShore created an initial model for heart failure patients using the EMRs of several thousand patients. “We identified hundreds of variables that affect heart failure outcomes and narrowed our choice to 35 in building our pilot predictive model using logistical regression techniques,” says Robicsek.

The test model segmented patients into quartiles with those in the top quartile at high risk. Validation of the model over several months, during which aggressive interventions were not used to change outcomes, showed that low-risk patients were readmitted at a rate of 12 percent; medium-risk, 16 percent; and high-risk, 33 percent.

During a pilot program to test the model, patients’ data entered into the EMR each day fed into the predictive modeling tool for a daily risk calculation. NorthShore also implemented an alert system—an e-mail is sent each morning to staff in a variety of departments where the model is being piloted. The e-mail identifies each hospitalized heart failure patient according to high, medium and low risk. The clinical staff then intervenes with individual patients based on their risk on each day of hospitalization and determines appropriate interventions post-discharge.

The most important variables in predicting risk in NorthShore’s patient population were the number of previous admissions, the number of medications and the type of medications, as well as several laboratory results. “Comorbidities did not end up on our final list of variables in this current model, but for another organization, that variable may be important,” he says.

In addition to their experience with the heart failure readmissions project, the growth of NorthShore’s analytics effort is due to the fact that they found early on that the model also worked well in the general population. “We’re excited about this model. We’re able to send a message to providers every morning with a list of high-risk patients.” Providers are then to ensure that those patients get several interventions, including visits with a social worker, occupational therapist, physical therapist, pharmacist and cardiologist before discharge.

After tracking those interventions for a few months, they found that only 15 percent of high-risk patients were getting all the right interventions. The team again studied the process and learned that “the workflow integration was lousy,” Robicsek says.

For one thing, that list of high-risk patients was getting to providers at about 9 a.m. Rounds start at 7 a.m. so the lists weren’t being used. Providers wanted the information integrated into the EMR. “That led us to embark on a series of projects where we found ways to get data out of predictive models where the modeling engine was not inside our EHR. It sits in our enterprise data warehouse which is much better suited for this kind of complex computation.”

Most important, he says, is feeding data via a bridge built from the data warehouse into their EMR. Every single inpatient has a risk score attached to his or her chart which is accessible through use of the patient list function. Any clinician involved in a patient’s care can see his or her risk score to help drive more care to those in the high-risk category.

As a result, the readmission rate for heart failure reduced considerably, Robicsek says. One hospital was in the range that meant penalties would kick in but is now safely out of that range.

“Really, that project was the terminal project for a series of predictive modeling we’ve begun to do in many areas.”

NorthShore also built a different bridge to its reporting tool that allows users to search for patients meeting certain criteria. “It’s a pretty simple functionality but we’re able to create a report based on the complex predictive model we built outside of the EMR.” That kind of bridge has been implemented in their care manager workflow as well to help them reach out to patients at high risk of being hospitalized for any reason in the next year.

The health system’s IT has numerous rules baked in, Robicsek says, all tied back to their EMR’s reporting workbench. “We created a loop where information coming out of the EMR is being used for analytical purposes and the results are fed back into the EMR so the EMR can provide more data for our models.”

Turning data into action

PinnacleHealth System, based in Harrisburg, Pa., began its predictive analytics journey in the middle of 2013, says George Beauregard, DO, senior vice president and chief clinical officer. “The demands of accountable care, as promulgated by the Affordable Care Act, really demands deeper mining and analysis of data. Really, I think it puts in an imperative to turn data into action.”

Because the organization’s readmissions rate for chronic obstructive pulmonary disease (COPD) was “stubbornly high for several years,” PinnacleHealth made a commitment to make predictive analytics an organizational competency over time. With readmission penalties now in play, “there was a lot of incentive for us to become more proactive and investigate using a technology platform to help us,” says Beauregard.

The organization began its COPD readmission pilot in June with a narrow scope of implementation—one nursing unit—to “make sure we completely understood all the downstream effects on workflow.” Plus, they wanted to make sure the predictions resulted in action. “If you don’t do anything with it, there’s no value. You have to integrate the predictor and the intervention back into the system and the workflow where the trend originally started. The intervention has to tie back to the reason for the readmission in the first place.”

While the COPD pilot is still in the early stages, Beauregard says PinnacleHealth also is working on a heart failure model because that condition “is another major stubborn problem for us and for the nation.”  They’ve assembled a different clinical team although several members sit on the overarching steering committee. “We have seen that we can build and run several models simultaneously, each addressing different clinical conditions that are burdensome.”

Lots of variables go into building the model, including socioeconomic variables, clinical variables, visit history and number of admissions. Some of the variables prove more valuable than others in becoming a predictor, he says. The model receives all the clinical information from PinnacleHealth’s inpatient EMR for every patient admitted that day with an admitting diagnosis of COPD and is in a bed by midnight. The model predicts whether they are at high, moderate or low risk of readmission as well as predictions for length of stay and how long until their next exacerbation.

Although not possible right now, Beauregard says the goal is for this report to be generated and embedded into the EMR automatically. For the time being, a paper report goes into patient records.

This idea of model predictions, patient demographics, an intervention checklist right in the record as well as automated reports and alerts is the ideal end state, he says, but “we’re not quite there yet.”

Interventions are important because there are many different available models with different strengths and weaknesses. Not all of the models capture unstructured data so information such as whether patients have good support at home isn’t necessarily recorded in the EMR in a mineable data field.

Today’s tools

The health IT market includes many different predictive analytics models with different strengths and weaknesses, says Beauregard. “Some have proprietary resources so you can’t understand how the model works which we feel is an important thing for clinicians to understand. It helps get their buy in.” Others rely on statistical methods or differential equations.

“It’s a rapidly evolving sector that’s been used in other sectors for a long time. We’re late to the game in healthcare.”

The Centers for Medicare & Medicaid Services (CMS) “to their credit, has used their bully pulpit to try to improve care,” says Peele. The agency’s mandates have helped grow the market for predictive models. The challenge, though, is that the models require data and “most organizations don’t have their data in good enough shape to develop a model off of.” Data used by health plans, for example, were not created to develop predictive models but to pay bills.

The secondary use of claims data is challenging, she says. “People will tell you that claims data are dirty. There’s nothing dirty about claims data. They do exactly what they’re supposed to do.” Using them for another purpose is difficult and users must understand how the data were created and all the nuances, she says. “If you don’t have a concrete understanding, you will create models that are quite silly."

Another concern about currently available tools is how the information is presented back to clinicians. If users have to go to the vendor’s website for information or somewhere else, they won’t like it. “If we can’t integrate risk information into our EMR, we don’t even want to start the conversation,” says Pollock.

How to get started

Healthcare organizations don’t necessarily need to start that kind of conversation with a vendor. Pollock says organizations with an EMR in place can use that as a starting point. “Use what you already have. Don’t try to create a new field that you’re going to have to operationalize.” He suggests starting with structured information and there are many tools available to help pull both structured and unstructured data.

The mistakes facilities make, says Peele, is think they have to buy a product at a high cost that takes 18 to 24 months to install. “You don’t have to invest tens of millions of dollars. The model we’re using is published in the Journal of the American Medical Association. Take what you have and use it.”

Robicsek agrees. “You don’t need fancy predictive analytics. There are probably things you can do better and efficiencies you can realize without doing predictive analytics.” For example, how late in the day do your clinicians realize the services a patient will need at discharge? That can be identified outside of predictive analytics, he says.

He also advises organizations to organize and clean up data, perform the basics of data warehousing and then build up a team that understands the available data. “Then you’re able to start doing the modeling.”

With organizations across the country facing a similar challenge, readmission penalties are only likely to increase, says Beauregard. “All of them have to think about being more proactive about how to prevent readmissions. Pick an experienced partner and proceed slowly. If you don’t exercise the muscle, it’s not going to get any better.”

The readmissions number is never going to be zero, says Beauregard. The question is what rate is clinically reasonable and feasible. “The greatest value in predictive analytics stems more in disease prevention or retarding the progression of a disease. It’s really about improving patient outcomes. That’s the No. 1 goal of this project.”

As providers learn to re-engineer their workflows, “they are not going to stop doing this,” says Peele. “Penalties are the impetus to kick off this change but they will become unimportant over time. Hospitals are not going to go backward.”

The Research on Readmissions

The issue of readmissions is “definitely at the forefront of many physicians’ and hospital administrators’ minds,” says Jeremiah Brown, PhD, assistant professor of health policy and clinical practice at The Dartmouth Institute. “It is a growing area of research.”

Brown was lead author on one study that found that a broader look at a given provider organization’s intensity of care offers insight on potential readmission rates. The study found some regions and their healthcare providers use hospitals more often than others, for care of all types, including readmitting patients within 30 days of their discharge after treatment for a heart attack.

Published in the Journal of the American Heart Association, the study’s core message is that “when hospitals are looking at how to reduce readmissions, they also need to look at the issue through the lens of how intense is healthcare and how intense their hospital is in admitting patterns,” he says.

Several professional associations are working on predictive modeling to help reduce readmissions. Providers and vendors are working on better leveraging EMR data for risk prediction as well as developing automated surveillance systems to use computer learning technology to automatically flag high-risk patients.

Most of the products on the market now, Brown says, are driven by claims data and “don’t actually perform very well.” The Dartmouth Atlas Project is “trying to find other patterns or signatures within the EMR or within patients’ biology by looking at novel biomarkers to try to identify factors that could improve readmission rates.” The ultimate care model, he says, would incorporate the computing framework side, molecular biology side and health system markers of intensity of care.

Beth Walsh,

Editor

Editor Beth earned a bachelor’s degree in journalism and master’s in health communication. She has worked in hospital, academic and publishing settings over the past 20 years. Beth joined TriMed in 2005, as editor of CMIO and Clinical Innovation + Technology. When not covering all things related to health IT, she spends time with her husband and three children.

Trimed Popup
Trimed Popup