Your new post is loading...
Your new post is loading...
So you've done due diligence, made deliberate purchasing decisions, and – thanks to smart planning and strategic teamwork – your rollout was near-flawless. Congratulations. You're the proud owner of an electronic health record system. But is your staff now using it to its fullest potential?
Too often, providers aren't making the most of their EHRs, according to Ed Simcox, healthcare practice leader for Logicalis Healthcare Solutions, a managed service provider.
"Many hospitals use only a portion of the capabilities available in an (EHR) today, and many applications are not configured in a way that promotes usage of their deep capabilities," said Simcox, in a press statement. "Hospitals and physicians have made significant investments in (EHRs); now it's time to improve and optimize their utilization."
The biggest missteps have to do with clinicians who have the wrong expectations of EHRs and administrators who don't fully leverage the problem-solving capabilities of the systems, he said. Earlier this month, Logicalis put the spotlight on eight areas where providers can do better when it comes to putting their EHRs to work.
Physicians are equating the use of health IT with reduced capacity to see more patients just as the Patient Protection and Affordable Care Act (ACA) is providing coverage for more Americans, a new survey finds.
As the number of health information technologies grew, physicians anticipated being significantly less likely to be able to accept new patients, according to the survey of Michigan primary care physicians, which was published in the American Journal of Managed Care.
In fact, the number of patients physicians could see decreased by 14 percent with each additional HIT system in use, according to the survey.
Electronic health records and electronic access to admitting hospital records in particular were associated with anticipated lower capacity. Other technologies--electronic prescribing, state immunization registry, reminder systems, and Web portals to either request refills or schedule appointments--were not seen as significantly affecting capacity.
Providers with higher HIT use also said they were less likely to accept patients with private insurance, but not with Medicaid or Medicare. That finding that surprised the researchers because private insurance generally has higher reimbursement rates, although Medicare and Medicaid might provide a steadier stream of patients, they reasoned.
The survey also found the inverse association of HIT use and capacity may be more prevalent in small versus large practices.
In research published recently at JAMA Internal Medicine, family practice physicians reported that EHRs cost them 48 minutes of free time per clinic day. Close to 90 percent of respondents said that at least one data management function was slower after adoption of EHRs.
Separate surveys report high levels of dissatisfaction associated with EHRs among doctors and nurses. Among the many interoperability issues, a Frost and Sullivan report found that chief information officers are concerned about EHR systems' poor retrieval capabilities. Among the issues with the systems were "rudimentary search functionality" and "poor usability," according to the health IT leaders.
Last week we found out that the ICD-10 transition may not be as expensive for medical practices as previously estimated. How does that happen?
First, consider the previous estimates are based upon the 2013 Nachisom Advisors study of ICD-10 transition costs. This gave a range for small medical practices from $56,639 to $226,105. If that's not enough to scare a physician into vigorous opposition, I don't know what else is.
To calm those fears, Thomas C. Kravis, M.D., Susan Belley, M. Ed., RHIA, Donna M. Smith, RHIA, and Richard F. Averill, M.S., counter that study with their own ICD-10 transition cost estimates in the Journal of AHIMA — which says the Nachimson study uses inpatient hospital activities to base costs.
The JOA study puts the ICD-10 price tag in a range from $1,960 to $5,900 for small medical practices.
Who are we to believe?
Neither. This is something that medical practice managers need to figure out for themselves.
Let's consider the framework created for the JOA study.Training
This is going to call for a set of ICD-10 code books and as many as 32 hours of formal training for each person and three hours for a physician. The JOA study caps the cost at $5,900. But I don't think that accounts for the time lost while staff members are in training.
AHIMA earlier estimated training time at:
But there is more than planning for the costs of training sessions and materials. The budget also needs to account for staff or temp workers who cover while your people are in training sessions. This may include outsourcing medical coding while staff coders are in training.
Single-physician practices may be able to work around the training schedule with minimal impact to the budget or productivity.System conversions
The JOA estimates depend on a small physician practice not having to pay anything to upgrade any systems:
Someone needs to start calling vendors. This is a huge part of the perceived costs of ICD-10 transitions. But maybe it's not.
Apparently this is a concern. The JOA study doesn't see the need to invest anything more than the time needed to usually update the superbill.
I'm not sure why this is a zero-cost activity. Medical practices shouldn't have to pay for testing. But the time to prepare ICD-10 claims that result in zero reimbursements is a cost. And there will be time needed to review results and make changes. Which means time will not be spent on something else.
Which leads to the next cost center.Productivity
It doesn't look like the JOA study accounts for the cost of time needed for converting processes. Which may not be a problem if the conversion is spread out over a year or so. Which makes procrastination more costly.
The JOA study focuses on the productivity losses from taking extra time to document individual encounters for ICD-10 specificity. That and the extra time medical coders will need to assign ICD-10 codes could be mitigated by automation.
This is where economies of scale come into play. A single physician probably doesn't take time to calculate minutes lost. (That would take time away from treating patients I'm sure.) So time lost in the ICD-10 transition could be "absorbed" without costing anything.
But as the healthcare provider size increases, lost time has a cost. The larger the medical practice or hospital, it is more likely there is someone calculating such costs.
So maybe these costs are too little to be worth concern in the very small medical practices. Even if the documentation effort has an effect, the JOA study argues that the increased documentation could increase reimbursement revenue previously left unclaimed.
And that's a valid argument today. Except the U.S. healthcare system is concerned with controlling costs. That means paying less for healthcare. Finding previously unclaimed revenue is going to become harder to do.How much will this cost?
In February when the Nachimson study was updated, I thought it was a good idea to have higher ICD-10 cost estimates. "Because how many complicated IT project are done under budget?"
But maybe a small medical practice doesn't have a complicated ICD-10 transition.
(This is a full-text transcription of one slide from my November 5th 42-minute webinar on Ebola and EHR Workflow Engines, Editors, and Visibility. Please excuse occasional “typos” as I’ve not proofed every word. Consider watching the Youtube video or going to first slide in this transcribed series. My original post announcing the webinar includes motivating context and an outline. Thank you!)
What we’ve connected up here is a series of steps to cost and benefit. If the cost and the benefit change, if the economics of the workflow change, you should be able to change the workflow. Unfortunately, many workflows and health IT and many EHRs are relatively frozen and that’s one of the sterling qualities that workflow technology brings to EHRs and health IT. It makes it easy to change the workflow when contexts change in order to make the users happier and more effective and efficient.
CMS highlights the future role of pediatric EHR use to improve the collection of pediatric quality measures.
In an annual report on care quality for beneficiaries of Medicaid and the Children’s Health Insurance Program (CHIP), the Centers for Medicare & Medicaid Services (CMS) highlights the future role of pediatric EHR use to improve the collection of children’s healthcare quality measures.
“As documented in the 2013 Secretary’s Report on the Quality of Care for Children in Medicaid and CHIP, CMS and states have made considerable progress in building a solid foundation for quality measurement and improvement,” the report states. “Working collaboratively with its many partners including states, health care providers, and program enrollees, CMS is now engaged in a number of efforts to use this information to drive improvements in care.”
Already, EHR technology is already demonstrating the ability to help pediatric care providers treat childhood obesity based on result from four managed care organizations (MCOs).
Modifications to the EHR system to prompt providers to collect body mass index (BMI), calculate BMI percentile, and lead to counseling in cases of concerning BMI values one of three system-level interventions that “resulted in statistically significant improvements on improving BMI percentile documentation, nutrition counseling, and physical activity counseling.”
To improve children’s access to high-quality, medically-necessary care, CMS relies heavily on the Child Core Set, healthcare quality measures specific to pediatrics. According to the annual report, the federal agency has made significant progress to improving reporting of consistent data while reducing the burden on states and their providers with at least 25 states voluntarily submitting data to CMS on 16 Child Core set measures.
For the last federal fiscal year, CMS set the following goals for quality measurement and improvement:
To improve the Child Core Set, CMS is actively engaging other federal agencies such as the Agency for Healthcare Research and Quality (AHRQ). As far as EHR technology is concerned, the federal agency is collaborating with the Office of the National Coordinator for Health Information Technology “to develop pediatric measures in areas that address the gaps in the Child Core Set and that can be collected” by certified EHR technology (CEHRT).
Additionally, with funding from Children’s Health Insurance Program Reauthorization Act of 2009 (CHIPRA), CMS is using multistate Quality Demonstration Grants to determine the effectiveness of provider-based models of care, pediatric EHR use, and integrating physical and behavioral health services to improve access to care for Medicaid and CHIP beneficiaries.
Poorly designed EHR alarms may be pushing physician dissatisfaction to the breaking point.
Too many EHR alarms and alerts may cause frustrated physicians to quit their jobs and seek employment elsewhere, especially if their interactions with the EHR system were monitored. Physicians have long complained that the overabundance of pop-ups, notifications, check-boxes, and emails is highly distracting to patient care and produces sensory overload that may increase negative perceptions of health IT while severely reducing job satisfaction. A new study in the American Journal of Managed Care adds evidence to the notion that poorly designed EHRs may be contributing to early retirements and higher turnover among fed-up physicians.
A research team asked 2590 providers, including physicians and non-physician clinicians, within the Department of Veterans Affairs system about their perceptions of EHR alerts, their job satisfaction, and the likelihood of quitting their positions. The researchers found that providers who perceived EHRs and health IT as a positive development in healthcare were more likely to be satisfied with their work and less likely to quit. Providers at facilities with higher turnover rates have lower perceptions of the value of EHR alerts, and are more likely to view EHR notifications as yet another demand on their time instead of a benefit for care coordination. Providers reacted particularly negatively to being monitored and given feedback on their interactions with the system, the study found.
“[EHR alerts] likely represent one of the most frustrating components of EHRs for providers,” the study says. “Compared with paper communication systems, they are perceived to ‘increase the number of work items, inflate the time to process each, and divert work previously done by office staff to them.’ Other work has shown that providers perceive many of the alerts they receive to be unnecessary, and has documented variable physician acceptance of features like computerized reminders and electronic alerts.”
In 2013, a research letter in JAMA Internal Medicine noted that 87% of physicians believe the number and frequency of EHR alerts is “excessive” while using the same VA technology as the more recent study. Physicians were likely to gloss over important information due to poorly designed alerts that don’t distinguish between routine and critical data, and may put their patients at risk for adverse events when important information is missed. An incorrect paper-based workflow that bypassed electronic drug interaction alerts was blamed for three deaths at the Memphis VA Medical Center last year.
“EHR alert systems, and by extension EHRs, could become catalysts for turnover, unless providers clearly understand their value to delivering high-quality care effectively and efficiently,” the study concludes. “As evidenced by the non-significant relationships between monitoring/feedback and provider satisfaction, as well as the non-significant relationship between training and both satisfaction and intention to quit, our data suggest that the aforementioned facilitating conditions may be insufficient to accomplish this goal, though we have no specific details in our data about the quality of the feedback or training.”
“More importantly, when providers do not perceive the value of these electronic aids to their practice, they might become dissatisfied with their work environment, and potentially seek work elsewhere altogether.”
At the AMIA 2014 Annual Symposium, talk about Stage 3 of meaningful use reveals tensions among members about what the Office of the National Coordinator for Health IT should do.
One Nov. 17 session detailed the Institute of Medicine’s new recommendations for a panel of 12 social and behavioral measures that should be required in Stage 3, and the audience comments were largely enthusiastic. (More detail below.)
Yet less than an hour earlier, AMIA’s EHR 2020 Task Force, charged with reflecting on the current status of EHRs and needed future directions, said its early recommendations are for ONC to stop where they are. Instead of adding more requirements, ONC should give providers time to stabilize and focus on how to be more innovative in meeting Stage 2 functional measures, said Michael Zaroukian, M.D., chief medical information officer of Sparrow Health System in Lansing, Mich., and one of the task force members. “We have enough EHR functionality in place now to advance quality and value,” he said. But Zaroukian also detailed some of the problems with Stage 2. He said 500 out of 1,000 eligible provider in his community would have to drop out of the MU program if Stage 2 reporting requirements for the full year in 2015 don’t change. “There is a Stage 2 dropout crisis looming,” he said. Only 17 percent of eligible hospitals have attested and 2 percent of EPs, he noted.
The task force’s early formulation of recommendations are that ONC should focus on relieving data entry burdens for clinicians, advance interoperability, ease quality reporting burdens, simplify meaningful use, and promote safety and quality.
During the Q&A discussion, some providers suggested that AMIA promote the JASON Task Force recommendations to have Stage 3 focus tightly on interoperability and public APIs. Others said that because the FHIR standard is still in draft form, it is not far enough along to include in Stage 3, and the best ONC can do is hint that open APIs and FHIR are the direction it will move in later EHR certifications. Still others questioned whether the marketplace is really ready to innovate with the current systems in place if regulations from CMS are relaxed.
The IOM report, issued a week ago, recommends that 12 measures be included by the ONC and CMS in certification of EHRs and meaningful use objectives, and the data should be recorded for every patient. The first four are already widely collected: alcohol use, race/ethnicity, residential address, and tobacco use. The IOM calls for adding eight more to the panel:
• Financial resource strain
• Intimate partner violence
• Physical activity
• Social connection/isolation
• Physical activity
“This is not a set of independent measures. They are complementary and make up a true panel,” said George Hripcsak, M.D., assistant professor in the Department of Medical Informatics at Columbia University and one of the IOM committee members. Hripcsak said the committee recognized the importance of minimizing the burden to providers. “The technology is only part of getting it implemented,” he said. There are workflow and patient engagement impacts. “We need to decide where and how to collect and review the data. HIEs may be used to share the data, even though privacy also is critical. The data can be de-identified for sharing with public health, he said.
The committee noted that having this data could lead to more effective population health management for health systems and agencies.
So the ONC will have to balance the desires of those who want to add more elements such as patient-generated data now that EHRs are more ubiquitous with the loud calls for a halt to any new regulations.
A group of 15 organizations--including the College of Healthcare Information Management Executives, the American Health Information Management Association, the Healthcare Financial Management Association and America's Health Insurance Plans--is urging congressional leaders to ensure that no future delays to ICD-10 implementation take place.
In a recent letter to House Speaker John Boehner, Minority Leader Nancy Pelosi, and Senate leaders Harry Reid and Mitch McConnell, the organizations--which call themselves the Coalition for ICD-10--say that prior delays have been "disruptive and costly," as well as an impediment to innovations in care delivery and payment reform.
"Nearly three-quarters of the hospitals and health systems surveyed just before the current delay were confident in their ability to successfully implement ICD-10," the letter reads. "Retraining personnel and reconfiguring systems multiple times in anticipation of the implementation of ICD-10 is unnecessarily driving up the cost of healthcare."
The letter comes on the heels of a recently published AHIMA survey that revealed that while larger organizations, for the most part, are ready for the transition, smaller organizations anticipate difficulties. At the Workgroup for Electronic Data Interchange's annual fall conference last month, American Medical Association President-Elect Steven Stack said his organization's policy is to "kill" the transition entirely.
In a recent ICD10Watch post, author Carl Natale says there's always the possibility another ICD-10 provision gets slipped into future attempts to kill the sustainable growth rate and its 24 percent cut to Medicare payments to physicians. The real impetus behind physician opposition, he says, is cost.
"If Congress finds a way to fund implementation costs for small medical practices and independent physicians, we're going to be using ICD-10 codes this year," Natale says. "[I]f someone crafts an ICD-10 stimulus act that finds some real money to help the small healthcare providers, they will need to persuade Congress to take the money from somewhere else associated with weak lobbyists."
The eHealth Initiative, in its recently unveiled roadmap for transforming health IT, calls compliance with ICD-10 by next October mandatory in order to advance health IT efforts.
A recent analysis published in the Journal of AHIMA concludes that ICD-10 costs may not be as burdensome as previously thought for small physician practices. However, Stanley Nachimson, founder and principal of Nachismon Advisors--which published a report on behalf of the AMA last February that came to the opposite conclusion--calls the new analysis "misleading," in a new post to ICD10 Monitor.
"The AHIMA article omits and/or minimizes several critical tasks for practices, misstates the sources of the Nachimson Advisors estimates used for productivity losses and minimizes the efforts that practices must undertake to ensure a successful implementation," he says.
Quality improvement will rely on health IT interoperability and data standards, says a new roadmap from the ONC.
Interoperability of current and future health IT systems is at the heart of creating a healthcare system that delivers consistently high quality, safe, and effective care, says the Office of the National Coordinator in a paper entitled “Health IT Enabled Quality Improvement: A Vision to Achieve Better Health and Health Care.” The paper outlines a ten-year plan towards system-wide quality improvement (QI) that relies on an infrastructure of clinical decision support, big data analytics, and clinical quality measures to accurately assess performance and provide real-time, data-driven insights to providers at the point of care.
“Quality measurement has a long tradition in health care,” write Jacob Reider, MD, Deputy National Coordinator and Capt. Alicia Morton, DNP, DN-BC, Director of the Health IT Certification Program in a blog post for Health IT Buzz. “But measurement is only one part of an improvement program. We don’t improve drivers’ alignment with the speed limit (and therefore the safety of drivers and pedestrians) by giving tickets, or by mailing notices days, weeks or months later. Rather, real-time feedback loops work to improve the quality of the driving as the driving is being done! New technology being applied to driving include notifications or even corrective action when the driver drifts out of unanticipated object lane, comes to close to another car, or encounters an unanticipated object while driving in reverse.”
the QI plan establishes benchmarks for three, six, and ten years from now, charting the journey from establishing data standards to harnessing big data to enabling highly personalized care. With the majority of healthcare providers identifying as EHR users, the ONC is shifting its attention to establishing, developing, and promoting the use of data standards as a way to ensure health information can be exchanged along the care continuum in a meaningful way.
As data standards become more prominent, healthcare providers will be able to capture data once and reuse it in multiple ways, which will help them to meet a number of different quality measurement and improvement objectives. This will enable the next phase of the quality improvement ecosystem.
“Over the next six years, the standards and technology building blocks will enable quality improvement data sharing and, thus, ‘big data,’” the report says. “Quality and safety metrics will refocus from provider-centric to patient-centric. The data in health IT, including patient-generated and claims data, will be standardized, linked at the individual level to clinical data as appropriate, and optimized for interoperable sharing and aggregation.”
By ten years from now, big data will allow healthcare to transition to a preventative, predictive industry that relies on real-time information to provide personalized care to patients. “Data accessibility will enable knowledge sharing, shortening the QI life cycle and ending the knowledge latency that has been so globally pervasive,” the paper predicts.
“Providers and stakeholders will rely upon interactive dashboards with ‘live’ data measuring performance on standard indicators and ad hoc focus across settings of care and geographic locations will be considered standard practice. The learning health system will be in full operation, while rapid advancements in research, leveraging widespread data, will lead to powerful conclusions and almost immediate translation to health, health care practice, populations and the community.”
In order to achieve these long-term goals, providers must start by adopting universal technical standards while fostering a business culture that supports the creation and use of quality improvement tools and the exchange of health information that makes those tools truly effective. Establishing strong measures of privacy and security, as well as adhering to data governance principles, will help to ensure the continued development of a reformed healthcare system that drives quality improvement for better patient care.
“ONC cannot achieve the health IT enabled quality improvement vision alone,” Reider and Morton conclude. “We will work collaboratively and transparently with stakeholders to develop shared national goals for three, six and ten-year timeframes that will serve as milestones on our shared journey to achieve this vision as an integral part of the nation’s future.”
Vendors continue to learn valuable lessons from one global region to another as their customers have varying degrees of success with their EMR products. This and other findings are included in the latest KLAS report on global EMR performance.
“Providers in certain geographical areas want more support or vendor involvement than in other places across the globe,” said report author Chris Brown. “That seems to be one of the key drivers of why provider satisfaction varies.”
KLAS spoke to providers from around the world to find out how their EMRs are performing. Interviews were based on the core areas of sales and contracting, implementation and training, functionality and upgrades, and service and support. The vendors included in the report are Allscripts, Cerner, ChipSoft, CSC, Epic, InterSystems, MEDITECH, Philips and Siemens.
Protecting the integrity of the EHR Incentive Programs will continue to be a top management challenge for the Department of Health & Human Services (HHS) as the EHR adoption rate increases and meaningful use rolls on, according to an Office of Inspector General (OIG) review of the federal agency’s work in the previous year.
This year’s OIG annual reveal of the most significant management and performance challenges facing HHS comprises ten items, and embedded within the eighth — effectively using data and technology to protect program integrity — is need to safeguard EHRs against inappropriate access and fraud.
“With the enactment of the Recovery Act and the HITECH Act, the Department has played a leading role in the nationwide adoption of EHRs and other health IT,” the OIG writes. “These innovations offer opportunities for improved patient care and more efficient practice management. However, as the volume of electronically-stored medical information grows, protecting the privacy, security, and integrity of EHRs has become more critical.”
Although the privacy and security of sensitive health information is the primary focus of the OIG’s analysis of this particular management challenge, the division within HHS also maintains that increased EHR adoption multiples opportunities for committing healthcare fraud:
According to the OIG, the onus is on the Centers for Medicare & Medicaid Services (CMS), which is responsible for establishing meaningful use requirements, and the Office of the National Coordinator for Health Information Technology (ONC), which is responsible for developing EHR certification criteria, to ensure that the EHR Incentive Programs ensure health data security, privacy, and integrity.
OIG is recommending that HHS and its departments maintain adequate oversight of these three areas:
As Inland Northwest Health Services Senior Director of Clinical Applications Mary Cheadle, RN, explained last month, eligible hospitals and professionals are performing differently in meaningful use audits — with the latter coming out on the losing end at a higher rate. Whereas hospitals have had a low failure rate of 4.9 percent for 613 completed audits, professionals have failed in 21.9 percent of approximately 8,000 completed audits.
It seems like every few days we get a message in the in-basket of our electronic health record (EHR) about a new type of message that we will be receiving in our in-basket.
They call these messages “system notices.”
OK, maybe that’s an exaggeration, maybe not every few days, but the different types of in-baskets and all the information we are bombarded with is getting out of control. As users of electronic health records know, the in-basket has become both a lifesaver and the bane of our existence, where the continuous influx of work to do piles up and up throughout the day, a tide which we continuously swim against and never seem to get ahead of.
Sometimes the messages are critically important; sometimes they are just more stuff.
Looking at my in-basket now I see the multitude of message types — folders piled full of information for me to look at.
CC’ed charts, appointment notifications, co-sign clinic orders, open charts, outside messages, overdue results, clinical letters, patient calls, patient advice requests, patient reports, patient prescription requests, patient unread message alerts, referral requests, results, prescription requests, staff messages, system notices.
I’m out of breath just reading through all of those, let alone doing the work that lies within.
The information presented to us through the electronic health record does serve an important purpose, helping us improve the quality of the care our patients receive, improving their access to their providers, improving communication between providers, and helping improve transitions from one setting to another for our patients. All noble goals, and essential pillars of the patient-centered medical home.
The newest of these to come out was recently announced via a “system notice” message. The new message type was “outside messages.”
Apparently we now get one of these when any outside system sends a report to our EHR about a patient whom we’ve even remotely been involved in the care of.
The first one of these I got was a hospital discharge file on a patient I didn’t recognize. It included multiple categories of data: patient demographics, a description of the hospital course, the patient’s vital signs throughout their stay, medications on admission and discharge, problem list, diagnoses, allergies, results, procedures, and more.
I am sure this was not a patient I knew, and at first I could not figure out why I was receiving this message.
Turns out I had been a supervising physician on an encounter the patient had at our practice several weeks earlier, and when they initially set up this system they decided to alert all the members of the “care team” about this information — including anyone who had touched the chart in the past 6 months.
It turns out this message could not be forwarded from within the EHR, so I contacted the programmers who help us maintain our EHR and told them that I shouldn’t have been receiving this message, and that I wanted to make sure that the right person got it.
Luckily, within a few weeks, an option to forward these messages appeared.
Interestingly, we also receive this information as a hospital discharge summary when our patient goes home, so it felt sort of redundant, like we were getting this information as a second huge packet, but ultimately it seems like the logic of passing this information on in this format made some sense.
This morning I got an outside message from a pharmacy sending me an update that my patient received an immunization at their facility the day before.
Wonderful, now I know my patient had his flu shot, and I have actual documentation that it has been given.
Unfortunately, this information did not flow into his immunization record in the EHR, so I have to now manually transcribe it there.
Time for another message to the programmers.
All of these message types, all of these folders in our in-basket, are ultimately part of helping us take better care of our patients, to see what our fellow providers and consultants are doing, to see all the lab work as it happens, to allow our patients to access us and to communicate with us.
We do need to help continually refine the system, to prevent redundancy, to improve the signal-to-noise ratio, so that messages go to the right people at the right time, so that information flows to the right places, so today’s work gets done today.
Front-line clinicians need to be able to tell the keepers of the code what really makes sense in an EHR, what gets in the way of care and what enhances it. We as caregivers need to know that the EHR cannot do everything we want it to, and the programmers need to know what does not work out in the “real world.”
Just got messages from the system about an updated billing summary within the EHR and one on Ebola preparedness.
The excitement never ends.
Automated reminders generated by EHR data within the Kaiser Permanente healthcare system helped to raise medication adherence rates in just twelve months, according to a new study published in the American Journal of Managed Care. The study of more than 21,700 patients produced a 2 percent bump in medication adherence for diabetes and heart disease drugs, said Bill Vollmer, PhD, lead author and senior investigator at the Kaiser Permanente Center for Health Research.
The study, funded by a grant from the Agency for Healthcare Quality and Research (AHRQ), mined EHR data to identify patients in need of refills on prescriptions including statins, ACE inhibitors or angiotensin receptor blockers. Patients were segmented into three groups: those receiving usual care, those who were given automated telephone calls reminding them to refill their medications, and those who were provided with enhanced reminders, including letters and live calls.
“We were trying to see if we could improve adherence to medications that have a proven efficacy for secondary disease prevention for people who have diabetes and existing cardiovascular disease,” Vollmer told the Portland Business Journal. “Most people take the medications, but they don’t take them every day as they should. They take them roughly every other day. We are trying to do what we can to improve that.”
Medication non-adherence is an extremely costly problem for the healthcare industry. By some estimates, patients incur $290 billion per year in avoidable costs due to not taking their prescriptions appropriately, and it’s not always easy to find effective interventions that stick with patients for the long-term. While the Kaiser Permanente study only lasted from 2010 to 2011, Vollmer found that a two to three minute automated phone call, which also connected patients to a refill hotline or a conversation with a pharmacist, helped to raise adherence rates by 1.6 to 3.7 percent.
“This small jump might not mean a lot to an individual patient, but on a population level it could translate into fewer heart attacks, fewer deaths and fewer hospitalizations, which will ultimately have an important impact on public health,” Vollmer said.
Patients in the enhanced intervention group, who received reminder letters and calls from a human care coordinator if they failed to respond to other methods of contact, were able to achieve significant reductions in their cholesterol levels, as well. Patients who started the program with “uncontrolled” cholesterol levels of greater than 100 mg/dl were able to reduce their levels by an average of 3.6 mg/dl more than patients who only received usual care.
The study is similar to another Kaiser Permanente effort at using EHR data to improve chronic disease management and medication adherence that was conducted in conjunction with National Jewish Health and Eliza Corporation. In that project, the parents of pediatric asthma patients received automated telephone refill reminders ten days before their prescriptions were slated to run out, ensuring a continuous supply of medication for their children. Adherence rates increased by 25 percent during the pilot over a two-year time period and used a similar automated refill call line to help make the process simple for busy parents.
With the growing usage of EHRs, more and more doctors are bringing their computers and tablets with them into the exam room. But just because you’re using a computer in the exam room, it doesn’t mean that you’re using it properly. Computers can be one of the most beneficial tools you use in an exam room, or they can lead to deteriorating patient engagement. Make sure you don’t make any of these four mistakes when you use your computer in the exam room.
Focusing solely on the screen. Making eye contact with your patients is essential. It conveys that you’re interested in the conversation and are paying attention: building trust. It can be easy to focus on what you’re entering into your EHR and you may not realize that you’re avoiding eye contact with your patient. Be cognizant about when you choose to type and when you are actively engaging with your patient.
Here’s a quick look at the psychology of eye contact: “When a person looks directly into your eyes when having a conversation, it indicates that they are interested and paying attention. On the other hand, breaking eye contact and frequently looking away may indicate that the person is distracted, uncomfortable, or trying to conceal his or her real feelings.”
Assuming you heard the patient correctly. When people are excited or anxious, they can have a tendency to accelerate their rate of speech. It can be difficult to capture everything in your EHR in these situations. Repeat back to the patient what you entered. This does two things: It lets the patient know that you are actively listening and it also verifies that the information entered is correct.
Ignoring your body language. Good body language sends a signal that you are actively listening to your patient. If you are facing your computer and not your patient it can convey that you are paying more attention to it than you are to them. Consider angling your body towards your patient as you take notes on your computer.
Blocking your line of sight. If you’re using a large desktop computer in the exam room, it can actually act as a barrier between you and your patient. Additionally, a computer that it stationary in the corner of the exam room can make creating eye contact with your patient difficult. Consider using both a small laptop or tablet as well as a mobile working station. This prevents your computer from blocking your field of vision and allows you to move to face your patient no matter where they are sitting.
Your computer is a tool just like any other tool in your exam room: There is both a right and a wrong way to use it. However, since many of us have been using computers for so long, we have an established way in which we hold them, view them and use them. That’s why self-awareness is so important when you’re introducing a computer to the exam room to make sure that you aren’t accidentally making one of the four above mistakes. When using your computer properly, it can not only make your practice more efficient, but it can increase the amount of engagement you have with with patient during an exam.
If you’re looking for additional ways to improve your clinical environment and improve patient engagement, check out this quick list of 7 simple exam room hacks that can be implemented today.
Despite the millions they've spent on digitizing medical records, many hospitals would be better off just scrapping their systems, according to an article at Hospitals & Health Networks.
"Poorly designed and poorly implemented information systems are worse than useless, worse than a waste of those millions and billions of dollars," writes Joe Flower, a healthcare futurist and CEO of The Change Project Inc., and its healthcare education arm, Imagine What If.
"As we go through rapid, serious changes in health care, poor information systems will strangle your every strategy, hobble your clinicians, kill patients and actually threaten the viability of your organization," Flower says.
Technology that promised to be a fast track to efficiency and effectiveness has instead been a drain, consuming time and money, and seriously eroded one of the most important management tools: trust, he says.
He suggests organizations evaluating their systems ask questions such as these:
He says interoperability, one of the biggest challenges to healthcare data-sharing, is a con--one perpetrated by vendors solely focused on market share.
"Imagine what the financial world would look like if their IT vendors had convinced each bank and brokerage to build software that would not talk to anybody else's. Interconnectivity is normal. The reason it's not normal in health care is that some or most of the vendors don't want it to be normal," he says.
Flower suggests it's time to stop digging ourselves into even deeper holes, toss technology that doesn't work and just start over.
Dissatisfaction with EHR systems is widespread among doctors and nurses. In a recent Black Book survey, 98 percent of 13,650 registered nurses polled said their institution never asked nurses for their input on the system design. At the same time, 79 percent ranked the reputation of the information system among the top three reasons they would want to work at--or avoid--a particular institution.
Family practice physicians reported that EHRs meant 48 minutes a day in more work, according to a study published at JAMA Internal Medicine
A report from HHS' Office of Inspector General outlines the top challenges faced by the Department of Health and Human Services in FY 2014. Among them: meaningful use and interoperability. The office also highlighted several areas HHS continues to struggle with heading into 2015, including electronic health records. First, HHS oversight of the EHR Incentive Programs has been significantly lacking and ultimately "vulnerable to inappropriate payments to participants that do not meet program requirements." The Centers for Medicare & Medicaid Services, for instance, has paid out more than $25.4 billion in incentives payments to eligible hospitals and providers that have demonstrated meaningful use, but have failed to implement adequate controls ensuring that those participants were actually entitled to the federal money. OIG officials cited the case of Louisiana Department of Health and Human Services, which just this September was found to have wrongly claimed $3.1 million in EHR incentive payments. OIG had examined the state's payouts in 2011 and subsequently found that LDHHS overpaid 13 hospitals $3.1 million and underpaid six hospitals $1.3 million. Overall, some 80 percent of Louisiana hospitals analyzed in the audit failed to comply with federal regulations or guidance. What's more, as the report outlined, CMS also hasn't done enough prepayment audits and instead has relied predominantly on post payment audits for "high-risk participants," which has proved insufficient in preventing things like fraud. In a report earlier this year, OIG called out CMS for its shortcomings in identifying and investigating EHR fraud. These deficiencies, as they pointed out, helped contribute to the estimated $75 billion to $250 billion in healthcare fraud. In that report, they highlighted two of the most common EHR documentation practices used to commit fraud: copy and paste, by which a healthcare provider copies and pastes information from a patient's record multiple times, often failing to update the data or ensure accuracy, and over-documentation, which involves adding false or "irrelevant documentation to create the appearance of support for billing higher level services." What's more, many of CMS' audit contractors were unable to determine whether a provider had even used copy and paste. Beyond the oversight and control issues HHS needs to address with meaningful use, OIG officials also pointed to serious challenges with interoperability. "The department must do more to ensure that systems are interoperable in order to realize these goals," wrote OIG officials in the report. This includes technical assistance, guidance and adopting policies that facilitate this. The Office of the National Coordinator for Health IT has made some progress with this, establishing its 10-year interoperability roadmap. Still, many stakeholders say it lacks teeth. "Where are the teeth with interoperability?" asked Marc Probst, CIO of Intermountain Healthcare, at a press briefing Sept. 16 on Capitol Hill. "With meaningful use, we had teeth. We had something we could get out there. We had benefits, incentives, and we had penalties."
Probst, a member of the Health IT Policy Committee, has been one of the most outspoken voices on the topic of interoperability advancement. "It does all come down to these fundamental standards," he added. "We've got to sit down and say,'what's the standard, and how are we gonna move it?'"
ReportsnReports.com adds electronic medical record (EMR) market research reports for
The European electric medical record market report defines and segments the concerned
Countries that have contributed to this demand in the European region are U.K.,
Various other factors, such as implementation of EMR systems, increasing mergers and
The European EMR market is segmented by component, deployment, application, end-user,
Electronic medical record (EMR) is an innovative tool that assists in gathering,
Many of you probably remember that we helped promote an Epic Salary Survey. As promised, they’ve published the results of the survey and we thought that many readers would be interested in the Epic Salary survey results.
The survey had 753 responses. Not bad for an online survey that was promoted across various blogs and social media outlets. Although, as you can imagine, some states are better represented than others. It’s the challenge of having 50 states.
This is my favorite chart from the Epic salary survey results (you can download the full survey results and data by states here):
As I look at some of these salaries, I’m reminded of the doctor who said that they shouldn’t be spending time learning their EHR. The hospital CFO then told the doctor, “I’m sorry, but that Epic consultant costs a lot more than you.”
Now I’d like to see one from Meditech and Cerner.
In response to an article in Journal of AHIMA questioning their estimates for ICD-10 implementation costs, Nachimson Advisors has issued a rebuttal with the backing of the American Medical Association.
“The new data suggests that the estimated costs, time and resources required by physician offices are dramatically lower than initially estimated as a result of readily available free and low cost solutions offered by coding, education and software vendors,” Kravis et al. wrote earlier this month.
According to Nachimson Advisors, this recent article “contains misstatements, shows a lack of understanding of the process for ICD-10 implementation, and directly contradicts ICD-10 implementation guidance from AHIMA and the Centers for Medicare & Medicaid Services (CMS), the enforcer of the ICD-10 regulatory requirements.”
Atop its list of repudiations, Nachimson Advisors finds fault with the absence of several ICD-10 implementation steps not included in the estimates of Kravis et al:
Interestingly enough, Nachimson Advisors has found AHIMA resources that contradict the Journal of AHIMA article.
Second, Nachimson Advisors refutes the claim that its figures for productivity reductions were based solely on inpatient hospital experience:
The most extensive of rebuttals centers of the assumption of zero cost for technology by Kravis et al. The claim that software upgrades would come at no cost to physician practices does not account for survey findings by organizations such as the Medical Group Management Association (MGMA), argues Nachimson Advisors. “So, while the costs could be zero in some physician offices, there will be costs in those offices that need to upgrade either their EHR or practice management system (PMS),” they add.
Nachimson Advisors has attempted to clarify the connection between meaningful use and ICD-10, which their findings linked together as part of estimated ICD-10 implementation costs and Kravis et al. attempted to keep separate:
Lastly, Nachimson Advisors finds the claims of Kravis et al. concerning ICD-10 testing to be the “most misleading” component of the latter’s analysis:
As Nachimson Advisors notes, AHIMA’s own recommendations to providers is to test early and frequently for both the technical and non-technical aspects of ICD-10 implementation.
Despite many EHR vendors best efforts to tell you otherwise, an EHR requires every organization to reconsider their workflow. Sure, many of them can be customized to match your unique clinical needs, but the reality is that implementing an EHR requires change. All of us resist change to different degrees, but I have yet to see an EHR implementation that didn’t require change.
What many people don’t like to admit is that sometimes change can be great. As humans, we seem to focus too much on the down side to change and have a hard time recognizing when things are better too. A change in workflow in your office thanks to an EHR might be the best thing that can happen to you and your organization.
One problem I’ve seen with many EHRs is that they do a one off EHR implementation and then stop there. While the EHR implementation is an important one time event, a quality EHR implementation requires you to reconsider your workflow and how you use your EHR on an ongoing basis. Sometimes this means implementing new features that came through an upgrade to an EHR. Other times, your organization is just in a new place where it’s ready to accept a change that it wasn’t ready to accept before. This ongoing evaluation of your current EHR processes and workflow will provide an opportunity for your organization to see what they can do better. We’re all so busy, it’s amazing how valuable sitting down and talking about improvement can be.
I recently was talking with someone who’d been the EHR expert for her organization. However, her organization had just decided to switch EHR software vendors. Before the switch, she was regularly visited by her colleagues to ask her questions about the EHR software. With the new EHR, she wasn’t getting those calls anymore (might say something good about the new EHR or bad about the old EHR). She then confided in me that she was a little concerned about what this would mean for her career. She’d kind of moved up in the organization on the back of her EHR expertise and now she was afraid she wouldn’t be needed in that capacity.
While this was a somewhat unique position, I assured her that there would still be plenty of need for her, but that she’d have to approach it in a little different manner. Instead of being the EHR configuration guru, she should becoming the EHR optimization guru. This would mean that instead of fighting fires, her new task would be to understand the various EHR updates that came out and then communicate how those updates were going to impact the organization.
Last night I had dinner with an EHR vendor who told me that they thought that users generally only used about 50% of the features of their EHR. That other 50% of EHR features presents an opportunity for every organization to get more value out of their EHR software. Whether you tap into these and newly added EHR features through regular EHR workflow assessments, an in house EHR expert who’s constantly evaluating things, or hiring an outside EHR consultant, every organization needs to find a way to regularly evaluate and optimize their EHR workflow.
I recently shared my thoughts on the physician perspective of EHR optimization strategies and physician EHR use. Now I have decided to expand the conversation by getting the take of two individuals, both within large health systems: Philip Baney, MD, from Reading Health System and Trista Eidmann, Clinic Administrator with UnityPoint Clinic.
We discussed how the EHR adoption has affected their physicians’ productivity, the types of optimization activities they have conducted, how the EHR has changed physicians seeing patients, and what they have done beyond an application optimization to improve the quality of the patient visit.
They brought varying perspectives, as Reading Health System has been on their EHR for one and a half years while UnityPoint Clinic has been on their current system for over 10 years. UnityPoint is in the process of switching over to a new application and is in the process of implementing their new system across their multi-state organization.
Trading eye contact for clicks
When asked about the top complaints in their organization regarding physician productivity in the EHR, the concerns were not surprisingly similar. Dr. Baney indicated that there are “more clicks, more busy work” and Ms. Eidmann responded that with more being handed down to them, they are “documenting more and more” and reimbursement isn’t any higher. Patients do not understand why the physicians’ heads are in the chart.
Years after organizations have implemented their EHR, I was not surprised that this continues to be a concern. Dr. Baney had a creative way to deal with this concern when he first started on the EHR. He would have a ruler in the room that he would give to patients and told them that when they were tired of looking at his ear they were allowed to use the yardstick to move his head so that he was looking at them. Most physicians have found that balancing the interaction with the patient and the time spent on their workstation in the exam room takes some effort. Dr. Baney added that he tries “to do as little on the computer when I am in the room with the patient and incorporate them in the process.”
Additional complaints within the organizations are that the providers are not getting their charts done, managing their inbox, or getting the bang for their buck. With all the data going into the system, they are not able to easily extract that information without additional steps being added.
But you’ve optimized, right?
When I asked what types of optimization activities their organizations have conducted to date, Ms. Eidmann responded that their efforts these days are to identify gaps in their current system and prepare to move over to the new system. Dr. Baney indicated that “there have been some attempts to identify things, but unfortunately many of those optimizations seems to be more clicks for the providers” and that he felt there was much more that could be done. This confirms what many organizations have already realized, that optimization and utilization of their EHR will be an ongoing process. This is especially true as a pay for value service is on the horizon.
Healthcare providers do not want to worry about where or how they put the data in the system — their focus is getting it in there as efficiently as possible. As developers and builders of the application, the focus should be on making the data entry easier for the end-user with the ability to easily extract that data out of the system.
Going beyond optimization
I next posed the question I discussed in my first article: What can be done beyond application optimization? Dr. Baney shared an anecdote about how his group realized that around the time school was out, their system would begin to run slowly. They discovered that their service provider didn’t have them on any higher provider priority, and all the school age kids would come home from school and suck up the bandwidth. What appeared to be an application issue was really a task for the internet provider. Another crucial problem identified was to have a triple redundant system. A final recommendation Dr. Baney made was training and evaluation — using their pulse reports to identify another provider who is strong who could mentor providers who are struggling. All of these are great recommendations for organizations that have not looked outside their application in their optimization efforts.
We must take the perspective of optimizing our systems so that end-users can work more efficiently while ensuring we can extract the data needed and also maintain a standard of quality patient visits. It will take time and creativity, but we have the tools needed to add value and increase enthusiasm to use an EHR. I thank Dr. Baney and Ms. Eidmann for their valuable time and input.
Physicians are more likely to prescribe cheaper generic drugs when their EHRs are set to choose the less expensive option by default, finds a study from the University of Pennsylvania and the Philadelphia VA Medical Center. As healthcare providers attempt to trim spending wherever possible, the simple programming technique may an easy way to prompt clinicians into making more cost-effective choices that don’t negatively impact the workflow.
“Prescribing brand-name medications that have a generic equivalent is a prime example of unnecessary health care spending because in most cases, generic medications are less expensive, similar in quality and may actually lead to better outcomes than brand names because of higher rates of patient adherence to generics,” said lead study author Mitesh S. Patel, MD, MBA, MS, assistant professor of Medicine and Health Care Management at Penn who is a graduate of the RWJF Clinical Scholars Program. “The results of this study demonstrate that leveraging default options can be very effective way to change behavior.”
The researchers looked at four clinics, including two family medicine and two internal medicine providers, in the University of Pennsylvania Health System between 2011 and 2012, examining physician prescribing habits for three common classes of drugs: beta-blockers, statins, and proton-pump inhibitors. During the study, the family physicians were given a choice of brand name and generic medications when prescribing through their EHR prescription portal, but the internal medicine practitioners were given the generic option more prominently with the ability to opt out and choose a different drug.
Providers given the generic drug as the default option increased their generic prescription rates by 5.4 percent, including 10.5 percent for beta-blockers and 4.0 percent for statins, the study found.
While generic drugs remain cheaper, overall, than their branded competitors, that doesn’t mean they’re as inexpensive as they used to be. A recent commentary in the New England Journal of Medicine by Dr. Aaron Kesselheim, Director of the Program on Regulation, Therapeutics and Law at Brigham and Women’s Hospital, notes that prices for common medications like doxycycline, a broad-spectrum antibiotic, have increased from 6.3 cents per pill to $3.36 dollars per pill. The average cost of captopril, used to treat heart failure and hypertension, increased by more than 2800% between 2012 and 2013.
While supply chain interruptions or shortages can be the cause of increases in price for non-patent-protected drugs, some pharmaceutical companies have come under federal scrutiny for possible collusion between competitors and the formation of monopolies that allow drug makers to charge more due to a lack of competition.
Despite the potential for unscrupulous practices, the use of generic drugs has saved billions for the healthcare industry over recent years, and the savings are escalating as providers turn to cheaper options, due in part to an increase in e-prescribing and EHR use. A recent report by analytics firm IMS Health found that generic drugs saved the industry $239 billion in 2013 alone, adding to $1.2 trillion in savings since 2004.
“Not only was changing the default options within the EHR medication prescriber effective at increasing generic medication prescribing, this simple intervention was cost-free and required no additional effort on the part of the physician,” Patel said of the UPenn study. “The lessons from this study can be applied to other clinical decision efforts to reduce unnecessary health care spending and improve value for patients.”
The first part of this article provided a view of the current data needs in health care and asked whether open source electronic health records could solve those needs. I’ll pick up here with a look at how some open source products deal with the two main requirements I identified: interoperability and analytics.
Interoperability, in health care as in other areas of software, is supported better by open source products than by proprietary ones. The problem with interoperability is that it takes two to tango, and as long as standards remain in a fuzzy state, no one can promise in isolation to be interoperable.
The established standard for exchanging data is the C-CDA, but a careful examination of real-life C-CDA documents showed numerous incompatibilities, some left open by the ambiguous definition of the standard and others introduced by flawed implementations. Blue Button, invented by the Department of Veterans Affairs, is a simpler standard with much promise, but is also imperfectly specified.
Deanne Clark, vxVistA Program Manager at DSS, Inc., told me that VistA supports the C-CDA. The open source Mirth HIE software, which I have covered before, is used by vxVistA, OpenVista (the MedSphere VistA offering), and Tolven. Proprietary health exchange products are also used by many VistA customers.
Things may get better if vendors adopt an emerging HL7 standard called FHIR, as I suggested in an earlier article, which may also enable the incorporation of patient-generated data into EHRs. OpenMRS is one open source EHR that has started work on FHIR support.
Tolven illustrates how open source enables interoperability. According to lead developer Tom Jones, Tolven was always designed around care coordination, which is not the focus of proprietary EHRs. He sees no distinction between electronic health records and health information exchange (HIE), which most of the health IT field views as separate functions and products.
From its very start in 2006, Tolven was designed around helping to form a caring community. This proved useful four years later with the release of Meaningful Use requirements, which featured interoperability. APIs allow the easy development of third-party applications. Tovlen was also designed with the rights of the patient to control information flow in mind, although not all implementations respect this decision by putting data directly in the hands of the patient.
In addition to formats that other EHRs can recognize, data exchange is necessary for interoperability. One solution is an API such as FHIR. Another is a protocol for sending and receiving documents. Direct is the leading standard, and has been embraced by open source projects such as OpenEMR.
The second requirement I looked at, support for analytics, is best met by opening a platform to third parties. This assumes interoperability. To combine analytics from different organizations, a program must be able to access data through application programming interfaces (APIs). The open API is the natural complement of open source, handing power over data to outsiders who write programs accessing that data. (Normal access precautions can still be preserved through security keys.)
VistA appears to be the EHR with the most support for analytics, at least in the open source space. Edmund Billings, MD, CMO of MedSphere, pointed out that VistA’s internal interfaces (known as remote procedure calls, a slightly old-fashioned but common computer term for distributed programming) are totally exposed to other developers because the code is open source. VistA’s remote procedure calls are the basis for numerous current projects to create APIs for various languages. Some are RESTful, which supports the most popular current form of distributed programming, while others support older standards widely known as service-oriented architectures (SOA).
An example of the innovation provided by this software evolution is the mobile apps being built by Agilex on VistA. Seong K. Mun, President and CEO of OSEHRA, says that it now supports hundreds of mobile apps.
MedSphere builds commercial applications that plug into its version of Vista. These include multidisciplinary treatment planning tools, flow sheets, and mobile rounding tools so doctor can access information on the floor. MedSphere is also working with analytic groups to access both structured and unstructured information from the EHR.
DSS also adds value to VistA. Clark said that VistA’s native tools are useful for basic statistics, such as how many progress notes have not been signed in a timely fashion. An SQL interface has been in VistA for a long time, DSS’s enhancements include a graphical interface, a hook for Jaspersoft, which is an open source business intelligence tool, and a real-time search tool that spiders through text data throughout all elements of a patient’s chart and brings to the surface conditions that might otherwise be overlooked.
MedSphere and DSS also joined the historical OSEHRA effort to unify the code base across all VistA offerings, from both Veterans Affairs and commercial vendors. MedSphere has added major contributions to Fileman, a central part of VistA. DSS has contributed all its VistA changes to OSEHRA, including the search tool mentioned earlier.
I would suggest to the developers of open source health tools that they increase their emphasis on the information tools that industry observers predict are going to be central to healthcare. An open architecture can make it easy to solicit community contributions, and the advances made in these areas can be selling points along with the low cost and easy customizability of the software.
One would expect that in an era where smartphones are more powerful than our computers were 5 years ago, health care providers would have an arsenal of health care IT solutions to enhance patient care but also optimize their own workflow.
Shockingly, in 2014 most health care IT solutions (such as EHR systems) are incapable of basic functions that we take for granted in other aspects of our digital lives (this is despite the fact that hundreds of millions of dollars have been invested by institutions). We have made information electronic, but continue to work with it as if all we had was an abacus. This is problematic since many providers are now involved in taking care of patients (physicians, nurses, physician assistants, clinical pharmacists, therapists, and trainees), and there is greater turnover of team members due to the shift work nature of inpatient care. Rather than having optimal information available — we have data chaos.
The situation is in stark contrast to my out-of-hospital life. In my Gmail I can (usually) easily find a specific message using a combination of “has,” “to,” “from,” and “subject” statements. Most EHRs on the other hand have me scrolling through consultation and progress notes in size 10 text, stacked one on top of each other, which are about as search-able as a Where’s Waldo scene.
As a hospital medicine doctor who needs to rapidly assimilate information often in the middle of the night, EHR is my last resort. When I receive an emergent page, I rely on my examination, the patient interview (if they’re able to talk), and then a quick glance at a rudimentary sign out document handed to me from the day-team. EHR is neither optimized for mobile in most cases, nor can I find anything I am looking for when I actually need it. This is true irrespective of whether I am working at a rural hospital in Maine with a 10-year-old EHR, or at a major academic center with a cutting edge EHR.
Despite heavy investment in health care IT from institutions, providers like myself still find ourselves carrying around printed patient lists with scribbled check boxes and to do lists. The sign out document — a purportedly succinct summary of the patient problems and suggested plan of action – obviously has pitfalls. One night last week I counted 42 patients I was covering overnight on five different sub-specialty services; and some basic math revealed that in my 12-hour shift I was in charge of more than 120 active medical problems, and at least 1,000 years of medical history (conservatively).
All of this was summarized for me by other providers in a neatly typed word document extracted from copy-pasted snippets of EHR notes, printed, stapled, and then folded — ready in my white coat pocket for me to peruse while I bolted through corridors or took the elevator to where a patient was crashing.
On discussing the shortfalls of modern health care IT solutions, I often hear colleagues stating that change is coming soon (read: 2 to 5 years). Given that medical errors now kill more than 400,000 Americans and are the number three cause of death in the United States, we need a greater sense of urgency about how broken IT systems are, and how we can fix them immediately. Infinitely more thought, investment and energy goes into social networking endeavors than how patient information is presented and made available to health care team members. (It is irrefutably easier for me to find out where you went to college — using LinkedIn, Google, or Facebook – than to find out who my patient’s primary care doctor is.)
It makes little sense that modern medicine offer us marvels like whole genome sequencing, while simultaneously providing ridiculous solutions such as mnemonics to reduce errors during change of shift. Surely the bar must be set higher, and we must harness the technology we carry in our pockets.
Believe it or not, the winter holiday season is nearly upon us, and ICD-10 is tops on the list of projects that are likely to be put aside in favor of office parties and family vacations. While the new code set will not come into effect until October 1, 2015, if the current implementation date holds, there are plenty of ICD-10 tasks that providers should consider getting under way before the end-of-the-year slump.
Already have an impact assessment? You might need to do another one
If you’re one of the 27% of providers who have not completed a financial impact assessment at this point in the transition process, you know what to do. Whether you hire a consultant or scrape together some in-house resources, it is critical to have a thorough inventory of what health IT systems need to be upgraded, what staff members must receive education, and how much it will cost.
“With ICD-10, it’s anticipated that days in accounts receivable may go up by 20 to 40 percent,” warns Summer Scott Humphreys, Executive Consultant for Beacon Partners. “Denials may increase. I would suggest having a strong revenue cycle team in place that actually starts looking at denials now as problem areas now are just going to become larger with ICD-10. Focusing on those denials by provider, by coder, by payer and figuring out why they’re happening is going to help an organization prepare for ICD-10.”
If you have already completed an impact assessment, but the results are from before the first few delays in 2012 or 2013, you might wish to consider a do-over. Technology needs have changed drastically over the past few years, and some organizations may have implemented new systems since the latest delay was announced in April. “You may have already checked the box and said, ‘Hey, I’ve passed this gate,’ but the reality is with the shift in timeline you may very well have to repeat various aspects of your testing strategy and other aspects of your implementation plan,” says Erik Newlin, Vice President of EDI Platform & Compliance at Xerox and Co-chair ICD-10 Assessment Workgroup at WEDI.
It’s 2014: Do you know where you upgrades are?
Recent surveys have shown the number of ICD-10-ready vendors has crept upwards, but many providers are still waiting for products that haven’t been released yet. While two-thirds of vendors have already made their products available, according to a WEDI poll in September, more than 25% won’t have their upgrades finalized until 2015 or don’t even have an idea when they’ll be ready.
Providers who are waiting on software or hardware should continue to hound their vendors for a delivery date in order to move their implementation timelines along as quickly as possible. October of 2015 may seem far away, but large-scale upgrades can take more time than anticipated, and may also be subject to unforeseen setbacks or rescheduling.
If you’re not already testing, nail down a schedule
Ensuring that the right technology is in place is so crucial because it is a pre-requisite for adequate testing. While a third of providers had already started external testing as of September, more than half don’t anticipate beginning the process until 2015. That could put some organizations in danger of squeezing up against the deadline without the opportunity to address any problems that may arise.
“If providers are waiting until the last second, you’re doing your own harm,” Newlin says. “Waiting until the last second does not leave enough of a ramp for payers to rectify any problems they might run into. We’re not going to have a seamless transition if that happens.”
“Any time you go through a major change like this the more detail you can check off, the less chaos you’re going to endure as you go through your conversion,” agrees Ken Kilmer, ICD-10 Project Manager at Nash Health Care, which participated in a full-scale simulated go-live in order to thoroughly put the system through its paces. “If you’re a hospital out there and you’re not already thinking about doing this type of testing, then it’s something you should consider. People just tend to do this right before the event, and I don’t think it’ll give them time to react and prepare the way they could be otherwise.”
Even if you don’t begin testing before the end of 2014, sit down with your ICD-10 transition team, your payers, and your technology providers to see if you can’t hammer out a rough schedule or sign up for a CMS testing opportunity before offices start emptying for the holidays.
Make sure your coders and physicians are on board with education
Technology is only half the battle when it comes to ICD-10. It is primarily a process improvement project which affects the quality of documentation, both clinical and financial. For ICD-10 project leaders, this means training physicians to produce better notes on the front end and giving coders the best possible tools to turn documentation into dollars as claims head out the door.
Clinical documentation improvement (CDI) can be an uphill battle if physicians aren’t properly engaged in the ICD-10 project. Consider presenting CDI as a way for physicians to better express their existing clinical expertise, not as a mandatory requirement for an administrative process that doesn’t directly fit into their wheelhouse. “Doctors want something simple and something that works,” declared Dr. Richard Garcia, MD, MPP, MHA, Emergency Department Director at Beverly Hospital in Montebello, California. “Physicians and HIM people have to align together to make this whole process work, because if we’re not respectful of each other’s workflow, it’s not going to happen.”
The New Year is a perfect opportunity to re-commit to CDI if your program wilted a little with the 2015 delay. It’s also a great time to start giving coders the practice they need to become proficient in the complex new code set. Before shutting down for the holidays, consider planning, introducing, or ramping up dual-coding and advanced training for coders who will need time and encouragement to adjust to the new ICD-10 environment.
If you’re in good shape, consider lending a helping hand
If you went through this list and checked off every part in your head, there’s one more task you might want to add. Healthcare providers, especially smaller practices and solo physicians, continue to struggle with the very basic pieces of the transition process. In the latest AHIMA and eHealth Initiative survey, more than a third of organizations who haven’t planned testing yet have cited a lack of knowledge as the reason why. Forty-five percent of those providers were small clinics or physician practices. CMS is doing its best to provide resources and education, but if your organization is already on point, you might want to think about reaching out and offering guidance to local partners who might be less well-prepared.
ICD-10 is an endurance race for the entire industry, not an individual event with a gold medal at the end. Every member of the care continuum, from payers to vendors to patients, stands to be impacted by a failure to make the great leap, so it’s in each provider’s best interest to ensure that October 1, 2015, isn’t a date to be feared.