EHR and Health IT Consulting
32.7K views | +0 today
Follow
EHR and Health IT Consulting
Technical Doctor's insights and information collated from various sources on EHR selection, EHR implementation, EMR relevance for providers and decision makers
Your new post is loading...
Your new post is loading...
Scoop.it!

Future of EHRs: Interoperability, Population Health, and the Cloud

Future of EHRs: Interoperability, Population Health, and the Cloud | EHR and Health IT Consulting | Scoop.it

Ever since the Health Information Technology for Economic and Clinical Health (HITECH) Act was passed in 2009 and the Medicare and Medicaid EHR Incentive Programs were established, healthcare providers have been quickly implementing EHR systems and adopting health IT tools. The overall movement toward improved quality of care and greater access to healthcare information will likely stimulate the future of EHRs.


Before predictions regarding the future of EHRs and their designs can be considered, it is critical to examine the history and evolution of EHR technology over the last five decades. The American Medical Association Journal of Ethics discussed how the earliest developments in EHR design took place in the 1960s and 1970s.  Healthcare leaders began forming organizations as early as the 1980s to develop standards for the increased use of EHR systems across the sector.

The very first health IT platforms, developed by Lockheed in the mid-1960s, were called clinical information systems. This particular system has been modified over the years and is now part of Allscripts’ platforms.  The clinical information system was capable of having multiple users on at once due to its high processing speed. During the same period, the University of Utah developed the Health Evaluation through Logical Processing (HELP) system and later Massachusetts General Hospital created the Computer Stored Ambulatory Record (COSTAR).


The COSTAR platform was able to separate key healthcare processes into separate entities such as accounting or billing versus clinical information. The federal government adopted an EHR system in the 1970s through the Department of Veteran Affairs’ Computerized Patient Record System.


Over the last several decades, there have been even more developments in EHR design and implementation, especially since the federal government constructed meaningful use objectives under the EHR Incentive Programs. In 1991, the Institute of Medicine (IOM) published a report analyzing the effects of paper health records and making a case for the use of EHR systems. The report also covered challenges to EHR adoption such as costs, privacy and security concerns, and a lack of national standards.


In 2000, the IOM also published its infamous report To Err is Human in which the high rates of medical errors were discussed and health IT systems were addressed as a potential solution. The history surrounding health IT will likely impact the future of EHRs, as the same principles toward better quality of care, lower costs, and improving patient health outcomes are at the forefront of EHR adoption.

EHRIntelligence.com spoke with three leaders in the healthcare IT industry to discuss the future of EHRs and the trends to expect over the coming years. Bob Robke, Vice President of Interoperability at Cerner Corporation, mentioned the importance of healthcare data sharing across multiple platforms.


“We’re moving out of the era of EHR implementation and adoption and into the era of interoperability,” Robke said. “Now that we’ve automated the health record, the next phase is connecting all of the information in the EHR. We need interoperability and open platforms to accomplish this.”


The functionalities possible in future EHR systems will also focus greatly on interoperability and Big Data. As telehealth functions spread across the country, patient health outside of the medical facility will be greatly considered.


“Interoperability has the potential to unlock a richer set of data that clinicians can use to help improve the care they provide to patients,” Robke explained. “More than ever, clinicians will need access to information about the patient’s care that happens outside of their four walls as healthcare moves from fee-for-service to value-based models.”

When asked what healthcare trends are affecting the design of EHR systems, Robke replied, “There is a lot of exciting work being done to advance open standards that enable information stored in one EHR to be accessed by other systems. A good example of this is the work being driven by the Argonaut Project to advance the development and adoption of the FHIR standard. We’re big supporters of the SMART on FHIR approach that allows information to be accessed from directly within the EHR workflow, and are enabling that within the Cerner EHR.”

Health information exchange and EHR interoperability will continue to impact the future of EHRs over the coming decades, as the healthcare industry continues to strive toward meaningful use of health IT systems. Robke spoke on the benefits of health information exchange and the strategic actions of the Commonwell Health Alliance, which is geared toward nationwide healthcare data exchange.


“Interoperability is a critical next step in the EHR world. Interoperability can provide clinicians with the data they need to manage the health of their populations and truly put the patient at the center of care,” Robke explained. “For interoperability to succeed, it will require all of the different information system suppliers coming together to find ways to connect their platforms, like those vendors who have joined together in the CommonWell Health Alliance. The great thing about CommonWell is vendors representing 70 percent of the acute market share in the U.S. have joined together to make interoperability a reality.”

When discussing how telemedicine and population health measures will affect the future of EHRs and the development of health IT platforms, Robke stated: “Connecting different information sources are key to successful telehealth and population health management strategies. Health care organizations need to access a patient’s full health history regardless of where that care was provided or what information system houses that information.”


“And yet, when it comes to results, there is an alarming failure in the healthcare industry.  Despite huge investments in enterprise systems, venerable healthcare organizations failing even at the basics like exchanging information electronically, communicating amongst care teams, and engaging patients,” Bush elaborated on the topic. “Some are even going bankrupt!  The shortcomings of software – the cost, the inability to share information at scale, the demands for onsite management and maintenance, and the sluggish pace of innovation—are chiefly responsible for this.”


The revenue cycle in the healthcare industry will also have a great impact on the future design of EHR systems and trends within this sector, Bush explained. The costs of investing in complex technologies will affect the future adoption rates while the financial incentives of the Medicare and Medicaid EHR Incentive Programs will also stimulate hospitals and physician practices.


“That’s why I believe that health care leaders are going to start thinking in terms of the total cost of driving results, not the total cost of ownership, when they contemplate the HIT of the future,” Jonathan Bush explained. “It’s crucial in the current landscape to adopt a cost calculation that accounts for labor and operational costs across several departments, as well as the opportunity costs of an underperforming system. As CIOs and health system boards are increasingly held to account for their investment decisions, I think we’ll start to see a new model for total cost of ownership emerge—and a fleet of next-generation services emerge to keep up.”


When asked what functionalities he thinks health IT systems will be able to obtain in the future, Bush replied: “Malleable IT strategies available from the cloud will reinvent what we ever thought HIT was capable of.  I agree with a recent IDC report and its vision for a future filled with ‘3rd Platform EHRs’ capable of functions we just don’t see in software today.”


“Those functionalities would include easy access to data; population-wide analytics; and network intelligence that crowd sources the wisdom of many to improve overall performance,” he continued. “These functionalities are already being built in to service value-based care organizations.  The promise is better healthcare in an accountable care environment.”


Next, the Athenahealth CEO discussed the importance of connectedness and interoperability when it comes to the design of EHR technology and future trends in health IT.


“Connectedness is a huge barrier to humanity in health care, as well as to the design of intelligent IT systems,” Bush said. “Achieving connectedness, or the meaningful use of health IT, isn’t reliant on getting all providers onto one system.”


“I believe that the one-size-fits all mantra is finally waning and that healthcare will continue to demand what I like to think of as the ultimate ‘backbone’ solution: lightweight technology that can unite data across multiple platforms and support advanced levels of care coordination and connectedness. That sort of infrastructure is not only more cost effective, nimble, and future-proof; it’s also best for patient choice and access and — ultimately — quality care.”


Some of the typical trends that are affecting the future of EHR technology include telehealth, population health management, accountable care, and health information exchange. Population health management in particular will affect the development of analytics software and statistical measurements vital for demonstrating healthcare quality improvements.


“The arrival of population health is, and will continue to be, huge. It’s trending in M&A, has wound its ways into vendors’ capability descriptions, and is on the required ‘must support’ list for healthcare organizations of all sizes,” Jonathan Bush explained.


“To do population health correctly, EHRs will need to gain insight into patient populations, translate that insight into meaningful knowledge for care teams, and enable a new standard of connectedness to manage and deliver care. To do such complex, hairy, and crucial processes, EHRs will have to leverage a combination of software, knowledge, and work.  Software alone simply isn’t cut out to do the job.”


EHRIntelligence.com also spoke with Practice Fusion Founder and Chief Executive Officer Ryan Howard about future trends in EHR design. Howard spoke about the importance of data sharing among health IT systems.


“The single biggest trend will be cloud-based EHRs. The biggest single problem in the space is not deployment of EHRs. It is sending data back and forth whether it’s for quality and accountable care or sharing data with a payer or a lab or other doctors,” said Howard. “In every spirit of this, data from EHR needs to be shared with another EHR system.”


“The challenges of that is to install software offsite. Most of the major competitors have enterprise solutions. The data is incredibly difficult to get out. A cloud-based model inherently has an exponential cognitive scale that allows it to do this easily,” Howard explained. “In our case, when we connected to Quest, every doctor on our platform has a connection to Quest now because they’re all the same multi-tenant cloud-based systems. I think the biggest problems in health IT will be solved by simple integration into the cloud.”


Howard was of the same opinion as the other CEOs when it comes to the functionalities EHRs will need in the coming years. Interconnectedness, interoperability, or the efficient sharing of health data between disparate systems will become a necessity in the quest to improve patient care and health outcomes.


“The biggest single thing [that will affect the future of EHRs] is that systems need to seamlessly connect to each other,” the Practice Fusion CEO stated. “Most of the systems are pretty robust, but I think the major cloud-based systems will need to interoperate. I think the major cloud-based vendors in the marketplace will connect and all their doctors will be able to interoperate. I think all the doctors will migrate to cloud-based systems.”


“This is only possible in a web-based or cloud-based model where the population data is in one place,” Howard said. “There’s very little value in doing this in a solution that’s installed in the doctor’s office. In that situation, all the data isn’t in one place and, in a population health management program, you’re constantly rolling out new rules and tackling new chronic conditions.”


When asked what healthcare trends will affect the design of EHR systems, Howard replied: “Population health management in addition to the electronic health records role in enabling telemedicine will all be key in the marketplace. Unless you have the patient’s record which only exists in the EHR, then there will be very little value on the telemedicine platform.”


“However, if I’m using a telemedicine platform that’s connected to the EHR, I have all that data in real-time. Most EHRs that are certified do drug-drug and drug-allergy checking dynamically in the system. That’s a good example of the value that comes from the platform.”


In predicting the coming impacts in EHR developments, Howard said, “cloud-based systems, population health management, private care management, and big data” are the major catalysts in health IT design.

“I think most vendors don’t have a population health management solution. The challenges of that is that population health does not work unless all the data is in one place,” Howard stated. “For population health management to work, take a look at diabetes. What the system is doing in a population health management model is that it is constantly monitoring your patient on a day-to-day basis.”


If a patient hasn’t had a required test done, “the system should automatically be reaching out to that patient to drive awareness – get them to book an appointment – and the system should also be prompting the physician with the standard of care during the visit.”


more...
No comment yet.
Scoop.it!

Vermont Gets More Robust With Data Exchange

Vermont Gets More Robust With Data Exchange | EHR and Health IT Consulting | Scoop.it

Southwestern Vermont Medical Center (SVMC) and Vermont Information Technology Leaders (VITL) have just completed a project that developed five connections to transmit health data from the hospital to the Vermont Health Information Exchange (VHIE).

According to officials of the organizations, the five interfaces were built to:

  • Send immunization data from SVMC to the VHIE. The immunization data is then forwarded on to the Vermont Department of Health Immunization Registry.
  • Modernize the existing laboratory results interface from SVMC to the VHIE.
  • Send patient demographics, radiology reports, expanded laboratory results (pathology, microbiology and blood bank), and transcribed reports (information about procedures, admissions, discharges and consults) from SVMC to the VHIE.

The SVMC interfaces complete VITL's goal of connecting all 14 Vermont hospitals to the VHIE, the statewide health data network operated by VITL. Although SVMC has been contributing laboratory results to the VHIE for over eight years, the four new connections will increase the amount of clinical and demographic data available to providers involved in a patient’s care, better informing health care decisions, its officials say.

The final phases of the SVMC interface project included the addition of a move-in process, where engineers, analysts and project managers met face-to-face at the VITL office in Burlington. The interface teams met for two in-person sessions that lasted two weeks at a time, and allowed them to completely focus on integration and quality assurance testing of health data flowing from SVMC into the health information exchange, according to officials.

The new clinical interfaces allow SVMC data to be shared with any provider in Vermont. “Southwestern Vermont Medical Center has been a part of the VHIE for over eight years, and we have actively used the data network to distribute electronic lab results to primary care practices in the southwestern Vermont health care service area,” Rich Ogilvie, chief information officer at SVMC, said in a statement. “The additional connections deliver data and reporting abilities that will enhance the provider-patient care relationship in the Bennington service area and across the state.”


more...
No comment yet.
Scoop.it!

Many Say Meaningful Use Stage 2 Is Disastrous, but the Data Say Otherwise

Many Say Meaningful Use Stage 2 Is Disastrous, but the Data Say Otherwise | EHR and Health IT Consulting | Scoop.it

The industry news is full of disparaging talk about the health of the EHR Incentive Programs (i.e., meaningful use), particularly the low number of Stage 2 attestations. While some statistics show that only 35% of the nation’s hospitals have met Stage 2 meaningful use requirements, further analysis reveals a different story.

Each month since July 2014, CMS and the Office of the National Coordinator for Health IT update the Health IT Policy Committee on the number of successful Stage 2 attestations. The following day, the same headlines appear with multiple industry analyses and strong reactions that take the low attestation volume as a sign of failing long-term meaningful use viability. These critics say that in November 2014, only 17% of the nation’s hospitals successfully demonstrated Stage 2, and most recently that in December 2014 that figure was 35%.

These numbers are being used to demonstrate how difficult it is for the majority of the hospitals to meet Stage 2 requirements and even to make the case that most will not be capable of attesting due to overly stringent requirements. While these numbers are not technically wrong, a closer look reveals a different picture. This is not an attempt to be provocative, but rather we want to provide additional detail to those figures because they do not tell the whole truth about how well hospitals have fared in Stage 2.

Stage 2 Attestation Numbers Send Mixed Messages
First, the numbers cited were correct when the number of Stage 2 attestations were compared with the entire population of U.S. eligible hospitals (EHs). Of course, based on such data, it looks as if only about a third of the hospitals have been able to meet Stage 2 requirements through the end of November 2014. Some have interpreted this number to mean that meaningful use Stage 2 is a disastrous program, but the industry should not use these numbers to judge the success of Stage 2, or in fact, hospitals’ ability to meet the requirements. Why?

The EHs participating in the EHR Incentive Program are required to progress through a set meaningful use timeline. This means every meaningful use participant is scheduled to start at Stage 1 and remain in each stage for two years before moving to the next stage, unless the policy allows otherwise. For example, the early adopters who began in 2011 were in Stage 1 for three years instead of two, as CMS moved the Stage 2 start year to 2014. Therefore, not every EH in the nation is scheduled to attest to Stage 2 in 2014. Even if they wanted to attest to Stage 2, they would not be able to do so.

Instead, the industry should look at how many EHs are scheduled to be in Stage 2 in 2014, rather than looking at all EHs. Per the CMS data:

  • 809 hospitals attested to Stage 1 Year 1 in 2011;
  • 1,754 hospitals attested in 2012;
  • 1,389 attested in 2013; and
  • 83 attested in 2014 by Sept. 30.

Thus, only 2,563 hospitals (i.e., those that started in 2011 or 2012, or 809 + 1754) were scheduled to demonstrate Stage 2 in 2014. Among these hospitals, 65.58% (1,681) of EHs successfully attested to Stage 2 by Dec. 1, 2014. It is this number that tells an accurate story of Stage 2’s viability so far.

Admittedly, CMS only includes Medicare-only or dually-eligible EHs in the database cited above, and CMS did not clearly indicate whether 1,681 include all types of EHs. However, the number of Medicaid-only EHs account for a small proportion here. Based on CMS’ October 2014 report, fewer than 100 Medicaid-only EHs should be in Stage 2 in 2014. Even if we added 100 to the calculation to account for Medicaid-only EHs, the percentage would still be at more than 63%.

Attestations Are on the Rise
In addition, the number of successful Stage 2 attestations has grown exponentially since CMS first announced that 10 hospitals attested to Stage 2 by July 1, 2014. We find many organizations wait until the final 30 days or even closer to the attestation deadline to attest, so it is no surprise to see such growth — especially in the last few months when the number doubled between Nov. 1, 2014, and Dec. 1, 2014.

Additionally, the majority of EHs had to wait until Oct. 1 if they chose the last fiscal quarter, as is likely the case for the majority of attestations. This approach was popular because it gave these organizations the first three quarters of the fiscal year to implement the 2014 Edition CEHRT and to make the required workflow adjustments. So the nearly-66% of successful Stage 2 EHs attestation will only rise from here, especially considering the fact that CMS has extended the hospital attestation deadline to Dec. 31.

Where Hospitals Stand at the End of 2014
The College of Healthcare Information Management Executives recently estimated that about one-third of the hospitals scheduled to attest to Stage 2 in 2014 will use the flexibility rule, which allows them to attest to Stage 1 requirements in 2014 if their certified EHR upgrade was delayed or unable to be implemented at all. If we combine the numbers of those who successfully attested to Stage 2 and those who will rely on the flexibility rule, more than 95% of hospitals are able to attest in 2014. Again, that percentage does not look like a disaster; it shows that the tremendous efforts these hospitals put toward readying themselves for Stage 2 in 2014 paid off for more than half, and CMS’ lifeline worked.

Taking the same approach for eligible professionals (EPs), 57,595 and 139,299 of Medicare EPs attested to Stage 1 Year 1 in 2011 and 2012, respectively. This means 196,894 EPs are supposed to be in Stage 2 in 2014. Per CMS data, 16,455 EPs successfully attested to Stage 2 by Dec. 1, 2014, which accounts for an 8.36% success rate for that group. Of course, the number appears low at this juncture. However, based on the trend for EHs, we expect the numbers to grow tremendously as the majority of the EPs would also rely on the last calendar quarter as their reporting period (Oct. 1, 2014, to Dec. 31, 2014), and EPs can complete their 2014 attestation within the first two months in 2015. In short, it is too early to draw conclusions regarding EP attestations. The real story still remains to unfold for the EP Stage 2 attestation.

Many have touted the misleading data and message that meaningful use is a failure as a reason to push CMS to reduce the reporting period in 2015 from one full year to one three-month quarter or 90 days. We agree with the many benefits that a shortened reporting period in 2015 would provide, and we offer an alternate rationale based on our analysis of the data.

First, so far, about two-thirds of EHs that are scheduled to be in Stage 2 in 2014 have successfully met the requirements. Based on research conducted among our members, we found that the shortened reporting period in 2014 played a critical role in their success. They would not have been able to attest or found it to be significantly challenging if any longer than a three-month quarter reporting period were imposed in 2014. This is because they would not have sufficient time to completely implement and stabilize the 2014 Edition CEHRT and to adjust existing or implement new workflows. In addition, the longer reporting period would equate a higher denominator, making it more difficult or nearly impossible for the providers to achieve the required threshold.

Stage 2 also introduced more complex objectives such as View, Download and Transmit, and Transitions of Care. These two objectives alone required many hospitals to deploy their IT capabilities in new territories of patient engagement and information exchange. As we’ve previously discussed, these two objectives are arguably the most challenging in Stage 2, and the majority of providers who attested showed marginal performance around the required thresholds. These two objectives are significant first steps toward something greater in health care, and it will take time to improve performance in these areas. CMS recognized these challenges and enacted the flexibility rule in 2014. It certainly would not hurt the forward momentum of the meaningful use programs to allow such an option in 2015.

Second, the meaningful use program is not just about what providers can or should do. It is about all of us. We all need to keep in mind that the ultimate goal of the meaningful use program is to promote better care and better health for consumers/patients, including ourselves.

Per a recent report, patients value providers’ use of EHRs, appreciate the ability to access their data in a timely manner and seek even more robust functionalities in EHRs. So far, one of the great accomplishments of the meaningful use program is the significant growth of EHR adoption among providers. This leads to higher recognition of its values among consumers. The meaningful use program should continue, but at a more measured pace, so we all can achieve the goal with little to no compromises.

We hope that these numbers and rationales provide a meaningful perspective as CMS and ONC continue to make data-driven decisions in setting the policy in 2015 and Stage 3. We think that when one asks for leniency, showing great results so far and good faith based on accurate data would trump defensive arguments.

Nevertheless, while there is no further change in the existing policy, providers should continue to keep up their efforts and push to achieve the higher goal of better care and better health.


more...
No comment yet.
Scoop.it!

Harnessing the Power of Big Data with Digital Health Partnerships -

Harnessing the Power of Big Data with Digital Health Partnerships - | EHR and Health IT Consulting | Scoop.it

In today’s digital world, electronic patient data is growing exponentially and moving faster than healthcare organizations can imagine. At the same time, clinicians suffer from information overload, and high-volume and increasingly complex clinical patient loads, alongside dwindling time and resources.

Now more than ever, the pressure is building to harness the power of big data and digital technologies to help clinicians make faster, patient-centric decisions that increase quality of care and enhance health outcomes all while decreasing costs.

Sounds great, right? Especially to the critical care domain where data is extraordinarily dense, time is our greatest opponent, and fiscal concerns represent an annual cost to the U.S. economy in excess of $260 billion and approximately 40 percent of total inpatient costs.

But what if health care analytics and clinical decision support (CDS) could combine to deliver rapid bedside diagnostics or upstream health detection capabilities? That is to say, a tool that provides first responders, clinicians, hospital staff, home care providers, and patients with clinically relevant, patient-centric information, intelligently filtered and presented at appropriate times to transform care delivery.

Historically, CDS applications have operated as components of comprehensive electronic health record (EHR) systems—in other words, retrospective data repositories or order entry systems with limited data streams that are, at best, semi-real time.

However, the next generation of CDS tools seeks to incorporate advanced data processing systems capable of discovering and harnessing actionable insights from all varieties of medical data, and leveraging these insights for diagnostic, predictive and prescriptive capabilities.

In a nutshell, this next gen CDS tool will aggregate disparate patient health information—static and real-time—across care delivery touchpoints for analysis and optimization, enabling clinicians to make faster decisions and implement personalized, patient-centric treatment options at the point of care, whether that is the home, ambulance, hospital or battlefield.

Bear in mind, this description simplifies what is a highly sophisticated and complex health IT tool to a functional concept. Key challenges for implementation include the ability to:

  • Collect and aggregate health data, including that from monitors, throughout the patient care continuum into a single portfolio
  • Normalize, pre-process and de-identify data for analysis—not all data is created equal and not all data is useful in its raw form
  • Capture data at the point of care, stream for real-time computational analysis and combine with retrospective data
  • Present actionable insights in a format that end-users can easily consume for enhanced decision-making in the clinical workflow or home life-flow

Ultimately, such a solution could have the power to save a life, elevate care delivery, reduce length of stay, improve quality of life or predict and avoid a critical health event altogether.

To many, this sounds almost like science fiction, but probable with the help of a small village—or in our case, a team of digital health partners comprised of world-class researchers like those at the University of Michigan, advanced analytic technology products, wearable and anti-wearable sensors, and mobile and connected health solutions.

Healthcare has lagged behind the retail and financial sectors in the use of big data and digital technologies but the gap is closing and closing fast. The risks are high, but manageable through the teaming of digital health partners, and worthy of such a high-impact payoff. Data is king and the more hard evidence we have the better decisions we can make as clinicians, patients, families, providers, payers and industry alike.



more...
No comment yet.
Scoop.it!

How Big Data Analytics Helps Accountable Care Organizations

How Big Data Analytics Helps Accountable Care Organizations | EHR and Health IT Consulting | Scoop.it

In an effort to revolutionize the healthcare industry and improve the quality of care, the federal government began working with physician practices to implement accountable care organizations (ACOs). ACO development and the right health IT technologies can be key in improving patient outcomes.


ACOs focus on preventing disease and emphasizing wellness among a set group of patients in an effort to reduce healthcare costs, according to The Fayetteville Observer.


“We are constantly monitored for any intervention that might keep the patient well: preventive vaccinations, fall risks at home, keeping blood pressure, diabetes, cholesterol under control,” Dr. Martin DeGraw, a family physician at New Bern’s Coastal Carolina Alliance, told the source.


Most of the practices involved in ACO development are treating Medicare patients, which is a condition under the Patient Protection and Affordable Care Act. The news source discusses an example of a woman who kept ending up in the emergency room. The woman was in a wheelchair, had no transportation available, and suffered from diabetes. She also lacked a refrigerator to store her medication.

Whenever she needed to visit a doctor and get a prescription refill, she would call an ambulance to pick her up and drive her to the emergency room. With the help of a care manager working with an ACO, this disabled woman was able to connect with a transportation provider and community nonprofit groups that built her a ramp and donated a refrigerator.


ACO development often involves implementing new technologies and integrating care across a team of healthcare professionals. The exchange of health information is integral to the functioning of an accountable care organization.


EHR technology is vital for running a successful accountable care organization. With digitized patient data available for access through EHR systems, physicians, nurses, and other healthcare professionals would be able to share information quickly and efficiently across an ACO care team.


The timely, effective exchange of big data in the healthcare industry is vital to improving patient health and integrating new models of care including ACO development. The Harvard Business Review commented on the importance of integrating data and translating data collection and analytics into practice.


One major endeavor in the healthcare industry is the need to standardize big data, as it come in many forms from images in CT scans and physician notes in EHRs to insurance claims and information from wearables and monitoring tools.


Standardization is also challenging due to the variety of entities that collect medical data – providers, payers, federal agencies, wellness institutions, genetic testing companies, public health agencies, disease-management companies, and patients themselves.

Currently, there are initiatives taking place in both the public and private sector to better integrate and standardize big data. For example, the National Institutes of Health launched a program called Big Data to Knowledge Initiative to develop better tools for collecting and analyzing healthcare information.


Predictive analytics is a key area that has used big data to improve patient health outcomes. By evaluating a person’s surroundings, better predictions can be developed and interventions can be targeted to particular patients. By preventing disease and finding ways to keep patients healthier, healthcare costs will be lowered and patient health outcomes should improve. This is essentially the mission of ACO development projects.


Practical application of new data collection systems and analytic tools is vital to the success of big data assessment. The future calls for physicians, policymakers, patients, and the entire medical team to be engaged in translating big data into practice.


more...
No comment yet.
Scoop.it!

Shift Your Focus Back To Patient Care By Outsourcing Data Entry Services

Shift Your Focus Back To Patient Care By Outsourcing Data Entry Services | EHR and Health IT Consulting | Scoop.it

As a healthcare service provider, are you constantly busy with non-core data management activities? Is not finding time to concentrate on improving patient care, one of the main concerns? If yes! Then it is time to outsource. Engage all your resources and attention towards improving patient care, non-core medical data entry activities can be handled by your outsourcing partners.

  • Free your resources and engage them into patient care and managing your health care facility
  • Save the money that you invest in technology and infrastructure for an in-house data entry department
  • Maintain all your records (up-to-date and accurate) in an electronic and well indexed format by outsourcing medical data entry solutions
  • When required information (data) is available quickly, just at the click of a mouse, it supports you to deliver better healthcare services.

Outsourcing to a reputed data entry services provider not only frees you of these mundane tasks and allows you to concentrate on patient care, but also ensure that you gain access to the required highly accurate data, in a format requested by you.

Popular medical data entry work outsourced by hospitals and healthcare facilities includes:

  • Patient record data entry
  • Soap notes
  • Health care records
  • Lab data
  • Medical billing data
  • Medical reports
  • Clinical records
  • OT reports
  • Patient files and documents
  • Discharge summaries
  • Medical insurance forms and details

Partner with your service provider for ongoing data entry work in your hospital and healthcare facility
Support activities like data entry, content management, data conversion and document indexing, are popularly outsourced by wellness centers, healthcare facilities and hospitals. There are companies that promise high quality services at low costs and a faster turnaround time. Service providers will also promise immediate attention and handling of high data volumes or sudden data influx situations. However, do not go ahead without due diligence. Get pilot projects done or work with an outsourcing service provider on a couple of data entry projects first before you form a partnership and outsource all the ongoing work to them. Due diligence is the most important, because however redundant data entry may sound, but your healthcare centers functioning and operations depend on the accuracy and availability of data.

Hi-Tech Export offers comprehensive BPO services like data entry, transcription, data conversion, web research, data management and processing etc to the healthcare sector. The company has garnered a client base and partnered with some of the reputed names in the health care industry by delivering the best quality services with every project they undertake.



more...
No comment yet.
Scoop.it!

Big Data Underpins Five Health IT Predictions for 2015 | EHRintelligence.com

Big Data Underpins Five Health IT Predictions for 2015 | EHRintelligence.com | EHR and Health IT Consulting | Scoop.it
These five health IT predictions for the coming year pins their hopes on the potential of big data to support healthcare reform.
In 2014, the Affordable Care Act (ACA) put a strong stake in the health IT landscape, shifting critical healthcare conversations from simple theory to reality. Today, all healthcare stakeholders — providers, payers and technology vendors — are getting serious about making healthcare delivery more efficient and more effective. Even the big tech players like Apple, Oracle, and IBM have joined the “fix healthcare parade,” all having made major moves in healthcare this year.
One area of health IT that really took off this past year is big data. There is a growing thirst for quality, fact-based information in healthcare. As providers begin to place more and more focus on aggregating and analyzing patient data from various sources (e.g., EMRs, administrative systems, claims processing systems) and determining how to make the best use of all this important information, clinicians are increasingly seeing value in evidence-based decision-making. As we move into 2015, big data will no doubt continue to play a key role in moving the needle on improving care delivery and patient outcomes.
Given the changing healthcare landscape, we predict the following for 2015:
Physicians will begin to embrace rather than abhor EMRs.
Hospitals and practices across the country have struggled over the years to get physicians on board with their EMR adoption initiatives. For the most part, physicians have been reluctant to embrace EMR technology as an aide and making it a part of their daily workflow.
However, as big data begins to contribute more meaningfully to patient care, we will see physicians increasingly realize the potential value of the EMR as guidance to improve clinical decisions rather than simply storage for patient information. Clinical decision support tools for “just-in-time” patient care will be valuable features in the next generation of EMRs. One day in the future, EMRs will even replace medical literature and clinical trials for evidence-based medical analysis and support.
We will see a rise in the democratization of health information.
Historically, there has always been an immense gap between provider and consumer access to health information. With healthcare increasingly becoming more personalized, this gap is slowly closing. Providers, payers, publishers, and even pharmaceutical companies have all shifted their focus to meeting consumers where they are and delivering the information they want, when they want it and in the format they want. This means we will see a big spike in innovations in new care delivery approaches, such as mobile health, telemedicine, and retail medicine solutions in the coming year.
Pharma will take more of a care provider or services role in healthcare.
Pharma marketers are beginning to understand the need to go beyond customer acquisition to truly improve treatment adherence and drug efficacy for consumers. They are quickly realizing that those who move in this direction are those who will ultimately succeed. As a result, pharma companies will start to look at themselves more as providers of services rather than drug developers in the traditional sense.
A massive data breach will force real action around health data privacy.
In 2015, a bigger-than-ever data breach will shake up the healthcare industry, forcing a reckoning around health data ownership and control. The industry will realize the need to build a truly reliable framework around health data privacy and security.
Quality will rise as the future king of the industry.
As the shift from fee-for-service care to population health management grows, providers will increasingly place emphasis on the quality of care rather than the number of procedures delivered. Consequently, payers and providers will invest more in health information technology that helps identify patients who are at high risk for care setbacks, hospital readmissions, and other poor outcomes. The EMR companies themselves will be further challenged to open up their closed system environments as doctors and health systems demand easier access to their patient data.



more...
No comment yet.
Scoop.it!

EHR + Geography = Population Health Management

EHR + Geography  = Population Health Management | EHR and Health IT Consulting | Scoop.it

Duke Medicine is combining the data of EHRs with geography information to create a program which can predict patient diagnoses.

Duke University Medicine is using geographical information to turn electronic health records (EHRs) into population health predictors. By integrating its EHR data with its geographic information system, Duke can enable clinicians to predict patients' diagnoses.

According to Health Data Management, Sohayla Pruitt was hired by Duke to run this project; she has a master’s degree in geographic information systems, or GIS. “I thought, wow, if we could automate some of this, pre select some of the data, preprocess a lot and then sort of wait for an event to happen, we could pass it through our models, let them plow through thousands of geospatial variables and [let the system] tell us the actual statistical significance,” Pruitt says. “Then, once you know how geography is influencing events and what they have in common, you can project that to other places where you should be paying attention because they have similar probability.”

iHealth Beat explains that the system works by using an automated geocoding system to verify addresses with a U.S. Postal Service database. These addresses are then passed through a commercial mapping database to geocode them. Finally, the system imports all U.S. Census Bureau data with a block group ID. This results in an assessment of socioeconomic indicators for each group of patients.

“When we visually map a population and a health issue, we want to give an understanding about why something is happening in a neighborhood,” says Pruitt. “Are there certain socioeconomic factors that are contributing? Do they not have access to certain things? Do they have too much access to certain things like fast food restaurants?”

Duke is working to develop a proof of concept and algorithms that would map locations and patients. They are also working on a system to track food-borne illnesses.

“It’s easy to visualize or just say, ‘Oh, this person lives in a low income neighborhood with lots of fast food restaurants.’ You could probably do that very quickly,” Pruitt says. ”But the only way to really understand the statistical significance of what’s going on and where else it’s happening or going to happen is through infrastructure development, by pre-downloading that data, prepping and pre-relating that data to every address and every EHR.”



Via nrip
more...
No comment yet.