Navigating Healthcare – Patient Safety and Personal Healthcare Management

Calling Doctor Data

With a nod to Star Trek, Bones, Data and even the Holo Doctor

Much of medical practice is as much a mystery to doctors as it is to patients.

Human physiology is so complex, and the external variables so numerous, that we often have no sure knowledge of why one patient did well or another patient didn’t. Every physician longs for some way to really know what will work for each patient.

While we have come a long way, even in just the past five years, there is still so much left to be learned. The one thing that can help us reach greater knowledge faster is data and analytics.
That’s really the underlying value of electronic medical records: they represent a treasure trove of data waiting to be mined. With the right algorithms, we can use that data to find patterns that tell us what factors make a tangible difference in outcomes. It’s the wisdom of the ages waiting to be read.

Perhaps the most valuable medical team member of the future will be a data scientist. These are the experts who understand how to tag and mine data and how to construct algorithms that find patterns accurately and can help us be more effective in delivering the best possible care every time.

For example, there is a great study from the University of Iowa Medical Center, in which gastroenterology surgeons are using real-time patient data in the operating room, combined with past data from gastric surgery patients, to predict who is at risk for developing a surgical site infection. This helps guide decisions in the OR as well as post-surgical care. While doctors know that a variety of modalities can reduce infection risk and promote healing, resources are not endless. By identifying patients who need high-level care, they can ensure that resources are targeted where they are needed most. The project has reduced surgical site infections by more than 50 percent in the gastroenterology patients whose care was guided by the analytics.

So Dr. Data (as New York Times writer Steve Lohr calls one data scientist) is saving lives, even without a medical degree.

 

How do we know the predictions are accurate?

But here’s the catch: we’ve got to get the algorithms right. If we aren’t careful, we can draw conclusions that aren’t really there. To make giant leaps forward in understanding, we need a colleague on the case who really understands how to create algorithms that have practical value and accurate results.

Tom Hill, a colleague of mine at Dell, recently wrote a blog in which he noted the necessity of using a systematic, transparent approach to predictive analytics. “Harvesting big data carries with it the responsibility to do-the-right-thing with those data. Big or any data and predictive models in healthcare must be correct, access and tamper-proof (secure), must not discriminate, generally do-good, and not-do-any-harm.”

He goes on to talk about the need for transparency in analytics, so that those using the results understand what data is being used and how it is being analyzed. As changes or improvements are made, they must be documented, so that the transparency lives on.

I think this is a critical point for physicians who will be using the algorithms in the future. If our patients’ lives will depend on the quality of the analytics used to guide treatment decisions, we need to know that the algorithms are correct. We don’t want a black box that dispenses treatment prescriptions; instead, we want to know how the results are created, so that we can trust the advice offered and help guide future improvements.

Adopting analytics in ways that don’t risk lives

Dr. Hill’s point about “not doing harm” is well taken. As healthcare organizations add analytics to patient care, projects like the one at the University of Iowa is a good place to start. It takes a body of existing knowledge about a large population of gastroenterology surgical patients and analyzes what factors were associated with certain outcomes. It then takes that analysis and compares it to a specific patient, providing insight into how that patient may do in post-surgical care.


The likelihood of a result that harms a patient is small. At worst, a patient might receive more care than is really necessary, or might not be recommended for care that would help. But that happens all the time without any analytics intervention, so the risk to patients is not increased by using the insights from the analytics. And the care team can monitor to see that, if the patient needs more extensive post-surgical care, that care can be ordered.
Other initial analytics projects in healthcare are looking at ways to predict surges in demand for care, based on environmental factors, and those projects also aren’t likely to put patients in harm’s way.

These kinds of project allow an organization to use analytics for practical improvements, while also learning how to use these new insights. As the organization’s expertise grows, the complexity of the analytics projects will likely grow, too. But starting with a project of limited scope and low risk for patient harm is a smart idea.
It’s also a way to help build trust. Physicians may be somewhat leery of trusting an analytics program to help them make treatment decisions, especially if a recommendation flies in the face of what that doctor’s always done in the past. So institutions must be careful to build trust in analytics as they move forward. As physicians see the effectiveness of using these tools, they’ll be more willing to engage in analytics themselves. So how, when and why you use analytics really matters. And making sure that you’re working with a really good Dr. Data is important, because at least for the foreseeable future, medical practitioners will be working very closely with Dr. Data to make analytics a powerful force for good.

 

This piece originally appeared in Beckers Hospital Review: Calling Dr. Data: A new consultant is set to make medical care more effective

Calling Doctor Data was originally published on DrNic1

Advertisements

Virtual Assistants in your Future – Personal Healthcare Delivered

You can always rely on Hollywood to take concepts and extend them into the future – sometimes correctly (cloaking, holographic TV, forcefields and eco skeletons with mind control), sometimes incorrectly (aluminum dresses, atmosphere that is completely controlled, suspension bridge apartment housing). We have had speech recognition and Spock’s request:

So it was no surprise to find the latest Hollywood idea is the “Her” – a lonely writer develops a relationship with a newly developed operation system

Intriguing and challenging our current concepts with an exploration of artificial intelligence, voice and natural language technologies. These new styled avatars understand, listen and decipher what we say and something that Nuance has been developing and reinventing the relationship that people and technology can have. We can engage with our devices on our own terms and we have show these concepts in healthcare with our very own Florence – who is getting ready to launch in 2014

Ambitious you say – maybe but imagine the environment with intelligent personal assistants that hear you, understand you, know your likes and preferences – and in our world exist across your doctors office, the phone, surgery, hospital and elderly care and hospice. Cool? Liberating? Impossible?

If you’re Nuance, the idea is not only brilliant – it’s our focus and drive as we reinvent the relationship between people and technology. It is the chance to connect with your devices on human terms and presents infinite possibilities for intuitive interfaces that adapt to you.

Liberating our clinicians to focus on the patient and providing patients with someone they can talk to, interact with and who does have time for them. That future – coming to a doctors office near you:

Science, Evidence and Clinical Practice

A recent article on the The Difference between Science and Technology in Birth on the AMA site demonstrates the challenges we still face in getting clicnal practice influenced by science and data. Studies and data may show the path for best clinical practice but as the authors note there are multiple instances of the clinical community – in this case the OBGYN – either knowingly or unknowingly failing to follow the best practices

For deliveries in the US evidence tells us that fetal monitoring in low risk pregnancies has a deleterious effect – yet it remains standard practice in most settings to place external scalp electrodes and intrauterine pressure catheters

Although we still see external continuous fetal monitoring employed in many low-risk pregnancies, “as a routine practice [it] does not decrease neonatal morbidity or mortality compared with intermittent auscultation…. Despite an absence of clinical trial evidence, it is standard practice in most settings to place internal scalp electrodes and intrauterine pressure catheters when there is concern for fetal well-being demonstrated on external monitoring” [3].

 

They list several other standard practices including

  • routing episitomy
  • Use of Doula’s
  • Challenges with Epidurals

Reasons for these behaviors are varied but as the authors state:

Many well-intentioned obstetricians still employ technological interventions that are scientifically unsupported or that run counter to the evidence of what is safest for mother and child. They do so not because a well-informed pregnant woman has indicated that her values contradict what is scientifically supported, a situation that might justify a failure to follow the evidence. They do so out of tradition, fear, and the (false) assumption that doing something is usually better than doing nothing

Until we fix these basic issues there seems limited opportunity to implement intelligent medicine and real evidence or science based practices.

 

Why Even Radiologists Can Miss A Gorilla Hiding In Plain Sight

Posted in EHR, expert systems, HealthIT, Inattentional Blindness, radiology by drnic on February 28, 2013

Notice anything unusual about this lung scan? Harvard researchers found that 83 percent of radiologists didn’t notice the gorilla in the top right portion of this image.

Trafton Drew and Jeremy Wolfe

Notice anything unusual about this lung scan? Harvard researchers found that 83 percent of radiologists didn't notice the gorilla in the top right portion of this image.

Notice anything unusual about this lung scan? Harvard researchers found that 83 percent of radiologists didn’t notice the gorilla in the top right portion of this image.

Trafton Drew and Jeremy Wolfe

This story begins with a group of people who are expert at looking: the professional searchers known as radiologists.

“If you watch radiologists do what they do, [you’re] absolutely convinced that they are like superhuman,” says Trafton Drew, an attention researcher at Harvard Medical School.

About three years ago, Drew started visiting the dark, cavelike “reading rooms” where radiologists do their work. For hours he would stand watching them, in awe that they could so easily see in the images before them things that to Drew were simply invisible.

“These tiny little nodules that I can’t even see when people point to them — they’re just in a different world when it comes to finding this very, very hard-to-find thing,” Drew says.

 

YouTube

In the Invisible Gorilla study, subjects have to count how many times the people in white shirts pass the basketball. By focusing their attention on the ball, they tend to not notice when a guy in a gorilla suit shows up.

But radiologists still sometimes fail to see important things, and Drew wanted to understand more. Because of his line of work, he was naturally familiar with one of the most famous studies in the field of attention research, the Invisible Gorilla study.

In that groundbreaking study, research subjects are shown a video of two teams of kids — one team wears white; the other wears black — passing two basketballs back and forth between players as they dodge and weave around each other. Before it begins, viewers are told their responsibility is to do one thing and one thing only: count how many times the players wearing white pass the ball to each other.

This task isn’t easy. Because the players are constantly moving around, viewers really have to concentrate to count the throws.

Then, about a half-minute into the video, a large man in a gorilla suit walks on screen, directly to the middle of the circle of kids. He stops momentarily in the center of the circle, looks straight ahead, beats his chest, and then casually strolls off the screen.

The kids keep playing, and then the video ends and a series of questions appear, including: “Did you see the gorilla?”

“Sounds ridiculous, right?” says Drew. “There’s a gorilla on the screen — of course you’re going to see it! But 50 percent of people miss the gorilla.”

This is because when you ask someone to perform a challenging task, without realizing it, their attention narrows and blocks out other things. So, often, they literally can’t see even a huge, hairy gorilla that appears directly in front of them.

That effect is called “inattentional blindness” — which brings us back to the expert lookers, the radiologists.

Drew wondered if somehow being so well-trained in searching would make them immune to missing large, hairy gorillas. “You might expect that because they’re experts, they would notice if something unusual was there,” he says.

He took a picture of a man in a gorilla suit shaking his fist, and he superimposed that image on a series of slides that radiologists typically look at when they’re searching for cancer. He then asked a bunch of radiologists to review the slides of lungs for cancerous nodules. He wanted to see if they would notice a gorilla the size of a matchbook glaring angrily at them from inside the slide.

But they didn’t: 83 percent of the radiologists missed it, Drew says.

This wasn’t because the eyes of the radiologists didn’t happen to fall on the large, angry gorilla. Instead, the problem was in the way their brains had framed what they were doing. They were looking for cancer nodules, not gorillas. “They look right at it, but because they’re not looking for a gorilla, they don’t see that it’s a gorilla,” Drew says.

In other words, what we’re thinking about — what we’re focused on — filters the world around us so aggressively that it literally shapes what we see. So, Drew says, we need to think carefully about the instructions we give to professional searchers like radiologists or people looking for terrorist activity, because what we tell them to look for will in part determine what they see and don’t see.

Drew and his co-author Jeremy Wolfe are doing more studies, looking at how to help radiologists see both visually and cognitively the things that hide, sometimes in plain sight.

In a well documented aspect of the human mind and one we have all probably experienced in one form or another:
 

Inattentional Blindness
If you have seen the gorilla video before (there is a whole web site dedicated to this here) and their video

And the original version of this (I think more compelling) can be seen here

So important in so many areas – in the cockpit of airplanes many of the accidents can be traced to failure to identify what may seem clear indications of the fault or problems. The most recent example in the cockpit of air france Flight 447. My favorite detailed report came in Popular Mechanics: What Really Happened Aboard Air France 447 that highlighted the fact the aircraft was in a user induced stall

during its entire 3 minute 30 second descent from 38,000 feet before it hit the ocean surface

Despite multiple warnings from the onboard systems (visual and audible)

In healthcare the same challenges exist and this was aptly demonstrated in this study by Drew who:

He took a picture of a man in a gorilla suit shaking his fist, and he superimposed that image on a series of slides that radiologists typically look at when they’re searching for cancer. He then asked a bunch of radiologists to review the slides of lungs for cancerous nodules. He wanted to see if they would notice a gorilla the size of a matchbook glaring angrily at them from inside the slide.

83% of radiologists missed it.

A problem when we are asking our radiologists (and doctors) to speed through even more images (and patients in less time)

I suspect technology is going to have to help in catching some of these instances and provide additional backup to the human mind. In fact “Assure” is one example of soem of the steps being taken towards this goal