Under pressure—AI FOMO in healthcare

There is a human emotion nearly as universal as fear of the unknown, and that is the fear of missing out (FOMO). I’m sure most of you have heard, read about, observed, or experienced FOMO somewhere by now, and the most likely places where such feelings often arise are Instagram, TikTok or Facebook/Meta. The past HIMSS and SIIM conferences have shown us first-hand evidence that FOMO has become an increasingly influential factor as the AI tsunami continues to flood Healthcare, and particularly the enterprise imaging sector. What’s happening is that companies and champions, dare I say “Influencers” are producing AI FOMO and confusion. Just like the latest bitcoin craze, the Gamestop madness, or NFTs. This is methodical, curated, purposely created FOMO. And healthcare is eating it up. For every successful Oxipit, there are dozens of companies lined up to offer AI solutions that may or may not provide breakthrough technology to improve workflows or patient outcomes. This, compounded with burnout, short staffed hospitals, and the ongoing onslaught of marketplaces, apps, and algorithms making egregious claims creates quite a dilemma for any practicing radiologist and healthcare organization.

AI FOMO is Counterintuitive

The scary part about FOMO, or even peer pressure, is that all of us experience it, succumb to it, and yes even make certain decisions (some good, some not so good) because of it.
In self-doubt, we observe what our friends, family and colleagues do, and we engage in unproductive worry that we are somehow missing out on something important. Take for example, the wildly popular travel locations across the globe and the FOMO created when an influencer posts their reels, photos, or videos. What I find the most interesting about these shots, is that reality often doesn’t measure up at all to what was portrayed, photoshopped, or significantly altered, yet the FOMO exists. Yes, I do want that photo of me standing in front of Trevi Fountain in Italy. And yes, I want it without any of the hassle that comes with obtaining it. And, wait, am I now looking for flights to get said-photo?

Instagram vs. Reality. Source: sajitha_travels sajitha_travels

The pandemic recently exacerbated this problem, as many of us did in fact miss out on many of life’s most basic joys for extended periods of uncertainty. In a collective planetary sigh of relief, our society came out of the pandemic with the sense that we needed to make up for lost time, and I’ve even caught myself overcommitting to too many events and gatherings.

This confluence of circumstances has given rise to a phenomenon, born out of the social media era, seeping its way into Healthcare: influencer culture. No, the influencing is not taking place on Instagram or TikTok, and it doesn’t involve any of the Kardashians (phew!!). Much of the influencing is taking place on LinkedIn, where self-declared AI gurus are popping up everywhere, actively branding themselves as luminaries who can provide guidance for Healthcare organizations in this confusing new AI gold rush.


Prospector pans for gold. Source: Findgoldprospecting.com

In many ways this phenomenon is reminding me of the irrational fears the world felt toward the Y2K bug, which turned out to be the biggest non-event of the last century. Many solid consulting firms did phenomenal work preparing the world for these extra two digits; however, among the legitimate professionals who carried the majority of the workload, many less than scrupulous consultants enthusiastically fanned the flames of fear of the unknown, selling thousands of billable hours to naive customers who didn’t fully understand the nature of the Y2K challenge. Those who approached the problem rationally and deliberately never skipped a beat. Those who let themselves become intimidated by their fear of the unknown ended up ok, but needlessly spent a lot of money in the process.

What Is Driving AI Adoption In Healthcare?

Make no mistake about it: when it comes to AI adoption in healthcare, everybody is learning on the job, consultants included. Many among us still don’t know what we don’t know about AI and its place in clinical workflows and Healthcare IT ecosystems.

Let’s take a look at the protagonists in this brave new AI world:

AI developers typically have undeniable expertise in data science, linear regression and machine learning methodologies, but in enterprise imaging they are learning on the job, sometimes the hard way, that they must conform at the very least to DICOM and HL7, and preferably to DICOMweb and FHIR (RESTful APIs). Software developers are not traditionally trained in healthcare and sometimes have little understanding of clinical workflows, HIPAA requirements, or FDA regulation. This makes AI development challenging, as data scientists who train the algorithms lack the practical experience to know the where and the how their creation fits in live clinical workflows.

Hospital CIOs are still learning the true potential of AI and evaluating whether it will substantially improve their organization’s clinical efficiency, or if it will do significant damage to an already overly complex and fragile IT ecosystem.

CISOs are trying to get ahead of the potential cybersecurity threats AI can present when implemented in production, especially in use cases that are deployed in the cloud.

Imaging IT professionals are torn between their duty to maintain mission critical infrastructure, and responding to the increasing demand for AI integration. It’s a bit like railroad workers in the 19th Century, laying down new tracks with a steam locomotive literally breathing down their necks.

Transcontinental Railroad workers. Source: Weebly

Radiologists can see some of the potential for AI to help them find the needles in the haystacks, and thereby improve turnaround times. However, when they realize the AI they want doesn’t fit neatly within their expected reading routine, they tend to become dismissive because, after all, they need to keep busy reading cases. The flow of patient images does not slow down while the industry is trying to collectively wrap its arms around the problem of clinical AI integration.

The FDA, NIST and MITA committees and subcommittees spend an immense number of collective hours to discuss AI governance, refining new guidelines and rules to reign in AI in clinical environments for the sake of job number one: patient safety.

In public opinion polls, patients are routinely attributed positive feelings toward AI in healthcare, but very few of them truly understand the actual impact AI could make on their health, whether negative or positive.

AI Marketplaces continue to try to attract Healthcare organizations to their shops, touting their ability to curate AI better than anyone else, and to make it easy to integrate their chosen algorithms into existing clinical workflows.

Investors pour billions of dollars into AI ventures, much like they did with various brands in 2019 when one hundred and forty two brands went IPO. To qualify for this distinction, venture-backed privately held companies were valued in a funding round at $1 billion or more. It was the stampede of the unicorns with notable companies Uber, Zoom, Lyft, Pinterest, AirBnb, CrowdStrike, Instacart, Beyond Meat, Slack, WeWork. The same is now occurring, investors are hoping to land the next AI unicorn— if there is such a thing. We all know what happened to WeWork. Investing in AI continues to be a cautionary tale. However, these Venture capitalists and PE firms are under diamond-generating pressure to produce an ROI from the AI gold rush. Or attach their names to an AI in some way shape or form. Again, FOMO on full display.

Source: Crunchbase

In many ways, it’s the blind leading the blind. Except Healthcare IT professionals can hardly be viewed as helpless. They have simply been made to feel intimidated by the relentless winds of change, fear of the unknown, and FOMO. This combination of circumstances has made it perfect for a whole new generation of consultants eager to feed on all the collective insecurities in the industry.

How To Avoid AI Pitfalls In Healthcare

The Health IT community needs to wake up from the collective FOMO, and go back to the basics of due diligence. Industry professionals know how to do this. They don’t need to bring in outside experts (influencers) to tell them how to evaluate a technology, or how to integrate it into their own domain, because nobody knows better than themselves what’s best for their own domain.
When in a state of self-doubt, it’s natural to seek outside help to compensate for our insecurities, and it also provides a convenient outlet in case things don’t go as planned (place blame on others).
Clinical and IT professionals aren’t shy teenagers seeking validation and self-confidence in the eye of others on social media. They don’t need to look to influencers to figure out how to do this. IT professionals have an important mission, which is to preserve the integrity and safety of their infrastructure above all. This internal compass, this guiding principle, is all the guidance they truly need to evaluate an algorithm, and determine whether or not it can be elegantly integrated into existing clinical workflows.
If your organization is understaffed and you need to bring in outside help to deploy a vetted solution to production, by all means, bring in outside help. However, before doing so, spend time to articulate your organization’s AI fundamentals, protocols, and procedures for evaluating, testing, and implementing AI!

When the Founding Fathers of the United States sat down to write the first draft of the Constitution, they didn’t outsource it. They created their own blueprint and set the terms for ratifying, not copying what others had done, because they trusted their internal compass to guide them and create a unique set of fundamental principles for the country. So can Healthcare professionals leverage their own internal compass (patient safety), clinical experience, and workflow knowledge to shape AI protocols and procedures.

At the outset, form a committee made of four individuals at a minimum: a physician leader, a data scientist, an IT infrastructure professional, and a clinical workflow design expert. Each of these roles should already exist in your environment, or you wouldn’t be able to function in the first place. Between these four leaders, the knowledge exists to establish a set of AI evaluation criteria, validation protocols, and implementation guidelines for your own organization.
In most situations, the argument to bring in outside counsel is predicated upon an organization’s lack of experience with AI implementation, or lack of knowledge about best practices. When it comes to AI implementation, chances are that you will be paying a consultant to learn on the job, simply because there isn’t enough collective experience with widespread, successful clinical AI implementation in our industry. These standards and best practices are being born as we speak, and they are the result of the collective work of the countless individuals who trust their own internal compass to guide them in the right direction.

10 rules for successful clinical AI adoption:

If you are producing an AI app or algorithm and want to ensure it can be deployed in a clinical setting, we’ve created a set of guardrails below:
  1. Be clinically relevant
  2. Know the Clinician’s perspective
  3. Respect the Clinician’s workflow preferences
  4. Be natively interoperable: use industry standards
  5. Neutralize bias in machine learning methodology
  6. Have a viable long-term business model
  7. Don’t introduce latency in clinical workflows
  8. Be more accurate than a human
  9. Generate usable results
  10. Be equally deployable on prem and in the Cloud
Before embarking on a new AI adventure, healthcare organizations should go back to the basics. Nothing beats deliberate due diligence and common sense. Trust your gut, the force, intuition or otherwise. It always prevails.

For an entire playlist dedicated to the many feelings that accompany FOMO, check out our Spotify.