Healthcare AI: The Technologies, Tools, and Criteria That Matter

ODSC - Open Data Science
4 min readApr 26, 2021

--

Artificial Intelligence (AI) has the power to change the healthcare industry for the better, but we’re just now scratching the surface of its potential. In addition to improved, more cohesive care, AI can help with a range of activities from vaccine development and accelerating clinical trials, to automating once mundane tasks and answering patient queries. That said, there’s still a lot we need to learn and many challenges we face in an industry as unique as healthcare AI.

To explore what those hurdles are and get a better understanding of how new technologies are being applied, a new global Gradient Flow survey, sponsored by John Snow Labs, seeks to provide a temperature check on the state healthcare AI today. In knowing the technologies, tools, and criterion that healthcare organizations are focused on to push AI initiatives forward, we can get a better handle on the promises and shortcomings that will launch us into — or prohibit us from entering — the next wave of AI adoption.

The Technologies

Healthcare leaders are getting serious about putting their valuable data to work, and the emerging technologies that practitioners are focused on are reflective of that. Close to half of respondents cited data integration, natural language processing (NLP), and business intelligence, respectively, among the technologies they are currently using or plan to use by the end of the year. Additionally, more than one third of technical leaders indicated that their organizations are using or will soon be using data annotation tools and data science platforms. Electronic medical records (EMRs) have made this possible, but NLP has empowered users to take this a step further by linking non-text entities — free-text, social media posts, diagnostic images, etc. — to create a more holistic view of a patient.

Interestingly, patients are also at the forefront when it comes to who is using AI technologies. While data scientists and technical personnel were the primary practitioners just a short time ago, the advent of chatbots and other interactive, machine-powered processes has enabled clinicians and patients alike to get more acquainted with AI. When asked who the intended users are for AI tools and technologies, more than half of all respondents reported clinicians among their target users. Of mature organizations with extensive AI experience, 59% indicated that patients were also users of AI technologies. This will only grow as people get more comfortable with the accuracy and convenience machines can provide as a first means of contact.

https://opendatascience.com/free-download-the-odsc-guide-to-machine-learning/

The Tools

As cloud services proliferate and ease of use and scalability are big selling points, cost and security can be a hindrance — especially in highly-regulated industries like healthcare and pharma. Because of this, it’s not surprising that the most popular forms of software being used to build AI solutions in the field are open-source (53%) followed by public cloud providers (42%), as data privacy emerges as a key challenge for cloud service adoption. Patient privacy regulations and laws, such as HIPAA often prohibit healthcare organizations from sharing third-party data, which can make information-sharing difficult.

This is likely the main reason as to why companies that have experience deploying models into production more often choose to rely on their own data and monitoring tools as opposed to third-party partners or software vendors. In contrast, companies that are still in the early stages of exploring AI are more open to evaluation metrics provided by software vendors. This makes sense both from a regulatory perspective and to satisfy the unique set of needs of the very complex and quickly-evolving healthcare industry.

The Criteria

As indicated by the aforementioned findings around AI tools, selecting the right partners, solutions, and models is another important, but difficult area of AI adoption. When evaluating a consulting company to work with, expertise in healthcare AI data engineering, integration, and compliance (41% considered this very important) and no sharing or derivative rights of the data or code (45%) were the most important to technical leaders. When you consider medical jargon, different meanings or acronyms for certain terms at different hospitals, and strict regulatory requirements, it’s obvious why these criteria top the list in pursuit of finding a good AI partner.

These ideals stay fairly consistent when it comes to selecting an AI technology, too. Technical leaders value state-of-the-art accuracy (48%), no data sharing with software vendors (44%), and the ability to train their own models (42%) when evaluating machine learning, NLP, or computer vision solutions. Similarly, for locally installed software libraries or SaaS solutions, healthcare-specific models and algorithms (42%) and having a production-ready codebase (40%) are key.

It’s promising to see how AI is currently being used by healthcare organizations and the plans they have in just the coming year. While it’s clear mature organizations are well on their way to being AI-enabled, the tools and technologies that prevail will be the ones that focus on accuracy, health-care specific offerings, and prioritize protecting sensitive information. With the pace and promise of AI development, it’ll be interesting to see where we are in another year from now.

Original post here.

Read more data science articles on OpenDataScience.com, including tutorials and guides from beginner to advanced levels! Subscribe to our weekly newsletter here and receive the latest news every Thursday. You can also get data science training on-demand wherever you are with our Ai+ Training platform.

--

--

ODSC - Open Data Science
ODSC - Open Data Science

Written by ODSC - Open Data Science

Our passion is bringing thousands of the best and brightest data scientists together under one roof for an incredible learning and networking experience.

Responses (1)