CEO: learning where to use AI is as important as learning how

The key to successful use of AI, says one leader, is corporate responsibility and education in how AI should be used, its capabilities and its limitations.
Jeff Rowe

To put it mildly, there’s been a little bit of buzz about AI, lately. But while few stakeholders discount the potential of new and emerging AI technologies, even some enthusiasts are advising cool heads and caution.

For example, writing recently at Forbes, Dekel Gelbman, CEO of FDNA, an AI and digital health company that develops technologies and SaaS platforms in the clinical genomics space, readily accepts that what he refers to as “narrow AI,” which he defines as “computers performing a narrow task, such as facial recognition or sifting through huge volumes of data,” is an undertaking “that machines are better equipped to do than humans are.”  

He’s quick to point out, however, that “while AI can far surpass humans at gathering and learning from mass quantities of data, there will always be the need for a person to interpret that information and reason with a combination of logic and emotional intelligence to decide how best to use it.”

Specifically, he says, “similar to how skill is the sum of our experience, AI is the sum of the data it’s trained on.”  In other words, while descriptions of the potential of AI, both beneficial and malign, often gravitate towards scenarios of “robots taking over the world or creating a race of superhumans,” Gelbman suggests “it’s far more likely we’ll see AI trickle into the field of healthcare,” with focused use in particular disciplines.

As an example, he notes how complicated making a medical diagnosis can be. “Doctors go through medical school, medical training and years of practice to develop their cognitive abilities — not to store millions of reference images in their heads. AI in its most influential state should be harnessed as a tool to enhance human abilities, not try to replace them. For example, clinicians can think about a potential diagnosis they might have never previously considered by working with AI that highlights findings and makes information accessible.”

Looking at AI’s potential from a different angle, Gelbman points to cost savings opportunities, but here again the effect will be somewhat gradual. “While the costs of services such as genome sequencing are beginning to decrease,” he explains, “multiple lab workups can still be a costly undertaking; in the genetic disease space, this can sometimes feel like looking for a needle in a haystack. I believe integrating AI technologies into the clinical workflow can aid in reaching an earlier diagnosis, thus eliminating the need for multiple lengthy tests.”

In the end, says Gelbman, “if the healthcare industry wishes to play a continued leadership role in promoting AI, we need to demonstrate corporate responsibility and invest in education — specifically in how AI should be used, its capabilities and its limitations. In this way, we’ll be able to establish a shared meaning between people and machines — and, more importantly, among our peers as we embark on the next phase of technological evolution.”