That's one hell of a breakdown, Naveen. I read it twice, because there is a lot(!) to take in.
To the AI claim, I was left with the impression that the device was merely a camera attached to a database - all old technology. Computer optical character recognition (and chess!) was big in the 1950s, facial recognition in the 1960s, and retinal imaging in the 1980s. It's only a matter of big memory - and, consequently big datasets - here. But I do understand the definition of AI is a rather loose one, and always has been.
That's one hell of a breakdown, Naveen. I read it twice, because there is a lot(!) to take in.
To the AI claim, I was left with the impression that the device was merely a camera attached to a database - all old technology. Computer optical character recognition (and chess!) was big in the 1950s, facial recognition in the 1960s, and retinal imaging in the 1980s. It's only a matter of big memory - and, consequently big datasets - here. But I do understand the definition of AI is a rather loose one, and always has been.
The AI engine works behind the scene, pretty much real time. Images are transferred to a remote server where they are analyzed using AI.
Here is a recent article comparing the AI algorithm of two different screening tools.
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8199438/