Instead, a physician working for the insurance company will use some sort of GPT to come up with an excuse to deny care and then they will write that up in their own words to deny care.
How about just prohibiting them from denying coverage <i>at all</i>, except for bona fide fraud? So much of the rot of the healthcare system stems from this broken regulatory approach of thinking that corporate bureaucrats will somehow "manage" care better than patients and doctors can. These "insurance" companies should be relegated to a purely financial role, as a basic table stakes reform.
> In an example involving a decision to terminate post-acute care services, an algorithm or software tool can be used to assist providers or MA plans in predicting a potential length of stay, but that prediction alone cannot be used as the basis to terminate post-acute care services.<p>So, the Random Forest can say, "No coverage", just so long as the human hits "Accept computer recommendations"
I don’t see how we can use ai and not have it generate new biases.<p>I also feel like biases have been baked into insurance data in a discriminatory way for a long time — think about how much car insurance is for a 16 year old boy.