From the linked IMF article:<p>> Fintech resolves the dilemma by tapping various nonfinancial data: the type of browser and hardware used to access the internet, the history of online searches and purchases. Recent research documents that, once powered by artificial intelligence and machine learning, these alternative data sources are often superior than traditional credit assessment methods, and can advance financial inclusion, by, for example, enabling more credit to informal workers and households and firms in rural areas.<p>No research is linked, but assuming this kind of statistical data is roughly accurate in the first place for the sake of the argument, feeding that in some black box ML "AI" system is likely to produce a fair amount of inaccurate results, and since it's all a magical system you can't really argue with it; it's just "computer says no". Even employees high up the food chain (if you get that far) won't be able to really help you either, because insight in these systems is hard and deal poorly with exceptional situations (and turns out there are a lot of those, it's just that all of them are different).<p>Without going in to all the privacy aspects and other problems, it's a good example of the dehumanisation of our society. At the end of the day you just can't beat human judgement and "common sense", but the possibility for this is increasingly taken from us. Sure, humans are not perfect but just because a few mistakes are made doesn't mean taking away the entire human aspect is desirable.