It's interesting how people generally consider it acceptable for insurance companies to discriminate based on gender, when that discrimination would probably not be considered OK in other areas. You don't choose your gender, why should you be punished for it? Would people also be OK if insurance companies discriminated based on race? Surely there is some correlation between race and collision risk as well.
It’s a tough philosophical question as to whether it is fair to discriminate based on gender to determine insurance rates. In the U.S. we split the difference. For car insurance and life insurance, where men have significantly greater risk, men pay more, because they cost the insurance company more. For health insurance, where women cost the insurance company more, insurance companies are forced to charge the same rates, because it’s wrong to make someone pay more for something based on an inherent trait they have no control over.
I wonder... is there any good solution to this issue from either side? Outside of legitimate gender change (bear with me -- I know some of you don't think this is possible but lets go with the assumption) can you really blame the guy? And as an insurance company what can you do about it? I would say maybe we could have flat rates but then that just incentivizes the "good" class to go to an insurer who values their "goodness".
British Columbia will soon support answering “not specified” for gender on identity documents. I wonder how insurance providers will cope with that. Will they demand you tell <i>them</i> a gender? Will they try to infer it? Will they use low-statistical-power actuarial data for the insurance usage rates of “not-specified people”?
So the simple solution to this type of manipulation is adding a part to the gender change request that says "I swear under threat of legal penalty that I identify as [gender]". Doesn't hurt trans people and makes abuses of the system like this clear cases of legal fraud.
Favorite quote:<p>> "If you're going to declare on any document, you need to be truthful," he said. "If not, you're making a fraudulent claim. This could impact you for any future insurance application that you make, or any other aspect of your life."<p>Unless the insurance commissioner is going to provide an objective definition of gender that can be externally and independently verified, I'm afraid that he really has no grounds on which to claim David is a man, rather than a gender-nonconforming woman who prefers male pronouns.
Is it gender discrimination to charge one gender more than another for the same service?<p>And if so, and more broadly, is there a problem with changing gender identification to gain a preferential price or service?
What if the insurance company used machine learning to calculate the premium, which resulted in correlations with gender, race, etc.? Is that also considered discrimination?Whose fault is that?
In the Netherlands we had a similar thing. An car insurance marketed specifically to woman, with lower rates, nice pink website and you even got a free purse as a welcome gift. But due to anti-discrimination laws you could simply apply as a man as well, if you could stand the pink website. Of course you got the free purse as well.
Well we've all thought of it now someone's done it (well lots of people have probably done it, this guy is just the first one to risk telling people he's done it). $91/mo adds up.
An man in Argentina changed legal gender so he can retire earlier :
<a href="http://www.dailymail.co.uk/news/article-5544173/Argentinian-man-legally-changes-gender-retire-earlier.html" rel="nofollow">http://www.dailymail.co.uk/news/article-5544173/Argentinian-...</a>
> "I'm a man, 100 per cent. Legally, I'm a woman," he said.<p>> "I did it for cheaper car insurance."<p>Yeah, I see what you did.<p>"it's because of the insurance!! I swear!!"
This is a perfect example of "AI gone wrong" even though there was no AI involved.<p>The costs of insurance are based on actuarial tables, which are really just calculations based on large chunks of historical data, much like an AI. And much like an AI, the result essentially magnifies the biases that already exist in the data (biases that may be accurate or may not be).<p>The tables, nor the AI, care about ethics or perception. They are simply the result of the inputs given.<p>Do men really have more tickets and accidents? Maybe. Or maybe they just get caught more.<p>It just highlights how careful we have to be about biases, real or accidental, as we rely more and more on mathematical models based on data.