Having worked in the connected car/telematics industry for a while as a contractor, I can very well relate to this and can confirm that the security systems in place inside the car's telematics unit is not good enough. For example, in one of the oauth process of authenticating a car with the cloud, the VIN was passed around as a client secret and MDN of the modem as the username ! We recommended to immediately stop this practice, but the "IT" dept of the automotive maker said, " You know we sell cars, not security software." There is no budget to rewrite the mechanism, and the telematics unit cannot be updated OTA. The upgrade requires customers bringing the car to a dealer and USB stick updates etc.<p>I believe the frequent bursts of data from the car was given to insurance companies. Or they were trying to package insurance deal along with the car sale or something.
No mention of the irony of someone who doesn't use Google Play Services because he only uses open source software being willing to attach a device to his car, running closed source software, that tracks everything he does in his car?
Until there is some kind of law in place that makes companies financially responsible for this kind of blunder, it will proliferate. In the current state of affairs it's simply not economically justified to implement proper security.
So they had this vulnerability live for 3 years, didn't even pay a bounty, and they <i>still</i> don't get named or shamed? What incentive is there to do a better job if they can just do a shitty job and nobody finds out?<p>Name and shame, please!
I can't believe that anyone would voluntarily sign up for this. Frankly, insurance isn't that expensive.<p>Having a little third party controlled snitch hooked to your car is a security issue, period. The fact that the implementation is a shitshow is just icing on the cake.
Terrible, but they did fix it rather quickly once the flaws were disclosed. Given many other such stories, the almost expected outcome would be to deny the problem, have the discloser prosecuted or sued, and put out a fix six months later that made things worse.
What's the point of hiding the identity of the company here? The issue has apparently been fixed and I'd rather know which company had it so that I can avoid them.
The EU and its member countries are still interested in personal privacy. Do they regulate insurance providers? Could EU, or Italy, exact a penalty against this provider for failing to do the most elementary of penetration tests on this system? Perhaps some of the penalty should be a return of premium payments to customers whose information was potentially exposed.<p>The point is to make the business-risk managers in other provider companies say to their executives: "We cannot take the risk of skipping cybersecurity hardening. If we do skip it and we get caught, our business will be forced into bankruptcy."
Note that with the latest changes to Android, using mitmproxy to analyse the behaviour of apps has become impossible: apps refuse to accept personally-installed certificates.<p>In the future, we'll see less revelations about this sort of thing, not because it has become rarer but because Google have chosen a course of action which obscures it.<p>(it also breaks things like personal or corporate CAs, but that's a different problem)
When my insurance company offered a discount to use one of these devices a few years back, I smelled a rat. I figured they would use it to observe how fast I drive vs the speed limit so they can decide how "safe" of a driver I am or whatever. But also my insurance is very inexpensive so discounts on it are not a big motivator.<p>I guess location tracking would make sense too, so they can bust you if the car stays in a place other than where it's insured for. Or god knows what else. All of this shit is only going to get worse, a lot worse.
it's really sad how young online political activists have adopted privacy issues instead of adopting issues like workers rights, vacation time, pay, a strong welfare state, universal healthcare etc...