Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Automakers are sharing consumers' driving behavior with insurance companies (nytimes.com)
272 points by xnx on March 13, 2024 | hide | past | favorite | 324 comments



When I got my last car, it had a sticker instructing me to press the SOS button to opt out of data collection. The gentleman that answered acted confused when I made the request, taking the stance that he wasn't aware of data collection from the car and maybe I should contact the manufacturer. It was only after I read him the sticker text verbatim that he went into a scripted response and confirmed the opt out. Shady.


The confusion was likely a scripted response too.

I had a friend who worked with call centers, taking orders over the phone. they had a script for everything, and after they quickly took your order, they started with a scripted upsell. If the person balked, they had responses for everything so they could continue. The only way out of the script was if the customer said "I will cancel my order". The call center person would be fired if they did not follow the script.


Xfinity similarly does this if they overcharge you on a contract. They try to upsell on a new (less value) contract for an hour before finally just changing tone and saying "you're right we messed up" and refunding you.


After all, you called them, which shows you're a ripe target. Every sales person knows that. Oh, you thought they were a support person? Maybe sales support.


When in doubt try the "God" opt-out. I have found they usually either do not have a scripted rebuttal to this or they are legally not allowed to go against one's religious views. This can also be useful when feeling pressured to make an immediate decision by asking for a moment to "pray".

Examples:

1- "God told me to..."

While canceling Time Warner Cable I tried cancelling a few times prior. Always met with rebuttals, put on hold to research the account. Either waited on hold for 20 minutes or the call was disconnected. Tried again and gave the reason for cancelling "God told me to" account cancelled immediately. (I have also heard simply saying you're moving out of the service area works too.)

2- "Moment to pray" - Buying time or dissuade an aggressive salesperson.

My wife just recently gave birth to our first child. When her water broke she wanted to go to the nearest hospital which was within walking distance instead of getting a cab across town to the hospital where she was originally being seen. She remained adamant that she wanted a natural birth unless it would potentially endanger her health or our daughters. She has two friends who were pushed into C-sections both of which ended up having complications due to the procedure.

After six or seven hours after her water broke an administrative nurse came in with a shot and said she was going to start inducing labor. My wife pushed against this as she was worried about complications from a C-section. The administrator had a rebuttal to everything, to the point she even started crying. When I continued to hold ground they brought up concerns of spousal abuse.

This registered an immediate red flag and luckily I defaulted to "God mode"

I asked if we could have a moment to "pray". This bought us enough time to speak with the doctor face-to-face who agreed to wait a few more hours before inducing labor to avoid a C-section. We had a successful natural birth with no side effects.

(Hospitals' make 2-3x from a C-section as they do from a natural birth.)

(*When asked if it was medically necessary the administrative nurse would only say it was hospital policy and they do the same for everyone else who was on the floor.)


It’s scary they pulled the “spousal abuse” card.

It should be illegal to manipulate people by implying they’re breaking the law.


Modern medicine is great, but sadly they push C-sections a lot. My understanding is they can bill at least 50% more than a natural birth. This skews even higher if you don't have the best insurance.

A bit outdated but: https://www.theatlantic.com/ideas/archive/2019/10/c-section-...

https://archive.is/4jj0h#selection-761.59-761.224

Also once you have one c-Section, the chances of a second increase significantly. It is not suggested to have more than two c-Sections, so if you want three kids you are out of luck.


My wife's first language is not English. (Though US citizen for well over 12+ years.) The hospital brought in someone who spoke both languages to bring up the concern while I was in the room.

Friends suggested bringing in a Doula. They are really good at patient advocating for patient's rights.


"Are you implying that I'm implying you're breaking the law?"


Sounds like prompt injection


Explain?


Was a joke based on how people are manipulating LLMs to get around filters. People call it prompt injection kind of like SQL injection

https://learnprompting.org/docs/prompt_hacking/injection


I never saw one of those when buying my most recent car (Toyota). But, after creating an account on their website I was able to find this section:

  Consent Centre
  E-privacy: Your car data

  Help us improve our products by sharing car data and diagnostic info.

  We don’t currently have your consent to use this data. This may be for the following reasons:

    Your car doesn’t have Connected Services. Contact your dealer to check if they can be added.

    No (connected) car has been added to your account. Go to My Vehicle, add a car if needed, then connect your car via the Connected Services card.

    You have a connected car but have yet to consent. Go to My Vehicle, and open the chosen car. You will be requested to give your consent.

 
  Toyota thanks you for helping us to improve our products.

FWIW, their smartphone app contains a "consent" section with various toggles for the usual reasons (e.g. "sharing data to improve products") and these are all turned off.


I love the attention to detail here, like any customer ever has navigated to this page trying to find out how to give consent to having their data collected.


I was looking at the Prius Prime for a next car--thanks for the heads-up!


Prius Prime has a single fuse [labeled `DCS`] which, once removed, actually disables the computer module which connects to the two cell antenna.


This is excellent news. Thanks!


I test drove a Prius Prime, and ended up purchasing a Camry Hybrid (instead). Was less expensive, and is much better on the highway [gets 51+mpg].


If I get a new car I'm just taking the modem out.


If an insurance company can’t find data on you when they expect it, won’t they just charge you the high risk premium?


If you won’t tell them whether you have a car alarm or a secure garage they just assume you don’t.


One of the automakers in the article claims that voids your warranty. It may or may not but enjoy the legal battle should you ever need to make a claim.


It does not. There are laws against voiding a warranty based on aftermarket modifications made by the end user. This is the case in the USA at least.


Brand?


Toyota.

I still had the sticker- I peeled it off and put it in the manual. The exact text is:

VEHICLE DATA TRANSMISSION IS ON! Your vehicle wirelessly transmits location, driving and vehicle health data to deliver your services and for internal research and data analysis. See www.toyota[.]com/privacyvts. To disable, press vehicle's SOS button.


Prepare to be creeped out:

https://www.toyota.com/privacyvts/


Yeah, right at the top

> We may modify this Privacy Notice by posting a new version at this website effective on the date of publication.

No actual notification needed. Are they screaming to be regulated?


Isn't this how all ToS work though? They all contain some clause that says "actually we reserve the right to do whatever we want whenever we want and everything you read here is meaningless"


Yep, and that's why physically defeating the device is always better than relying on the other party to honor their word. Disconnect the wireless module/board and they can't send your data.


I've physically disconnected the modem in my Audi, though when it goes to the dealer for a service (still under extended warranty) I cannot help but imagine that they download the data.



Similar experience but to be fair at least toyota has a big sticker. Other cars are doing it too and just not telling you. Either way, my next car is going to be a 90s land rover.


I found this Facebook group called "Low Miles No Miles"

https://www.facebook.com/groups/2066782386864241/

I was very surprised how many people there are who buy cars from the factory and immediately stored them, in the 1990's. In some cases the owners did not even remove any of the factory stickers or plastic.

This group must be showing just the tip of the iceberg.


Was it a Toyota?


This kinda confirms, unfortunately and sadly, that ChatGPT answers are probably just as good as human answers. And the data collection for your phone call went to training, not into a database where you officially opted out.


> This kinda confirms, unfortunately and sadly, that ChatGPT answers are probably just as good as human answers.

These people are paid to follow scripts and strict protocols. At best, this may suggest that ChatGPT answers are as good as a call center representative's answers.


I think there are a few reasons to be mad about this, but some are better than others:

1) People don't understand they're being monitored. I think this is a good reason to be mad. People should have some understanding of the agreements they make. It's part of being a functional adult in the world. It's also annoying that the companies keep spinning this as a tool to improve your driving, when it's clearly an attempt to price insurance against a person's actual driving habits.

2) The system's assessments are opaque. I don't have a good sense of how accurate any of these measurements are, nor what system is in place to ensure that. If the information collected is consequential enough to double a person's insurance costs, there should be some effort expended to be confident that the collected metrics actually reflect reality. I didn't see anything like that in the article, maybe I missed it, but it shouldn't just be some random team in a private company doing their best.

3) People's driving habits shouldn't be shared with insurance companies. This one ... this one I think is not great. It looks like the shared data at least tries to be anonymous -- they share driving behavior and times, but not actual location data. Heck, I'd be fine with scrubbing the times and just sharing the hard start and stop and speeding numbers (assuming point 2 above is addressed). I get that a knee-jerk defensiveness about privacy would make Thomas Jefferson proud or whatever, but we strike balances on public welfare and private freedom all the time. If you're itching to manufacture 1 gram of ricin to put in a sealed glass vial above your mantle, too bad, you can't. Cars aren't ricin, but they are the shortest path between most humans and homicide. If this kind of intervention induces people to be more careful to keep their current insurance rates, I think that's reasonable. Driving like a maniac is not a human right or a protected characteristic.


I've avoided accidents by hard breaking twice in the last two years, once from deer bounding into the road, and once from a deaf old cat walking into the street.

I haven't been cited for anything in decades, and have never been in an at-fault accident. I drive the speed limit and have a dashcam. With the deer, I was actually 10 MPH under the limit.

So should my rates go up for these incidents where I successfully avoided hitting something? Insurers are unscrupulous and would use any excuse.

No, thanks. I'll share nothing.


Amen. It's their job to calculate risk. Not my job to be "transparent". The ratchet only goes in one direction.

I won't be an Amazon driver in my own car.


But they have the tech to collect the data and make extra cash by selling it and making you an Amazon driver in your own car. So if they can, they will. Unless there’s something to stop them. Which in the absence of their goodwill would be legislation.

Unfortunately legislation representing anything other than big money interests is difficult and rare to pass.


The insurance company would argue that you drive in an area with wildlife crossings. That makes you a higher risk even if you managed to avoid this deer. You are more likely than average to encounter another one in the future and may not be as fortunate.


I did pest control for a while, and my truck was equipped with a monitoring device that would beep if it detected unsafe driving. The thing was inconsistent enough to be nearly indistinguishible from random. It sometimes nagged me while driving straight at normal speeds, or going over a pot hole, or just stopping like normal at a stop light. At other times, it wouldn't go off for what should have been obvious "offenses"—hard stops, last-second swerves to avoid road debris, etc.

All in all, I think it was useless for actually policing driving behavior, but I did get identified (read: randomly selected) as the safest driver in the branch one month and got a bonus, so I guess that was nice?


I drove a newer Subaru for a couple days and it had a "feature" like this with a camera pointed at the driver that would beep if it thought you weren't paying enough attention. Just like the pest control truck it was innacurate to the point of being totally useless and very annoying. The stupidest part was that it couldn't be disabled. I was a happy Subaru owner for many years but the driver camera and a few other modern owner-hostile features totally turned me off to the company.


It seems that in at least some models it can be disabled.

https://www.youtube.com/watch?v=c1xLD_UKUEY


Did your rates go up? Or is this a straw man argument?


They didn't, since my insurer was unaware of these events.


Location can be relevant. There is both a quarter mile drag strip by me and a circuit lap by me that both allow you to drive your own car on them.

Both styles of driving would be... Alarming from a telemetry perspective.

Afaik neither is covered by regular auto insurance anyways so it really shouldn't factor into rates. There's specific racing insurance, but it's quite pricey.

Not that I want them sharing location data, but pure acceleration/velocity data won't show areas like that.

I'm also not sure how well regionalized the data is. Though neither is good, there's a very big difference between going 15 over on the highway and going 15 over on back country roads with blind turns. Or between going 15 over on the highway vs in a shopping center parking lot.

Speeding is contextual.


It’s also difficult to determine if someone is speeding from data.

For example the road I live off of according to the speed limit the car thinks goes from 40 to 65 to 25 to 65 to 40 in about a 4 mile span. Spoiler it does not. It is 40 the whole way. But according to the car I am either going 25 under, 15 over, or exactly the right speed.

(And the 65 section in the middle? Blind corner. Idk where it’s getting its data but it is very very wrong)


Indeed, the usual garbage in garbage out issue.

Iirc, though, I think I read something about this and they were more interested in average speed (regardless of posted speed), and the rate/frequency of acceleration/deceleration (especially deceleration).

The idea being that speed increases accident severity, regardless of posted speeds. Rapid deceleration is indicative of reacting late to something you should have seen and responded to earlier (eg following too closely and having to slam the brakes, not seeing someone merging, not slowing down for a yellow light, etc).

Basically that a safe driver would have a fairly smooth acceleration/deceleration profile because they're aware of what's happening around them and pre-plan accordingly. If someone wants to merge in, give them room and then back up enough that you can brake slowly if something happens.

I still don't want to be tracked, but their metrics seemed sane at first pass.


As someone in the Northeast US: Many of our highways were designed 80+ years ago, and do not have appropriate acceleration/deceleration lanes.

The terrible drivers are often those with the most timid inputs, especially with regards to acceleration. It is perfectly normal here to need to merge into heavy, 60mph+ traffic from a dead stop, or to need to quickly match speed and identify an appropriate merge spot to not wind up stuck at the end of a ramp.

And it's not like they sit there for 15 minutes waiting for some exceptionally large gap to match their acceleration habits - that would be very annoying to other drivers, but theoretically "safe". They enter in the same length gap as someone that actually uses their gas pedal - but rely on oncoming traffic to hit their brakes/evade, as they fail to get up to speed quickly enough for the small gap they've entered in.

-----

Hard braking is something with fewer reasons it should happen regularly - but I'm still reminded of the usual adage about metrics. Do you really want people to be mentally reluctant to hit their brakes as hard because of the insurance hit? That seems like a recipe for increasing decision time and accidents.


Maybe if the only thing you're reacting to is other vehicles or the road. The number of times I have to slam on my brakes on that particular road because of animals running into the road is way too high. And no, not always deer. I've come around curves and just had someone's dog sitting in the middle of the road on multiple occasions because for some reason people think it's totally safe to just let their dogs roam.


That indicates real increased risk, though, compared to someone who drives in places where animals are less likely to be in the road.

Insurers aren't trying to determine how good of a driver you are (conditional probability of you being in a collision given conditions). They're trying to determine how likely it is that you're going to be involved in a collision that results in a claim (unconditional probability of you being in a collision). If you frequently drive through deer infested forests, it seems reasonable that your insurer is going to expect more claims compared to someone who doesn't do that.

It's similar to how driving late at night results in higher premiums. You can be the same good driver at night and during the day, but if you're frequently driving at 3 a.m., you're a higher risk.


Just like any other data collection and “tailored” to it service the only purpose is to justify a charge not to actually work. Targeted ads work better than traditional ones? Who knows but how can you say that your service is better if it has nothing innovative. Just like “feature-rich” devices is just a sales pitch.


>People should have some understanding of the agreements they make.

People do not engage in meetings of the minds on these types of things. Manufacturers/insurance companies enter into agreements (and leave stickers that are unlikely to be read) which is a clear violation (imo) of contract law.

It's one thing to be aware of agreements you make, it is another to navigate a corporate surveillance hellscape of on by default consentless surveillance a bunch of psychopayhic corporate types greenlit.


> it's clearly an attempt to price insurance against a person's actual driving habits.

I think it's a combination of two strategies.

1) searching for a reason to not pay a claim.

2) searching for a reason to increase your pricing, while hiding average driver behavior from you to increase their bargaining power


> It looks like the shared data at least tries to be anonymous

One of the main points of the article is that insurance companies are using the data to raise drivers' rates. How can they do that if the data is anonymous?


The car company can share the details based on the chassis VIN number rather than driver details.

Then the insurance company grabs the vehicle registration number when you ask for a quote and looks up the VIN on their side based on a security database to prevent resale of stolen cars or similar.

Anonymous data becomes identifiable data...


> The car company can share the details based on the chassis VIN number rather than driver details.

That's hardly anonymized data! It's more obscured.


Ah, anonymous was the wrong word. I meant instead that the shared data tries to restrict itself to information that doesn't obviously fall under a right to privacy. For example, trip times are shared, but locations are not.


The flow of traffic on the highways where I live is consistently 15-20 mph above the posted limit. I wish everyone would slow down, but that doesn't change the fact that the safest way to merge is to accelerate hard and match their speed. The last thing I need is a financial incentive to be oblivious to my surroundings.


The only speeding ticket I’ve gotten in the past 20 years was for speeding on the on-ramp to get up to the speed of the highway traffic. Holiday weekend, so it was stop, ticket, and release. Repeat. No warnings given.


When I worked in car insurance, besides our own telemetry, we got at least

- Willis Towers Watson (WTW) aggregated driving data

- Verisk (afaik this was mostly around vehicles, not people)

- Various reports directly from state governments

- LexisNexus (multiple different report types)

Really any mobile app that has accelerometer or gyroscope access (even without GPS) can estimate driving safety. Using phone movement and angle, you can estimate driver vs passenger.

Cambridge Mobile sells equipment a lot of insurers use and afaik also data

The magic keyword to look for is "telematics"


(From the Toyota Connected Services Disclaimer)

Your Responsibilities

    Your responsibilities include: (1) informing passengers and drivers of your vehicle that data is collected and used by us, and (2) notifying us of a sale or transfer of your vehicle. If you do not notify us of a sale or transfer, we may continue to send data about the vehicle to the subscriber's Account Information currently on file, and we are not responsible for any privacy related damages you suffer.


This needs to be illegal.


Senator Wyden has been a champion of data privacy. If you are an Oregon resident, please reach out to his office.

https://www.wyden.senate.gov/


This is why I like paper contracts. I would simply cross out a stupid clause like this before signing.

99% of the time the rep doesn't care, and if the company can't be bothered to put someone on the other side of the table who is actually paying attention or has bargaining power then they deserve it.


I remember driving a nissan leaf. You had an opt-out prompt every time you drove the car.


For data or was it the standard “use responsibly” message?


It was the telematics data screen


The US doesn't just need laws about disclosure of these practices. It needs to mandate that this kind of corporate surveillance must be a clearly labeled opt-in and cannot be mandated by any contract.


Yes, but to me there's another important issue, with all the tracking tech in our vehicles, who actually owns them? We need to treat this similar to the "right to repair" issue! I paid my money, so I own the product. [IF] I own the vehicle it should be my right to say what software runs in the background.

Of course the company can say, "If you don't like our product, don't buy it." If I want to keep up with the latest safety upgrades to my vehicle to protect myself and all car companies have the same tracking software, my only option is to look for a "dumb" vehicle. This is blatantly unsafe and irresponsible. So, they're saying that my safety comes with a price other than the $40K I shelled out?


> Yes, but to me there's another important issue, with all the tracking tech in our vehicles, who actually owns them?

It's even worse. Your car acts as a blackbox against you. A 1990's car? I can do whatever the fuck I want to, I can drive it offroad, I can speed, I can even be near a bank robbery or whatever.

A modern car? Someone robs a bank a few hundred meters from where I am, and now the police will come knock on my door because the IMEI of my car was near the bank when the robbery happened. I speed a little bit to overtake some dumbass driving 20 km/h below the limit, the police makes a dragnet subpoena against my insurance / the data processor from the manufacturer, and issues me a ticket.


> Of course the company can say, "If you don't like our product, don't buy it."

Honestly I'm pretty tired of that "our way or the high way" nonsense. Society needs to make it so they actually can't say that. Make respecting us a precondition for their continued existence. As in they literally get liquidated if they say that even once.

That's how we deal with sociopaths leveraging these non-negotiable "terms" against us. They have zero empathy, they view us like cattle to be marked and monitored and turned into cash flow. So there is no reason to empathize with their nonsense viewpoints either. Just make whatever they're doing illegal. Doesn't matter how much money they lose.


> who actually owns them? We need to treat this similar to the "right to repair" issue

Ownership is a bad framework for this issue—it’s too ambiguous. You can “own” a vehicle all you want, that doesn’t give you the right to fuck with its odometer or catalytic converter.


> You can “own” a vehicle all you want, that doesn’t give you the right to fuck with its odometer or catalytic converter.

Sure it does.

This entire attitude is what's scary about software, actually. See, back in the ye olden days, no one disputed your right to remove the catalytic converter on a vehicle you purchased.

It was no longer legal to drive and if you were caught you could get fined. But you absolutely had the right to do it.

But now with software, there's enough control that 3rd party entities are dictating with 100% success what an owner can do with the vehicle. And you're defending it as right.


> there's enough control that 3rd party entities are dictating with 100% success what an owner can do with the vehicle

Your key term here is control. When discussing a new rule, that is what you focus on. Use the word ownership when selling the rule, sure. My point is rules drafted starting from ownership tend to be trivial to circumvent. Because they presume ownership is a natural state when it is a social construct.


or, and this is a crazy thought, when someone pays for something they expect to have the right to do what they want with it. When a 3rd party is able to exert absolute control in hampering that ability, it becomes a problem.

you purchase a video game from your religious friend and they decide you shouldn't be allowed to play the game between 8pm and 8am and they have the ability to ensure you can't.

their ability to limit you isn't a social construct, it's as strongly bound as physical violence, and that's the problem.


> when someone pays for something they expect to have the right to do what they want with it

Where? When?

Say you own private property and a car. Does that mean you are allowed to leak diesel all over it? Most jurisdictions say no, in part because that affects your neighbours’ property values.

Ownership is not, and has never meant, absolute sovereignty. It’s a package of rights defined in terms of control. When we’re discussing amending what ownership means, giving the owner more control, it’s circular to start with ownership: you can do it. But it’s much more meaningful (and powerful) to talk about control.


it's telling that all of your counterexamples involve the government when the entire discussion is around what non-government 3rd parties are allowed to dictate.

Tesla is not the government. Toyota is not the government.

stop it.


> all of your counterexamples involve the government

I’m trying to avoid the miasma of conflicting rights. Somewhere in this thread I referenced a car spilling diesel on private land. This impact the value of neighbouring plots. On entirely private merits, the owner’s ability to operate their property, on their property, willy nilly, is curtailed.

Simpler, if more absurd example: someone’s pet or kid wanders on your property. This curtails what you can do, with your property on your property. You own both. But you don’t control everything which happens upon it.

Seizing on this distinction is immensely clarifying. It’s the difference between talking about computers in general and knowing the protocols.


In other words, you think it's ok to insist on a contract with a loan for 50% interest.

I, and many others, disagree and think that's a heinous abuse despite the argument that both parties willingly entered into the agreement.


> you think it's ok to insist on a contract with a loan for 50% interest

This is a total non sequitur. (And sure, if both participants are wealthy institutions and knowledgeable and uncoerced.)


It's not a non sequitur so much as it was a preparation to make the point I'm about to make.

There is no world in which two institutions with plenty of money, knowledge, and a lack of coercion are going to come to an agreement for a loan with 50% interest.

Theory vs practice. In theory what you're saying could happen, in practice it's only going to happen when one party has a severe imbalance against the other party (and you know this or you wouldn't have tried to head off that argument). Since contract law deals with practice, contract law disagrees with your assessment that it should be allowed.

Which goes back to the whole ownership thing.

Just because someone _can_ draw up a contract to muddy ownership to the point that the seller of a $30k+ USD vehicle can retain control and absolutely limit what the purchaser can do does not mean contract law should allow it.

And there's too much precedence for this sort of thing for you to have a leg to stand on (although I'm sure you'll try). Just because someone _can_ sign a non-compete with no expiration does not mean the law should allow it.

ad nauseum.

Arguing that because contracts today muddy ownership so you can't act as if the purchaser has certain rights is missing the point.


The word "willingly" does a TON of heavy lifting here given the circumstances of both parties.


I disagree. You can mess with odometer as much as you like. Trying to sell it off with a different amount of miles than have actually been put on it is called fraud.

You should own your car and be able to do as you wish. You should also be able to turn on or off any tracking. There are just consequences for some of the things you might want to do.


> should own your car

Ownership is a legal concept. What it means, what that package of rights tied to a piece of property entails, is entirely dependent on the law. Using ownership as a guideline for rule-making is bad form because it’s tautology; I can justify and condemn anything on the basis of my or adjoining persons’ purported ownership rights.

The machine languages of ownership are control and possession. That’s what we’re delineating, and unfortunately it generally must be done piecemeal. In this case, the pieces are the data cars beam home. Currently, the manufacturer controls it. You and I agree—I think—that it should be the user, which we—by this conjecture—make its owner. The ownership flows from control, not the other way.

(The problem is trebled with cars given they’re typically driven on roads the driver doesn’t own nor control.)


ownership has a legal definition, the concept of ownership exists outside of the law.


> the concept of ownership exists outside of the law

Not really. The common definitions either fall back to control or invoke the term property, another legalistic word. What ownership means is incredibly fluid and context dependent; consider how ambiguous it is when it comes to its classic form, real estate.


we should label this movement you're describing.

how about "legal absolutism"? If it's not codified in law it doesn't exist and therefore cannot be a part of people's vernacular.

Once this takes over we can update all our dictionaries to stop marking specific definitions as being legal definitions as they'll all, by definition (heh) be the legal definition.

Or, to put it another way, this is the internet, where you're free to say whatever you want but that doesn't mean you'll be taken seriously.


> if it's not codified in law it doesn't exist and therefore cannot be a part of people's vernacular

Within the context of lawmaking, for social constructs like ownership, absolutely. It’s sort of like starting with legality when writing drug regulations; outside the lawmaking context, that makes sense, within it, it’s nonsense.

I’m not saying never use the word ownership in common parlance. But when discussing a new law, yes, it pays to be precise. Because starting from ownership will result in a law that is ineffective or misdirected.


the comment that started this

> You should own your car and be able to do as you wish.

no "new law" was being discussed, you yourself tried to limit the scope to the legal definition and now you're trying to argue that no one should be discussing anything but the legal definition (well, for the second time, just with different words).


> no "new law" was being discussed, you yourself tried to limit the scope to the legal definition

It's a normative statement. And I'm not solely talking legal definitions. But when we're debating the proper boundaries of ownership, it is tautological to invoke ownership in the definition.

The original phrase is stronger as "you should be able to do [with your car] as you wish." Which is not a commonly-held view even if we restrict ourselves to vehicles solely driven on private property--to the point of absurdity, you can't mow down pedestrians just because it's your car and land.


What's actually being discussed is the level of authoritarian control software has enabled to non-governmental entities in our lives and how it's so complete that it offers a level of control that not even violence can achieve.

For example, you offer up that just because I own a car doesn't give me the right to murder people with it (stupid, but you went for it so let's roll with it). The level of control being exerted by software is such that I couldn't _stop it_ from happening regardless of my ownership status if a 3rd party decided it wanted my vehicle to murder people.

the ownership thing is a red-herring from someone who is trying really hard to be smart but they're missing the point entirely.

Put another way, It's the tail wagging the dog. "You don't _really_ own it, therefore 3rd parties have the right to exert that level of control over you" when what's being protested is the level of control being afforded 3rd parties. ownership is just the mechanism.

you can't legally create a contract that allows you to charge 50% interest on a loan. You shouldn't be able to create a contract that allows a 3rd party to dictate what you can, and cannot do, with a vehicle they sold you. That should remain solely in the hands of the government (which is why your car murdering people analogy was stupid).


>I disagree. You can mess with odometer as much as you like. Trying to sell it off with a different amount of miles than have actually been put on it is called fraud.

For a long time this was just a fact of buying any car that lived long enough. I have bought several cars where the transaction went something like: "so the odometer has rolled over twice; so there's actually 376,000 miles on the frame... but only 118,000 of those are on this motor and I swapped the transmission with a reman 76,000 miles ago."

Of course we've added a few significant figures to odometers since then, and in the era of digital odometers I imagine "rolling over" behaves very differently. (Will the chassis survive 4 billion miles? Seems unlikely. Do the display and storage have different bit resolutions? Is it a saturating counter internally? Externally?)


You can do whatever you want to it, you just can't legally drive it on public roads anymore. You are forgetting about all the use cases which aren't general public transit.


>you just can't legally drive it on public roads anymore

Varies by state.


But.. it really does. As most states don't require annual inspection, you are free to smash up your odometer and sell your cat to the scrapyard. It hurts the resale value, but it's not illegal to do this at all.


It does however give you a right to download any data that is being collected. I wonder if that's covered by GDPR.


> In recent years, automakers [...] have started offering optional features in their connected-car apps that rate people’s driving.

At least the programs are (currently) opt-in.

This amusing anecdote is buried:

> One driver lamented having data collected during a “track day,” while testing out the Corvette’s limits on a professional racetrack. > [...] he was denied auto insurance by seven companies [...]


There is another commenter further up that says they had to opt out on a Toyota and the rep acted like he didn't know until the opt out text was read verbatim.

https://news.ycombinator.com/item?id=39667268


I just purchased a Camry Hybrid from a Toyota dealership. The operator tried to tell me that "because I financed it they cannot turn off analytics." I had not financing, paid cash.

Pressing the SOS button to cancel [as sticker suggested] was met with so much difficulty that (while the operator was on the line still) I found the fuse panel and pulled out `DCS` to disconnect the call/tracking. This ended our trasmission.


But are they really optional? I can’t imagine that the telematics link is going unused for the value it provides (i.e. crowd-sourcing for speed and road map data).

The worst part is that assumptions about who’s driving the vehicle.


I would be willing to bet even if you told the insurance companies it was totally legal on a professional race track -- they'd say "Nope, we still don't want to insure someone that takes his car on professional race tracks like that."


> At least the programs are (currently) opt-in.

The article makes clear that most people don't know what's happening with their data. They opt into something else and this data collection is included - that doesn't sound like much of an 'option'.


Your quote is misleading. The "he" is in the next paragraph and refers to someone else who owns a Cadillac, not a Corvette.

The track day thing probably was the funniest thing in the article, though.


My biggest concern is that rather than comparing difficult to identify behavior against claim rates, they will penalize behavior that is easy to identify. For example, yesterday I was traveling 15 MPH over the speed limit on a multi-line highway where traffic is often traveling 10+ MPH over the limit (the limit is objectively wrong for a divided, grade separated, access controlled highway). I typically drive as far right as I can to make room for faster vehicles, but eventually got stuck behind someone camping in the left lane. When opportunity presented itself I went around them in the center lane. They expressed anger at this by encroaching into my lane to squeeze me against traffic in the right lane. There were four inches between their vehicle and my side mirror. Who is driving dangerously, and more likely to cause an accident? I would argue it's the driver who is obstructing traffic and behaving aggressively toward others on the road. But if GPS isn't accurate enough to show their lane deviation, it's a lot easier to ding me for my speed.


Making it easier to determine who's at fault in cases like you mentioned would involve more sensors, radars, cameras etc. So we either 1984-ify everyones car or we just don't do any monitoring at all (since half-assing it can lead to false positives). I have a feeling insurance companies (and therefore governments) will slide more towards the 1984 side to save a couple dollars.


Insurance companies don’t have to, people are voluntarily installing dash cams to show who was at fault (or at least show they were not). Chances are, someone is recording your collision, and you might as well have your evidence to fight against someone else’s.


I‘m a huge fan of s telemetry insurance. I have I personally and it saves me around 300€/year on my cars insurance because I am a very defensive driver.

However, this being integrated into the vehicle in an absolutely intransparent way is a huge step up and a really unsettling privacy violation.

For this to be ethically viable imho, there need to be a few prerequisites

- it’s transparent what has been transmitted

- you can always easily opt out, but you may loose the discount you earned

- your driving can’t make your premium go up beyond the base premium without the discount (sensors will never paint an entirely accurate picture)


>>I‘m a huge fan of s telemetry insurance. I have I personally and it saves me around 300€/year on my cars insurance because I am a very defensive driver

My sister had it, and it was biggest piece of crap imaginable. The system would send her emails warning her about "lack of smoothness" in her driving, because....the system would rate her down every time she went over a speed bump.

The biggest problem was that she would get emails saying "we've detected you were going 70mph in a 20mph zone, if this continues we will cancel your insurance", so we would call them and ask them to provide GPS logs, which they always would - and the logs would always show that she was going legal 70mph on the motorway, which at one point goes above a smaller 20mph road - and of course the system was stupid enough to just query the speed limit for every point, not realizing that this wasn't the road she was actually on. We would email them back explaining, and the warnings would go away until she went on that road again.

Absolute waste of time and money, I think the insurance company would need to pay me to have this fitted, the nerves it cost my sister to have that piece of crap in her car weren't worth whatever discount she got for it.


From the other side, it’s essentially a fine for people who respect their privacy. Insurance prices will adjust to the adoption of this discount, will rise to the current normal and only people who don’t opt in will be hit with the extortion fee forcing them to opt in.


That last point is merely a way for you to get used to this system. Once enough people allow the spying they'll increase the price of you don't allow the spying.

And after that they'll mandate it for everybody.

I already pay a premium for having more horses under the hood. I don't want to get dinged when I use my car's power.


>you can always easily opt out

No, that should definitely be opt-in, with explicit consent to data collection and process purposes.


I'd be a fan, too, if they couldn't use the information to raise rates. But even the best drivers brake hard to avoid accidents from time to time, and in the US, insurers are dirty.


In which country is that?


I don't know about OP but in Poland https://yanosik.pl/ offered such deals ( https://payhowyudrive.pl/ ). It is probably a bit self defeating - the app's main function is warning about speed traps, that means unsafe drivers as significant part of its users.


Sorry for the late answer. But this is for my insurance in Germany, which is extremely expensive because I’m a young driver


They're describing what should be, not what is.


No, I was asking about this statement which I assume wasn't hypothetical:

> I have I personally and it saves me around 300€/year on my cars insurance because I am a very defensive driver.


gotcha, I misunderstood.


The easiest way to disable this in a Chevrolet with OnStar is to pull the fuse (Fuse 38 under the dash for the Chevrolet Malibu 2024). Other options disconnecting the antenna (can still connect if strong signal), or pulling out the box/microphone (disassembly required). At least for the 2024 model CarPlay features seems to keep working, but I haven't tested Bluetooth yet.

There's somebody on YouTube describing the parts of the OnStar feature: https://youtu.be/TZILodhvjdw?feature=shared


> (can still connect if strong signal)

Wonder if that would still work if you additionally shunted the antenna with some kind of impedance matched load.


All the Hams scramble to grab a spare dummy load.


Many GM models used to have a bridge/jumper between the network daughter board and the rest of the car. Pretty easy and didn't affect anything else (sometimes the fuse for OnStar also covered your Bluetooth or voice commands).


I read that microphone in Bluetooth stops working (on Chevy bolt)


The problem with such issues of data misuse is that people only provide 2 solutions.

a) Go off grid. Don't use The tech that these cars make.

The problem with this is that it is impractical for people that use see alot of value in using this tech.

b) Pass more regulation.

I am a Hayekian and I believe that regulation will not help with people that know the ins & outs of the regulation, also it's doesn't stop them. It just means corporations are willing to misbehave as long as they can play the legal gymnastics and pay rudimentary fines.

Now, The third option which I see would be the best but isn't talked much about is the promotion, and installation of homomorphic computing or homomorphic encryption.

I am not a cryptographer so I really don't fully understand it's limitations. But adopting this would simply make all these data abuse issues vanish.

Cryptographers, why hasn't homomophic Computing or homomophic encryption been massively adopted?


> isn't talked much about is the promotion, and installation of homomorphic computing or homomorphic encryption

Sure, the car company will homomorphically encrypt your driving data when it sends it to its own servers.

You’re trying to solve a social problem with technology. That doesn’t work.


>Sure, the car company will homomorphically encrypt your driving data when it sends it to its own servers.

You can encrypt the data such that the insurance companies cannot target any particular individual (which is my problem her) but they can use the data to improve their insurance pricing models.

I have no problem with a health insurance company using population data to find out how many are susceptible to say cancer.

But I have a problem when they use this data to over price a particular individuals insurance because their gene say that they are susceptible to cancer.


> encrypt the data such that the insurance companies cannot target any particular individual (which is my problem her) but they can use the data to improve their insurance pricing models

We already have population claims statistics, a product of regulations that require reporting. What insurance companies want is discrimination within the variation.


Of the solutions:

a) Impractical because cars are needed for daily life and there’s no incentive for automakers to not sell your data.. so all cars will unless this becomes a compelling enough product difference to move the needle on profits,

b) Legislation/regulation that creates the right incentives isn’t easy, but certainly doable.

c) Impractical because homomorphic encryption is absurdly computationally expensive, is still not a fully unsolved problem, and.. in what universe do automotive companies implement this far fetched and expensive means of privacy without sone.. err.. regulation?

It doesn’t seem to be superior to option b)


Which specific regulation do you think has a history of not being impactful? I find that the devil is in the detail in this argument because most regulation us massively impactful and helpful and I find that the talking point that we need to get rid of it is generally loudest from those who would profit the most from not following those rules anymore.


GDPR for example has done nothing to protect people from this particular case of data misuse.

The problem with English law, is that you have to explicitly declare what is wrong a head of time. So we just end up with endless needs for regulation ls.

If we had legal systems like Hammurabi Codes, they work work way better.


You'd be surprised what French data authority (CNIL) has to say about this[1]:

> Any use of personal data for an objective that is incompatible with the primary purpose of proces- sing is a misuse that is subject to administrative or criminal sanctions. > For example, a mechanic cannot sell the vehicle’s technical data to insurers to enable them to infer the driving profiles of their policyholders.

There may be a lack of enforcement, but it seems this type of data may be protected under GDPR.

[1] https://www.cnil.fr/sites/cnil/files/atoms/files/cnil_pack_v...


With good encryption we wouldn't need to spend alot of time trying to enforce these laws


As a corporation, would I use your encryption standards if I stand to make money legally by not using them? You'll need to enforce encryption usage to force me to use these. Which currently requires these kind of laws.

What do you have in mind to ensure standards that are good for end-users are put in my place?


> GDPR for example has done nothing to protect people from this particular case of data misuse

You’re using one badly-written law to discard a category.

Why not look at the FDA? When was the last time you were poisoned?


> Why not look at the FDA? When was the last time you were poisoned?

How many deaths happened because of excessive regulation, extreme delays, and overall refusal to acknowledge other medical bodies' acceptance of treatment?

The CATO institute, a Republican think-tank, put a number on FDA drug law alone from 20000-120000 deaths per decade. (I was aiming at another more impartial org, but sigh)

https://www.cato.org/commentary/end-fda-drug-monopoly-let-pa...


Side note: CATO isn’t a particularly credible source. (Like Greenpeace.)

That said, even though I agree with them in this case, that bolsters the case for regulation being effective. If the FDA were ineffective, pharmaceuticals could “play…legal gymnastics and pay rudimentary fines” to get around their power. In other words, the magnitude is undisputed; we’re debating the sign.


Poisoning people is accepted as wrong by most people. Monitoring devices so that you can "make them safer" or "save the children" or whichever other BS reason they give is easy to give them a pass on.


> Poisoning people is accepted as wrong by most people

Sure. It was still prevalent prior to the F&DA of 1906.


Then why is it not more prevalent now that the FDA is owned by the food-producing cartels?


How is GDPR badly written?


> How is GDPR badly written?

Enforcement is fractured. It’s a mandatory-complaint driven model, which is both intensive (every complaint demands manpower on both the regulator and regulated’s sides) and prone to abuse (known tactic for quashing European competition: herding complaints). All that means it’s ambiguously burdensome, which means there is a fixed cost to compliance even if you aren’t doing anything wrong.


I mean the one thing GDPR did was scare the ever living daylight out of quite a few engineering teams and executives. Which honestly was what they industry really needed, people just needed to consider the data collection a bit more.

And fines have been levied and are levied constantly. It's mostly a man power problem as to how many, but the fines pay for more man power in some places so it all works out. It's just slow, which is why people always complain that nothing ever happens.


Since nobody answered the question, the reason is its terribly absolutely insanely slow. It's possible, just requiring hundreds of thousands or millions of times as much work as say, a normal lookup in a database.


> I am a Hayekian and I believe that regulation will not help with people that know the ins & outs of the regulation, also it's doesn't stop them.

that is such a funny thing to say. Car industry is heavily regulated and car companies do work with the regulation. They are already regulated on safety, fuel standards, dimensions... Adding data protection into the mix makes sense.


The auto industry has fought tooth and nail against safety requirements[1] and still fights today against more stringent fuel standards[2][3].

Not only would they fight regulations like data safety that would open them to potential litigation when lose the data or sell it to the wrong player, but they would win. Privacy isn't the political football that the environment is, and you can't point to death statistics like you can with safety issues.

[1] https://www.the-rheumatologist.org/article/revisionist-histo... [2] https://texasclimatenews.org/2022/03/19/decades-of-lobbying-... [3] https://www.cbtnews.com/auto-lobby-group-warns-fuel-efficien...


they fight it because it works and impacts their bottom line, i dont see how that's evidence that regulation is ineffective as a whole because people can just find loopholes


I don't think regulation is ineffective as a whole, but I do think that regulation won't be able to curb the industry's hunger for data or its incompetence in how it collects it. I believe this is the case because the industry will fight just as vigorously to collect this data while regulators will be less invested in stopping it. I believe regulators will be less invested in stopping it because there has been a steady degradation in our expectation of privacy in this area since smart phones. Any auto industry lobbyist just has to point out that tracking your driving habits and selling that data to insurance companies is little different from what google and apple already do.


The fact that they will fight it does not mean we should not try it. At least in EU the GDPR gives quite a bit of power to regulate this.


If I am a corporation and I am willing to break regulations, how will you force me to use homomorphic encryption? Why should I pass on gathering data that I can resell?

The average buyer won't understand or care about it so there is no direct pressure from consumers. I think regulations is not optional (and homomorphic encryption may be mandated if viable?). Breaching regulations is often a "cost of doing business", but some recent regulations (such as GDPR) can actually create very large fines in many countries. So it seems that what may be needed is good enforcement and measured penalties. Another deterrent would be having penalties that are not money.


> Breaching regulations is often a "cost of doing business", but some recent regulations (such as GDPR) can actually create very large fines in many countries.

This is the issue with so many laws. Stricter fines basically never deter would be offenders from committing the crime. What deters people is a high chance of getting caught.


Do companies ignore regulations? Sure, some do. But saying 'they will just pay the fines' ignores the fact that we could make the fines existential, or punish board members by kicking them out of the industry. The answer to 'the regulation we haven't even tried won't work if we do it improperly' is 'let's do it, and do it properly'. I have no idea what homomorphic encryption is, but rarely do 'let's add more tech to magic bullet a human problem of incentives' solutions work.


Homomophic encryption simply means that the data is encrypted in a way that the person working with it cannot use it arbitrarily.

Here is an example, I would for instance use Google Maps for Navigation but Google or any other third party would have no idea where I am going.

I used it in the first company I worked for and it works beautifully.

A) and B) work but they are not as effective as homomophic encryption.


Barring regulation, why would car manufacturers currently profiting off the sale of this data spend extra money voluntarily implementing something that cuts off their revenue stream?


Or why would one car manufacturer cut off a revenue stream that their competition has.


The keyword here is "use".

Homomorphic Encryption reduces the breadth of computations that can be ran on the gathered data, by making it inaccessible outside of the specific homomorphic scheme that was chosen. So yes, in that sense it cannot be used arbitrarily.

However, the results, i.e. knowledge derived, from the chosen computations can still be shared arbitrarily, which IMO is a much greater issue, as the need of the result sharing will inform the computations that can be done within the scheme.

Who defines the computations? Surely not the users, and lacking regulations, also surely not regulatory bodies.


> use Google Maps for Navigation but Google or any other third party would have no idea where I am going

You don’t need homomorphic encryption for this, just local route processing. In the case of car data, the auto companies aren’t doing any useful processing of the data for the user. Homomorphic encryption is irrelevant.


I think a problem in this area is that if one avenue of data collection is denied, another one will be implemented and it becomes a game of whack-a-mole.

For example the USG is forbidden from collecting communications from US citizens, but that does not keep it from buying this information from private domestic sources or from other governments.


We did not freeze the ability to pass legislation or have courts decide on the constitutionality of governmental processes. Have you given up on democracy?

Why is everyone so quick to say 'well, they are getting away with it, might as well let them' instead of trying to use our processes for the purposes which they were designed?


Because they tend to build-in exceptions and only the likes of R Paul and 1990s Sanders would object. At the state level you saw Newsom and co. argue for increased minimum wage --except for restaurants serving bread -ala Panera. They are not, by and large, honest.


Strangely enough, I know the answer to that, if memory is serving.

Homomorphic encryption is where you can compute on the encrypted data without ever decrypting it.

Logically, it sounds like a pipe dream to me, but apparently it's a thing.


Why is it a pipe dream I know companies that use it. And it serves their purposes well.


I said it sounds like one, not that it is one. I don't know enough about the implementation of it to comment intelligently, but logically it seems to me that if you can compute on it, then it's likely to leak the data, or at least some metadata about the data.

The truth of the matter may be something other. Life is not always logical.


> know companies that use it

We can only do a limited set of operations homomorphically. Moreover, it’s more power intensive than conventional computation. In most cases, local computation is the more effective (and secure) solution.


> I am a Hayekian and I believe that regulation will not help with people that know the ins & outs of the regulation, also it's doesn't stop them.

I work in the automotive industry. It is very heavily regulated. The majority of people have never heard of ISO 26262 but it's keeping billions of people safe every day. Data privacy can work in the same way.


> The problem with this is that it is impractical for people that use see alot of value in using this tech.

I would be happy to turn down the tech, but I wonder how long until I can't feasibly buy a car (or a car I want) without it...


> I am a Hayekian and I believe that regulation will not help with people that know the ins & outs of the regulation, also it's doesn't stop them. It just means corporations are willing to misbehave as long as they can play the legal gymnastics and pay rudimentary fines.

So you try nothing and are out of ideas. Amazing.

> homomorphic encryption

Let me get this straight, you think regulation is too hard because corporations don't want it, but you don't see any problem with homomorphic encryption, which is difficult to implement, poorly understood by consumers, AND provides privacy guarantees that corporations don't want?

Really?


It's pretty clear we've reached the point where technology has shifted to working against us, and not for us anymore.

I work in tech but as far as I am concerned, you can keep all your smart homes, cars and other gadgets and soul sucking (anti) "social" apps.

Somewhere along the way technology was hijacked to control us rather than empower us. And if you don't like it: shut up because "progress" is inevitable


> we've reached the point where technology has shifted to working against us

Everyone has always said this since the dawn of farming. It’s not a particularly useful insight: the question is in how and how it is to be banned or balanced.


So you think today's technology is comparable to farming?


> you think today's technology is comparable to farming?

That’s a not what “since” means.

If a technology causes social change, it will create winners and losers. Those winners tend to autocorrelate (inversely to the magnitude of the shift). As a result, small technological revolutions tend to result in a shift against the broader “us” while broader ones disempower an elite that tries to gain sympathy by aligning itself with that broader “us”. If it doesn’t do either of those, it is—almost by definition—not a technological shift that resulted in social change.

As a result, complaining about technology working against a nebulous “us” is basically saying we had technology that caused social change. Which isn’t a novel point.


Technology is amoral. The power shift happened because we are no longer in control. It's these corporations who are the masters of the computers now. They're just allowing us to use their computers. Of course those computers work against us, they are treacherous by definition.


All new manufacture cars sold in the US already have "black box" data recorders that can be dumped in the event of an accident. In many cases this can even be done without a warrant as of a decade ago [1] - not sure whether that's changed. In any event it seems as though this is a natural evolution in concert with those voluntary ODB-II devices that insurers started using to record driving habits.

[1] https://www.edmunds.com/car-technology/car-black-box-recorde...


  > "More specifically, automakers are selling access to the data to Lexis Nexis, which is then crafting “risk scores” insurance companies then use to adjust rates. Usually upward"
In an ideal world, such data-harvesting might lead to cheaper prices / a more efficient insurance market - which would make the privacy loss worth considering from a trade-off standpoint, at least in theory.

Unfortunately it's instead likely to just lead to higher margins for insurance companies. And the only way to compete would be to harvest more data for better predictions.


> In an ideal world, such data-harvesting might lead to cheaper prices / a more efficient insurance market - which would make the privacy loss worth considering from a trade-off standpoint, at least in theory.

In an ideal world (read: perfect information knowledge), this would lead to insurance being a bad deal for every consumer of it. In the theoretical position where insurance companies can accurately price each individual customer based on their habits, they will charge them exactly what they cost _plus_ a margin.

This is only useful for a consumer if they cannot access cash or a credit line to pay for a sudden large expense. Instead, insurance effectively becomes paying the credit line ahead of time.


> This is only useful for a consumer if they cannot access cash or a credit line to pay for a sudden large expense.

Isn't that the main point of insurance?

Insurance can also socially redistribute bad things. Which fair enough it is in practice a result of insurance but I don't think that's what it was invented for. And indeed the better the insurer's crystal ball the smaller this effect is.

Although in practice I don't think there ever will be a crystal ball good enough to make insurance a bad deal for everyone like that. You always have to insure against another driver being bad or just plain bad luck.


>Isn't that the main point of insurance?

No, insurance claim can be larger than sum of lifetime premiums, contrary to credit.


Again, that’s the point of insurance. Most of us will pay more in premiums than we ever get paid in claims. A few of us will avoid bankruptcy because insurance will pay out a claim we can’t otherwise afford either through cash or any line of credit we could get.

The trick is no one can know which side of the ledger we’ll fall on. You can be the world’s safest and best driver and still get t-boned by a driver with no assets and no insurance.


No, the point of buying insurance is to reduce your individual variance even though your average cost goes up. It's not an individual savings plan, but rather shared pooling of risk.


In a perfect information world, there is no shared pooling of risk. You are paying exactly what you will cost, plus margin. That is the point im trying to make.


By "perfect information" do you mean the ability to predict the future? Because that is what is necessary to make your statement true.

Because in general no, people buying insurance do not pay in what they will individually cost (plus margin). Rather most will be paying in much more than they will ever cost, whereas some will cost much more than they have, or ever will have, pay in. One total property loss or serious at-fault auto collision (say ~$400k) will dwarf a lifetime of premiums ($2k/year for 50 years?).

The point is that such things are rare, so most people will have zero of them over their lifetime. But even perfectly knowing the a priori chance that each person may suffer a catastrophic loss, you can't know which specific people it will be.


Insurance companies don't have to make money from underwriting or insurance.


In an ideal world, such data harvesting would be illegal, with liability adhering to the executives pushing for and approving the initiative as well as any legal counsel involved. Acquiring the data should require explicit, truly informed, and revocable consent not buried in a bunch of BS and not required for the purchase of a vehicle or insurance.


I wholeheartedly agree that the dark patterns around consent are atrocious. But I also think hn is probably biased in its valuation of an individual's data.

If companies offered say a $50/month discount on car insurance premiums in exchange for gathering data, I imagine a large proportion of people would indeed opt in to that (setting aside issues of selection bias or trust in this ideal world)


People should be free to do that. My objection is to the fact that currently just existing in modern society (in the US) means you're being spied on by everyone from the manufacturer of your tv to the grocery store, and huge amounts of your personal data is sold to anyone who wants it.


Basically they keep the profits and socialize the risks.


> Unfortunately it's instead likely to just lead to higher margins for insurance companies.

Why? Insurance pricing is heavily regulated, and profit margins for insurers have always been very low.


After seeing this article, I did a bit of searching and you can also get your LexisNexus report and also opt-out of data sharing along with deleting associated data.

I did it and recommend everyone else does as well.

https://consumer.risk.lexisnexis.com/consumer


That link only allows you to request the LexisNexis Risk report on you.

Do you have a link where one can opt-out of data sharing or deleting associated data? I cannot find anything on the LexisNexis site allowing for that.


It turns out only certain states allows for deleting like California


For anyone looking

https://www.lexisnexis.com/global/privacy/en/privacy-center-...

This looks like the Canadian page, made a request and let’s see where this leads to.


Hello there - This form is from a different business of LexisNexis. Not pertaining to their Risk business. This form will not get what you want.


Is there any hope for something like a Privacy Bill of Rights to ever be passed? I feel like privacy is an inalienable right for all humans and the passage of something like this would be a light speed jump ahead for personal freedom in the new era we find ourselves in. Just because tech enables it doesn’t make this any creepier than someone following behind you in the woods stalking you on your horse 200 years ago.


[flagged]


Many things people do are extremely dangerous and detrimental to society. Not sure that's a great rationale for stripping someone of their privacy.

> You even sign away your rights to the privacy of your own blood when you get a license to drive.

I'm not sure what this is referring to. Is any random government agent allowed to take a DNA sample if you're behind the wheel of a car?


I assume that person is talking about blood tests for suspected DUI's.

>"All U.S. states have driver licensing laws which state that a licensed driver has given their implied consent to a certified breathalyzer or by a blood sample by their choice, or similar manner of determining blood alcohol concentration." -https://en.wikipedia.org/wiki/Implied_consent


Yeah, pretty much. It is called "implied consent".


Like it or not most of the US is oriented around driving and it's basically unavoidable for most adults. Using that as justification to erode everyone's rights feels deeply wrong to me.


If you actually believe this, then please reply here with the start and end GPS coordinates of your last driven commute to/from where you live.


Why would I drive? Like some kind of jerk.


Nobody? There are countries other than the USA. I've never heard of signing away rights in respect of blood as a condition of getting a licence. Is this a real thing in the USA?


If you get in a car crash and are suspected of a DUI, in most states you must submit to a breathalyzer or you are presumed guilty by default. Where you live, is that not the case? Or can you get out of a drunk-driving charge by just refusing to blow?


Getting a driving licence and being suspected of drunk driving and causing a crash are two different things. Where I'm from your blood would be drawn after you were arrested for a suspected criminal offence, not just because you had a license.


It is a condition of getting your license that you will consent in that event, at least in the US. [1] It would be interesting if someone who was driving illegally without a license could get away with not consenting to a breathalyzer.

[1]: https://www.baronedefensefirm.com/breathalyzer-refusal-and-i... (for instance)


Yes. If a US cops demands a blood or breath test you basically have no choice if you were driving or you are presumed guilty of a crime and then they pull a warrant for a blood test anyways. The right not to be tested is essencially given up when you sign for your drivers license.


And despite all of that, every lawyer will tell you to refuse a blood or breath test until they have that warrant and strap you down in the hospital to forcibly pull blood.


I have been convinced for several years now that insurance companies are likely buying up personal data from many different sources. They seem to be ideal consumers because it'll lead to better outcomes when they can increase rates on those that identify as risky.


This isn't a secret. Go read one of the world's largest data broker's annual report to investors, ctrl-f for "insurance": https://www.experianplc.com/content/dam/marketing/global/plc...


Absolutely. Annual financial reports by public companies are a gold mine for this stuff, as they are literally required to talk about it.

You can also get a sense of the scale of the problem by the reported revenue and growth rates (which they're always eager to highlight).


I knew a guy who worked in Finance. Whenever he would buy alcohol, or cannabis (legal where I lived) he would only pay cash. His concern was that, if his credit card usage data were sold, it could increase his premiums.


That's why I buy my liquor at the gas station, on the same tx as the gas.


The credit card company could access subcategories of your purchase. It would make sense for them to do that to track you


Itemized receipt data is not transmitted to the network or to the issuing bank.

Some merchants have multiple registers for the sale of different types of products, but generally if you receive only one receipt for your full purchase, it will be recorded under the category code for the merchant's primary business.


It can be, if the merchant wants it to be.

https://www.tidalcommerce.com/learn/what-is-level-3-data

On my American Express credit card statement, all the airline flights show the details of the flight and Staples.com transactions show the specific items that were purchased. And this has appeared for at least 6 to 8 years.


In Canada, at least, you have to go to specific stores that only sell either alcohol, or cannabis. So the just having a bill from there would be enough.


The whole point of an insurance business is to insure against unknown and unlikely risks.

If it is insuring known or likely risks, then it becomes a subsidy or wealth transfer (which should be the domain of governments).


> The whole point of an insurance business is to insure against unknown and unlikely risks.

Unknown to whom? To you, the insured? Or to them? Business thrives on customers with incomplete information.


It’s still unknown if someone engaging in risk will end up in costly collisions, or other events. Just because you engage in risk doesn’t mean it will bite you, only that it is more likely to bite you.

Besides why should less risky drivers subsidize riskier drivers?


If they have an ACTUAL measure of lower skilled and higher risk drivers, fine.

But when they use overly simplistic data (or use it in an oversimplified way) that makes the highest-skilled drivers appear in the same batch as low-skilled and high-risk drivers, that is not subsidy, it is unfair penalization by stupidity.

(see other comment on logging of g-forces)


"Besides why should less risky drivers subsidize riskier drivers?"

They essentially do. If the safe drivers are never at fault, those premiums went somewhere. If the risky, repeat accident drivers aren't paying thr full price replacement vehicles, that money came from somewhere.


You are right but what matters is disclosure.

Here is a car that sells your driving data. Here is one that won't

If you knew they were selling your data you could objectively demand a discount from one of the 2 .


This has been true for several years. An insurance agent once told me that there are life insurance companies dropping the requirement for blood draws / medical exams and are just buying prescription records to correlate with financial, educational, and other behavioral data.

Edit: changed prescription “data” to “records”


Wouldn’t this violate HIPAA?


Depends who is selling that data. Some pharmacy delivery services or billing services may not be covered by HIPAA, since they are not necessarily "covered entities".


Is this true?

My understanding of HIPAA (possibly incorrect) is that it's attached to the data.

If a covered provider is leaking HIPAA covered data to a non-covered business associate entity... that's a big no-no and a fine.


There are criteria for which organizations are covered by HIPAA’s privacy protections. It is not attached to the data wherever the data goes.


Yes, those are covered entities. Their subcontractors who touch HIPAA data are business associates.

See https://www.hhs.gov/hipaa/for-professionals/covered-entities... and https://www.ecfr.gov/current/title-45/subtitle-A/subchapter-...

In my experience, covered entities are really serious about signing BAAs with any of their hosting vendors and partners, as afaik the liability falls on the covered entity if they didn't have an agreement in place and data leaked from a vendor/partner.


If you agree to the data being shared when signing up for insurance it wouldn’t be a violation.


Do you have any details on this?

I'm sure there are legal HIPAA data escape pathways (given the financial incentives for companies to find them), but I'm curious on the details.

Afaik, there's no way to make HIPAA-covered data non-HIPAA-covered, and absent that everyone in the custody chain is responsible for anywhere it eventually ends up.

That said, I expect the way this works in practice is more likely data that originates with non-HIPAA-covered entities, but can be massaged/combined into a similar product.


Not only that, don't insurers offer 'discounts' for installing tracking apps on your phones and devices?


As a safe driver, I like the idea of dangerous drivers paying more. There's no good reason the participants should not be aware they are under surveillance though.

Sidenote: I wonder if they've considered close follow distance or frequent lane changes as a risk factor.


Spot enforcement with appropriate training and vehicle improvements is more than appropriate for numerous reasons: 1) Regression to the mean will happen with 100% enforcement/over-enforcement. The new standard for 'safe' will collapse to an unobtainable level which will not benefit society in the long run. 2) Safety is not my #1 concern. The number one cause of death on roads is being born. I value getting to my destination without being tracked more than the potential safety gains of strict monitoring. I believe in rational safety measures but "It's safer so we must do it" is an argument I no longer accept. I want to live a good life, not just a safe one. 3) We have seen time and time again that personal information collected by companies rarely benefits consumers and instead is always used to benefit companies. This is no different. I have negative trust in industry handling my data for my benefit.


The idea of dangerous drivers paying more for insurance is fine. It's probably better than the idea of drivers with bad credit paying more for insurance.

The problem is in how is dangerous driving assessed. Simple to apply rules lack the understanding of conditions. Telematics are going to be low bandwidth data, almost certainly without enough data to form an understanding of conditions.


> It's probably better than the idea of drivers with bad credit paying more for insurance.

There must be some correlation between bad credit and likelihood to be in a collision.


There is a better correlation between dangerous driver and likelihood to be in a collision.


Probably. If so, that will be weighted more heavily.


The thing that’s somewhat ironic here is that the car companies could make cars safe by default. For example, they could make it not possible to accelerate faster than one needs to. They could put in speed limiters that are triggered by the speed limit on the road. They could stop marketing and selling over powered cars.

Instead they market cars as exciting race track like vehicles, things that let you do what you want, when you want. And now they will collect data on the people who actually do that.

Personally I would prefer a car that helps me be a safer driver by following the law. Ensuring there are no pedestrians or cyclists in front of me, etc. But at the end of the day, automated enforcement is a good thing, so maybe this will help some people become safer drivers, though the reality that’s probably more likely is that fewer and fewer people will be able to afford/get insurance, and because our country is so car dependent, they will just drive without.


> For example, they could make it not possible to accelerate faster than one needs to.

I was in a rental car that had this once. Was on the highway, needed to get around another driver who was being unsafe. Was unable to do so because of the limiter. It was easily the most unsafe vehicle I've ever driven as a result. These mechanisms lack situational awareness and nuance, and thus are a direct threat to my personal safety. They very much need to be banned as a matter of course until such a time as humans aren't allowed to drive at all.


The problem though is that inevitably they will eventually automatically label anyone who does not "consent" to total surveillance as risky or dangerous.


Its telling how you phrase this.


If it's so telling then tell us, enough with the dark innuendos!


Appearing "safe" via slow speeds and slow turns does not equal "safe" in the real world.

Unless the car has cameras staring at you, it doesn't know if you're checking your mirrors before lane-changes, going 10-under the limit in the far left lane on a busy highway, etc.

Even then, cameras aren't good because assuming people are telling the truth when they use test-taking software, there are false positives that have to be manually-reviewed by proctors when the computer thinks you're looking in the wrong place.

(Edit: it also doesn't know if your mirrors are positioned properly so you do not have a "blindspot". In every modern vehicle I've driven, it is possible to set the mirrors so you can continuously see cars in the left or right lane next you - from the rearview mirror to the sideview mirror to the side glass. Hint, if you can see the side of your own vehicle in your sideview mirror: it is set improperly.)

---

> frequent lane changes as a risk factor.

People not willing to change lanes is how you get more traffic and more dangerous driving overall. Traffic should stay right unless passing - which means once someone is done passing, they also need to move to the right. People crusing in the left lane is how backups happen and people trying to make more dangerous lane-changes to pass on the right. Less lane changes would be a bad thing to incentivise.

If more people used cruise control in general, that might help (don't know if any studies have been done about that). As it is now, most people's speeds ebb and flow and that causes traffic, especially when they either consciously or subconscious try to speed up to match someone trying to pass them, or whether it's curves, hills, narrower lanes due to automated tollbooths, bridges, etc.


> People not willing to change lanes is how you get more traffic and more dangerous driving overall.

This is false. Lane changing, causing others to slow down, is a leading cause of traffic waves.

https://www.sciencedirect.com/science/article/abs/pii/S01912...

"Frequent lane-changes in highway merging, diverging, and weaving areas could disrupt traffic flow and, even worse, lead to accidents."


The easiest way to disable this is by physically removing the cell modem from your vehicle, which is very straightforward. Without egress, the only way for data harvesting to occur is by physical access, typically at a dealership. However, virtually all automotive cell modems are either packaged on the same chip as the GNSS receiver, or colocated on the same daughter board. As such, choosing to retain control over your data typically comes at the cost of foregoing the built in navigation system and other features such as emergency calling.


Unless most people do it, their answer will be to put you on the high risk bucket by default.


There are insurance companies that allow you to voluntarily submit to tracking in exchange for reduced premiums. What is happening here is that those savings are being passed on to auto makers as an extra revenue stream.


There needs to be a Pi-hole for cars.


GrapheneOS for Automotive


All the companies were have bee hoovering up our data, where did you think this was going to end?

Add in the fact that if you are not getting "growth" on the stock market, then you must be doing something wrong.


The Stasi didn't add anything to the GDP of the GDR, either. That wasn't the point then, and it isn't now.


My point is that companies start to collect data; either for debug purposes or to try and better understand the customer.

Then there is a lot of pressure to monetize the data. So from a consumer POV, it is better to not have anything collected.


Yes.

My quip was to make a rather negative comparison, while noting that the whole collect-and-monetize industry is a net negative for both the economy and human society.


Gift Link: https://www.nytimes.com/2024/03/11/technology/carmakers-driv...

US lawmakers can put a stop to this and every other privacy scandal over the years at any time you know by passing a strong privacy law but nahhhhhh we can't do that!

It's yet another reason why people should buy older cars (preferably 2012 or older) since the automotive, insurance, and data broker industries don't give a total jack about your privacy and sadly the US aren't going to do jack about this either until we can elect more people in office that does care and pass a strong privacy law in the process.


More concerning is how we are not able to view and challenge this data. It's a one-way street.


The worst thing about this is that all of their conclusions about what data constitutes "bad driving" or "risky driving" is dead wrong.

The signs they consider to be "bad driving" are high-g braking and turning.

Yet these are EXACTLY the same signs created by highly-skilled driver or racer operating at the limit, as they would to avoid an accident (thus costing the insurer $0), where the same situation would catch 90% of the low-g drivers into a wreck that totals the vehicle and causes injuries. A core element of high-performance driving for accident avoidance and racing is to understand the limits of tyre traction, and how to operate the car up to those limits — but not over them — i.e., just under the limit of sliding (sliding friction is always less than static or rolling friction), and to choose lines that maximize available traction.

Distinguishing the signs to tell a high-skilled driver from a bad driver requires more than just "is that number high?". You must look at the circumstances, the frequency, the conditions, the rate of increase and decrease of pressure, the slip angle, the grip state of all 4 tires, and more. But of course, no one bothers to do this.

It is the same kind of institutional stupidity that causes a world-class weightlifter with 4% body fat to be classed as "obese" because s/he scores high on the stupidly simplistic BMI scale(a ratio of weight to height).

Except with BMI insurance companies are not allowed to re-rate people and doctors can instantly adjust treatment when they see the person is obviously not obese but highly trained.

With auto insurance, they can secretly re-rate us on bogus numbers that actually down-rate the highly skilled.

Seems more attractive with every passing year to rebuild older nice cars than get into the new rolling spyware contraptions.


Well, if one is stupid enough to get a race car with telemetry then the spying is deserved. The skill level is irrelevant insurance-wise, as it doesn't last, varies within the day, and is of no use on open, shared streets.

Now the dream car will soon be an electrified lada niva, no electronics, speeding impossible.


Who said anything about racecar telemetry?

You do realize that wheel speed sensors and g-force sensors are already standard equipment in most cars, and that this is part of the data they are selling, right?

Electrified Lada Niva, eh? Depending on how it's electrified, it might go waaayy faster than would be sane... ;-)


I'm skeptical of this argument. I don't want to be on the same road with people who self identity as expert drivers going at the limit.


I completely agree.

My example is NOT about "self identified" "experts", but REAL experts who ACTUALLY have the skills. They also are typically very safe on the roads and know that race-like on-the-limit driving on the streets is idiocy.

The point is that people who ACTUALLY have these skills have a far wider margin of safety than the ordinary driver, and far better capability to avoid accidents. But, they will also — with that far wider margin of safety — often turn or brake with higher than ordinary G-forces.

For example, ordinary street tires and suspensions on modern cars can handle 0.9G lateral or braking acceleration. Ordinary people get uncomfortable at 0.2G lateral acceleration.

An unskilled driver approaching 0.25G lateral acceleration does risk exceeding adhesion limits and losing control because they are insensitive to inputs and feedback. In contrast, a skilled driver can turn at 0.25G all day with virtually no risk, as they are accustomed to driving at 3-4 times those Gs, and are situationally aware, sensitive to inputs and feedback, and choose lines and inputs that avoid the limit.

They are far less of a risk than an unskilled driver at 0.1G. Yet, the skilled driver will get flagged as "bad".

With deeper understanding and analysis, they could make the distinction between actual expert drivers vs overconfident idiots. But I see no indication that this will happen.


Can this be opted out at the dealer? Black box collection OR wireless connectivity?

If not, are there guides on disabling the modem without damaging diagnostics or infotainment?

I want a car that does not transmit data. Which means I may need to get my 2010s crossover rebuilt and reupholstered instead of getting a new car.


I think it varies manufacturer to manufacturer, but even those that make it possible seemingly make you jump through hoops. I was researching potential replacements for my 16 year old car and found a lot of discussion about this re Mazda models:

https://www.cx30talk.com/threads/thoughts-on-tcu-disable.374...

> Mazda CEC makes it quite difficult to actually request/disable your TCU. It can take many phone calls and escalations to get someout to understand the request and actually "push the button" to send the disable event to your car.

Honestly I’m just totally disinterested in just about every current new car model.


Yes, many models have guides out there for disabling wireless connections. On a previous vehicle of mine, it was as simple as disconnecting the bridge/jumper between the main board and the wireless board.


Being mad about this is like being mad the thief who stole your belongings then pawned them. The crime was spying on you in the first place. Automakers should not have any data, to share or sell or give to law enforcement with a subpoena.



> An employee familiar with G.M.’s Smart Driver said the company’s annual revenue from the program is in the low millions of dollars.

Is that a lot of money for GM? I would have guessed no, but it doesn't seem like very much for selling out their customers like this. Either it's more to GM's profits than I'd expect, or they really don't expect much PR blowback risk at all?

I don't know if they are right or wrong, but...

> Drivers who have realized what is happening are not happy. The Palm Beach Cadillac owner said he would never buy another car from G.M. He is planning to sell his Cadillac.


Is there a good source for which makes, models, and model years “phone home”? I would absolutely take it into account when shopping for a new or used car, but I’ve had no luck with Googling.


I’d imagine any car since 2019 can likely share such data.



I called to turn off the data in a Toyota, and the guy wanted my name, phone number, email address, physical address and even more I can't remember right now. I was like "why do you need this info?" He said, "We need a record of who made this request for our records." I told him "do you understand that I am calling your company specifically because I don't want you to have records?" This went round and round about three times before I just gave him fake info.


Were you able to obtain any records of your own where they agree to cease collection that you can hold against them if they continue? Do you have any means of verifying that the collection has ceased? I don't believe that their word means much without these.


I just wrapped my truck with several layers of copper mesh, so it should be fine.

In all seriousness though, no. I have no way of confirming the data transmission has stopped.


Surely these cars have an "offline-mode". Anyone know how to force it? (I almost said "airplane-mode".)


> almost said "airplane-mode"

My old Jetta’s door once fell off.


Disconnect the antenna or the whole modem itself.


Ladies and gentleman if we want a fair society we MUST:

- mandate FLOSS by law, starting from the first SLoC, meaning no company can sudden publish software to sell something with it, the software must be published since the day zero of it's development or the hw/sw/service can't be on sale;

- mandate local first for anything, so connected cars are ok, but they just offer a simple DynDNS mechanism the owner can add to it's own domain name as a subdomain like car.mydomain.tld and reach a relevant set of APIs the car offer. All data collected by the OEM must pass though the car owners systems, in an open and readable and documented form.

If this is not mandate, by popular acclaim, surveillance capitalism will stay, since it's the new tool to know and conform the masses. Surveilled people are known, and knowing they are surveilled try to behave in a "social norm" way, fearing the judgment/social score, as a result people evolve toward slaves who obey those who establish and update current social norms. We all know cooperation is needed to do anything, those who compete then need many who cooperate, obeying their orders, to craft anything. In the past was religion, then money, now social scoring the way to stiffen the masses. Such powerful tool is not something anyone accept to loose without a desperate and limitless fight. Only a large public reaction can force a change.



You can get your data from LexusNexus or opt out and delete data if you're in a state that mandates the option (such as CA) here:

https://consumer.risk.lexisnexis.com/consumer


I hate how many companies don't give a flying shit about privacy and completely ignore user selections under the guise of incompetence.

I would like to see a judge with sympathy for this sentiment fine them out of existence.


There's a lot of jerk drivers who go way too fast and drive very dangerously. They should have to pay significantly more for it. For people that drive correctly, they should be charged less as well. I don't see why this is an issue.


Fun new line of business idea: Manufacturers could claim the texas abortion bounties by reporting any motorist who travels to an out of state clinic.

The problem with allowing this kind of data usage is you will also have other moral authoritarians that wish to use the data as well.


There are plenty of data brokers who will sell your personal location data, independent of your vehicle, obtained from the apps on your phone.


The cars are not capable of measuring how dangerous the driving is.


Sure they are. Speed is easy to detect for instance. Someone driving 50mpg in a 25mph school zone should have massive increases to their insurance as they present huge risk.


Cars don't know about school zones and the reported data lacks even the signal of "50 in a 25." It's just a boolean: speeding, true or false. I'm skeptical of the idea that 61 in a 60 zone should be considered as dangerous as 130 in a 25, for example.


Should the fine be the same regardless of whether it's five minutes after school let out out or 3AM on Christmas day?


It would be up to the insurer and how their models reflect and manage risk. The point being, individual insurers would be able to better price insurance based on discrete customer behavior.


Should cell providers sell customer data/metadata to car insurance providers? It, after all, in the best interest of the insurers.


If there's enough data to suggest that with a high degree of probability people using their cell phones in certain ways while operating a vehicle causes damages then yeah, there should be a market for that. I'm not saying there shouldn't be legislation that limits which types of data can be shared, but certainly if there's useful information that has a market that leads to better safety and lower insurance prices, then I'm all for it.


I'm curious how that would play out. Speeding is a risk, doing it in a school zone when classes are getting out is a bigger risk, but driving at 3AM is also a big risk.


Driving while tired is a risk, but that's not the same thing.


From the insurance company's point of view it's all just risk. Driving at 3AM isn't just about driving tired, it's about drunk drivers sharing the road. The risk is substantial.


> An employee familiar with G.M.’s Smart Driver said the company’s annual revenue from the program is in the low millions of dollars.

i would have guessed that this is making more money. Does anyone know why this is the case?


Topic is previously discussed (163 comments, 2 days ago) off NYT article at: https://news.ycombinator.com/item?id=39666976


This happens with healthcare data too. Every prescription you fill is tracked and used as input data for many insurance models that make health insurance pricing decisions.


They share your data in order to help lower your insurance rates.

Imagine what your premium might be without this service.

For example, I drive less than 900 miles a year, have had no accidents, citations or thefts and keep my 10 year old car in a garage. Yet my payments are $1500 per year. And after getting estimates from several companies, this was the lowest we could find.

Even with this service, the inflation rate for auto insurance is higher than anything else in our family budget.

Thank the lord for data sharing.


Some insurance companies make you use an app if you want lower payments so this battle is mostly lost.


My insurance went up 10 percent out of the blue. Wonder if Tesla shares?


10% is below average, you should feel lucky. Car insurance premiums have been rising dramatically, especially for EVs.


Could've just been "inflation" (read: Opportunity to jack up prices) too. Although if one car company is going to be on the bleeding edge of data collection & sharing it'd probably be Tesla. They're the most Silicon Valley of all.


How do you know labor prices to fix cars and more complicated/costly car parts did not cause the increase?


Can someone educate me why insurer should not know one's driving habits? I'd imagine that the risks calculated from one's driving habit will be more accurate than that derived from only past accidents, car color, user profile and etc.


For me it’s because of this dirty concept called "privacy" and it’s the reason why insurers don’t have access to the list of items that I buy at the grocery store (also health records, name of sex partners, what I do all day long, whether I walk enough every day, etc.)


Does anyone have a list of companies that do not do this?


> An employee familiar with G.M.’s Smart Driver said the company’s annual revenue from the program is in the low millions of dollars.

It doesn't seem in the car company's interests to take on the reputational risk for this kind of financial reward.


What sort of "reputational risk" do you think they are taking on?

Data sharing with third parties is ubiquitous in almost all industries. Every single company that deals with financial products reports account information to third parties (Experian, Equifax, TransUnion, Early Warning Services, ChexSystems). If you return an item at a retail store it gets reported to fraud alert databases. Most medium to large employers report the contents of the paychecks of their employees to The Work Number. Insurance claims are reported to LexisNexis. Oil change companies report milage to CarFax, which insurance companies use to look up if you're reporting accurate mileage.

Data reporting and sharing is ubiquitous; it's standard operating procedure. Having a few "privacy nerds" complain about it on the Internet is not risking their reputation.


> What sort of "reputational risk" do you think they are taking on?

> a few "privacy nerds" complain about it on the Internet is not risking their reputation.

The news about GM's OnStar tattling (their words) on drivers is front page on several big news sites like CNN. This is not just some privacy nerds, this is a whole bunch of mainstream media outlets calling out GM by name.

I'm confident the PR team at GM is working overtime right now to try and find a mitigating spin.


> It doesn't seem in the car company's interests to take on the reputational risk for this kind of financial reward.

Tell that to Boeing, they're on course to tank the entire company out of the financial shenanigans they pulled after 1997.

As soon as a company goes publicly traded, the incentives change - there is no more priority on long term, the only thing that matters is INVESTORS INVESTORS INVESTORS (read that one in your finest Steve Ballmer voice).


Short term profit long term losses things are done a lot.

Also, companies seem to work against their own interests quite often. The spyware is probably on some separate budget with separate bonuses attached. So "locally" in the department it might make financial sense to spy on the users.


Hear hear! And why not? Fuck the consumer I say! The one thing we can all agree on is that human dignity must be paid for in cash. If normal people wanted to be treated with respect then they would be high earners like us.


Good luck collecting that from a 1992 Mitsubishi L200


Got any apps?


Well, yeah, other than that :-)


Gives the creepy vibes, but if you stop to think about it - this can stop good drivers from subsidizing the bad drivers. Not like the insurance companies are doing to lower the premium on good drivers, if you have a problem with that talk to capitalism. But bad drivers getting higher premiums is good for everyone.


> But bad drivers getting higher premiums is good for everyone.

Not necessarily. In many parts of the United States, a car is the only viable mode of transport. If you price the bad drivers out of the insurance market, they will forgo insurance all together. Then, if they cause a loss, they will be uninsured and the other driver's insurance will have to pay for the loss (or spend resources in costly suits) anyhow. So, then good drivers premiums will need to go up to compensate for the extra "bad drivers can't afford insurance" risk that good driver's carry. We end up in a similar situation in a roundabout manner but with the added element that now all our data is stored on everyone's servers.


I mean, Tesla has their own insurance product that they claim is better and cheaper than alternatives because of the data they track. People cheered for this.

Personally, I"m not opposed to dangerous drivers paying higher rates, but the devil is in the details.


I mean, I knew…but… I didn’t know...


oh f**, we invented this shit in software, now it's coming back to bite us


Anyone know where I can find /etc/hosts on my Ford? /s


Behind the firewall.


I hope so. Get the animals off the road.


Imagine a scenario where bunch of those animals brake check you, and then your insurance company calls you up and says "Hey jgalt212, we're seeing that you your forward collion avoidance system got activated too many times this year. We're going to flag you as a tailgater and up your premium by %80. Have a lovely day."


Playing devil's advocate, the answer is that brake checking only works if you are tailgating. If you increase following distance such that you cannot be brake checked, the insurance company has succeeded in making your driving habits safer.


As opposed to the jackass whose pulling high lateral G's and going 90 on the freeway. I think I can explain my way out of the insurance hike easier than the aforementioned jackass--who actually should be deemed uninsurable.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: