• UndercoverUlrikHD@programming.dev
      link
      fedilink
      arrow-up
      10
      arrow-down
      4
      ·
      6 months ago

      Autopilot turns off because the car doesn’t know what to do and the driver is supposed to take control of the situation. The autopilot isn’t autopilot, it’s driving assistance and you want it to turn off if it doesn’t know what it’s should do.

        • UndercoverUlrikHD@programming.dev
          link
          fedilink
          arrow-up
          8
          arrow-down
          1
          ·
          6 months ago

          Sure, what meant though was that Tesla doesn’t have self driving cars the way they try to market it as. They are no different than what other car manufacturers got, they just use a more deceptive name.

      • Lemming6969@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        6 months ago

        If an incident is imminent within the next <2 seconds or so, autopilot must take the action or assist in an action. Manual override can happen at any time, but in such a duration it’s unlikely and only the autopilot has any chance, therefore it cannot turn off and absolve itself if liability.

    • Biyoo@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      2
      ·
      6 months ago

      Autopilot turns off before collision because physical damage can cause unpredictable effects that could cause another accident.

      Let’s say you run into a wall, autopilot is broken, the car thinks it needs to go backwards. You now killed 3 more people.

      I hate Elon Musk and Teslas are bad, but let’s not spread misinformation.

      • Programmer Belch@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        2
        ·
        6 months ago

        It seems reasonable for the autopilot to turn off just before collission, my point was more in the line of “You won’t get a penny from Elon”.

        People who rely on Full Self Driving or whatever it’s called now, should be liable for letting a robot control their cars. And I also think that the company that develops and advertises said robot shouldn’t get off scot-free but it’s easier to blame the shooter rather than the gun manufacturer.

        • Biyoo@lemmy.blahaj.zone
          link
          fedilink
          arrow-up
          2
          ·
          6 months ago

          Yeah I agree. Both parties should be liable. Tesla for their misleading and dangerous marketing, drivers for believing in the marketing.

  • Hux@lemmy.ml
    link
    fedilink
    arrow-up
    132
    arrow-down
    9
    ·
    6 months ago

    This reminds me of that Chinese law about being personally responsible for all medical debts of a person you run over—incentivizing killing the person, rather than injuring them.

    • Tankiedesantski [he/him]@hexbear.net
      link
      fedilink
      English
      arrow-up
      18
      arrow-down
      3
      ·
      6 months ago

      That rumor is so stupid it doesn’t even begin to stack up. Paying medical bills sucks, but killing someone even unintentionally puts you at risk of jail time. Vanishingly few people are going to choose a decade or more of hard labor in jail over paying a debt.

      The only thing this whole rumor proves is that people will believe the most irrational things about China as long as it makes Chinese people look bad.

  • lugal@sopuli.xyz
    link
    fedilink
    arrow-up
    63
    arrow-down
    4
    ·
    6 months ago

    I hope this isn’t law anywhere. You’re liable for your car no matter what. You have to take control if necessary

    • Uriel238 [all pronouns]@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      42
      ·
      6 months ago

      I saw a headline about Mercedes offering an autopilot that doesn’t require the driver to monitor, so it’s going to be interesting to see how laws play out. The Waymo taxi service in Phoenix seems to occasionally run in with the law, and a remote service advisor has to field the call, advising the officer the company is responsible for the car’s behavior, not the passenger.

      • Cyclist@lemmy.world
        link
        fedilink
        arrow-up
        26
        ·
        6 months ago

        So in theory the manufacturer takes responsibility because they trust their software. This puts the oness on them and their insurance, thereby reducing your insurance considerably. In actuality your insurance doesn’t go down because insurance companies.

        • conditional_soup@lemm.ee
          link
          fedilink
          arrow-up
          12
          ·
          edit-2
          6 months ago

          I’m not trying to be the grammar police, just thought you might like to know that it’s “onus”.

        • Baŝto@discuss.tchncs.de
          link
          fedilink
          arrow-up
          2
          ·
          6 months ago

          It’s the reason why they prefer to offer only assistence systems. Aside from warning they can act, but they don’t drive on there own. EU will even require some systems for new cars. They’ll especially annoy people who ignore speed limits and don’t use turn lights.

    • cm0002@lemmy.world
      link
      fedilink
      arrow-up
      36
      arrow-down
      3
      ·
      6 months ago

      You’re liable for your car no matter what

      Nope, it should be law that if an auto manufacturer sells an autonomous driving system that they advertise being able to use while driving distracted then they are liable if someone uses it as advertised and per instructions.

      What you wrote is probably an auto manufacturer executive’s wet dream.

      “You used our autonomous system to drive you home after drinking completely within advertised use and per manufacturer instructions and still got in an accident? Oh well tough shit the driver is liable for everything no matter what™️”

      • warm@kbin.earth
        link
        fedilink
        arrow-up
        27
        arrow-down
        3
        ·
        edit-2
        6 months ago

        When autonomous cars are good enough to just drive people around then yeah the companies should be liable, but right now they’re not and drivers should be fully alert as if they are driving a regular vehicle.

        • monk@lemmy.unboiled.info
          link
          fedilink
          arrow-up
          9
          arrow-down
          1
          ·
          6 months ago

          When autonomous cars are good enough to just drive people around

          they become autonomous cars. It’s not autopilot if I’m liable, simple as that.

        • FlexibleToast@lemmy.world
          link
          fedilink
          arrow-up
          8
          arrow-down
          3
          ·
          6 months ago

          There are already fully autonomous taxis in some cities. Tesla is nowhere near fully autonomous, but others have accomplished it.

            • FlexibleToast@lemmy.world
              link
              fedilink
              arrow-up
              3
              ·
              6 months ago

              Fair, but when a company is given the authority to run fully autonomous taxis in cities that’s a huge accomplishment. Granted they are cities that don’t see things like snow storms and I’m sure there is a good reason for that.

        • cm0002@lemmy.world
          link
          fedilink
          arrow-up
          5
          ·
          6 months ago

          but right now they’re not and drivers should be fully alert as if they are driving a regular vehicle.

          Which is what would be per manufacturer instructions, which still falls under my definition

          • warm@kbin.earth
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            6 months ago

            Replies aren’t always in disagreement! I agree with what you are saying, just adding on my thoughts on information further up the thread too.

        • azertyfun@sh.itjust.works
          link
          fedilink
          arrow-up
          3
          arrow-down
          1
          ·
          6 months ago
          1. Then don’t call it autopilot
          2. What’s the point of automated steering if you have to remain 100 % attentive? To spare the driver the terrible burden of moving the wheel a couple mm either way? It is well studied and observed that people are less attentive when they’re not actively driving, which, FUCKING DUH.

          Manufacturers provide this feature for the implicit purpose of enabling distracted driving. Yet they will not accept liability if someone drives distractedly.

          Next in We Are Not Liable For How Consumers Use Our Product, Elon will replace the speedometer by Candy Crush with small text that says “pwease do not use while dwiving UwU”.

          • warm@kbin.earth
            link
            fedilink
            arrow-up
            2
            arrow-down
            1
            ·
            edit-2
            6 months ago

            You choose to activate that mode, while I understand your sentiment and do agree, it’s not as cut and dry as ‘company liable’ or ‘driver liable’, both can be at fault. Taking blame off drivers entirely could make people even less attentive and the safety of lives is more important than some fines to a car manufacturer. The real problem is that mode being allowed to exist at all. It’s clearly not ready for use on public roads and companies are just abusing advertising to try and pin their ‘autopilot’ as something it isn’t.

            Also note: Some manufacturers (Volvo & Mercedes, that I know of) have already said they will claim full responsibility for their cars in self-driving mode.

          • warm@kbin.earth
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            6 months ago

            It’s still in its infacy, eventually it will replace humans entirely and the roads will be much safer. Right now it’s just like improved cruise control and kind of pointless.
            Some manufacturers have already said they will claim full responsibility for their cars in self-driving mode, which makes sense to do.

            • queermunist she/her@lemmy.ml
              link
              fedilink
              arrow-up
              1
              ·
              6 months ago

              I’ll clarify: what is the actual purpose of giving customers access to this infantile technology? It doesn’t make following traffic laws easier like cruise control does, it doesn’t make drivers better at driving or safer behind the wheel, and it merely encourages distracted driving.

              So why did they ship this product? Again, it just seems like a dangerous toy.

      • lugal@sopuli.xyz
        link
        fedilink
        arrow-up
        3
        ·
        6 months ago

        So I say it is law last time I’ve checked (which is a while back tbh), and you say “no, it should be law” in your opinion. You see it, right?

        Autonomous systems aren’t that trustworthy yet and you shouldn’t drive drunk with them. Are they really advertised that way?

  • Technoguyfication@sh.itjust.works
    link
    fedilink
    arrow-up
    67
    arrow-down
    11
    ·
    edit-2
    6 months ago

    I’m not aware of a single jurisdiction on the planet that makes Tesla liable for what the vehicle does when autopilot is enabled. In order to activate autopilot you have to accept about 3 different disclaimers on the car’s screen that state VERY clearly how you are still responsible for the vehicle and you must intervene if it starts behaving dangerously.

    I’ve been driving with autopilot for over 2 years, and while it has done some stupid stuff before (taking wrong turns, getting in the wrong lane, etc.), it has NEVER come close to hitting another vehicle or person. Any time something out of the ordinary happens, I disengage autopilot and take over.

      • Zagorath@aussie.zone
        link
        fedilink
        arrow-up
        43
        arrow-down
        11
        ·
        6 months ago

        Bro bought a Tesla just 2 years ago. Long after it was very widely known just how much of an arsehole Musk was, and after many other excellent EVs were on the market.

        I’ll let you draw the conclusions from those facts.

        • Technoguyfication@sh.itjust.works
          link
          fedilink
          arrow-up
          7
          ·
          6 months ago

          When I bought my car, there were no widespread plans for other manufactures to adopt NACS, you couldn’t get your hands on a Rivian for less than $100k, and I was commonly driving long distances for work so I needed a vehicle with long range that I could charge quickly on trips. Tesla checked all the boxes.

          I haven’t experienced any of these super widespread quality or reliability issues people on the internet talk about. It was delivered with no issues, has needed very little maintenance (just tire rotations basically), and it’s not falling apart like some would lead you to believe. I don’t know what to say other than that my personal experience with the vehicle has been great, and that’s what I really care about in a vehicle. I don’t buy cars based off what the CEO says on Twitter.

        • jose1324@lemmy.world
          link
          fedilink
          arrow-up
          12
          arrow-down
          20
          ·
          6 months ago

          Hate Musk or not, the Tesla is still a very good car. In many markets still the better value often times.

          • pufferfisherpowder@lemmy.world
            link
            fedilink
            arrow-up
            14
            arrow-down
            1
            ·
            6 months ago

            Yeah and while Elon is the fucking worst I assume not everyone knows that he is the Tesla man. It’s incredible actually how much he’s intertwined with the brand. I would totally buy a Toyota or whatever and I couldn’t tell you the name of their CEO, nor of any other car manufacturer, nor would I look up who they are beforehand.

            Granted the poster above is on Lemmy so I assume he knows more about musky boy than he would like.

          • Zagorath@aussie.zone
            link
            fedilink
            English
            arrow-up
            16
            arrow-down
            4
            ·
            6 months ago

            Everything I’ve heard says that Teslas have had huge reliability problems.

      • Technoguyfication@sh.itjust.works
        link
        fedilink
        arrow-up
        6
        ·
        edit-2
        6 months ago

        You can think whatever you want, but my experience driving it has been perfectly fine. Range is great, the car is not falling apart like some people claim, it was not delivered with any issues, and chargers are plentiful where I live. Those are the main things I (and many others) care about in a vehicle. I don’t care what the CEO does or says online. I have a Ford as well and couldn’t even tell you who the CEO of Ford is.

    • orcrist@lemm.ee
      link
      fedilink
      arrow-up
      4
      ·
      6 months ago

      If someone is injured or killed by a Tesla car, they can sue the company directly, regardless of any legal agreements you may have as the owner. Whether they win is a different question, but they might win if they could show that Tesla was negligent, and especially if Tesla was willfully negligent.

      Just because you think you’re responsible, even if you agreed in triplicate that you’re responsible, doesn’t necessarily make you legally responsible, depending on the circumstances. And that’s the way it should be.

  • Dr. Moose@lemmy.world
    link
    fedilink
    English
    arrow-up
    25
    ·
    edit-2
    6 months ago

    Even with autopilot I feel it’s unlikely that driver would not be liable. We didn’t have a case yet but once this happens and goes higher to courts it’ll immediatly establish a liability precedence.

    Some interesting headlines:

    So I’m pretty sure that autopilot drivers would be found liable very fast if this developed further.

    • dejected_warp_core@lemmy.world
      link
      fedilink
      arrow-up
      11
      ·
      6 months ago

      I am not a lawyer.

      I think an argument can be made that a moving vehicle is no different than a lethal weapon, and the autopilot, nothing more than a safety mechanism on said weapon. Which is to say the person in the driver’s seat is responsible for the safe operation of that device at all times, in all but the most compromised of circumstances (e.g. unconscious, heart attack, taken hostage, etc.).

      Ruling otherwise would open up a transportation hellscape where violent acts are simply passed off to insurance and manufacturer as a bill. No doubt those parties would rush to close that window, but it would be open for a time.

      Cynically, a corrupt government in bed with big monied interests would never allow the common man to have this much power to commit violence. Especially at their expense, fiscal or otherwise.

      So just or unjust, I think we can expect the gavel to swing in favor of pushing all liability to the driver.

      • Hagdos@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        6 months ago

        Making that argument completely closes the door for fully autonomous cars though, which is sort of the Holy grail of vehicle automation.

        Fully autonomous doesn’t really exist yet, aside from some pilot projects, but give it a decade or two and it will be there. Truly being a passenger in your own vehicle is a huge selling point, you’d be able to do something else while moving, like reading, working or sleeping.

        These systems can probably be better drivers than humans, because humans suck at multitasking and staying focused. But they will never be 100% perfect, because the world is sometimes wildly unpredictable and unavoidable accidents are a thing. There will be some interesting questions about liability though.

    • SkyezOpen@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      6 months ago

      They’re most likely liable. “FSD” is not full self driving, it’s still a test product, and I guarantee the conditions for using it include paying attention and keeping your hands on the wheel. The legal team at tesla definitely made sure they weren’t on the hook.

      Now where there might be a case for liability is Elon and his stupid Twitter posts and false claims about FSD. Many people have been mislead and it’s probably contributed to a few of the autopilot crashes.

    • SinJab0n@mujico.org
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      6 months ago

      It was possible to let Musk dealt with his own mess before, but after the last demands for false advertisement they changed the wording from “fully automated” to “assisted driving”, and now even the manuals says;

      "dude, this is some fucky woocky shit, and is gonna kill u and everyone involved if u let us in charge. So… Pls be always over the edge of ur seat ready to jump! We warned u (even if we did everything to be as misleading as possible), u can’t pass us the bill, nor sue us now.

      K, bye."

      So yeah, they ain’t liable anymore.

    • stom@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      2
      ·
      6 months ago

      You’re still in control of the vehicle, therefore you’re still liable. Like plopping a 5 year old on your lap to drive while you nap, if they hit people it’s still your fault for handing over the control to something incapable of driving safely while you were responsible for the vehicle.

      • Norodix@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        6 months ago

        But a reasonable person would not consider a child capable of driving. An “extremeley advanced algorithm that is better and safer than humans and everyone should use it” is very different in this case. Aftet hearing all the stupid fluff, it is not unreasonable to think that selfdrivong is good.

        • stom@lemmy.dbzer0.com
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          6 months ago

          Teslas own warnings and guidance assert that drivers should remain ready to take control when using the features. They do not claim it is infallible. Oversight and judgement still need to be used, which is why this argument wouldn’t hold up at all.

          • LovesTha🥧@floss.social
            link
            fedilink
            arrow-up
            1
            ·
            6 months ago

            @stom @Norodix Pity Tesla hasn’t taken reasonable precautions to ensure the driver is driving.

            It isn’t unreasonable to have customers expect the thing they were sold to do the thing they were told it does.

  • Uriel238 [all pronouns]@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    21
    ·
    edit-2
    6 months ago

    The cover-your-ass scenario.

    In the Philosophy Crash Course there was a scenario like this. I’ll paraphrase:

    You’re a traveler exploring a semi-devloped nation in South America. Coming out of the wilderness you come across a squad of soldiers. They are forcing twenty villagers to dig a mass grave. The officer to the soldiers tells you these villagers committed the state crime of supporting a rival to their leader, and are to be executed. But as you are a guest in their country, he will make you an offer: if you shoot one of them, yourself, he will set all the rest free, and then can hike to the border and beg for asylum. (A rough trek, but the neighboring country may take them).

    Do you shoot one of the villagers?

    Actually killing someone is rather hard on the psyche, and most of us cannot bear the thought (and might suffer from trauma as a result). But then, perhaps this is a small price to pay for nineteen human lives.

    Thomas Aquinas and Kant were happy to let the soldiers kill the villagers so as to avoid committing the sin of murder, themselves. Aquinas and Kant even would not lie to the murderer at the door, or Nazi Jew-hunters to save the lives of fugitives hidden in their home, since lying was sin enough, and they would count on God to know His own. Both had contemporaries who disagreed, and felt it was proper to suffer the trauma and do what was necessary (assuming the officer of the soldiers seemed inclined to keep to his word and actually spare the remaining villagers.)

    So, the cover your own ass response has a long history of backers, including known philosophers.

    • Kwakigra@beehaw.org
      link
      fedilink
      arrow-up
      6
      ·
      6 months ago

      The soldiers are killing them in either case. Basically I’m being asked to choose whether they kill one or all of them, and they’ll make me shoot someone if I choose to spare most of them. Considering the situation, I’m not sure these soldiers won’t just kill me with everyone else anyway since I’m a witness after yhey’ve had their fun.

      • KevonLooney@lemm.ee
        link
        fedilink
        arrow-up
        3
        ·
        6 months ago

        This is the actual answer. Don’t look at them in the face and try to leave ASAP. Philosophical questions are usually very simplified to illustrate a point, and there are usually more than two choices.

        In a real life “trolley” scenario, you should wave to the driver and get them to stop. There should be zero people tied to the tracks. And that’s only what should happen if you think quickly enough. In real life you may actually just freeze and do nothing.

        • Uriel238 [all pronouns]@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          1
          ·
          6 months ago

          IRL we typically do what we feel and justify it later, but then IRL there is no right or wrong, except what we construct in the process of organizing with each other to cooperate against outward threats such as predators and the elements. We have agreed to poor conditions because our lords were kinder than the winter and the bears, but then we’ve also overthrown our lords when we worked out they need us more than we need them.

          But yes, if you want to pretend that moral philosophy is just cerebral masturbation, that’s valid. All of our philosophy is about the opinions of past thinkers about the perimeters of right and wrong. It will give you a clear answer about as well as religious philosophy might tell you which patheon of gods is the true one.

          These scenarios are less about what is right or wrong, but about how you, individually and personally, decide ad hoc what is right or wrong. You might distrust the soldiers, but then if they were inclined to betray your trust with a lie, they might have never intended you to go free either, and the whole story becomes irrelevant.

          Another Trolley-like features a stranger come to town who is a perfect match for five transplant patients waiting organs. The surgeon / hospital administrator has a friend in organized crime who can abduct the stranger and harvest his organs quietly and cleanly so that the authorities won’t notice he disappeared. Although IRL, having a transplant is a mortal condition. Having the organ buys more time than not having the organ. Also this doesn’t get into the risks of other complications of transplant surgery that can occur even when an organ is a good match.

          These scenarios are not about real life, but about becoming more self aware of how you’d consider these. And yes, this may mean looking for third options, hoping to find one better than the two obvious ones.

          • KevonLooney@lemm.ee
            link
            fedilink
            arrow-up
            1
            ·
            6 months ago

            There are almost always better options than the given ones. I remember an answer to the “Ship of Theseus” problem a friend gave; he recommended calling it a different ship once more than 50% was replaced. I asked why and he said that all definitions are just made up, and you have to draw a line somewhere.

            That’s what people do in real life. They don’t just sit there perplexed by a “paradox”.

            • Uriel238 [all pronouns]@lemmy.blahaj.zone
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              6 months ago

              Firstly, if we’re talking about the Trolley Problem, that’s not a behavior paradox, that’s a morality paradox. Animals, including human beings, commonly act first and rationalize their behavior later. We can decide after the fact it was ethical after all, decide that it wasn’t but was justifed due to circumstances, decide that it wasn’t and wasn’t, but we’ll reconcile it after the fact. Examples like the Trolley problem are not meant to reflect real life and how we behave, rather are contrived in contemplation of the logical mechanisms we use to determine ethical options can become problematic. (Utilitarianism has its own paradoxes.)

              Secondly, in fact, human beings are susceptible to paradoxes that can cause decision paralysis, but they tend to be about either survival or high-stakes situations with incomplete information. A common one is when a green, low ranking enlisted person is given a direct order that is illegal. In the US army, our soldiers are educated as to the rules of war, and what constitutes a war crime, and while they are legally obligated to not act on illegal orders, they also know well before they get out of boot they’ll be jolly sorry if ever they do disobey an order. Command them to commit an atrocity on the field and they lock up by the dozens. Hence squad commanders know that if they issue an illegal command – even one based on incomplete information – it risks unit cohesion. Getting caught in a SNAFU like this is still common, and the enlisteds seldom come out of them well, so it’s on the list of counter-recruitment bullet-points.

              The same kind of thing also appears in game shows (where its contrived) and in the strategic command chain of command, because a lot of officers do not ever want to be a guy who nuked two million people, even if they’re the enemy. And yet those officers routinely got to serve as key-turners to arm (or launch) our nuclear arsenal. (I don’t know how the situation is since the new century, if those stations are even manned at all times anymore.)

              In the end, we are animals, and typically when we’re confronted with moral choices, it’s a matter of survival or high stakes, in which case we often don’t have the time for measured contemplation on what we’re doing. Moral philosophy questions what behavior may be right or wrong according to a given standard, but it doesn’t get into how people actually behave. For that, consider psychology and sociology.

    • JohnDClay@sh.itjust.works
      link
      fedilink
      arrow-up
      4
      ·
      edit-2
      6 months ago

      How are you sure the soldiers will follow though with their end of the bargain? Once they give you the gun, can you try and shoot the soldiers? Could you bribe the soldiers to release all the prisoners?

      Thought experiments like this have two options, but real life is never only two options. Getting into that mindset can lead people to accept things for the greater good without exploring all the options.

    • mondoman712@lemmy.mlOP
      link
      fedilink
      arrow-up
      4
      ·
      6 months ago

      You just described an alternative version of the well known trolley problem, which the post is referencing.

      The answers to the problem from the philosophers is interesting.

      • Uriel238 [all pronouns]@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        1
        ·
        6 months ago

        The Trolley problem is a schoolbook example of the failure of creed-based philosophy (deontological ethics), but is also used (the various scenarios) to illustrate that circumstances that don’t affect the basic scenario or outcome do affect our feelings and our response to the scenario.

        It’s easier to pull a lever from a remote position than to actually assault someone or kill them by your own hand, for example.

        There are other scenarios that don’t necessarily involve trolleys, but involve the question of doing a wrongful act in order to produce a better outcome. Ozymandias in The Watchman killing millions of New Yorkers to prevent a nuclear exchange, thereby saving billions of people. (Alan Moore left it open ended whether that was the right thing to do in the situation, but it did have the intended outcome.)

        We like the trolley problem because you can draw it easily on the blackboard, but other situations are much better at illustrating how subtle nuance can drastically change the emotions behind it.

        Try this one:

        The Queen of the land dies. On the day of her sister’s coronation, she declares that Anglicanism is now the faith and Catholics are now unlawful — a reversal of the old order — Catholics are to report to a town or city hall to convert or be executed. You are Catholic. Do you obey the law or flee? And if you obey the law, do you convert or perish at the hand of the state? Do you lie about your faith to state agents or to the national census?

        To a naturalist like myself, I’m glad to lie or convert to spare my own life, but to the devout, pretending to be another faith, or converting by force was a terrible sin, so it’s a very sober (and historically relevant) look at religious principle.

    • Sean@liberal.city
      link
      fedilink
      arrow-up
      1
      ·
      6 months ago

      @uriel238 @mondoman712
      In the days before Wannsee Conference (Nazis setting up death camps) but after the invasion of Poland where most executions occurred by firing squad, there were German tourists who would travel to partake in the firing squads. So the trauma is not universal across the human experience and there’s some circumstances that would cause individuals to kill. Lynchings and massacres in the US, are examples of this occurring without a war to give cover to killings.

      • Uriel238 [all pronouns]@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        7
        ·
        edit-2
        6 months ago

        We’ve seen a similar phenomenon in some of the red states in the ideology conflict here in the US. There are people eager to kill someone just to have the experience, and who volunteer to hunt targeted groups (trans folk, lately) or as participants in an execution by firing squad. I remember in the John Oliver’s first segment on the death penalty (he did a second one recently) executions were stalled due to difficulties obtaining the drugs used in lethal injections, and firing squads were brought up. The expert pointed out the difficulty finding one executioner, let alone seven. The officials suggested recruiting volunteers from the gun-enthusiast citizenry, which the expert saw as naïve.

        I can’t speak to firing-squad executions during the German Reich and the early stages of the holocaust, but I can speak to the Einsatzgruppen who were tasked with evacuating villages (to mass graves) who harbored Jews, harbored enemies of Germany or otherwise were deemed unworthy of life. The mass executions were hard on the troopers, and as a result Heydrich contended with high turnover rates.

        This figured largely into the movement towards the industrialized genocide machine that pivoted around the Auschwitz proof of concept. Earlier phases included wagons with an enclosed back in which the engine exhaust was piped. The process was found to be too slow, and exposed to many service people to the execution process. The death camps were staffed to assure no-one had to interact with the prisoners and process the bodies, so no-one would have to confront the visceral reality of before and after. They were staffed so that anyone who engaged a mechanism was two steps away from the person authorizing (and taking responsibility for) the execution. The guy who flipped the switch was just following orders.

        Interestingly, we’d see a repeat of this during the International War on Terror, specifically the Disposition Matrix which lead to executions of persons of interest on the field by drone strike (Hellfire missile launched from a Predator drone). During the CIA Drone Strike Programs in Afghanistan and Pakistan, the drone operation crews suffered from high turnover rate, with operators suffering from combat PTSD from having pulled the trigger on the missile launches. It didn’t help they were also required to scan the damage to assess the carnage, and identify the casualties.

        Interestingly, this also presented an inverted demonstration of how the human mind can tell the difference between violent video games and the real thing. Plenty of normies play Call of Duty without dealing with the mental after-effects of war, but even when we conduct war operations from continents away, our brains recognize that we are killing actual human beings, and suffers trauma from the act. War continues to be Hell, and video games not so much.

  • bufalo1973@lemmy.ml
    link
    fedilink
    arrow-up
    16
    ·
    edit-2
    6 months ago

    The funny part will be once the car doesn’t have a driver and is full autonomous. If the car kills someone, who’s to blame?

    • Glytch@lemmy.world
      link
      fedilink
      arrow-up
      10
      arrow-down
      1
      ·
      6 months ago

      The company that rented it to you, because fully self-driving cars won’t be for private ownership, they’ll just replace rideshare drivers.

      • explodicle@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        ·
        6 months ago

        Who’s to say that will be immediate? Many people won’t be quick to abandon their guaranteed-available vehicle, especially while every house and employer has parking.

                • Sizzler@slrpnk.net
                  link
                  fedilink
                  arrow-up
                  2
                  ·
                  6 months ago

                  Ok so ten years then. In that time nearly all average family cars will be smart. They will have self-driving (they can come pick you up). Will have a few years of insurance claims and premiums showing they are not responsible for 99% of crashes and insurance will react accordingly pushing up the insurance of the last holdouts so far that it becomes uneconomical for the average person to drive “manual”.

    • Schadrach@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      8
      ·
      6 months ago

      You treat it like any other traffic accident, except if a self driving car is responsible, that responsibility lies with the vehicle’s owner.

      • Wogi@lemmy.world
        link
        fedilink
        arrow-up
        4
        arrow-down
        1
        ·
        6 months ago

        It would have to be the manufacturer.

        If someone steals your car and kills someone with it, then disappears without ever being identified, the car owner doesn’t assume liability. Liability falls on whoever was operating it at the time. If software was driving, then the software company assumes the liability.

        • Schadrach@lemmy.sdf.org
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          6 months ago

          Doubt it. I mean, any self driving car is going to make the driver agree to responsibility for what the car does and ensure the user has a manual override available just in case.

          No company is going to ship fully autonomous driving software (for example to have fully autonomous driverless taxis) without contractually making the fleet owner responsible for their fleet cars.

        • explodicle@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          6 months ago

          But you bought the driverless car and turned it on. You never agreed to the thief’s joyride. Where do you draw the line for “operation” - like operating a steering-assist car, or operating a Roomba?

      • bufalo1973@lemmy.ml
        link
        fedilink
        arrow-up
        2
        ·
        6 months ago

        It’s not the same. When you have a dog you use a leash and, if needed, you can restrain the mouth.

        In this case you are not in control. And you can’t be. You are just a passenger. And you should have the same responsibility as a passenger in a train: none.

        • boatsnhos931@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          6 months ago

          I didn’t know about your parameters. I would think your example pushes it home, no car should ever be fully autonomous and should have a “leash” that a human could “restrain” the car with if necessary. Is no good?

    • supercriticalcheese@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      6 months ago

      Whichever was at fault is my non-lawyer opinion.

      What kind of penalty you apply to a self driving car guilty for causing an accident is a good question though.

      • bufalo1973@lemmy.ml
        link
        fedilink
        arrow-up
        1
        ·
        6 months ago

        I guess it would be the car maker’s responsibility if you are only a passenger in the car.

  • KillingTimeItself@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    18
    arrow-down
    2
    ·
    6 months ago

    this is funny and all, but it doesn’t matter what you’re doing here, you’re technically liable for all of them so uh.

    I’ll wait for a better version of this.

    • marcos@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      6 months ago

      I think that’s the point. There’s a follow-up about killing the people tying others to the rails that fits.

  • DNOS@lemmy.ml
    link
    fedilink
    arrow-up
    7
    arrow-down
    1
    ·
    6 months ago

    Immagino having a car that doesn’t pretend to drive herself but it’s enjoyable to drive, a car that doesn’t pretend to be a fucking movie because it’s just a car, a car without two thousands different policies to accept in wich you will never know what’s written but a car that you will be able to drive even though you decided to wear a red shirt on a Thursday morning which in you distorted future society is a political insult to some shithead CEO, a car that you own not a subscription based loan ,a car that keeps very slowly polluting the environment instead of polluting it with heavy chemicals dig up from childrens while still managing to pollute in CO2 exactly the same as the next 20 years of the slow polluting one not to mention where the current comes from, a car that will run forever if you treat it well and with minor fixes with relative minor environment impact and doesn’t need periodic battery replacement which btw is like building a new vehicle … This are not only a critical thoughts about green washing but are meant to make you reflect on the different meanings of ownership in different time periods

    And yes I will always think that all environmentalists that absolutely needs a car should drive a 1990s car, fix it, save it from the dump fields and drive it till it crashes into a wall …

    • SinJab0n@mujico.org
      link
      fedilink
      arrow-up
      8
      ·
      6 months ago

      Imagine not being forced to need a car at all.

      Imagine being able to just sit down, watch memes, read something, watch a movie, maybe take a nap, or even take advantage of the journey and get ahead some tasks on ur way to our jobs.

      Imagine being able to eat dinner on ur way home if our daily commute is kinda long, woldn’t that be a dream?

      Brothers, sisters, lets get some trains in our lives.

      • DNOS@lemmy.ml
        link
        fedilink
        arrow-up
        2
        ·
        6 months ago

        Totally agree…
        The dream would be to see them arriving on time , maybe clean ( not from Graffiti I’m a huge fan I mean from trash… )z I don’t know about other places in the world but we definitely need more especially during peak hours and the Infostructure should be in the state hands not in the monopoly of a single private low paying dickhead … (We regularly have a strike almost every Friday since my parents were born)…

        • SinJab0n@mujico.org
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          6 months ago

          Where r u from my friend? Even ours in the 3rd world ain’t that bad, actually they r really reliable (and clean), our usual demands its more lines

          • DNOS@lemmy.ml
            link
            fedilink
            arrow-up
            2
            ·
            edit-2
            6 months ago

            Italy we probably have the worst local train system… The long distance ones are actually better … Maybe are my standards you know people keep wanting more…

    • SirQuackTheDuck@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      6 months ago

      I would expect that that 90’s car would eventually be able to be converted to hydrogen combustion. That would save on pumping up petrol (if the hydrogen is not generated with petrol) and it would not cost yet another car to be created.

  • Sibbo@sopuli.xyz
    link
    fedilink
    arrow-up
    5
    arrow-down
    1
    ·
    6 months ago

    Reminds me of the Chinese issue: you run over someone, but they are likely not dead. Will you save their life but accept having to pay for whatever healthcare costs they have until they are recovered? Or will you run over them again, to make sure they die and your punishment will be a lot lighter?