Autonomous Cars | עמוד דיון
Me vs. Someone
If I do nothing – I will be saved
If I do nothing – someone will die
If I do nothing – I will be saved but many will die
One or many
If I do nothing – many will die
According to the source from Leviticus 25:35, which states that one must allow one's brother to live with them if he is "waxen poor." Interestingly, it also states that you shall "take thou no interest of him or increase." Why does it state this? It seems quite ironic and obvious to state not to take from a destitute man. Perhaps in order to demonstrate the seemingly rare case that this would occur, perhaps one in which the other needs to save himself by taking from the destitute man. I think that this relates to our discussion on the Autonomous Cars dilemmas because it gives us an understanding of our responsibilities toward another person. Just as it is required of us to love our neighbor as we would ourselves, you should perhaps be responsible in the same way toward him as you are toward yourself. Perhaps this portion in Leviticus complicates the possibilities of answers to our dilemma: we wonder if we should be more responsible for our lives over the lives of the others (should we swerve the car to save ourselves or let ourselves be killed) or if it should be the other way around. This pasuk seems to say that we should be responsible for our neighbors, we have an obligation to protect them, not only for the love of the neighbor but also for the fear of God (this strengthens the stringency of this commandment). Does this mean that our neighbor's life should be placed before our own? It seems this way according to this psukim…
I watched the Ted Talk "How Self-Driving Cars Will Transfrom Our Cities and Our Lives." In this Ted Talk only positive things were brought up. For example, he argues that with autonomous cars we will be reinventing our lives, cities, time and money spent because currently 6.9 billion hours are wasted in traffic, there are over 1 billion cars in the world and 96% of the time the cars are parked, and 20% of most cities are made up of parking lots. Therefore, by not owning as many cars, and not driving ourselves we can fix/change these problems. However, the Ted Talk fails to discuss the issues of all the jobs that will be lost from no longer needing public transportation drivers. He talks about all the time we will gain but not about everything we will be costing. Like what dicision the car will make if there is an accident. This Ted Talk did not make a strong enough argument as to why this would be a benifical change.
The speaker in the video we watched last week differentiated between the instinctive reaction (of a human) and the conscious decision (of a computer) in the event of a collision, and between invocations of human responsibility and of those transcending them. It was difficult for us to determine if a human driving a self-automated car would be responsible for a self-automated car's reaction in a collision, even if its reaction had been preprogrammed.
The Talmud considers passive euthanasia as an impediment to life. Rabbi Isserles, on the other hand, sees suffering as an impediment to death. He believes human interference catalyses –and does not impede– natural death. In the case of self-automated cars, the impediment to our own life/death is beyond human interference; but the impediment to our own decision-making affecting how we die is within the relative bounds of our control (in this case, over a computer's programming).
I watched the video about the cannibals at sea. Three men were stranded at sea and two of them decided to kill and eat the youngest in order to save themselves from starvation. If they had not done so, they probably would have died. At their trial they were convicted as guilty of murder. From a moral standpoint, most people would say that the two men acted immorally because they took a life. Because law and morality are meant to be intertwined, the men were punished for acting immorally. From an economic standpoint, however, the men acted rationally because they maximized the length of their lives. The issue of autonomous cars we discussed last week is very similar. If we program our cars to crash into the cars next to us in order to save us it is essentially the same as killing to save yourself like in the case of the cannibals. Most would find it immoral. The reason I think a lot of us have difficulty seeing it as such is because the decision would be predetermined and we would not be the ones to physically steer the car into our neighbors. The lack of direct action disassociates us from the guilt. At the same time, however, I believe that most people would prefer to have the car save them. After all, what makes their blood redder than mine?
I found the cannibals at sea extremely relevant to our discussions .
This relates to the self-driving cars because it is a question of taking action to save one's-self (eg swerving out of the way of the object in front of you, even if it means you kill the person in the car next to you.)
It seems, though, that the decision of the captain would have multiple ethical problems Jewishly.
First, the captain made his decision against the wishes of the other men on the boat,
violating an important rule in Judaism that majority rules (I know they aren't Jewish but for the sake of our argument we can examine it from that perspective.)
Next, the captain made an active decision to value his blood over the blood of someone else, and as we saw in the sources last week , this is not allowed.
While, some might say that this was a case in which they had to kill one of them for the others to survive,
realistically, they could have waited for one of them to die,
and then eaten that person, instead of actively murdering one of the crew members.