There is no ethical dilemma. A driver is expected not to kill. If a self driving car (SDC) kills it should not self drive. A person that kills has liability. Cars do not have.
A driver should override a SDC before danger. A SDC should never be allowed not to have a driver.
The programmer would otherwise be liable.
BTW, being released to outpatient tomorrow. Yea!