Sure, I'll address the rest...
"Did the one in Arizona kill the lady on her bike?"
Yes, however the car was trying to tell the system to emergency brake but the Uber developers TURNED OFF that feature. The autonomous software saw the biker a full 6 seconds before impact. Uber coded the software to DELAY braking as they wanted the driver to take over to flag incidents needing further review. They still were relying on the driver to react at that stage of development. That driver was watching her phone in her lap - none of the drivers were allowed to use their phones while testing the vehicles. That's also why she is being prosecuted. Volvo later did it's own tests. The emergency braking, the one the Uber developers disabled, would have prevented the crash in 17 out of 20 scenarios and the other 3 would have been at a non fatal speed.
"Did the one in Cali drive into the K barriers and kill the driver"
It did. The lanes split and the lines on the road actually drove right up to the barrier and the driver, who was playing a game on his phone, didn't notice. The section of road was not marked well and caused many accidents at that same spot (by non autonomous cars). The crash attenuator was even damaged by a crash there just a little while before the crash in question. It was also discovered that if the attenuator had been repaired the driver would most likely be alive today.
I don't defend any of these incidents, quite the contrary. There are no systems out there today that should be considered better than Level II Autonomy. Meaning ultimately it is the driver's responsibility to be in control should the car do anything dangerous. Some situations are out of control of even the best software or drivers, like red light runners. Some accidents and deaths are going to happen no matter what. I think people forget that.
I would also note the data collected in these 'computer on wheels' cars is extensive so we have lots. Tesla releases reports of how often their cars are involved in crashes.
For Q4 2022: using Autopilot there was 1 crash every 4.85 million miles. not using Autopilot there was 1 crash every 1.4 million miles.
The US average is 1 crash every 652,000 miles - 7.44 times MORE LIKELY to be in an crash than using Autopilot.
https://www.tesla.com/VehicleSafetyReport