You point out one of the toughest things about programming autonomous vehicles, programming it whether to decide to hit the stroller in the crosswalk, or the crowd on the sidewalk.I remember talking to an engineer who came in the dealership out in CA over autopilot technology. It's wild this correlation comes up now between a bus and a UPS truck. He mentioned the software he was working on and how to make it intelligent, essentially deciding which is worse to crash into. Basically if the car is going to crash it looks all around and in the right lane sees 2 cars and in the left lane it sees 1 bus. 1 is less than 2 so it would crash into the larger object even though it could have more impact. Now this was years ago and I don't know if that technology was actually implemented, but crazy to read an article about them hitting larger vehicles and trees.
FAA, and EASA have been getting harder to work with, I am hearing?I am currently working on an SAE committee that is developing the safety standard for certification of AI systems for aircraft. That standard is due out in late 2023.
Road vehicles are supposed to meet ISO26262 safety standard, but that does not currently discuss AI and there is no independent auditors for that standard, such as the FAA or EASA in the aerospace domain. NHTSA only gets involved after accidents or incidents occur.
Ya, I cannot believe what they have done with the 777X, major flight control redo and recertifications. Years of delays and billions of dollars. This is likely an overcorrection due to the MAX problems.FAA has gotten much tougher since 737 MAX. One of my company's customers, Gulfstream just got affected by this.
In the wake of the Boeing 737 MAX crashes, the FAA is making its aircraft certification process more rigorous, a move that could push the Gulfstream G700 timeline back.www.globalair.com