I have a different perspective on the situation.
Mobileye started as a graduate project in image recognition. It transformed into a company with a product that recognized road signs. They added the ability to recognize lane markers, including upcoming curves, along with a limited static recognition of some objects (vehicles, cycles, and pedestrians).
Mobileye's (and Tesla's) current system does more, but Telsa started out years ago with a very basic product. Basically the Mobileye camera system output (at low speed) a list of road signs, the distance to the next lane curve, how much of a curve that was, and if there was a recognized object in the path. Tesla took that output, combined it with other sensors, and built what is best described as a lane-keeping system for use only on already-scanned sections of highway.
If you look at the documented Tesla accidents, you can readily see the limitations of the older approach. Tesla and Mobileye were separating before the Florida accident, but that accident made it clear that the Mobileye approach was flawed. Their static image system lost track that a truck was crossing the path. It presumably "saw" the truck initially crossing, but when the plain white trailer spanned the highway, it reported only an unobstructed lane markers to the no-feature horizon.
(There are additional faults, such as relying on radar which scanned under the trailer, but that's beyond the scope here.)