The idea of having a car that will drive you to and from your jobsite while you take a nap, read a book, or catch up on a show you didn’t get to watch the night before is certainly on its way. Autonomous driving technology is at the forefront of every automotive company’s goals and teams of experts have been employed to help make these systems work for us. So far we don’t have any vehicles on the market that are fully autonomous, but Tesla lays claim to being extremely close to having this system perfected.
So far Tesla has added an Autopilot system to their Model S and Model X vehicles to allow owners to take their hands off the wheel for short periods of time while the vehicle does the driving. This only works on well-marked highways and the vehicle is supposed to alert a driver when the need arises to take the controls back. Recently we learned that there have been two accidents in Tesla vehicles while the automatic systems were engaged and the drivers were not alerted to the need to take over the controls of the vehicle; thankfully, in both cases, no one was hurt.
The first question is to wonder if the Autopilot system was to blame. In one case it seems the Summon feature was engaged. This system is meant to allow a car to drive to your location without you in the vehicle at all. so far this is only supposed to be used to bring a car out of the garage and into the driveway for you to enter the vehicle before heading out for the day. This system is not to engage during normal driving leaving me to wonder if the software had a fault in it that caused the Summon feature to engage at the wrong time.
While one Google scientist calls the Autopilot system irresponsible Elon Musk stands by the fact they continually educate their customers to the fact the system is not an autonomous system. Musk reminds drivers they need to stay alert and be prepared to take over the controls of the vehicle at any time. It seems the Goggle expert has the scenario correct in stating the vehicles work many times over and give you a false sense of security and then don’t even alert you to take over the controls before you end up in a rear end collision.
While there were only the two accidents that we know of so far, if the system fails in this way, how can we trust it at all? Even though there is an obvious generational gap in the trust of an autonomous system with younger generations feeling confident and older ones leery of the technology, this is not something you chalk up to human error. The point of a computerized and electronic system is to take away the possibility of failure, which occurred in both these cases.
Maybe Tesla rushed their Autopilot system to their vehicles just to be the first to the market. Maybe there is a glitch in the software that engages the Summon feature, which would then not sense a driver behind the wheel. No matter what the issue is, Tesla needs to get this fixed and we all should be a bit concerned that any form of autonomous driving is being offered already.
This post may contain affiliate links. Meaning a commission is given should you decide to make a purchase through these links, at no cost to you. All products shown are researched and tested to give an accurate review for you.