Our website uses cookies to give you a better browsing experience, and by using our site you accept our cookies policy


If you have not yet heard the news about Google’s autonomous car being involved in an accident with a bus in California, it’s time you heard what happened. According to multiple news reports, the autonomous SUV made a few incorrect calculations and collided with a city bus operated by a human driver. At that moment, it became abundantly clear why self-driving vehicles are nowhere near ready to be allowed on public roads – even in limited numbers.

The accident offers a very clear illustration of what happens when bus driver training and artificial intelligence produce different and competing, outcomes. In this particular case, the Google SUV was travelling in the right lane at roughly 2 mph next to a city bus travelling in the left lane at approximately 15 mph. The operator of the SUV noticed sandbags in the street that the car would have to compensate for. Herein lies the first problem.

California law requires a human driver to be capable of taking control of an autonomous vehicle in order to prevent an accident. In this case, the operator assumed the bus driver would yield so that the Google SUV could get around the sandbags safely. That’s not what happened. The bus driver maintained his lane only to have his vehicle struck by the Google SUV moving left to avoid the sandbags.

First Accident of Its Kind

Although blame has not yet been officially assigned for the accident, all indications are that the fault lies entirely with the Google SUV. According to standard bus driver training, a city bus operator would not take the risk of causing a much larger accident by suddenly yielding in traffic in a non-emergency situation. It was entirely reasonable for the bus driver to assume that a vehicle travelling so slow on its right side would be the one to yield for an obstacle. The driver did the right thing; the autonomous vehicle did not.

Assuming that the fault is eventually determined to rest with the Google car, it would be the first time an autonomous vehicle caused an accident on public roads. It is our opinion that it will not be the last time, either. As advanced as artificial intelligence technology is right now, it is still not capable of predicting what other vehicles will do. As such, autonomy is simply not safe.

Vehicles Need Drivers, Drivers Need Training

We are not at all worried about the threat of autonomous vehicles to the haulage and transport industries. From where we stand, the use of autonomous lorries and buses on a large scale is still decades away – if that day is ever reached at all. So for now, we will continue offering world-class lorry and bus driver training at more than four dozen facilities throughout the UK.

Vehicles need drivers and drivers need training. If you intend to drive lorries or buses for a living, we urge you to contact us to learn more about our training courses.


Daily Mail – https://www.dailymail.co.uk/news/article-3469952/Self-driving-Google-SUV-crashes-bus-road-test-car-accident-kind.html


We are open Monday-Thursday 9am to 6pm and Friday 9am to 5:30pm

Just fill in your details below and we’ll send you a free theory test practice with.
Just let us know your score...

We reserve the right to contact you in the
future via this email

By Signing up, you agree to our Terms & Privacy Policy


By Signing up, you agree to our Terms & Privacy Policy


By Signing up, you agree to our Terms & Privacy Policy


By Signing up, you agree to our Terms & Privacy Policy


By Signing up, you agree to our Terms & Privacy Policy

Request A Callback

Corporate Form