RoadSafety

The legal issue of consent in autonomous driving

With autonomous and semi-autonomous systems gaining traction in today's automobile landscape, the issue of legal liability is become more relevant.

Human Factors research has shown time and again that driving assistance technology -- including more "archaic" systems like Adaptive Cruise Control and Lane Keeping Assistance Systems, is far from being error-proof. Recent studies have demonstrated that a limited understanding (or mental models) of how these systems operate can in fact lead to system misuse.

A recent study published on the Humanities and Social Science Communications tackles the issue of driver over trust and system misuse from the legal viewpoint.

Every time we register for a new social media account, or install a new smartphone app, the always-present consent message pops up: BY REGISTERING FOR THIS SERVICE YOU ACCEPT ALL TERMS AND CONDITIONS.

Typically, very few people ever bother to skim over this information, let alone read it in its entirety. However, the issue of consent and its implications on liability, will become more relevant as we entrust the autonomous system with our safety and the safety of the all vehicle passengers.

The authors of the study suggest that automakers may use the already-existing in-vehicle digital interfaces as a way to obtain consent from the driver (and possibly all passengers). However, this decision is far from being ideal or even safe.

It is argued that using the car touchscreen may not provide nearly enough information to the driver. Also, the authors suggest that "individuals may misunderstand the nature of the notices which grant permissions".

"Warning fatigue" and distracted driving are also causes of concern.

All in all, given the sizeable limitations of using digital interfaces for obtaining consent, it is suggested this won't shield automakers from their legal liability should the system malfunction or an accident occur.

Similarly to what I described in a recent article, training is seen as a potential aid in ensuring that drivers fully understand system capabilities and limitations.

Whatever the solution may be, this is yet another challenge that all autonomous vehicle stakeholders (including automakers and transportation agencies) needs to address if they wants to take a proactive (rather than a reactive) stance on the issue.

Reference

Pattinson, J. A., Chen, H., & Basu, S. (2020). Legal issues in automated vehicles: critically considering the potential role of consent and interactive digital interfaces. Humanities and Social Sciences Communications, 7(1), 1–10. https://doi.org/10.1057/s41599-020-00644-2

A user's guide to self-driving cars

This article was originally published by the author on The Conversation, an independent and nonprofit source of news, analysis and commentary from academic experts.

You may remember the cute Google self-driving car. In 2014, the tech giant announced their brand-new prototype of what the future of transportation might one day look like. If you wish you could drive one today, you are out of luck. The design was unfortunately scrapped in 2017. But don’t worry, what happened didn’t make a dent in the plan of introducing the world to self-driving cars, I mean autonomous cars, driverless cars, automated vehicles or robot cars?

Today’s cars offer a vast selection of driving aids available. Relatively few models, however, come with advanced features like self- or assisted-parking technology and systems capable of taking over steering and acceleration in different driving situations. A recent report shows that despite an optimistic surge in market penetration of these systems, the general public is still on the fence when it comes to fully relying on them.

Systems of classification

In 2016, Mercedes-Benz released an ad for their new 2017 E-Class car. What the ad focused on instead was their futuristic self-driving F 015 concept car driving around with the front and back-row passengers facing each other and using futuristic Minority Report-like displays. The ad came under attack by road safety advocates because it overstated “the capability of automated-driving functions available” of the E-Class. You may even spot the fine print: “Vehicle cannot drive itself, but has automated driving features.”

A similar controversy had Tesla at the centre of the debate in 2016, when it announced it would release self-driving capabilities over-the-air to their vehicles. Similar to what happened with Mercedes-Benz, the company was criticized for misleading advertising and “overstating the autonomy of its vehicles.”

Labelling expectations

When I buy a dishwasher, what I want is a machine that automates the manual task of washing dishes. What I need to do is just push a button and the machine will do its thing with no additional command or intervention. Now, believe it or not, a similar logic applies to automated driving systems. If I am told — or shown or suggested or hinted — that the car might in fact drive itself, what do you expect I, as a human, will do?

Leaving aside related technical or ethical issues, from the perspective of someone who teaches and researches cognitive ergonomics and human factors, I can tell you that providing inaccurate, or even wrongful, information on how automation works has direct safety consequences. These include using machines in unintended ways, reducing the level of monitoring or attention paid to their functions and fully ignoring possible warnings. Some of these safety consequences were touched upon in the official investigation report following the first fatality involving a car with an automated driving system.

Informing consumers

What, you may wonder, are today’s drivers left to do?

A few things: First, before you drive a car equipped with autonomous or self-driving features, you might want to find more about the actual capabilities and limitations. You can ask your dealership or do some good old online research. A valuable resource for consumers is MyCarDoesWhat.org. This website, with helpful videos and links to manufacturers’ websites and user guides, is valuable in presenting the dos and don’ts of automated driving systems.

Finally, before using your car’s automated driving features in real traffic, you may want to familiarize yourself with how they work, how to engage them, etc. Do all of this while stationary, when parked in your driveway perhaps.

I know it may sound like a lot of work (and sometimes it may not even be sufficient), but as research and accident reconstruction already showed many times over, when you are at the wheel, the safest thing to do is to keep your mind and eyes on the road, instead of thinking about how a self-driving car might make your commute much simpler and much more enjoyable.

How to make vehicle tech less distracting

In a recent entry, I talked about the role of training for automated vehicle aids.

In a study published in 2017 in collaboration with the AAA Foundation for Traffic Safety and the University of Utah, I investigated driver interaction with in-vehicle infotainment systems, which are those systems that allow drivers to, e.g., make phone calls or send text messages without using mobile devices.

One of the most striking findings from that study was that, although technology like touchscreens and voice interaction systems have been around for many years, they are still challenging to use, at least for some groups of drivers.

Issues that we found with this touchscreens included relatively low responsiveness, cluttered menu designs, and long interaction times. In certain cases, for examples, primary functions were buried deep in menus or the design of the menu made frequently-used features almost invisible to the driver.

For voice technology, certain systems were overly verbose, and, as a result, imposed large memory load and required long interaction times.

One possible solution to this problem is using off-the-shelf systems like Android Auto and Apple CarPlay, which, in a later research, were shown to burden driver’s attentional resources to a lesser degree.

Another possible solution is to encourage drivers to familiarize themselves with this technology when the vehicle is stationary. which may help them find and utilize frequently-used functions more quickly and efficiently.

References

https://aaafoundation.org/visual-cognitive-demands-using-vehicle-information-systems/

https://aaafoundation.org/visual-cognitive-demands-apples-carplay-googles-android-auto-oem-infotainment-systems/

How to reduce distraction

As we all know, driver distraction is among the top causes of road collisions. It is in fact estimated that 1 every four crashes involves phone use. While reducing the use of personal or vehicle technology is needed and feasible in many cases, there are however some exceptions to this rule.

Emergency vehicle operators like ambulance or police car drivers are in fact exempt from many restrictions, according to the Highway Traffic Act. There are also professions like commercial driving where the use of portable dispatch devices is part of the job description.

This brings us to the question of how can we reduce distraction in these workplaces?

One way could be by providing better training. Despite there being virtually no evidence that distraction can be fully trained away, cognitive research shows that extensive practice can reduce the attentional component of completing simple experimental tasks. Hypothetically, training programs could be developed that reduce the cognitive component of certain driving tasks.

A second possibility would be to design down the distracting effect of using communication technology. What I mean by this is to design technology that adopts modalities that require lower cognitive, manual, or visual demand. In a recent study, we found that certain off-the-shelf infotainment systems were in fact "better" than vehicle's native technology.

These offer possible avenues that should be explored when attempting to tackle the disruptive effect of distraction on road safety.

References

https://aaafoundation.org/wp-content/uploads/2018/06/AAA-Phase-6-CarPlay-Android-Auto-FINAL.pdf

The Danger of Vehicle Automation

Incorrect or incomplete understanding of vehicle automation is detrimental to safety. Evidence shows that drivers with limited or flawed mental models are in fact more at risk of misusing vehicle automation, and, in turn, road collision.

Watch Dr. Biondi’s talk to find out about the Human Factors issues of misusing vehicle automation.

Yet another case of vehicle automation misuse!

Unfortunately, this will not sound like a news to many (me included.). But yet another Tesla driver was caught napping behind the wheel of a semi-autonomous vehicle in Edmonton, Alberta.

How come this isn’t news you ask? Well, there are now countless examples of erratic, unsafe drivers blatantly misusing (and abusing) vehicle automation.

The National Transportation Safety Boards’ investigations following a handful of fatal and nonfatal collisions involving Tesla Autopilot reported that driver’s inattention and over-reliance on the system, coupled with system’s operational design contributed to these collisions.

Also, this summer, German regulators ruled that the Autopilot name is misleading motorists to believe that Autopilot is in fact an auto-pilot .. which is not.

Efforts from the American Automobile Association and others have recently contributed to the development of a naming convention for semi-autonomous systems that hopes to help consumers make educated decisions when purchasing a vehicle, and reduce the likelihood of misusing its systems.

Much has been done thus far to promote a safe adoption of these systems. My research and others’ have contributed to better understand how Human Factors affect driver’s adoption of autonomous and semi-autonomous systems. Transportation agencies and road safety stakeholders are too pushing for safe regulations. But much more needs to be done.

References

Biondi et al. (2018).80 MPH and out-of-the-loop: Effects of real-world semi-automated driving on driver workload and arousal https://doi.org/10.1177/1541931218621427

CBC (2020). Speeding Tesla driver caught napping behind the wheel on Alberta highway https://www.cbc.ca/news/canada/edmonton/tesla-driver-napping-alberta-speeding-1.5727828

Credit: CBC

Credit: CBC

The Danger of ADAS webinar series

On Septembers 30th, 2020, I will be the guest speaker in iNAGO’s Intelligent Assistant webinar series on the topic of The Danger of ADAS.

The National Highway Traffic Safety Administration estimates that 94% of serious crashes are due to human error (NHTSA, nd). While advanced driver assistance systems are designed to minimize the impact of human error on safety, recent evidence suggest that lacking understanding of these systems, and the over-trust resulting from it, may contribute to drivers misusing ADAS and engaging in potentially dangerous behaviors (NTSB, 2020).

The webinar will cover:

  • Understanding ADAS and its role on driver safety

  • How connected vehicles can be safer by making drivers more knowledgeable

  • Demonstration of a conversational assistant-driven car feature information system

  • User Study results on the use of in-car knowledge assistants by Human Systems Lab

  • Live Q&A with Dr. Biondi and Ron DiCarlantonio

Reserve your virtual seat HERE

Capture.PNG

Distracted driving uptick since the COVID-19 lockdown

A recent study published by ZenDrive shows an uptick in distracted driving and speeding since the beginning of the COVID-19 lockdown in March.

While this is not surprising per se, there may be two important factors determining this.

First, with possibly fewer cars on the road, some motorists may feel like they can take more risks, and, perhaps, convinced of the lower police presence, they are less at risk of being caught.

The second and frankly more disturbing contributor is remote working. As suggested in the ZenDrive report, the ‘mass migration’ to remote working and virtual conferencing has made us even more dependent to communication technology. This, possibly combined with the difficulty to distinguish between work and leisure time during remote working, may have made motorists more inclined to attend work meeting while driving.

Altogether, this evidence suggest that distracted driving may have gotten worse since the beginning of the COVID-19 lockdown in March.