Credit Score

Connected Vehicles & Automatic Decision Making – Transportation

Connected cars are the next step in bringing mobility to everyone. Automakers have claimed that more than half a dozen cars sold in 2021 are considered nearly self-driving. The Infrastructure Investment and Jobs Act 2021 requires new cars built after 2026 to implement passive driver performance monitoring systems to detect drunk driving. HR 3684 §24220. Some of the technologies on offer monitor drivers for signs of impaired driving through cameras built into the car’s interior.

Eliminating drunk drivers from the road is good policy. However, monitoring drivers and allowing cars to ban driving raises privacy and autonomy issues. Should behaviors like speeding be automatically monitored for chronic speeding ticket offenders for traffic safety purposes? As automated decisions are made from data collected from connected cars, this policy is beginning to look like the outlines of the social credit system. This article focuses on the conflicts the automotive industry will face, between government policies requiring automated decision-making and increasing data privacy laws that protect consumers from certain data collection and subject them to a automatic decision making.

Social Credit Scoring System

The Social Credit Scoring System is a credit scoring system that attempts to link public and private data on the financial and social behavior of individuals and entities and to track and assess their reliability. China originally developed this concept to combat financial fraud and non-compliance of civil court judgments in the mid-2000s. China first rolled out this system in 2014 and assessed individuals in China for blacklist or whitelist based on the individual’s social credit score.

The social credit score system is built on two parts: data collection and reward and punishment based on credit score.

Data collection is performed on a multitude of monitoring systems. Although the exact methods of evaluation are kept secret, credit information, shopping behavior, criminal history, compliance with court or administrative orders, traffic violations, online behavior and actions in public are collected. Any information or behavior considered negative leads an individual to receive a lower score, and behavior considered positive will increase the score.

The social credit score system then implements a punishment reward system based on the individual’s score. Chinese authorities have used social credit to ban individuals from buying flights, making reservations on high-speed express trains or staying in a luxury hotel. It also had an impact on people’s daily lives. For the alleged bad behavior of playing too many video games, authorities throttled the internet speed of households with chronic gamers.

Punishment could also take more severe forms such as denial of access to a university or a job. However, if an individual has a high social credit score, authorities may offer benefits. For example, the highest rated individuals benefit from a reduction on energy bills, book a hotel without deposit, or “boost” their user profile on a Chinese dating site. In other words, the social credit score system uses a “carrot and stick” approach to induce desired behavior from individuals.

Data collection and implications for decision-making

Social credit score-type systems may seem wacky in the United States. But, if connected vehicles can collect driver behavioral data for impairment purposes under the Infrastructure Act, this opens up the possibility of rules and regulations that could penalize drivers for their consolidated misbehaviour. behavior, as do social credit scores. Data collection from a connected vehicle may determine that the driver regularly exceeds the speed limit and may decide to reduce the speed of the car for safety reasons. This may seem like a reasonable algorithmic decision. However, when allowing algorithmic decision-making in cars, the automotive industry needs to be aware of two issues:

  • Does the car (or car manufacturer) have a legitimate reason to collect and use car speed information, such as driver consent?

  • Should algorithmic decision-making be implemented to favor or punish driver behavior?

For connected vehicles to monitor drivers, there will be some form of technology collecting data while a driver is inside the car. The interior of the car is considered a place with a reasonable expectation of privacy, even under the Fourth Amendment of the US Constitution. New York vs. Class, 475 US 106, 114-115 (1986). And studies show that drivers and passengers aren’t comfortable with car systems monitoring them while they’re in the car.

With the development of privacy laws globally, if the interior of the car is monitored, a notice of data collection to drivers and passengers will be required. For example, an interior video recording will likely gather racial or ethnic information as it records the likeness of the driver and passengers. If the vehicle monitors the road speed limit, precise geolocation is also likely to be collected to determine the road speed limit. These categories of information are considered sensitive information under the California Privacy Rights Act (CPRA) and the EU General Data Protection Regulation (GDPR). See Cal. Civil. Code § 1798.140(ae)(1)(A)—(F); see also 2016 OJ L 119/1 §9(1).

The CPRA gives a California consumer the right to limit the use of sensitive personal information to that which is necessary to provide the services or provide the goods reasonably expected by an average consumer. Cal. Civil. Code § 1798. 121(a). Similarly, GDPR requires explicit consent to process sensitive personal information. 2016 OJ L 119/1 §9(2). Drivers may opt out of collection in the United States and may not consent to data collection while driving.

Car manufacturers and manufacturers may attempt to justify the collection, use and processing of sensitive personal data based on the legal obligation to comply with traffic laws. However, it could be argued that the collection of sensitive information to allow someone to drive is unnecessary and not something that consumers would reasonably expect, because drivers do not provide this data for drive today.

In addition, the collected data can be misinterpreted by the algorithm. Suppose the vehicle monitors driver behavior to identify drunk driving and stops the vehicle after recognizing drunk behavior. If it is later determined that the driver suffered from diabetes with low blood sugar that made him appear drunk, the data collected may suddenly become medical information and the purpose of the collection may be challenged by the driver.

One solution to avoid misinterpreted data collection may be to implement a combination of data collection systems. Using the Infrastructure Act as an example, the car can be equipped with a breathalyzer and a video surveillance system to ensure that data collection relates to drunk driving. This could reduce the risk that the driver’s behavior is caused by something other than alcohol. However, the facts can get even more complicated if a passive breathalyzer picks up the alcohol content of the passenger instead of the driver. New technologies and ideas can provide a safe harbor to justify the collection and use of sensitive data. These new methods still do not resolve the question of whether consumers will be comfortable and accepting the monitoring of their driving behavior inside the car, especially when the monitoring could potentially limit the driver’s ability. to drive freely.

Download >> Connected Vehicles and Automatic Decision Making

Reproduced with permission. Published March 2022. Copyright © 2022 The Office of National Affairs, Inc.

The content of this article is intended to provide a general guide on the subject. Specialist advice should be sought regarding your particular situation.