The Electronic Frontier Foundation (EFF) has joined forces with 30 civil society and academic groups to express their concerns to UK authorities about the Data Use and Access Bill (DUA Bill). The letter is specifically addressed to Interior Secretary Yvette Cooper and Secretary of the Department of Science, Innovation, and Technology Peter Kyle.
The crux of the matter revolves around Article 80 of the DUA Bill, which, according to its critics, weakens existing protections for automated decision-making in law enforcement. Currently, the Data Protection Act of 2018 prohibits fully automated decisions unless explicitly required or authorized by law. However, this new article reverses that prohibition, allowing authorities to employ automated decisions based on a variety of factors, ranging from socioeconomic status to characteristics like regional accent or even inferred emotions.
Opponents of the regulation warn that this approach increases the risk of bias, discrimination, and lack of transparency. In the government’s Impact Assessment for the DUA Bill, it is acknowledged that individuals with protected characteristics such as race, gender, and age are more likely to face discrimination as a result of automated decisions. Despite these warnings, British politicians continue with the implementation of a law that many consider potentially harmful.
Furthermore, it is highlighted that the lack of transparency surrounding automated decision-making could leave affected individuals without effective recourse options, exposing them to unfair and opaque decisions. In light of this situation, the signatory groups demand that Yvette Cooper and Peter Kyle urgently address the lack of safeguards in the use of automated decision-making tools by law enforcement, before the effects of this legislation become irreversible.
Referrer: MiMub in Spanish