Digital technology exacerbates racial discrimination in the United States

Author:Guangming Daily Time:2022.06.20

【Naruto】

Digital technologies such as the Internet, big data, and artificial intelligence, while increasing the well -being of human beings, may also have a negative impact of human rights, especially for specific groups that are in a weak position in the weak position in race, gender, age, etc. Essence The United States has the world's most advanced digital technologies, but not only failed to bring opportunities for them to solve racial problems, but instead replicated, strengthened, or even exacerbated its systemic and structural racial discrimination.

Digital gaps and technical obstacles with race coordinates

Although the United States occupies a dominant position in the global digital economy, the opportunity for ethnic minorities to benefit from emerging digital technologies is incredible. According to data from the Federal Communications Commission, in 2016, about 41%of indigenous Americans living in the tribe could not obtain Internet services such as video conferences. A survey of Petou Research Center in 2019 shows that in the United States, African and Spanish people have computers or high -speed Internet proportion. About 82%of white people say that there are desktop or laptop, but only 58%and 57%of African and Spanish people with computers. There are huge ethnic differences in broadband use. The proportion of broadband connections in white home is 13%to 18%higher than that of African and Spanish descent.

The widely used facial recognition technology itself also implies the factors of racial discrimination. Many algorithms that can successfully identify white faces cannot be correctly identified. In 2019, the National Institute of Standards and Technology released a report, showing the performance of 189 people's face recognition algorithms submitted by 99 developers around the world when identifying the faces from different population statistics data. Tests show that the possibility of accurately identifying African or Asian faces cannot be accurately identified as white people. When searching for databases to find a given face, the proportion of African women showing errors is significantly higher than other populations.

Social media spread hatred speech and racism

Emerging digital technology provides channels for rapid and large -scale racism, hatred speech and incitement of discrimination violence, and social media platforms play a key role in it. Since 2014, the number of hatred groups in the United States has increased by 30%, and it has increased by 7%in 2018 alone. According to the statistics of the Southern Poverty Law Center in the United States in 2019, there are 1020 hatred groups in the United States. According to the monitoring report of the US anti -slander alliance, the number of white supremacy in the United States in 2018 was 1,187, an increase of 182%from 421 in 2017. Hate groups have implemented terrorist attacks and killings on ethnic minorities in the name of "pure race" and racial superiority around the world, including anti -Jewie attacks in Pittsburgh, the United States in 2018. In 2018, the new Nazi and other white suprematic extremist groups killed at least 40 people in Canada and the United States. These racial hatred groups often use social media platforms to find like -minded individuals, support each other, and spread their extreme ideas. In addition, hatred groups have become more and more infiltrated into the "game" world. Video games and game -related forums, chat rooms and live streaming streaming media websites have become the most mainstream new Nazi recruitment venues.

The algorithm system repeats and strengthens racial prejudice

First, the algorithm system may exacerbate racial discrimination in terms of work rights. The United Nations Special Reporter pointed out in a 2020 report that some algorithms used in the United States for recruitment were criticized for discriminatory. This type of algorithm system determines candidates based on the database of the existing "successful" employees, and the database includes gender, ethnic or religious information. Therefore, the decision -making made by the algorithm system reflects the existing inequality phenomenon in employment, repeating and strengthening the prejudice based on race and gender.

Secondly, emerging digital technology has also had a discriminatory impact on the health rights of ethnic minorities. A study published in the magazine of Science found that the U.S. medical care system used business algorithms to guide more than 200 million people with medical services and health decisions, and implemented systemic discrimination against African patients. Because there is no "race" option in the input data, developers believe that the algorithm is "race", but for African -American patients with equivalent illness as whites, this algorithm has always given a lower risk score and failed to identify Nearly half of African patients who are also likely to produce complex medical needs as white patients, which leads to their unable to participate in the improvement of health intervention projects.

Finally, the racial discrimination in the directional advertisement violates the housing rights of ethnic minorities. Social media websites that control the 22%digital advertising market share in the United States. In the past, it allowed advertisers to exclude users with "population statistics" category under their advertising positioning tools to "reduce the scope of the crowd" by users with some "family relationships". Essence This directional advertisement can be used to prevent African, Asian or Spanish people watching specific housing advertisements. This "only white man" advertisement can't help but remind people of the Jim Crowe era in racial isolation. At that time, newspapers provided advertisers with options that only advertised to white readers.

Predictive police technology leads to severe racial discrimination

On August 31, 2016, the alliance consisting of 17 organizations issued a statement on the U.S. law enforcement agencies using predictive police tools, pointing out that the technology lacks transparency, racial prejudice, and other profound defects that lead to unfairness. Forecast police officers refer to the data of relevant personnel, such as their age, gender, marriage status, history of drug abuse and criminal records, and predict their possibility of participating in criminal activities. The Los Angeles Police Department took the lead in using the algorithm tools developed by a predictive police technology company. The New York and the Chicago Police Department subsequently created a "hot list", which included the "strategic subject" of "strategic subjects" and social media based on the crime of crime of gun crimes based on population statistics. detail. The working method of predictive police tools lacks transparency, and the police station is generally reluctant to disclose the working principle of the algorithm. This makes these algorithms a "black box" and cannot be audited or evaluated by any external person. By 2019, the system has allocated the "high risk" score for more than 400,000 people and is regarded as the main means to prevent violent crimes. Predictive police tools will cause serious discrimination and crime prediction errors. According to data from the US Department of Justice in 2020, the possibility of African -American was 5 times the possibility of being stopped by the police without proper reasons. It is twice as much. The U.S. predictive police tools use race as prediction factor, and to copy and exacerbate prejudice in police prejudice by sending police officers to the places they have supervised before, adding excessive supervision of the non -white community. The existing dataset reflects the existing racial prejudice, so although these technologies are assumed that they are "objective" and even considered to reduce the prejudice of the human behavior that they replaced, their operation has exacerbated racial discrimination. Police departments often use predictive technologies in poor communities in poor communities in ethnic minorities. Predicted police officers created "huge structural prejudice". By containing racial discrimination algorithms, some police officers' prejudices have been replaced by data -driven structural prejudice.

Digital technology increases racial discrimination in the criminal judicial system

Emerging digital technology continues and copys the racial discrimination structure in criminal justice. From predictive police to predictive criminals, law enforcement agencies and courts are relying on algorithm tools, making long -term racial discrimination more solidified, overall weakening human rights of ethnic minorities, and strengthening the structural oppression they suffered in society in society. Essence Several states in the United States use artificial intelligence risk assessment tools every step in criminal judicial procedures. Developers hope that these systems can provide objective and data -based judicial results, but these algorithms usually depend on data generated by defects, racial prejudice, and even illegal policy. Because these algorithms affect the judgment, they violate the rights of minority ethnic groups to obtain fair trials and the right to avoid arrest and detention. This risk assessment factors usually include previous arrest and conviction records, parents' criminal records, postal codes, and so -called "community confusion". These factors reflect the extensive socio -economic disadvantages model caused by the over -colored human communities, as well as the racial system, and the behavior of the target group. In other words, the data can also indicate the racial disadvantage and police distribution of the defendant's community, not its personal behavior. The probability of the African defendant is determined to be a higher risk of violent crimes in the future than that of white people, which is higher than that of white people, and predicts that the possibility of any type of crime may be 45%higher in the future.

The same is true for predicting the tools of criminals. In the US system, the suspect usually gets a score after being arrested, the purpose is to predict the possibility of their future crimes. The risk assessment score provided by the most commonly used system Compas is used at every stage of criminal judicial procedures. A survey shows that the system has obvious racial prejudice, and it has mistakenly mark the African defendant as future criminals, which is almost twice that of the white defendant. At the same time, white defendants are often mistaken for low risks. Due to the inherent prejudice of these score standards, it is not uncommon for criminal suspects to plead guilty even if they are innocent.

Professor Jessie Dennis, a professor at City University of New York, pointed out that the language of digital technology is American English. The network technology is full of American -style norms, giving the American technology industry that is not proper economic and political power, and in various types of global multinational multinational multinational countries The company has huge influence, which usually reflects the ideology of imperialism and white supremacy in the United States. Professor Rouha Benjimo, Professor of Princeton University, pointed out in the "Race of the Race: New Gim Code" that algorithm black boxes associate the so -called racial neutral technology and laws and policies, and become a powerful tool for white supremacy. The United States should face up to the systemic and structural defects of its own racial discrimination, actively respond to the governance challenges brought by digital technology, and adopt rapid effective action to prevent and reduce the risk of racial discrimination in the use and design of emerging digital technologies, and strengthen algorithm accountability.

(Author: Tang Yingxia, deputy director of Human Rights Research Center Nankai University)

- END -

The South Korean military announced that it had detected the North Korean firing, and Yin Xiyue'

[Global Times Special reporter in Korea, Liu Haisheng] The South Korean military suddenly announced on the night of the 12th that it was detected by North Korea to test multiple rockets on the morning

New York's gold futures price fell 9.3 US dollars from the previous trading day

Xinhua News Agency: The most active August gold price price of the New York Commod...