- Advertisement -Newspaper WordPress Theme

Top 5 This Week

spot_img

Related Posts

A.I. & Social Networks – Is an inequality process started?

Intro

Firstly, Over the past several years, artificial intelligence and machine learning have radically transformed society. Social networks are lauded for creating connections and bringing people together. However, a study published in the Scientific Reports journal indicates that algorithms worsen existing inequalities. And discriminate against specific groups of people. Sociologists have always acknowledged the existence of inequalities in any domain of society. And equally recognize that the benefits and harms of technology are not evenly distributed. The most pertinent questions have been directed at developers of new algorithmic technologies. 

Artificial intelligence and algorithmic systems criticize for perpetuating biases, unjust discrimination, and worsening inequality. Today’s dominant forms of AI and machine learning algorithms train using datasets that reflect human judgment, priorities, and conceptual categories. When a dataset is biased, any inequalities will be encoded and reproduced in algorithms based on machine learning. There have been many growing social issues linked to AI by evaluating how undesirable characteristics creep in. And how they remove from AI systems. 

What Sociologists concern

Secondly, Sociologists and other experts are concerned about the issues of bias that are deeply rooted in pre-existing social inequalities. Data about patterns in society serves as the input on which these systems are trained. With the resulting automated decisions (the output) reflecting and perpetuating social inequalities. Further proliferation of algorithmic systems would create unequal consequences in education, employment, government benefits, or criminalization and law. The inequalities reproduced and reshaped through algorithmic technologies can play out on a global scale in areas such as international labor. And the flow of capital through colonial and extractive processes.

Even algorithmic systems built to be objective and without bias also discriminate along the most familiar human lines – amplifying social differences and inequalities. Humans and societies tend to exhibit different tendencies often reproduced in automated systems. Which is evident in today’s dominant AI and machine learning algorithms.

More

Meanwhile, The study sought to investigate how social mechanisms influence the rank distributions of two of the most popular algorithms. The algorithms choose PageRank, an algorithm on which Google’s search engine builds, and Who-to-Follow. So, The algorithm used by Twitter is to suggest people that you may find interesting and want to follow. These ranking algorithms show to increase the popularity of already popular users. And may lead to a lack of opportunities for specific groups of people.

Meanwhile, The researchers sought to understand how these algorithms usually go wrong based on their structure and the characteristics of the network. Using 2,000 people for the study, the researchers simulated different networks. And adjusted the social mechanisms of relationships between the individuals in each network. Some of the variations made on the networks included tweaking the numbers of the minority. How the active users connected with other users, and how people generally associated in the network. 

The researcher

The researchers were keen to evaluate if people associated more with an already popular individual. And if people were more likely to link with individuals who were similar to them. The preference to connect with people similar to oneself is a principle to as homophily which essentially means that birds of a feather flock together. 

The researchers’ findings were that homophily was the principal social mechanism responsible for distorting the visibility of minorities in rankings as well as the proportion of minorities. The majority groups associate with other members of the majority.

Minorities can overcome the challenges of under-representation by using a strategic approach when connecting with famous people. These strategic connections will help minorities achieve statistical parity in top rankings. Statistical parity means that if the number of minorities within a population is 20%, then the same reflected in the people within a network, especially in the top ranks. The onus is upon minorities to create more connections with other people in the majority and become more active to increase their visibility in the network. On the other hand, the majority can diversify their connections to minority groups to increase visibility. 

Way

By using realistic social network scenarios, it is evident from the study that ranking algorithm. And social recommender algorithms on social network platforms such. As Twitter can distort the visibility of minority groups in unexpected ways. 

It is important to have algorithms and other AI systems working effectively and efficiently since we are becoming increasingly dependent on these systems. Social inequalities shouldn’t be amplified or entrenched any further, and public policy should guide and articulate the social dynamics of AI technologies. Sociologists are helping create positive visions for AI that all people can work towards with improved governance of algorithms. 

Solutions

The solution to this problem among AI developers has been to find various ways to remove or reduce bias in datasets and algorithmic decisions. Biases in algorithms recognize as a complex and multi-dimensional challenge that cannot achieve purely through technological solutions. The problem requires the input of experts in the social, data science, and technical fields. AI and algorithmic systems can be studied sociologically, questioning how sociology and other disciplines can contribute to current debates about these technologies.

The three leading contributions that sociologists, AI developers, and other experts can push through interdisciplinary collaboration and policy influence are:

  • Critique and politics of refusal – analysis will help unpack the politics of algorithmic technologies drawing on existing social theories, skills, and methods. Where necessary, society can exercise the refusal of algorithmic technologies to dismantle unjust systems and institutions. 
  • New technologies unsettle established systems since they tend to be resistant to change.
  • Improve algorithmic governance – social inequality issues are matters of public interest and therefore address through the institution’s mandate with safeguarding public good. Governments have a role to play in facilitating policies and regulations that promote robust algorithmic systems. We can already see governments worldwide having a solid urge to rein in the tech giants. 

Final Word

Algorithms and AI systems use widely and have a role in achieving outcomes and distributing goods. However, AI systems are central in the reproduction and perpetuation of bias and inequality. Combining the three measures will help drive positive in the face of disruption by algorithmic systems and rampant reproduction of inequalities by algorithms.

Alessandro Civati
Alessandro Civatihttps://lutinx.com
Entrepreneur and IT enthusiast, he has been dealing with new technologies and innovation for over 20 years. Field experience alongside the largest companies in the IT and Industrial sector - such as Siemens, GE, or Honeywell - he has worked for years between Europe and Africa, today focusing his energies in the field of Certification and Data Traceability, using Blockchain and Artificial Intelligence. At the head of the LutinX project, he is now involved in supporting companies and public administration in the digital transition. Thanks to his activities carried out in Africa, in the governmental sphere, and subsequently, as a consultant for the United Nations and the International Civil Protection. The voluntary work carried out in various humanitarian missions carried out in West Africa in support of the poorest populations completes his profile. He has invested in the creation of centers for infancy and newborn clinics, in the construction of wells for drinking water, and in the creation of clinics for the fight against diabetes.

Popular Articles