1250x260_Light
1250x260

A.I. & Social Networks – Is an inequality process started?

Date:

Intro

Firstly, Over the past several years, artificial intelligence and machine learning have radically transformed society. Social networks are lauded for creating connections and bringing people together. However, a study published in the Scientific Reports journal indicates that algorithms worsen existing inequalities. And discriminate against specific groups of people. Sociologists have always acknowledged the existence of inequalities in any domain of society. And equally recognize that the benefits and harms of technology are not evenly distributed. The most pertinent questions have been directed at developers of new algorithmic technologies. 

Artificial intelligence and algorithmic systems criticize for perpetuating biases, unjust discrimination, and worsening inequality. Today’s dominant forms of AI and machine learning algorithms train using datasets that reflect human judgment, priorities, and conceptual categories. When a dataset is biased, any inequalities will be encoded and reproduced in algorithms based on machine learning. There have been many growing social issues linked to AI by evaluating how undesirable characteristics creep in. And how they remove from AI systems. 

What Sociologists concern

Secondly, Sociologists and other experts are concerned about the issues of bias that are deeply rooted in pre-existing social inequalities. Data about patterns in society serves as the input on which these systems are trained. With the resulting automated decisions (the output) reflecting and perpetuating social inequalities. Further proliferation of algorithmic systems would create unequal consequences in education, employment, government benefits, or criminalization and law. The inequalities reproduced and reshaped through algorithmic technologies can play out on a global scale in areas such as international labor. And the flow of capital through colonial and extractive processes.

Even algorithmic systems built to be objective and without bias also discriminate along the most familiar human lines – amplifying social differences and inequalities. Humans and societies tend to exhibit different tendencies often reproduced in automated systems. Which is evident in today’s dominant AI and machine learning algorithms.

More

Meanwhile, The study sought to investigate how social mechanisms influence the rank distributions of two of the most popular algorithms. The algorithms choose PageRank, an algorithm on which Google’s search engine builds, and Who-to-Follow. So, The algorithm used by Twitter is to suggest people that you may find interesting and want to follow. These ranking algorithms show to increase the popularity of already popular users. And may lead to a lack of opportunities for specific groups of people.

Meanwhile, The researchers sought to understand how these algorithms usually go wrong based on their structure and the characteristics of the network. Using 2,000 people for the study, the researchers simulated different networks. And adjusted the social mechanisms of relationships between the individuals in each network. Some of the variations made on the networks included tweaking the numbers of the minority. How the active users connected with other users, and how people generally associated in the network. 

The researcher

The researchers were keen to evaluate if people associated more with an already popular individual. And if people were more likely to link with individuals who were similar to them. The preference to connect with people similar to oneself is a principle to as homophily which essentially means that birds of a feather flock together. 

The researchers’ findings were that homophily was the principal social mechanism responsible for distorting the visibility of minorities in rankings as well as the proportion of minorities. The majority groups associate with other members of the majority.

Minorities can overcome the challenges of under-representation by using a strategic approach when connecting with famous people. These strategic connections will help minorities achieve statistical parity in top rankings. Statistical parity means that if the number of minorities within a population is 20%, then the same reflected in the people within a network, especially in the top ranks. The onus is upon minorities to create more connections with other people in the majority and become more active to increase their visibility in the network. On the other hand, the majority can diversify their connections to minority groups to increase visibility. 

Way

By using realistic social network scenarios, it is evident from the study that ranking algorithm. And social recommender algorithms on social network platforms such. As Twitter can distort the visibility of minority groups in unexpected ways. 

It is important to have algorithms and other AI systems working effectively and efficiently since we are becoming increasingly dependent on these systems. Social inequalities shouldn’t be amplified or entrenched any further, and public policy should guide and articulate the social dynamics of AI technologies. Sociologists are helping create positive visions for AI that all people can work towards with improved governance of algorithms. 

Solutions

The solution to this problem among AI developers has been to find various ways to remove or reduce bias in datasets and algorithmic decisions. Biases in algorithms recognize as a complex and multi-dimensional challenge that cannot achieve purely through technological solutions. The problem requires the input of experts in the social, data science, and technical fields. AI and algorithmic systems can be studied sociologically, questioning how sociology and other disciplines can contribute to current debates about these technologies.

The three leading contributions that sociologists, AI developers, and other experts can push through interdisciplinary collaboration and policy influence are:

  • Critique and politics of refusal – analysis will help unpack the politics of algorithmic technologies drawing on existing social theories, skills, and methods. Where necessary, society can exercise the refusal of algorithmic technologies to dismantle unjust systems and institutions. 
  • New technologies unsettle established systems since they tend to be resistant to change.
  • Improve algorithmic governance – social inequality issues are matters of public interest and therefore address through the institution’s mandate with safeguarding public good. Governments have a role to play in facilitating policies and regulations that promote robust algorithmic systems. We can already see governments worldwide having a solid urge to rein in the tech giants. 

Final Word

Algorithms and AI systems use widely and have a role in achieving outcomes and distributing goods. However, AI systems are central in the reproduction and perpetuation of bias and inequality. Combining the three measures will help drive positive in the face of disruption by algorithmic systems and rampant reproduction of inequalities by algorithms.

Popular

Subscribe to our newsletter


Your emaill address should be use only for updating you on our articles, in the respect of the privacy law

Share post:

More like this
Related

What is The Future of the Internet and Why Should You Care?

Now we arrive at Web 3.0. This modern iteration of the Internet transitions from read-only to read-write Web to a read-write-own format.

What The Emergence of the Metaverse Means for Today’s Brands

Metaverse and Web3 are now set to dilute this space and gradually eliminate the gaps by enabling seamless movement of people and products across both the digital and physical worlds

How To Improve Customer Relations in The Metaverse

We will be able to communicate and interact with brands and with one another in metaverse environments.

Decentralization: Cutting Out the Middle Men

Musicians and developers who have invested in web3 believe that blockchain, cryptocurrencies, and NFTs will render middlemen who exist solely to facilitate transactions obsolete.