কৃত্রিম বুদ্ধিমত্তাতথ্যপ্রযুক্তি

Is Toxicity Embedded in the DNA of Social Media?

Share
Share

Two young men were chatting at a city café. One was showing various Facebook posts on his phone, while the other, with an annoyed expression, exclaimed, “Just look, brother! Are these even people? How much quarrelling and toxic talk!” An elderly gentleman at the next table laughed and added, “If you hand it to people, you blame them. Now it seems even the robots are doing the same thing!” The story sounds amusing, but hidden within it lies a troubling reality.

Researchers from the University of Amsterdam in the Netherlands recently conducted an experiment that forces us to rethink the future of social media. Peter Tornberg, assistant professor of AI and social media, along with his colleague Mike Larue, created an innovative network. Surprisingly, no humans were involved. The entire platform was filled with GPT-4 powered AI bots. The goal was simple: to see if disputes, division, or fake news could be prevented using only technology—without any human participation.

The results were both astonishing and disappointing. Various strategies were tried—feed was presented in chronological order, diverse viewpoints were highlighted, ‘likes’ and engagement were hidden, and even algorithms were tweaked. But no matter what was done, the bots quickly fractured into factions, formed ‘echo chambers’, and began spreading fake news and toxic reactions.

Later, Tornberg commented, “If these platforms follow the same path even without humans, it shows the problem isn’t in human nature, but rather in the very design itself.”

A young Bangladeshi reader, upon hearing this news, wrote on social media, “All this time we thought people were to blame. Now it seems the very design of social media is a trap that turns us into quarrelsome beings.”

This isn’t just an academic curiosity. Sociologists say the research shows us clearly—toxicity seems to be embedded in the very blood of technology. One is reminded of a renowned philosopher who once said, “The machine you build will one day build you.”

Tech companies have long offered various excuses. They say the problem lies with users: someone spreads fake news, someone incites. But if robots themselves fall into the same patterns, the question arises—does the design itself encourage division? And if so, can something as simple as a ‘policy update’ or ‘content moderation’ fix it?

A young journalist in Dhaka commented, “It’s as if the road is built in such a way that people will fall into holes whether they want to or not. The blame is laid on the traveler, but in reality, it’s the road’s design at fault.”

The question is now serious. If algorithms and platform structures alone dictate our thinking, then democracy, the flow of information, even social cohesion itself may be at risk. Peter Tornberg’s research may have started small, but its message is huge—the problem isn’t with us, but rather the system that surrounds us.

This brings to light the responsibility of policymakers and tech companies. Reducing likes or shares isn’t enough; the core structure of the platforms needs to change. Otherwise, no matter how aware we become, the very design of the technology will push us back into the same pit of division.

Writer George Orwell once said, “In a society where words replace truth, lies become reality.” Today’s research seems to prove that true. If robots themselves get trapped in falsehood and division, how will humans be any different?

Eventually, the question remains—will we be able to change this design, or will we just sit back as spectators, watching humans and robots repeat the same toxic game over and over?

affordablecarsales.co.nz
Share

Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *

ফ্রি ইমেইল নিউজলেটারে সাবক্রাইব করে নিন। আমাদের নতুন লেখাগুলি পৌছে যাবে আপনার ইমেইল বক্সে।

বিভাগসমুহ

বিজ্ঞানী অর্গ দেশ বিদেশের বিজ্ঞানীদের সাক্ষাৎকারের মাধ্যমে তাদের জীবন ও গবেষণার গল্পগুলি নবীন প্রজন্মের কাছে পৌছে দিচ্ছে।

Contact:

biggani.org@জিমেইল.com

সম্পাদক: মোঃ মঞ্জুরুল ইসলাম

Biggani.org connects young audiences with researchers' stories and insights, cultivating a deep interest in scientific exploration.

নিয়মিত আপডেট পেতে আমাদের ইমেইল নিউজলেটার, টেলিগ্রাম, টুইটার X, WhatsApp এবং ফেসবুক -এ সাবস্ক্রাইব করে নিন।

Copyright 2024 biggani.org