Return to site

Is Big Tech Failing Our Youth?

The Battle for Safety on Social Media.

As a keynote speaker about the future of work and technology and even AI. It’s is nice to be asked about a household name like Facebook by another household name like the BBC.

broken image

But why should we care?

Today, the touch of technology isn't restricted to any age group; children are also significantly immersed in these technological advancements. Though this digital revolution opens up countless opportunities for learning and exploration, it also brings risks.

The role of tech firms in protecting the next generation from inappropriate or harmful content has been intensely scrutinized, yet we seem to have made little progress. In this blog post, we ponder the question - are tech firms fulfilling their responsibility to safeguard the younger generation, and if not, what should be done?

The Accountability of Tech Giants

Huge tech firms like Facebook have been caught in the crosshairs of stringent scrutiny. Not long ago, Facebook's CEO Mark Zuckerberg was subjected to vigorous questioning in the US Senate. Accusations were thrown at him for allegedly failing to insulate children from accessing inappropriate and harmful material through the social media platform. For 20 years, Zuckerberg has been at the helm of a company accused of not adequately protecting children, leading to calls for greater accountability.

In fairness, social media serves as both a boon and a bane. It has played a significant role in democratizing several aspects of our lives, amplifying individual voices, and facilitating political discussions.

However, content moderation and child safety have often taken a back seat. The critics argue that Facebook, along with other such firms, have not invested enough in safety measures, while their influence continues to expand unchecked.

The Impact of Lack of Regulation

While the burden of responsibility is partially on these tech companies, the question also arises - why haven't governments increased regulations? Social media giants have become larger than many nations, yielding enormous power and earning staggering amounts of revenue. Their size and influence alone should warrant amplified scrutiny and regulation from governing bodies.

And I know the irony of someone that has trained companies in how to use social media, for over a decade, now saying social media companies have got too big. But these platforms are now simply not caring what the rest of the world does. Or cares about… They seem to only care about the collection of data and the selling of advertising. Or worse…

broken image

The lack of regulation and oversight isn't only a concern for Facebook. It's a widespread issue that encompasses many tech companies headquartered in various countries. Devils advocates might argue that this lack of regulation comes down to the substantial revenues generated by these companies. Their financial impact on economies might disincentivize politicians from applying the regulatory brakes.

It's clear that tech giants hold a considerable share of responsibility when it comes to protecting the next generation from the potential harms lurking in the digital world. It's time we collectively reassess the roles and regulations placed on these companies. It can’t just be me talking about it on the BBC. We need to lobby politicians, governments need to step up their regulation efforts, creating effective policies to ensure child safety online.

But not ones designed to help them curb our freedoms. We live in a time of a “post truth world online” where a wild, wild west is only about to get wilder. The irony being is that AI is going to play both roles of cowboy and indian in this new world. Or more aptly sheriff and outlaw.


As consumers, we must remain vigilant and demand greater accountability from these tech behemoths. After all, the mental health and well-being of the future generation are at stake. Remember without us on their platforms - their platforms don’t exist.

Right now. The power lies with us. Not them… But it might not always.

 

broken image

 References: