Three tips for how companies can protect their internet users


Share on LinkedIn

In today’s world, internet users are susceptible to online threats and scams more than ever before. People are consuming copious amounts of digital content, accelerated by the pandemic. In 2020, the average time U.S. users spent on social media increased to 65 minutes daily, compared to 54 minutes and 56 minutes in the previous years — an almost 20% increase. This is a trend we’ll continue to see, as people are inclined to turn to their phones first and can access content at the touch of a button.

Beyond social media, internet content encompasses a broad range of applications related to dating, gaming, retail, and marketplaces, just to name a few. A majority of these applications require (or request) private data from the user; whether it be a generic user profile, location preferences or credit card information. Larger companies will typically require more information from the user. This data needs to be protected and secured. Additionally, the platforms that have millions of pieces of content generated by users, whether it be streaming services or an online dating profile, have a pressing need for content moderation.

Today, companies face key challenges when it comes to protecting their internet users. But there are solutions to these challenges that companies can take to be successful:

Hire your teams accordingly (and be transparent!). Often, companies do not have specialized teams in house that are focused on Trust and Safety. This can be attributed to the size of the company, or available resources and bandwidth of employees. When hiring for roles around Trust and Safety and moderation, ensure they understand the scope of work, in addition to possible risks.

Balance growth and safety. One of the biggest challenges that companies face is the friction between safety, security and privacy, while making their platforms user friendly. It’s important that users feel that their information is protected. At the same time, the more strict the security measures are taken (ie. verifying identity), the harder it is to grow. Create a healthy balance between safety and company growth by ensuring Trust & Safety experts are part of the product development process.

Avoid a “one size fits all” model in order to scale. Content moderation laws differ depending on which country you are in, or where your users are located; what is true in the United States will be different than in Germany. Be mindful of where your users are and the best way to protect them. If you’re looking to scale your business, be flexible across your locations.

Since the onset of the pandemic, content has increased exponentially. More people are playing online games than ever before, listening to music and streaming movies. As these platforms continue to evolve and grow their capabilities for users, it creates a heightened level of visibility and risk. Ultimately, companies know that community and connection drive higher engagement, which is the core of the user experience. Where there is community, there needs to be trust and safety measures in place to protect internet users.

Phil Tomlinson
Phil leads Trust & Safety at TaskUs, where their mission is to protect the digital frontline and make the internet a healthier place for everyone. He works at the intersection of content moderation and platform policy, helping to define the purpose of online safety in our broader society. He has over 20 years of experience building digital products and the teams to support them, and is a passionate advocate for better mental health in the workplace.


Please use comments to add value to the discussion. Maximum one link to an educational blog post or article. We will NOT PUBLISH brief comments like "good post," comments that mainly promote links, or comments with links to companies, products, or services.

Please enter your comment!
Please enter your name here