Every few decades there is a drastic change in how we view important societal and scientific problems. What used to be a ridiculous idea yesterday can become the predominant opinion today, only to turn into an obsolete notion tomorrow.
In the history of human beliefs, change is the only constant. Most people living in 1921 shared views that today are considered quaint at best, and dangerous at worst. The chances that our present-day convictions will remain relevant by 2121 are slim.
In fact, we won't even have to wait 100 years, as the speed of change is accelerating. A good example is how quickly humanity changed its mind over the origins of Covid.
Just a year ago, the idea that the virus originated from a Wuhan Lab was dismissed as a conspiracy theory. Facebook, Twitter and other social media platforms blocked posts promoting the lab leak theory. Today, however, this theory is on its way to becoming the mainstream scientific view of how the virus originated.
Such instances make combating fake news and misinformation particularly challenging. They can also fundamentally undermine people's trust in the neutrality of social media platforms, and jeopardize future efforts to fight misinformation.
Telegram never blocked posts discussing the lab leak theory, because we didn't think it's our role to decide for our users what they should believe. At the same time, we felt that our users had the right to be informed about Covid by official sources that reflected scientific consensus. That's why we worked with 19 governments to help them reach out to every Telegram user in their countries with up-to-date information on the pandemic.
In my 20 years of managing discussion platforms, I noticed that conspiracy theories only strengthen each time their content is removed by moderators. Instead of putting an end to wrong ideas, censorship often makes it harder to fight them. That’s why spreading the truth will always be a more efficient strategy than engaging in censorship.