Written by admin

March 24, 2025

Facebook used to focus on showing posts from friends and family, but between 2020 and 2025, that changed a lot. Now, the News Feed is filled with posts from random pages, suggested videos, and ads chosen by artificial intelligence (AI). This shift has made Facebook look more like TikTok, where an algorithm decides what users see.

This article explains why Facebook changed its algorithm, how it affects users, and whether Facebook is still moderating false ads, fake news, and scams.


From Friends and Family to AI-Generated Content

Facebook’s Old Algorithm (2020)

Back in 2020, Facebook’s algorithm was designed to show posts from your friends, family, and groups first. This change was made in 2018 after people complained about too many ads and business posts in their feed. The idea was to make Facebook more personal and meaningful.

The TikTok Influence & Algorithm Shift

By 2021, Facebook noticed that TikTok was becoming super popular, especially among younger people. TikTok’s “For You” feed shows users videos based on what they watch and like, even if they don’t follow the creators. Facebook wanted to do something similar, so in 2022, it changed its News Feed:

  • The “Home” tab became a feed full of suggested posts, Reels, and AI-chosen content.
  • A new “Feeds” tab was added for people who wanted to see only posts from friends and pages they follow.

Even though the Feeds tab exists, Facebook automatically opens to the AI-powered Home tab, making suggested content the default experience.

Facebook’s Algorithm in 2025

By 2025, AI-generated recommendations dominate the News Feed:

  • 30% of Facebook’s feed now comes from recommended posts instead of friends and family.
  • 50% of content on Instagram (owned by Facebook) is from accounts people don’t follow.
  • Facebook’s video feature, Reels, saw a 30% increase in watch time after being pushed into more feeds.

CEO Mark Zuckerberg defended this shift, saying it still allows people to connect—just in a different way. Instead of seeing direct posts from friends, people are encouraged to share or discuss viral content with them.


What This Means for Users

1. Facebook Feels Less Personal

Many longtime Facebook users have complained that their feeds are full of irrelevant content. Instead of seeing life updates from family or posts from their favorite groups, they see viral videos, clickbait articles, and ads.

2. Misinformation & Fake Posts Are Spreading Faster

Because Facebook now prioritizes engagement, posts that get lots of reactions—even if they contain false information—get pushed into more feeds. This has led to:

  • A rise in AI-generated scam posts
  • Fake advertisements tricking users into buying non-existent products
  • False political claims spreading without fact-checking

3. Trust in Facebook Is Dropping

Surveys show that people trust Facebook less than they used to. Many users feel they can’t believe what they see in their feed. This problem has worsened now that Facebook has cut back on moderation of misleading content.


Is Facebook Still Moderating Fake Content?

Over the past few years, Facebook has reduced its efforts to fight misinformation. Here’s how:

  • Fewer Moderators – Facebook laid off thousands of employees in 2022–2023, including people who worked on stopping fake posts.
  • Weaker Fact-Checking – In 2025, Facebook officially stopped using third-party fact-checkers in the U.S.
  • More Scams & Fake Ads – Fraudulent ads and fake giveaways are more common because of fewer restrictions.

This means that misleading posts are spreading faster, and Facebook is not removing them like before. The platform has also stopped labeling many AI-generated images and videos, making it harder to tell what’s real.


Conclusion: The Cost of Engagement

Facebook’s goal from 2020 to 2025 was to increase engagement, and it succeeded—people are spending more time on the app. But this came at a cost:

  • Facebook feels less personal and more like an entertainment platform
  • Fake news and scams are spreading faster than ever
  • People trust Facebook less than they did before

Going forward, Facebook faces a major question: Will it fix these problems and restore user trust, or will it continue prioritizing engagement at all costs? The answer will shape the future of social media.

Unfortunately, we believe that Zuckerburg’s primary goal is to make money and prioritize money-making goals.  

So, what can you do to have a better experience on Facebook and not get trapped in their sensational exploitation?  First, understand that you’re probably on Facebook to see your friends and family.  People over 50 use Facebook to connect with friends and family they would not otherwise feel connected to except on the Facebook platform.  With that said, say true to why you’re there.  Check-in with friends and family, and refrain from scrolling.  Look for friends and family posts, comment/engage, and then log out.  Don’t linger.  If you scroll, click on negative click-bait, that’s what you will see more of.  In journalism, there is the ultimate mantra.  “If it bleeds, it leads.”  People are drawn to negative posts that create drama.  You can only control “you” and not how Facebook’s Algorithm works.  Don’t get sucked in.  


Sources

  1. Facebook Transparency Reports (2020–2025)
  2. Zuckerberg, M. (2022). Meta Company Announcements.
  3. Pew Research Center Social Media Trends (2023–2024)
  4. Free Press Study on Misinformation (2024)

Related Articles

Are The Kids Alright?

Are The Kids Alright?

If you have grandchildren or teenagers and have been paying attention, you've likely noticed the growing concern about social media and its potential negative impact on teens' mental well-being. This is not a passing trend but a pressing issue that demands our...

The Future is NOW

The Future is NOW

  AGI Is Here: Navigating the Future with Mo Gawdat and Salim Ismail As we stand on the brink of a new era, the discussions surrounding Artificial General Intelligence (AGI) are becoming increasingly urgent. In a recent engaging conversation, Mo Gawdat, Peter...

Supportscreen tag
error: Content is protected !!