🎯 What to Expect When You Are AI Chatting

The Promise of 9 Months' Worth of Social Media in Training Data


AI chatbots don’t operate in a vacuum—they are shaped by the data they consume.

Over the past nine months (July 2024 – March 2025), global events have unfolded with competing narratives, influencing the information AI models learn from.
🔹 U.S. Presidential Election: Donald Trump secured a second term, marking a historic political comeback amid heated rhetoric and deep national divisions.
🔹 Ukraine-Russia War: Ukraine intensified counteroffensives while Russia escalated attacks.
🔹 Middle East Conflict: Renewed fighting in the Israel-Gaza war led to heavy casualties before a fragile ceasefire in early 2025.
🔹 German Political Turmoil: Chancellor Olaf Scholz’s coalition collapsed, triggering snap elections and uncertainty in European leadership.
🔹 European Military Buildup: Nations across Europe ramped up defense spending, signaling a shift toward militarization.
🔹 Paris Olympics Controversy: The 2024 Games sparked debate after a biological male athlete competed in the women’s boxing tournament.
🔹 Stranded Astronauts Return: After months aboard the ISS, a crew of astronauts finally returned in a high-stakes emergency mission.

🔍 These events have been framed differently across media outlets, shaped by ideological biases, cultural perspectives, and political agendas. AI models trained on this data reflect these divergent viewpoints—sometimes enriching responses, sometimes reinforcing biases.
Meanwhile, AI itself is subject to grooming—subtle efforts to shape its outputs through curated training data, reinforcement learning, and moderation policies.

How AI Processes This Information:
1️⃣ Your AI isn’t learning from today—it’s trained on yesterday’s internet: LLMs don’t absorb daily news or social media as it happens. Unless retrained, they reflect past data.
2️⃣ Retraining means risk—who’s feeding the AI? Every new training cycle can shift an AI’s worldview. If dominant narratives emerge, they don’t just get included—they get reinforced.
3️⃣ RAG vs. Core Learning—how “live” is your AI? Some AI models use Retrieval-Augmented Generation (RAG) to pull fresh internet data. Others rely entirely on pre-trained knowledge, repeating what they learned months ago.

💡 But AI doesn’t just reflect facts—it mirrors how we argue, cancel, and manipulate, how narratives spread, how dissent is treated, and what gets labeled “misinformation.”

📌So what can you expect?
AI interactions should always be approached critically—but can we really expect it to be better than us, given the data it is trained on? Just like human discourse, AI absorbs not only perspectives but also the tactics used to push them—selective framing, emotional manipulation, and the quiet steering of narratives over time.

Expect AI to become a really good Mirror of Modern Society.

 

Disclaimer

The companies and organizations mentioned in this article are referenced for informational and analytical purposes only. All discussions about their potential roles and interests in space-based data centers are based on publicly available information and do not imply any endorsement, partnership, or direct involvement unless explicitly stated. The opinions expressed are solely those of the author and do not reflect the official positions of the companies mentioned. All trademarks, logos, and company names are the property of their respective owners.

#AI #Ethics #Explainability #XAI #AIAlignment #ResponsibleAI #AITransformation