top of page
Worker with Ladder

Blog Bubble

Dive into the BubbleBot and explore the world of AI and tech with us! Learn how to harness the power of tech while being aware of the potential dangers. Join us in raising awareness of the adverse effects of social media, and how to use it positively.
  • justsolomonmike

"Does the algorithm read my mind? A teenage dilemma"

As teenagers, we are frequently torn between wanting to fit in and being unique. Like our real-life selves, our online personas are a delicate balance between what we want to show the world and what we want to keep hidden. However, with the rise of algorithmic targeting and personalization, determining where that balance lies is becoming increasingly difficult.

We've all been there: we're looking for a new pair of shoes online, and our social media feeds are suddenly flooded with shoe store advertisements. It's as if the algorithm is reading our minds and foreseeing our every move before we do. Is this, however, the case?

According to Dutton et al. (2019), algorithmic targeting creates a "filter bubble" around us, displaying content and advertisements tailored to our interests and past behaviours. This may appear to be a harmless way to improve the convenience and enjoyment of our online experience, but it comes at a cost.

Filter bubbles create an "echo chamber" effect in which we are only exposed to one side of an issue or topic by only showing us content that aligns with our existing beliefs and preferences. This can result in polarisation, in which our opinions become more extreme and we are less likely to consider opposing points of view.

The recent shooting in Buffalo is a tragic example of what can happen when we trust everything we see on social media without checking the facts.

This is only one of the risks of living in an algorithmic bubble. We are more likely to believe false information and less likely to question the validity of what we see when we are only exposed to content that confirms our biases. For teenagers, who are often more susceptible to peer influence and less experienced in critical thinking, these echo chambers and filter bubbles can be particularly dangerous. Without proper guidance and fact-checking, they may believe false information and become radicalized towards harmful beliefs and actions.

It's important for parents, educators, and social media platforms to work together to combat these echo chambers and filter bubbles. Parents and educators can teach critical thinking skills and encourage fact-checking, while social media platforms can adjust their algorithms to promote a more diverse range of viewpoints and information.

So, does the algorithm truly know us better than we do? Yes and no, respectively. While algorithms are extremely sophisticated and can, to some extent, predict our behaviours and preferences, they cannot truly read our minds. In our online lives, we still have agency and autonomy, and it is up to us to use that power responsibly.

"Algorithmic targeting refers to the use of algorithms to select the most relevant advertisements for a specific audience, based on factors such as their online behavior, search history, and demographics."

As we navigate the digital landscape as teenagers, we must remember that our online actions have real-world consequences. We can break out of our filter bubbles and create a more balanced online experience for ourselves and those around us by fact-checking information and seeking out diverse viewpoints.

Finally, the algorithmic targeting and personalization we see online can be a double-edged sword. While it can make our online experience more convenient and enjoyable, it can also result in the formation of filter bubbles and echo chambers, limiting our exposure to diverse points of view. It is our responsibility as teenagers to be responsible digital citizens, fact-checking information and seeking out diverse viewpoints in order to create a more balanced and informed online experience. Until next time, Bubble bot out!



Оцінка: 0 з 5 зірок.
Ще немає оцінок

Додайте оцінку
bottom of page