Reading the texts on Facebook for next week's session, I found myself highlighting a fair amount of passages - but truth be told, not that much for CSIV but rather for my minor where we're preparing a study on Facebook's qualities as a source for (political) news.
Facebook and its qualification as a news outlet have really peaked the interest of people in my field - understandably. Last semester, we developed a survey trying to find out whether Facebook is actually seen and consciously used for daily news - turns out, it isn't (which did not come as such a big surprise). Derek Thompson did actually sum it up quite nicely: "It's an entertainment portal for stories that remind us of our lives and offer something like an emotional popper" and not at all intended to be a serious, reputable news source. But despite that fact, it simply cannot be denied that Facebook plays a big role for news - and vice versa. If you want people to see your content, you have to share it where the people are, it's as simple as that. At least in theory.
This semester we're working on a similar topic: after establishing that Facebook is not regarded as a real source for political news, we will take a look at how some of those political news posts are perceived - this time not focusing on how people consciously perceive it but what happens unintentionally with the help of eye tracking and how the posts we'll be showing to the test persons will alter their opinions.
At this point we're still working on what kind of stimulus material we want to use, whether we'll need a preliminary interview or not, etc. So, we're pretty much still stuck in the organisational phase. But even yet, I'm quite skeptic whether all this will actually be of much help to determine real, specific results and correlations. I was already ranting about this in last semester's research paper: while it is logical and important to research Facebook's influence in any way (the digital world has after all become a big part of our real life, like it or not), I think it's pretty much impossible to come to any lasting conclusions. Just take a look at Facebook's ever-changing algorithm - it determines what the user sees. And some pages might literally spam its fans with news articles, if those fans get only shown 10% of it because of that algorithm, they will perceive the news amount to be less than it is in "reality". That filter, that's constantly changing, will always alter the outcome of any given research - which is a real problem for the long-term comparability of such studies.
Of course, every topic of all studies ever done are subject to change, it's just natural. But in most cases there is at least some consistency in those developments and you might be able to link changes to human nature (boredom, fear, you name it). But in the case of Social Media, especially regarding Facebook, it's just so much harder because some changes just aren't due to its users but due to algorithms and decisions made by Facebook's management.
Conversely, when looking into Facebook and its user's behaviour you have to factor in both the human psyche and behaviour but also the algorithms and their thousands of additional factors while having trouble to base your findings on older researches (which are already scarce). Good luck with that, guys.
No comments:
Post a Comment