Tuesday, December 30, 2014

Retrospective of Facebook are controversial – BBC

Facebook offers its users to view a page titled “Year in Review” their vivid memories of the year. But significant does not mean happy.

Facebook fans have probably already noticed the ticket “Year in review” that sits on their timeline. The social network Mark Zuckerberg offers its users to review their year 2014. With one click, an algorithm aligns vertical frieze most popular pictures and statuses released on their profile. To the chagrin of the most unfortunate users, as Eric Meyer whose unfortunate experience goes around the Web.

In his blog post entitled The unfortunate cruelty algorithms, writer and developer tells how a photo of his six year old daughter died a few months earlier was left pinned to the top of the wall. Subtitled “Eric, this is what your year comes down!”, The image present in the background of the characters dancing arms in the air. “And there was no way to stop it,” he reflects.



Multiple Tribulations

The case of Eric Meyer is not isolated. The setbacks have increased since the opening of the service. “Mine (frieze, ed) reveals the announcement of my pregnancy and a little further the loss of our 23-week-old baby,” says a woman incognito on Reddit forum where many Internet users criticized the automation of service.

The experience is “the result of a computer code that works in most cases, reminding people how their year was awesome,” said Eric Meyer. “But for those of us who have experienced the death of a loved one, or have spent time in the hospital, or were hit by a divorce or have lost their jobs (…), it could be that we did not want to come back on this past year, “he insists.



Facebook favors happiness

First of all, lies the issue of choice of photos, statuses that appear in the frieze. The content selected by “Year in Review” are those who have received the largest number of interactions – like, shares and / or comments, says the Washington Post Jonathan Gheller, manager of the application. The assumption underlying the computer code boils down to is a widely distributed content is a vivid memory and therefore happy. A blunder like Facebook that favor the happiness on its platform.

“The actions on Facebook tend to focus on positive social interactions,” had already pointed out in 2013 an engineer of the company. According to Facebook itself, users press the button “Like” 4.5 billion times a day. “Some have asked a button ‘I hate’ because they wanted to say ‘this is not good.’ This is not something we think beneficial for the world, “outbid the number one site in mid-December 2014.

Facebook not crazy negative thoughts as unprofitable. The social network assisted by two American researchers conducted a study on the contagion of emotions from 11 to 18 January 2012, a study revealed the end of June 2014. The secret data collection 700,000 users had also made much. Facebook had manipulated their news stream to observe the influence that the tone of the messages could have on their behavior. Conclusion: moods are indeed epidemic. “We were concerned that exposure to the negativity of some makes individuals not see Facebook,” explains Adam Kramer, who led the investigation for Facebook on the contagion of emotions. The platform therefore everything to make us benedict.



The algorithm that wants to understand the emotions

The second problem is that Facebook’s algorithms reduce the world to three expressions: joy, “neutral” and sadness. To assess the level of contagion of emotions, Facebook had to identify them. The scientists used the software Linguistic inquiry and word count (LIWC) – Linguistic inquiry and in words in English – again computer code meant to detect punctuations, lexicons and individuals smileys, these icons used to illustrate our feelings. But “understand emotion is much more complex than simple keyword search,” notes The New Yorker. And the Manichean dichotomy: happy / unhappy enough to describe.

It remains an algorithm achieves what it has been programmed. So Eric Meyer offers, the blunder of which he was the victim could have been avoided if there was a way to curb notifications “Year in Review”. This is the last bad thing is the lack of empathy computer scientists have not provided “worst case scenario”, says the writer and developer. “We can do better – I am very grateful that despite his trouble he has taken the time to write this blog post,” acknowledged the head of the application, Jonathan Gheller. Eric Meyer suggests to Facebook for next year to install an option to hide or deny the repeated incentives from Facebook. In case.

LikeTweet

No comments:

Post a Comment