Facebook Crisis: Whistleblower Shares How FB Chose Profit Over User's Mental Health

版主: admin1

回覆文章
Ko Chi Kit
文章: 108
註冊時間: 2020-07-18, 22:30
被點讚了: 6 次

Facebook Crisis: Whistleblower Shares How FB Chose Profit Over User's Mental Health

文章 Ko Chi Kit »

When capitalism gets stronger, multinational companies start to extract profit as much as possible. Previously, there is a documentary called the social dilemma which shows the ironic side of many technological companies: million and thousands of engineers are trying to analyse customers data so that they could come up with plans for extending customer usage and get them to explode to advertisements for profit.

Facebook Crisis: Whistleblower Shares How FB Chose Profit Over User's Mental Health
Yesterday, on the popular news show 60 minutes on CBS, the individual revealed her name -- Frances Haugen -- a 37-year old data scientist from Iowa who has worked with companies like Pinterest and Google.
Haugen revealed that despite working with tech giants like Google, Facebook was substantially worse than anything she had ever seen.
In the interview, she revealed that the algorithm that’s being used by Facebook -- what’s responsible for showing the right content on an individual’s News Feed -- was found to easily inspire people to feel anger, than other emotions.
According to Haugen, Facebook has known about it through its own research, however, it also knows that if they toned the algorithm down to make it safer for people, it would make them spend less time on the site, resulting in them clicking on less and eve
What revelations did she make?
In the interview, she revealed that the algorithm that’s being used by Facebook -- what’s responsible for showing the right content on an individual’s News Feed -- was found to easily inspire people to feel anger, than other emotions.

Haugen also stated that during the 2020 US Presidential Elections, Facebook understood the repercussions that its algorithm can cause, and thus to limit the danger and spread of hatred, turned on the safety systems.

However, as soon as the elections were over, Facebook flipped the switch and the algorithm went back to its toxic self, prioritising growth over safety, which to Haugen felt a lot like a “betrayal of democracy.” She added that no one at Facebook is malevolent, however, their incentives are misaligned.

Haugen stated, "Facebook makes more money when you consume more content ... And the more anger that they get exposed to, the more they interact, the more they consume."

According to Haugen, Facebook has known about it through its own research, however, it also knows that if they toned the algorithm down to make it safer for people, it would make them spend less time on the site, resulting in them clicking on less and eventually making less money.

From https://www.indiatimes.com/technology/ ... 50852.html Monit Khanna Updated on Oct 04, 2021 on Facebook Crisis: Whistleblower Shares How FB Chose Profit Over User's Mental Health

回覆文章

回到「民族、民權、民生及民主」