‘I blame Facebook’: Aaron Sorkin is writing a Social Network sequel for the post-Zuckerberg era
-
Use !friendica@lemmy.ca instead!
-
Use !friendica@lemmy.ca instead!
-
-
PieFed
-
Use !friendica@lemmy.ca instead!
Let's fucking go
The Facebook Files made – and provided evidence for – multiple allegations, including that Facebook was well aware of how toxic Instagram was for many teen girls; that Facebook has a "secret elite" list of people for whom Facebook's rules don't apply; that Facebook knew its revised algorithm was fueling rage; and that Facebook didn't do enough to stop anti-vax propaganda during Covid-19. Most damningly of all, The Facebook Files reported that all of these things were well known to senior executives, including Mark Zuckerberg.
It's clear which side Sorkin is taking. "I blame Facebook for January 6," he said last year. "Facebook has been, among other things, tuning its algorithm to promote the most divisive material possible. Because that is what will increase engagement ... There’s supposed to be a constant tension at Facebook between growth and integrity. There isn’t. It’s just growth."
-
Let's fucking go
The Facebook Files made – and provided evidence for – multiple allegations, including that Facebook was well aware of how toxic Instagram was for many teen girls; that Facebook has a "secret elite" list of people for whom Facebook's rules don't apply; that Facebook knew its revised algorithm was fueling rage; and that Facebook didn't do enough to stop anti-vax propaganda during Covid-19. Most damningly of all, The Facebook Files reported that all of these things were well known to senior executives, including Mark Zuckerberg.
It's clear which side Sorkin is taking. "I blame Facebook for January 6," he said last year. "Facebook has been, among other things, tuning its algorithm to promote the most divisive material possible. Because that is what will increase engagement ... There’s supposed to be a constant tension at Facebook between growth and integrity. There isn’t. It’s just growth."
Deleted by author
-
Let's fucking go
The Facebook Files made – and provided evidence for – multiple allegations, including that Facebook was well aware of how toxic Instagram was for many teen girls; that Facebook has a "secret elite" list of people for whom Facebook's rules don't apply; that Facebook knew its revised algorithm was fueling rage; and that Facebook didn't do enough to stop anti-vax propaganda during Covid-19. Most damningly of all, The Facebook Files reported that all of these things were well known to senior executives, including Mark Zuckerberg.
It's clear which side Sorkin is taking. "I blame Facebook for January 6," he said last year. "Facebook has been, among other things, tuning its algorithm to promote the most divisive material possible. Because that is what will increase engagement ... There’s supposed to be a constant tension at Facebook between growth and integrity. There isn’t. It’s just growth."
-
Use !friendica@lemmy.ca instead!
Shame on them if they don't highlight the fediverse.
-
Use !friendica@lemmy.ca instead!
Post-Zuckerberg? I'm confused on the eras of Facebook I guess. He's still CEO isn't he? Wouldn't that make the whole history of the company the Zuckerberg era?
-
Post-Zuckerberg? I'm confused on the eras of Facebook I guess. He's still CEO isn't he? Wouldn't that make the whole history of the company the Zuckerberg era?
It’s probably referring to the era past when Zuckerberg was an up-and-coming CEO and was still doing a bunch of new things, and when almost all of social media growth revolved around Mark Zuckerberg. Now we’re in an era where he’s an establishment tech CEO. The stuff he does now is less about innovation and more about driving blind profits.
-
Your link is borked. Here's a fixed version: https://www.c-span.org/program/senate-committee/meta-whistleblower-testifies-on-facebook-practices/658354
-
Your link is borked. Here's a fixed version: https://www.c-span.org/program/senate-committee/meta-whistleblower-testifies-on-facebook-practices/658354
-
Let's fucking go
The Facebook Files made – and provided evidence for – multiple allegations, including that Facebook was well aware of how toxic Instagram was for many teen girls; that Facebook has a "secret elite" list of people for whom Facebook's rules don't apply; that Facebook knew its revised algorithm was fueling rage; and that Facebook didn't do enough to stop anti-vax propaganda during Covid-19. Most damningly of all, The Facebook Files reported that all of these things were well known to senior executives, including Mark Zuckerberg.
It's clear which side Sorkin is taking. "I blame Facebook for January 6," he said last year. "Facebook has been, among other things, tuning its algorithm to promote the most divisive material possible. Because that is what will increase engagement ... There’s supposed to be a constant tension at Facebook between growth and integrity. There isn’t. It’s just growth."
tuning its algorithm to promote the most divisive material possible. Because that is what will increase engagement
But at the same time in every case I described on Lemmy an experience not maximizing engagement by maximizing conflict, I was downvoted to hell's basement. Despite two of three modern social media experience models being too aimed for that, that'd be Facebook-like and Reddit-like, excluding Twitter-like (which is unfortunately vulnerable to bots). I mean, there's less conflict on fucking imageboards, those were at some point considered among most toxic places in the interwebs.
(Something-something Usenet-like namespaces instead of existing communities tied to instances, something-something identities too not tied to instances and being cryptographic, something-something subjective moderation (subscribing to moderation authorities you choose, would feel similar to joining a group, one can even have in the UI a few combinations of the same namespace and a few different moderation authorities for it), something-something a bigger role of client-side moderation (ignoring in the UI those people you don't like). Ideally what really gets removed and not propagated to anyone would be stuff like calls for mass murders, stolen credentials, gore, real rape and CP. The "posting to a namespace versus posting to an owned community" dichotomy is important. The latter causes a "capture the field" reaction from humans.)
-
-
tuning its algorithm to promote the most divisive material possible. Because that is what will increase engagement
But at the same time in every case I described on Lemmy an experience not maximizing engagement by maximizing conflict, I was downvoted to hell's basement. Despite two of three modern social media experience models being too aimed for that, that'd be Facebook-like and Reddit-like, excluding Twitter-like (which is unfortunately vulnerable to bots). I mean, there's less conflict on fucking imageboards, those were at some point considered among most toxic places in the interwebs.
(Something-something Usenet-like namespaces instead of existing communities tied to instances, something-something identities too not tied to instances and being cryptographic, something-something subjective moderation (subscribing to moderation authorities you choose, would feel similar to joining a group, one can even have in the UI a few combinations of the same namespace and a few different moderation authorities for it), something-something a bigger role of client-side moderation (ignoring in the UI those people you don't like). Ideally what really gets removed and not propagated to anyone would be stuff like calls for mass murders, stolen credentials, gore, real rape and CP. The "posting to a namespace versus posting to an owned community" dichotomy is important. The latter causes a "capture the field" reaction from humans.)
...And under the current model, the egos of mods get crazy big as they see their
communityarmy grow bigger and they can shape it how they want, even stackoverflow suffered and developers left in droves long before LLM took its place.I do miss the original imageboards though that used
sage
and was a community driven effort into moderation. -
...And under the current model, the egos of mods get crazy big as they see their
communityarmy grow bigger and they can shape it how they want, even stackoverflow suffered and developers left in droves long before LLM took its place.I do miss the original imageboards though that used
sage
and was a community driven effort into moderation.The mod ego problem will exist as long as there's moderation, unfortunately.
It was present in the web even before it was expelled from heaven.
But it's not necessary to remove all moderation, just global identifiers of posts and many different "moderating projections" of the same collection of data can be enough to change the climate for most of the users. Not moderation itself really matters - the ability to dominate, to shut someone's mouth matters. If the only way you see a post is without such at all - then maybe it's too rude. If it's removed on the instance level on most of instances - then maybe it's something really nasty that shouldn't be seen. But if in some projection it's visible and in some not - then we've solved this particular problem.
In such a hypothetical system.
-
Use !friendica@lemmy.ca instead!
He should just adapt Careless People for film
-
-
-
-
YouTube Loosens Video Content Moderation Rules | The world’s largest video platform has told content moderators to favor “freedom of expression” over the risk of harm in deciding what to take down.
Technology1
-
-
-
Mozilla is shutting down Pocket, their read-it-later and content discovery app, and Fakespot, their browser extension that analyzes the authenticity of online product reviews.
Technology1
-