From context collapse to content collapse
- The days of you having a different image for your work friends or co-workers and for the other people you know are probably coming to an end pretty quickly.” Zuckerberg praised context collapse as a force for moral cleanliness: “Having two identities for yourself is an example of a lack of integrity.” Facebook forces us to be pure.
- Context collapse remains an important conceptual lens, but what’s becoming clear now is that a very different kind of collapse — content collapse — will be the more consequential legacy of social media.
- Content collapse, as I define it, is the tendency of social media to blur traditional distinctions among once distinct types of information — distinctions of form, register, sense, and importance.
TikTok offered details about how its most popular feed works. Experts seem unimpressed.
- Earlier this year, the Intercept obtained internal policy documents that encouraged content moderators to limit videos appearing in the “For You” feed that were deemed “undesirable,” including those featuring people with an “abnormal body shape” and “ugly facial looks.” TikTok also reportedly reached out to some high-profile users of its app to update them about changing rules, and the company censored political speech on its livestreaming feature.
- TikTok also announced this year that it would launch a transparency center in Los Angeles focused on “moderation and data practices.” The company’s recent blog post about the For You algorithm said that experts visiting that center will eventually be able to learn more about how its algorithms operate and review the company’s source code.
What's wrong with WhatsApp
- In those early months, WhatsApp – which hovers neatly between the space of email, Facebook and SMS, allowing text messages, links and photos to be shared between groups – was a prime conduit through which waves of news, memes and mass anxiety travelled.
- The ongoing rise of WhatsApp, and its challenge to both legacy institutions and open social media, poses a profound political question: how do public institutions and discussions retain legitimacy and trust once people are organised into closed and invisible communities?
- The features that would later allow WhatsApp to become a conduit for conspiracy theory and political conflict were ones never integral to SMS, and have more in common with email: the creation of groups and the ability to forward messages.
How to turn off political ads in your Facebook News Feed
- Facebook recently announced that it is now letting users turn off political ads on both their Facebook and Instagram feeds.
- The new move seems to be a compromise to critics who think Facebook shouldn’t let politicians lie in ads (Facebook largely allows this, arguing that moderating politicians’ speech would amount to censorship).
- Facebook has faced sustained criticism since the 2016 US presidential election that the company isn’t doing enough to limit political misinformation on its platform and, as a result, is hurting democracy.
- Facebook said it will be giving users more control over seeing political ads as part of a larger announcement defending how it handles politicians’ controversial posts and its initiative to launch a Voting Information Center that aims to register 4 million users to vote ahead of the 2020 US presidential election.
- Facebook’s new policy to let you ignore political ads is opt-in.
Civil rights organizations want advertisers to dump Facebook
- Amid ongoing protests against police brutality and racism, a new campaign from organizations including the NAACP and the Anti-Defamation League is urging advertisers to pull their spending on Facebook ads for July, emphasizing the platform’s repeated failure to curb hateful and false content.
- The Stop Hate for Profit campaign also calls for moderators affiliated with Facebook to be involved in any online group including more than 150 people and for the company to provide “an internal mechanism to automatically flag content in private groups associated with extremist ideologies for human review.” That comes following extensive reporting documenting how misinformation, including fake news about the coronavirus, can especially thrive in private groups.
CanaryTrap Method Identifies 16 Facebook Apps Guilty of Data Misuse
- Fortunately, a group of academics has created a method which can aid the identification of Facebook app developers that share user data with third-parties.
- The problem is that third-party apps with access to personal details of a large number of users have a high potential for misuse, the researchers point out in their white paper.
- The number of high-profile incidents of data misuse by third-party apps on online social networks is bigger than it should be.
- Neither users nor online social networks have any visibility on the use of data stored on the servers of third-party apps.
- We share the email address associated with a Facebook account as a honeytoken by installing a third-party app and then monitor the received emails to detect any unrecognized use of the shared email address.
Court Rules Facebook Widgets Can Be Considered Wiretaps
- At least, that’s according to the 9th Circuit Court of Appeals, who issued a short order earlier this week ignoring the company’s plea to reconsider whether it potentially violated multiple federal and state privacy laws by stuffing widgets across the web.
- There are a few reasons that the judge overseeing the case, Edward Davila, decided to throw things out of court, but the one that aged the worst—at least in my opinion—is that anyone who’s creeped out by a Facebook widget can just use something like “incognito” mode, or install an ad blocker: basic practices that we’ve seen time and again mean fuck all to Facebook’s data-driven machine.
- Or put another way, if you’re browsing around a site where there might be a little hidden widget quietly tracking who you are and the actions you take and then sending that data somewhere else, well, that sure does sound a lot like the wiretapping of yesteryear, just put under another name.
Facebook still won’t take down politicians’ misleading posts, but it’s trying to register 4 million new voters
- The company plans to show this Voting Information Center to 160 million people in total and aims to help 4 million Americans register to vote — that’s twice the number of voters the company says it helped register in the lead-up to the 2016 presidential election, using similar efforts.
- Instead, they want Facebook to consider a “non-binary” action that would label those posts as misleading or contentious, similar to what Twitter did with Trump’s mail-in ballot statements in May. Facebook did not respond to a request to answer follow-up questions about the voter information center in time for publication.
- Considering Zuckerberg’s long-standing ethos that Facebook should not be an “arbiter of truth” on contentious political speech, his decision not to intervene with world leaders on controversial posts about voting but instead surface more seemingly objective information makes sense.
Tech billionaire Peter Thiel may ditch Trump because he thinks Trump will lose
- Thiel fears President Donald Trump will lose the race, according to a report from The Wall Street Journal.
- Thiel soured on Trump after COVID-19 left tens of millions of Americans unemployed; the billionaire believes that there will be a profound recession when November rolls around, making Trump vulnerable to challenge.
- Thiel was a vocal supporter of the president in 2016, speaking at the Republican National Convention in 2016 and donating $1.25 million that year to his campaign and other adjacent political groups and causes.
- Thiel, who earned his fortune co-founding PayPal before becoming one of the earliest Facebook investors, has no plans on donating any money to Trump’s campaign this year, the report says.
- Instead of financially supporting Trump in November, Thiel now reportedly plans to focus his money on helping Republicans win Congressional races; apparently Thiel fears for the down-ballot races in the event of a Trump loss.
Facebook's first TikTok clone failed, so it tries for another
- Facebook this week revealed it’s shutting down two of its experimental apps, one of which is its first ever TikTok clone.
- Both were immediately called out for their similarities to other apps — even the positive reviews for Lasso compare it to TikTok. I guess I can’t really judge Facebook too harshly for the naked attempts to swipe other platforms’ gimmicks — it worked, and gloriously so, in the case of Instagram and Snapchat.
- While several of them have dressed it up as fear for the safety and privacy of its users, Sandberg let slip in an interview what’s likely the real source of hostility: the meteoric rise of TikTok has sliced into Facebook‘s supply of young people.