Skip to content

Social Media News

Where Countries Are Tinderboxes and Facebook Is a Match

If you have a facebook account and have a shred of decency, you should delete your facebook account. If your businesses on facebook, get off and send the message. This company needs to be buried as they are out of control and have serious blood on their hands. Stop supporting them.

“A reconstruction of Sri Lanka’s descent into violence, based on interviews with officials, victims and ordinary users caught up in online anger, found that Facebook’s newsfeed played a central role in nearly every step from rumor to killing. Facebook officials, they say, ignored repeated warnings of the potential for violence, resisting pressure to hire moderators or establish emergency points of contact.”

Quote

False rumors set Buddhist against Muslim in Sri Lanka, the most recent in a global spate of violence fanned by social media.

MEDAMAHANUWARA, Sri Lanka — Past the end of a remote mountain road, down a rutted dirt track, in a concrete house that lacked running water but bristled with smartphones, 13 members of an extended family were glued to Facebook. And they were furious.

A family member, a truck driver, had died after a beating the month before. It was a traffic dispute that had turned violent, the authorities said. But on Facebook, rumors swirled that his assailants were part of a Muslim plot to wipe out the country’s Buddhist majority.

“We don’t want to look at it because it’s so painful,” H.M. Lal, a cousin of the victim, said as family members nodded. “But in our hearts there is a desire for revenge that has built.”

The rumors, they believed, were true. Still, the family, which is Buddhist, did not join in when Sinhalese-language Facebook groups, goaded on by extremists with wide followings on the platform, planned attacks on Muslims, burning a man to death.

But they had shared and could recite the viral Facebook memes constructing an alternate reality of nefarious Muslim plots. Mr. Lal called them “the embers beneath the ashes” of Sinhalese anger.

We came to this house to try to understand the forces of social disruption that have followed Facebook’s rapid expansion in the developing world, whose markets represent the company’s financial future. For months, we had been tracking riots and lynchings around the world linked to misinformation and hate speech on Facebook, which pushes whatever content keeps users on the site longest — a potentially damaging practice in countries with weak institutions.

Time and again, communal hatreds overrun the newsfeed — the primary portal for news and information for many users — unchecked as local media are displaced by Facebook and governments find themselves with little leverage over the company. Some users, energized by hate speech and misinformation, plot real-world attacks.

A reconstruction of Sri Lanka’s descent into violence, based on interviews with officials, victims and ordinary users caught up in online anger, found that Facebook’s newsfeed played a central role in nearly every step from rumor to killing. Facebook officials, they say, ignored repeated warnings of the potential for violence, resisting pressure to hire moderators or establish emergency points of contact.

Read the full article

Facebook sever ties to data brokers

Quote

The Social Network™ all-but-admits its previous legalese for developers was useless

Facebook has outlined a set of changes to its platform that impact developers and data brokers.

Facebook has a program called “Partner Categories” that it tells advertisers will let them “further refine your targeting based on information compiled by … partners, such as offline demographic and behavioural information like homeownership or purchase history.”

The partners Facebook uses are Acxiom, CCC Marketing, Epsilon, Experian, Oracle Data Cloud and Quantium.

Graham Mudd, a Facebook product marketing director, said that using such providers to refine ad targeting “is common industry practice” but that Facebook feels “this step, winding down over the next six months, will help improve people’s privacy on Facebook.”

On its own platform, Facebook has promised new fine print for business-to-business applications, complete with “rigorous policies and terms”. Which kind of admits some of Facebook’s past fine print was floppy. Perhaps floppy enough to let data flow to Cambridge Analytica and beyond?

Also notable is a change that means apps that provides access to lists of a user’s friends will now be reviewed by Facebook.

So there you have it. No real change. They can’t change. Facebook needs to sell data like Starbooks needs to sell coffee. It is their business and you are their product. They will continue to mine and map your information with their third party partners to create highly targeted ads.

Want it to stop? Delete your Facebook Account now would be a good start.

Facebook Inspired Killings

This is an article from Oct 2017. While I have excerpted it here, but think it is worth a complete read (see Quote). It is an excellent article that I feel shows the complexity and human cost side of Facebook.

Quote

… But while the focus on Russia is understandable, Facebook has been much less vocal about the abuse of its services in other parts of the world, where the stakes can be much higher than an election.
..
the ethnic cleansing of Rohingya Muslims, an ethnic minority in Myanmar that has been subjected to brutal violence and mass displacement. Violence against the Rohingya has been fueled, in part, by misinformation and anti-Rohingya propaganda spread on Facebook, which is used as a primary news source by many people in the country. Doctored photos and unfounded rumors have gone viral on Facebook, including many shared by official government and military accounts….In Myanmar, the rise in anti-Rohingya sentiment coincided with a huge boom in social media use that was partly attributable to Facebook itself. In 2016, the company partnered with MPT, the state-run telecom company, to give subscribers access to its Free Basics program. Free Basics includes a limited suite of internet services, including Facebook, that can be used without counting toward a cellphone data plan. As a result, the number of Facebook users in Myanmar has skyrocketed to more than 30 million today from 2 million in 2014.

In India, where internet use has also surged in recent years, WhatsApp, the popular Facebook-owned messaging app, has been inundated with rumors, hoaxes and false stories. In May, the Jharkhand region in Eastern India was destabilized by a viral WhatsApp message that falsely claimed that gangs in the area were abducting children. The message incited widespread panic and led to a rash of retaliatory lynchings, in which at least seven people were beaten to death. A local filmmaker, Vinay Purty, told the Hindustan Times that many of the local villagers simply believed the abduction myth was real, since it came from WhatsApp….
The company has made many attempts to educate users about the dangers of misinformation. In India and Malaysia, it has taken out newspaper ads with tips for spotting false news. In Myanmar, it has partnered with local organizations to distribute printed copies of its community standards, as well as created educational materials to teach citizens about proper online behavior.

But these efforts, as well-intentioned as they may be, have not stopped the violence, and Facebook does not appear to have made them a top priority. The company has no office in Myanmar, and neither Mr. Zuckerberg nor Ms. Sandberg has made any public statements about the Rohingya crisis.

Facebook has argued that the benefits of providing internet access to international users will ultimately outweigh the costs. Adam Mosseri, a Facebook vice president who oversees the News Feed, told a journalism gathering this month, “In the end, I don’t think we as a human race will regret the internet.” Mr. Zuckerberg echoed that sentiment in a 2013 manifesto titled “Is Connectivity a Human Right?,” in which he said that bringing the world’s population online would be “one of the most important things we all do in our lifetimes.”

That optimism may be cold comfort to people in places like South Sudan. Despite being one of the poorest and least-wired countries in the world, with only around 20 percent of its citizens connected to the internet, the African nation has become a hotbed of social media misinformation. As BuzzFeed News has reported, political operatives inside and outside the country have used Facebook posts to spread rumors and incite anger between rival factions, fostering violence that threatens to escalate into a civil war. A United Nations report last year determined that in South Sudan, “social media has been used by partisans on all sides, including some senior government officials, to exaggerate incidents, spread falsehoods and veiled threats, or post outright messages of incitement.”

Peter Thiel Employee Helped Cambridge Analytica Before It Harvested Data

Quote

I think this story shows that the Facebook data mining is the tip of the iceberg. It will drag in Google and others.

As a start-up called Cambridge Analytica sought to harvest the Facebook data of tens of millions of Americans in summer 2014, the company received help from at least one employee at Palantir Technologies, a top Silicon Valley contractor to American spy agencies and the Pentagon.

It was a Palantir employee in London, working closely with the data scientists building Cambridge’s psychological profiling technology, who suggested the scientists create their own app — a mobile-phone-based personality quiz — to gain access to Facebook users’ friend networks, according to documents obtained by The New York Times.

Cambridge ultimately took a similar approach. By early summer, the company found a university researcher to harvest data using a personality questionnaire and Facebook app. The researcher scraped private data from over 50 million Facebook users — and Cambridge Analytica went into business selling so-called psychometric profiles of American voters, setting itself on a collision course with regulators and lawmakers in the United States and Britain.

The revelations pulled Palantir — co-founded by the wealthy libertarian Peter Thiel — into the furor surrounding Cambridge, which improperly obtained Facebook data to build analytical tools it deployed on behalf of Donald J. Trump and other Republican candidates in 2016. Mr. Thiel, a supporter of President Trump, serves on the board at Facebook.

The connections between Palantir and Cambridge Analytica were thrust into the spotlight by Mr. Wylie’s testimony on Tuesday. Both companies are linked to tech-driven billionaires who backed Mr. Trump’s campaign: Cambridge is chiefly owned by Robert Mercer, the computer scientist and hedge fund magnate, while Palantir was co-founded in 2003 by Mr. Thiel, who was an initial investor in Facebook.

Google Link?

A former intern at SCL — Sophie Schmidt, the daughter of Eric Schmidt, then Google’s executive chairman — urged the company to link up with Palantir, according to Mr. Wylie’s testimony and a June 2013 email viewed by The Times.

“Ever come across Palantir. Amusingly Eric Schmidt’s daughter was an intern with us and is trying to push us towards them?” one SCL employee wrote to a colleague in the email.

Ms. Schmidt did not respond to requests for comment, nor did a spokesman for Cambridge Analytica.

In an interview this month with The Times, Mr. Wylie said that Palantir employees were eager to learn more about using Facebook data and psychographics. Those discussions continued through spring 2014, according to Mr. Wylie.

Mr. Wylie said that he and Mr. Nix visited Palantir’s London office on Soho Square. One side was set up like a high-security office, Mr. Wylie said, with separate rooms that could be entered only with particular codes. The other side, he said, was like a tech start-up — “weird inspirational quotes and stuff on the wall and free beer, and there’s a Ping-Pong table.”

Mr. Chmieliauskas continued to communicate with Mr. Wylie’s team in 2014, as the Cambridge employees were locked in protracted negotiations with a researcher at Cambridge University, Michal Kosinski, to obtain Facebook data through an app Mr. Kosinski had built. The data was crucial to efficiently scale up Cambridge’s psychometrics products so they could be used in elections and for corporate clients.

“I had left field idea,” Mr. Chmieliauskas wrote in May 2014. “What about replicating the work of the cambridge prof as a mobile app that connects to facebook?” Reproducing the app, Mr. Chmieliauskas wrote, “could be a valuable leverage negotiating with the guy.”

Those negotiations failed. But Mr. Wylie struck gold with another Cambridge researcher, the Russian-American psychologist Aleksandr Kogan, who built his own personality quiz app for Facebook. Over subsequent months, Dr. Kogan’s work helped Cambridge develop psychological profiles of millions of American voters.

One can only hope this will broaden the understanding of what “you are the product” means to free services peddled by big tech. Then again…..

Facebook’s Mark Zuckerberg Vows to Bolster Privacy Amid Cambridge

Sounds like bullshit to me. And how can he even do this. This is his business: harvesting and selling his user’s personal data by offering a honeypot free service to clueless (and some not so clueless) users?

After several days of silence, amid a growing chorus of criticism, Facebook chief executive Mark Zuckerberg publicly addressed the misuse of data belonging to 50 million users of the social network.

“We have a responsibility to protect your data,” Mr. Zuckerberg said Wednesday in a Facebook post, his preferred means of communication, “and if we can’t then we don’t deserve to serve you.”

Wait – you don’t serve your users..they serve you Zucky

Read more

WhatsApp co-founder joins call to #DeleteFacebook as fallout intensifies

Quote

Facebook’s troubles entered a fourth day with a rising chorus of people – including the co-founder of WhatsApp – joining the #DeleteFacebook movement as the Federal Trade Commission was reported to be investigating the company’s handling of personal data.

Momentum gathered behind the #DeleteFacebook campaign, with several media outlets publishing guides to permanently deleting your Facebook account. One surprising voice to emerge was that of Brian Acton, the co-founder of WhatsApp, which was bought by Facebook for $19bn in 2014.

Acton, who left WhatsApp in late 2017, posted to Twitter: “It is time. #deletefacebook.”

Meanwhile, in the United States, the FTC will examine whether the social networking site violated a 2011 agreement with the agency over data privacy, after reports that a firm called Global Science Research harvested information relating to 50 million Facebook profiles and provided the data to Cambridge Analytica

Of course this begs the question: Why on earth were you so ignorant to use FaceBook and What’s App (after Facebook bought it) in the first place?

Facebook Leak or OMG – you mean facebook has my data?

Well unless you live under a rock, Facebook has been caught once again with their pants down. Lets see…

LONDON — As the upstart voter-profiling company Cambridge Analytica prepared to wade into the 2014 American midterm elections, it had a problem.

The firm had secured a $15 million investment from Robert Mercer, the wealthy Republican donor, and wooed his political adviser, Stephen K. Bannon, with the promise of tools that could identify the personalities of American voters and influence their behavior. But it did not have the data to make its new products work.

So the firm harvested private information from the Facebook profiles of more than 50 million users without their permission, according to former Cambridge employees, associates and documents, making it one of the largest data leaks in the social network’s history. The breach allowed the company to exploit the private social media activity of a huge swath of the American electorate, developing techniques that underpinned its work on President Trump’s campaign in 2016.


But the full scale of the data leak involving Americans has not been previously disclosed — and Facebook, until now, has not acknowledged it. Interviews with a half-dozen former employees and contractors, and a review of the firm’s emails and documents, have revealed that Cambridge not only relied on the private Facebook data but still possesses most or all of the trove.

Read more

Oh I am so shocked, SHOCKED I Say

And today learned that Cambridge Analytica Suspends C.E.O. Amid Facebook Data Scandal

Cambridge Analytica, the political data firm with ties to President Trump’s 2016 campaign, suspended its chief executive, Alexander Nix, on Tuesday, amid the furor over the access it gained to private information on more than 50 million Facebook users.

The decision came after a television broadcast in which Mr. Nix was recorded suggesting that the company had used seduction and bribery to entrap politicians and influence foreign elections.

The suspension marked a new low point for the fortunes of Cambridge Analytica and for Mr. Nix, who spent much of the past year making bold claims about the role his outfit played in the election of Mr. Trump. The company, founded by Stephen K. Bannon and Robert Mercer, a wealthy Republican donor who has put at least $15 million into it, offered tools that it claimed could identify the personalities of American voters and influence their behavior.

So-called psychographic modeling techniques, which were built in part with the data harvested from Facebook, underpinned Cambridge Analytica’s work for the Trump campaign in 2016. Mr. Nix once called the practice “our secret sauce,” though some have questioned its effectiveness.

But in recent days, the firm has found itself under increased scrutiny from lawmakers, regulators and prosecutors in the United States and Britain following reports in The New York Times and The Observer of London that the firm had harvested the Facebook data, and that it still had a copy of the information.

Read more

As I said before, anyone who uses facebook, has an Alexa, smart TVs (for the clueless) and so forth really needs to get educated on privacy and IT Security. I will copy and post the “Whips” excellent comment to this article

Let’s be clear here: Facebook doesn’t steal our data; we give it to them, one Like at a time.

For decades, Europe has had a Data Protection Directive that runs circles around the U.S.’s, such as it is–and it’s about to get even stronger with the GDPR, which will improve user control over our own data.

Instead of Americans spewing moral outrage at the weekly corporate affront (last week Experian, this week Facebook, next week who knows), why not grow up and demand a national approach to data protection?

Facebook needs to be regulated more tightly, or broken up so that no single entity controls all of its data

Quote

The message was clear: The company just wanted negative stories to stop. It didn’t really care how the data was used.

Facebook knows what you look like, your location, who your friends are, your interests, if you’re in a relationship or not, and what other pages you look at on the web. This data allows advertisers to target the more than one billion Facebook visitors a day. It’s no wonder the company has ballooned in size to a $500 billion behemoth in the five years since its I.P.O.

The more data it has on offer, the more value it creates for advertisers. That means it has no incentive to police the collection or use of that data — except when negative press or regulators are involved. Facebook is free to do almost whatever it wants with your personal information, and has no reason to put safeguards in place.

For a few years, Facebook’s developer platform hosted a thriving ecosystem of popular social games. Remember the age of Farmville and Candy Crush? The premise was simple: Users agreed to give game developers access to their data in exchange for free use of addictive games.

..

In one instance, a developer appeared to be using Facebook data to automatically generate profiles of children, without their consent. When I called the company responsible for the app, it claimed that Facebook’s policies on data use were not being violated, but we had no way to confirm whether that was true. Once data passed from the platform to a developer, Facebook had no view of the data or control over it. In other cases, developers asked for permission to get user data that their apps obviously didn’t need — such as a social game asking for all of your photos and messages. People rarely read permissions request forms carefully, so they often authorize access to sensitive information without realizing it.

At a company that was deeply concerned about protecting its users, this situation would have been met with a robust effort to cut off developers who were making questionable use of data. But when I was at Facebook, the typical reaction I recall looked like this: try to put any negative press coverage to bed as quickly as possible, with no sincere efforts to put safeguards in place or to identify and stop abusive developers. When I proposed a deeper audit of developers’ use of Facebook’s data, one executive asked me, “Do you really want to see what you’ll find?”

The message was clear: The company just wanted negative stories to stop. It didn’t really care how the data was used.

This makes for a dangerous mix: a company that reaches most of the country every day and has the most detailed set of personal data ever assembled, but has no incentive to prevent abuse. Facebook needs to be regulated more tightly, or broken up so that no single entity controls all of its data. The company won’t protect us by itself, and nothing less than our democracy is at stake.

Indeed. And users, including businesses, need to get serious about privacy and the damage the likes of facebook are doing and flee Facebook and their ilk in droves. Will this happen? I doubt it. As long as it is free they will come. As the increased popularity of Alexa, and other personal assistants that listen in shows, people are continuing to invite these modern forms of big brother into their private lives.

Facebook, Twitter, Skype & the rest of Garbage

I have to love this quote

When it comes to technology and social media is of course the man behind the entire investigation, Robert Mueller, who doesn’t give one iota of a shit about the swirling social media yelling that passes for modern political debate.

Love it! Supermarket checkout tabloid rags, but not political debate.

Now here’s a shock (not) from the same article

126 million Facebook users, rather than just 10 million, may have seen content produced and circulated by Russian operatives during the 2016 White House race, it emerged on Monday. And Twitter now says 2,752 accounts accounts were run by Russian agents, rather than the 201 figure it previously estimate

Disconnect from social media and join the real world.

Source: Quote

Facebook and Google promoted false news about Las Vegas

“Social media: The internet version of the supermarket tabloid. Written by the mindless for the mindless.” Unfortunately it is picked up by mainstream media and is swallowed and regurgitated by a good percentage of the 65% of Americans who get their “news” from social media. The article also points up to a failure in machine learning (AI) algorithms in use by the Facebook, Google and their ilk.

Quote

Facebook and Google promoted false news stories claiming that the shooter who killed more than 50 people in Las Vegas was a Democrat who opposed Donald Trump. The misidentification spread rapidly from dark corners of the internet to mainstream platforms just hours after hundreds were injured at a festival near the Mandalay Bay casino, the latest example of fake news polluting social media amid a breaking news story.

The flow of misinformation on Monday illustrated a particularly grim trend that has increasingly dominated viral online propaganda during US mass shootings – hyper-partisan trolls battling to blame the tragedy on opposing political ideologies. …

Despite the fact that the claims were unproven and coming from non-credible sources, Facebook’s “Safety Check” page, which is supposed to help people connect with loved ones during the crisis, ended up briefly promoting a story that said the shooter had “Trump-hating” views, along with links to a number of other hoaxes and scams, according to screenshots. At the same time, Google users who searched Geary Danley’s name were at one point directed to the 4chan thread filled with false claims.
..
False content can quickly move from social media to legitimate news sources, she added: “People are putting out crap information on purpose … It’s really easy to get shit into the news cycle by being on Twitter.”

A YouTube user also pushed an unsubstantiated rumor that the suspect was a Hillary Clinton supporter.