Skip to content

Just Say NO to Facebook

Facebook uploads users address books (contacts) without user permission

Really – any business that does business with Facebook is sending a great big message to their customers that they do not care about their user’s privacy. “Like US on Facebook” means let us rape your privacy. JUST SAY NO TO FACEBOOK.

Quote

Facebook has admitted to “unintentionally” uploading the address books of 1.5 million users without consent, and says it will delete the collected data and notify those affected.

The discovery follows criticism of Facebook by security experts for a feature that asked new users for their email password as part of the sign-up process. As well as exposing users to potential security breaches, those who provided passwords found that, immediately after their email was verified, the site began “importing” contacts without asking for permission.

Facebook has now admitted it was wrong to do so, and said the upload was inadvertent. “Last month we stopped offering email password verification as an option for people verifying their account when signing up for Facebook for the first time,” the company said.

“When we looked into the steps people were going through to verify their accounts we found that in some cases people’s email contacts were also unintentionally uploaded to Facebook when they created their account,” a spokesperson said. “We estimate that up to 1.5 million people’s email contacts may have been uploaded. These contacts were not shared with anyone and we’re deleting them. We’ve fixed the underlying issue and are notifying people whose contacts were imported. People can also review and manage the contacts they share with Facebook in their settings.”

The issue was first noticed in early April, when the Daily Beast reported on Facebook’s practice of asking for email passwords to verify new users. The feature, which allows Facebook to automatically log in to a webmail account to effectively click the link on an email verification itself, was apparently intended to smooth the workflow for signing up for a new account.

But security experts said the practice was “beyond sketchy”, noting that it gave Facebook access to a large amount of personal data and may have led to users adopting unsafe practices around password confidentiality. The company was “practically fishing for passwords you are not supposed to know”, according to cybersecurity tweeter e-sushi, who first raised concern about the feature, which Facebook says has existed since 2016.

At the time, Facebook insisted it did not store email passwords but said nothing about other information gathered in the process. Shortly after, Business Insider reported that, for users who entered their passwords, Facebook was also harvesting contact details – apparently a hangover from an earlier feature that Facebook had built expressly to take contacts with permission – except in this new implementation, users had not given consent.

The company said those contacts were used as part of its People You May Know feature, as well as to improve ad targeting systems. While it has committed to deleting the uploaded contacts, it is not immediately clear whether it will delete the information it inferred from those uploaded contacts – or even whether it is able to do so. Facebook did not immediately reply to a query from the Guardian.

Facebook Is Stealing Your Family’s Joy

Before you post that baby bump or college acceptance letter online, remember how much fun it used to be to share in person.

My kids have had some good news lately. Academic triumphs, hockey tournament wins, even a little college admissions excitement. They’ve had rough moments too, and bittersweet ones. There have been last games and disappointments and unwashed dishes galore. If you’re a friend, or even somebody who knows my mom and struck up a friendly conversation in line at the grocery store, I’d love to talk to you about any of it. I might even show you pictures.

But I’m not going to post them on social media. Because I tried that for a while, and I came to a simple conclusion about getting the reactions of friends, family and acquaintances via emojis and exclamations points rather than hugs and actual exclamations.

It’s no fun. And I don’t want to do it any more.

I’m not the only one pulling back from social media. While around two-thirds of American adults use Facebook, the way many of us use it has shifted in recent years. About 40 percent of adult users report taking a break from checking Facebook for several weeks or more, and 26 percent tell researchers they’ve deleted the app from their phone at some point in the past year.

Some have changed their behavior because of Facebook’s lax record on protecting user data: More than half of adult users have adjusted their privacy settings in the past year. Others seem more concerned with how the platform makes them act and feel. Either way, pulling back on social media is a way to embrace your family’s privacy.

“I have definitely seen an evolution toward sharing less,” said Julianna Miner, an adjunct professor of global and community health at George Mason University and the author of the forthcoming “Raising a Screen-Smart Kid: Embrace the Good and Avoid the Bad in the Digital Age.” She added, “It’s hard to tell if the changes are a response to the security breaches, or a result of people just getting tired of it.”

Even Mark Zuckerberg, the chief executive of Facebook, seems to suspect it’s at least in part the latter — that after experimenting with living our lives in a larger online sphere for over a decade, many of us are ready to return to the more intimate groups where humans have long thrived. In a recent blog post, Mr. Zuckerberg announced plans to emphasize private conversations and smaller communities on the platform. Interacting on Facebook, he wrote, “will become a fundamentally more private experience” — less “town square,” more “living room.”

[As technology advances, will it continue to blur the lines between public and private? Sign up for Charlie Warzel’s limited-run newsletter to explore what’s at stake and what you can do about it.]

That’s a shift I’ve already made for myself, and since doing so, I find myself asking why I embraced my personal soapbox in that online square in the first place. The more I reserve both good news and personal challenges for sharing directly with friends, the more I see that the digital world never offered the same satisfaction or support. Instead, I lost out on moments of seeing friends’ faces light up at joyful news, and frequently found myself wishing that not everyone within my network had been privy to a rant or disappointment.

“There’s plenty of evidence that interpersonal, face-to-face interactions yield a stronger neural response than anything you can do online,” said Ms. Miner. “Online empathy is worth something to us, but not as much. It takes something like six virtual hugs to equal one real hug.”

Time spent seeking those virtual hugs can take us outside the world we’re living in, and draw us back to our phones (which, of course, is the reason many networks offer those bursts of feedback in the first place).

“Ultimately, you’re not just giving social media the time it takes you to post,” said Stacey Steinberg, the associate director of the Center on Children and Families at the University of Florida Levin College of Law and the author of a paper on the topic called “Sharenting: Children’s Privacy in the Age of Social Media.”

“The interaction doesn’t end the minute you press share,” she said. “Some part of your mind is waiting for responses, and that amounts to a small distraction that takes us away from whatever else we would be engaged in.” Once we post that image of our toddler flossing, we’re no longer entirely watching him dance. Some part of us is in the digital realm, waiting to have our delight validated.

That validation can be satisfying, but the emotion is fleeting, like the sugar rush that comes from replacing a real breakfast with a Pop-Tart. Watching your mother’s reaction to the same video, though, brings a different kind of pleasure. “I see parents sharing differently than I did five years ago,” said Ms. Steinberg. “We’re looking for smaller audiences and ways to share just with close friends.”

She also warned that even seemingly innocuous public updates have long shadows. “You could have a child who was a star baseball player and later decides to make a change, still being asked by relative strangers about his batting average,” she said. “Or one who decides on a college, and then changes her mind. Decisions are complex. Lives are complex. Marie Kondo-ing your Facebook page is not so easy.”

There are exceptions. Facebook shines as an arena for professional connection and promotion, of course. For those of us with children who have special needs, it can offer an invaluable community of support. And for the very worst of bad news — for calamities or illnesses or deaths — Facebook can help users speedily share updates, ask for help and share obituaries and memories.
Sign Up for The Privacy Project Newsletter

As technology advances, will it continue to blur the lines between public and private? Explore what’s at stake and what you can do about it.

Cal Newport, the author of “Digital Minimalism: Choosing a Focused Life in a Noisy World,” suggests that when we evaluate the ways we use the social media tools available to us, we ask ourselves if those tools are the best ways to achieve our goals. In those cases, the answer is yes.

But for sharing personal moments, for venting, for getting good advice on parenting challenges while feeling supported in our tougher moments? I’ve found that real life, face-to-face, hug-to-hug contact offers more bang for my buck than anything on a screen ever could. Why cheat yourself out of those pleasures for the momentary high of a pile of “likes”?

Recently, I ran into an acquaintance while waiting for my order at a local restaurant. “Congratulations,” she said, warmly. I racked my brain. I’d sold a book that week, but the information wasn’t public. I wasn’t pregnant, didn’t have a new job, had not won the lottery. My takeout ordering skills didn’t really seem worthy of note, and in fact I probably had asked for too much food, as I usually do. I wanted to talk more about this happy news, but what were we talking about? Fortunately, she went on, “Your son must be so thrilled.”

Right. My oldest — admitted to college. He was thrilled, and so were we, and I said so. But how did she know?

My son told her daughter, one of his classmates, and her daughter told her.

Perfect.

Dear tech companies, I don’t want to see pregnancy ads after my child was stillborn

Dear Tech Companies:

I know you knew I was pregnant. It’s my fault, I just couldn’t resist those Instagram hashtags — #30weekspregnant, #babybump. And, silly me! I even clicked once or twice on the maternity-wear ads Facebook served up. What can I say, I am your ideal “engaged” user.

You surely saw my heartfelt thank-you post to all the girlfriends who came to my baby shower, and the sister-in-law who flew in from Arizona for said shower tagging me in her photos. You probably saw me googling “holiday dress maternity plaid” and “babysafe crib paint.” And I bet Amazon.com even told you my due date, Jan. 24, when I created that Prime registry.

But didn’t you also see me googling “braxton hicks vs. preterm labor” and “baby not moving”? Did you not see my three days of social media silence, uncommon for a high-frequency user like me? And then the announcement post with keywords like “heartbroken” and “problem” and “stillborn” and the 200 teardrop emoticons from my friends? Is that not something you could track?

You see, there are 24,000 stillbirths in the United States every year, and millions more among your worldwide users. And let me tell you what social media is like when you finally come home from the hospital with the emptiest arms in the world, after you and your husband have spent days sobbing in bed, and you pick up your phone for a few minutes of distraction before the next wail. It’s exactly, crushingly, the same as it was when your baby was still alive. A Pea in the Pod. Motherhood Maternity. Latched Mama. Every damn Etsy tchotchke I was considering for the nursery.

And when we millions of brokenhearted people helpfully click “I don’t want to see this ad,” and even answer your “Why?” with the cruel-but-true “It’s not relevant to me,” do you know what your algorithm decides, Tech Companies? It decides you’ve given birth, assumes a happy result and deluges you with ads for the best nursing bras (I have cabbage leaves on my breasts because that is the best medical science has to offer to turn off your milk), DVDs about getting your baby to sleep through the night (I would give anything to have heard him cry at all), and the best strollers to grow with your baby (mine will forever be 4 pounds 1 ounce).

And then, after all that, Experian swoops in with the lowest tracking blow of them all: a spam email encouraging me to “finish registering your baby” with them (I never “started,” but sure) to track his credit throughout the life he will never lead.

Please, Tech Companies, I implore you: If your algorithms are smart enough to realize that I was pregnant, or that I’ve given birth, then surely they can be smart enough to realize that my baby died, and advertise to me accordingly — or maybe, just maybe, not at all.

Regards,

Gillian

Addendum:

Rob Goldman, VP of advertising at Facebook, responded to an earlier version of my letter, saying:

“I am so sorry for your loss and your painful experience with our products. We have a setting available that can block ads about some topics people may find painful – including parenting. It still needs improvement, but please know that we’re working on it & welcome your feedback.”

In fact, I knew there was a way to change my Facebook ad settings and attempted to find it a few days ago, without success. Anyone who has experienced the blur, panic and confusion of grief can understand why. I’ve also been deluged with deeply personal messages from others who have experienced stillbirth, infant death and miscarriage who felt the same way I do. We never asked for the pregnancy or parenting ads to be turned on; these tech companies triggered that on their own, based on information we shared. So what I’m asking is that there be similar triggers to turn this stuff off on its own, based on information we’ve shared.

But for anyone who wants to turn off parenting ads on Facebook, it’s under: Settings>Ad Preferences>Hide ad topics>Parenting.

I have a better idea – just say no to facebook and related social media “look at me look at me” posts and get on with growing up and living your own life.

Hundreds of millions of Facebook records exposed on public servers – report

Wait Wait – I thought old Zuck said he was changing things. I guess his users (including Business users) are really the Zuckers.

Note to Businesses: DROP FACEBOOK, it will hurt you in the long run.
Note to Users: Time to seek addiction counseling because if you still use Facebook, in spite of all the news, you are either mentally challenged or have a serious addiction problem (or simply too apathetic to give a damn).

Editorial — actually my ire is not with the users, it is with the businesses that still patronize Facebook. Customers of these businesses really need to ask this “why is this business still using Facebook?” The answers is clear — they also want your private data and prefer to track and monetize you as opposed to protecting your privacy. And yes, their bedfellows include media like the Washington Post, New York Times, The Guardian, Bloomberg, all of which we often quote here. Shame on them and all others. If companies left Facebook, then this menace (Facebook) would be history. Well that will not happen as they see $$$$$$$$$$$.

What to do? Simple: 1) Delete your Facebook Account, 2) contact those businesses and urge them to drop Facebook.

Quote

Material discovered on Amazon cloud servers in latest example of Facebook letting third parties extract user data

More than 540m Facebook records were left exposed on public internet servers, cybersecurity researchers said on Wednesday, in just the latest security black eye for the company.

Researchers for the firm UpGuard discovered two separate sets of Facebook user data on public Amazon cloud servers, the company detailed in a blogpost.

One dataset, linked to the Mexican media company Cultura Colectiva, contained more than 540m records, including comments, likes, reactions, account names, Facebook IDs and more. The other set, linked to a defunct Facebook app called At the Pool, was significantly smaller, but contained plaintext passwords for 22,000 users.
Zuckerberg’s proposals to regulate Facebook are self-serving and cynical | Roger McNamee
Read more

The large dataset was secured on Wednesday after Bloomberg, which first reported the leak (see article here), contacted Facebook. The smaller dataset was taken offline during UpGuard’s investigation.

The data exposure is not the result of a breach of Facebook’s systems. Rather, it is another example, akin to the Cambridge Analytica case, of Facebook allowing third parties to extract large amounts of user data without controls on how that data is then used or secured.

More than 540m Facebook records were left exposed on public internet servers, cybersecurity researchers said on Wednesday, in just the latest security black eye for the company.

“The data exposed in each of these sets would not exist without Facebook, yet these data sets are no longer under Facebook’s control,” the UpGuard researchers wrote in its blogpost. “In each case, the Facebook platform facilitated the collection of data about individuals and its transfer to third parties, who became responsible for its security.”

Facebook said that it was investigating the incident and did not yet know the nature of the data, how it was collected or why it was stored on public servers. The company said it will inform users if they find evidence that the data was misused.

“Facebook’s policies prohibit storing Facebook information in a public database,” a spokeswoman said in a statement. “Once alerted to the issue, we worked with Amazon to take down the databases. We are committed to working with the developers on our platform to protect people’s data.”

Cultura Colectiva did not immediately respond to a request for comment.

The data exposure is just the latest example of how Facebook’s efforts to be perceived as a “privacy-focused” platform are hampered by its own past practices and what UpGuard researchers called “the long tail” of user data. For years, Facebook allowed third-party app developers substantial access to users’ information.

“As these exposures show, the data genie cannot be put back in the bottle,” the UpGuard researchers wrote. “Data about Facebook users has been spread far beyond the bounds of what Facebook can control today.”

Facebook’s Data Deals Are Under Criminal Investigation

Throw the book at em, and wind down this house of despicable spies and greedy exploiters of their (arguably gullible) flock

Quote

Federal prosecutors are conducting a criminal investigation into data deals Facebook struck with some of the world’s largest technology companies, intensifying scrutiny of the social media giant’s business practices as it seeks to rebound from a year of scandal and setbacks.

A grand jury in New York has subpoenaed records from at least two prominent makers of smartphones and other devices, according to two people who were familiar with the requests and who insisted on anonymity to discuss confidential legal matters. Both companies had entered into partnerships with Facebook, gaining broad access to the personal information of hundreds of millions of its users.

The companies were among more than 150, including Amazon, Apple, Microsoft and Sony, that had cut sharing deals with the world’s dominant social media platform. The agreements, previously reported in The New York Times, let the companies see users’ friends, contact information and other data, sometimes without consent. Facebook has phased out most of the partnerships over the past two years.

A grand jury in New York has subpoenaed records from at least two prominent makers of smartphones and other devices, according to two people who were familiar with the requests and who insisted on anonymity to discuss confidential legal matters. Both companies had entered into partnerships with Facebook, gaining broad access to the personal information of hundreds of millions of its users.


Yep, no surprise here. The invasion of privacy extends much further including the oligopolist, and in many cases, outright monopolies in the mobile phone carriers, ISPs and beyond. When will the U.S. get serious about anti-trust enforcement in the tech industry?

“We are cooperating with investigators and take those probes seriously,” a Facebook spokesman said in a statement. “We’ve provided public testimony, answered questions and pledged that we will continue to do so.”

[Read Brian Chen’s story on what he found when he downloaded his Facebook data.]

It is not clear when the grand jury inquiry, overseen by prosecutors with the United States attorney’s office for the Eastern District of New York, began or exactly what it is focusing on. Facebook was already facing scrutiny by the Federal Trade Commission and the Securities and Exchange Commission. And the Justice Department’s securities fraud unit began investigating it after reports that Cambridge Analytica, a political consulting firm, had improperly obtained the Facebook data of 87 million people and used it to build tools that helped President Trump’s election campaign.

The Justice Department and the Eastern District declined to comment for this article.

The Cambridge investigation, still active, is being run by prosecutors from the Northern District of California. One former Cambridge employee said investigators questioned him as recently as late February. He and three other witnesses in the case, speaking on the condition of anonymity so they would not anger prosecutors, said a significant line of inquiry involved Facebook’s claims that it was misled by Cambridge.

In public statements, Facebook executives had said that Cambridge told the company it was gathering data only for academic purposes. But the fine print accompanying a quiz app that collected the information said it could also be used commercially. Selling user data would have violated Facebook’s rules at the time, yet the social network does not appear to have regularly checked that apps were complying. Facebook deleted the quiz app in December 2015.

The disclosures about Cambridge last year thrust Facebook into the worst crisis of its history. Then came news reports last June and December that Facebook had given business partners — including makers of smartphones, tablets and other devices — deep access to users’ personal information, letting some companies effectively override users’ privacy settings.

The sharing deals empowered Microsoft’s Bing search engine to map out the friends of virtually all Facebook users without their explicit consent, and allowed Amazon to obtain users’ names and contact information through their friends. Apple was able to hide from Facebook users all indicators that its devices were even asking for data.

Privacy advocates said the partnerships seemed to violate a 2011 consent agreement between Facebook and the F.T.C., stemming from allegations that the company had shared data in ways that deceived consumers. The deals also appeared to contradict statements by Mark Zuckerberg and other executives that Facebook had clamped down several years ago on sharing the data of users’ friends with outside developers.

F.T.C. officials, who spent the past year investigating whether Facebook violated the 2011 agreement, are now weighing the sharing deals as they negotiate a possible multibillion-dollar fine. That would be the largest such penalty ever imposed by the trade regulator.

Facebook has aggressively defended the partnerships, saying they were permitted under a provision in the F.T.C. agreement that covered service providers — companies that acted as extensions of the social network.

The company has taken steps in the past year to tackle data misuse and misinformation. Last week, Mr. Zuckerberg unveiled a plan that would begin to pivot Facebook away from being a platform for public sharing and put more emphasis on private communications.

Over a Dozen Children’s and Consumer Advocacy Organizations Request Federal Trade Commission to Investigate Facebook for Deceptive Practices

It is not just me Tilting at Windmills as some have suggested. The Facebook and related social media threats are real – especially to our children.

Contact:
David Monahan, CCFC: david@commercialfreechildhood.org; (617) 896-9397
Lisa Cohen, Common Sense: lcohen@commonsense.org; (310) 395-2544

Over a Dozen Children’s and Consumer Advocacy Organizations Request Federal Trade Commission to Investigate Facebook for Deceptive Practices

SAN FRANCISCO, CA — February 21, 2019 — Earlier today, Common Sense Media, Campaign for a Commercial-Free Childhood, Center for Digital Democracy, and over a dozen organizations called upon the Federal Trade Commission (FTC) to investigate whether Facebook has engaged in unfair or deceptive practices in violation of Section 5 of the Federal Trade Commission Act and the Children’s Online Privacy Protection Act (COPPA).

“Facebook’s practice of ‘friendly fraud’ and referring to kids as ‘whales’ shows an ongoing pattern of the company putting profits over people. Kids, under any circumstances, should not be the target of irresponsible and unethical marketing tactics,” said Jim Steyer, CEO of Common Sense Media. “Facebook has a moral obligation to change its culture toward practices that foster the well-being of kids and families, and the FTC should ensure Facebook is acting responsibly.”

The FTC complaint is in response to unsealed documents from a 2012 class action lawsuit that Facebook settled in 2016. Upon a Freedom of Information Act request filed by the Center for Investigative Reporting, internal documents at Facebook revealed the company knowingly duped children into making in-game purchases and made refunds almost impossible to obtain. Facebook employees called the practice “friendly fraud” and referred to kids who spent large amounts of money as “whales,” a casino-industry term for super high rollers.

Advocates are concerned that Facebook employed unfair practices by charging children for purchases made without parental consent and often without parental awareness. According to Section 5 of the Federal Trade Commission Act, “unfair” practices are defined as those that “cause or [are] likely to cause substantial injury to consumers which is not reasonably avoidable by consumers themselves and not outweighed by countervailing benefits to consumers or to competition” (15 U.S.C. Sec. 45(n)). Advocates point to court documents to demonstrate substantial injury to consumers, including one teenager who incurred $6,500 of charges in just a few weeks, and request rates for refunds were 20 times higher than the usual rate of refund requests.

“Facebook’s scamming of children is not only unethical and reprehensible – it’s likely a violation of consumer protection laws,” said Josh Golin, Executive Director of Campaign for Commercial-Free Childhood. “Time and time again, we see that Facebook plays by its own rules regardless of the cost to children, families and society. We urge the FTC to hold Facebook accountable.”

Additionally, the complaint asks the FTC to investigate whether Facebook violated COPPA. Unsealed documents show that Facebook was aware that many of the games it offered were popular with children under age 13 and were in fact being played by children under 13. COPPA makes it unlawful for an “operator of a Web site or online service directed to children, or any operator that has actual knowledge that it is collecting or maintaining personal information from a child, to collect personal information from a child” unless it has obtained verifiable parental consent and provided appropriate disclosures.

Advocates are calling for the Commission to recognize the particular vulnerability of young people and investigate whether Facebook is complying with Section 5 and COPPA.

Groups signing on to the complaint include Common Sense Media, Center for Digital Democracy, Campaign for a Commercial-Free Childhood, Consumer Action, Electronic Privacy Information Center, Consumer Federation of America, Children and Screens, Badass Teachers Association, Inc., Media Education Foundation, New Dream, Parents Television Council, Peace Educators Allied for Children Everywhere (P.E.A.C.E.), Parent Coalition for Student Privacy, Public Citizen, Story of Stuff, TRUCE, and Defending the Early Years.

The full complaint can be read here.

It’s time to hold Facebook accountable

From the Campaign for a Commercial-Free Childhood -CCFC educates the public about commercialism’s impact on kids’ well being and advocates for the end of child-targeted marketing.

Quote

In January, it was revealed that Facebook knowingly defrauded children and their families out of millions of dollars by intentionally misleading children into making in-app purchases. The company referred to children who unintentionally spent thousands of dollars as “whales,” a casino industry term for high-rollers, and refused to refund unauthorized purchases. Not only did the company not refund these unauthorized charges, they encouraged them.

As we wrote at the time, these policies and attitudes toward kids show that Facebook is unfit to make products for children. Now, we’re joining our allies at Common Sense Media, Center for Digital Democracy, and 14 other organizations, asking the FTC to investigate these clearly fraudulent and deceptive practices. Facebook has proven again and again that it will stop at nothing to increase profits, even at the expense of children.

Read our press release here, and the full text of our FTC complaint here.

Zuck’s asleep at the wheel (or ZZZZing in his wallet) – This time Brexit

Note to Zuckerberg, if you cannot identify and add accountability to your advertisers, then just no! You are the real zucker here.

Britain’s Future has spent £340,000 promoting hard exit – but no one knows who’s funding it

The single biggest known British political advertiser on Facebook is a mysterious pro-Brexit campaign group pushing for a no-deal exit from the EU. The revelation about Britain’s Future, which has never disclosed the source of its funding or organisational structure, has raised concerns about the influence of “dark money” in British politics.

Hmmmm…smells like a wind blowing from the east.

The little-known campaign group has spent more than £340,000 on Facebook adverts backing a hard Brexit since the social network began publishing lists of political advertisers last October, making it a bigger spender than every UK political party and the government combined.

However, there is no information available about who is ultimately paying for the adverts, highlighting a key flaw in Facebook’s new political transparency tools.

The sophisticated campaign includes thousands of individual pro-Brexit adverts, targeted at voters in the constituencies of selected MPs. The adverts urge voters to email their local representative and create the impression of a grassroots uprising for a no-deal Brexit. The MPs then receive emails, signed by a “concerned constituent”, demanding a hard Brexit. The emails do not mention the involvement of an organised campaign group.

Britain’s Future’s public presence contains links to just two individuals: an ex-BBC Three sitcom writer turned journalist, and, indirectly, a former BNP candidate who lives on a farm called Rorke’s Drift in the Yorkshire dales.

The site’s public face is Tim Dawson, who created the sitcom Coming of Age while still in his teens before going on to contribute to Two Pints of Lager and a Packet of Crisps. In recent years he has stood for election to Manchester city council as a Conservative candidate before last year taking control of Britain’s Future.

However, there is no information available about who is ultimately paying for the adverts, highlighting a key flaw in Facebook’s new political transparency tools.

..

Under Facebook’s transparency rules, a representative of Britain’s Future would have been required to provide a valid UK postal address before placing political adverts, but this information was not made public. There are no checks on the ultimate source of any funds.

Facebook said it was only thanks to its new political ad transparency tools, introduced after the EU referendum and soon to be rolled out across the UK, that it was possible to see the extent of political advertising placed by Britain’s Future. There is no equivalent database for Google, Twitter or other online advertisers.

(Good point Facebook, in all fairness, the same rules need to apply accross all social media!)

Dawson’s pro-Brexit campaign group has spent more than a third of a million pounds on targeted Facebook and Instagram adverts in just a few months, including more than £50,000 last week alone, urging voters to email their local MP and tell them to get Britain out of the EU. An further unknown sum has also been spent buying up adverts alongside Google search results related to Brexit, suggesting that the total amount spent by his organisation on online campaigning could be much higher.

Throughout all this, Dawson, who these days makes a living from writing occasional pieces for the Daily Telegraph and the Spiked website, has declined to comment on the source of his funds, other than to tell the BBC that he was “raising small donations from friends and fellow Brexiteers”. There was no answer at his flat in Manchester and he has repeatedly declined to answer questions on how he has access to levels of funding that dwarf many high-profile campaigns.

According to its Facebook page, there are at least five individuals involved in the administration of Britain’s Future, although there are few clues as to who they are. Its “About Us” page contains a map centred on a remote building in the Yorkshire Dales north of Harrogate. This is Rorke’s Drift farm, named after the 1879 battle in South Africa where a small group of British soldiers made a successful last stand against thousands of Zulu warriors, an incident later depicted in the Michael Caine film Zulu.

The farm is home to Colin Banner, a former British National Party candidate. When contacted by the Guardian, he insisted that he had no knowledge of Dawson, was not aware of Britain’s Future, and was not involved in placing the adverts.

In a rare statement, Dawson declined to answer questions on funding or who was behind Britain’s Future. He said it was pure coincidence that his website was pointing to the remote home of a one-time BNP candidate and thanked the Guardian for bringing it to his attention.

“Britain’s Future has never associated with, nor would it ever associate with Colin Banner, or any BNP member. I have never met with, spoken to, or associated with Colin Banner, or any BNP member, nor would I want to. To state otherwise would be untrue.

“Designing the website required selecting a point on the map of the UK. The coordinates were randomly selected so the map of the UK would display centrally on the webpage. It was solely a design decision.

“The purpose of Britain’s Future is to represent the views of 17.4 million people who voted to leave the European Union – regardless of background. This is about delivering on the result of the referendum.”

No law is being broken by Britain’s Future’s campaigning. Outside of an election period, it is legal for any individual or campaign group to pay to promote political material without declaring where the funds come from. Britain’s Future is not a political party and does not appear to have any intention of putting forward candidates in elections, so is not regulated by laws requiring large political donations to be publicly declared.

Even the anti-Brexit People’s Vote campaign for a second referendum, backed with financing from the billionaire George Soros, has spent less on Facebook than Britain’s Future. Its website is essentially a personal blog on arguments for Brexit, with a discreet PayPal button soliciting donations.

Under Facebook’s transparency rules, a representative of Britain’s Future would have been required to provide a valid UK postal address before placing political adverts, but this information was not made public. There are no checks on the ultimate source of any funds.

Facebook said it was only thanks to its new political ad transparency tools, introduced after the EU referendum and soon to be rolled out across the UK, that it was possible to see the extent of political advertising placed by Britain’s Future. There is no equivalent database for Google, Twitter or other online advertisers.

Dawson previously stood as the Conservative council candidate in Manchester’s Hulme ward last year and finished a distant sixth. He gave an interview to Country Squire Magazine, explaining that he had recently embraced politics after becoming exasperated with the leftwing bias of the BBC: “There are lots and lots of Conservatives in this country and they deserve to be represented in our cultural landscape.”

Last month, a report from the Department for Digital, Culture, Media and Sport warned that electoral law was out of date and vulnerable to manipulation by hostile forces, and that the need to update it was urgent.

Mark Zuckerberg Says He’ll Shift Focus to Private Sharing

Bullshit!

Facebook’s business model is selling ads and massive sharing of data to profile user. When I go to Acuwaether, for one example, guess who they link to, you guessed it Facebook. Don’t believe this low life lying excuse for a person, ie. Zuckerberg. Just say no to Facebook, cure your addiction, and get on with your life.

Quote

SAN FRANCISCO — Social networking has long been predicated on people sharing their status updates, photos and messages with the world. Now Mark Zuckerberg, chief executive of Facebook, plans to start shifting people toward private conversations and away from public broadcasting.

Mr. Zuckerberg, who runs Facebook, Instagram, WhatsApp and Messenger, on Wednesday expressed his intentions to change the essential nature of social media. Instead of encouraging public posts, he said he would focus on private and encrypted communications, in which users message mostly smaller groups of people they know. Unlike publicly shared posts that are kept as users’ permanent records, the communications could also be deleted after a certain period of time.

He said Facebook would achieve the shift partly by integrating Instagram, WhatsApp and Messenger so that users worldwide could easily message one another across the networks. In effect, he said, Facebook would change from being a digital town square to creating a type of “digital living room,” where people could expect their discussions to be intimate, ephemeral and secure from outsiders.

“We’re building a foundation for social communication aligned with the direction people increasingly care about: messaging each other privately,” Mr. Zuckerberg said in an interview on Wednesday. In a blog post, he added that as he thought about the future of the internet, “I believe a privacy-focused communications platform will become even more important than today’s open platforms.”

Facebook’s plan — in which the company is playing catch-up to how people are already communicating digitally — raises new questions, not the least of which is whether it can realistically pull off a privacy-focused platform. The Silicon Valley giant, valued at $490 billion, depends on people openly sharing posts to be able to target advertising to them. While the company will not eradicate public sharing, a proliferation of private and secure communications could potentially hurt its business model.

Facebook also faces concerns about what the change means for people’s data and whether it was being anti-competitive by knitting together WhatsApp, Instagram and Messenger, which historically have been separate and operated autonomously.

Mr. Zuckerberg was vague on many details of the shift, including how long it would take to enact and whether that meant Instagram, WhatsApp and Messenger would share user information and other contact details with one another. He did not address how private, encrypted communications would affect Facebook’s bottom line.

But Mr. Zuckerberg did acknowledge the skepticism that Facebook would be able to change. “Frankly we don’t currently have a strong reputation for building privacy protective services, and we’ve historically focused on tools for more open sharing,” he wrote in his blog post. “But we’ve repeatedly shown that we can evolve to build the services that people really want, including in private messaging and stories.”

Facebook’s move is set to redefine how people use social media and how they will connect with one another. That has societal, political and national security implications given the grip that the company’s services have on more than 2.7 billion users around the world. In some countries, Facebook and its other apps are often considered as being the internet.
Editors’ Picks
Her Husband Did the Unthinkable. This Is a Play About Everything After.
She Helped Deliver Hundreds of Babies. Then She Was Arrested.
Bigger, Saltier, Heavier: Fast Food Since 1986 in 3 Simple Charts

Mr. Zuckerberg’s decision follows years of scandal for the social network, much of it originating from public sharing of posts. Foreign agents from countries like Russia have used Facebook to publish disinformation, in an attempt to sway elections. Some communities have used Facebook Groups to strengthen ideologies around issues such as anti-vaccination. And firms have harvested the material that people openly shared for all manner of purposes, including targeting advertising and creating voter profiles.

Even WhatsApp, which has long been encrypted, has grappled with the distribution of misinformation through its service, sometimes with deadly consequences.

All of that has put Facebook in the spotlight, which in turn has badly damaged the company’s reputation and created mistrust with users. Regulators have intensified scrutiny of Facebook’s privacy practices, with the Federal Trade Commission considering a multibillion-dollar fine against the company for violating a 2011 privacy consent decree. Last week, the agency said it would create a task force to monitor big tech companies and potential anti-competitive conduct.

Mr. Zuckerberg has repeatedly tried to rid Facebook of toxic content, disinformation and other problems. At one point, he emphasized prioritizing what friends and family shared on Facebook and de-emphasizing content from publishers and brands. He has also said that the company will hire more people to comb through and remove abusive or dangerous posts, and that it is working on artificial intelligence tools to do that job.

But none of those moves addressed the issue of public sharing. And in many ways, consumers were already moving en masse toward more private methods of digital communications.

Snap, the maker of the Snapchat app, has built a young, loyal audience by allowing people to share messages and stories for a finite period of time, for example. Other companies, like the local social networking company Nextdoor, focus on the power of group and community communications. And closed, private messaging services like Signal and Telegram have also become more prominent.

Evan Spiegel, chief executive of Snap, hinted at the problems that Facebook’s News Feed had created last week at a New York Times conference. Because of the way social networks had been constructed for people to publicly share content, he said, “things that are negative actually spread faster and further than things that are positive.” He later added, “You know, I certainly think there’s a lot of opportunity to sort of course-correct here.”
Interested in All Things Tech?

The Bits newsletter will keep you updated on the latest from Silicon Valley and the technology industry.

In many ways, Mr. Zuckerberg is now emulating a strategy popularized by Tencent, the Chinese internet company that makes the messaging app WeChat. WeChat has become the de facto portal to the rest of the internet for Chinese citizens because through the app, users can perform a multitude of tasks, like pay for items, communicate with friends and order takeout.

“Facebook is focused on mobile and messaging as the key conduit for people to communicate online, and thereby to communicate with Facebook,” said Ashkan Soltani, an independent privacy and security researcher who was a former chief technologist at the F.T.C. “The chat app essentially becomes your browser.”

Mr. Zuckerberg said that even though he would focus on private and secure conversations, the public forums for communication popularized by Facebook would continue. In addition, WhatsApp, Instagram and Messenger will remain stand-alone apps, even as their underlying messaging infrastructures are woven together, The Times previously reported. The work, which will include adding end-to-end encryption across all the apps, is in the early stages.

Mr. Zuckerberg said this overall shift would ultimately create new opportunities for Facebook.

“We’re thinking about private messaging in a way that we can build the tools to make that better,” he said in the interview. “There’s all kinds of different commerce opportunities, especially in developing countries. There’s more private tools to be built around peoples’ location. There’s just a whole set of broader utilities we can build that fit this more intimate mode of sharing.”