Skip to content

Tracking Iowa caucus-goers via their phones

Quote

On Thursday morning, I listened to an interview with the CEO of “a big data intelligence company” called Dstillery; it “demystifies consumers’ online footprints” to target them with ads. The CEO told public radio program Marketplace something astounding: his company had sucked up the mobile device ID’s from the phones of Iowa caucus-goers to match them with their online profiles.

Via Marketplace:

“We watched each of the caucus locations for each party and we collected mobile device ID’s,” Dstillery CEO Tom Phillips said. “It’s a combination of data from the phone and data from other digital devices.”

Dstillery found some interesting things about voters. For one, people who loved to grill or work on their lawns overwhelmingly voted for Trump in Iowa, according to Phillips.

..

What really happened is that Dstillery gets information from people’s phones via ad networks. When you open an app or look at a browser page, there’s a very fast auction that happens where different advertisers bid to get to show you an ad. Their bid is based on how valuable they think you are, and to decide that, your phone sends them information about you, including, in many cases, an identifying code (that they’ve built a profile around) and your location information, down to your latitude and longitude.

Yes, for the vast majority of people, ad networks are doing far more information collection about them than the NSA–but they don’t explicitly link it to their names.

So on the night of the Iowa caucus, Dstillery flagged all the auctions that took place on phones in latitudes and longitudes near caucus locations. It wound up spotting 16,000 devices on caucus night, as those people had granted location privileges to the apps or devices that served them ads. It captured those mobile ID’s and then looked up the characteristics associated with those IDs in order to make observations about the kind of people that went to Republican caucus locations (young parents) versus Democrat caucus locations. It drilled down farther (e.g., ‘people who like NASCAR voted for Trump and Clinton’) by looking at which candidate won at a particular caucus location….

For most ads you see on web browsers and mobile devices, there is an auction among various programmatic advertising firms for the chance to show you an ad. We are one of those buyers, and we are sent a variety of anonymous data, including what kind of phone you have, what app you are using, what operating system version you’re running, and sometimes – crucially for this study – your latitude and longitude (lat/long).
We identified the caucusing locations prior [to] the Iowa caucus and told our system to be on the lookout for devices that report a lat/long at those locations during the caucus.

So when we received an ad bid request that our system recognized as being at one of the caucus sites, our system flagged that request and captured that device ID so we could use it for this.

This is roughly equivalent to exit polling for the smart phone age.

Turn off GPS unless using it, turn on add blockers, and use a VPN.

Amazon Quietly Removes Encryption Support from its Gadgets

Quote

While Apple is fighting the FBI in court over encryption, Amazon quietly disabled the option to use encryption to protect data on its Android-powered devices.

The tech giant has recently deprecated support for device encryption on the latest version of Fire OS, Amazon’s custom Android operating system, which powers its tablets and phones. In the past, privacy-minded users could protect data stored inside their devices, such as their emails, by scrambling it with a password, which made it unreadable in case the device got lost or stolen. With this change, users who had encryption on in their Fire devices are left with two bad choices: either decline to install the update, leaving their devices with outdated software, or give up and keep their data unencrypted. …“This is a terrible move as it compromises the safety of Kindle Fire owners by making their data vulnerable to all manner of bad actors, including crackers and repressive governments,” Aral Balkan, a coder, human rights activist, and owner of a Kindle Fire, told Motherboard. “It’s clear with this move that Amazon does not respect the safety of its customers.”

Balkan also highlighted the hypocrisy of Amazon using encryption to protect its copyright with digital rights management or DRM technology.

Some Amazon Fire customers complained about the change it in support forums.

“How can we keep using these devices if we can’t actually secure the large amount of personal data that ends up on them?” asked a user rhetorically.

Hijack Your Wireless Mice to Hack Computers from Afar

wireless keyboard mice hacked

Quote

A flaw in the way several popular models of wireless mice and their corresponding receivers, the sticks or “dongles” that plug into a USB port and transmit data between the mouse and the computer, handle encryption could leave “billions” of computers vulnerable to hackers, security firm Bastille warned on Tuesday.

In short, a hacker standing within 100 yards of the victim’s computer and using a $30 long-range radio dongle and a few lines of code could intercept the radio signal between the victim’s mouse and the dongle plugged into the victim’s computer. Then this hacker could replace the signal with her own, and use her own keyboard to control victim’s computer.

….

For Rouland, these vulnerabilities, which affect non-Bluetooth mice produced by Logitech, Dell, Lenovo and other brands, are a harbinger of the near future of the Internet of Things when both companies and regular consumers will have hackable radio-enabled devices in their offices or homes. It’s worth noting that Bastille specializes in Internet of Things (IoT) security, and sells a product for corporations that promises to “detect and mitigate” threats from IoT devices across all the radio spectrum. That obviously means the firm has a vested interest in highlighting ways companies could get hacked.

This attack in particular, which Bastille has branded with the hashtag-friendly word “MouseJack,” builds on previous research done on hacking wireless keyboards. But in this case, the issue is that manufacturers don’t properly encrypt data transmitted between the mouse and the dongle, according to Bastille’s white paper.

Bill Gates Is Backing the FBI in Its Case Against Apple

Or is he?
Quote

Microsoft co-founder and billionaire philanthropist Bill Gates is backing the Federal Bureau of Investigation in its legal battle against Apple over encryption in an iPhone used by one of the shooters in December’s San Bernardino attacks.

In an interview with the Financial Times published late Monday night, Gates dismissed the idea that granting the FBI access would set a meaningful legal precedent, arguing that the FBI is “not asking for some general thing, [it is] asking for a particular case.”

Gates goes on:

“It is no different than [the question of] should anybody ever have been able to tell the phone company to get information, should anybody be able to get at bank records. Let’s say the bank had tied a ribbon round the disk drive and said ‘don’t make me cut this ribbon, because you’ll make me cut it many times.’”

….

[BUT] In an interview with Bloomberg’s TV network this morning, Gates takes issue with the FT story, but it’s not entirely clear whether he he is walking back his comments, or simply doesn’t like the headline and other packaging around them. After a Bloomberg anchor suggests that Gates was “blindsided” by the FT headline, Gates says the following:

“I was disappointed, because that doesn’t state my view on this. I do believe that with the right safeguards, there are cases where the government, on our behalf — like stopping terrorism, which could get worse in the future — that that is valuable. But striking that balance — clearly the government [has] taken information, historically, and used it in ways that we didn’t expect, going all the way back, say to the FBI under J. Edgar Hoover. So I’m hoping now we can have the discussion. I do believe there are sets of safeguards where the government shouldn’t have to be completely blind.”

And in a response to a follow-up question about the specifics of the FBI/Apple dispute, Gates offered this: “The courts are going to decide this. … In the meantime, that gives us this opportunity to get the discussion. And these issues will be decided in Congress.”

I never trust anything Bill Gates says given his legacy.

The former head of the NSA has a surprising stance on Apple’s battle with the FBI

Quote

Apple has found an unlikely ally in its fight against iPhone backdoors: the former head of the office responsible for spying.

Michael Hayden, who at different times was the head of the NSA and CIA, told USA Today’s Susan Page that he’s against legislation that would require tech companies to create so-called “backdoors” that would make it easier for law enforcement to access devices like smartphones and computers.

Apple has found an unlikely ally in its fight against iPhone backdoors: the former head of the office responsible for spying.

Michael Hayden, who at different times was the head of the NSA and CIA, told USA Today’s Susan Page that he’s against legislation that would require tech companies to create so-called “backdoors” that would make it easier for law enforcement to access devices like smartphones and computers.

This is Why People Fear the ‘Internet of Things’

IoT Spy

Quote

Imagine buying an internet-enabled surveillance camera, network attached storage device, or home automation gizmo, only to find that it secretly and constantly phones home to a vast peer-to-peer (P2P) network run by the Chinese manufacturer of the hardware. Now imagine that the geek gear you bought doesn’t actually let you block this P2P communication without some serious networking expertise or hardware surgery that few users would attempt.

This is the nightmare “Internet of Things” (IoT) scenario for any system administrator: The IP cameras that you bought to secure your physical space suddenly turn into a vast cloud network designed to share your pictures and videos far and wide. The best part? It’s all plug-and-play, no configuration necessary!

I first became aware of this bizarre experiment in how not to do IoT last week when a reader sent a link to a lengthy discussion thread on the support forum for Foscam, a Chinese firm that makes and sells security cameras. The thread was started by a Foscam user who noticed his IP camera was noisily and incessantly calling out to more than a dozen online hosts in almost as many countries.

Turns out, this Focscam camera was one of several newer models the company makes that comes with peer-to-peer networking capabilities baked in. This fact is not exactly spelled out for the user (although some of the models listed do say “P2P” in the product name, others do not).

But the bigger issue with these P2P -based cameras is that while the user interface for the camera has a setting to disable P2P traffic (it is enabled by default), Foscam admits that disabling the P2P option doesn’t actually do anything to stop the device from seeking out other P2P hosts online (see screenshot below).

This is a concern because the P2P function built into Foscam P2P cameras is designed to punch through firewalls and can’t be switched off without applying a firmware update plus an additional patch that the company only released after repeated pleas from users on its support forum.
Yeah, this setting doesn’t work. P2P is still enabled even after you uncheck the box.

One of the many hosts that Foscam users reported seeing in their firewall logs was iotcplatform.com, a domain registered to Chinese communications firm ThroughTek Co., Ltd. Turns out, this domain has shown up in firewall logs for a number of other curious tinkerers who cared to take a closer look at what their network attached storage and home automation toys were doing on their network.

In January 2015, a contributing writer for the threat-tracking SANS Internet Storm Center wrote in IoT: The Rise of the Machines that he found the same iotcplatform.com domain called out in network traffic generated by a Maginon SmartPlug he’d purchased (smart plugs are power receptacles into which you plug lights and other appliances you may wish to control remotely).

….

“The details about how P2P feature works which will be helpful for you understand why the camera need communicate with P2P servers,” Qu explained. “Our company deploy many servers in some regions of global world.” Qu further explained:

1. When the camera is powered on and connected to the internet, the camera will log in our main P2P server with fastest response and get the IP address of other server with low load and log in it. Then the camera will not connect the main P2P server.

2. When log in the camera via P2P with Foscam App, the app will also log in our main P2P server with fastest response and get the IP address of the server the camera connect to.

3. The App will ask the server create an independent tunnel between the app and the camera. The data and video will transfers directly between them and will not pass through the server. If the server fail to create the tunnel, the data and video will be forwarded by the server and all of them are encrypted.

4. Finally the camera will keep hearbeat connection with our P2P server in order to check the connection status with the servers so that the app can visit the camera directly via the server. Only when the camera power off/on or change another network, it will replicate the steps above.”

As I noted in a recent column IoT Reality: Smart Devices, Dumb Defaults, the problem with so many IoT devices is not necessarily that they’re ill-conceived, it’s that their default settings often ignore security and/or privacy concerns. I’m baffled as to why such a well-known brand as Foscam would enable P2P communications on a product that is primarily used to monitor and secure homes and offices.

Apparently I’m not alone in my bafflement. Nicholas Weaver, a senior researcher in networking and security for the International Computer Science Institute (ICSI), called the embedded P2P feature “an insanely bad idea” all around.

“It opens up all Foscam users not only to attacks on their cameras themselves (which may be very sensitive), but an exploit of the camera also enables further intrusions into the home network,” Weaver said.

Windows 10 forced update KB 3135173 changes browser and other defaults

Quote

“The cumulative update not only knocks out PCs’ default settings, it prevents users from resetting them”

If you have Chrome as the default browser on your Windows 10 computer, you’d better check to make sure Microsoft didn’t hijack it last week and set Edge as your new default. The same goes for any PDF viewer: A forced cumulative update also reset PDF viewing to Edge on many PCs.

Do you use IrfanView, Acdsee, Photoshop Express, or Elements? The default photo app may have been reset to — you guessed it — the Windows Photos app. Music? Video? Microsoft may have swooped down and changed you over to Microsoft Party apps, all in the course of last week’s forced cumulative update KB 3135173 .
….
How many times does this have to happen before Microsoft separates security and non-security patches, and give us tools to block or delay patches? As long as Microsoft’s patching bugs are relatively minor, there’s little incentive to give us the tools we need. The day we get a really bad, crippling patch, there’ll be tar and feathers.

Better IDEA: Just saw NO to Windows 10

Why the FBI’s request to Apple will affect civil rights for a generation

fbi-cracked-iphone

“No legal case applies in a vacuum, and in this case the FBI needs the precedent more than the evidence.”

Before posting the full article, I want to state I fully support Apple in this matter. As a security professional, I agree with the author Rich Mogull

 

“What matters is if we have a right to the security and privacy of our devices, and of our communications, which are also under assault. If we have the right to tools to defend ourselves from the government and criminals alike. Yes, these tools will be sometimes used for the worst of crimes, but they’re also fundamental to our civil rights, freedom of discourse, and our ability to protect our digital lives from the less impactful, but far more frequent criminal attacks.

 

Quote

On Tuesday, the United States District Court of California issued an order requiring Apple to assist the FBI in accessing a locked iPhone (PDF)—and not just any iPhone, but the iPhone 5c used by one of the San Bernardino shooters. The order is very clear: Build new firmware to enable the FBI to perform an unlimited, high speed brute force attack, and place that firmware on the device.

Apple is not only fighting the request, but posted a public letter signed by Tim Cook and linked on Apple’s front page.

Make no mistake: This is unprecedented, and the situation was deliberately engineered by the FBI and Department of Justice to force a showdown that could define limits our civil rights for generations to come. This is an issue with far-reaching implications well beyond a single phone, a single case, or even Apple itself.

As a career security professional, this case has chilling implications.

Why now?

I’ve been writing about Apple’s role in our digital civil rights since 2014, and specifically addressed why Apple is at the center of the battle over encryption last month on TidBITS. The short version is that Apple is one of the only companies with the technologies, high profile, and business model to both find themselves in the cross hairs, and take a strong position.

Make no mistake, Apple has a long history of complying with court orders and assisting law enforcement. Previous to iOS 8, they could extract data off devices. Even today, data in most of their online services (iCloud, excluding iMessage and FaceTime) can be provided upon legal request.

This case is different for multiple reasons:

  • Apple is being asked to specifically create new software to circumvent their security controls. They aren’t being asked to use existing capabilities, since those no longer work. The FBI wants a new version of the operating system designed to allow the FBI to brute force attack the phone.
  • The FBI is using a highly emotional, nationally infamous terrorism case as justification for the request.
  • The request refers to the All Writs Act, which is itself under scrutiny in a case in New York involving Apple. Federal Magistrate Judge James Orenstein of the Eastern District of New York is currently evaluating if the Act applies in these cases.

That’s why this is about far more than a single phone. Apple does not have the existing capability to assist the FBI. The FBI engineered a case where the perpetrators are already dead, but emotions are charged. And the law cited is under active legal debate within the federal courts.

The crux of the issue is should companies be required to build security circumvention technologies to expose their own customers? Not “assist law enforcement with existing tools,” but “build new tools.”

The FBI Director has been clear that the government wants back doors into our devices, even though the former head of the NSA disagrees and supports strong consumer encryption. One reason Apple is likely fighting this case so publicly is that it is a small legal step from requiring new circumvention technology, to building such access into devices. The FBI wants the precedent far more than they need the evidence, and this particular case is incredibly high profile and emotional.

The results will, without question, establish precedence beyond one killer’s iPhone.

The technical details

The court order is quite specific. It applies only to one iPhone, and requests Apple create a new version of the firmware that eliminates the existing feature that erases the iPhone after 10 failed attempts at entering the passcode. It further asks Apple to allow passcode attempts to be performed as rapidly as possible.

Apple has been prompting users to choose longer and more complicated—and harder to crack—iPhone passcodes.

Beginning with iOS 8, devices are encrypted using a key derived from your passcode. This is combined with a hardware key specific to the device. Apple has no way of knowing or circumventing that key. On newer devices, the hardware key is embedded in the device and is not recoverable. Thus the passcode must be combined with the device key in a chip on the phone, and that chip rate-limits passcode attempts to make a brute force attack slower.

Reading through the order, it seems the FBI thinks that a modified version of the operating system would allow them to engage in high-speed attacks, if the 10-tries limit was removed. The request indicates they likely can’t image the device and perform all the attacks on their own super-fast computers, due to that hardware key. With a four-character passcode the device could probably be cracked in hours. A six-character code might take days or weeks, and anything longer could take months or years.

Dan Guido over at Trail of Bits posted a great explanation:

As many jailbreakers are familiar, firmware can be loaded via Device Firmware Upgrade (DFU) Mode. Once an iPhone enters DFU mode, it will accept a new firmware image over a USB cable. Before any firmware image is loaded by an iPhone, the device first checks whether the firmware has a valid signature from Apple. This signature check is why the FBI cannot load new software onto an iPhone on their own—the FBI does not have the secret keys that Apple uses to sign firmware.

This opens up a few questions. Could this work on newer devices with the enhanced encryption of the Secure Enclave? How can Apple pair the device and replace the firmware in the first place? Would they be using the shooter’s computer? An over-the-air update? Apple says that all devices (with or without the Secure Enclave) are vulnerable to this kind of attack, but declined to comment on the specific technical methods, a position I initially disagreed with, but on reflection is probably the right move for reasons we will get to in a moment.

Thus the FBI wants a new version of iOS, signed by Apple and installed on the device, that removes limitations on their attempts to brute-force the password.

Why this matters

Legal precedent is like a glacier, slowly building over time until it becomes nigh unstoppable. Major issues like this are first, and sometimes ultimately, decided on a series of small steps that build on each other. It’s the reason the NRA fights any attempts at gun control, since they fear a slow build, not a single small law.

The crux of this round of the encryption debate is if companies should be forced to build tools to circumvent their customers’ security. If the answer is “yes,” it could be a small step to “should they just build these tools into the OS from the start?”

I have no doubt the FBI deliberately chose the highest-profile domestic terrorism case in possibly a decade. We, average citizens, want the FBI to stop this sort of evil. We don’t necessarily see this one case as applying to our lives and our rights. Why the big deal? What if the FBI could find the terrorists’ contacts and stop other attacks?

What matters is if we have a right to the security and privacy of our devices and communications.

But the truth is, no legal case applies in a vacuum. If this goes through, if Apple is forced to assist, it will open a floodgate of law enforcement requests. Then what about civil cases? Opening a phone to support a messy divorce and child custody battle? Or what about requests from other nations, especially places like China and the UAE that already forced BlackBerry and others to compromise the security of their customers?

And once the scale of these requests increases, as a security professional I guarantee the tools will leak, the techniques will be exploited by criminals, and our collective security will decline. It really doesn’t matter if it’s the iPhone 5c or 6s. It really doesn’t matter if this is about dead terrorists or a drug dealer. It doesn’t matter what specific circumvention Apple is being asked to create.

What matters is if we have a right to the security and privacy of our devices, and of our communications, which are also under assault. If we have the right to tools to defend ourselves from the government and criminals alike. Yes, these tools will be sometimes used for the worst of crimes, but they’re also fundamental to our civil rights, freedom of discourse, and our ability to protect our digital lives from the less impactful, but far more frequent criminal attacks.

This situation was engineered by the FBI and Department of Justice for the maximum impact and chances of success. Apple is fighting, and as a security professional it’s my obligation to support their position, and stronger security.

France attacks Facebook data tracking, opening new front in privacy battles

facebook big brother
Quote

French data regulators have given Facebook three months to stop transferring data on French users to the US and to refrain from tracking nonusers.

PARIS — In yet another fissure between the US and Europe over digital privacy practices, French regulators ordered Facebook to curtail its online data collection practices.

The country’s data protection authority, known by its French acronym CNIL, ruled this week to give Facebook three months to stop transferring data on French users to the states and to refrain from collecting information about nonusers, or else face hefty fines.

—–
There is an easier solution. Just stop using it. These slime balls track you whether you are a user or not. That said, anyone who disrespects their own privacy deserves what they get. Word of the day Insouciant -“Marked by blithe unconcern; nonchalant.” And it is not just users of Facebook and other social media, it is what we witness everyday in businesses when it comes to their IT security and their employee and customer’s privacy.

La justice confirme que les tribunaux français peuvent juger Facebook

Quote (French) / Quote (English)

Paris court rules against Facebook in French nudity case

facebook censorship

The Paris appeal court has upheld a ruling that Facebook can be sued under French – not Californian – law.

A French teacher won in the Paris high court last year, arguing that Facebook should not have suspended his account because of an erotic image on his page.

Facebook appealed against that ruling – but the appeal court has now upheld the criticism of Facebook’s user terms.

US-based Facebook says users can only sue in California. It removed a close-up of a nude woman, painted by Courbet.

The teacher, Frederic Durand-Baissas, argued that he had a right to post a link on Facebook with the image of the famous Gustave Courbet painting. The original 19th-Century work hangs in the Musee d’Orsay in Paris.

The teacher accused Facebook of censorship and said the social network should reinstate his account and pay him €20,000 (£15,521; $22,567) in damages. He sued the company in 2011.

It is seen as a test case, potentially paving the way for other lawsuits against Facebook outside US jurisdiction.

Facebook users have to agree to the tech giant’s terms of service, which state that its jurisdiction is California. About 22 million French people are on Facebook.

The Paris high court decided that the company’s argument was “abusive” and violated French consumer law, by making it difficult for people in France to sue.

The Facebook community standards say “we restrict the display of nudity because some audiences within our global community may be sensitive to this type of content – particularly because of their cultural background or age”.

———
Good work Frederic Durand-Baissas!