Skip to content

Monthly Archives: February 2016

Hijack Your Wireless Mice to Hack Computers from Afar

wireless keyboard mice hacked

Quote

A flaw in the way several popular models of wireless mice and their corresponding receivers, the sticks or “dongles” that plug into a USB port and transmit data between the mouse and the computer, handle encryption could leave “billions” of computers vulnerable to hackers, security firm Bastille warned on Tuesday.

In short, a hacker standing within 100 yards of the victim’s computer and using a $30 long-range radio dongle and a few lines of code could intercept the radio signal between the victim’s mouse and the dongle plugged into the victim’s computer. Then this hacker could replace the signal with her own, and use her own keyboard to control victim’s computer.

….

For Rouland, these vulnerabilities, which affect non-Bluetooth mice produced by Logitech, Dell, Lenovo and other brands, are a harbinger of the near future of the Internet of Things when both companies and regular consumers will have hackable radio-enabled devices in their offices or homes. It’s worth noting that Bastille specializes in Internet of Things (IoT) security, and sells a product for corporations that promises to “detect and mitigate” threats from IoT devices across all the radio spectrum. That obviously means the firm has a vested interest in highlighting ways companies could get hacked.

This attack in particular, which Bastille has branded with the hashtag-friendly word “MouseJack,” builds on previous research done on hacking wireless keyboards. But in this case, the issue is that manufacturers don’t properly encrypt data transmitted between the mouse and the dongle, according to Bastille’s white paper.

Bill Gates Is Backing the FBI in Its Case Against Apple

Or is he?
Quote

Microsoft co-founder and billionaire philanthropist Bill Gates is backing the Federal Bureau of Investigation in its legal battle against Apple over encryption in an iPhone used by one of the shooters in December’s San Bernardino attacks.

In an interview with the Financial Times published late Monday night, Gates dismissed the idea that granting the FBI access would set a meaningful legal precedent, arguing that the FBI is “not asking for some general thing, [it is] asking for a particular case.”

Gates goes on:

“It is no different than [the question of] should anybody ever have been able to tell the phone company to get information, should anybody be able to get at bank records. Let’s say the bank had tied a ribbon round the disk drive and said ‘don’t make me cut this ribbon, because you’ll make me cut it many times.’”

….

[BUT] In an interview with Bloomberg’s TV network this morning, Gates takes issue with the FT story, but it’s not entirely clear whether he he is walking back his comments, or simply doesn’t like the headline and other packaging around them. After a Bloomberg anchor suggests that Gates was “blindsided” by the FT headline, Gates says the following:

“I was disappointed, because that doesn’t state my view on this. I do believe that with the right safeguards, there are cases where the government, on our behalf — like stopping terrorism, which could get worse in the future — that that is valuable. But striking that balance — clearly the government [has] taken information, historically, and used it in ways that we didn’t expect, going all the way back, say to the FBI under J. Edgar Hoover. So I’m hoping now we can have the discussion. I do believe there are sets of safeguards where the government shouldn’t have to be completely blind.”

And in a response to a follow-up question about the specifics of the FBI/Apple dispute, Gates offered this: “The courts are going to decide this. … In the meantime, that gives us this opportunity to get the discussion. And these issues will be decided in Congress.”

I never trust anything Bill Gates says given his legacy.

The former head of the NSA has a surprising stance on Apple’s battle with the FBI

Quote

Apple has found an unlikely ally in its fight against iPhone backdoors: the former head of the office responsible for spying.

Michael Hayden, who at different times was the head of the NSA and CIA, told USA Today’s Susan Page that he’s against legislation that would require tech companies to create so-called “backdoors” that would make it easier for law enforcement to access devices like smartphones and computers.

Apple has found an unlikely ally in its fight against iPhone backdoors: the former head of the office responsible for spying.

Michael Hayden, who at different times was the head of the NSA and CIA, told USA Today’s Susan Page that he’s against legislation that would require tech companies to create so-called “backdoors” that would make it easier for law enforcement to access devices like smartphones and computers.

This is Why People Fear the ‘Internet of Things’

IoT Spy

Quote

Imagine buying an internet-enabled surveillance camera, network attached storage device, or home automation gizmo, only to find that it secretly and constantly phones home to a vast peer-to-peer (P2P) network run by the Chinese manufacturer of the hardware. Now imagine that the geek gear you bought doesn’t actually let you block this P2P communication without some serious networking expertise or hardware surgery that few users would attempt.

This is the nightmare “Internet of Things” (IoT) scenario for any system administrator: The IP cameras that you bought to secure your physical space suddenly turn into a vast cloud network designed to share your pictures and videos far and wide. The best part? It’s all plug-and-play, no configuration necessary!

I first became aware of this bizarre experiment in how not to do IoT last week when a reader sent a link to a lengthy discussion thread on the support forum for Foscam, a Chinese firm that makes and sells security cameras. The thread was started by a Foscam user who noticed his IP camera was noisily and incessantly calling out to more than a dozen online hosts in almost as many countries.

Turns out, this Focscam camera was one of several newer models the company makes that comes with peer-to-peer networking capabilities baked in. This fact is not exactly spelled out for the user (although some of the models listed do say “P2P” in the product name, others do not).

But the bigger issue with these P2P -based cameras is that while the user interface for the camera has a setting to disable P2P traffic (it is enabled by default), Foscam admits that disabling the P2P option doesn’t actually do anything to stop the device from seeking out other P2P hosts online (see screenshot below).

This is a concern because the P2P function built into Foscam P2P cameras is designed to punch through firewalls and can’t be switched off without applying a firmware update plus an additional patch that the company only released after repeated pleas from users on its support forum.
Yeah, this setting doesn’t work. P2P is still enabled even after you uncheck the box.

One of the many hosts that Foscam users reported seeing in their firewall logs was iotcplatform.com, a domain registered to Chinese communications firm ThroughTek Co., Ltd. Turns out, this domain has shown up in firewall logs for a number of other curious tinkerers who cared to take a closer look at what their network attached storage and home automation toys were doing on their network.

In January 2015, a contributing writer for the threat-tracking SANS Internet Storm Center wrote in IoT: The Rise of the Machines that he found the same iotcplatform.com domain called out in network traffic generated by a Maginon SmartPlug he’d purchased (smart plugs are power receptacles into which you plug lights and other appliances you may wish to control remotely).

….

“The details about how P2P feature works which will be helpful for you understand why the camera need communicate with P2P servers,” Qu explained. “Our company deploy many servers in some regions of global world.” Qu further explained:

1. When the camera is powered on and connected to the internet, the camera will log in our main P2P server with fastest response and get the IP address of other server with low load and log in it. Then the camera will not connect the main P2P server.

2. When log in the camera via P2P with Foscam App, the app will also log in our main P2P server with fastest response and get the IP address of the server the camera connect to.

3. The App will ask the server create an independent tunnel between the app and the camera. The data and video will transfers directly between them and will not pass through the server. If the server fail to create the tunnel, the data and video will be forwarded by the server and all of them are encrypted.

4. Finally the camera will keep hearbeat connection with our P2P server in order to check the connection status with the servers so that the app can visit the camera directly via the server. Only when the camera power off/on or change another network, it will replicate the steps above.”

As I noted in a recent column IoT Reality: Smart Devices, Dumb Defaults, the problem with so many IoT devices is not necessarily that they’re ill-conceived, it’s that their default settings often ignore security and/or privacy concerns. I’m baffled as to why such a well-known brand as Foscam would enable P2P communications on a product that is primarily used to monitor and secure homes and offices.

Apparently I’m not alone in my bafflement. Nicholas Weaver, a senior researcher in networking and security for the International Computer Science Institute (ICSI), called the embedded P2P feature “an insanely bad idea” all around.

“It opens up all Foscam users not only to attacks on their cameras themselves (which may be very sensitive), but an exploit of the camera also enables further intrusions into the home network,” Weaver said.

Windows 10 forced update KB 3135173 changes browser and other defaults

Quote

“The cumulative update not only knocks out PCs’ default settings, it prevents users from resetting them”

If you have Chrome as the default browser on your Windows 10 computer, you’d better check to make sure Microsoft didn’t hijack it last week and set Edge as your new default. The same goes for any PDF viewer: A forced cumulative update also reset PDF viewing to Edge on many PCs.

Do you use IrfanView, Acdsee, Photoshop Express, or Elements? The default photo app may have been reset to — you guessed it — the Windows Photos app. Music? Video? Microsoft may have swooped down and changed you over to Microsoft Party apps, all in the course of last week’s forced cumulative update KB 3135173 .
….
How many times does this have to happen before Microsoft separates security and non-security patches, and give us tools to block or delay patches? As long as Microsoft’s patching bugs are relatively minor, there’s little incentive to give us the tools we need. The day we get a really bad, crippling patch, there’ll be tar and feathers.

Better IDEA: Just saw NO to Windows 10

Why the FBI’s request to Apple will affect civil rights for a generation

fbi-cracked-iphone

“No legal case applies in a vacuum, and in this case the FBI needs the precedent more than the evidence.”

Before posting the full article, I want to state I fully support Apple in this matter. As a security professional, I agree with the author Rich Mogull

 

“What matters is if we have a right to the security and privacy of our devices, and of our communications, which are also under assault. If we have the right to tools to defend ourselves from the government and criminals alike. Yes, these tools will be sometimes used for the worst of crimes, but they’re also fundamental to our civil rights, freedom of discourse, and our ability to protect our digital lives from the less impactful, but far more frequent criminal attacks.

 

Quote

On Tuesday, the United States District Court of California issued an order requiring Apple to assist the FBI in accessing a locked iPhone (PDF)—and not just any iPhone, but the iPhone 5c used by one of the San Bernardino shooters. The order is very clear: Build new firmware to enable the FBI to perform an unlimited, high speed brute force attack, and place that firmware on the device.

Apple is not only fighting the request, but posted a public letter signed by Tim Cook and linked on Apple’s front page.

Make no mistake: This is unprecedented, and the situation was deliberately engineered by the FBI and Department of Justice to force a showdown that could define limits our civil rights for generations to come. This is an issue with far-reaching implications well beyond a single phone, a single case, or even Apple itself.

As a career security professional, this case has chilling implications.

Why now?

I’ve been writing about Apple’s role in our digital civil rights since 2014, and specifically addressed why Apple is at the center of the battle over encryption last month on TidBITS. The short version is that Apple is one of the only companies with the technologies, high profile, and business model to both find themselves in the cross hairs, and take a strong position.

Make no mistake, Apple has a long history of complying with court orders and assisting law enforcement. Previous to iOS 8, they could extract data off devices. Even today, data in most of their online services (iCloud, excluding iMessage and FaceTime) can be provided upon legal request.

This case is different for multiple reasons:

  • Apple is being asked to specifically create new software to circumvent their security controls. They aren’t being asked to use existing capabilities, since those no longer work. The FBI wants a new version of the operating system designed to allow the FBI to brute force attack the phone.
  • The FBI is using a highly emotional, nationally infamous terrorism case as justification for the request.
  • The request refers to the All Writs Act, which is itself under scrutiny in a case in New York involving Apple. Federal Magistrate Judge James Orenstein of the Eastern District of New York is currently evaluating if the Act applies in these cases.

That’s why this is about far more than a single phone. Apple does not have the existing capability to assist the FBI. The FBI engineered a case where the perpetrators are already dead, but emotions are charged. And the law cited is under active legal debate within the federal courts.

The crux of the issue is should companies be required to build security circumvention technologies to expose their own customers? Not “assist law enforcement with existing tools,” but “build new tools.”

The FBI Director has been clear that the government wants back doors into our devices, even though the former head of the NSA disagrees and supports strong consumer encryption. One reason Apple is likely fighting this case so publicly is that it is a small legal step from requiring new circumvention technology, to building such access into devices. The FBI wants the precedent far more than they need the evidence, and this particular case is incredibly high profile and emotional.

The results will, without question, establish precedence beyond one killer’s iPhone.

The technical details

The court order is quite specific. It applies only to one iPhone, and requests Apple create a new version of the firmware that eliminates the existing feature that erases the iPhone after 10 failed attempts at entering the passcode. It further asks Apple to allow passcode attempts to be performed as rapidly as possible.

Apple has been prompting users to choose longer and more complicated—and harder to crack—iPhone passcodes.

Beginning with iOS 8, devices are encrypted using a key derived from your passcode. This is combined with a hardware key specific to the device. Apple has no way of knowing or circumventing that key. On newer devices, the hardware key is embedded in the device and is not recoverable. Thus the passcode must be combined with the device key in a chip on the phone, and that chip rate-limits passcode attempts to make a brute force attack slower.

Reading through the order, it seems the FBI thinks that a modified version of the operating system would allow them to engage in high-speed attacks, if the 10-tries limit was removed. The request indicates they likely can’t image the device and perform all the attacks on their own super-fast computers, due to that hardware key. With a four-character passcode the device could probably be cracked in hours. A six-character code might take days or weeks, and anything longer could take months or years.

Dan Guido over at Trail of Bits posted a great explanation:

As many jailbreakers are familiar, firmware can be loaded via Device Firmware Upgrade (DFU) Mode. Once an iPhone enters DFU mode, it will accept a new firmware image over a USB cable. Before any firmware image is loaded by an iPhone, the device first checks whether the firmware has a valid signature from Apple. This signature check is why the FBI cannot load new software onto an iPhone on their own—the FBI does not have the secret keys that Apple uses to sign firmware.

This opens up a few questions. Could this work on newer devices with the enhanced encryption of the Secure Enclave? How can Apple pair the device and replace the firmware in the first place? Would they be using the shooter’s computer? An over-the-air update? Apple says that all devices (with or without the Secure Enclave) are vulnerable to this kind of attack, but declined to comment on the specific technical methods, a position I initially disagreed with, but on reflection is probably the right move for reasons we will get to in a moment.

Thus the FBI wants a new version of iOS, signed by Apple and installed on the device, that removes limitations on their attempts to brute-force the password.

Why this matters

Legal precedent is like a glacier, slowly building over time until it becomes nigh unstoppable. Major issues like this are first, and sometimes ultimately, decided on a series of small steps that build on each other. It’s the reason the NRA fights any attempts at gun control, since they fear a slow build, not a single small law.

The crux of this round of the encryption debate is if companies should be forced to build tools to circumvent their customers’ security. If the answer is “yes,” it could be a small step to “should they just build these tools into the OS from the start?”

I have no doubt the FBI deliberately chose the highest-profile domestic terrorism case in possibly a decade. We, average citizens, want the FBI to stop this sort of evil. We don’t necessarily see this one case as applying to our lives and our rights. Why the big deal? What if the FBI could find the terrorists’ contacts and stop other attacks?

What matters is if we have a right to the security and privacy of our devices and communications.

But the truth is, no legal case applies in a vacuum. If this goes through, if Apple is forced to assist, it will open a floodgate of law enforcement requests. Then what about civil cases? Opening a phone to support a messy divorce and child custody battle? Or what about requests from other nations, especially places like China and the UAE that already forced BlackBerry and others to compromise the security of their customers?

And once the scale of these requests increases, as a security professional I guarantee the tools will leak, the techniques will be exploited by criminals, and our collective security will decline. It really doesn’t matter if it’s the iPhone 5c or 6s. It really doesn’t matter if this is about dead terrorists or a drug dealer. It doesn’t matter what specific circumvention Apple is being asked to create.

What matters is if we have a right to the security and privacy of our devices, and of our communications, which are also under assault. If we have the right to tools to defend ourselves from the government and criminals alike. Yes, these tools will be sometimes used for the worst of crimes, but they’re also fundamental to our civil rights, freedom of discourse, and our ability to protect our digital lives from the less impactful, but far more frequent criminal attacks.

This situation was engineered by the FBI and Department of Justice for the maximum impact and chances of success. Apple is fighting, and as a security professional it’s my obligation to support their position, and stronger security.

France attacks Facebook data tracking, opening new front in privacy battles

facebook big brother
Quote

French data regulators have given Facebook three months to stop transferring data on French users to the US and to refrain from tracking nonusers.

PARIS — In yet another fissure between the US and Europe over digital privacy practices, French regulators ordered Facebook to curtail its online data collection practices.

The country’s data protection authority, known by its French acronym CNIL, ruled this week to give Facebook three months to stop transferring data on French users to the states and to refrain from collecting information about nonusers, or else face hefty fines.

—–
There is an easier solution. Just stop using it. These slime balls track you whether you are a user or not. That said, anyone who disrespects their own privacy deserves what they get. Word of the day Insouciant -“Marked by blithe unconcern; nonchalant.” And it is not just users of Facebook and other social media, it is what we witness everyday in businesses when it comes to their IT security and their employee and customer’s privacy.

La justice confirme que les tribunaux français peuvent juger Facebook

Quote (French) / Quote (English)

Paris court rules against Facebook in French nudity case

facebook censorship

The Paris appeal court has upheld a ruling that Facebook can be sued under French – not Californian – law.

A French teacher won in the Paris high court last year, arguing that Facebook should not have suspended his account because of an erotic image on his page.

Facebook appealed against that ruling – but the appeal court has now upheld the criticism of Facebook’s user terms.

US-based Facebook says users can only sue in California. It removed a close-up of a nude woman, painted by Courbet.

The teacher, Frederic Durand-Baissas, argued that he had a right to post a link on Facebook with the image of the famous Gustave Courbet painting. The original 19th-Century work hangs in the Musee d’Orsay in Paris.

The teacher accused Facebook of censorship and said the social network should reinstate his account and pay him €20,000 (£15,521; $22,567) in damages. He sued the company in 2011.

It is seen as a test case, potentially paving the way for other lawsuits against Facebook outside US jurisdiction.

Facebook users have to agree to the tech giant’s terms of service, which state that its jurisdiction is California. About 22 million French people are on Facebook.

The Paris high court decided that the company’s argument was “abusive” and violated French consumer law, by making it difficult for people in France to sue.

The Facebook community standards say “we restrict the display of nudity because some audiences within our global community may be sensitive to this type of content – particularly because of their cultural background or age”.

———
Good work Frederic Durand-Baissas!

‘Error 53’ fury mounts as Apple software update threatens to kill your iPhone 6

iphone bricked

Quote

It’s the message that spells doom and will render your handset worthless if it’s been repaired by a third party. But there’s no warning and no fix

Thousands of iPhone 6 users claim they have been left holding almost worthless phones because Apple’s latest operating system permanently disables the handset if it detects that a repair has been carried out by a non-Apple technician.

Relatively few people outside the tech world are aware of the so-called “error 53” problem, but if it happens to you you’ll know about it. And according to one specialist journalist, it “will kill your iPhone”.
Apple says iPhone ‘Error 53′ is to protect customers’ security

The issue appears to affect handsets where the home button, which has touch ID fingerprint recognition built-in, has been repaired by a “non-official” company or individual. It has also reportedly affected customers whose phone has been damaged but who have been able to carry on using it without the need for a repair.

But the problem only comes to light when the latest version of Apple’s iPhone software, iOS 9, is installed. Indeed, the phone may have been working perfectly for weeks or months since a repair or being damaged.

After installation a growing number of people have watched in horror as their phone, which may well have cost them £500-plus, is rendered useless. Any photos or other data held on the handset is lost – and irretrievable.

Tech experts claim Apple knows all about the problem but has done nothing to warn users that their phone will be “bricked” (ie, rendered as technologically useful as a brick) if they install the iOS upgrade.

Freelance photographer and self-confessed Apple addict Antonio Olmos says this happened to his phone a few weeks ago after he upgraded his software. Olmos had previously had his handset repaired while on an assignment for the Guardian in Macedonia. “I was in the Balkans covering the refugee crisis in September when I dropped my phone. Because I desperately needed it for work I got it fixed at a local shop, as there are no Apple stores in Macedonia. They repaired the screen and home button, and it worked perfectly.”

He says he thought no more about it, until he was sent the standard notification by Apple inviting him to install the latest software. He accepted the upgrade, but within seconds the phone was displaying “error 53” and was, in effect, dead.

When Olmos, who says he has spent thousands of pounds on Apple products over the years, took it to an Apple store in London, staff told him there was nothing they could do, and that his phone was now junk. He had to pay £270 for a replacement and is furious.

“The whole thing is extraordinary. How can a company deliberately make their own products useless with an upgrade and not warn their own customers about it? Outside of the big industrialised nations, Apple stores are few and far between, and damaged phones can only be brought back to life by small third-party repairers.

It is all about the money isn’t Apple? !

Microsoft Admits Windows 10 Automatic Spying Cannot Be Stopped

Windows10-Spy
Quote

…Speaking to PC World, Microsoft Corporate Vice President Joe Belfiore explained that Windows 10 is constantly tracking how it operates and how you are using it and sending that information back to Microsoft by default. More importantly he also confirmed that, despite offering some options to turn elements of tracking off, core data collection simply cannot be stopped:

“In the cases where we’ve not provided options, we feel that those things have to do with the health of the system,” he said. “In the case of knowing that our system that we’ve created is crashing, or is having serious performance problems, we view that as so helpful to the ecosystem and so not an issue of personal privacy, that today we collect that data so that we make that experience better for everyone.”

To his credit, Belfiore does recognise the controversial nature of this decision and stresses that:

“We’re going to continue to listen to what the broad public says about these decisions, and ultimately our goal is to balance the right thing happening for the most people – really, for everyone – with complexity that comes with putting in a whole lot of control.”

B.S.!


Interestingly Belfiore himself won’t be around to oversee this as he is about to take a year long sabbatical. When he comes back, however, I suspect this issue will still be raging as Windows and Devices Group head Terry Myerson recently confirmed Windows 10 Enterprise users will be able to disable every single aspect of Microsoft data collection.

This comes in combination with Windows 10 Pro and Enterprise users’ ability to permanently disable automatic updates which are forced upon consumers and shows the growing divide between how Microsoft is treating consumers versus corporations.

So how concerned should users be about Windows 10’s default data collection policies? I would say very.

By default Windows 10 Home is allowed to control your bandwidth usage, install any software it wants whenever it wants (without providing detailed information on what these updates do), display ads in the Start Menu (currently it has been limited to app advertisements), send your hardware details and any changes you make to Microsoft and even log your browser history and keystrokes which the Windows End User Licence Agreement (EULA) states you allow Microsoft to use for analysis.

The good news: even if Belfiore states you cannot switch off everything, editing your privacy settings will disable the worst of these. To find them open the Start menu > Settings > Privacy.

The bad news: despite Belfiore’s pledge “to continue to listen”, Microsoft’s actions (including the impending Windows 7 and Windows 8 upgrade pressure) suggests the company’s recent love for Big Brother tactics is only going to get worse before it gets better…

Answer? Stay on windows 7 pro or switch to a Linux distro. It is time that users stand up and say “Stop spying or I will stop using your products.” Remember, Windows 10 is not free, you pay for the privileged to get raped by their ilk!