If the encryption debate is already a house of cards, the ground keeps moving. Facebook, with stated plans to connect all its platforms across one fully encrypted backend, just rolled out a selective version of Instagram for web—with dubious plans to encrypt. The problem might stretch beyond just the difficulties of storing encrypted info in JavaScript. And, in a day where protection of minors seems to be top of mind (everyone from Attorney General Barr to new YouTube content rules), a baby photo app somehow was left “grossly insecure.” How exposed API keys left an Elasticsearch database vulnerable that shared it all. And, we take a second look at some of the most widely used arguments in favor of lawful access and why they may, or may not, hold up.
Why we don’t want Apple to break encryption
"We have always maintained there is no such thing as a backdoor just for the good guys," Apple said in a statement sent Monday. "Backdoors can also be exploited by those who threaten our national security and the data security of our customers."
As the E2EE debate might be treading water post-holidays, we may yet be gearing up for another fight in 2020. Here are some points to stay sharp.
Providing the key to only secure government agencies
First, the obvious; keys can be copied, found, stolen. Once a key is made, it will eventually be exploited.
Second, giving keys to one government and denying them to another might be tricky for business. Are tech companies prepared to take a political or sided stance by selectively handing out keys to some foreign governments and not to others? Should backdoor access be “safely” in the hands of all world powers with citizens on an iPhone, regardless of censorship, minority subjugation or human rights violations? Will mega tech companies deny those countries backdoor access and lose business, or will the neutrality of business law (and a bottom line) win out?
Interestingly, Silicon Valley companies might already be up against a very similar scenario, with banned UAE spy app ToTok recently being reinstated into the Google Play Store. Apple’s App Store has yet to decide their next move. As Jamf researcher Patrick Wardle shared with WIRED, "But if [Apple does] reinstate it, that also sets a crazy precedent! Basically it green-lights any government surveillance app, as long as the app doesn't violate App Store policies.” It’s a slippery slope.
The FBI and DoD need backdoor access for national defense
Not necessarily. The 2015 terrorist-linked San Bernardino shooting surfaced some curious facts. A later report from the Inspector General of the Justice Department reads, in part, that those investigating the case “should have checked with ... trusted vendors for possible solutions before advising ... that there was no other technical alternative and that compelling Apple's assistance was necessary to search the Farook iPhone.” This calls into question the testimony of FBI official Amy Hess and former FBI Director James Comey before Congress about the “capabilities available to the national security programs” regarding the unlocking of the suspect’s iPhone. One such capability available was the Remote Operations Unit, an internal team specifically organized to devise device break-ins.
Accept and continue
What Apple said is not wrong. Backdoors would weaken the security of consumers whose information weaves the threads of our national security. The danger wreaked might even rival that of criminally linked iPhones running lose, in an ironic twist of fate. Considering recent trust breaches however (ToTok, Cambridge Analytica), we know that backdoors aren’t the only way third parties can obtain our information. Yet while the issue stands, we’ll take as much privacy as we can get, and consider the points to ensure no one gets access to our encrypted information. Unless of course, you ask us to hit Accept and Continue.
Related posts
- Unfinished Business: Why Apple’s Decision Left FacebookHolding the Ball [Encryption Digest 23]
- Backdoors and Federal Cybersecurity Posture
- Overheard in the Press: Backdoor Debate Rages On
Peekaboo Moments breach: exposed API keys
They’ve got more than your nose.
A recent Elasticsearch database left open has rendered thousands of baby pics defenseless on the internet, all hosted on the popular Bithouse Inc. App, Peekaboo Moments.
Security expert Troy Hunt describes the exposed API as a “garden variety” data breach but concerning nonetheless given the subject matter, children’s photographs and video.
Twelve Security’s Dan Ehrlich confides, "I've never seen a server so blatantly open. Everything about the server, the company's website and the iOS/Android app was both bizarrely done and grossly insecure."
Apparently, the app also clumsily left its own endpoint exposed, leaving it naked to a malicious code upload or the exfiltration of all data party to the API. In terms of an easy-find payload, “you usually don't get that lucky,” says Ehrlich.
In the grab bag of interesting treasures were Peekaboo’s Facebook API keys, also left exposed in the open database. This could give access to a user’s personal Facebook page content, on which presumably more baby photos were posted by the app.
According to Winston Bond, EMEA Technical Director at Arxan, “This breach is a great example of extracting a web API from a mobile app and then using it to extract data. It shows exactly why app developers should harden their apps against reverse engineering and use integrity checks to make sure that the app is what it is supposed to be.”
This leads to an overall trending point. With the rapid deployment of applications in the cloud, there might be a secret cost to success. As Offensive Security Manager Hugo Van den Toorn (Outpost24) sums up, “With the countless possibilities of ‘quickly deploying a system in the cloud’, security is—still—often overlooked by organisations... Even after vendors make statements such as ‘we take your security and privacy serious’, we often see security ending-up somewhere on the bottom of the priority list… Assuming it made the priority list at all.”
Related posts
- Why Banks Need to Protect the Machine Identities of APIs
- Secure APIs: Safety Measurements for your External Interfaces
- HashiCorp Managing Machine Identities in the Cloud
Instagram to follow WhatsApp into encryption. Potentially.
Ready or not, here it comes.
Instagram goes—desktop. And it may or may not be encrypted. We’ll see. They’re working on that.
While for many of us it represents the moment we didn’t know we were waiting for, it enters as a welcome boon for a host of thumb-weary influencers, media mavens and status stalkers. In terms of marketing, it’s a great tracking tool. And, no more hassling a different device when an Insta DM becomes priority one.
Right now, this version is only available to a "small percentage” of global users, on a test-basis.
The only problem, as stated, is the irksome issue of encryption for a web-based communications app. According to former Facebook security chief Alex Stamos, “Nobody has ever built a trustworthy web-based E2EE messenger.” And why not?
We know that WhatsApp and iMessage have web compatibility, and web-based Messenger never went anywhere. So how do they do it?
Well for WhatsApp, JavaScript is used to generate a public key in the browser, which gets embedded into a QR code that gets scanned by your phone, which creates an encrypted tunnel between the internet and your device. How’s that for two-factor authentication?
The issues with creating secure web-based communications rest primarily on storing cryptographic information in JavaScript, and in the ability to curate user-specific backdoors. The first concern is being addressed by researchers, the second by policy makers. No definitive answer yet on either one.
It’s an odd step back in what seemed like a full Facebook-led charge towards complete E2EE. After flying in the face of Five Eyes and defiantly encrypting Messenger, rolling out a family-owned product with unmaterialized plans to encrypt just seems, as Alex Stamos put it, to “[cut] directly against the announced goal of E2E encrypted compatibility between FB/IG/WA.”
In other news, Facebook recently declined to comply with California’s new Consumer Privacy Act, on the assertion that what it does with user data can be defined as a “data transfer,” not sale.
Related posts