JUDGE ORDERS APPLE TO HELP UNLOCK SAN BERNARDINO SHOOTER'S PHONE

Marko

TCG Elite Member
Feb 19, 2005
18,799
2,456
Judge Orders Apple to Help Unlock San Bernardino Shooter's Phone | abc7chicago.com



Investigators hope to gain insight on who Farook and his wife, Tashfeen Malik, may have contacted in plotting the attack. They are also interested to learn where the couple may have traveled to before and after the shooting, along with any other "pertinent" information.

The phone is owned by Farook's employer, the San Bernardino County Department of Public Health. The department has given authorities consent to search the phone, but it's locked with a numeric password.

The FBI's attempts to crack the passcode have failed because Apple has set its phone systems with a function that automatically erases the access key and renders the phone "permanently inaccessible" after 10 failed attempts.
 

LikeABauce302

TCG Elite Member
Aug 27, 2013
5,874
16,317
South suburbs
Real Name
Matt
I wonder if they tried the asshole's fingerprint to unlock it

That wouldn't work, even if the guy did have the finger print reader programmed. My iPhone makes me put in the password if I go more than 48 hours without using it. It's my work phone, so I usually have to re-enter the password every Sunday night or Monday morning after its sat untouched all weekend.
 

Fish

From the quiet street
TCG Premium
Aug 3, 2007
40,518
7,871
Hanover Park
Real Name
Fish
That wouldn't work, even if the guy did have the finger print reader programmed. My iPhone makes me put in the password if I go more than 48 hours without using it. It's my work phone, so I usually have to re-enter the password every Sunday night or Monday morning after its sat untouched all weekend.

This. After a certain amount of time, or a restart, you need the passcode.
 

Burtonrider10022

TCG Elite Member
Feb 25, 2008
13,052
30
Milwaukee, WI
Real Name
Yes
Apple has released a statement:



[URL=http://www.apple.com/customer-letter/]A Message to Our Customers[/URL] said:
The United States government has demanded that Apple take an unprecedented step which threatens the security of our customers. We oppose this order, which has implications far beyond the legal case at hand.

This moment calls for public discussion, and we want our customers and people around the country to understand what is at stake.

The Need for Encryption
Smartphones, led by iPhone, have become an essential part of our lives. People use them to store an incredible amount of personal information, from our private conversations to our photos, our music, our notes, our calendars and contacts, our financial information and health data, even where we have been and where we are going.

All that information needs to be protected from hackers and criminals who want to access it, steal it, and use it without our knowledge or permission. Customers expect Apple and other technology companies to do everything in our power to protect their personal information, and at Apple we are deeply committed to safeguarding their data.

Compromising the security of our personal information can ultimately put our personal safety at risk. That is why encryption has become so important to all of us.

For many years, we have used encryption to protect our customers’ personal data because we believe it’s the only way to keep their information safe. We have even put that data out of our own reach, because we believe the contents of your iPhone are none of our business.

The San Bernardino Case
We were shocked and outraged by the deadly act of terrorism in San Bernardino last December. We mourn the loss of life and want justice for all those whose lives were affected. The FBI asked us for help in the days following the attack, and we have worked hard to support the government’s efforts to solve this horrible crime. We have no sympathy for terrorists.

When the FBI has requested data that’s in our possession, we have provided it. Apple complies with valid subpoenas and search warrants, as we have in the San Bernardino case. We have also made Apple engineers available to advise the FBI, and we’ve offered our best ideas on a number of investigative options at their disposal.

We have great respect for the professionals at the FBI, and we believe their intentions are good. Up to this point, we have done everything that is both within our power and within the law to help them. But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.

Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.

The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.

The Threat to Data Security
Some would argue that building a backdoor for just one iPhone is a simple, clean-cut solution. But it ignores both the basics of digital security and the significance of what the government is demanding in this case.

In today’s digital world, the “key” to an encrypted system is a piece of information that unlocks the data, and it is only as secure as the protections around it. Once the information is known, or a way to bypass the code is revealed, the encryption can be defeated by anyone with that knowledge.

The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.

The government is asking Apple to hack our own users and undermine decades of security advancements that protect our customers — including tens of millions of American citizens — from sophisticated hackers and cybercriminals. The same engineers who built strong encryption into the iPhone to protect our users would, ironically, be ordered to weaken those protections and make our users less safe.

We can find no precedent for an American company being forced to expose its customers to a greater risk of attack. For years, cryptologists and national security experts have been warning against weakening encryption. Doing so would hurt only the well-meaning and law-abiding citizens who rely on companies like Apple to protect their data. Criminals and bad actors will still encrypt, using tools that are readily available to them.

A Dangerous Precedent
Rather than asking for legislative action through Congress, the FBI is proposing an unprecedented use of the All Writs Act of 1789 to justify an expansion of its authority.

The government would have us remove security features and add new capabilities to the operating system, allowing a passcode to be input electronically. This would make it easier to unlock an iPhone by “brute force,” trying thousands or millions of combinations with the speed of a modern computer.

The implications of the government’s demands are chilling. If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone’s device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge.

Opposing this order is not something we take lightly. We feel we must speak up in the face of what we see as an overreach by the U.S. government.

We are challenging the FBI’s demands with the deepest respect for American democracy and a love of our country. We believe it would be in the best interest of everyone to step back and consider the implications.

While we believe the FBI’s intentions are good, it would be wrong for the government to force us to build a backdoor into our products. And ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect.

Tim Cook
 

willizm

Very Nice, Very Evil
May 13, 2009
12,829
10,150
The Woodlands, TX
NSA already has the info.... this is just a feel good exercise to make sure the sheeple think that we're being looked out for...

Id bet they find bite marks on the pillow they killed him with.....cause apple

NSA only collects information that has been transmitted to or from the device but things on the device like pictures, notes, video's, etc that is probably what they want to get at.
 

BrianG

Big Dick Team Octane
Oct 5, 2008
5,715
74
Streamwood
Real Name
Brian G
Somehow I feel like Apple already has to have that ability, but doesn't want to let it out. At the same time tho, the US Government really doesn't have the resources to dump the data from the memory of the phone and work on it outside of the device??? That, I find even harder to believe.
 

Lord Tin Foilhat

TCG Conspiracy Lead Investigator
TCG Premium
Jul 8, 2007
60,686
56,744
Privy Chamber
That's the whole point of encryption. The govt could dedicate their super computers to trying to break the encryption 24/7 but even then it could take YEARS. That's encryption when its built right. This is why a backdoor defeats the whole purpose of encryption.

Encryption = Math

That is why this is so important. People think the government has all the keys when really they just monitor un-encrypted information and are reactive based on it. Yes they have broken very common and old encryption techniques that are still in use....but there are plenty of encryption types that they wont ever break...or at least not until quantum computers come online.
 

Bob Kazamakis

I’m the f-ing lizard king
TCG Premium
Oct 24, 2007
85,195
44,840
Denver
Real Name
JK
That's the whole point of encryption. The govt could dedicate their super computers to trying to break the encryption 24/7 but even then it could take YEARS. That's encryption when its built right. This is why a backdoor defeats the whole purpose of encryption.

Encryption = Math

That is why this is so important. People think the government has all the keys when really they just monitor un-encrypted information and are reactive based on it. Yes they have broken very common and old encryption techniques that are still in use....but there are plenty of encryption types that they wont ever break...or at least not until quantum computers come online.
So you're saying apples security and encryption is good? :noes:
 

Lord Tin Foilhat

TCG Conspiracy Lead Investigator
TCG Premium
Jul 8, 2007
60,686
56,744
Privy Chamber
So you're saying apples security and encryption is good? :noes:

There was never a doubt regarding Apples security. The whole icloud leaked photos was due to weak passwords, social engineering and dumb users :rofl: Nothing to do with apples systems.

but iphones still suck dick :hsughlol: and they are still overpriced and super proprietary
 

TCG Member 5219

TCG Elite Member
Mar 22, 2005
12,447
18
Somehow I feel like Apple already has to have that ability, but doesn't want to let it out. At the same time tho, the US Government really doesn't have the resources to dump the data from the memory of the phone and work on it outside of the device??? That, I find even harder to believe.

They do 100%. And have given the ability to other companies too. I can clear passcodes and activation locks on any ipad in my environment, pending its been enrolled in my mobile device management system. Its doable, and i do it several times a week.
 

Flyn

Go ahead. I'll catch up.
Moderator
TCG Premium
Mar 1, 2004
68,052
27,984
Selling homes on the Gulf Coast of Florida
This will be an interesting cast. Safety vs. safety.

The FBI does not even know if there's anything useful on the phone (anybody heard of burner phones?).

The risk to iPhone user safety is real.

I lean towards Apple telling the FBI no backdoor.

Can't the FBI find a hacker who is not connected to Apple?
 

BrianG

Big Dick Team Octane
Oct 5, 2008
5,715
74
Streamwood
Real Name
Brian G
Somehow I feel like Apple already has to have that ability, but doesn't want to let it out. At the same time tho, the US Government really doesn't have the resources to dump the data from the memory of the phone and work on it outside of the device??? That, I find even harder to believe.

They do 100%. And have given the ability to other companies too. I can clear passcodes and activation locks on any ipad in my environment, pending its been enrolled in my mobile device management system. Its doable, and i do it several times a week.


Interesting. While I hate Apple's products, I definitely don't agree with the government forcing Apple to develop their software in any particular way, but also don't agree with Apple's denial, lies, and refusal regarding cracking the phone.
Apple Unlocked iPhones for the Feds 70 Times Before - The Daily Beast
Apple Unlocked iPhones for the Feds 70 Times Before

A 2015 court case shows that the tech giant has been willing to play ball with the government before—and is only stopping now because it might ‘tarnish the Apple brand.’

Apple CEO Tim Cook declared on Wednesday that his company wouldn’t comply with a government search warrant to unlock an iPhone used by one of the San Bernardino killers, a significant escalation in a long-running debate between technology companies and the government over access to people’s electronically-stored private information.

But in a similar case in New York last year, Apple acknowledged that it could extract such data if it wanted to. And according to prosecutors in that case, Apple has unlocked phones for authorities at least 70 times since 2008. (Apple doesn’t dispute this figure.)

In other words, Apple’s stance in the San Bernardino case may not be quite the principled defense that Cook claims it is. In fact, it may have as much to do with public relations as it does with warding off what Cook called “an unprecedented step which threatens the security of our customers.”

For its part, the government’s public position isn’t clear cut, either. U.S. officials insist that they cannot get past a security feature on the shooter’s iPhone that locks out anyone who doesn’t know its unique password—which even Apple doesn’t have. But in that New York case, a government attorney acknowledged that one U.S. law enforcement agency has already developed the technology to crack at least some iPhones, without the assistance from Apple that officials are demanding now.

The facts in the New York case, which involve a self-confessed methamphetamine dealer and not a notorious terrorist, tend to undermine some of the core claims being made by both Apple and the government in a dispute with profound implications for privacy and criminal investigations beyond the San Bernardino case.

In New York, as in California, Apple is refusing to bypass the passcode feature now found on many iPhones.

But in a legal brief, Apple acknowledged that the phone in the meth case was running version 7 of the iPhone operating system, which means the company can access it. “For these devices, Apple has the technical ability to extract certain categories of unencrypted data from a passcode locked iOS device,” the company said in a court brief.

Whether the extraction would be successful depended on whether the phone was “in good working order,” Apple said, noting that the company hadn’t inspected the phone yet. But as a general matter, yes, Apple could crack the iPhone for the government. And, two technical experts told The Daily Beast, the company could do so with the phone used by deceased San Bernardino shooter, Syed Rizwan Farook, a model 5C. It was running version 9 of the operating system.

Still, Apple argued in the New York case, it shouldn’t have to, because “forcing Apple to extract data… absent clear legal authority to do so, could threaten the trust between Apple and its customers and substantially tarnish the Apple brand,” the company said, putting forth an argument that didn’t explain why it was willing to comply with court orders in other cases.

“This reputational harm could have a longer term economic impact beyond the mere cost of performing the single extraction at issue,” Apple said.

Apple’s argument in New York struck one former NSA lawyer as a telling admission: that its business reputation is now an essential factor in deciding whether to hand over customer information.

“I think Apple did itself a huge disservice,” Susan Hennessey, who was an attorney in the Office of the General Counsel at the NSA, told The Daily Beast. The company acknowledged that it had the technical capacity to unlock the phone, but “objected anyway on reputational grounds,” Hennessey said. Its arguments were at odds with each other, especially in light of Apple’s previous compliance with so many court orders.

It wasn’t until after the revelations of former-NSA contractor Edward Snowden that Apple began to position itself so forcefully as a guardian of privacy protection in the face of a vast government surveillance apparatus. Perhaps Apple was taken aback by the scale of NSA spying that Snowden revealed. Or perhaps it was embarassed by its own role in it. The company, since 2012, had been providing its customers’ information to the FBI and the NSA via the so-called PRISM program, which operated pursuant to court orders.

Apple has also argued, then and now, that the government is overstepping the authority of the All Writs Act, an 18th century statute that it claims forces Apple to conduct court-ordered iPhone searches. That’s where the “clear legal authority” question comes into play.

But that, too, is a subjective question which will have to be decided by higher courts. For now, Apple is resisting the government on multiple grounds, and putting its reputation as a bastion of consumer protection front and center in the fight.

None of this has stopped the government from trying to crack the iPhone, a fact that emerged unexpectedly in the New York case. In a brief exchange with attorneys during a hearing in October, Judge James Orenstein said he’d found testimony in another case that the Homeland Security Department “is in possession of technology that would allow its forensic technicians to override the pass codes security feature on the subject iPhone and obtain the data contained therein.”

That revelation, which went unreported in the press at the time, seemed to undercut the government’s central argument that it needed Apple to unlock a protected iPhone.

“Even if [Homeland Security] agents did not have the defendant’s pass code, they would nevertheless have been able to obtain the records stored in the subject iPhone using specialized software,” the judge said. “Once the device is unlocked, all records in it can be accessed and copied.”

A government attorney affirmed that he was aware of the tool. However, it applied only to one update of version 8 of the iPhone operating system—specifically, 8.1.2. The government couldn’t unlock all iPhones, but just phones with that software running.

Still, it made the judge question whether other government agencies weren’t also trying to break the iPhone’s supposedly unbreakable protections. And if so, why should he order the company to help?

There was, the judge told the government lawyer, “the possibility that on the intel side, the government has this capability. I would be surprised if you would say it in open court one way or the other.”

Orenstein was referring to the intelligence agencies, such as the NSA, which develop tools and techniques to hack popular operating systems, and have been particularly interested for years in trying to get into Apple products, according to documents leaked by Snowden.

There was no further explanation of how Homeland Security developed the tool, and whether it was widely used. A department spokesperson declined to comment “on specific law enforcement techniques.” But the case had nevertheless demonstrated that, at least in some cases, the government can, and has, managed to get around the very wall that it now claims impedes lawful criminal investigations.

The showdown between Apple and the FBI will almost certainly not be settled soon. The company is expected to file new legal briefs within days. And the question of whether the All Writs Act applies in such cases is destined for an appeals court decision, legal experts have said.

But for the moment, it appears that the only thing certainly standing in the way of Apple complying with the government is its decision not to. And for its part, the government must be presumed to be searching for new ways to get the information it wants.

Technically, Apple probably can find a way to extract the information that the government wants from the San Bernardino shooter’s phone, Christopher Soghoian, the principal technologist for the American Civil Liberties Union, told The Daily Beast.

“The question is does the law give the government the ability to force Apple to create new code?” he said. “Engineers have to sit down and create something that doesn’t exist” in order to meet the government’s demands. Soghoian noted that this would only be possible in the San Bernardino case because the shooter was using an iPhone model 5C, and that newer hardware versions would be much harder for Apple to bypass.

But even that’s in dispute, according to another expert’s analysis. Dan Guido, a self-described hacker and CEO of the cybersecurity company Trail of Bits, said that Apple can, in fact, eliminate the protections that keep law enforcement authorities from trying to break into the iPhone with a so-called brute force attack, using a computer to make millions of password guesses in a short period of time. New iPhones have a feature that stops users from making repeated incorrect guesses and can trigger a kind of self-destruct mechanism, erasing all the phone’s contents, after too many failed attempts.

In a detailed blog post, Guido described how Apple could workaround its own protections and effectively disarm the security protections. It wouldn’t be trivial. But it’s feasible, he said, even for the newest versions of the iPhone, which, unlike the ones in the New York and San Bernardino cases, Apple swears it cannot crack.

“The burden placed on Apple will be greater… but it will not be impossible,” Guido told The Daily Beast.
 

wombat

TCG Elite Member
TCG Premium
Sep 29, 2007
14,097
2,964
WI
They do 100%. And have given the ability to other companies too. I can clear passcodes and activation locks on any ipad in my environment, pending its been enrolled in my mobile device management system. Its doable, and i do it several times a week.

That's completely different. You've authorized a security delegate on that system allowing you to control it via mdm...That's not the same as what they're asking them to do for this phone.

Edit: I misread your comment, I take back my picard, but still want to make it known for the lesser techsavy people in this thread that this is apples and oranges to what the FBI is asking. The FBI is basically asking for a backdoor that could affect millions of users.
 
Old Thread: Hello . There have been no replies in this thread for 90 days.
Content in this thread may no longer be relevant. Consider starting a new thread to get fresh replies.

Thread Info