Bruce Schneier A blog covering security and security technology. This idea, by Stuart Schechter at Microsoft Research, is -- I think -- clever: It certainly is a new way to look at the security threat model. Posted on April 15, 2010 at 6:43 AM • 4 Comments Fifteen years ago, Matt Blaze wrote an Afterword to my book Applied Cryptography. Here are his current thoughts on that piece of writing. Posted on April 14, 2010 at 1:30 PM • 17 Comments Chris Hoofnagle has a new paper: "Internalizing Identity Theft." Basically, he shows that one of the problems is that lenders extend credit even when credit applications are sketchy. From an article on the work: Of 16 applications presented by imposters to obtain credit or medical services, almost all were rife with errors that should have suggested fraud. Yet in all 16 cases, credit or services were granted anyway. In the various cases described in the paper, which was published on Wednesday in The U.C.L.A. Journal of Law and Technology, one victim found four of six fraudulent applications submitted in her name contained the wrong address; two contained the wrong phone number and one the wrong date of birth. Another victim discovered that his imposter was 70 pounds heavier, yet successfully masqueraded as him using what appeared to be his stolen driver's license, and in one case submitted an incorrect Social Security number. This is a textbook example of an economic externality. Because most of the cost of identity theft is borne by the victim -- even with the lender reimbursing the victim if pushed to -- the lenders make the trade-off that's best for their business, and that means issuing credit even in marginal situations. They make more money that way. If we want to reduce identity theft, the only solution is to internalize that externality. Either give victims the ability to sue lenders who issue credit in their names to identity thieves, or pass a law with penalties if lenders do this. Posted on April 14, 2010 at 6:57 AM • 50 Comments John Adams argues that our irrationality about comparative risks depends on the type of risk: Cycling from A to B (I write as a London cyclist) is done with a diminished sense of control over one's fate. This sense is supported by statistics that show that per kilometre travelled a cyclist is 14 times more likely to die than someone in a car. This is a good example of the importance of distinguishing between relative and absolute risk. Although 14 times greater, the absolute risk of cycling is still small -- 1 fatality in 25 million kilometres cycled; not even Lance Armstrong can begin to cover that distance in a lifetime of cycling. And numerous studies have demonstrated that the extra relative risk is more than offset by the health benefits of regular cycling; regular cyclists live longer. While people may voluntarily board planes, buses and trains, the popular reaction to crashes in which passengers are passive victims, suggests that the public demand a higher standard of safety in circumstances in which people voluntarily hand over control of their safety to pilots, or to bus or train drivers. Risks imposed by nature -- such as those endured by those living on the San Andreas Fault or the slopes of Mount Etna -- or impersonal economic forces -- such as the vicissitudes of the global economy -- are placed in the middle of the scale. Reactions vary widely. They are usually seen as motiveless and are responded to fatalistically - unless or until the threat appears imminent. Imposed risks are less tolerated. Consider mobile phones. The risk associated with the handsets is either non-existent or very small. The risk associated with the base stations, measured by radiation dose, unless one is up the mast with an ear to the transmitter, is orders of magnitude less. Yet all round the world billions are queuing up to take the voluntary risk, and almost all the opposition is focussed on the base stations, which are seen by objectors as impositions. Because the radiation dose received from the handset increases with distance from the base station, to the extent that campaigns against the base stations are successful, they will increase the distance from the base station to the average handset, and thus the radiation dose. The base station risk, if it exist, might be labelled a benignly imposed risk; no one supposes that the phone company wishes to murder all those in the neighbourhood. Less tolerated are risks whose imposers are perceived as motivated by profit or greed. In Europe, big biotech companies such as Monsanto are routinely denounced by environmentalist opponents for being more concerned with profits than the welfare of the environment or the consumers of its products. Less tolerated still are malignly imposed risks -- crimes ranging from mugging to rape and murder. In most countries in the world the number of deaths on the road far exceeds the numbers of murders, but far more people are sent to jail for murder than for causing death by dangerous driving. In the United States in 2002 16,000 people were murdered -- a statistic that evoked far more popular concern than the 42,000 killed on the road -- but far less than the 25 killed by terrorists. This isn't a new result, but it's vital to understand how people react to different risks. Posted on April 13, 2010 at 1:18 PM • 11 Comments Nice analysis by John Mueller and Mark G. Stewart: These established considerations are designed to provide a viable, if somewhat rough, guideline for public policy. In all cases, measures and regulations intended to reduce risk must satisfy essential cost-benefit considerations. Clearly, hazards that fall in the unacceptable range should command the most attention and resources. Those in the tolerable range may also warrant consideration -- but since they are less urgent, they should be combated with relatively inexpensive measures. Those hazards in the acceptable range are of little, or even negligible, concern, so precautions to reduce their risks even further would scarcely be worth pursuing unless they are remarkably inexpensive. [...] As can be seen, annual terrorism fatality risks, particularly for areas outside of war zones, are less than one in one million and therefore generally lie within the range regulators deem safe or acceptable, requiring no further regulations, particularly those likely to be expensive. They are similar to the risks of using home appliances (200 deaths per year in the United States) or of commercial aviation (103 deaths per year). Compared with dying at the hands of a terrorist, Americans are twice as likely to perish in a natural disaster and nearly a thousand times more likely to be killed in some type of accident. The same general conclusion holds when the full damage inflicted by terrorists -- not only the loss of life but direct and indirect economic costs -- is aggregated. As a hazard, terrorism, at least outside of war zones, does not inflict enough damage to justify substantially increasing expenditures to deal with it. [...] To border on becoming unacceptable by established risk conventions -- that is, to reach an annual fatality risk of 1 in 100,000 -- the number of fatalities from terrorist attacks in the United States and Canada would have to increase 35-fold; in Great Britain (excluding Northern Ireland), more than 50-fold; and in Australia, more than 70-fold. For the United States, this would mean experiencing attacks on the scale of 9/11 at least once a year, or 18 Oklahoma City bombings every year. Posted on April 13, 2010 at 6:07 AM • 28 Comments Says Matt Blaze: Scary research by Christopher Soghoian and Sid Stamm: Even more scary, Soghoian and Stamm found that hardware to perform this attack is being produced and sold: [...] The company in question is known as Packet Forensics.... According to the flyer: "Users have the ability to import a copy of any legitimate key they obtain (potentially by court order) or they can generate 'look-alike' keys designed to give the subject a false sense of confidence in its authenticity." The product is recommended to government investigators, saying "IP communication dictates the need to examine encrypted traffic at will." And, "Your investigative staff will collect its best evidence while users are lulled into a false sense of security afforded by web, e-mail or VOIP encryption." Matt Blaze has the best analysis. Read his whole commentary; this is just the ending: Also, it's not clear how web interception would be particularly useful for many of the most common law enforcement investigative scenarios. If a suspect is buying books or making hotel reservations online, it's usually a simple (and legally relatively uncomplicated) matter to just ask the vendor about the transaction, no wiretapping required. This suggests that these products may be aimed less at law enforcement than at national intelligence agencies, who might be reluctant (or unable) to obtain overt cooperation from web site operators (who may be located abroad). Posted on April 12, 2010 at 1:32 PM • 56 Comments An NYU student has been reverse-engineering facial recognition algorithms to devise makeup patternsto confuse face recognition software. Posted on April 12, 2010 at 6:08 AM • 34 Comments Cute. Posted on April 9, 2010 at 4:21 PM • 10 Comments CRN Magazine named me as one of its security superstars of 2010. Posted on April 9, 2010 at 1:58 PM • 18 Comments Last month at the RSA Conference, I gave a talk titled "Security, Privacy, and the Generation Gap." It was pretty good, but it was the first time I gave that talk in front of a large audience -- and its newness showed. Last week, I gave the same talk again, at the CACR Higher Education Security Summit at Indiana University. It was much, much better the second time around, and there's a video available. Posted on April 9, 2010 at 12:55 PM • 12 Comments Dueling has a rational economic basis. Posted on April 9, 2010 at 6:49 AM • 27 Comments New cryptanalysis of the proprietrary encryption algorithm used in the Digital Enhanced Cordless Telecommunications (DECT) standard for cordless phones. News. http://www.schneier.com/
Schneier on Security
Storing Cryptographic Keys with Invisible Tattoos
Abstract: Implantable medical devices, such as implantable cardiac defibrillators and pacemakers, now use wireless communication protocols vulnerable to attacks that can physically harm patients. Security measures that impede emergency access by physicians could be equally devastating. We propose that access keys be written into patients' skin using ultraviolet-ink micropigmentation (invisible tattoos).
Matt Blaze Comments on his 15-Year-Old "Afterword"
Externalities and Identity Theft
Using a 2003 amendment to the Fair Credit Reporting Act that allows victims of ID theft to ask creditors for the fraudulent applications submitted in their names, Mr. Hoofnagle worked with a small sample of six ID theft victims and delved into how they were defrauded.
Among the ways to move the cost of the crime back to issuers of credit, Mr. Hoofnagle suggests that lenders contribute to a fund that will compensate victims for the loss of their time in resolving their ID theft problems.
Terrorist Attacks and Comparable Risks, Part 2
With "pure" voluntary risks, the risk itself, with its associated challenge and rush of adrenaline, is the reward. Most climbers on Mount Everest know that it is dangerous and willingly take the risk. With a voluntary, self-controlled, applied risk, such as driving, the reward is getting expeditiously from A to B. But the sense of control that drivers have over their fates appears to encourage a high level of tolerance of the risks involved.
Terrorist Attacks and Comparable Risks, Part 1
There is a general agreement about risk, then, in the established regulatory practices of several developed countries: risks are deemed unacceptable if the annual fatality risk is higher than 1 in 10,000 or perhaps higher than 1 in 100,000 and acceptable if the figure is lower than 1 in 1 million or 1 in 2 million. Between these two ranges is an area in which risk might be considered "tolerable."
Man-in-the-Middle Attacks Against SSL
A decade ago, I observed that commercial certificate authorities protect you from anyone from whom they are unwilling to take money. That turns out to be wrong; they don't even do that much.
Abstract: This paper introduces a new attack, the compelled certificate creation attack, in which government agencies compel a certificate authority to issue false SSL certificates that are then used by intelligence agencies to covertly intercept and hijack individuals' secure Web-based communications. We reveal alarming evidence that suggests that this attack is in active use. Finally, we introduce a lightweight browser add-on that detects and thwarts such attacks.
At a recent wiretapping convention, however, security researcher Chris Soghoian discovered that a small company was marketing internet spying boxes to the feds. The boxes were designed to intercept those communications -- without breaking the encryption -- by using forged security certificates, instead of the real ones that websites use to verify secure connections. To use the appliance, the government would need to acquire a forged certificate from any one of more than 100 trusted Certificate Authorities.
It's worth pointing out that, from the perspective of a law enforcement or intelligence agency, this sort of surveillance is far from ideal. A central requirement for most government wiretapping (mandated, for example, in the CALEA standards for telephone interception) is that surveillance be undetectable. But issuing a bogus web certificate carries with it the risk of detection by the target, either in real-time or after the fact, especially if it's for a web site already visited. Although current browsers don't ordinarily detect unusual or suspiciously changed certificates, there's no fundamental reason they couldn't (and the Soghoian/Stamm paper proposes a Firefox plugin to do just that). In any case, there's no reliable way for the wiretapper to know in advance whether the target will be alerted by a browser that scrutinizes new certificates.
Makeup to Fool Face Recognition Software
Friday Squid Blogging: Another Squid T-Shirt
Me in CRN
Schneier on "Security, Privacy, and the Generation Gap"
The Economics of Dueling
Cryptanalysis of the DECT
Abstract. The DECT Standard Cipher (DSC) is a proprietary 64-bit stream cipher based on irregularly clocked LFSRs and a non-linear output combiner. The cipher is meant to provide confidentiality for cordless telephony. This paper illustrates how the DSC was reverse-engineered from a hardware implementation using custom firmware and information on the structure of the cipher gathered from a patent. Beyond disclosing the DSC, the paper proposes a practical attack against DSC that recovers the secret key from 215 keystreams on a standard PC with a success rate of 50% within hours; somewhat faster when a CUDA graphics adapter is available.
Thursday, 15 April 2010
Posted by Britannia Radio at 13:43