Guest blog by David R. Johnson
The Cross Community Working Group on ICANN Accountability (CCWG) spent part of its second meeting discussing a proposal to convert ICANN into an organization with “members,” potentially including the heads of Supporting Organizations and Advisory Committees (including GAC!??) who might be given the power to overrule ICANN Board Decisions. Many of the other changes on the list for discussion include proposals that would require amendment of ICANN’s Bylaws. One surreal aspect of this discussion is that it is not clear whether the ICANN Board, which must approve any amendment of the bylaws, would ever agree to such changes.
It is time for those with effective veto power over any resolution of the accountability issues, and the IANA transition, to show their hands.
ICANN appears to want the CCWG to show its hand first, and then react to whatever is proposed. That is a mistake, because it may lead the group to propose things the Board won’t want to accept, or devise oversight mechanisms that can be repealed by later ICANN Boards. And it may lead to proposals for untested structural changes that create new accountability issues of their own. CCWG can and should ask ICANN to indicate, right now, whether it will or will not agree to specific contractual provisions. The answers, whatever they are, would help immensely to guide the CCWG deliberations.
Bylaws are a way that a corporation (including a non-profit) governs itself. Contracts, in contrast, are what allows it to govern others, and others to govern it. It would be much simpler to talk about ICANN’s accountability in terms of contractual provisions — enforceable promises ICANN is willing to make to other parties, now, regarding what it will and will not do. The other side of the contract could be registries and registrars, but there could be a third party beneficiary provision that allows anyone adversely affected by the breach of such promises to bring an arbitration before a panel that can issue binding decrees.
If this were viewed as a simple contract negotiation, between ICANN stakeholders (at least the registries, registrars and registrants who are regulated by ICANN contracts) and the ICANN corporation (Board), then the accountability issues would look very different depending on whether ICANN agrees to contract terms that limit its ability to impose rules on others. For example, ICANN might find it relatively easy to agree not to impose rules that bind registries, registrars and registrants (1) in the absence of consensus, (2) on topics unrelated to sound operation of the DNS, or (3) regarding content or online behavior that doesn’t itself threaten the operation of the DNS. In contrast, if ICANN decided to reserve the right to use its monopoly over entry into the root zone to impose contracts that regulate online behavior, the nature of the “accountability” issues would change dramatically. A global consumer protection regulatory authority, with the power to take down domain names in order to protect the “global public interest”, would have to be constructed with accountability mechanisms very different from those of an organization that makes only technical operational rules subject to consensus support.
The original deal to create ICANN was, at its core, an agreement by a registry to abide by future (unknown!) “consensus policies” that fit within a very limited range of topics. That limited range of topics, known as the “picket fence,” involved issues the global resolution of which were needed to assure sound operation of the DNS. The requirement for consensus support meant that these new rules would be more like operational standards than regulations or laws — because they would have to compete for general approval among those affected. But because ICANN, in possession of the IANA contract, had the power to impose contracts of adhesion on new gTLDs as a condition of entry into the root, it has since decided to use that power to serve its view of the “global public interest.” Now the CCWG seems at times to assume that ICANN should have the power to define and impose rules in relation to some view of the “global public interest” — but it wants to create a new entity, drawn from ICANN’s internal structures, that can second-guess Board decisions by being treated as “members” who can select an independent review panel, approve a budget, dismiss Board members, and even overturn Board decisions.
Membership for all netizens in the internet community sounds like a great idea. But what will keep a new entity composed of a limited number of privileged members from itself developing an unrestrained view of its ability to impose rules on others in pursuit of some view of the “global public interest” — and to discipline a board that doesn’t share its views? In contrast, a clear contractual requirement for demonstrated consensus among affected parties before ICANN can make rules that bind others would more clearly achieve the limitation on unchecked power that any accountability regime must be designed to achieve. To make these checks and balances effective, all we need is a judicial branch (an independent arbitration panel) that can enforce the contract terms. And ICANN could, right now, take fears of abuse by a global “public interest” regulator off the table by making clear that it will agree to such contract terms.
In short, ICANN would be well advised to tell the CCWG at the outset of the process what it will contractually agree to, as a constraint on its power to tell other people what to do. It is one thing for a do-gooder non-profit deploying its own staff and resources to have some feedback from those it might affect. It is quite another thing for the IANA monopoly over entry into the root zone file to be used to impose “mandatory public interest commitments” and registrar accreditation requirements that exploit the power to revoke domain names (and registrar accreditation) to police online conduct and content — all without any clear provision for due process or the normal limitations on jurisdiction. If ICANN took the “public interest regulator” option off the table, the “accountability” discussion could be greatly simplified.
Unless CCWG knows at the outset what the ICANN board will agree to by contract, the process of discussing “accountability” is likely to create a proposal to establish another, even less accountable entity (the privileged “members”), who might be even more inclined than the current board to abuse the power inherent in the IANA monopoly. Should we have such a coup — or just negotiate a contract that puts back inside the original box ICANN’s powers to impose rules on others?
My Twitter feed has exploded with lots of theorizing about whether or not North Korea really hacked Sony. Most commentators are saying "no", pointing to the rather flimsy public evidence. They may be right — but they may not be. Worse yet, we may never know the truth.
One thing is quite certain, though: the "leaks" to the press about the NSA having concluded it was North Korea were not unauthorized leaks; rather, they were an official statement released without a name attached. Too many major news organizations released their stories more or less simultaneously. To me, that sounds like an embargoed press release. (One is tempted to imagine multiple simultaneous brush passes from covert operatives to journalists, but I suspect that emails and/or phone calls from individuals known to the reporters are much more likely.)
Before going further, let me add a disclaimer: I have no idea if North Korea is actually involved. I also have no idea how the intelligence community actually did come to its conclusions. What follows is speculation, not fact.
Nick Weaver has given a good explanation of how the NSA could have made the determination, just based on SIGINT. However, it wasn't necessarily done by SIGINT alone. Suppose, for example, that the CIA (or perhaps the South Koreans) had an agent in North Korea's Unit 121. In an era when the head of foreign operations for Hezbollah was supposedly a double agent for the Mossad and the CIA had a mole in Cuban intelligence, one can't rule out such scenarios.
There are many more possible ways to do attribution (I like this one), but most are based on sensitive sources and methods. Translation: they're not going to tell us, and they're right not to do so.
It's also very possible that their attribution is simply wrong:
In the words of a former Justice Department official involved with critical infrastructure protection, "I have seen too many situations where government officials claimed a high degree of confidence as to the source, intent, and scope of an attack, and it turned out they were wrong on every aspect of it. That is, they were often wrong, but never in doubt."
People can jump to conclusions. Worse yet, in intelligence (and unlike the criminal justice system), you never get proof beyond a reasonable doubt, and that's even if you're being honest. If someone doesn't like your answers and wants better ones — well, think Iraqi WMDs. Besides, there's always the chance that the government is lying.
Let me sum up.
- Drawing positive conclusions from the public evidence is incorrect. The NSA and the CIA may (or may not) have many other details they'll never disclose. The much-ballyhooed language setting, for example, is completely useless. Externally observable behavior and behavioral or code similarities to other attacks can be more useful. (See Kim Zetter's wonderful book on Stuxnet for a description of how some of the forensic analysis was done, e.g., don't rely on compilation dates, but do look for when a file was uploaded to a virus company's database.)
- Similarities (and especially reuse) of code, infrastructure, and techniques to other attacks can be a very strong indicator. The FBI did cite exactly these aspects in their overt press release blaming North Korea.
- There are many other information sources that intelligence agencies use. We don't know what they are, and they won't tell us.
- They could still be wrong — but we probably won't know why.
Bottom line: it's plausible, but not publicly provable.
Written by Steven Bellovin, Professor of Computer Science at Columbia University
Follow CircleID on Twitter
Losing your monopoly must be hard.
True, few companies ever experience that particular breed of angst, but if Verisign's reply to even modest success in the new gTLD marketplace is any indication, it must be very hard to say goodbye.
We understand why they're worried:
- New .COM quality is abysmal. The quality of newly registered .COM names is dropping and has been for years. And there is nothing Verisign can do about it. So welcome to the fire sale.
- New .COM is as cheap as .XYZ. The average retail price of such .COM names has dropped to about $2.50 (one tenth the average retail price of new TLDs, and that price factors in the free new TLD names).
- New .COM renewal below 50%. The 2012 cohort of .COM new registrations renewed at just above 50%, the 2013 cohort at 50%, and the 2014 names will renew at below 50%.
- .COM is affinity-less. New TLDs such as .CHURCH and .PHOTOGRAPHY appeal to specific communities. Verisign cannot change the meaning of .COM to appeal to a community.
- Here come Google and Amazon. Meanwhile, Google applied for 101 TLDs, and Amazon for 78, with near zero of those launched in 2014, not to mention all the other large corporations launching brand and generic TLDs in 2015. (Johnson and Johnson paid over $3 million yesterday for the brand spanking new .BABY TLD.)
- Low consumer awareness... today, but not tomorrow. Consumers are not broadly aware of new TLDs, yet still there are 3.5 million names registered in them.
- The registrar channel is improving regarding new TLDs. Registrars are developing their search and purchase paths. These tools are improving as new TLD awareness and demand increase.
- It's only just started! Only a quarter of new TLDs are out, for an average of just six months, yet they have done 3.5 million new registrations at an average retail price of $25 (counting the free ones). In fact, I signed the .SCHOOL, .NEWS, .FOOTBALL and .GOLF registry contracts yesterday. You'll see these launched along with hundreds of others in 2015.
- The best are yet to come. That is 3.5 million names on mostly single-applicant TLDs. What will the highly contended TLDs that launch in 2015 do?
- Their own clients have jumped into competition. Verisign's clients applied for 220 new TLDs. Those clients think new TLDs will succeed even if their back-end provider, Verisign, does not.
- Usage of names in new TLDs is going up. We are monitoring usage on a month-to-month basis in Donuts TLDs and every month, usage increases.
- Registrars can do math. $25 retail with greater than 80% renewal for new TLDs vs. $2.50 retail with less than 50% renewal for new .COM — it's obvious that ancient .COM renewals are on autopilot and easy money for registrars, but regarding new names, registrars will figure out the long term customer value of new TLDs is greater than with new .COMs.
- Check out .NET. If you don't believe what's going to happen in .COM, look at what's already happening in .NET.
Maybe they don't see their own predicament because they're focused on generating fear, uncertainty and doubt, like the following:
1) "We've always had alternatives to .COM"
What is the fantastic alternative TLD that photographers always had before .PHOTOGRAPHY? Verisign says .BIZ, .INFO, .TV, .ORG, NET etc., are as attractive to photographers as . PHOTOGRAPHY. That .BIZ, .INFO, .TV, .ORG, .NET etc., have the same affinity to florists as does .FLORIST. That .BIZ, .INFO, .TV, .ORG, .NET etc., work for churches as well as does .CHURCH. They're really saying that??
It's understandable, since Verisign's pre-new gTLD marketing tag line could have been: ".COM: Because What Choice Do You Have?"
2) "The number of UDRP/URS cases in new TLDs is more than 15x higher than all other domains."
Apples, meet oranges. Of course there are more URS cases in new TLDs than .COM — the URS is not available in .COM. But it should be.
3) "Plurals and synonyms of new gTLDs [are]… making it difficult and confusing for registrants"
Verisign's registrant customers disagree. They use millions of synonyms and plurals in the .COM namespace apparently without much confusion. Verisign does not ban their usage there. People have been able to differentiate these words in English and other languages for centuries. I think they can cope with them in the Internet namespace.
We actually thought it would take a little longer to catch the attention of the 800-pound gorilla in the naming industry. The tired old thinking embodied in Verisign's response to our early success is one of the best validations yet that Donuts and other new registries are on the right track.
Written by Paul Stahura, Founder and CEO Donuts Inc.
Follow CircleID on Twitter
Two weeks ago I blogged about ICANN's astonishingly lucrative domain auctions. At that time, they'd raised $26.7 million. Now, two auctions later, they're up to about $33 million.
Yesterday's two auctions were for .MLS and .BABY. The former, for those who aren't deep into the real estate biz, stands for Multiple Listing Service, the system that lets you list a house with one broker, and all the other brokers can sell it. The Canadian MLS association filed two applications, one community supported and one not. ICANN rejected the community one on the (not unreasonable) grounds that MLS meant a lot of things other than Canadian real estate, from multiple listing services other places to Major League Soccer. The Canadians went up against Afilias, and after six rounds paid $3,359,000 for their TLD.
The other was .BABY, with six applicants. It also went six rounds, with winner healthcare giant Johnson and Johnson, maker of Johnson's Baby Oil (despite its name not made from actual babies) for $3,088,888.
Add those to the previous haul and ICANN's now got about $33 million. There are still five more auction dates running through May 2015. The current schedule has 30 names set to be auctioned, and 20 more with auctions scheduled but on hold at the request of the applicants, presumably to see if they can work something out. Another 28 are not yet eligible for auction for various reasons, mostly unresolved third party objections.
At this rate, $50 million seems conservative. What would you do with $100 million?
Written by John Levine, Author, Consultant & Speaker
Follow CircleID on Twitter
A fledgling attempt to create a new global Internet governance clearinghouse has run into trouble as leading business and civil organizations said they are not yet prepared to participate in the NETmundial Initiative (NMI) championed by ICANN President Fadi Chehade.
In highlighting that there remain several unanswered questions, the Internet Society (ISOC), Internet Architecture Board (IAB), and International Chamber of Commerce (ICC-BASIS) raised serious concerns about whether NMI, which sought to empanel a council to direct global Internet governance initiatives, was consistent with its core principles of openness and accountability among multiple stakeholders.
Many of us in the multistakeholder community share the concerns of those organizations and appreciate them for prompting a second look at how we approach the next phase of global multistakeholder policy development for the Internet.
The challenges set forth in the NETmundial Outcomes Document are meant as a blueprint for continued improvements of the multistakeholder model, which should be addressed, discussed, and advanced by the broadest possible cross-section of the global Internet community.
Even with all Internet stakeholders working collectively toward solutions, it will take quite some time for the world to implement the Outcomes Document's "Roadmap for the Future Evolution of the Internet Governance". But the responsibility for implementation should not be centralized in an elite and exclusive leadership group.
The NMI's perceived exclusivity threatens to alienate Internet stakeholders who will be critical to the success of any multistakeholder outcomes. Business groups, governments, and civil society outside of NMI's limited leadership structure aren't likely to buy in to outcomes that they didn't help to produce, and may stay away from the new body altogether.
ISOC's request for more information regarding the scope and purpose of the NMI approach may have been the first and most visible, but it likely won't be the last, given that many sectors still feel there are remaining questions. And without truly representative buy-in from stakeholders, NMI will not have the legitimacy it needs to tackle such an ambitious agenda.
NMI organizers, including ICANN, should give serious consideration to the questions raised by ICC-BASIS in its letter to the NMI Transitional Committee. Questions related industry participation and the apparently open-ended nature of NMI are particularly germane. Even the Internet Governance Forum, which now plays a critical role in the global Internet governance landscape, was initially launched with only a five-year charter.
If anything, the issues surrounding the NMI approach highlight the limitations of ICANN as the global leader on Internet issues that fall outside of its essential, but limited remit. Now that ICANN is seeking to strengthen its accountability as it assumes the IANA functions without U.S. oversight, it is critical that ICANN demonstrate it will focus on its core technical mission.
While ICANN's strong organizational structure and regular meetings make it attractive focal point for convening stakeholders, the real success of the multistakeholder process in the wake of NETmundial will be measured in how effectively other organizations and governments initiate discussions and implement solutions in the forums where they are most appropriately addressed.
When you have so many global stakeholders addressing such complex Internet issues, a single point of focus can become the single point of failure.
Written by Steve DelBianco, Executive Director at NetChoice
Follow CircleID on Twitter
In an announcement on Tuesday, ICANN reports that it is investigating a recent intrusion into its systems. The agency believes a "spear phishing" attack was initiated in late November 2014. It involved email messages that were crafted to appear to come from its own domain being sent to members of its staff. The attack resulted in the compromise of the email credentials of several ICANN staff members, according to the report. ICANN has also disclosed that the compromised credentials were used to access other ICANN systems besides email including The Centralized Zone Data System and ICANN GAC Wiki.
Follow CircleID on Twitter
Many with financial interests in new gTLDs, such as Donuts, have painted a rosy picture of how new gTLDs create greater availability of meaningful domain name options that the global masses have been waiting for. Their message seems to be: FINALLY, there is an alternative to .com in new domain extensions like .guru, .photography, .blackfriday and .tips. But, the reality is that we have always had options other than .com to choose from when registering a domain name. The challenge isn't choice, its relevance and credibility.
While all of this bravado is par for the course in marketing new products, the real shame is that the registrants whom these new gTLDS were supposedly intended to serve may be the ones who suffer in the end when they invest their time and money into branding or rebranding their businesses with new gTLDs without all of the facts.
It's been well under a year since the first new gTLDs became available for registration; however, we are already seeing several troubling trends that are being glossed over, including:
- Data reported by ntldstats.com shows considerable drop off in new registrations for the top five new gTLDs. Even with an assumed renewal rate upwards of 80 percent, as recently forecast by Donuts, if there is no solid growth to offset even a seemingly small loss, at the end of the day — or financial year — most new gTLDs will see their customer base shrink.
- The number of UDRP/URS cases filed in new gTLDs is more than 15x higher than all other domains, indicating that brands are not going to play the defensive registration game (For more info and a list of UDRP cases filed, read The Domains' article).
- Plurals and synonyms of new gTLDS continue to be deployed, making it difficult and confusing for registrants.
- About a fourth of the new gTLDs registered to date were given away for free or registered by the registry or a related party, according to The Domains, raising questions about the behavior and motives of those who cite registration numbers as a sign of the popularity of a new TLD. DomainNameWire recently published a relevant analysis of new gTLDs titled, "Lies, damned lies, and new TLD statistics."
- Noted DNS expert and ICANN Security Advisor Paul Vixie warned that some new gTLDs will be blocked by many because of collision security issues. If new gTLDs don't resolve everywhere, what will that do to their value?
These are but a few of the realities that you won't hear about from some of the new gTLD sellers. This is why new gTLDs do themselves a disservice by comparing themselves to .com, which has a record of growth and stability. Domainers, Internet experts and business owners alike agree.
All one has to do is take a look at the "data" being used to tout the benefits on new gTLDs to see that the argument to invest should be met with strong skepticism. For example, a recent CircleID post from the CEO of Donuts states with regard to search, "Internet addresses registered in new gTLDs are holding their own against — and in some cases outperforming — comparable addresses registered in legacy domains like .COM." However, the examples cited clearly show the comparable .com domains performing better. This is equally true for .com domains that don't contain any of the keywords of the new gTLDs referenced. That's because the most important factor for search is the quality of the content on the site, while the most important factor in domain registration is choosing a globally recognized, used and trusted TLD like .com, a point echoed by Google's John Mueller recently. Mueller, whose title is Webmaster Trends Analyst at Google, felt the need to set the record straight and wrote the following on his Google+ page on Dec. 11, 2014:
"It feels like it's time to re-share this again. There still is no inherent ranking advantage to using the new TLDs… If you see posts claiming that early data suggests they're doing well, keep in mind that's this is not due to any artificial advantage in search: you can make a fantastic website that performs well in search on any TLD."
There's no question that there is room for additional gTLDs that make sense, but that's the key — making sense. It's not about shorter, more keyword rich names — as some with vested financial interests in the new gTLDs keep saying. If that were the case, new gTLDs like .xyz would never have been delegated.
There has been a flood of options and a land rush to secure the best property in this new online real estate, but that is where it seems to have ended for so many of these new gTLDs that don't make sense. The CircleID post cited above says, "Even as new gTLDs grow exponentially in popularity, we are many years away from any scenario in which registrants have difficulty finding available, keyword-rich names in the new gTLD space." I guess the question is: Do they want to find space there?
As the expansion of gTLDs brings about a massive range of new — and often similar — domain extensions, it increases the likelihood that consumers will be unsure which new gTLD extensions provide a secure and appropriate experience. Moreover, the process (or lack thereof) to secure one of these new gTLDs is often difficult and confusing, as noted in this GeekWire article titled, "Buying a new gTLD domain name? The process .sucks."
For example, in this new world of hundreds of domain registration options, if I'm a photographer trying to decide on a domain name, I can choose .photography, .photo, .pics, .photos, .camera or .pictures. I may even consider .exposed or .digital, or perhaps I am the photography.guru. In this scenario, the average user could become overwhelmed and confused; not only in trying to choose an appropriate extension, but also in trying to figure out which registrar offers that extension. Similarly, their customers will now need to remember which similar sounding extension they need to type in to find their vendor. Or, perhaps they'll just go with whoever resolves to .com because that's what they know and trust. That's why I elected to secure about two dozen personal domains on .com, where I had no problem registering domains to suit my needs and I know that .com will be here for the long haul.
Encouraging small business owners to put their online presence in the hands of unknown, untrusted and unestablished TLDs, where an unknown number of them likely won't be around in the next year or two is simply bad business advice in my opinion. In addition to paying to rebrand themselves, they will need to have a high SEM budget to rank decently on a new gTLD, and then if that gTLD fails, they will have to pay to do it all over again on a different TLD. This is a real risk that is being totally glossed over. The real costs that should be discussed are those of building and marketing an online presence and the natural conclusion by those without a vested financial interest in seeing new gTLDs succeed is that it behooves small businesses to do that on a TLD with staying power, like .com.
Established companies like those cited in this NetworkWorld article have passed by the hundreds of new options to .com because they know that .com is a smart and secure investment. Similarly, The Domains reported recently that the week of Nov. 15 — only two weeks before Black Friday — 559 domain names ending in .com were registered containing the term "blackfriday." During that same period, there were only 16 domain name registrations in the .blackfriday new gTLD extension. If there is such enthusiasm for new gTLDs and such a lack of availability in .com, as some have suggested, this wouldn't happen.
There has been a lot of hype about new gTLDs and there's no doubt that some people will find new gTLDs that work for them. Registrants should make the choice that works best for them, but they're entitled to know all of the facts before making that decision. The better informed they are, the better decisions they will make.
The good news for those who make the decision to entrust their online business to a new gTLD and find it didn't work out as expected, is that .com will still be there for them — just like it has been for the last 30 years - when they need to make the switch like so many others already have.
Written by Jeannie McPherson, Senior Manager of Corporate Communications, Verisign
Follow CircleID on Twitter
Kieren McCarthy reporting in the Register: "The US government has posted a step-by-step guide to how it authorizes changes to the internet's root zone — the heart of the world's domain-name system. The 16-page slide deck published by the Department of Commerce's National Telecommunications and Information Administration (NTIA) sheds light on what has been a contentious and largely secret process for the past 15 years."
Follow CircleID on Twitter
More under: DNS
The recent huge security breach at Sony caps a bad year for big companies, with breaches at Target, Apple, Home Depot, P.F.Changs, Neiman Marcus, and no doubt other companies who haven't admitted it yet. Is this the new normal? Is there any hope for our private data? I'm not sure, but here are three observations.
Systems are so complex that nobody understands them
This week Brian Krebs reported on several thousand Hypercom credit card terminals that all stopped working last Sunday. Had they all been hacked? No, they were doing exactly what they'd been programmed to do.
When these terminals send customer transaction data to the bank, the session between the terminal and the bank is protected by a cryptographic certificate similar to the ones used in secure Web sessions (the ones with the lock or the green address bar.) Each certificate includes an expiration date. Those terminals are pretty old, and their certificate was issued in 2004 with an expiration date ten years in the future, a date that arrived last Sunday. Expired certificates are no longer valid either for web sessions or for credit card terminals. Oops. Setting an expiration date is a reasonable thing to do for a variety of technical reasons, but someone does have to remember to renew it.
Back in 2004, the people who designed the terminals either assumed that the terminals would be replaced by now, or that well in advance of the expiration date the vendor would update the terminals with new certificates. If they made the first assumption, they came pretty close, since those terminals will have to be replaced by October 2015 to handle the long delayed switch to cards with chips.
The second assumption no doubt seemed reasonable, but during the past decade Hyperion was sold, merged, spun off and re-merged so I'd be surprised if there were many (any?) people around now at the current company Equinox who remembered that the certificates would expire.
Even if they had remembered, it's not clear how easy it would have been to find all the terminals that needed reprogramming. Hyperion probably sold the terminals to specialist "acquirers" such as Elavon, Vantiv, and First Data, who then sold or leased them to the merchants. There's been plenty of M&A in that business, too, and their records about exactly which merchant has exactly which kind of terminal are unlikely to be in perfect order.
It wouldn't be technically hard to have the server software with which the terminals communicate look for signatures using certificate that would expire soon, but the acquirers often are just sales agents for banks, so the servers belong to someone else, and the signature checking software is security-critical, so it's not something they will change quickly. Again, in principle this could all be made to work, but it's a lot of moving parts at a lot of different organizations to deal with a problem that sounds extremely obscure and hypothetical until it happens.
Multiply this kind of chaos by several thousand, and you get the state of corporate computer security. The priority is always to make things work NOW, not to keep systems simple for long term stability. With all of the different parts, companies frequently make what are in retrospect obvious foolish mistakes, such as encrypting data in some but not all of their network traffic.
The incentives are wrong
The Sony breach is different from the others because Sony itself (and its employees) is suffering the pain. For all the others, the breach was primarily customer data, so the burden falls on the customers to notice bogus transactions, on the customers' banks to deal with the bogus transactions and issue new cards, and possibly the merchants whose transactions got charged back. Target, Home Depot, and so forth certainly got bad publicity, and in some cases they may be the target of lawsuits, but no money went out the door of Target et al. to the crooks.
Lacking direct financial losses, the incentives for internal security people to find breaches like these are not compelling. More than once (as discussed in the next section) I've given a company direct evidence that they have a security breach, and they've just denied that there could be a problem. Investigating my report involves work, and if they find out I was right, they look bad since they failed to prevent it. I'm not sure how to realign the incentives, but it's got to happen.
Banks are just as bad, since their goal is generally more to avoid being sued or being sanctioned by regulators than to minimize fraud. If a bank can show that they were doing the same thing as every other bank (an approach usually called "best practices") they're generally off the hook even if those practices are obviously ineffective. This leads to nonsense like those questons asking you about your pet's favorite color, which have somehow been blessed as a substitute for actual two-factor authentication, which would cost money. (There are banks that do two-factor authentication, but not many in the U.S.)
Again, this good-enough-herd mentality is hard to fix. We clearly don't want a bank to do worse than its peers, but we also don't want banks all to use security measures that don't work.
Some companies are a lot better than others
I don't have any direct data about credit card security, but I have a lot about e-mail address security. Whenever I do business with a company online or sign up for their mailings, I use a unique e-mail address. If I start getting mail to that address from someone else, I know it's leaked.
What I have found is that some companies leak and some don't. The Wall Street Journal has repeatedly leaked my address to spammers, but the New York Times hasn't. TD Ameritrade blew off repeated reports (including several from me) that e-mail addresses were leaking, until they found malware running on one of their internal servers. Vanguard hasn't leaked any of my information. The Economist and Forbes (more than once) have leaked, the Atlantic Monthly hasn't. There are a lot more, but you get the idea.
Maybe the non-leakers are just lucky, but I think it's more likely to be a better security culture. Companies are very skittish about talking about their security practices, usually for good reason since the questions tend to come when they've screwed up. Perhaps highlighting companies that are succeeding would work better.
Written by John Levine, Author, Consultant & Speaker
Follow CircleID on Twitter
DMARC is extremely useful, yet I've heard some vendors are putting their implementations on hold because of the IETF DMARC working group. You really shouldn't wait though — it's been in wide use for nearly three years, enterprises are looking at DMARC for B2B traffic, and the working group charter is limited in it's scope for changes.
Let's compare this to a similar situation in the past. When I was on a panel at the INBOX Conference in 2006, I told the audience — which included email appliance makers, software vendors, and email service providers (ESPs) — that they should already be offering SPF and be preparing to support DKIM, which would be finalized Real Soon Now (in fact it took another year). But I did not tell them they should be implementing DomainKeys…
DomainKeys had been announced two years earlier and was an effective technology, but it was not widely deployed — a number of senders were using it, but few mailbox providers or companies were using it to filter inbound messages. Meanwhile the IETF DKIM working group was creating something different from and incompatible with DomainKeys. So the decision to wait for the DKIM working group to finish was reasonable.
Today, the circumstances around DMARC are very different. DMARC has been filtering the email sent to over 80% of consumer mailboxes in the US alone for almost three years now, and over 80,000 active domains have published DMARC records. It's already popular with saavy email senders and domain owners for the light it sheds on where email using their domains is coming from. And just as enterprises became enthiastic users of TLS for B2B email, DMARC is being evaluated as an additional protection for sensitive B2B email channels.
The IETF DMARC working group is chartered to fix some important interoperability issues with forwarded email, mailing lists, and other "indirect" mailflows. However the working group is not chartered to make major changes to the protocol, the way the DKIM working group was — maintaining interoperability with existing implementations is a key objective.
Several vendors and services have already integrated DMARC filtering into their products (list at dmarc.org), and you can bet more have it in the pipeline. So if you're planning the next release of your email appliance, MTA, or related software, you should make sure you've got DMARC support in the works too.
Written by Steve Jones, Consultant - Programmer - Strategist
Follow CircleID on Twitter
More under: Email