Feed aggregator

ICANN vs. the Federal Reserve

CircleID - Sun, 2016-05-01 15:28

The Internet is about to go independent.

After years of support and supervision, the U.S. Government is about to irrevocably relinquish its control over the Internet by transferring its authority to an independent corporation named ICANN.

As part of this push, the current chairman of ICANN posted this article to the Wall Street Journal (copied below via fair use): Broadening the Oversight of a Free and Open Internet

What's truly amazing about this piece, is how well it's done.

I found myself nodding in agreement with just about everything in it, especially the reasons why the U.S. Government should do the transfer, and the vision for the organization that takes over this role.

It was only after digesting this piece, that I remembered we are talking about ICANN here. You see, the Internet community has been deceived once before ...

On July 2nd, 1997, the NTIA started the process to transfer its authority over the Internet via a Request for Comments. Then, after thousands of comments were submitted by a wide range of Internet stakeholders, they were compiled into a "Green Paper" which stated:

Principles for a New System. The Green Paper set out four principles to guide the evolution of the domain name system: stability, competition, private bottom-up coordination, and representation.

As the Green Paper devolved into the White Paper, and the White Paper devolved into ICANN, the "representation" principal was the one that was constantly at issue.

Initially, ICANN fought against any representation for Internet users. But, in order to get the contract, ICANN begrudgingly allowed user representation on their board. For North America, that person was Karl Auerbach, one of the first elected representatives in cyberspace.

Unfortunately for everyone, this was a token gesture. Karl was excluded from all important decision making. Then when he complained and tried to change the system, he was removed from the ICANN board, along with all user representation.

Today, we don't know how ICANN came about, we don't know who is behind it, and we don't know how decisions are made in ICANN.

What's really going on here, is the powers that be are about to grant a perpetual franchise of control over who is who, and what is what, on the Internet.

It's similar to the transfer of authority over the money supply by the U.S. Congress to the Federal Reserve. Except in ICANN's case, once the transfer is done, there will be no way to undo it.

So while it may be true that many people and organizations support the transfer of authority from the U.S. government to an independent entity, many of the names mentioned in the article below have also expressed concerns with ICANN (in its present form) being that entity.

Today, ICANN continues to refuse all attempts to put in place some form of user representation, and continues to operate in secret with no transparency or sunshine.

If this transfer is allowed to go forward as is, I predict that, years from now, we will be watching with bated breath for decisions from ICANN, just like we watch for announcements from Janet Yellen today:

Will the Fed raise interest rates by 0.25%?

Will ICANN limit what transgender people can say in cyberspace?

Either way, we'll watch these decisions from afar, and wonder how those people came to have so much power and control over our lives.

Jay Fenello,
Phone: 770-516-6922

* * *

P.S. Here's the referenced article:

Broadening the Oversight of a Free and Open Internet
Stewardship by the global community will guard against 'capture' by one group or government.
By Stephen D. Crocker
April 19, 2016 6:31 p.m. ET

Today the global Internet connects three billion of us. While it has grown, the world has shrunk. Geographic distance has become less relevant as we can more easily access information, communicate and reach new customers.

The Internet has matured because it is free and open, led by the private economy and based on voluntary standards. It is built on the principles that define America: free enterprise and limited government.

It is those same ideals of privatization that frame a proposal recently sent to the National Telecommunications and Information Administration that would transition stewardship of some key Internet technical functions away from the U.S. to a diverse and accountable global Internet community.

You can read the rest here:
Broadening the Oversight of a Free and Open Internet

Written by Jay Fenello

Follow CircleID on Twitter

More under: ICANN, Internet Governance, Law, Policy & Regulation

Categories: External commentary

Internet Governance in Transition: The ITU as a Battleground for Rival Visions

CircleID - Fri, 2016-04-29 16:30

This article was co-authored by Ambassador Gross (chair of Wiley Rein's International & Internet Practice), Carl R. Frank, Umair Javed, and Sara M. Baxenberg (members of Wiley Rein's Telecom, Media & Technology Practice).

* * *

During the past few years, the International Telecommunication Union (ITU) has been a battleground where governments promote rival visions of how the Internet should be governed. Although there has been a recent cease-fire as Internet governance debates have focused more on the role of ICANN, those skirmishes may soon restart at the ITU. Indeed, Internet-related issues already are moving from the periphery of discussions in the ITU's Telecommunication Standardization Sector (ITU-T) to the top of the agenda at many ITU-T study group meetings. These discussions likely will culminate at the upcoming World Telecommunication Standardization Assembly (WTSA-16) — another significant meeting that will probably help to shape the ITU's future role and activities regarding Internet-related public policy.

As a result, businesses and others in the communications, Internet, and related industries would be wise to monitor carefully the domestic and international preparations by governments leading up to WTSA-16. In essence, developments at WTSA-16 could have important and potentially harmful consequences for companies and others that might find themselves subject to new ITU oversight or even regulatory burdens.

What's at Stake

There has been considerable controversy in recent years over the ITU's role in Internet governance. Debates have been dominated by two factions with fundamentally different views. Some governments, such as the United States, those in Europe, Japan, and others, support a role for all stakeholders in Internet governance and have pressed for a multistakeholder approach that enjoins national governments to participate in Internet governance issues on equal footing with the private sector, civil society, and academia. Other governments, including China, Russia, and many from the Middle East, support a more robust role for governments in Internet governance and have favored multilateral or intergovernmental arrangements, where states are the primary actors in policy discussions administered by the ITU. Indeed, at a recent meeting in April, foreign ministers of Russia, China, and India agreed on "the need to internationalize Internet governance and to enhance in this regard the role of [the ITU]."

A few years ago, these debates came to a head at the World Conference on International Telecommunications (WCIT-12), a treaty conference that reviewed an important 1988 international telecommunications treaty, the International Telecommunication Regulations (ITRs). WCIT-12 saw a number of proposals from governments favoring multilateral mechanisms and expanded legal authority of the ITU regarding a variety of Internet-related matters. Because of fundamental disputes over the appropriate role of the ITU regarding the Internet, for the first time in the ITU's 150-year history, a significant number of countries (including the United States and most of Europe) affirmatively declined to sign the revised treaty.

More recently, however, at the 2014 Plenipotentiary Conference held in Busan, South Korea (PP-14), governments decided to avoid fundamental changes to the ITU's jurisdiction and instead appeared to embrace a more multistakeholder approach to Internet policymaking. Plenipotentiary Conferences, held every four years, are treaty conferences that set the ITU's general policies and revise key legal texts of the ITU, including the Constitution and Convention. Despite calls from some governments to incorporate new ITU provisions to oversee Internet issues related to domain name governance, cybersecurity, privacy, data protection, and content, governments ultimately decided not to make such changes at that meeting. In fact, governments agreed to withdraw proposals, previously endorsed by Russia, China, Saudi Arabia, and others, aimed at providing the ITU with legal authority to coordinate global policies related to Internet governance.

Importantly, and little noticed at the time, decisions at PP-14 nevertheless subtly but materially broadened Internet-related work at the ITU in other, potentially significant ways. These changes were accomplished through several Resolutions adopted at Busan, reflecting a strategic shift on the part of some governments that significant changes can be made merely by the adoption of Resolutions (which drive the ITU's agenda for a four-year cycle and beyond), rather than the more controversial process of changing the ITU's jurisdiction by amending the Constitution and Convention. Notably, many of the new or amended PP-14 Resolutions refocused the ITU's work beyond telecommunications and into more problematic areas such as Internet content and applications, cybersecurity, and Internet policy, among others. The impact of these series of Internet-related Resolutions now is reflected in ITU-T study groups and in the preparatory process for WTSA-16.

The Expanding Role of ITU-T Study Groups

ITU-T is one of three sectors of the ITU, the others being the Radiocommunication Sector (ITU-R) and the Development Sector (ITU-D). The ITU-T's primary function is to develop and coordinate voluntary international standards, known as ITU-T Recommendations, covering international telecommunications. The ITU-T's work primarily is carried out by technical study groups. These study groups address a wide variety of Internet-related technical and economic issues, including transmission protocols, cybersecurity, cloud computing, and the terms of interconnection agreements.

Technical decisions in these areas can have far-reaching economic and social consequences, altering the balance of power between competing businesses or countries and potentially constraining the freedom of users. What is more, the Internet-related PP-14 Resolutions illustrate how standards can be, in essence, politics and policymaking by other means. The increased attention paid by governments to the work of ITU-T study groups should trouble affected businesses as well as others and encourage them to understand the deeper meaning beneath the technical nuts and bolts at the ITU.

Three study groups, discussed below, are particularly notable for their increased focus on Internet regulation and Internet governance. Led by governments that prefer to address such issues via multilateral public policymaking, the activities of these groups are moving further into what many think more properly is the arena of multistakeholder governance.

  • Study Group 3 – Economic and Policy Issues. Historically, SG3 focused on traditional telecommunications economic issues such as international tariffing, roaming, and resale. More recently, SG3 refocused on a series of topics related to the Internet, particularly over-the-top (OTT) services, "charging and accounting/settlement mechanisms," and "relevant aspects of IP peering." For example, newly adopted text that could become a Recommendation encourages governments to develop measures to strike an "effective balance" between OTT communications services and traditional communications services, in order to ensure a "level playing field" (e.g., with respect to licensing, pricing and charging, universal service, quality of service, security and data protection, interconnection and interoperability, legal interception, taxation, and consumer protection). Plainly, SG3 is pushing the ITU even more into Internet-related policy and technical matters.
  • Study Group 17 – Security. SG17 coordinates security-related work; cybersecurity and spam are high on its agenda. Cybersecurity is proving to be a dominant issue throughout the ITU, and related, contentious debates are finding a new home in SG17. Some of the work of SG17 arguably involves fundamental and important foreign policy and national security issues that appear to fall outside the ITU's remit, such as cybercrime. In addition, SG17 increasingly is focusing on the security of applications and services for Internet of Things (IoT), smart grid, cloud computing, and the protection of personally identifiable information. Each of these activities potentially sets a precedent for an expanded ITU role in these issues going forward.
  • Study Group 20 – IoT and Its Applications Including Smart Cities and Communities. SG20 was created in 2015 over the objections of some governments, including the United States. Those objections were based upon the concern that the focus of SG20 — namely, the development of international standards for the coordinated development of IoT technologies, including M2M communications and ubiquitous sensor networks — either was unnecessary or was allotted elsewhere. Nevertheless, SG20's activities seemingly have centered on the attempted standardization of end-to-end architectures for IoT and mechanisms for the interoperability of IoT applications and datasets. Some governments and many others have expressed concern that recent expansions of the ITU's work agenda on IoT runs parallel to, and potentially impedes the effectiveness of, existing global standardization efforts primarily driven by the private sector through a variety of other standards development organizations.

Businesses and others may find it worthwhile to monitor the activities of these various ITU-T study groups — they effectively may set the international regulatory environment for many aspects of the Internet and new technologies. Indeed, although study group outcomes theoretically are voluntary, the ITU-T study groups' work often is converted directly into domestic law in many countries, or could become international "norms," or even treaties, and thus mandatory standards.

WTSA-16 and the Future of ITU-T

WTSA is a "once every four year" ITU conference that sets the mission of each ITU-T study group until the next conference. WTSA-16 is scheduled to be held in Tunisia from October 25 to November 3, 2016. WTSA-16 decisions will be important because, among other things, they will determine the scope of the ITU's impact on the Internet-related issues discussed above.

Governments and particularly private sector companies and others that participate are expected to address a number of Internet public policy-related issues. These include the OTT, cybersecurity, and IoT issues now being discussed in study groups 3, 17, and 20, respectively. In fact, governments may submit proposals for new work on these issues, further solidifying an expanded role for the ITU going forward. Other governments are expected to offer proposals to restructure or even eliminate some of the study groups. Governments and others also likely will discuss other Internet-related issues at WTSA-16, including:

  • ITRs. One of the outcome Resolutions from PP-14 calls for review of the ITRs every eight years. That Resolution requires formation of an Expert Group on the ITRs in early 2017, comprised of governments and other private sector members of the ITU to initiate review. WTSA-16 could become one of the first testing grounds for another WCIT.
  • Internet Resolutions. Many existing ITU resolutions regarding Internet Protocol-based networks and the ITU's role regarding international Internet public policy issues — especially pertaining to the management of Internet resources — will be important topics of discussion at WTSA-16. In addition, WTSA-16 likely will address issues associated with strengthening the role of the ITU in building confidence and security in the use of ICTs, and the role of governments in the management of internationalized domain names.

The preparatory process for WTSA-16 already has begun. Over the next few months, study groups will hold their final meetings and they will draft new questions for the next four years as well as specific recommendations for approval, modification, or deletion by governments. Regional telecommunication organizations, including APT, the Arab States, CEPT, CIS, and CITEL, are holding preparatory meetings to prepare regional positions on the issues that will be discussed at WTSA and to develop common regional proposals. Although formal decision-making at WTSA-16 will be limited to governments, the private sector and others can have a material impact both directly and through their national delegations.

Written by David A. Gross, Chair of Wiley Rein’s International & Internet Practice

Follow CircleID on Twitter

More under: Internet Governance, Internet of Things, Policy & Regulation, Security

Categories: External commentary

WSIS, Development, and Internet Governance: A Plea for 'Star Trek' over 'Mad Max'

CircleID - Fri, 2016-04-29 15:02

Humanity continues to find itself at a crossroads. Ahead of us lies an uncertain future filled with predictions of imminent doom and ominous prospects along with the wonders of science and technology. Behind us lies a century marked paradoxically by both devastating global conflicts and unparalleled global collaboration. As societies continue to globalize, we are increasingly becoming more connected — to the point where it is difficult, if not impossible, to divorce ourselves from the interconnectivity in contemporary systems of commerce, economics, politics, and culture. Keeping this reality in mind, I view ensuring broad development and advancement across a host of social indicators, including class, nationality, and geographic location, as one of humanity's primary responsibilities for itself. Such responsibility stems from a deeply rooted, if not cringeworthily idealistic, motivating belief I hold that if the basic needs of more individuals around the world are ensured — especially those in low- and middle-income countries — we can collectively focus more time, energy, and human capacity toward a universal goal of progress to protect, in the words of Elon Musk, the "light of consciousness."

The fact that it is much more difficult for someone to ponder the universe when their stomach is empty is self-evident. The same is true in environments when women and girls must travel for two hours one-way to collect water (that is often contaminated), or when individuals in positions of power siphon aid money that is meant to educate children or provide healthcare to communities. Indeed, much work needs to be done across many developmental indicators to continue to inspire hope for a better future so that we can foster a world similar to "Star Trek" and avoid one resembling "Mad Max." Yet, when it comes to the role of technology vis-à-vis development, it is also undeniable that the internet and information and communication technologies (ICTs) can help empower people with the wealth of humanity's accumulated knowledge.

Development will continue to take center stage at various internet governance (IG) fora, discussions, and processes this year in light of the United Nations adopting 17 Sustainable Development Goals (SDGs) in September 2015, including a target regarding access to ICTs, as well as the Internet Governance Forum (IGF) Multistakeholder Advisory Group (MAG) choosing the theme "Enabling Inclusive and Sustainable Growth" for IGF 11 and the overwhelming consensus to support the World Summit for the Information Society (WSIS) process. And as access becomes more ubiquitous, the internet and ICTs can provide a key tool for development. In particular, as one author suggests, it can engage youth and include them in the development process; identify resources and map patterns for better decision making or public action (such as with Ushahidi); quickly gather information to aid in the investigation or dissemination of information or instructions, e.g., with disease outbreaks such as the Ebola crisis or natural disaster relief management; support accountability, transparency, anti-corruption efforts, and human rights; and improve municipal services and information management.

Although ICTs are not a panacea for the world's problems, using the story of the young, famed Malawian inventor William Kamkwamba, who built an electricity-producing windmill out of scrap, in part, due to access to his local library, as an example, it is clearly deducible how the ability to locate pertinent information, access existing patents, research scientific literature, or connect with a knowledgeable person or resource can make a significant difference in the lives of underprivileged people in particular. Yet ensuring that such knowledge and information is open and accessible so that it can be used as a resource to help both individuals as well as organizations solve the developmental challenges of the 21st century and fulfill the 17 SDGs is imperative to progress. While access to knowledge and instant communication technologies themselves will not feed the hungry or resolve civil conflict, they can provide access to solutions, new ideas and perspectives, or communities or nongovernmental organizations (NGOs) that can support innovation and creative problem solving.

The role of internet governance

Thus, it is the responsibility of the internet governance community and all involved in expanding accessibility such as governments and international NGOs to empower sustainable development by ensuring the needs of developing societies and the underprivileged are met. This should incorporate multiple courses of action, including but not limited to protecting openness and accessibility; upholding internet access as a human (or civil) right; expanding network interoperability and security; supporting technological innovation and infrastructural development; providing platforms and fora where the multiple and often competing interests of stakeholders can be addressed and discussed; promoting collaboration and cooperation among stakeholders; and engaging in capacity building and education, specifically technical skill building and digital media literacy education. Four areas in particular require concerted attention as well as a multistakeholder approach to devise effective solutions and bridge the digital divide(s):

1. More widespread internet infrastructure, which also includes wider availability of Internet eXchange Points (IXPs), IPv6, and Internationalized Domain Names (IDNs)

2. More easy-to-use and affordable services

3. Relevant local digital content and local language support

4. Higher digital literacy skills

Ultimately, the relationship between sustainable development and internet governance relates to the future of the internet as a whole, which affects all countries regardless of income level. If accessible information and instant communication can continue to be a staple of the contemporary paradigm of existence, this robust system must be safeguarded from fragmentation or other threats to internet infrastructure and operability.

Moreover, driving the future of the internet forward in a way that emphasizes the inclusivity of stakeholders and focuses on dialogue, negotiation, consensus building, and collaborative problem solving is vital to ensuring more investments in the internet and ICTs can be made. Such investments include upgrading infrastructure, expanding mobile access or Creative Commons-licensed content, or incentivizing ideas such as the OneWeb satellite constellation, Mozilla's zero-rating initiatives, Facebook's Project Aquila, or Google's (Alphabet's) Project Loon. Engaging governments and other stakeholders to promote cooperation and foster inclusive, consensus-based sustainable outcomes is equally as important to solutions that involve technical or tangible means.

Policymakers, activists, educators, organizations, and a host of other stakeholders in the internet governance community certainly have a hands-on role to play in terms of short- and medium-term sustainable development. Internet governance is, however, inextricably tied to long-term developmental goals and outcomes, especially since ICTs can be a driver of the progress sustainable development is meant to facilitate, but on a global level. Thus, while technology is changing and being adopted quickly, it is critical that we address how governance structures and institutions can maintain a constructive pace as well. Operating under the assumption that such structures and institutions practically and historically have taken long periods of time to adopt and adapt to significant changes, moving forward into the future also brings with it much ambiguity that can be mitigated in part by more robust collaboration, more constructive dialogue, more transparent and accountable processes, and inclusive strategic planning.

Current trends may suggest more conflict, especially political conflict, as well as diverging long-term interests between stakeholders, in particular in emerging economies where sustainable development outcomes are often most needed. Such conflict challenges the entire notion of democratizing the internet, especially when there are so many interests in play and stakeholders invested. While it is advantageous to model potential outcomes, refining current processes will help ensure a smoother transition going forward and continue to create better mechanisms to address any future complications or challenges that arise. Genuine interest in collaborative decision making through dialogue and sincere engagement with stakeholders is key to strengthening confidence that internet governance and internet-related policies are robust, inclusive, engaging, bottom-up, and consensus-based.

Additionally, the internet governance community must use the opportunity that the various fora allow to build consensus on difficult and contentious topics. Does the community take a clear stand on access to information, for instance, by promoting censorship-circumventing virtual private networks (VPNs) when information is blocked by an authoritarian government? Would that alienate certain stakeholders, erode the multistakeholder model, or become too politicized? What boundaries and limitations should the multistakeholder internet governance community respect in terms of its operation? And are there clear values that need to be agreed upon and defended? Is reaching consensus on and adopting such values achievable given today's political climate? These are only a few of the questions that the internet governance and larger development community must continue to work through in terms of its purview and responsibilities as the internet further grows and develops as an ecosystem. These are not rhetorical questions, either; for instance, the work of the IGF's Dynamic Coalition on Core Internet Values, the Internet Assigned Numbers Authority (IANA) transition process, the NETmundial outcome document, and the continuation of the WSIS process as well as the 2003 WSIS Declaration of Principles seem to suggest that consensus on certain internet functions and values can be achieved. On the contrary, answering such questions is vital to preserving the potential information society provides so we can more effectively incorporate the next billion internet users and ensure a more promising future for all.

Written by Michael Oghia, 2015 ISOC IGF Ambassador

Follow CircleID on Twitter

More under: Access Providers, Broadband, Internet Governance, Policy & Regulation, Telecom, Web

Categories: External commentary

Why Broadband Speed Tests Suck

CircleID - Thu, 2016-04-28 21:51

Everyone is familiar with broadband 'speed test' applications. When you have an ISP service quality problem, it is common to grab one of these tools to see if you are getting the service you feel you are entitled to get.

Whenever you upgrade your broadband, the first thing you might do is to run a speed test. Then you can show off to your mates how superior your blazing fast service is compared to their pitiful product.

The whole broadband industry is based on hawking 'lines' to its 'speed'-addicted 'users'. The trouble with broadband is that the performance can be 'cut' from over-sharing the network. Buyers are naturally concerned with the 'purity' of the resulting product versus its claims of potency. These 'speed test' tools purport to fill that quality assurance role.

The resulting situation is not good for the wellbeing of users. Selling 'speed' is also not a particularly ethical way for ISPs to make a living. So why do speed tests suck so badly? Let me tell you…

* * *

Misrepresents the service to users

Speed test applications implicitly aim to tell users something about the fitness-for-purpose of the service on offer. However, they are only a weak proxy for the experience of any application other than speed testing itself.

By its nature, a speed test tells you some (peak) rate at which data was transferred when the network is (over-)saturated. It is nearly always expressed as a 'per second' measure. That means it doesn't capture the instantaneous effects of the service, as a lot of packets can pass by in a second. Yet the user experience solely comprises the continuous passing of those instantaneous moments of service delivery.

It is like having a report on pizza home delivery service that tells you that pizzas arrived within 10 minutes of coming out of the oven, when averaged over a year. That averaging could hide an hour-long wait on Saturday evenings which makes the service unfit for purpose. Merely reporting on average quantity misses essential information about the quality of the service.

Confuses capacity with other factors

The 'speed' figure reported is the result of the interaction of many different technological elements, all of which contribute to the number presented to the user. It is a long list!

These factors include: the speed tester client code (e.g. JavaScript), the browser (and its execution environment), the OS and its TCP stack implementation, any virtualisation layers of the OS and connectivity (e.g. VPNs, VLANs), the customer LAN or WiFi (and any local activity, like backups to an Apple Time Capsule), the router or home gateway, security filters and firewalls, the physical access line (which may be wholesaled), the retail ISP network, IP interconnect, and the hosted speed tester service (server hardware, OS and any virtualization, and network stack). Apologies if I have missed anything.

What happens is that we pretend that the final 'speed' presented is the result of a single one of those factors, namely the 'capacity' of the ISP service (including the access link, where vertically integrated). Even that falsely treats all the internal complexity of that ISP as if it were a circuit-like 'pipe'.

More subtly, the ISP 'speed' being measured is an emergent phenomenon as all the platforms, protocols and packets interact within the network. The ISP doesn't even have control over the result that they are being held responsible for!

Speed tests are costly

What we are doing when we run a 'speed test' is a small-scale denial of service attack on the network. That's not a big deal when individual users do it on rare occasions. When ISPs themselves deploy thousands of testing boxes running lots of tests all the time it becomes a significant proportion of the load on the network.

This has a cost in terms of the user experience of other applications running at the same time. One person's speed test is another person's failed Skype call. Speed testing is a pretty antisocial way of going about using a shared resource. This is especially true for physically shared access media like wireless and coax cable. For FTTx users it doesn't take many simultaneous speed testers to create local performance problems.

The high load of speed testing applications also directly drives the need for capacity upgrades. My colleagues have seen networks where there is an uncanny correlation between where the ISP's own speed testing boxes are placed, and where the capacity planning rules are triggered to drive capital spending!

Wrongly optimises networks

By focusing marketing claims on peak data transfer rates, speed testing encourages network operators and ISPs to optimise their services for this purpose. The difficulty is that we live in a finite world with finite resources, so this must involve trade-offs. When we optimise for long-lived data transfers, we may pessimise for real-time and interactive services.

We have seen much concern and engineering effort expended over the phenomenon called 'bufferbloat'. This is when we have large buffers that create spikes of delay as queues build up. As ISPs have tuned their services to meet their marketing promises for speed, they have taken their eye off other matters (i.e. packet scheduling) that are of equal or more importance to the overall user experience.

Doesn't even tell the truth

You might get the impression that we're a bit down on speed tests as a measurement method for networks. Well, it gets worse. They don't even accurately report the peak transfer rate!

What happens is that you can get packets arrive out of order, for instance when they take different routes. Then the network protocol stack 'holds back' a bunch of packets from the application until the missing one arrives. Then it releases them all in one go when the missing one turns up. They all suddenly appear at once to the speed testing application, which then reports an amazing burst in speed. This reported number may even greatly exceed the maximum line rate, and be physically impossible.

So the peak transfer rate the speed test application reports can include artefacts of the protocols and control processes involved. The number you get can be a total fib, and you have no way of knowing.

Drives false marketing claims

As Seth Godin notes, "All marketers are liars”. The act of marketing is to present the product to the user in most favourable light when compared to your competition. As ISPs compete over marketing claims for peak speed, they naturally like to report the peak of all peaks on their advertising hoardings.

This sets up a false expectation with users that they are entitled to achieve these best of all best case data rates. Furthermore, the users feel that 'speed test' applications are an accurate representation of the peak data rate of their line and the performance of all applications. This then leads to a blame game in which the ISP is accused of not fulfilling its service obligation promise. The ISP has no means of isolating the cause of the poor reported speed test results, or the poor performance of any other application. This drives huge dissatisfaction and churn.

The ISP industry is already battling technically incompetent 'net neutrality' regulation. Adding inept claims of the service capability, and encouraging use of misleading reporting tools to measure this, doesn't help the cause of rallying public opinion to your side.

What's the alternative?

If unthinking pursuit of speed tests is unwise, what could be a better approach?

To answer this, we need to go back to fundamentals. The application performance on offer is a result of the packet loss and delay (and nothing else). So why not measure that? The good news is that we now have non-intrusive, scalable and cheap to deploy methods of doing just this.

Written by Martin Geddes, Founder, Martin Geddes Consulting Ltd

Follow CircleID on Twitter

More under: Access Providers, Broadband

Categories: External commentary

On the Way to the G7 ICT Ministers' Meeting in Japan

CircleID - Thu, 2016-04-28 20:36

This week in Japan I have been invited to address the Multi-Stakeholder Conference that will officially open the G7 ICT Ministerial summit in Takamatsu. The focus of the ICT Ministerial will be on four distinct areas:

  1. Innovation and economic growth;
  2. Unrestricted flow of information, and ensuring the safety and security in cyberspace;
  3. Contributing to the resolution of global issues, including digital connectivity;
  4. International understanding and international cooperation in the future.

In December 2015, we were encouraged to see the nations of the world endorsing the WSIS agreement made 10 years ago in Tunis. The updated WSIS+10 outcome document is unequivocal that the Internet should be "governed" through bottom-up, collaborative processes that include all those with a stake in the outcome.

This continued commitment is another milestone in Internet governance that we must build upon and deepen. We believe, strongly that issues of Internet safety and security cannot be addressed by one stakeholder alone — be it the industry or the government. Indeed, we voiced a concern in New York that it would be a mistake to think "that cooperation ONLY among governments is sufficient to solve issues that require the expertise and commitment of all of us".

This week in Japan, I heard concerns that some governments and commentators continue to assert that matters of security are exclusively within the purview of governments.

The Internet Society believes that this is not and should not be the case. Indeed, because of the transnational and distributed nature of the Internet, security issues are best addressed by collaborative and coordinated efforts of all those with a stake in a trusted Internet, including businesses, civil society groups and governments. We refer to this as Collaborative Security.

The Internet exists because of the creative energy and ideas from individuals across the world, working together to figure out how to connect networks, how to send information across those networks, and how to enable billions of users to benefit from a digitally connected world. We need to apply this same energy to issues of security. Just as networking technology is complex, so is Internet safety and security.

There is no single technical solution or regulation or international agreement or business practice that is magically going to bring about a trusted Internet. The reality is that we must harness the necessary expertise to come together to solve hard problems.

In recent years, there have been countless political debates about whether this collaborative approach to problem solving, often called the Multistakeholder Model, is valid, particularly for complex matters of public policy.

We believe that this debate is settled and that it is now more useful to focus on the particular outcomes we want to achieve for a particular problem when making decisions in the Internet age.

In our view, Internet public policies, regardless of the issue, should:

  • maintain the global, interconnected nature of the Internet,
  • enable permissionless innovation and free expression,
  • strengthen the security, stability and resiliency of the Internet; and,
  • allow the Internet to flourish as a platform for limitless opportunity and innovation around the world.

To craft sound Internet policies and make decisions that address the challenges of today while upholding the core elements of the Internet requires that we bring all the relevant expertise to the table.

With an issue as complex and sensitive as security, it is even more crucial that we do so.

At the G7 meeting this week, I will emphasize this point. The G7 ICT Ministers are among the most influential in the world. It is incumbent upon them to set an ICT policy agenda that rises above the typical politics and that draws upon all the expertise available to get to solutions. And, I am most encouraged by the G7 Foreign Ministers' Joint Communique earlier this week. In that Communique, the Foreign Ministers wrote:

"We reaffirm our commitment to a multi-stakeholder approach to Internet governance, which includes full and active participation by governments, private sector, civil society, the technical community, and international organizations, among others".

Collaboration is key. The Internet is the outcome of the cooperative efforts of different actors. This is true for the Internet's technical issues as is true for its more complex governance issues. The multistakeholder governance framework is widely accepted as the optimal way to make policy decisions that are accountable, sustainable and, above all, effective.

Today, the Internet Society released a paper that discusses in further detail why the multistakeholder approach works and must be embraced to ensure the continuing economic, social and human rights benefits of a global, open and secure Internet.

On Friday, I will join others from the Internet community to illustrate that the only way forward is through continued multistakeholder collaboration and coordination.

The Multi-Stakeholder Conference will be streamed live on YouTube on Friday, April 29, 2016, starting at 09:00 Japan Standard Time (UTC+9).

An earlier version of this post appeared on the Internet Society blog.

Written by Kathy Brown, President and CEO, Internet Society

Follow CircleID on Twitter

More under: Internet Governance, Security

Categories: External commentary

Call for Participation - DNSSEC Workshop at ICANN 56 in Helsinki, Finland on 27 June 2016

CircleID - Thu, 2016-04-28 19:28

Do you have an idea for an innovative use of DNSSEC or DANE? Have you recently deployed DNSSEC or DANE and have some "lessons learned" that you could share? Did you develop a new tool or service that works with DNSSEC? Have you enabled DNSSEC by default in your products? (And why or why not?) Do you have ideas about how to accelerate usage of new encryption algorithms in DNSSEC?

We are seeking presenters on all these topics for the DNSSEC Workshop on June 27, 2016, at ICANN 56 in Helsinki, Finland. The full "Call for Participation" is found below.

If you have an idea and will be at ICANN 56 (or can get there), please send a brief email to dnssec-helsinki@isoc.org by Wednesday, May 18.

Thank you!

* * *

Call for Participation
ICANN DNSSEC Workshop at ICANN 56 in Helsinki, Finland

The DNSSEC Deployment Initiative and the Internet Society Deploy360 Programme, in cooperation with the ICANN Security and Stability Advisory Committee (SSAC), are planning a DNSSEC Workshop at the ICANN 56 meeting on 27 June 2016 in Helsinki, Finland. The DNSSEC Workshop has been a part of ICANN meetings for several years and has provided a forum for both experienced and new people to meet, present and discuss current and future DNSSEC deployments. For reference, the most recent session was held at the ICANN meeting in Marrakech, Morocco on 09 March 2016. The presentations and transcripts are available at here.

Examples of the types of topics we are seeking include:

1. DNSSEC Deployment Challenges

The program committee is seeking input from those that are interested in implementation of DNSSEC but have general or particular concerns with DNSSEC. In particular, we are seeking input from individuals that would be willing to participate in a panel that would discuss questions of the nature:

  • What are your most significant concerns with DNSSEC, e.g., implementation, operation or som
  • What do you expect DNSSEC to do for you and what doesn't it do?
  • What do you see as the most important trade-offs with respect to doing or not doing DNSSEC?

We are interested in presentations related to any aspect of DNSSEC such as zone signing, DNS response validation, applications use of DNSSEC, registry/registrar DNSSEC activities, etc.

2. DNSSEC by Default

As more and more applications and systems are available with DNSSEC enabled by default, the vast majority of today's applications support DNSSEC but are not DNSSEC enabled by default. Are we ready to enable DNSSEC by default in all applications and services? We are interested in presentations by implementors on the reasoning that led to enable DNSSEC by default in their product or service. We are also interested in understanding those that elected not to enable DNSSEC by default and why, and what their plans are.

3. DNSSEC Encryption Algorithms

How do we make DNSSEC even more secure through the use of elliptic curve cryptography? What are the advantages of algorithms based on elliptic curves? And what steps need to happen to make this a reality? What challenges lie in the way? Over the past few months there have been discussions within the DNSSEC community about how we start down the path toward adding support for new cryptographic algorithms such as Ed25519 and Ed448. At ICANN 55 in Marrakech we had a panel session that explored why elliptic curve cryptography was interesting and some high level views on what needs to happen. At ICANN 56 we are interested in presentations that dive into greater detail about what needs to be done and how we start the process. More background information can be found in this document.

In addition, we welcome suggestions for additional topics.

If you are interested in participating, please send a brief (1-2 sentence) description of your proposed presentation to dnssec-helsinki@isoc.org by **Wednesday, 18 May 2016**

We hope that you can join us.

Thank you,
Julie Hedlund

On behalf of the DNSSEC Workshop Program Committee:

Mark Elkins, DNS/ZACR
Cath Goulding, Nominet UK
Jean Robert Hountomey, AfricaCERT
Jacques Latour, .CA
Xiaodong Lee, CNNIC
Luciano Minuchin, NIC.AR
Russ Mundy, Parsons
Ondřej Surý, CZ.NIC
Yoshiro Yoneya, JPRS
Dan York, Internet Society

Written by Dan York, Author and Speaker on Internet technologies

Follow CircleID on Twitter

More under: DNS, DNS Security, ICANN, Security

Categories: External commentary
Syndicate content
Licensed under Creative Commons Attribution Share-Alike 3.0 - Privacy policy Drupal theme by Kiwi Themes.