Feed aggregator

'The Global Village' Idiot

CircleID - Tue, 2015-09-01 21:49

I recall from some years back, when we were debating in Australia some national Internet censorship proposal de jour, that if the Internet represented a new Global Village then Australia was trying very hard to position itself as the Global Village Idiot. And the current situation with Australia's new Data Retention laws may well support a case for reviving that sentiment. Between the various government agencies who pressed for this legislation, the lawyers who drafted the legislation, the politicians who advocated its adoption and the bureaucrats who are overseeing its implementation, then as far as I can tell none of them get it. They just don't understand the Internet and how it works, and they are acting on a somewhat misguided assumption that the Internet is nothing more than the telephone network for computers. And nothing could be further from the truth.

The intended aim of this legislation was to assist various law enforcement agencies to undertake forensic analysis of network transactions. As the government claims: "telecommunications companies are retaining less data and keeping it for a shorter time. This is degrading the investigative capabilities of law enforcement and security agencies and, in some cases, has prevented serious criminals from being brought to justice." So what the agencies wanted was a regulation to compel ISPs to hold a record of their address assignment details so that the question "who was using this IP address at this time" had a definitive answer based on the retention of so-called meta-data records of who had what IP address when.

In the world of traditional telephony, this makes some sense. Telephone numbers were synonymous with endpoint identifiers, so that a telephone number was uniquely associated with a subscriber, and this association was stable and long-lived. Asking phone companies to hang on to the association of telephone numbers to subscriber names and addresses, or in other words a phone directory, was hardly an onerous imposition to the industry, and considering that the phone directory was public it was hardly a dramatic incursion into untested areas of personal privacy. In the context of these Data Retention laws, no one is asking service providers to record and retain conversations. No one is even asking to keep the records of what numbers were called by subscribers, although my telephone bill clearly demonstrates that my phone company collects and stores all such individual call records. The data retention measures are explicitly limited to the association of telephone numbers to subscriber details. In this world of black bakelite telephones of the 1950's I'm sure that this was a fine idea, and was, in fact, little more than a formal codification of existing practice in many telephone operators then and now.

But that was then and this is now, and today the telephone system is heading to a role of quaint historical artifact, while the Internet continues to take on the role of the global communications platform. So data retention for telephones is hardly useful. Something has to be done about the Internet. Doubtless someone had the bright idea that if they took this concept of the association of telephone number to subscriber, and used a text editor to globally change "telephone number" to "IP address" in the text of a data retention piece of regulation then they would have a bright shiny piece of regulation that would make them all set for this brave new Internet world. After all, the Internet is just a telephone service for computers isn't it?

But that is not the case in today's Internet. It has been a constantly changing environment that has responded and adapted to various pressures over time. One of the more critical long term pressures on the Internet's architecture has been the prospect of address exhaustion, which has been a pervasive influence for over two decades now. The result of this prospect of address exhaustion has been to change the semantics of an IP address. Because addresses were considered to be a scarce resource the change was to use them sparingly, and the way to achieve that was to share an address across multiple devices. This sharing has increased in intensity over the years. The initial model was to place address sharing units, or Network Address Translators (NATs), at the "edge" of the network, where the carriage network connects to the customer's network. In this model the IP address is now shared by all the units located on the customer's network. As a result an IP address is still, in some sense, an edge point identifier, but the endpoint is now a home network, not a single device. But even so that was then and this is now, and the address sharing picture has changed further.

We are now seeing these address sharing units being pulled further back into the service provider network. This has started with mobile networks, but is now also occurring on wired access networks as well. The inexorable pressures of address exhaustion are driving many service providers into these address sharing approaches for their network. What does an IP address mean when its shared in this manner? It's no longer synonymous with an endpoint identifier, as a number of endpoints may be sharing this single public IP addresses. Equally, a single endpoint may use a number of public addresses, even at the same time in some situations.

So what is an "IP address" if it's not an endpoint identifier? It is now an ephemeral shared token whose contextual lifetime in the public network is that of single network transactions, and it's use is never assuredly unique, even within such limited contexts.

So if IP addresses are losing their role as stable endpoint identifiers what has taken their place? What should we be storing in some data retention framework that relates a network transaction to an endpoint? If storing IP addresses makes no sense as an endpoint identification what should we use instead? The hard answer is that we don't have such a concept any more, and it's ok that we don't. We've managed to convince ourselves that the Internet does not need them. And that's a big statement.

Today's Internet has no strict requirement for universal stable fixed endpoint identities. And things work just fine. What we have found is that in a client/server service model there is no need to assign fixed endpoint identities to the clients. They can get away with pulling out ephemeral tokens from some shared pool and everything still just works. And these days we are also experimenting with Content Distribution Network (CDN) service solutions that allows the servers to also use IP addresses in the same ephemeral manner, relying solely on the DNS as the service point identification space. So addresses in the Internet don't mean all that much any more, and increasingly they don't map to endpoint identifiers any more.

But the Australian Data Retention Laws say something has to be stored, and the bureaucrats running the Attorney General's Office of Data Retention say something has to be stored, and the industry players are trying to understand what exactly should be stored, because in shared address-based networks there is nothing around that meets the intended requirements of this law. If the intended requirement of this law is to retain the association of protocol-specific endpoint identifiers with the customer's details, and the network has now managed to eschew the very concept of stable endpoint identifiers, then where have we got to?

It is unlikely that operators of these address sharing networks will refuse to comply with the provisions of the Data Retention laws. It's likely instead that they will use the address sharing logs and retain them. But this starts to get interesting, because in theory in order to retain something the temptation will be to retain the complete log from the network address sharing unit. What is in this log? In this world of Carrier Grade NATs (CGNs) every transaction generates a new NAT binding, and that NAT binding generates a log entry. So every DNS query, every part of every web page, every individual email collected by your device - in short, each and every individual network transaction - will generate a CGN log entry. This is no less than your entire Web browsing history, your DNS query history, and the history of everything else you are doing on the net.

So when the bureaucrats claim that: "The Australian Government is not requiring industry to retain a person's web-browsing history or any data that may amount to a person's web-browsing history." are they just lying, or do they really mean that operators of CGNs do not need to retain any of this data, making CGN-based networks truly opaque and anonymous? I strongly suspect the former - they are indeed lying and everything you do on the net will be logged and retained. However it's not intentional duplicity. They just don't get it.

Written by Geoff Huston, Author & Chief Scientist at APNIC

Follow CircleID on Twitter

More under: Access Providers, IP Addressing, Policy & Regulation, Privacy

Categories: External commentary

ICT Sectors Are Merging Into a New Wholesale Platform for the Networked Economy

CircleID - Tue, 2015-09-01 21:44

There certainly is a lot of interest in the IoT (personal devices) and M2M (industrial applications) market. But what we are seeing is only what is happening on the surface. Most of the IoT and M2M activities are taking place unseen. For example, all new electronic devices (smartphones, tablets, set-top boxes, game consoles) are now IoT devices. Wearable technology has also become a thriving part of the IoT industry, with an ever-broadening range of possible uses and devices, including smart watches, glasses, clothing items, skin patches, and even implants for health monitoring.

Tens of millions of smart meters have already been deployed by the electricity industry, with literally hundreds of millions of them in the pipeline. Healthcare is another key industry. All new hospitals now operate large-scale M2M operations, tracking their equipment with real-time information. Most local governments have invested massively in mapping their assets; this is now being followed up by adding connectivity to these assets — whether it be streetlamps, drainage, sewerage or trees, all are in the process of becoming part of a smart city. (The number of connected M2M devices in Australia, for instance, will grow to somewhere between 25 million and 50 million by 2020.) Progress is still hampered by lack of standards, interoperability and effective government and industry collaboration.

The intelligent outcome of the use of the various new technologies is known as big data. This can only be achieved through connected information management and data collaboration. Open data systems are therefore critical to its success. Governments are increasing the number of data sets they make available to the public and data collaboration between businesses is also starting to happen.

These intelligent transactions are mostly taking place in the cloud, with data centres forming the intelligent hubs between the clouds. Cloud computing has become one of the fastest-growing areas for the IT sector, and cloud computing solutions are being adopted by enterprises; government and consumers alike. In 2015 cloud computing has become more mainstream, with the majority of large enterprises adopting various solutions. Small and medium-sized businesses still largely need to start on the road to cloud computing, while close to 90% of larger businesses in developed economies have already embraced it. Few people realise the enormous impact that cloud computing is already making.

The other critical element for the future of these ICT developments is the network quality needed for those billions of intelligent transactions between all of the IoT and M2M devices. This data needs to be collected and processed to then deliver executable outcomes with real-time analyses to the IoT and M2M devices and their users, being consumers, businesses, government organisations, utilities, traffic authorities and so on.

In order to successfully implement the emerging networked economy far more robust infrastructure is required than is currently available. The NBN and 4G LTE A(dvanced) — a halfway house on the way to full 5G — are going to provide that robust infrastructure necessary for high-speed information processing, distributed computing, as well as many other applications that can be processed, analysed and managed — all in real time over a cloud computer-based IT platform. Ubiquitous access, enormous capacity, low latency, robustness and symmetric access, as well as the very high levels of reliability, quality and security, are all critical to the success of such a new communications environment.

The importance of access to infrastructure in these ICT developments is leading to convergence of what are still largely separate sectors (big data, IoT, M2M, cloud computing, data centres and telecoms wholesale). This will lead to mergers and acquisitions between the various companies involved in these activities, and winners and losers will be attached to this process; it will be a very dynamic and rapidly changing market over the next few years.

Social and economic developments are further accelerating, and as more organisations tap into this merged ICT space and more investments are made we will see further astonishing innovations emerge over the next few years.

Over time this will have a major impact on the economy. The emerging networked economy will become decentralised with more innovative new jobs and business opportunities being shared. Smart cities are going to play a key role in this new economy.

Given the current social, economic and political turbulence, it becomes clear that we seem to have reached a ceiling in the way we currently use our intellectual ability to address the complex issues that society is facing.

The need for increased intelligence will lead to a merging of human activities and machines, something that is becoming increasingly possible and is heading towards the broader concept of artificial intelligence (AI). Some of the predictions and scenarios discussed in this context are clearly wrong, and AI as described by the popular media is, if it really happens, at least a century away; nevertheless we are pushing the boundaries of our current level of intelligence capacity and, while most current predictions will lead to totally different outcomes, one thing is certain — things will change.

In the end it is all about people — smart people in charge of all of these processes. What is needed is a vision from the top and smart communities working from the bottom upwards.

Written by Paul Budde, Managing Director of Paul Budde Communication

Follow CircleID on Twitter

More under: Data Center, Mobile, Web

Categories: External commentary

U.S. Preparing Sanctions Against Chinese Firms and Individuals over Cyberespionage

CircleID - Tue, 2015-09-01 02:53

Ellen Nakashima reporting in the Washington Post: The Obama administration is developing a package of unprecedented economic sanctions against Chinese companies and individuals who have benefited from their government’s cybertheft of valuable U.S. trade secrets. The U.S. government has not yet decided whether to issue these sanctions, but a final call is expected soon, perhaps even within the next two weeks.

Follow CircleID on Twitter

More under: Cyberattack, Cybercrime, Policy & Regulation

Categories: External commentary

The Mobile App Trap

CircleID - Mon, 2015-08-31 17:43

The Apple App Store was seven years old as of Friday, 10 July, marking a key — and possibly critical — evolution in how we use the Internet. First, the numbers, which are truly astounding — there are now more than 1.4 million apps available, which have been downloaded more than 100 billion times. And that's just Apple. Add in Android and the other platforms, and we start talking about a new app economy, generating billions in revenues for developers from around the world.

For us, the users, apps are our current path to the increasing number of features in our smart phones and tablets, helping us from morning until night. Apps can help wake us in the morning based on our sleep cycle, pay for our coffee, navigate our way to a meeting, time our exercise, remind us to pick up some bread, and find a constellation in the night sky. Based on one survey in the US, more than 80% of online time on a mobile is spent on apps, as opposed to a browser. Even including desktop browsing, these users spend more than 50% of overall online time using mobile apps.

That's a big shift in how we use the Internet. It has generated significant benefits, but they come at a cost. Today, most apps are native to a particular mobile platform, meaning that they are developed specifically for the proprietary technical specifications of that platform. This means, in turn, that they can only be used on other platforms if they are rewritten to match those particular technical specifications.

As a result, contemplate the time and expense that might be required to switch mobile platforms. Every app must be downloaded on the new platform, and some must be paid for, again. That assumes that all apps are available on every platform, which they are not. Developers face an expense customizing their apps for each platform, and many only target one or two, limiting their addressable market.

This can, in turn, limit competition between platforms. A new platform needs to offer apps to attract users, but users look at the number of apps available before choosing a platform. In economics we call this a two-sided market — but everyone else calls it a chicken-and-egg problem.

This phenomenon is new to the Internet. Before the mobile Internet, switching computer platforms simply required installing a new or familiar browser, and web sites were developed to be accessed by all browsers regardless of platform. It turns out, however, that the old way could also be the new way, using web apps.

A web app enables developers to create websites with advanced features that can be installed on a mobile device with an icon similar to native apps. Developers can create one web app for all platforms — consumers can easily move between platforms the way they switch browsers today — and new platforms can enter and compete on more of an even ground.

The work leading to this new app environment, known as the Open Web Platform, is led by the World Wide Web Consortium. It requires Application Programming Interfaces (APIs) that match the rich functionality available to native apps, but that enable web apps to be interoperable between platforms. Examples of this new environment are already emerging and could be a new milestone in how we use the mobile Internet.

The development of web apps, and their resulting 'interoperability', has a corollary to a key feature of the Internet as a whole. During the early years of the Internet's commercialization, vendors and operators joined the open standards movement and helped unleash an unprecedented era of growth and innovation. They found value in adopting standards that promoted interoperability between products across the industry.

Under an open environment, web apps can still be sold, and companies can still create platforms, as today. Providing interoperability would reduce switching costs for consumers and lower the development cost of apps that could reach all platforms. This would increase competition between platforms, and create a larger market for app developers to target at lower cost.

As a result, interoperable web apps would help to preserve the permissionless innovation that has been a hallmark of the Internet since its creation. It would help anyone, be they in Silicon Valley or Swaziland, to turn inspiration into innovation and innovation into income in order succeed in the new Internet-enabled global marketplace. This would mark another important milestone in the development and evolution of the mobile Internet.

Written by Michael Kende, Chief Economist, Internet Society

Follow CircleID on Twitter

More under: Mobile, Web

Categories: External commentary

Online Reviews Provide Insights into gTLD Customers' Needs

CircleID - Mon, 2015-08-31 17:36

There is a high degree of uncertainty when launching a product, and new gTLDs are no exception. However, registries that have introduced them — or will introduce them in the second round of new gTLDs — can improve their odds. A key move: go to established domain name forums and check out their specialized sections for new gTLDs. What users say there can help companies refine tactics and strategy.

In a recent study, Simon Li, Kamrun Nahar, and Benjamin C.M. Fung of Concordia University suggest that analyzing online reviews is a valuable tool to better gauge and understand the best positioning of the gTLDs and the necessary refinements that customers need once a novel product has been launched. Unlike traditional consumer surveys or registration/sales numbers, online reviews provide immediate information on valuable opinions of early adopters and highly motivated users.

The study analyzed online reviews posted on popular sites such as Amazon and CNET between April 2010 and May 2011, a year after the rollout of the Apple iPad tablet, and about the time that competitors (Asus, Lenovo, Samsung, and Toshiba) started offering their own versions of the tablet. The competitors needed insight into consumers' needs for various hardware and functionality features.

The authors applied data mining techniques to online reviews to extract keywords and patterns that revealed customers' critical or favorable comments for 27 different tablet attributes. For domain names attributes, reviews should be segmented by the desired role and the availability of the domain name. Insight can be gained into the customers' choices, as the selection decision is not a simple choice of .com or an appropriate gTLD.

Do not confuse analyzing online reviews with following product mentions on social media (Facebook and Twitter). That sort of feedback is more hindrance than help, as studies have shown. Online reviews are different. They represent the opinions of people who have actually used the new device and experienced its ups and downs and/or have questions about desirable features. They aren't mouthing speculation about potential features and customers' adoption rates.

It is no easy task to develop data mining techniques for nonstandard forum formats. However, building such a competency for gTLDs will provide a competitive advantage that may lead to concentration of gTLDs' registry ownerships and/or the emergence of new consulting businesses providing such services.

Written by Alex Tajirian, CEO at DomainMart

Follow CircleID on Twitter

More under: Top-Level Domains

Categories: External commentary

GAC as “first among equals:” the danger in the accountability plan

IGP Blog - Sat, 2015-08-29 22:29

How will ICANN’s Governmental Advisory Committee (GAC) handle the dilemma it has been put into by the community empowerment mechanism?

The dilemma I am referring to is this: does the GAC want to participate in the community empowerment mechanism as an organization, with voting seats, or does it want to have a privileged advisory role like it has today with respect to the ICANN board? Or does it want both?

I’ve been reviewing the transcripts of the ICANN 53 (Buenos Aires) meeting in an attempt to understand how GAC might be approaching this question.

Some people defend making the GAC part of the community empowerment mechanism by invoking the concept of “equal footing”  for stakeholders. While reading the transcript today I was hit over the head with reminders of how wrong this argument is. Here are the simple facts:

  1. The GAC does not have equal footing now, it has superior footing.
  2. The GAC knows this.
  3. The GAC does not want equal footing in the future, it wants to retain its superior footing.
  4. Some members want to further strengthen their ‘footing’ to make it even more superior

It’s all written down in black and white, in multiple languages, from the June 21st, June 23rd and especially the June 24th transcripts. The GAC members realize that the privileged status of their advice under the ICANN bylaws gives them an extraordinary amount of influence over ICANN’s policy making process.

Here we see the UK representative to the GAC worrying about whether empowering the community will make GAC lose its superior status.

UK: “we have our first amongst equals status under the current arrangements, and we are grappling with how to ensure that that is retained against proposals which are going to empower other parts of the community in the successive regime.”

Portgual chimed in with an even stronger claim. The power of governments within ICANN to define policy should be unlimited:

PORTUGAL: “our position is that the public-policy issues obviously have to be dealt with by the government. Obviously governments [should] not be limited in [any] respect. So any wording in the proposal or in the text referring to what the governments do or don’t do within a certain boundary, provided that there is a limitation to the role of governments, I think that this would be unacceptable, because you cannot limit the role of governments in these kind of issues. At the same time we need to be first among equals.”

How about America, land of the free?

USA: So I couldn’t agree more with Portugal. We are first among equals. We wish to see that continue.

The quotation above ought to be a very nice antidote to what I call L. Gordon Crovitz disease, the idea that America is exceptional and it’s only other governments, not governments per se, that we need to worry about in the transition. (But Crovitz never let facts get in the way of his arguments.)

Spain made a ringing endorsement of the unequal footing argument:

SPAIN: From our point of view, we have to maintain the status quo of GAC as first among equals within the ICANN ecosystem. This is key.

Indeed, Spain wins my prize for the GAC’s most ardent statist. Their representative worried about the possibility that an empowered community might “defy ICANN board decision that are based on GAC advice.” Something needed to be done about that, Spain suggested:

maybe we can think of reasons or instances in which the community could not exert their powers against government decision — board decisions based on government advice.

Brazil, showing that little has changed there since 2005 (or 1895), had this to say:

And that’s why, therefore, [we] support in the new ICANN accountability framework a more significant involvement of governments that goes beyond the merely advisory role that the GAC holds today.

China sounded a similar theme:

We cannot make the conclusion right now to say the GAC will forever just be advisory body,

Solidarity among the BRICS: here’s RUSSIA:

…we totally agree with Brazil that this level of decision-making and the role of governments cannot be lower than it is today at this level.

Very nice to see the harmonious relationship between the US, Russia, Brazil, China and Europe here. The world’s states are united in their determination to preserve and possibly expand their power at the expense of the Internet community! Out of fairness, let me note that Denmark offered a word of caution:

from a Danish point of view, it’s very important that nobody, no individual, no organization even government can capture the organization in the future. That’s why we think it’s important to keep the rule that the Board only have to take into account the GAC advice when it’s in consensus.”

These sobering facts need to be reflected in our assessment of the CCWG proposal for enhancing ICANN accountability. The CCWG proposal would make the GAC a voting member in the community empowerment mechanism. This concept is usually rationalized in the name of “equal footing” for different stakeholders, but the GAC transcript blows their cover: They are first among equals and they revel in it. Governments would only use participation in the community mechanism to protect that special status and would probably attempt to expand it.

This blog has criticized including Advisory Committees in the community mechanism, but I, at least, am reconsidering> If we give GAC equal status in the community mechanism and TAKE AWAY their special advice status, we are closer to “equal footing” than we are now. Indeed, it may be that the privileged advisory power is more dangerous and distortive of ‘equal footing’ than their participation in the community mechanism would be. While many of us assume that GAC’s “Advisory” status limits its power, in fact when it is used strategically it elevates governments above all other stakeholders in the policy making process. If one reviews the history of the new TLD program, for example, one can see how GAC advice was repeatedly used to provide the “last word” that could delay, hijack and change the policy developed by the GNSO. So maybe we should force the GAC to choose between privileged advice or participation in the community mechanism. Or, maybe we should actually prefer their inclusion in the community mechanism to continued special status of advice.

Two things to avoid like the plague:

  • Giving GAC BOTH privileged advisory status AND participation in the community mechanism
  • Giving GAC a similar privileged advisory status over the community mechanism (e.g., GAC would not participate directly but would “advise” the empowered community, which would translate into an effective veto, delay or dilution of the community’s powers).

It seems unlikely that the special advisory power would be taken away. So if we don’t oppose their inclusion in the community mechanism, there is a risk that they will get both. Indeed, it seems highly likely to me that many members of GAC will respond to the CCWG dilemma by demanding option 1) or 2).  The Internet community needs to be alert to this threat. Reassuring aphorisms about “equal footing”  can obscure a rather Orwellian reality that all animals are supposed to be equal, but some are more equal than others.

 

Categories: External commentary
Syndicate content
Licensed under Creative Commons Attribution Share-Alike 3.0 - Privacy policy Drupal theme by Kiwi Themes.