The Internet Corporation for Assigned Names and Numbers (ICANN) has released new guidance concerning the reporting and disclosure of bugs that affect the Domain Name System, including information of how ICANN itself will behave in response to vulnerabilities.
Until recently, ICANN, which is responsible for maintaining the root domain servers at the heart of the DNS system, had no specific guidelines for the reporting of vulnerabilities, leaving responsible disclosure protocols up to the researchers who discovered the bugs. With the release of the Coordinated Vulnerability Disclosure Reporting [PDF] document they hope to instigate a more unified and consistent process for disclosure.
The guidelines are intended to:
"define the role ICANN will perform in circumstances where vulnerabilities are reported and ICANN determines that the security, stability or resiliency of the DNS is exploited or threatened. The guidelines also explain how a party, described as a reporter, should disclose information on a vulnerability discovered in a system or network operated by ICANN."
The document outlines procedures that ICANN will follow in various roles, including as an affected party, where the vulnerability directly impacts ICANN's operations; as a reporter, when ICANN researchers discover vulnerabilities; and as a coordinating party.
Security vulnerability reporting is a controversial topic, with some researchers advocating immediate full disclosure, and others opting for responsible disclosure where vendors and stakeholders are notified privately before a full release is made only following the patching of relevant software. There is also a thriving black market for security vulnerabilities, where the information is disclosed only to the highest bidder for use in hacking attacks.
As an essential and ubiquitous part of Internet's infrastructure, the security of the Domain Name System is of particular interest to hackers and those engaged in industrial or state-sponsored espionage. ICANN is advocating a system of responsible disclosure with ICANN itself acting as a coordinator in some cases. Bugs that impact DNS can be reported directly to ICANN, who will then inform affected vendors or service providers.
Public disclosure is strongly discouraged until vendors have been informed of the vulnerability and have fixes in place. However, the methodology recommended by ICANN makes it clear that in the case of vendors who fail to respond to attempts at coordination, researchers may choose to disclose vulnerabilities.
None of these recommendations is binding, and researchers are still free to choose how to react to discovered vulnerabilities. However, the creation of these guidelines is a positive move towards a unified and coordinated system for handling security vulnerabilities in the DNS.
Written by Evan Daniels
Follow CircleID on Twitter
Mandarin is a tricky language, but ICANN may want to learn the expression chóngfù before leaving the Beijing meeting. Chóngfù means "do-over" and that's what ICANN needs to forestall an entirely preventable disaster in the delegation of new top-level domains (TLDs).
The issue of "string similarity" seems straightforward. Nobody inside ICANN or out there in the real world wants Internet users to be confused by new TLDs that are confusingly similar. Imagine hearing an ad offering low rates at car.loans but you encounter something completely different at car.loan instead? And what would stop somebody from launching a new TLD by just tacking an "s" onto popular domains like .com or .org?
The Government Advisory Committee (GAC) is catching a lot of flack for it's Beijing Communiqué, but one thing the GAC got right was its advice that singular/plural strings are confusingly similar.
So how did we get to a point where ICANN inexplicably failed to find confusing similarity for 24 pairs of singular and plural forms of the same words, including .web /.webs, .game/.games, and .hotel/.hotels? More important, how do we fix this?
Chóngfù is hard for westerners to say and will be even harder for ICANN to do.
For starters, a little transparency is probably in order. The string-similarity review process was opaque by design. But many in the community want to know how ICANN's experts either failed to recognize the plurality issue — which would be troubling — or decided that single and plural gTLD strings can successfully coexist — which would be ludicrous.
Thankfully, the World Intellectual Property Organization (WIPO) has basic guidance on similarity: "words used in the singular include the plural and vice versa, as the context may require." That's the kind of common sense ICANN could use to correct the Guidebook and do a quick do-over on those 24 pairs of singular/plural TLDs.
ICANN may get a convenient backdoor out of this dilemma from the International Centre for Dispute Resolution, which is reviewing string confusion objections on seven of the single/plural pairs. If ICDR makes the right ruling, ICANN should apply that rule to all 24 single/plural pairs.
And if all else fails, there's always ICANN's "reconsideration" process for a formal chóngfù.
ICANN's critics at the United Nations and within many governments are waiting for a highly visible misstep in the ambitious expansion of top-level domains. That could be used to justify having governments displace the private sector in its leadership role on growing and governing the Internet.
Better that ICANN find a way to do-over on singular/plurals, than to risk having governments impose a bigger do-over on ICANN itself.
Written by Steve DelBianco, Executive Director at NetChoice
Follow CircleID on Twitter
Over the last few months one of the areas of attention in the new TLD project has been "closed generics". I've written about this several times in the past and I've also raised the issue in as many fora as possible.
Yesterday ICANN published a letter they'd received from Google with respect to several of their new TLD applications.
Whereas Google had made it clear previously that they intended to operate domain extensions such as .blog, .cloud, .search and .app in a closed fashion or "walled garden" this is no longer the case, as outlined in their submissions on the topic of closed generics last month.
The letter, which runs to 41 pages, includes a fairly concise explanation of Google's planned changes as well as the full text of the requested changes to their applications.
So what are they planning to do? Bearing in mind that they've got competition with several of these applications, so there is no guarantee that they'll be even granted to Google.
.search is planned to be a "dotless" domain:
Our goal for .search is to provide an easily-identifiable namespace for firms that provide search functionality and to allow Internet users a unique and simple mechanism to access the search functionality of their choice. Google intends to operate a redirect service on the "dotless" .search domain [search] that, combined with a simple technical standard will allow a consistent query interface across firms that provide search functionality, and will enable users to easily conduct searches with firms that provide the search functionality that they designate as their preference.
I'm not sure how that will look, but it sounds kind of funky.
.app will be for developers of apps
We intend for .app to be a TLD dedicated to application developers. The term "app" is used in a variety of contexts, including mobile applications, browser-based applications and even desktop applications. We intend for the .app TLD to be restricted for use by relevant developer communities, but to be inclusive of the full range of application development communities and not to restrict registration to developers on a particular platform
So "app" will have the widest meaning possible, though how they'll actually "police" that isn't clear. Intent? Use?
.blog is one of the "closed generics" that bugged me the most. I blog. The string describes the content you are expecting to find on the domain. Being forced to use a specific blogging platform in order to access a .blog domain name was not how I'd like to see that extension used.
So Google's latest proposal for .blog is a lot more palatable to me:
We have two principal goals for the .blog TLD. First, users navigating to domains within the TLD should reasonably expect to reach a blog when they access a .blog domain name. Second, it should be simple and easy for .blog registrants to associate their secondlevel domain with their blog on the blogging platform of their choice. To this end, we are working with others in the blogging community to develop a simple set of technical standards that will allow users to automatically link their domain name to their blog at the time of registration. Registrations within the TLD will be limited to those with blogs adhering to these technical standard.
I'm not sure how this "standard" is going to look or how registrars and hosting providers are going to be able to implement it, but I like the concept.
The .cloud application is the fourth one that Google is planning to tweak:
As with .blog, our goal for .cloud is to create a clear association between .cloud names and projects hosted in cloud platforms, while simultaneously allowing registrants to more easily link domain names with the cloud offering of their choice. We are in the earlier stages of discussions with others in the cloud community, but intend to develop similar technical standards as with .blog
So with Google changing at least some of their applications to be more open and inclusive, will other new TLD applicants see the light and tweak theirs? What about Amazon? Symantec? L'Oreal?
And what about ICANN's board? Will they be able to find a way of dealing with the issue in a fair, transparent and equitable manner?
Written by Michele Neylon, MD of Blacknight Solutions
Follow CircleID on Twitter
CircleID: Information and Communication Technologies (ICT) Industry Soon to Be Largest Source of Co2 Emissions
There has been a lot of discussion lately on the environmental impact of the proposed Keystone-XL pipeline that is intended to carry heavy oil from the tar sands in Alberta to refineries on the US Gulf Coast.
I suspect at the end of the day the US government will approve the pipeline as GDP growth and potential job losses will always trump concerns over the environment.
However, the US government has been putting on a lot pressure on Alberta to improve its environmental standards as a quid pro quo for approving the pipeline. In response Alberta is exploring expanding their current CO2 emissions program to a $40/tonne carbon levy. In the past, all of the funds raised by Alberta's carbon emissions program was returned to industry to invest in dubious energy efficiency programs. But Alberta could really have a much more meaningful impact in terms of reducing CO2 emissions, that would more than compensate the emissions from the oil carried in the Keystone XL pipeline, if it invested some of this money into its local universities and R&E network — Cybera.
Although on the production side the tar sands are one of the biggest sources of CO2 emissions, the Information and Communication Technologies (ICT) industry, globally is the fastest growing and soon will be the largest source of CO2 emissions on the consumption side of the equation. ICT emissions are produced indirectly from the coal generated electricity that is used to power all of our devices. Currently it is estimated that ICT consumes around 10% all electrical power growing at about 6-10% per year. According to the OECD and other studies ICT equipment in our home now consumes more energy than traditional appliances.
New studies suggest that the growth in wireless networks could be the single largest component of that growth in CO2 emissions from the ICT sector. In a recent report by the Centre for Energy-Efficient Communications, at the University of Melbourne-based research centre claimed that by 2015, the energy used to run data centres will be a "drop in the ocean", compared to the wireless networks used to access cloud services. The report predicts that by 2015 energy consumption associated with 'wireless cloud' will reach 43 terawatt-hours, compared to 9.2 terawatt-hours in 2012. This is an increase in carbon footprint from 6 megatonnes of CO2 in 2012, up to 30 megatonnes of CO2 in 2015, which is the equivalent of an additional 4.9 million cars on the road, the report states.
More worrisome is another report from Sweden KTH that predicts will need to increase the density of wireless base stations by 1000 times to meet the insatiable demand for the "wireless cloud". If this came to fruition, it would be incredibly huge jump in the demand of electricity by the ICT sector.
The wireless industry in particular is an ideal sector to be powered by local renewable energy sources such as solar panels and windmills. Already many wireless towers in the developing world are powered by renewable energy (but unfortunately often with diesel backup). Because of it is inherently distributed, lower power architecture the wireless industry is ideally suited to be powered by local renewable energy.
I have long advocated that universities and R&E networks are the ideal environment for deploying wireless networks that are powered solely by local renewable power sources. By integrating WIfI and 4G networks with multiple over lapping cells it would be possible to provide seamless service zero carbon wireless services.
For more details see:
Alberta could be a world leader in deploying such zero carbon networks starting first at universities in partnership with Cybera. The global CO2 impact of developing such technology in terms of removing additional 4.9 million cars from the road would be much greater than expected emissions from the oil to be carried in the proposed Keystone XL pipeline
Thousand times greater density of base stations
J. Zander, P. Mähönen, "Riding the Data Tsunami in the Cloud – Myths and Challenges in Future Wireless Access", IEEE Communications Magazine, Vol 51, Issue: 3 (March 2013), pages 145-151 [theunwiredpeople.com]
Written by Bill St. Arnaud , Green IT Networking Consultant
Follow CircleID on Twitter
One of the staggering numbers introduced during the opening remarks at ICANN 46 here in Beijing by multiple speakers, including ICANN CEO Fadi Chehade and speakers from the Chinese government, was this:
China now has over 564 million Internet users!
Think about that for a minute.
Most estimates these days are that there are around 2 billion people around the world using the Internet. We have no real way of knowing exactly how many people are online, but the estimate most of us use is "2 billion".
So if we go with that estimate, these latest numbers out of China would mean that China represents around 25% of all Internet users. A rather amazing growth given that the ICANN 46 welcoming remarks also indicated that in 2002 China only had 59 million Internet users.
Less surprising to me was the stated fact that 75% of Chinese users are mobile Internet users. I think most of us can clearly see both in industry trends and in our own personal usage that Internet usage is increasingly moving to a mobile-centric world.
Still, let's think about the scale of that percentage: 75% of 564 million represents 423 million mobile Internet users — about the size of the entire population of the USA and Mexico combined.
A rather huge number of people.
I sat there thinking about those numbers and my mind immediately turned to all of those of us who are publishing content on the Internet. This is yet another sign that mobile consumption of content is increasingly dominant — how well does your website work for mobile users? And while English may be the primary language many of us may use for our websites, how well do those sites work for viewers for whom English is not their main language? And what multi-lingual capabilities does your website have? Or what are you planning to add?
Truly an amazing number of users… and it will only continue to grow!
Written by Dan York, Author and Speaker on Internet technologies
Follow CircleID on Twitter
ICANN's Nominating Committee (NomCom) is both a strange animal and a precious resource. Having a committee charged with first recruiting, then selecting suitable candidates to hold key positions within ICANN is something that is often little, or even mis, understood. Within the ICANN community itself.
By the very nature of its recruitment role, the NomCom has to remain secretive. About who the candidates are, at any rate. But that doesn't mean the rest of the NomCom's processes must remain so.
The feeling that the NomCom has at times lacked transparency became very evident last year, when the 2012 NomCom Chair Elect — the person chosen by the ICANN Board to be the NomCom Chair for the following year — refused to take up that position.
The ensuing debate, and sometimes stinging criticism, has clearly energised this year's NomCom to execute significant changes. Under the auspices of the 2013 NomCom Chair Yrjö Länsipuro, blessed with both information sharing and people skills (he was a journalist and a diplomat), the NomCom has significantly changed its approach.
A general 2-stage transition has been initiated. Stage 1 is becoming more transparent. Stage 2 should be looking at the actual recruitment processes used by the NomCom to ensure that high-level candidates do not baulk at the complexities of filling in online application forms and dealing with the application system.
Since the start of the 2013 NomCom's tenure, the committee has been putting out a Report Card after each of its official meetings. This is the first time ICANN's NomCom has produced written accounts of its meetings.
History was also made at the ICANN Beijing meeting this week, where the NomCom has scheduled several open meetings, including its main planning meeting. This is the first time that the NomCom's deliberations have ever been held in public to such an extent.
These are important steps towards for what is a crucial committee for ICANN because it is designed to help bring new blood into the ICANN universe, which otherwise might be in serious danger of sclerosis.
Written by Stéphane Van Gelder, Chairman, STEPHANE VAN GELDER CONSULTING
Follow CircleID on Twitter
The headlines out of ICANN's meeting in Beijing may be all about new domains, but it is the quiet, systemic evolution of ICANN itself that holds the greatest promise for Internet users globally.
ICANN President Fadi Chehadé opened the meeting by announcing that it was ICANN's "season to evolve," and setting forth a series of programs, restructuring efforts and policy initiatives intended to make ICANN more responsive to the needs of its stakeholders, and by extension, to the needs of all Internet users, everywhere in the world.
Mr. Chehadé's ambitious agenda provides a unique opportunity for ICANN to holistically review and strengthen its role in upholding the safety of Internet users.
Historically, ICANN's focus has been on Internet security almost to the exclusion of Internet safety. During the early stages of ICANN's evolution this narrow focus on security was both natural and likely necessary, given the organization's resources and scope.
The threats against the Internet's core technical infrastructure are significant, and ICANN's work in mitigating them is critical. But as ICANN's scope and resources expand, so to does its obligation to address the more granular threats to Internet users that arise from systemic abuse and exploitation of the Domain Name System.
Global cybercrime is at an all-time high, and shows no signs of abating. An independent study conducted by eight researchers for the U.S., UK, Germany, and the Netherlands presented at the Workshop on the Economics of Information Security (WEIS) 2012 placed the global cost of cybercrime at just over $225 Billion per year. And it could get much worse — a 2012 survey by the National Cyber Security Alliance (NCSA) and digital security firm Symantec showed the 83 percent of U.S.-based small businesses have no formal cybersecurity plan, even though the 2011 NCSA/Symantec survey showed that cyberattacks cost small and medium-sized business an average of $188,242. Almost two-thirds of the victims were shut down within six months after the attack.
The vast majority of the fraud and scams conducted by international cyber-syndicates shares a common characteristic of gaming the openness and accessibility of the Internet's addressing system to exploit the most vulnerable users.
Within its existing technical scope, ICANN has a tremendous platform to address these significant safety challenges. Simply enforcing existing contract terms with registrars and registries could have a dramatic global impact on cybercrime. Strengthening those contracts, and their enforcement mechanisms, would only magnify that effect.
ICANN is already making significant strides in the right direction. The new registrar accreditation agreement seems to hold great promise for Internet users globally, as does the registrants "bill of rights and responsibilities" that Chehadé discussed in his speech.
But part of ICANN's evolution should be systematizing these efforts so that Internet safety is not addressed piecemeal, but as part of a broader effort to address the safety needs of Internet users, including the millions who lack the wherewithal to participate in ICANN's policymaking process.
When the ICANN community sets its will to something, history demonstrates that it can be remarkably effective at accomplishing it. We've seen that in its strides on Internet security, and will likely have another demonstration soon in the form of new gTLDs.
If the community can embrace the Internet safety challenge with the same vigor with which they approached new gTLDs, we will look back years from now and mark the critical importance of ICANN's "season to evolve."
Written by Tom Galvin, Executive Director at Digital Citizens Alliance
Follow CircleID on Twitter
More than six million domain names were registered in the fourth quarter of 2012, bringing the total number to more than 252 million domain names worldwide across all top-level domains (TLDs) as of Dec. 31, 2012, according to the latest Domain Name Industry Brief from Verisign. The increase of 6.1 million domain names globally equates to a growth rate of 2.5 percent over the third quarter of 2012, and marks the eighth straight quarter with greater than 2 percent growth. Worldwide registrations have grown by 26.6 million, or 11.8 percent, year over year.
Follow CircleID on Twitter
More under: Domain Names
This week bank costumers of The Netherlands were shocked when they realised that online banking may not be as safe as they thought. Perhaps some were surprised to hear that what they think is money, is nothing but digits, something that does not exist. Their money only exist because we all act as if it exists and accept transactions between each other aided by software run by banks, if they haven't outsourced that function. The good people found out the hard way that by, in this case involuntarily, changing a few digits, their money just disappeared (and some became millionaires without being able to access this money).
The next day new malfunction of banks' websites were reported. For the first time it was openly admitted that all our banks' and payment intermediary iDeal's website were down, due to an attack in the form of a DDoS attack, making the website of the respective banks unreachable for regular traffic. The assailants tried to log in also.
This resulted in headlines, Tweets, blogs and opening news items, the one at the 8 o'clock news on the public channel ending with: "in the USA this happens nearly every day". In the following I'd like to take a look at a few related comments, a tweet by a politician, before coming to some questions. The main one reflects the title most: "Who's responsible for cyber security?"
If anything the chaos or perceived chaos in banking transitions led to angry or confused people, famous short fuses and loads of attention from the media. The cyber security world is waiting for years for a major cyber incident. One causing great damages, in the hope governments and companies start moving in the right direction. Some experts are even totally resigned to this way of thinking. This is not that incident. Sure, it shocked end users, led to some reactions from politicians, but in the end nobody seems to have lost money and there are so many other issues calling out for attention.
In the past week high level tax evasion by multi nationals, top-executives, politicians, etc., let's say the top of societies, was prominent in the news. A conclusion in a column in NRC Handelsblad stated, to this problem decisions at world level are needed. (If I'm cynical, look at the list at the start of this section and ask yourself the following question: Who decides on worldwide solutions?) What struck me, also, is that this is the exact same conclusion that is derived at when talking about Internet governance, international cooperation against cyber crime, spam and malware enforcement, etc., etc. In short, what I recently heard someone call "the glass ceiling of Internet governance". Most discussions stop here. Another variant to this discussion is: "we need to break own silos!". Okay, but who is "we"? Is someone made responsible for this breaking down, silos or ceilings? What are the right questions to ask here? Questions that lead to answers that could take the discussion forward and actually change the outcome? A topic for the upcoming IGF in Bali I'd say.
The near future
The comment in the 8 o'clock news cited above, caught my attention most. "This happens nearly every day in the US". I read somewhere that 267 out 365 days there were problems accessing major banks' websites. In other words this is something we are to expect also? Are there contingency plans? Do governments allow that payments can't be made (parts of) 267 days in the year? The economic impact is gigantic. Does it matter then whether the attacks stem from criminals, free speech advocates, "fun hackers" or state-to-state activities? I'd say not.
How can banks ever guaranty the safety of our money?
...is the question Dutch parliamentarian Kees Verhoeven (D66) asked on Twitter. (This is the Tweet: "Heftig. De storing blijkt nu een #DDoS aanval! De vraag is hoe banken de veiligheid van ons geld kunnen blijven garanderen. #cybersecurity"). I responded to him that this was totally the wrong question to ask. There is nothing banks can do against DDoS attacks, beyond preventive measures. The attackers, the tools they use, the infected PCs and other devices used, the command and control servers hosted anywhere in the world, are all far beyond the control of banks. As long as banks run state of the art security measures (even if they don't), they are victims and not attackers. Perhaps the banks need support from other entities on and around the Internet to solve this problem.
The tools used are infected PCs of end users, companies, governments, industry, etc. and other devices like smart phones, smart TVs, up to a hacked chip in your cat's collar (and this is no joke). There are a million reasons why these devices are infected. From irresponsible use by end users, flawed software, a lack of security by design in anything with "i" in front if it, negative incentives to deal with botnet mitigation or notice and take down requests, a lack of understanding in general, right up to a lack of government regulation, enforcement or incentives. All measures or better a lack of measures, banks have no influence over at all. They have an influence over the quality of the products they buy themselves in the future, over internal policy and security measures and perhaps they can reach out more to discuss Internet governance actively, which I advice them to do, but it stops there.
So, taking this all in, can banks guarantee the safety of our money? Answer this question yourself and continue to ask yourself the question who is responsible for cyber security? A virtual plethora of parties involved and where to start? What I have to conclude is that almost every single decision is to be made in the private sphere. In a competitive world. Where does that leave governments? Where does this leave decisions consciously made with the common good in mind?
So, who's responsible?
I'm not going to answer this question here. Those who follow me on my blog, here on CircleID or read my articles in Virus Bulletin know my points of view. What I'd like to ask you is to think about this question for one minute and share your thoughts with me here on within an(y) other context. It may just get a discussion going.
Written by Wout de Natris, Consultant international cooperation cyber crime + trainer spam enforcement
Follow CircleID on Twitter
Mary Iqbal writes to report that ICANN has released the third round of initial evaluation results, bringing the total number of applicants to pass Initial Evaluation to 93. ICANN has now completed the initial evaluation of all but 13 IDN Top Level Domains. To learn more, see http://www.getnewtlds.com/news/Third-Round-of-Initial-Evaluations.aspx.
Follow CircleID on Twitter
Matthias C. Kettemann's International Law and the Internet: Can you prove somebody is an idiot? Defamation between freedom of expression and protection of reputation
Defamatory statements on the Internet, too.(c) Kettemann 2013Just as the fama in Virgil’s Aeneid (the etymological root of defamation) negative rumors harmful to someone's reputation prosper on the Internet.
"Fama, malum qua non aliud velocius ullum: mobilitate viget virisque adquirit eundo, parva metu primo, mox sese attollit in auras ingrediturque solo et caput inter nubila condit. [...] progenuit pedibus celerem et pernicibus alis, monstrum horrendum, ingens, cui quot sunt corpore plumae,tot vigiles oculi subter [...] tot linguae, totidem ora sonant, tot subrigit auris." “[Fama] flourishes by speed, and gains strength as she goes: first limited by fear, she soon reaches into the sky, walks on the ground, and hides her head in the clouds.[…] fleet-wingedand swift-footed, […] who for every feather on her body has as many watchful eyes below […], as many tongues speaking, as many listening ears.”Many tongues speaking indeed, many listening ears, many writing fingers on keybords and watchful eyes for youTube videos. For a study on freedom of expression on the Intenret to be published by the Council of Europe I've looked at the issue in some more depth. What follows are a few important markers. But for an overview of the jurisprudence towards a "right to reptutation" I encourage you to have a look at Stijn Smet's excellent article on Freedom of Expression and the Right to Reputation: Human Rights in Conflict, American University International Law Review 26 (2011) 1, 183-236.
But let's get back to fama and her wings and feet.
One thing is clear: Internet platform providers, site moderators and bloggers have to take care not to engage in defamation and journalists reporting on events and news have to be careful not to publish content that is objectively defamatory. They have to avoide giving fama wings and feet, a forum and a multiplication vector.
As I have argued in two previous postings (here and here), liability may ensue - and the liability regime established by some national courts is problematic; a definite answer by the European Court of Human Rights is still out though there are some positive indications in its previous case-law.
Now, why is defamation such a problem.
While truth is an absolute defense against a claim of defamation, very often it can be difficult to establish or very costly do so. A customer on a travelling forum, for instance, might say that a specific hotel was a bad choice because of the small rooms and the broken appliances. This may be their opinion (‘bad choice’) but it also contains a statement of facts (‘broken appliances’). Once the hotel identified in the review published on the site asks the website owner to take down the post (on the argument that it is defamatory) the owner has a clear choice: either delete the post and thus arguably infringing upon the freedom of expression of its users or keeping the post and thus, having ‘owned up to it’, risk a defamation-based suit by the hotel.
The risk in the defamation suit is to prove the veracity of the statement. Unfortunately for the owner of the travelling website that duty now falls upon them. Thought the original poster may help, they are difficult to be legally forced to do so. The website owner - by themselves - will usually have a very hard time indeed proving that at a certain date in, say, 2011, the appliances in one specific hotel room in a small village somewhere in, say, California were faulty.
Voicing opinions (value judgments) online cannot amount to defamation, only statements of fact can be defamatory. As the European Court of Human Rights ruled in Lingens, “[t]he existence of facts can be demonstrated, whereas the truth of value judgments is not susceptible of proof”.
However, the Court will look the context of a statement to determine whether it is a true opinion or rather a statement of facts disguised as a value judgment.
Freedom of expression and the right to reputation as a weapon against defamation often conflict. The European Convention on Human Rights only mentions reputation in Article 10 (2) as a legitimate aim that would allow a restriction of freedom of expression: “for the protection of the reputation or the rights of others”. In a number of cases, centrally Pfeifer v. Austria (with regard to Article 8) however, the Court has developed a right to reputation from this basis as being part of a person’s right to respect for private life.
More recently, in Karakó v. Hungary Court seemed to qualify its strong position in Pfeifer arguing that only “factual allegations [of a] seriously offensive [with an] inevitable direct effect on the applicant’s private life” warrant protection a position it largely held in Polanco Torres and Movilla Polanco v. Spain.
In Polanco Torres (regarding an article alleging unlawful dealings and dirty money published first in the El Mundo newspaper) the Court ruled that the journalist had sufficiently verified the veracity allegations contained in the article. Their right to impart information that was in the general interest was given more weight than the right of reputation.
What makes this case especially interesting for freedom of expression online is that the article under review was republished by another newspaper, Alerta, that was also charged with defamation but unlike El Mundo convicted of it in the national courts because the journalists at Alerta had simply copied the article from El Mundo without checking the veracity of the allegations. We see: Merely republishing defamatory allegations without ensuring their veracity is highly problematic.
In the 2011 case of Editorial Board of Pravoye Delo and Shtekel v. Ukraine the Court had another opportunity to assess the limits of defamation. The Court ruled that Article 10 must be interpreted as to imposing on states obligations to create an appropriate regulatory framework to ensure effective protection freedom of expression on the Internet for journalist. Pravoye Delo is therefore to journalistic freedom online what K.U. v. Finland is to protection of minors on the Internet.
The editorial board of the Ukrainian newspaper had been fined for publishing defamatory statements taken from the Internet accompanied by an editorial in which they distanced themselves from them. The Court found fault with the reluctance of the local courts to apply protections regarding offline media to online surroundings. The Court agreed that
“[the] risk of harm posed by content and communications on the Internet to the exercise and enjoyment of human rights and freedoms, particularly the right to respect for private life, is certainly higher than that posed by the press. Therefore, the policies governing reproduction of material from the printed media and the Internet may differ. The latter undeniably have to be adjusted according to the technology’s specific features in order to secure the protection and promotion of the rights and freedoms concerned.”Just because the legal treatment of offline and online publications may differ, not applying safeguards at all is a violation of Article 10. This does not mean, however, that newspapers have to make individuals aware of potentially defamatory information. In the 2011 case Mosley v. the United Kingdom the Court ruled that the United Kingdom cannot be faulted in not giving a public figure whose sexual activities had been recorded and published in form of images and videos on a newspapers’ website the possibility of an injunction to prevent publication, even if the publication was violative of his right to private life.
Taken together the case law of the European Court of Human Rights contains important markers for navigating between the right of freedom of expression and the right to private life, between legitimate publications in the public interest and defamatory comments. A key lesson, however, is – again – that states need to apply offline free expression protection guarantees to online situations, even if they have to developed in recognizance of the special impact Internet publications can have.
Much ado about nothing; why the Uniregistry request for antitrust immunity is meaningless and its conclusions misleading
With much fanfare last month, Uniregistry announced that proposals for dispute resolution between New TLD applicants in lieu of ICANN's so-called "Auction of Last Resort" posed significant antitrust risks. Their claim of concern was not based on any critical antitrust analysis, but rather on the fact that they had sought a "Business Review" letter from the Antitrust Division of the U.S. Department of Justice (DOJ), and, according to Uniregistry, the DOJ failed to provide them a positive response and discussed the issue with them.
I am a former trial attorney in the DOJ Antitrust Division and the former Policy Director of the Federal Trade Commission (FTC). At the FTC, I was in charge of the business review letter process and authored several of these letters. The specter of concern raised by Uniregistry is based on a misinterpretation of the business review process and not sound antitrust analysis.
Uniregistry suggests that simply the fact that they failed to receive a positive response from the DOJ suggests that enforcement action is likely. That is hardly the case. The DOJ has very high standards for issuing business review letters. Review letters are typically only issued where the facts and the law are fairly clear cut and demonstrate that there are no potential competitive concerns raised by the proposed conduct. Because of these very high standards, the DOJ typically receives numerous review letter requests, but issues only two or three business review letters a year. The fact they did not grant Uniregistry's request did not mean the conduct raised substantial competitive concerns. In my experience, it simply means that the DOJ lacked the unambiguous compelling facts to say that there were no competitive issues.
If the DOJ saw some potential competitive problems it would have responded with a letter articulating those concerns. In fact, one week after the Uniregistry announcement, the DOJ did exactly that, turning down a business review request on a patent exchange system because of potential competitive concerns. See [www.justice.gov] . The DOJ's failure to respond formally to Uniregistry certainly does not support the allegation that they have competitive concerns over the dispute resolution system.
Contrary to Uniregistry's suggestion, the DOJ's refusal to issue a positive letter does not suggest the conduct at issue is likely to lead to antitrust enforcement. If the DOJ thought there were competitive concerns sufficient to bring enforcement action, its procedures instruct that they would respond clearly in that fashion. Rather, according to Uniregistry, they simply responded that the conduct is not wholly immune from scrutiny. Stated another way, the failure to secure a business review letter does not mean the DOJ is likely to bring a law enforcement action. Indeed, in over 40 years there has never been a case where a rejected business review letter request led to an enforcement action, even when the DOJ has suggested that the conduct at issue could potentially present antitrust issues.
Moreover, the key to any analysis of proposed conduct from the perspective of the antitrust laws is whether consumers or other parties may be harmed by the conduct at issue. In this case, it seems fairly unambiguous that ICANN will not be harmed by the dispute resolution system. In fact, they designed the dispute resolution system pursuant to which they encourage applicants to engage in dispute resolution in order to avoid the ICANN auctions. Indeed, there never has been a successful antitrust case brought where the alleged plaintiff was the party that actually designed the restraints at issue.
Uniregistry's request was unusual in another important respect. Typically business review letters are requested by the parties proposing the conduct or those that have created the arrangement, but in this case ICANN did not go to the DOJ. A critical part of any analysis of a proposed arrangement is the "purpose and intent," but Uniregistry was in no position to answer those critical questions.
In any case, regardless of how Uniregistry might want to interpret DOJ's non-action, there's little antitrust risk posed by anticipated private auctions or the registry dispute resolution system as a whole. First, as suggested earlier, the only entity that could be harmed by the system is ICANN, which designed the system. ICANN effectively cannot be harmed by this system, and this is key, as it is deliberately avoiding any type of revenue from the auctions of these new registries. Second, the dispute resolution system cannot harm consumers. There is no fashion in which the method of dispute resolution ultimately would lead to higher prices or less innovation or output. Without some clear-cut harm to consumers, it is difficult to fathom any antitrust violation. Third, the dispute resolution system is akin to many types of joint ventures that have been approved by the DOJ in which competitors have collaborated in order to improve how the market works. The ultimate question asked by the DOJ is whether a system helps to make markets function more effectively and certainly the ICANN dispute resolution system, including private auctions, would meet that requirement.
Finally, although Uniregistry or others might be able to envision some other form of dispute resolution system, it is not the DOJ's role to engage in economic policy engineering and suggest how ICANN should restructure those rules. They simply are obligated to stop conduct that will harm consumers through higher prices or less innovation. The current ICANN dispute resolution system does not pose these risks; that is why antitrust enforcement would be highly unlikely. Any suggestion otherwise is most likely just in Uniregistry's business interests.
Written by David Balto, Antitrust Lawyer
Follow CircleID on Twitter
The 46th meeting of the Internet Corporation for Assigned Names and Numbers (ICANN) takes place this week in Beijing, China, and will bring together leaders from all over the world to discuss and debate a wide range of issues related to domain names and the surrounding industry. One can expect that the new gTLDs, a topic frequently discussed here on CircleID, will naturally consume a great amount of the discussion at ICANN 46. The main site for the event can be found at:
and the full schedule of events can be found at:
A great aspect of ICANN meetings is that most of the meetings have some mechanism for you to view the meeting remotely. If you go into any of the sessions on the schedule, you will see remote participation links — often for both high and low bandwidth connections. In my experience, many sessions are also recorded for later viewing.
Do keep in mind that all times are local to Beijing which is UTC+8 and may not work with your viewing schedule. For instance, there is a 12-hour difference from the eastern US where I live and as a result a session that starts Monday at 9am will be starting Sunday night at 9pm for people in the eastern US..
In the midst of all the more business-focused discussions around domain names and governance questions, there are also some excellent technical tracks. I will be in Beijing specifically for the excellent DNSSEC Workshop and related sessions, as well as attending the IPv6 workshop.
I'm looking forward to the ICANN 46 event — if you will be there, too, please do feel free to say hello. You can pretty much expect to find me in any sessions related to DNS security.
P.S. If you are interested in the views of my employer, the Internet Society, on the events happening at ICANN 46, a few of my colleagues prepared the "Internet Society's Rough Guide to ICANN 46's Hot Topics” that outlines what the organization will be watching and participating in over the next week.
Written by Dan York, Author and Speaker on Internet technologies
Follow CircleID on Twitter
Matthias C. Kettemann's International Law and the Internet: An African Spring? Human Rights and Security in Africa in Times of Change (Call for Papers)
human rights and human security in
Africa (the image shows an architecturally
interesting window ('nozzle') in the
Graz Museum of Modern Art
('Kunsthaus') (c) Kettemann 2012)
On 10-11 June 2013 the University of Graz will host the 6th Graz Workshop on the Future of Security, a series of annual workshops I co-founded six years ago, dedicated to new security challenges and internantional reponses to insecurity, focusing on the individual.
Human security has been a focus of my research for some time. At the Institute of International Law and International Relations, I help run the Human Security Focus Group.
The workshop takes on the important task of analyzing whether the political dynamic of the Arab Spring can be scaled up.
Please find below the call for papers. The deadline for contributions has been extended to 12 April 2013. The organizational committee welcomes your submission.
6th Graz Workshop on the Future of SecurityAn African Spring? Human Rights and Security in Africa in Times of Change10-11 June 2013 | University of Graz, Austria
The Institute of International Law and International Relations of the University of Graz, Austria, the European Training and Research Centre for Human Rights and Democracy (ETC) Graz, and their Human Security Focus Group, in cooperation with the Austrian National Defence Academy invite contributions to the 6th Graz Workshop on the Future of Security on 10-11 June 2013, dedicated to the topic: ‘An African Spring? Human Rights and Security in Africa in Times of Change’.
The sixth workshop in a series of academic events devoted to furthering our understanding of
today’s and tomorrow’s security challenges is meant to bring together both emerging and
established researchers at the pre- and postdoctoral level active in the field of human rights and security studies to exchange views on contemporary challenges facing the African continent. The interdisciplinary workshop is dedicated to studying the consequences of the tremendous political shifts that occurred during the revolutionary changes in African countries over the last two years, their reasons and their implications for international, regional and human security.
Presentations could evaluate, for example, the impacts of the Northern Africa uprisings for peace and security in the region and abroad, the human rights situation during and after the revolutions, as well as the role of the international community and international organizations. Analyses of the events in light of international law and African normative instruments and the role of civil society and networks as agents of social change are also most welcome.
The presentations are selected on the basis of academic merit and may be submitted independently or under one or more of the following streams: Stream 1: Security Studies; Stream 2: Human Rights; Stream 3: International Law and International Politics; Stream 4: Social Sciences; Stream 5: Interdisciplinary Approaches.
Submissions of no more than 300 words describing your presentation should be sent together with a short bio no later than 12 April 2013 to email@example.com. Decisions of acceptance will be notified by 20 April 2013. A camera-ready version of the paper is due on 15 May 2013. Selected excellent contributions will be published in a special edition of the peer-reviewed journal Human Security Perspectives.
Organizing Committee:Wolfgang Benedek | Vanda A. Dias | Lisa M. Heschl | Matthias C. Kettemann
Reinmar Nindler | Kalkidan N. Obse | Stefan Salomon
We regularly check the status of IPv6 deployment in the RIPE NCC service region, and in other service regions as well. One way to measure IPv6 deployment is to look at the percentage of networks announcing IPv6 prefixes and follow the developments over time.
The RIPE NCC's IPv6-ASN graph shows the percentage of networks that announce one or more IPv6 prefixes in the global routing system. Having an IPv6 prefix visible in the global routing system is a required step for a network to actually start exchanging IPv6 traffic with other networks. The interactive graph allows you to specify the countries or service regions you are interested in, which can make for some interesting comparisons.
The graph below shows the percentage of networks announcing IPv6 prefixes in each Regional Internet Registry's (RIR) service region over the last few years.
It is interesting to see that the percentage of networks announcing IPv6 address space in the APNIC and the RIPE NCC service regions continues to increase steadily. Both of these RIRs have reached IPv4 exhaustion (in 2011 and 2012 respectively) and are currently allocating from their last /8 block of addresses.
It is also encouraging to see that the percentage of IPv6-enabled networks in the ARIN service region, which is projected to be the third RIR to reach its last /8 of IPv4 addresses, is also increasing. On the other hand, the percentage of IPv6-enabled networks in the Lacnic and the AFRINIC service regions appears to have stopped growing. For the Lacnic service region this number even fell a little over the last few months. Despite the absolute number of IPv6 announcing networks growing from 388 to 399 since the beginning of 2013, this growth was outpaced by the total growth of networks in the service region that are visible in the global routing system, which resulted in a total percentage decrease from 15.5% to 15.0% for this period. Even though this might not be a surprise, it's reassuring to see that in regions where IPv4 exhaustion has occurred, there is a steady growth in the percentage of networks announcing IPv6 address space.
If you find other interesting comparisons between countries or regions, please comment below! You can find more information and statistics on RIPE Labs.
Note that this article is based on work done by Emile Aben, System Architect at the RIPE NCC.
Written by Mirjam Kuehne
Follow CircleID on Twitter
As part of the new domain initiative launched by the Internet Corporation for Assigned Names and Numbers, established businesses and speculators have filed applications for a wide range of top-level domains — from .amazon to .garden. While some applications would make new web domains open to any qualified applicants, others propose a "single registrant" model that would allow only one company to use the new top-level domain.
Before the experiment has gotten off the ground, some critics have expressed concern about applications to operate domains referring to a "generic" product or service, like .car, .book, or .app. News reports indicate that Microsoft and other Google competitors have filed complaints about Google's applications, while authors' organizations have raised questions about some of Amazon's applications. These complaints assert that giving these applicants the right to operate these new domains would provide an unfair competitive advantage.
ICANN shouldn't worry, however. The sky isn't falling.
Granting Google, Amazon or any other company "single registrant" gTLDs does not threaten the competitive online ecosystem.
First, the "competitive advantage" (or value) any company can achieve from these gTLDs is uncertain. Previous TLD offerings like ".biz," ".mobi," or ".info" failed to draw large numbers of websites despite extensive promotional efforts. In fact, repurposed country code TLDs — including .ly (Libya), .me (Montenegro) and .co (Colombia) — earned their popularity unexpectedly.
To put a finer point on it — most alternative domains have flopped. Because of the highly uncertain value of new gTLDs, many of the concerns levied against bidding companies like Google and Amazon, which have applied to manage dozens of gTLDs, are completely speculative. Companies are bidding because they think there might be opportunities in new domains — but history suggests they will have an uphill battle. There is no evidence to suggest a genuine likelihood of harm to Internet users or the online ecosystem.
Second, the existence of alternative web domains will not disturb the fundamental openness of the Internet. Amazon's use of the .book domain to market the latest bestsellers would in no way block any other bookseller from using a different domain to do the same. In fact, the use of .book does not seem to provide a company any kind of competitive advantage against its business rivals.
Despite linguistic confusion, there is no relation between an exclusive right to a domain and a "monopoly" over a specific economic market. Users can easily navigate to any site based on its quality, whatever its domain name. Sites that grow popular do so because of how well they meet their users' needs, not because of their domain name.
Moreover, users today often rely on search engines to get where they want to go, rather than typing URLs out. There is even a term for such searches: "navigational searches." Terms relating to Facebook including "Facebook.com" or "Facebook login," for example, represented 5.62% of all searches conducted online in the United States, according to the information analytics firm Experian. If Facebook is on .com, .facebook, or .socialnetwork, people will be able to find it.
Finally, many of the worries about Google's control over certain gTLDs have already been addressed. Google changed its applications for the .search, .app, .blog, and .cloud gTLDs so that the domains would be open to qualified sites, not just Google products. Others of its applications, including .map and .fly, were already drafted to be open for qualified sites. This means that if MapQuest wants to use mapquest.map, Wordpress wants to use wordpress.blog, or Yahoo! wants to operate yahoo.search, all will be free to do so.
Google's competitors also contend that Google has the incentive to tweak its search algorithm to favor any site on a Google domain. Google has already pledged not to do this. Further, Google has little financial incentive to make its results less relevant to users, because some users would switch to other search engines.
If ICANN's experiment is successful, it has the potential to generate tremendous value for companies and offer users a better online experience. Existing companies will be able develop domains centered on their brands to draw more customers and enhance their business performance. Operators crafting new business models for these domains may also improve how users interact on the web. As ICANN's At Large Advisory Committee observed, "there may be innovative business models that might allow a closed TLD to be in the public interest."
While the benefits remain uncertain, the harms are clearly exaggerated and should find a home at a new domain called .premature.
Written by Marvin Ammori, Fellow at the New America Foundation, Lawyer at The Ammori Group
Follow CircleID on Twitter
The numbers are big. Official figures quoted at the recent 21st annual China Content and Broadcasting Network (CCBN) conference indicate that China has 400 million TV households, of which 210 million subscribe to cable TV (CATV). Of these cable subscribers, 140 million receive digital service while the rest are still on analog systems. This means that the country's CATV network is still largely a one-way network, limiting the growth of on-demand and interactive services. Compared to broadband offered by the dominant telecom operators — China Telecom and China Unicom — the country's CATV high-speed Internet service is tiny at a mere 5.64 million subscribers in total.
Theoretically, China's unique CATV industry is organized in a four-layer hierarchical structure. First, there's the nationwide network. Secondly, each of the country's thirty-odd provinces runs its own CATV network. Then each municipality owns a cable network, and finally, each county below the municipality level runs its own network. In reality, this structure is not always so fixed, as some government levels merely perform administrative functions while others actually own a physical network of services. Even so, there are still thousands of CATV operators in China and almost all of them are owned or partly owned by some level of government.
The country is currently undergoing a major effort to consolidate CATV networks. The first step is to consolidate all networks up to the provincial level, so that each province will run a connected cable network by merging and unifying the networks within its provincial territory. The aim of this is to provide a foundation of operational scale and reach. Leading the effort is the State Administration of Radio, Film, and Television (SARFT), the government regulator that sets state policies and regulations for these industries. Each CATV operator is owned by the respective administrative branch of SARFT, so in essence, the regulator is the operator.
This consolidation is part of China's Next Generation Broadband (NGB) initiative. It involves an upgrade of the country's CATV systems to two-way transmission and the deployment of a distributed conditional access system to deliver high-definition TV, 3D TV, Ultra HDTV, and multimedia. The NGB will enable China to move towards an all-digital, all-IP world. By the end of this year, the aim is to turn 50% of all networks above the municipal level into all digital and IP services, and by 2015, for 80% of all networks to feature two-way services. China's CATV industry is also expected to grow from the current 28 high-definition channels and one 3D channel to at least 100 HD channels and 10 3D channels by 2015.
There is still a proliferation of Ethernet over cable (EoC) but DOCSIS has gained ground recently through what is known as "C-DOCSIS". This localized version of DOCSIS architecture pushes the traditional CMTS further to the edge of a Converged Media Converter (CMC) to deliver bandwidth to some 300 homes more cost-effectively than a CMTS.
All in all, the country is gearing up for delivering the 4As: anywhere, anytime, any device, and any content. Multi-screen access to content is a priority. Although the market is big, it can be confusing for equipment vendors and revenues can be elusive. Layers of bureaucracies, shifting priorities and timelines, and intricate distribution channels have contributed to market inefficiencies that hinder the growth of this industry. Cable in China is caught between the need to provide a commercial service and adhere to its function as a governmental branch that has to carry out state goals and priorities.
Written by Will Yan, Senior VP, Worldwide Sales at Incognito Software
Follow CircleID on Twitter