Author Archives: kc

Home / Articles posted by kc ()

Note on Internet Organisations

Note on Internet Organisations

Digital Feudalism

Digital Feudalism : Enclosures and Erasures from Digital Rights Management to the Digital Divide

Background Note on Science Commons

One of the key determinants of today’s world is the speed with which innovation takes place and is brought within the sphere of production. The growth of technology is a continuous driver of the economy. While a lot of discussions have taken place on the monopoly created through the “reproduction” of the innovation via patents, much less attention has been focussed in the way innovation takes place and the structures within which innovation is either facilitated or retarded. Does the networked world of today carry new possibilities for alternate structures of creating knowledge and innovation that are currently being retarded? Is it possible to expand the notion of “commons” to help such processes develop?

A number of recent cases in the United States Supreme Court1 and in the US Federal Court dealing with patents have shown that companies investing heavily in advanced technologies are moving away from the patent model. A major exception to this is the big pharmaceutical company sector.

The technology model of generating innovation was conceived to be “private” from the beginning. The patenting system originated in the days of the lone inventor and the need to protect his/her invention. Historically, the lone inventor has given way to large corporate or state funded research laboratories in the early twentieth century. Increasingly, science institutions have been also looking at producing knowledge in profit-oriented ways similar to those used by global corporations in creating new technologies. With the Bayh-Dole legislation2 in the US, this model has come to dominate publicly funded science in the US. In India, as elsewhere, the belief that a direction the US has moved in is a good one to go in is gaining ground.

Interestingly, this is also a time in which alternate models of generating knowledge and innovation have gained ground. The Free Software Movement has shown that networked and open collaborations of “hackers” can produce software of far better quality that what the best of well-heeled corporations working in isolation can manage. The power of open, collaborative structures, working without so-called material incentives is visible in this model. The Free Software Movement has thus resurrected older models that have played key roles in successful innovation in technology development, such as the cases of the steam engine development in Cornish mines3 and the blast furnace developments4 in Great Britain and the US.

Proponents of patents may argue may be they did not work in the past, but currently patents are great for promoting innovation. The argument against the current patent regime for companies involved in innovation is that the increasingly networked character of producing scientific and technological knowledge comes up against the requirements of a patenting system: the bang is not worth the buck involved in patenting.

Let us look at the more recent data. In a forthcoming book, two researchers Bessen and Meurer5 have analysed the numbers in terms of revenues generated from patents as against cost of filing, maintaining and defending patents in courts. In their view, the data shows that except in the case of pharmaceuticals, patents generate far more litigation costs than revenue. The numbers are clear: domestic litigation costs –16 billion dollars in 1999 alone — was about twice the revenue for patents. Even in this, almost two thirds of the revenue was from pharmaceuticals and chemicals. Worse, the more innovative the company, more was the likelihood of it being sued. The software and business method patents fared the worst, with costs far outstripping the benefits of patenting. Even if we examine, not the broader question of whether societies benefit due to greater innovation, but the very narrow one of whether companies that are innovative, benefit from patenting, the answer is that they do not. This answer that Bessen and Merurer come to is no different from what others have discovered in the past: if patents did not already exist, it would be a poor way of rewarding innovation.

Research of Bessen and Meurer, Boldrin and Levine also show that patents do not promote innovation in societies either. Most of the historical data from countries that had different forms of patent protection do not show significantly different rates of innovation. Neither are current data any different.

Historical Look at patents: Cornish Mines and Blast Furnaces in Cleveland Area

The need for patents has always been articulated as a necessary social evil. The US Constitution allows the Congress, “To promote the progress of science and useful arts, by securing for limited times to authors and inventors the exclusive right to their respective writings and discoveries.” Thus even in the US, this exclusive or monopoly rights is given not because the inventor somehow owns the idea embodied in the patent but in order to promote science and technology, therefore larger societal goals.

Patent as an incentive, gives a monopoly to the inventor for a certain period in lieu of which he/she makes the invention public. In economic terms, this monopoly allows the patent holder to extract rent from all users of the patents: it is the state allowing the patent holder the right to levy a private tax. Therefore, the question arises whether patents (or monopolies) are the best form of providing such incentives?

Even if we accept that material incentives need to be given to the inventors, patent monopolies however are not the only form of incentives. Others could be a royalty for the inventor from any producer who wanted to work the patent, but not a monopoly over all reproduction of the invention. This is what in patent literature would be referred to as an automatic license of right. Or it could be the state offering prizes from its kitty for socially useful inventions, a policy that a number of states have followed in the past for encouraging inventors.

The question is whether the monopoly patent regime has helped in promoting innovation. For this, let us start with the most celebrated innovation, which in all text books is stated to be one of the key elements of Industrial Revolution: the Steam Engine. James Watt perfected his version of the steam engine for which he secured a patent in 1769. In 1775, using the influence of Mathew Boulton, his rich and influential business partner, he succeeded in getting the Parliament to pass an Act extending his patent till 1800. This gives us an opportunity to examine the developments in steam engines and deciding whether the Watts patent helped in promoting innovation or did it actually stifle development.

The major beneficiary of the advances in steam engines would have been the mining industry in Cornwall. Watt spent his entire time suing the Cornish miners if they tried to make any advances over his design. The firm of Boulton and Watts did not even manufacture steam engines then, they only allowed others to construct the engines based on Watt’s designs for which they claimed huge royalties. If we examine the increased efficiencies of steam engines and plot it against time, we find that after the initial Watts breakthrough, during the period that Watt had monopoly, all further improvements virtually stopped, starting again only after the expiry of his patents (figure below). During the period of Watt’s patents the U.K. added about 750 horsepower of steam engines per year. “In the thirty years following Watt’s patents, additional horsepower was added at a rate of more than 4,000 per year. Moreover, the fuel efficiency of steam engines changed little during the period of Watt’s patent; while between 1810 and 1835 it is estimated to have increased by a factor of five” 6. The major advance in steam engine efficiency took place not because of Watt’s invention but afterwards.

Interestingly, all those who made further advances, such as Trevithick, did not file patents. Instead, they worked on a collaborative model in which all advances were published in a journal collectively maintained by the mine engineers, called the “Lean’s Engine Reporter”. This journal published best practices as well as all advances that were being made. This was the period that saw the fastest growth of engine efficiency.

If we look at the research on increased patent protection helping innovation, very little concrete evidence has ever been found for this thesis. In fact, the evidence not only of Cornish mines but also in U.K. and the US of blast furnaces in the 19th Century, show that collective innovation settings7 lead to a faster diffusion of technology and more innovation as opposed to the closed, patent based monopolies. Thus, the advances in the two key elements of industrial revolution – steam engines and steel — both came out of a non-patented and open sharing environment. The recent advances of Free and Open Source Software is not an anomaly but merely the reflection that an open model of developing knowledge is a faster and surer way to innovation than conferring state monopolies.

Science and Open Models

In science, while results have been open and shared publicly, the competitive model has been the way discoveries are made. The current contours of the scientific enterprise are defined by these competitive notions of exclusive discovery. They have been consistent with reductionist paradigms, in which small problems could be examined in isolation. Such models are unlikely to be adequate today. Cooperation in the scientific community on a far wider scale than has been the case so far is critical if major advances are to take place.

Today, the information technology sector8 has shown that new technologies and methodologies can be developed by cooperative communities. It may be argued that this sector is unique in that the “reproduction costs” of the “artefacts” – the software– are relatively low. However, the question needs to be posed whether it is possible to design such approaches for other areas such as, say, the life sciences? Is it possible to have similar cooperative communities that work together to produce new products? Is it possible to envisage ways by which artefacts can be reproduced and reach the community without high costs of such “reproduction”? Are there spaces to be found in which new, more intimately cooperative modes of scientific enquiry can be initiated?

What is needed is to explore new ways of establishing ‘creative commons’, in which new technologies and methodologies are developed by cooperative communities. It is in this context that we have thought to get together a set of practitioners from different disciplines to focus on production of knowledge and innovation and examining what structures of knowledge production are in consonance with the needs of producing new knowledge and innovation. Some examples of this are given below.

Agribiotechnology:

There is little doubt that genetically engineered plants are going to create an enormous impact on agriculture in the future. That it has not done so till date is due to various reasons. One of course is that genetically modified organisms are in their infancy. The second and perhaps an even more important is that unlike the Green Revolution that came out of public domain science, the Gene revolution is coming from private domain science. The prospect of agriculture of any country passing into the hands of a few multinational companies is not a re-assuring one. It is compounded by the fact that most of the successful biotech seed companies are either chemical companies such as Monsanto, Du Pont etc., while others pharmaceutical companies — Novartis, Bayer, etc. And the track record of both regarding public good has been rather poor. Therefore the discomfort that people have regarding their counties’ agriculture passing into multinational hands is not unjustified.

Greg Traxler, in his paper for FAO shows the rapid increase of transgenic crops in some countries and for specific crops. “In 1996, approximately 2.8 million hectares were planted to transgenic crops or genetically modified organisms (GMO) in six countries (James, 1998). Adoption has been rapid in those areas where the crops address important production problems, and by 2003 global area had risen to 67.7 million hectares in 18 countries (James, 2003)… Six countries (the USA, Argentina, Canada, Brazil, China and South Africa), four crops (soybean, cotton, maize and canola) and two traits (herbicide tolerance and insect resistance) account for more than 99 percent of global transgenic area.” 9

In order to explore such possibilities, a possible example would be the development of useful crop varieties in the agribiotech sector. The bulk of ‘innovative technology’ in this arena currently appears focussed in making genetically modified crops (GMOs, so to say), a technology that is patent-protected by the MNC sector. An interesting step away from this corporate model of agribiotech development has been the establishment of an ‘open source biology’10 platform, centred around new microbes useful for making transgenic plants. The most advanced initiative of this kind is the Australia-based CAMBIA/BIOS. While the first acronym refers to the broader scope of promoting biological innovation for agriculture (Centre for the application of Modern Biology to International Agriculture), the second refers to the Biological Innovation for Open Society, the specific arm of CAMBIA dedicated to open-source biology. This particularly focuses on freeing the basic technological tools of biotech for general use, so that innovation at the application level is not restricted, particularly by the biggest multinationals in the biotech sector. It promotes a protected commons license for use in this regard. It also operates a web portal BioForge, similar to the SourceForge of the open-source software movement. While the BIOS initiative is not identical to the free-software idea, it appears to be the most developed initiative of this kind so far.11

However, such a knowledge commons approach may still depend on the conventional manufacturing sector for delivery of the products – for example, the seeds — to the market. Also, it still involves making transgenic crops, which is a technology replete with implementation difficulties of both the political and the environmental kind.

One alternate possibility that is being discussed globally is to take advantage of the growing ability to sequence the entire genetic sequence of individual organisms at steadily declining expense. The incorporation of such a step in traditional plant breeding for advantageous traits will allow the breeding programmes to overcome some of the major obstacles to creating crop varieties with advantageous traits that breed true so that seeds can be re-used. It would allow the identification of combinations of genes that confer a particular trait and thus allow reliable selection of varieties with combinations of many advantageous traits, and it would even allow the creation of carefully engineered crops in which the introduced gene form providing advantage is not from some other species but from the host crop itself. Such a programme would be of little interest to the profit-sector since farmers can re-use seed. It would require little by way of a manufacturing intermediary, since experimentally generated seed can simply be handed out to be bred by farmers themselves. And it is a programme that would demand a large-scale cooperative global effort between breeders and scientists. Breeders would need to collect and maintain source varieties and carry out careful breeding. Scientists must, on the other hand, generate new ways of handling and interpreting the large mass of data that sequencing-assisted breeding would yield, – essentially, cutting-edge science would result from the enterprise as well.

Open Source Drug Discovery:

A similar possibility exists in the area of drug discovery. In 1995 the TRIPS agreement introduced an uniform and higher level of Patent protection across the globe. The promise that this would lead to higher levels of innovation remains a mirage. Globally, the number of New Chemical Entities (NCEs) have progressively gone down over the past decade. Further, of NCEs approved for marketing, a very small fraction – less than 3% — constitute a significant advance over prevailing therapies. An overwhelming majority of new products address needs of the wealthy populations in the global North, while the disease burden is largely in the global South. While the industry researches drugs for lifestyle conditions of the affluent – obesity, erectile dysfunction, baldness, etc. – conditions such as Tuberculosis, Kala Azar, Sleeping Sickness, have to make do with decade old therapies. The last drug developed specifically for Tuberculosis, was introduced some three decades back.

Clearly the IPR based model for innovation is just not working. Strong IP protection is encouraging protectionism and is harming the way science is done. Many more Patents are taken out to stop others from working than to protect one’s own research. It is premised on very high costs of development, that are sought to be recovered through high monopoly pricing of products, thereby closing the door for research that targets conditions of the global poor who do not have pockets deep enough to afford the high prices.

Can open-source drug research and development, using principles pioneered by the highly successful free software movement, help revive the industry? As the cost of genome sequencing drops and the speed at which the sequencing can be done increases exponentially, it is possible to harness this power to solve the problems of health in radically different ways.

An open source model to promote innovation is not a new model and is used extensively in the software sector today. It organises research around researchers across the globe, who draw from a pooled source of information to which the contribute, and to which they pledge to plough back the new developments that accrue. A decade back such a model might have appeared an utopia. Not so today when very powerful tools are available that can create virtual models, that can sequence genetic codes of humans, that can identify potential targets for interventions in the genetic code. It is possible to process genomic information and on a much larger scale, create public databases of genomic information and protein structures, identify promising protein targets, and deliver such compounds for clinical trials. It would be based on a collaborative, transparent process of biomedical development to take on health challenges that big pharmaceutical corporations have neglected in favour of what they perceive as “block-buster drugs”. A number of interesting initiatives are currently under way, from tuberculosis to malaria.

Such a model can identify new candidates at a fraction of the cost that Big Pharma claims to spend. It has been argued that the major cost in drug development relates to clinical trials that need to satisfy drug regulatory agencies. Today Big Pharma outsources clinical trials to a dispersed set of Contract Research Organisations. A collaborative open source model could use the same route, with the difference that the entire endeavour – from selection of promising candidates to marketing approval – is organised and overseen by a publicly funded entity or group that promises to place such research in public domain, without insisting on Patent monopolies. It is an idea whose time has come and has the potential to revolutionise the way research is done.

This is not to say that there are no difficulties with the approach. Rather, it is to suggest a possible example of ways in which the framework of present-day science and technology can be re-cast and used in innovative ways for cooperative generation of useful knowledge. Obviously, each of these areas would have their own specificities as well as demand creating new structures to protect the knowledge commons. Some advances have taken place with the Free Software community’s creation of the Gnu Public License. However, many more questions will need to be addressed not only to protect the knowledge commons being created but also to create this open and cooperative communities.

1 One of the important cases is KSR Vs Telefax. In this case a number of hi-tech companies who are regarded to be innovative, sided against easy granting of patents. The exception was of course the pharma companies who were on the other side. The judgement raised the bar on patents. “We build and create by bringing to the tangible and palpable reality around us new works based on instinct, simple logic, ordinary inferences, extraordinary ideas, and sometimes even genius. These advances, once part of our shared knowledge, define a new threshold from which innovation starts once more. And as progress beginning from higher levels of achievement is expected in the normal course, the results of ordinary innovation are not the subject of exclusive rights under the patent laws. Were it otherwise patents might stifle, rather than promote, the progress of useful arts. See U. S. Const., Art. I, §8, cl. 8. These premises led to the bar on patents claiming obvious subject matter established in Hotchkiss and codified in §103. Application of the bar must not be confined within a test or formulation too constrained to serve its purpose.” KSR International v Telefax US Supreme Court.

2 A good critique of Bayh Dole Act is Clifton Leaf, The Law of Unintended Consequences, Fortune, September 19, 2005

3 Alessandro Nuvolari, Collective Invention during the British Industrial Revolution: The Case of the Cornish Pumping Engine, Eindhoven Centre for Innovation Studies, The Netherlands, Working Paper 01.04, May 2001.

4 Robert C. Allen, Collective Invention, Journal of Economic Behavior and Organization 4, 1983

5 James Bessen and Michael J. Meurer, Innovation at Risk, Princeton University Press, http://researchoninnovation.org/dopatentswork/),

6 Against Intellectual Monopoly, Michele Boldrin and David K. Levine, http://www.dklevine.com/general/intellectual/againstnew.htm)

7 Robert Allen, op cit

8 John Willinsky, 2005. “The unacknowledged convergence of open source, open access, and open science,” First Monday, volume 10, number 8, at http://www.firstmonday.org/issues/issue10_8/willinsky/

9 The Economic Impacts of Biotechnology-Based Technological Innovations, May 2004, ESA Working Paper No. 04-08, Food and Agriculture Organization, Greg Traxler.

10 W Broothaerts et al, “Gene transfer to plants by diverse species of bacteria,” Nature 433: 583-4.  Feb. 10, 2005

11 T Jayaraman, Note on Promotion of Open-Source Biology in India, Private Circulation.

Ensuring Last Mile Connectivity: Ensuring Enforcement of Existing Licensing Terms is a Must

The Government of India recently announced a slew of new programs to help create a knowledge economy under the broad rubric of the ‘Digital India’ program. Possibly the most important aspect of this program is ensuring appropriate telecom network access to all parts of India (through projects such as the National Fiber Optic Network). Providing network access to citizens is quite clearly the first and most vital step towards ensuring equitable access to service delivery and the many secondary projects that form part of the Digital India concept.

 

However, given existing models of service provision – particularly in the last mile, it is debatable whether the Governments attempts to ensure access to all will be successful. Two of the most problematic aspects of last mile connectivity have been the issue of cartelization by private sector service providers (who often divide up a circle and lay physical infrastructure to only certain dedicated portions, thereby ensuring that they have a captive consumer group and lowered costs) and the denial of services to consumers by service providers (on grounds such as unavailability of connections etc). These practices are clearly against the prevalent legal regime and it is essential that the relevant regulatory authorities step in to take corrective action in order to ensure that the Digital Divide is reduced.

 

The existing licensing framework for telecom services treats the provision of fixed line and other Services provided under the license as a matter of right to the consumer / subscriber. A service provider, must, once a request has been made for a connection (which in itself cannot be refused), provide the same within a reasonable time (subject only to conditions such as credit checks).

 

According to Part V of the licenses issued by the Department of Telecommunications, Government of India, to all providers of broadband and fixed line services (pertaining to Operating Conditions:

 

“30.1. The Licensee shall register demand / request for telephone connection without any discrimination from any applicant, at any place in the licensed service area and provide the Service, unless otherwise directed by the Licensor. The Licensee shall not in any manner discriminate between subscribers and provide service on the same commercial principle and shall be required to maintain a transparent, open to inspection, waiting list…The Licensee shall launch the Service on commercial basis only after commencement of registration in the manner prescribed. Before commencement of Service in an area, the Licensee shall notify and publicize the address where any subscriber can register demand /request for telephone connection”; and,

 

“30.2. The Licensee shall widely publicize provision of service and shall not refuse registration of demand in the licensed service area. In case the provision of telephone connection to an applicant is not feasible for technical or other reasons beyond the control of Licensee, then the Licensee shall endeavour to make arrangement for providing connections in such cases within a reasonable time.”

 

From these provisions it is clear that all service providers are mandated to:

 

(a) register requests for provision of relevant telecom services from all prospective subscribers, within the relevant service area, without discrimination;

 

(b) maintain a publicly available register, containing a list of pending applications / requests for provision of services;

 

(c) notify to the public, the place and form in which a subscriber can register demand for a particular telecom service.

 

It is crucial to note that service providers may not refuse registration of demand for connections, including in situations where there is no immediate (physical/technical) availability of a connection. In such circumstances, the service provider must act so as to provide the requested service to the subscriber within a reasonable time (but nonetheless, a request must be taken by the service provider).

 

Many telecom companies are acting in gross violation of the aforementioned terms of the telecom licenses issued by the Government of India. Market practice has evolved so as to enable service providers to act with complete impunity and avoid the license terms detailed above, thereby resulting in clear and direct harm to the consumers of telecom services in India.

 

Most private sector service providers do not provide prospective subscribers with any Customer Application Form (CAF) either online or in a physical form (it may be noted that the only telecom providers to actually make a CAF available to prospective consumers appear to be the publicly owned companies MTNL and BSNL). A CAF should be the basis for any customer to register their demand for a service – and this is currently not possible through any private sector telecom service provider.

 

Instead, virtually every private sector service provider first requires a prospective consumer to ‘unofficially’ register demand (including by providing a physical address where the service is to be provided), and to then conduct a feasibility test for provision of the service. In the event, the feasibility test (conducted usually on a completely non-transparent basis) shows that requisite service can be provided at the relevant location requested by the prospective consumer, only then is the CAF provided to the consumer.

 

So if a consumer desires a service not immediately supplied or provided at a particular location (even if within a relevant circle where a service is offered by the service provider), say on account of unavailability of physical connections, there is no way in which demand can be officially registered and therefore no duty cast on the service provider to provide the service within a ‘reasonable time’ as mandated by the licensing terms.

 

This is clearly a case of defrauding Indian citizens on a large scale, particularly those from weaker and more economically backward classes of society. Such a system permits service providers to pick and choose what parts of a telecom circle they will provide services in and permits them to ignore parts of a circle where they do not wish to provide a service. Typically this will mean that service providers can completely ignore laying last mile infrastructure to less well off areas within their telecom circles (and then disregard all requests for services in such an area by claiming no availability of connections and relevant infrastructure).

 

This practice, which appears to be ubiquitous in its use by all private sector telecom service providers, is in utter disregard for the terms of the licenses provided to them and clearly acts against the wishes of the Government of India in ensuring maximum connectivity to the public.

 

Of course, one must also note the complete absence of any public register (containing list of connections demanded / list of applications for provision of services). Such a register is not available either in physical or online form through any of the private sector service providers. In fact, it is surprising that despite most private sector telcom companies carrying out online billing etc. and also taking orders / requests for connection on their websites, none of them provide any publicly available register of requests for connection.

 

it is incumbent on TRAI to take necessary action, including through setting up a mechanism for consistent oversight of the process of registration and provision of telecom services to consumers, in order to ensure that the Government of India’s plans to ensure maximum connectivity to all Indian citizens is realised at the earliest. TRAI must act to frame and implement guidelines mandating a uniform and transparent process for service providers to take and register requests for provision of services, including by providing an appropriate and standardised Consumer Application Form format.

 

Without such urgent measures being adopted, the Digital India program will remain a mere pipe dream and the governments plans to bridge the digital divide will be doomed to failure.

 

Disclaimer: The views expressed here are the author’s personal views, and do not necessarily represent the views of Newsclick

DoT fails to fill all the dashes

Data services and telephone services use different technologies. Charging a service on the basis of function and not technology creates major licensing anomalies

 

The Department of Telecom (DoT) Report on Net Neutrality is better written than the shoddy consultation paper that the Telecom Regulatory Authority of India (TRAI) produced last March. But while the DoT report pays the usual lip-service to net neutrality, it refers the most contentious section — on the Zero Rating services such as Airtel Zero and Facebook-Reliance Internet.org — back to TRAI. The report concedes that the domestic voice revenues of telecom companies are increasing but accepts their claims that if Voice over Internet Protocol (VoIP) services are allowed to compete, there might be significant revenue loss in future. DoT’s treatment of VoIP is also inconsistent: it proposes that domestic VoIP calls be licensed, but not international calls.

 

The report does accept that net neutrality principles are basic to the Internet, and need to be incorporated in the licences of Internet and telecom service providers. This is a step forward. It does shift the debate from whether India needs net neutrality to what constitute violations of net neutrality. The report also states that only those OTT services that compete directly with existing communication services such as voice may need to be brought under some form of regulation. TRAI’s absurd claim that all websites and Internet services are communication services and therefore potentially subject to licensing will, mercifully, receive the quiet burial that it richly deserves.

 

When the debate erupted, Airtel and Facebook suddenly fell in love with net neutrality. They gushed about how they would never violate it. Since both companies offer variants of Zero Rating Services, their definition of net neutrality is obviously different from that of others who regard such services as a violation of net neutrality. The DoT report lists all the reasons why net neutrality should be a core principle, but it virtually negates it by suggesting that various Zero Rating plans should be submitted to TRAI for evaluation. Worse, the report talks about “guidelines” and “criteria” to evaluate net neutrality. Why did the committee not evaluate the Zero Rating plans against such guidelines and criteria instead of passing the buck to TRAI?

 

Simply put, net neutrality believes that those who control or own the physical network — either the wired or the wireless network — over which the Internet runs shall not discriminate between different kinds of services or websites. By controlling access, telcos cannot extract a monopoly rent or termination fee from websites or web service providers.

 

Net neutrality ensures that the wealthiest website or one with virtually no resources has equal access to the users. Today, there are about 1 billion websites, of which about 175 million are active. The bulk are small organisations, movements, non-commercial news organisations, or new service providers. These websites and innovative services springing up on the Internet every day would be badly hurt without net neutrality. The Internet would, then, be very much like cable TV, where select TV channels are carried by networks to viewers.

 

The DoT report underlines non-discriminatory access and states that traffic speeds cannot be controlled; and Internet monopolies, ISPs or TSPs, should not act as gatekeepers. But it fails to take the next step of calling out Zero Rating services, which bundle a few websites to be provided free to subscribers, or avoid the data caps imposed on others once the download limit is exceeded. Offering a few sites as a free bundle means blocking all others, either from the beginning or after the data cap is exceeded. If slowing down traffic from a particular source is a violation of net neutrality, why is blocking some sources not a violation?

 

The report’s second problem is the argument that VoIP services are essentially voice services even if offered over the Internet, and should thus have the same licensing terms as those for telecom services.

 

Historically, data services and telephone services use different technologies and have completely different principles built into them. Traditional voice services create a physical circuit through telephone switches that connect two sides. In data services including VoIP, all voice communication is converted into small data packets that are sent over the Internet and re-assembled at the other end. This is identical to all other communication online — whether voice, text or image. Treating a service on the basis of function and not technology creates major licensing anomalies. It also creates a long-term problem as any web-conferencing software today has very similar capabilities. It is also not clear why domestic and international Skype calls need to be treated differently.

 

Strangely, DoT does not ask why Indian telcos have not offered VoIP services in India, when from 2006 their Unified Access Service Licences includes internet telephony? Should we consider regulatory arbitrage as between VoIP services of OTT players and existing voice services? Or should it be between the VoIP services of the OTT players and the Internet Telephony services that the telecom companies have chosen not to offer? Instead of offering the telecom licensees protection in the guise of regulatory arbitrage, should not TRAI and DoT have pulled up the telcos for not offering a service that has been in their licence for the last nine years?

 

Data services and telephone services use different technologies.

Data services and telephone services use different technologies. Charging a service on the basis of function and not technology creates major licensing anomalies.

 

 

The Department of Telecom (DoT) Report on Net Neutrality is better written than the shoddy consultation paper that the Telecom Regulatory Authority of India (TRAI) produced last March. But while the DoT report pays the usual lip-service to net neutrality, it refers the most contentious section — on the Zero Rating services such as Airtel Zero and Facebook-Reliance Internet.org — back to TRAI. The report concedes that the domestic voice revenues of telecom companies are increasing but accepts their claims that if Voice over Internet Protocol (VoIP) services are allowed to compete, there might be significant revenue loss in future. DoT’s treatment of VoIP is also inconsistent: it proposes that domestic VoIP calls be licensed, but not international calls.

 

The report does accept that net neutrality principles are basic to the Internet, and need to be incorporated in the licences of Internet and telecom service providers. This is a step forward. It does shift the debate from whether India needs net neutrality to what constitute violations of net neutrality. The report also states that only those OTT services that compete directly with existing communication services such as voice may need to be brought under some form of regulation. TRAI’s absurd claim that all websites and Internet services are communication services and therefore potentially subject to licensing will, mercifully, receive the quiet burial that it richly deserves.

 

When the debate erupted, Airtel and Facebook suddenly fell in love with net neutrality. They gushed about how they would never violate it. Since both companies offer variants of Zero Rating Services, their definition of net neutrality is obviously different from that of others who regard such services as a violation of net neutrality. The DoT report lists all the reasons why net neutrality should be a core principle, but it virtually negates it by suggesting that various Zero Rating plans should be submitted to TRAI for evaluation. Worse, the report talks about “guidelines” and “criteria” to evaluate net neutrality. Why did the committee not evaluate the Zero Rating plans against such guidelines and criteria instead of passing the buck to TRAI?

 

Simply put, net neutrality believes that those who control or own the physical network — either the wired or the wireless network — over which the Internet runs shall not discriminate between different kinds of services or websites. By controlling access, telcos cannot extract a monopoly rent or termination fee from websites or web service providers.

 

Net neutrality ensures that the wealthiest website or one with virtually no resources has equal access to the users. Today, there are about 1 billion websites, of which about 175 million are active. The bulk are small organisations, movements, non-commercial news organisations, or new service providers. These websites and innovative services springing up on the Internet every day would be badly hurt without net neutrality. The Internet would, then, be very much like cable TV, where select TV channels are carried by networks to viewers.

 

The DoT report underlines non-discriminatory access and states that traffic speeds cannot be controlled; and Internet monopolies, ISPs or TSPs, should not act as gatekeepers. But it fails to take the next step of calling out Zero Rating services, which bundle a few websites to be provided free to subscribers, or avoid the data caps imposed on others once the download limit is exceeded. Offering a few sites as a free bundle means blocking all others, either from the beginning or after the data cap is exceeded. If slowing down traffic from a particular source is a violation of net neutrality, why is blocking some sources not a violation?

 

The report’s second problem is the argument that VoIP services are essentially voice services even if offered over the Internet, and should thus have the same licensing terms as those for telecom services.

 

Historically, data services and telephone services use different technologies and have completely different principles built into them. Traditional voice services create a physical circuit through telephone switches that connect two sides. In data services including VoIP, all voice communication is converted into small data packets that are sent over the Internet and re-assembled at the other end. This is identical to all other communication online — whether voice, text or image. Treating a service on the basis of function and not technology creates major licensing anomalies. It also creates a long-term problem as any web-conferencing software today has very similar capabilities. It is also not clear why domestic and international Skype calls need to be treated differently.

 

Strangely, DoT does not ask why Indian telcos have not offered VoIP services in India, when from 2006 their Unified Access Service Licences includes internet telephony? Should we consider regulatory arbitrage as between VoIP services of OTT players and existing voice services? Or should it be between the VoIP services of the OTT players and the Internet Telephony services that the telecom companies have chosen not to offer? Instead of offering the telecom licensees protection in the guise of regulatory arbitrage, should not TRAI and DoT have pulled up the telcos for not offering a service that has been in their licence for the last nine years?

 

The Tallinn Manual on the International Law Applicable to Cyber Warfare

“The Tallinn Manual on the International Law Applicable to Cyber Warfare”

Reasonable Analysis, Questionable Conclusions: Comments on the DoT Report on Net Neutrality

The report on net neutrality recently released by a committee of the Department of Telecommunications (DoT) has a rather mixed bag of recommendations. The report is fairly comprehensive in detailing several contentious issues raised by the subject of net neutrality. However, the conclusions and recommendations in the Report are, occasionally, inconsistent with the arguments and rationale presented in the text of the report. A reasoned position taken in the analysis is often contradicted by the final recommendation.The DoT Report is not as unbalanced as the TRAI consultation paper released earlier this year – possibly because the DoT actually consulted constituencies other than the major telecom companies. It is encouraging that DoT has, to some extent, taken a pro-consumer / user position in making its recommendations. Rather than debating the need for net neutrality regulation, it has recognized that certain core principles must be followed in the interests of the public. The focus is now on how to contour the proposed regulations to ensure an appropriate balancing of all interests.

 

The report is unequivocal in stating that ‘core principles’ of net neutrality must be adhered to. However, nowhere in the report does it specifically mention what these core principles are. (We could deduce, of course, that non-discriminatory behavior, encouraging competition and openness form the basic principles of net neutrality). While its general support for the principles of net neutrality is a positive step, much depends on how the regulations will be fleshed out. The guidelines provided by the report are minimalistic, and merely indicative in nature.

 

Perhaps the biggest problem of the DoT Report is its failure to make tough calls. On the regulation of OTTs (Over the Top Applications and Services),it has used questionable logic to provide the incumbent telcos with exactly what they want, while not making the sell-out as blatant as TRAI did. On the issue of zero-rating, the report has, essentially, thrown the ball back in TRAI’s court. Keeping in mind the common perception that TRAI is merely an industry body in the guise of a regulator, this basically means that consumer interest is unlikely to be protected.

 

Regulation of OTTs:

 

One of the key questions of the net neutrality debate, and quite possibly the issue that sparked off this series of consultations by TRAI and the DoT, is whether to license OTT service providers.

 

In dealing with this question, the Report indulges in a rather adroit juggling act to justify giving the big telecom companies exactly what they want. The Report classifies all services / application / content on the Internet into two categories – ‘Application OTTs’ and ‘Communication OTTs’. The latter is defined as any service that provides real-time communication services (and so competes with existing telecom players who provide ‘similar’ communication services). The former includes all other types of content, services and applications — everything but real time communication apps and services.

 

The DoT Report essentially states that all OTTs (all services / content / applications on the Internet) can be licensed or regulated by them should they so desire: “A view arising from legal considerations is that all OTTs fall under the Telegraph Act and require a license to be granted for service provision.” But as there is no possibility for arbitrage in the case of OTT Application providers and existing telcos, there is no need to regulate Application OTTs — There is no economic or other rationale for this.

 

Communication OTTs, however, are brought under a licensing regime to ensure no regulatory arbitrage vis-à-vis existing communication services (voice calls for instance, which require licensing from the government). Based on the possibility of disruption of the incumbent telco’s revenue models and the pricing arbitrage in VOIP OTT communications (compared to traditional voice telephony), the report recommends that only those communication OTTs that provide domestic voice communications should be regulated.

 

Net Neutrality

 

The logic used by the report in reaching this conclusion is questionable. It does not even adequately examine the issue of whether the DoT can indeed regulate all OTTs under the existing provisions of the Telegraph Act.

 

First, the classification of applications / content / services into ‘communication’ and ‘application’ OTTs is vague, and will prove incredibly difficult to implement in practice. Many of the services that fall within the Application OTT category according to the Report, can be used for real time communications between users, given the increased instances of convergence on platforms. For instance, the Report mentions Facebook and YouTube as application OTTs; but they can be used for real time person-to-person communications. We are likely to see more converged services in the future. Trying to decide what requires or does not require a license will be an impossible job, and will restrict innovation in, and growth of, the OTT market.

 

Second, one of the points of differentiation mentioned in the Report is the fact that Communication OTTs consume large amounts of bandwidth, and may restrict usage of other applications. This is not strictly true. BEREC, the EU body dealing with net neutrality, has found that traffic management for VOIP services is not required specifically to prevent congestion on a network.1 Even otherwise, the logic does not seem to hold good when you compare the bandwidth consumption and other requirements of video streaming (say on Youtube), and other heavy content / jitter sensitive based applications (such as online gaming).

 

Third, the only consideration for mandating the licensing of OTT domestic voice services appears to be the possible revenue losses caused to TSPs given the possibility of pricing arbitrage. But since the Report has mentioned several times that the revenues of TSPs are growing at a more than sufficient rate (at approx.. 10% pa), and the logic doesn’t appear to hold.2 Either they are doing badly, and voice is being cannibalized, or they aren’t. The DoT’s logic to differentiate this particular class of traffic (domestic voice) does not appear to be very solid. It is far more common for an Indian to make VOIP calls abroad than locally due to the low cost of domestic calling compared to international calling. It is strange then that international VOIP, and even messaging, do not get the same treatment as domestic VOIP.

 

Zero Rating:

 

The Report dedicates a fair bit of space to examining the issue of zero rating. It says all the right things, before failing to take the logical step of applying its analysis to the specific cases at hand – notably Internet.org and Airtel Zero. Both cases are discussed in the report, but no concrete position is taken on such agreements.

 

The DoT begins by (rightly) cautioning against the dangers of ‘gatekeeping’ and states that all platforms should be open. It then leaves it to TRAI to make a case-by-case determination on all such arrangements, using certain broad tests laid down in the Report). The discussion is therefore shifted to whether the conditions laid out by the TSP/content provider for the specific zero-rated service are sufficiently open in the way they operate.

 

While the Report states that exclusive agreements should not be permitted, it fails to understand that drawing a distinction between exclusive and non-exclusive agreements may not be practically workable. For instance, you may have an agreement between a content provider and a telco for preferential access, based on a very high payment, which no other content provider can match. While not an explicitly exclusive arrangement, the high cost could act as a practical deterrent for other content providers to enter into similar deals. Similarly, technical conditions could also be imposed so that what is notionally a ‘non-exclusive’ agreement is, in effect, an exclusive agreement. Given that all zero rating agreements tend towards the creation and maintenance of a walled garden, permitting a case by case analysis of the issue — with the exception of totally free public Internet provision — is pointless. In fact, the only examples of ‘good’ zero rating agreements in the report are the case of free Internet coupons / free wi-fi, and in the case of government programs. Internet coupons and wi-fi do not constitute a zero-rating issue as access is free to the entire Internet and not specific websites or content. And clearly, zero rating must be permitted in the case of government programs to enable equitable and subsidized access to all. Using this rationale to allow commercial zero rating makes little sense given the importance given to openness and promoting competition throughout the report. The government usually has a monopoly on the services it providers, so these similar considerations will not apply in the case of government zero rating its own content3.

 

Some Positives:

 

Despite the disappointing stances taken on two of the most critical issues, the report does have plenty of positives as well.

 

Its commitment to maintain core principles of net neutrality is unambiguous. The report specifically recommends that all (legal) content on the Internet should be equally accessible. This means no blocking or throttling or paid prioritization.

 

One of the most noteworthy aspects of the Report is its focus on user / consumer rights. The report repeatedly returns to the need to ensure a rights-based perspective on net neutrality, and, specifically, the need to protect the consumer’s privacy. A critical recommendation in this respect is the bar on deep packet inspection. It is a pity, though, that the report does not deal with the issue of how certain types of ‘legitimate’ traffic management will be conducted if DPI is barred. If all Traffic Management Practices that use DPI are banned, this may, arguably, affect a TSP’s ability to ‘legitimately’ differentiate between different types of traffic.

 

The recommendation of an open, publicly accessible complaints / redressal procedure is significant. The DoT Report must also be commended for mandating complete transparency by a telco/ISP to ensure that consumers can make informed choices. However, there is a necessary corollary missing here — ensuring that consumers are in a position to exercise that choice.

 

Cartelization between Internet service providers to ensure that consumers on the ground have no real choice between providers is all too common.

 

To sum up, the DoT Report certainly says many positive things and deals with the issue of net neutrality in a fairly balanced way — at least on the face of it. The problem is that the report’s final recommendations do not always reflect its own discussion and analysis.

 

1 See BEREC, ‘Differentiation Practices and related competition issues in the scope of Net Neutrality’, BoR (12) 31, 29 May 2012, 49, cf. Net Neutrality: a regulatory perspective, GSR 2012 discussion paper , International Telecommunication Union.

 

2 The Report mentions that ‘The Committee is of the view that the statement of TSPs that they are under financial stress due to rapidly falling voice revenues and insufficient growth in data revenues is not borne out by evaluation of financial data. There is a healthy growth revenue particularly from data – which has compensated to some extent the expected shortfall in voice revenue growth.’

 

3 Though even in the case of government services being zero rated, it is possible for the government to skew the market if allowed to enter into exclusive agreements with specific telcos.
Disclaimer: The views expressed here are the author’s personal views, and do not necessarily represent the views of Newsclick.

 

Image Courtesy: pixabay.com

Digital Access to Scholarship at HARVARD

Digital Access to Scholarship at HARVARD
The Future of the Internet and How to Stop It

Cyberwar or Cyberpeace?

Cyber weapons are no longer the stuff of science fiction. They are all too real, and so is their threat to our interconnected world. This threat is bound to grow in the coming days with the Internet of things, when all our devices will have intelligence and be connected to the internet. If we want to stop the Internet from being weaponised, we have to start talking about what nation states should or should not do. And that means an international compact on a par with what the world did with biological and chemical weapons, and what it failed to do with nuclear weapons.

 

These are the two interconnected questions we face: will we recognise the danger posed by weaponising cyberspace and confront it squarely? Or will we allow the continued building of a world in which a few countries, by their offensive power, come to a state of mutual deterrence as we have done with nuclear weapons, always at the edge of spinning out of control any moment? Non-proliferation is not disarmament, as we are finding out to our cost.

 

Amit Yoran, in his keynote speech to this year’s RSA Conference — a premiere computer security event — held last month, warned that while computer technology has advanced at near-lightening speed, cyber security is still in the dark ages. Sophisticated attacks cannot be prevented by our virus scanners and existing threat detection tools, because they are handling yesterday’s attacks. According to Yoran, the only way to beat such threats is to make visible what your computer and your network is doing – who it is communicating to, what is being transmitted, and at what speeds. Seems simple, doe it not? Except that it goes against what the US has been doing and the basic business model of the internet.

 

The US has been systematically working to weaken security. As we now know from Snowden revelations, they have weakened encryption standards, worked with various vendors to create backdoors in hardware and software, and in the process created gaping security holes in the networks and systems that we all rely on. The second is the business model of the internet. This depends entirely on mining users data and selling it to advertisers. Unfortunately, the business model of advertising is identical to the mass surveillance model, the need to syphon off “users” data. This is the reason that Internet companies are very much a part of the US surveillance state.

 

According to Yoran, the world has reached an inflection point. The barbarians are not at the gates; they are within the gates. And to drive home the point, he projected on the screen a North Korean figure in military fatigues. Presumably, the hacking of networks and computers is now no longer the exclusive preserve of the US. Invading Roman armies conquering barbarians and creating empires is normal; it becomes an inflection point only when barbarians enter Rome. Having systematically weakened security, the US and its giant internet companies have put the entire cyber resources of the world, including that of the US, at risk. The sophisticated entities that Yoran talks about are the nation states, who have the ability and the resources to mount dangerous attacks that cannot be stopped by the cyber defences we have today.

 

The danger to vital infrastructure

 

A nation state today has the ability to target computers that control the vital infrastructure of a country and cause catastrophic failures. Consider the case of a nuclear reactor. Its core is controlled by embedded computers, a part of the plant control system. If the control system is known, it is possible to “infect” the system in a way that may cause its malfunction, even a core melt-down. After Fukushima, can anybody doubt that this would be an act of war, on a par with a physical attack on the nuclear reactor?

 

The power grid, the control of hazardous plants, telecommunication networks, air traffic controls, even flying aircrafts, are handled by computers and software. With the Internet of things, even the lowly washing machine will have embedded computers and will be connected to the Internet. If countries want to play games with such software and computers, it opens a whole new arena of war, a war with untold consequences.

 

In the nuclear fuel enrichment plant at Natanz, Iran, the US and Israel deployed the Stuxnet virus to attack the Siemens controllers of the centrifuges, causing physical damage to the equipment. Even when a specific equipment or country is targeted, Stuxnet has shown that such viruses can escape into the wild and pose a threat to other equipment and countries. The Stuxnet virus infected thousands of such computers in Indonesia, India and other countries, and could easily have affected other Siemens controllers in the vital equipment of these countries. The attack on Iran – codenamed Olympic Games – has not only been on its centrifuges, but also on computers handling oil industry data.

 

There have been attacks, attributed by US sources to Iran, that wiped off data from two-thirds of Armco computers in Saudi Arabia; there have also been similar attacks on the US banking system. NSA considers such attacks as Iran’s response, or Iran’s version of the Olympic Games, to the attacks on Natanz and its oil information infrastructure.

 

The Stuxnet virus is the first known use of a computer virus to destroy or damage physical equipment. For those who follow such matters, this is the first time any country has crossed this threshold. It was the crossing of the Rubicon in cyber-attacks.

 

In the context of the use of Stuxnet against Iran, many western experts have argued that using a computer virus to cripple a nuclear fuel enrichment facility is better than bombing it outright. The issue here is not which course of action is better (and of course for whom), but whether this is an act of war. Is there a difference between bombing a facility and physically damaging it with a virus.

 

The US and the 5-Eyes partners have inserted 50,000 malwares – or Computer Network Exploitations (CNE’s) – in the network of almost all countries in the world.These are “logic bombs”; on activation, they can bring down these networks. They have also weaponised the internet backbone.

 

What is cyberwar?

 

As the Iran example shows, we are already in the early stages of cyberwar. Bruce Schneier, the doyen of cyber security, has said, “We’re in the early years of a cyberwar arms race. It’s expensive, it’s destabilizing, and it threatens the very fabric of the Internet we use every day. Cyberwar treaties, as imperfect as they might be, are the only way to contain the threat.”

 

The key problem in de-weaponising the internet is the US conviction that it is far ahead of its rivals, and any compact of not weaponising the internet is akin to its unilateral disarmament. As a result, the US has rejected Russian and Chinese proposals of de-militarising the internet in the UN and other platforms; or watered them down to be virtually useless. While some concessions have recently been made – as exemplified by the Report of the Group of Governmental Experts to the 68th Session of the General Assembly – there is very little that has been agreed in terms of concrete action.

 

Cyberwar consists of attacks on computer networks or computer controlled resources that cross a certain threshold. One approach to defining cyberwar would be to define it in terms of physical damage that a cyber attack would cause in the real world. The attack, by a state actor against another, uses software or code intended to prevent the functioning (or the misuse) of an essential computer network, and so damage critical infrastructure, or cause physical damage to property or people, including loss of life, or both. In this definition, cyberwar always involves a state actor, not the work of a group or an individual.

 

This approach has the merit of putting on a similar basis the definition of cyberwar as an act of war as defined in international law. In order to constitute cyberwar, the actions must be on a scale as to constitute a use of force (or threat of a use of force) as required by Article 2(4) of the UN Charter.  Other approaches also seek to include the damage to the information system and information as cyberwar, and these would require widening of the current definition of war. There is, too, the problem of defining what constitutes a threshold: at what point do we describe information loss on systems as an act of war? After all, information loss takes place due to a variety of reasons, and only some of them are malicious.

 

We can define what constitutes war in cyberspace, and have an international agreement that holds cyberwar – or any attack that leads to physical damage or loss of life – as henceforth illegal. It is important to note that current international law does not consider all acts of war to be illegal. It limits, to a relatively narrow width, the legal basis for war, either to a country’s self-defence, or based on a resolution of the United Nation’s Security Council. Removing cyberwar as a “permissible form of war” in international law would be a big step forward.

 

The other option would be to ban cyber weapons, and pledge, through an international agreement, that such weapons will not be developed or used by any country. Banning cyber weapons would be akin to banning biological and chemical weapons.  Given our rapid movement toward a more interconnected world, we need to go beyond outlawing cyberwar and ban cyber weapons as well. The development of such weapons is a threat to our future. As long as cyber weapons are not illegal, there will be an incentive to develop them as a kind of deterrence; moreover, there will be perverse incentive to weakening security of networks and devices as the US has been doing.

 

Of course, offensive capabilities are much easier to build than defensive ones. For offence to succeed you need to be successful once; for defence, you need to succeed every time.  Hence defence needs global collaboration. This is the point of difference with the Olympic Games: there are no individual winners or losers. You win only when everyone also wins.

 

We need a change in mind-set: we have to engineer the devices and the networks for defensive purposes.  We have to build security into the DNA of all communications.  This means changing the outlook of all the players, including that of the most dominant one, the US.  We need to build strong defences and not weaken them, if we are to achieve cyberpeace, not cyberwar.

 

(An earlier version of this article published in: Latin America in Movement 503, ALAI, April 2015. http://www.alainet.org/en/revistas/169787)

Disclaimer: The views expressed here are the author’s personal views, and do not necessarily represent the views of Newsclick