Intellectual Property Litigation and Reverse Obsolescence: Out with the New and In with the Old
A changing landscape can cause newer technologies to be rendered obsolete by the older approaches they were supposed to replace. This phenomenon is particularly relevant given the growth in intellectual property litigation related to digital media technologies.
October 2015 | by David Delaney
On May 25, 1926, the French city of Lyon inaugurated commercial air service to Paris – a symbolically important link that demonstrated the ease and speed with which the airplane could provide access to the capital city. In the 1980s, however, the French high-speed train system began operation on the Paris-Lyon route, bringing the city center to city center travel time to well under what air travel could possibly offer. By 2004, over 85 percent of travel between the two cities was by train. Today, door-to-door travel by train is faster than travel by plane for a long and quickly growing list of city pairs worldwide – something that would have been unthinkable when mass-market air travel was coming of age in the 1950s.
Reverse obsolescence – when a changing landscape can make older technologies more compelling than the newer ones that were supposed to replace them – is the very rare exception to the general rule that dominant technologies, once replaced, almost never regain the lost ground. Black and white televisions, vinyl records, film cameras, floppy disk drives, and communication by handwritten letter will not be making a mass-market comeback. And the trains of 2011 are of course very different from those of the 1920s. But the same could be said for airplanes.
The dynamics of when, how and why a “legacy” technology can be positioned to capture market share from its projected replacement can involve a complex set of interactions among rapidly changing market and technology growth factors. In today’s world, these interactions are particularly relevant given the exponential growth in bandwidth and storage capacity on the one hand, and increases in technology-related intellectual property litigation on the other.
Largely because of the potential for huge settlements, intellectual property litigation is a booming industry. Patent infringement cases have increased at an average of almost 5 percent per year for the past two decades, with about 2500 new cases now filed each year. The potential settlement and judgment amounts are enormous. For example, in March 2006, Blackberry maker Research in Motion agreed to pay over $600 million to NTP, a small Virginia patent holding company, to license a set of patents related to wireless e-mail services. In 2009 and 2010, Kodak obtained settlements totaling nearly a billion dollars from Samsung and LG based on litigation over two patents. And on June 9, 2011, the Supreme Court upheld a nearly $300 million verdict against Microsoft in a patent infringement lawsuit brought by Toronto-based content solutions provider i4i.
Many active patents related to communications technology date from the 1990s or early 2000s – an era when the communications infrastructure was dramatically less mature and capable than it is today. In cellular networks, residential wired Internet, WiFi, long-haul fiber optic cables, and a long list of other environments, today’s data rates have increased by several orders of magnitude over those of the 1990s. We are rapidly approaching the point where consumers have essentially ubiquitous access to multi-megabit per second delivery.
So how might this help with intellectual property issues? Patents have finite lifetimes. Most currently active patents in the United States will expire 20 years after they were filed. Methods that were first developed 20 years ago are not typically covered by active patents today. Sidestepping intellectual property claims by reviving solutions from the 1980s means forgoing two decades of technology advances, and in some cases, the cure would be far worse than the disease.
But the tradeoff is different for the large and commercially critical set of technologies designed to address limited bandwidth. Take the example of video compression, which aims to combine the highest video quality with restricted bandwidth requirements. Over the past several decades there has been an intensive and sustained effort to improve the video compression efficiency, resulting in an alphabet soup of standards.
Much of this effort was motivated by the then-important goal of maximizing video quality over bandwidths that are low by current standards. However, very little video today is actually delivered over links this slow. Today’s most “advanced” standards, because of their sheer number of bells and whistles, often create a wider exposure to intellectual property claims while consuming precious battery power in implementing what are now superfluous techniques.
For example, the video coding standard H.264 was designed to deliver moderate quality video at very low rates in the range of hundreds of kilobits per second. It is certainly up to the task of encoding high quality video at a much higher 3 megabits per second, but the capability it was really designed to excel at – high efficiency to compensate for low bandwidth – isn’t being fully utilized. In fact, at these higher (and now common) rates a well-designed approach based on 1980s methods can perform just as well as H.264 in terms of compression efficiency, while minimizing modern intellectual property challenges. This approach has been recognized by the MPEG standards group, which, motivated largely by the intellectual property concerns surrounding the newer standards, decided in January 2011 to develop a “new” compression standard based on legacy methods.
Another example can be found in mobile phone voice transmission technologies. Today’s most advanced 3G and 4G network standards support data exchange at multiple megabits per second - several orders of magnitude higher than the 2G standards of the 1990s. Yet the quality of mobile voice calls has barely improved. The voice codecs – that is, the chip-based methods that turn your voice into strings of ones and zeros – were designed to operate under bandwidth constraints of the original 2G digital cellular networks. They achieve this goal thanks to a complex set of computations that squeezes voice into approximately 10 to 20 kilobits per second.
With today’s wireless technologies and improved throughput, cell providers could devote a little more bandwidth to voice – say another 20 or 30 kilobits per second – improving quality immensely with almost no overall capacity sacrifice. There are plenty of legacy alternative voice codecs that can easily do the job. Ironically, these codecs haven’t seen the light of day not because they don’t work, but because they were viewed as obsolete in view of the now-ubiquitous hyper-efficient voice codecs that were designed to address a now-obsolete set of bandwidth constraints.
In combination, the growth in bandwidth and in technology intellectual property litigation creates a compelling case to reexamine the many “modern” solutions that were designed with bandwidth efficiency as the primary goal. More broadly, there are other technologies such as storage and processing capability that have also seen dramatic advances over the last two decades where similar reexaminations could be performed. A clean-slate look at how we deliver and manage digital media offers the opportunity to create a new set of solutions that are better matched to today’s computing and communications systems while also reducing exposure to intellectual property claims.