For the half-year to 31 December 2014, the IPKat's regular team is supplemented by contributions from guest bloggers Rebecca Gulbul, Lucas Michels and Marie-Andrée Weiss.

Regular round-ups of the previous week's blogposts are kindly compiled by Alberto Bellan.

Monday, 10 May 2010

Refusing Proprietary Technology

The IPKat is (still) no expert when it comes to the software industry but, like many IP enthusiasts, he tries to keep up with the major issues and thought-trends as and when they crop up. One of the issues he has pondered about from time to time is the sometimes hostile, sometimes constructive tensions that exist between the business models that drive proprietary software and open source, as well as the continuing dialogue between the supporters of each.


Bearing this in mind, the Kat is running a series of four short pieces by his friend Keith Braithwaite, which he hopes will do two things. One is to seek to pinpoint the most significant issues as viewed from the industry side rather than from that of the academics and lawyers in whose company this Kat is most comfortable. The other is to place these issues within some sort of temporal continuum where that is possible. Here's the second piece in the series:

Refusing Proprietary Technology

Prologue

The first post in this four-part series looked at the effect on users of removing a patent-encumbered technology from an existing product, rather than licensing it: while with much ingenuity there is usually a way of substituting a non-encumbered implementation for each patented technology, the resulting impact on users is usually not negligible and sometimes unexpected.

We also began to look at the kinds of technology that might be missing from some solutions because of IP issues. This time we will look at the potential impact on users of active refusal to use patented technology in new products. Avoiding patented new or improved technologies, either to avoid a licence fee or on ideological grounds, can impose a hidden cost on users.

Journaling file systems

Since the earliest days of computing, “secondary storage” media such as magnetic drums, tapes and disks have been used to make data persistent. More recent developments have added memory sticks and cards and writeable optical media like CDs and DVDs to the repertoire of storage technologies. All these media require some way of organising their contents. This organising principle is called the file system. These have evolved to include something like a map to show where the free space is and where the files are, along with some control information such as timestamps, ownership and permissions. This arrangement can be fragile and, after any system crash, it can take many minutes to recover any potentially corrupted data―sometimes without success, leaving the user with lost or unreadable files.

One solution to this problem is the journaling file system. There are many variants but the basic idea is to record every change made to all files in two widely separated places on the disk or card. The actual content of the file is in one place and the change is recorded somewhere else – in a “journal”. In the past decade, journaling file systems have come into widespread use, first in the business market and now also in the consumer market, where the most common OSs have a journaling file system.

With such a file system it is also easier to back up (which too few users do). By adopting a “journal based backup”, a system needs only to check the journal rather than trying to figure out what has changed itself. This is much faster.

If our OS provider were to decide to remove journaling because it is based on proprietary technology, we would in effect be forced to downgrade to a 10 year old OS in which every hardware problem, software glitch or power failure brings back the old nightmare of lost files and long computer boot-up times caused by the attempt to recover file system corruption.
The loss of this feature would also cause serious degradation of service in the server room, in which server downtime to recover from file system corruption would increase while daily maintenance operations, like backup, would take longer and become more expensive.

Storage virtualization

As the amount of disk space attached to computer systems grows, it becomes increasingly hard to manage.

Various ingenious solutions have been devised over the years, many of which are proprietary and subject to patent protection. The term “storage virtualization” results from a relatively recent attempt to identify the common features and differences of these approaches. This allows much improved utilization and capacity forecasting – storage can be held in reserve and granted to users as needed. The virtual storage pool can be easily expanded by adding more disk drives managed in a central location. Old drives can be removed more easily, without interrupting availability of the data on them. All these features make life easier for users—fewer rude emails for your IT centre asking you to archive your emails, no data lost through disk crashes, easier access to really large data sets.

All those benefits would be lost if the OS vendors decided not to include LVM technologies because they are proprietary. The main result of this choice would be that the IT departments of all the organizations using that OS would be forced to adopt more restrictive IT policy regarding, for example, user quota utilization, because managing the configuration and provisioning new hard disk space would become much more difficult.

Fixation on IP issues distorts engineering judgment

In an ideal world, engineers developing software solutions would be free to choose the technically “best” option to resolve every issue. In reality commercial and legal considerations often supervene.

Technology encumbered by intellectual property rights might be rejected because the licence fee would increase the price of the finished product by too much. This is a prime consideration for free open source software (FOSS) developed by teams of volunteers and given away to anyone who wants it – volunteer teams typically have no budget to buy in commercial technology. Not all OSS developers are in this category, however. Some high-profile OS products are well funded by commercial backers. Nevertheless, software patents are abhorred by most of the OS community with a quasi-religious fervour.

Although FOSS solutions are zero-cost the choice to incorporate OS software in a solution incurs the risk that the technology in question has already infringed someone’s intellectual property rights, even inadvertently. By using that technology, an organization could become party to the infringement. Even if open source technology is provided free of charge, it is still very much a case of “caveat emptor”.

Since 2005 the Open Invention Network (OIN) has been trying to reconcile these opposing forces by buying software patents on the open market and making them available royalty-free to licensees who promise in return not to bring patent lawsuits against the Linux environment. This strategy to defend Linux developers against the risk of being taken to court, although backed by big names such as IBM, Novell, Oracle, Google, Red Hat and NEC, has not yet succeeded entirely in shielding the Linux environment against patent infringement suits. A study carried out by Dan Ravicher for Open Source Risk Management in 2004 identified 283 patents that had been granted but not yet validated in court, any of which could potentially be used to support a patent claim against the Linux kernel. Many, but not all, of these patents have since been acquired by OIN or are held by one of its licensees.

All the intellectual activity in a development project ought ideally to be expended on solving the problem at hand, not on guarding against the risk of intellectual property infringement. Huge investments of money, time and energy have been diverted from the creation of software features that benefit users and towards these “displacement activities”. If users are to derive maximum benefit from the creative efforts of software architects, designers and developers, it may sometimes be more cost-effective to acquire the rights to proprietary software technology than to expend the energy to reinvent or “code around” it.

6 comments:

Gentoo said...

"Although FOSS solutions are zero-cost the choice to incorporate OS software in a solution incurs the risk that the technology in question has already infringed someone’s intellectual property rights, even inadvertently. By using that technology, an organization could become party to the infringement. Even if open source technology is provided free of charge, it is still very much a case of “caveat emptor”."

Yes, but the point is, that this "caveat emptor" unlike "market ouvert", where at least you can rely on daylight for title to pass, has the potential to catch everybody. One of the growth industries in the shark infested custard that is software patents, is the so called submarine patent, which is only revealed when there's someone to sue (cf $4million table stake to defend). It's never about product or market protection.

And, of course, as the vast majority of the readers of this blog will know, given the "triple damages for willful infringement" one is wiser to code in ignorance rather than honesty.

This is most recently evidenced (IMHO) by Apple's plans to present a patent pool to assert there is no such thing as a royalty free video codec. (No evidence presented, mind you, are they learning from others?)

The BBC developed Dirac and Schrodinger which they assert were developed specifically to avoid the existing IP minefield, so perhaps Apple will either sue them (and they did so rush to put iPlayer on the iPhone when it had a market share of 0.001% + BBC Execs) or, joy, Dirac/Schrodinger will become the web standards. (My Dirac enabled video playback software is just waiting for some content...)

While at one level that the fantastic Linux exponent, Nokia, are slapping some back; causes my heart to skip a beat, at another level will someone tell what's in it for anybody not in the legal profession?

PS Hopefully this post will help refute the canard often put about that FOSSers only complain about one proprietary organisation.

PPS It's just an unnecessary flame to assert "quasi-religious" it adds nothing to the debate.

Gentoo said...

Sorry for the second post, however...

"A study carried out by Dan Ravicher for Open Source Risk Management in 2004 identified 283 patents that had been granted but not yet validated in court any of which could potentially be used to support a patent claim against the Linux kernel."

..is just lazy

Now consider reading these...

http://www.theregister.co.uk/2004/11/18/ballmer_linux_lawsuits/

http://www.groklaw.net/article.php?story=20070517083516872#c572488

Gentoo said...

And thirdly...

http://www.enterpriseirregulars.com/17600/the-problem-with-software-patents/

Anonymous said...

Unwisely I just wasted 10 minutes reading the stories behind those links.

Gentoo said...

While I hope "Anonymous" doesn't waste any more precious seconds reading links, others might care to add this one to their reading list....

http://www.networkworld.com/community/node/60912?source=NWWNLE_nlt_microsoft_2010-05-11

...and for this idiot, identify the non-obvious inventive step.

For example, Knoppix has been a live CD since 2000 (when 16MB USB sticks were rare and expensive) and many Linux distros (to name one set of applications) have been bootable from a USB stick for years

The idea seems to have been proposed by IBM

http://www.ibm.com/developerworks/linux/library/l-fireboot.html

and then there is

http://www.research.ibm.com/WearableComputing/SoulPad/soulpad.html

keithb said...

So, I read the Register article.

The first thing I notice is that it's from 2004 and I don't recall hearing of any government (state, city or otherwise) being sued by anyone as a result of using Linux since then. Inference: Ballmer was overstating the case. Second inference: people get way too excited about what he says about IP issues.

Part of my goal for these articles (however imperfectly met) was to try and get away from that excitement about who is or is not right/wrong/telling the truth/talking smack about the likelyhood of litigation over IP and how ethically righteous/reprehensible that may or may not be. I want to think about the impact on users of having or not having or having and then losing (by choice or otherwise) proprietary technology in products.

I don't have a personal stake in either camp: as a user I use and enjoy FOSS systems, I use and enjoy paid-for closed-source systems. As a programmer I very much prefer to have access to the source of the tools and libraries I use, for practical reasons, but I don't feel that my values have been outraged if I don't.

I also looked at the "portable applications" article. That seems like a very, very wierd claim to me, and one with clear prior art. It's a fine example of the sort of thing that brings technology patents into disrepute. I'm not sure I understand what it has to do with my article. My stance is that the law is what it is, and is being interpreted how it is, and technologists are responding how they do. I want to understand, however much sense that does or does not make, what are the implications for users.

Subscribe to the IPKat's posts by email here

Just pop your email address into the box and click 'Subscribe':