The Horrors of Software Patents

US Patent Number  4701722 A is a perfect example of everything software patent opponents hate about software patents:

  • It implements  mathematical functions that are pretty well known.
  • It covers a process of changing information not changing any physical object one could touch.
  • It is owned by a company with a business model depending on licensing fees, not manufacturing things: What many people would call a “patent troll”

US 4701722 A is is one of the basic patents of Dolby B noise reduction awarded to Ray Dolby back in 1987. It’s not a software patent, it is a circuit patent.  The usual arguments about why software should not be subject to patenting make little sense. If a circuit can be patented, then it should be possible to patent a program. The beginning of the  study of computation as a mathematical subject was the discovery that a simple general purpose engine reading instructions from some sort of store can emulate any circuit. The early computers were “programmed” by physically moving wires around. The photo below shows ENIAC being programmed. A “program” then was obviously a circuit design. Technology advanced so that these circuits could be stored in memory and on disk drives. But that did not change the basic process – writing software is conceptually the same as wiring up an electrical circuit.

eniac

And around the same time Nyquist/Shannon showed that analog and digital signals could be equivalent to each other.  Ray Dolby knew, and noted in this patent, that the analog signals transformed by his invention could be transformed into digital information and transformed back – as convenient. If there is an intrinsic flaw in the concept of software patents themselves, something fundamentally distinct in software that makes software inventions impossible, then the critics of software patents have failed to explain what that could be. See also Martin Goetz which I found via the anti-software patent arguments (1 and 2) of Tim B. Lee (who is not Tim Berners-Lee).

 

The technology disconnect

What it did was reinforce a point about the sociology of management: From cars to space shuttles, from offshore oil wells to nuclear reactors, the people who make the decisions are often out of step with the mechanical details.” – Mathew Wald, New York Times. 2014/06/09.

We sell pretty complex software that synchronizes clocks of computers, down to nanoseconds in many networks. Our customers are often firms where where IT staff, let alone higher management, are consumed by pressing issues, none of which have to do with the nuts-and-bolts of how ideal clock algorithms and time protocols interact with network equipment, operating systems, and application servers. Our easiest sales are to companies where some person is able to get perspective on the technical issue and relate it to business priorities. Otherwise, things get lost between IT staff running hard to keep up with day-to-day and big picture matters, and higher management who are “out of step with the mechanical details”. This is why the most successful big technology companies have built up corps of “technology fellows”, in practice if not formally. These are experts who can take a sufficiently deep dive into technology to appreciate the problem, who also understand business priorities, and who have management confidence. If someone with those skills had been involved in GMs key ignition lock discussions the company could have made far better decisions. As more and more firms become dependent on critical technology infrastructure, they will need to acquire people with such skills, in-house or not.

also on LinkedIn

gigasync2  The story we were told at a bank that I cannot name is that all their time synchronization was the domain of an engineer tucked away for 30 years in the home office, a fellow known as Professor Time. The system he had built was remarkable in its complexity and fragility. Nobody seemed to understand how it worked. Accuracy was highly variable. There was no management, no documentation. We never got to meet the Professor, but I always thought of him as something like this.

Mattheus_van_Hellemont_The_AlchemistIn the past, the enterprise time synchronization market was composed of vendors that sold boxes to IT teams that were responsible for architecting a solution from GPS clocks and free software that was not at all designed for the job. Time synchronization is a specialized field that is a lot more complex than it may appear. And quality has become more of an issue as trading speeds and volume have increased so much.  Some firms have responded to the change by scaling up their synchronization staffing. Some have relied on luck. Some hope that heavily marketed new technology in PTP aware routers will solve their problems.

Over the last couple of years, we have been building out technology to provide financial trading firms and other organizations that need precise timing with an alternative to the boxes+ custom-in-house approaches. We have built client and server software that is fault-tolerant, cross-checking, with sophisticated alarms, easy configuration, and graphical web management/data-analysis tools. And we’ve put that software in powerful server computers that can serve time directly at 10Gbps (and better) and that have all the standard enterprise features (like lots of storage for archival records and dual power supplies). All the parts are designed to work with each other and to connect in a flexible, resilient time distribution network. Our new partnership with Spectracom brings their extensive hardware expertise, distribution and support infrastructure into the mix.