The Problem of Obviousness

By Neal Solomon
August 3, 2017

EDITORIAL NOTE: This is part 4 of a multi-part series by author Neal Solomon. If you have missed previous installments please see part 1: The Myth of Patent Quality; part 2: Patent Quality Relies on a Fictitious Narrative; part 3: The Problem of Reducing Patentability to Novelty

Like the 1793 Act and the 1836 Act, the 1952 Act required an inventor to show an invention’s originality as set against prior art. Whereas before 1952, this originality was proved by showing inventiveness as novelty, the 1952 Act introduced in addition to the concept of novelty the notion of obviousness. “Obviousness” (or “non-obviousness”) arose to challenge the subjective ideas of inventiveness by seeking to link inventiveness only to prior art, rather than, say, to a flash of genius. In the case of novelty (35 U.S.C. § 102), a patent must be shown to be original as against a single piece of prior art, with the prior art “anticipating” the invention. In the case of obviousness (35 U.S.C. § 103), an invention must be shown to be original against a combination of prior art references. In either the case of anticipation or obviousness, an analysis of prior art becomes crucial. Most interpretations of patent quality mistakenly refer to the issue of patent validity and thus most efforts depend on the analyses of prior art, with a more intensive analysis mistakenly determining the “quality” of a patent.

The main aim of introducing the idea of obviousness is to prevent an invention on a basic scientific advance that may include standard knowledge of a field. However, the challenge for an inventor is to get beyond these combinations of prior art by identifying the range of this prior knowledge and narrowing the art to the field of the invention. One main test established to narrow the field of an invention was the teaching-suggestion-motivation (TSM) test in which an expert in a field would narrow the prior art to those references that are relevant to the invention. Only if prior art is clarified and precise can multiple relevant references be combined to attack an invention. For instance, if the prior art is interpreted broadly, with multiple references from different fields of art, many combinations of prior art will be found to exclude most inventions.

The main problem with obviousness began with the Supreme Court opinion in KSR. [Note, however, that KSR draws on Graham v. Deere (1966) and Hotchkiss v. Greenwood (1850), which discusses the first notion of obviousness, embedded in the patent statute in 1952.] KSR removed the TSM test as the exclusive test of exclusivity, thereby opening up the restriction of a broad interpretation of combinations of prior art. Virtually any art in any field could be introduced as prior art if it had a tangential relationship to the proposed invention. With much more prior art included in an obviousness determination, the combination of many more prior art references tended to limit the inventiveness of many proposed inventions. This is particularly the case with evolutionary inventions that require only a small inventive step, which is the main domain of large corporations that tend to patent hundreds or thousands of small inventions in a large portfolio.

In the internet era, it is straightforward to do a database search of prior art, but with the broad swath of prior art with minimum limitations beyond a localized field of an invention, many prior art references are included in a search. The database has taken the place of the expert from the TSM test, with problematic results. The search for prior art has become the main function of the patent examiner. However, the crucial function of interpretation of prior art, and the exclusive of irrelevant prior art, still remains somewhat subjective. The main challenge of the patent examiner — and thus the main role of the patent office examination system – is to connect how two or more prior art references may be combined to read on a patent claim. In some cases, the identification of a novel solution to a technical problem is a substitute for the problem of combining prior art references since there may be multiple solution options for the same technical problem. In any event, it is crucial to cabin obviousness challenges in order to provide more meaningful prior art analyses of inventions.

Logically, with more prior art references that are included in a patent examination, the more combinations of prior art that will be applied to kill the patent application. These combinations increase exponentially with the addition of prior art references. Two prior art references may be combined in a few ways, but dozens of prior art references may be combined in hundreds of ways, leaving the opportunity to kill an invention with hundreds of attempts. In addition to patent applications, the inclusion of published writings increases the opportunities to add prior art reference to combine to kill a patent application. The net effect of a broad view of obviousness has been to destabilize the patent system by providing tools to kill patents.

After KSR, the horse got out of the barn. A broad view of obviousness has become problematic since many inventions are ruled invalid based on a loose interpretation of a combination of unrelated prior art references. The very liberal interpretation of obviousness after KSR has been a core problem of the patent system. The Federal Circuit recognizes this problem and has recently begun to bring the TSM test back in vogue as a central tenet of obviousness. As the TSM test shows, the first challenge is to narrow the scope of the field of the invention. Once constrained, the inclusion of prior art references must be confined to the scope of the field of invention.

Increasingly, the Federal Circuit is applying secondary considerations (or objective indicia) to challenges of non-obviousness. These “objective indicia of non-obviousness” secondary considerations include: (a) the invention’s commercial success; (b) long felt but unresolved needs; (c) the failure of others; (d) skepticism by experts; (e) praise by others; (f) teaching away by others; (g) recognition of a problem; and (h) copying of the invention by competitors. Interesting, the Federal Circuit reiterated these factors in determining the validity of an invention involving a smartphone against a rival.

The overly inclusive nature of obviousness interpretations has led to problems. First, with an overly broad view of obviousness, patent applicants are encouraged to flood patent examiners with prior art references in order to immunize prosecution from future surprises of prior art, even though many of these references are irrelevant. This flood of prior art burdens examiners and encumbers the patent prosecution process. Second, PTO examiners, PTAB judges and the federal district courts have different standards of determining obviousness, with the courts maintaining a clear and convincing standard for challenging the validity of an issued patent. For example, examiners may tend to narrow prior art to the field of an invention, thereby allowing applications that are then retested in IPRs under broader (higher bandwidth) standards, thereby explaining discrepancies in IPR claim kill rates.

Obviousness requires examiners and judges to carefully interpret prior art references. So far, analyses of prior art have been inconsistent and subjective, with a need to narrow the broad range of prior art references to interpret references carefully and to connect references with each other and the patent claims. Interpretation theory (hermeneutics) may be helpful for these patent validity analyses.

The problem of obviousness is critical in determining patent validity. Since patent quality is typically used as a proxy for patent validity and since obviousness is so crucial to patent validity and obviousness is a critical problem with increasing subjectivity and interpretation required, the state of the patent system relies on an unstable foundation.

Strictly speaking, if obviousness is so broadly applied, only a few pioneer patents would withstand scrutiny. The gold-plated patents would be these pioneer patents that represent fundamental or industry creating technologies. All others, according to this narrative, would be invalid or useless.

The Author

Neal Solomon

Neal Solomon is CEO of Advanced System Technologies Inc. A prolific inventor of technologies involving semiconductors, communications, data management, imaging, robotics and healthcare, he holds degrees from Reed College and the University of Chicago.

Warning & Disclaimer: The pages, articles and comments on IPWatchdog.com do not constitute legal advice, nor do they create any attorney-client relationship. The articles published express the personal opinion and views of the author and should not be attributed to the author’s employer, clients or the sponsors of IPWatchdog.com. Read more.

Discuss this

There are currently 14 Comments comments. Join the discussion.

  1. Anon August 3, 2017 1:10 pm

    As a matter of law, the statement of “Whereas before 1952, this originality was proved by showing inventiveness as novelty, the 1952 Act introduced in addition to the concept of novelty the notion of obviousness.” is simply not correct.

    Prior to 1952, Congress had shared its authority to write patent law with the judicial branch – specifically on the term “invention” – and such was NOT what the pre-1952 portion of the single paragraph that became 102 was about.

  2. Paul Cole August 3, 2017 3:45 pm

    The problem of Section 103 is inherent in the statute and is the same as the problem presented under UK law. The EPO adopted the technical problem test also inherent in the PCT implementing regulations, but the UK specifically did not. Both the Graham/KSR test in the US and the Windsurfer test in the UK provides for a number of mandatory enquiries, and when it comes to “make your mind up” time provides no further guidance and leaves it to the court or other decision maker to decide according to the evidence.

    Every court-house has somewhere the scales of justice motif. In deciding obviousness it is apt to regard the various evidential topics including circumstantial evidence as equivalent to boxes of weights in each evidential category from which individual ones of greater or lesser value are placed in one scale pan or another according to the evidence which each party can and wishes to adduce. The decision maker then weighs the totality of the evidence and reaches a conclusion.

    It is implicit in this process that a weight in one or the other evidential category is critical and tilts the scales one way or the other. But the category of the critical evidential weight varies randomly from one case to another, and it cannot be otherwise because every case is individual. Any attempt to systemise analysis by progressive study of case law runs into the problems created by this randomness.

    For example, circumstantial evidence used not to be so important. But in the recent and instructive en banc Federal Circuit decision in Apple v Samsung circumstantial evidence and in particular peer reaction on the day of product launch proved decisive.

    Anyone finding a way out of these difficulties would be deserving of much appreciation. But they have been a matter of debate and controversy for 150 years. The more systematic EPO approach has much to commend it, but is unlikely to be adopted either in the UK or in the US any time soon.

  3. A Rational Person August 3, 2017 5:26 pm

    The teaching-suggestion-motivation test was a good operational rule that in my experience produced much better rejections on average than KSR has, because to do a TSM rejection required an examiner to find in the art why the references should be combined.

    Also, given the well-known and overwhelming pervasiveness of hindsight bias,
    any rule for combining references that does not require objective evidence for reasons to combine references should be presumed to allow for impermissible hindsight, which has certainly been the case since the KSR decision in obviousness rejections:

    https://en.wikipedia.org/wiki/Hindsight_bias

  4. Tesia Thomas August 3, 2017 6:44 pm

    @A Rational Person,

    Agreed.
    Again, I’ll pull from my personal anecdotes.
    Inventors of tech of SBIRs will sue the govt for infringement. First thing the government will say is the tech doesn’t work (well why are you using it then?), the inventor isn’t qualified to make a workable version (ok so why is he the inventor?), and last but not least that the invention is obvious (ok why the SBIR then?.)

    The hypocrisy and hindsight bias goes hand-in-hand.

  5. Paul Cole August 4, 2017 1:39 am

    Has anyone considered in the obvious analysis the entropy of the prior art and the reduction in entropy which results from an ordered search based on hindsight knowledge of the invention? Prior to the invention the prior art is in a disordered state with multiple possibilities for development, and it is important to consider the 50 or so signposts pointing the wrong way as well as the one or two signposts pointing in the right direction. Examiners often forget this.

  6. Edward Heller August 4, 2017 7:12 am

    As Paul said, the central problem with obviousness is that it is entirely subjective. It might be better to simply ask whether the claimed invention is new and whether it brings some new utility.

    Actually, that is how the statute read in 1793, which was a virtual copy of the Statute of Monopolies. Who wrote the latter?

    Edward Coke.

    The former?

    Thomas Jefferson.

  7. Anon August 4, 2017 7:58 am

    Mr. Heller,

    The problem with what was written (and not by Jefferson, as you suggest), was that the early Congress punted on actually defining what “invention” itself meant, providing for the judicial branch to set the meaning of that term through the power of common law evolution.

    That was, until the Supreme Court of the 1930s and 1940s had grown so virulently anti-patent – and made “invention,” gist of the invention” and dozens of like terms to be like a nose of wax in their pursuit of “the only valid invention is one that has not yet appeared before us” that Congress was instigated to act and act they did in 1952, stripping the common law writing authority from the Court and in its place substituting section 103.

    Your “love” of history shows fatal bias when you consistently refuse to acknowledge ALL of the history, including history that you happen not to like because of your predilections of placing the Supreme Court on pedestals.

  8. A Rational Person August 4, 2017 12:42 pm

    Paul@5,

    Since KSR, my experience is that Examiners with the pressure exerted on them from above to increase production, don’t “forget” what you think they forget, but don’t have the time to care. The type of rejection I’ve seen a lot of since KSR:

    A rejection based on 2, 3 or more, often unrelated, references found by text searching the art for words in the claims which are then combined using some hindsight reason based on using the inventor’s application as a roadmap to combine the references.

  9. Anon August 4, 2017 1:40 pm

    A Rational Person,

    I have seen an even more lazy approach.

    Combinations of unrelated references piecing together key words (unrelated except for a key word) with a proffered reason for combination being a sole reason listed in each respective individual reference.

    Reference “A” teaches an improved screwdriver by using a particular new processed concoction of citric acid.

    Reference “B” teaches an improved screwdriver by using a selectively magnetized shaft that may pick up a dropped metal screw.

    Invention “C” on an improved mixed drink (perhaps having a method step of processing using magnetism) is obvious in view of the combination of Reference “B” with reference “A” because a magnetized shaft may pick up dropped metal screws.

    I often see “motivations to combine” that speak of nothing of the act of combining. The fact that some items MAY be combined is not something “made obvious” by the mere possibility of two references being combined.

    It appears that the prior art world of 103 is desired to be collapsed (expended?) into being the same prior art world of 102.

    Often, an element of Reference “B” may be great for the way that that element is being used in Reference “B,” replete with caveats and other elements that must also be in place for the benefit of what Reference “B” teaches. This however does not mean that taking that singular element and sticking that singular element into any other combination carries with it the same benefit as Reference “B” teaches for its own combination.

    Good luck explaining that to a person that thinks that key word searching without reading or understanding a specification is “good enough.”

    Another (related) point: I often see examiners quote the Graham Factors, and then blow right by them, obviously not taking the time to actually apply those factors, and certainly not spelling out how those factors were applied.

    This is related to the first item, as often, a response that at first baffles, then consternates the examiner is a response pointing out that the (key word search driven) examination is NOT ENOUGH to satisfy the Graham factors. Text searching is rarely enough – without more actual work by the examiner.

    It is worth taking a quick moment to see that more than mere lip service has been given by the examiner as to how that examiner has actually examined the claims.

  10. Paul F. Morgan August 4, 2017 7:59 pm

    Re “examiners may tend to narrow prior art to the field of an invention, thereby allowing applications that are then retested in IPRs under broader (higher bandwidth) standards, thereby explaining discrepancies in IPR claim kill rates.”
    But that is not the main reason for the difference between application claim allowance rates and IPR claim rejection rates. The main reason is the very few hours application examiners have to conduct prior art searches versus the orders of magnitude more hours spent in prior art searches for IPR petitioners because they or their customers are being expensively sued. Also, application prosecution is entirely ex parte whereas an IPR petitioner gets to file expert declarations on obviousness along with the better prior art found in the much better prior art search. Also, IPRs are decided by APJs – patent attorneys well aware of KSR and BRI case law.

  11. Anon August 4, 2017 8:43 pm

    Mr. Morgan,

    Your arguments sound as if the Office is simply not examining as they should.

    They are charted to examine under the law – NOT to some internal metric.

    The job is not the metric of the job.

  12. Ternary August 7, 2017 12:33 pm

    More in particular, if you can find other combinations but not the one you claim, then clearly the invention is not obvious. Especially because many alleged “obvious” combinations are very tenuous, not to say beyond fictitious: scientific fantasy.

  13. test August 7, 2017 3:47 pm

    Paul @5 and EH@6,
    Paul makes an excellent point, which is not that an obviousness rejection is subjective, but that it is inherently biased because it only considers prior art against the invention. That is, an Examiner uses the claims in hindsight to find prior art to reject.

    Prior art that teaches away from the invention is not searched for, or considered. It is like selectively using experimental data to extract a clean formula. The Examiner is in fact acting as Maxwell’s demon. (hence the entropy, I assume). It does generate a desired result, but it does not reflect reality.

    If you consider all prior art as the “guideline” to an invention (that is rejected as being obvious), then there is likely not a net guideline to the claimed invention. It seems reasonable that citing other prior art that teaches away from the claimed invention destroys an allegation of obviousness.

    More in particular, if you can find other combinations but not the one you claim, then clearly the invention is not obvious. Especially because many alleged “obvious” combinations are very tenuous, not to say beyond fictitious: scientific fantasy.

  14. Edward Heller August 7, 2017 4:02 pm

    One has to consider that Hotchkiss v. Greenwood only announced the principle that if it was “known” to the art that X could be substituted for Y in application Z that the claimed invention would be the work of in the ordinary mechanic.

    The Supreme Court has held that section 103 was intended to codify Hotchkiss v. Greenwood. We should take that as a given. At a minimum, therefore, the statute requires that it be proven that the combination was effectively taught by the prior art.

    TSM as an exclusive rule may have been overruled. But the Supreme Court did not overrule Hotchkiss. Certainly some common sense can be employed to infer facts. But the overall conclusion has to be that the art itself taught the combination.

Post a Comment

Respectfully add to the discussion.

Name *
Email *
Website