The Northern District of California, in Finjan, Inc. v. Blue Coat Systems, Inc., Case No. 13-cv-03999-BLF (Judge Beth Labson Freeman) (July 14, 2015), ruled on a host of Daubert motions, including by both plaintiff and defendant to exclude the other side’s damages expert. The court addressed several apportionment techniques, allowing some and rejecting others.
“Real estate approach” based on lines of software code (royalty base)
In what may be a first in a patent case, the court squarely addressed an apportionment technique relating to the royalty base that this author calls the “real estate approach,” finding it not inherently unreliable. This technique has different flavors, e.g., the footprint of circuitry on an integrated circuit or the lines of software code. Here, the question was how much respective “real estate” was occupied by the lines of code making up the accused feature versus the total lines of code in the accused product. The court held that use by defendant’s expert, Julie Davis, of a percentage of source code directed to the accused feature was not unreliable and could be presented to the jury. (Note, this issue has arisen in the trade secret context: UniRAM Tech., Inc. v. Taiwan Semiconductor Mfg. Co., No. C 04-1268 VRW (N.D. Cal. April 17, 2008) (allowing defendant’s expert’s apportionment that the accused misappropriated trade secret constituted 25% of the chip). It was also addressed in passing by the Federal Circuit in Lucent—see quoted language below.)
Plaintiff’s first argument was that the method is unreliable because it would vary depending on the competency of the programmer. Judge Freeman rejected this argument (slip op. at 8):
In passing, and without any citation to authority, Plaintiff suggests that the method itself is unreliable because it varies depending on the competency of the programmer. Pl.’s Mot. at 11 (“an accused infringer who has inefficient programmers would pay less in damages because the overall code base would be larger”). This argument has little appeal because an incompetent programmer is likely to be equally incompetent in programming all of an accused product’s code, just as an efficient programmer would efficiently program an entire product’s code; the percentage of code attributable to a feature would not change. In any case, although the Federal Circuit has indicated that the portion of an accused product’s realizable profit attributable to the patentee’s technology, “cannot be reduced to a mere counting of lines of code,” the court acknowledged that “the glaring imbalance between infringing and non-infringing features must impact the analysis of how much profit can properly be attributed to the use of the [accused feature] compared to non-patented elements and other features of [the accused product].” Lucent Techs., 580 F.3d at 1332-33 (analyzing Georgia-Pacific Factor 13). As such, this apportionment method is neither inherently unreliable nor absolutely barred by Federal Circuit precedent.
Next, plaintiff argued against the source code percentages that Ms. Davis used in her analysis. Plaintiff contended that Ms. Davis originally received the relevant data from defendant’s counsel, which data was subsequently confirmed by defendant’s technical experts. The court reasoned this was more of a lack of disclosure argument under Rule 26 than a reliability argument. After ordering the parties to submit excerpts from the technical experts’ depositions, the court concluded that plaintiff had opportunity to examine the experts on their basis for the accused lines of code (which originally came from employees of defendant), and “that Plaintiff simply did not ask those questions.” Slip op. at 9. Thus the court rejected this argument, especially considering that the defendant’s source code computers were available during the experts’ depositions. Slip op. at 10.
The last argument advanced by plaintiff was that the identified lines of code did not account for all claim elements nor for other ways in which plaintiff accused the defendant’s products of infringement. The court found this to be a factual dispute and a “determination better left to the jury.” Slip op. at 11.
The court therefore denied the motion concerning lines of code apportionment.
Apportionment based on plaintiff’s patent portfolio (royalty rate)
Ms. Davis attempted to apportion the royalty rate by dividing plaintiff’s proposed rate by the number of patents that plaintiff had asserted in litigation to date. The total number of patents was 20, but the number asserted in the instant litigation was just 6. The court found this apportionment analysis to have improperly used the “book of wisdom,” because “the additional 14 patents that Ms. Davis folded into Plaintiff’s portfolio include patents asserted in separate litigation against third parties, ‘largely after the dates of the hypothetical negotiations.’” Slip op. at 12 (quoting plaintiff’s brief). The court reasoned that plaintiff’s future litigation activity (relative to the hypothetical negotiation date) was not probative of the asserted patents’ value on that date.
Apportionment using “forward citation analysis”
This time the defendants sought to exclude an apportionment technique employed by the plaintiff’s expert, Dr. Layne-Farrar, and succeeded. This technique suggests that a patent’s value is strongly correlated with the number of times the patent is cited as prior art by future patents. The court noted that this technique may be provative of reasonable royalty in some circumstances, but that it was improperly applied here. Slip op. at 13.
The court rejected the application of this technique on several levels:
- Layne-Farrar had failed to explain why this technique was an appropriate measure for the asserted patents. The court noted that some of the patents are related and reference one another. “Surely a patent’s objective quality cannot be based on the number of times an inventor cites himself in prosecuting related patents.” Slip op. at 13-14. And the court noted that the patent with the highest number of citations was, not surprisingly, the oldest patent. “Dr. Layne-Farrar’s straightforward application of a forward citation analysis without taking into consideration these potential problems renders the method unreliable for failure to specifically tie the methodology to the facts of this case.” Slip op. at 14 (citing Oracle Am., Inc. v. Google Inc., No. C 10-03561 WHA, 2012 WL 877125, at *2 (N.D. Cal. Mar. 15, 2012) (rejecting forward citation methodology used to rank reexamined patent in a portfolio because expert did not count citations to predecessor patents)).
- Layne-Farrar’s application of the technique was based simply on a 6-patent portfolio without accounting for the value of the accused features as a portion of the accused products and thus only demonstrated the value of the patents relative to one another. It did not evidence value of the asserted patents relative to other patents that cover or potentially cover the accused and non-accused features.
- The reliance on other cases was misplaced. The court distinguished two other cases cited by plaintiff: one related to SEPs (GPNE Corp. v. Apple, Inc., No. 12-CV-02885-LHK, 2014 WL 1494247 (N.D. Cal. Apr. 16, 2014), and the other involving a situation where Dr. Layne-Farrar had compared the patents-in-suit to other patents “within the same technology market.” Slip op. at 14-15.
Apportionment using totality of features in the accused products
Here, the defendant’s challenge was provisionally rejected; the court allowed Dr. Layne-Farrar to testify about apportionment based on an internal presentation by the defendant that identified 24 functions “that cover ‘all features in the full suite of [defendant’s] security products.’” Slip op. at 15 (quoting defendant’s document). The court explained (slip op. at 15):
Relying on Dr. Medvidovic’s report, Dr. Layne-Farrar concluded that the evidence “suggests a per-feature apportionment of sales revenue” and thus apportioned accused product revenue according to the number of functions out of 24 that each patent-in-suit drives. Id. ¶ 158. Dr. Layne-Farrar notes that this apportionment approach is “highly conservative because not every accused product has all 24 features, and yet I apply only 1/24th for each feature to each accused product.” Id. ¶ 159. As with Ms. Davis’s apportionment based upon lines of infringing code, Dr. Layne-Farrar’s second apportionment method may not be perfect, but it reasonably ties the value that Defendant places on product features to the accused products in this case. Any factual challenges to Dr. Layne-Farrar’s analysis are better presented to the jury.
The provisional rejection stood on the court’s concerns that it was unclear how Dr. Layne-Farrar had valued each of the 24 functions equally. The court thus decided to allow the defendant to renew its objection to this apportionment approach if there was insufficient factual foundation supporting a 1/24 apportionment for each function. Slip op. at 16.