By Thom Kobayashi ‑ September 18, 2017
This is my third in a series of 3 blogs. In the first blog, I discussed how a patent had extrinsic vs intrinsic value and it really depended on just a few things. I then proposed a simple framework for ranking that had seven elements. In the next blog I discussed how frame of reference affected the possible rankings, and how different experts, with different perspectives can all provide different, even conflicting rankings that are correct. Also layered on here was a temporal affect – where the arc of technology would also change the importance of competing technologies relative to each other as might circumstance. So on the one hand it’s simple and the other it’s hopelessly complex – and how does one make use of that?
Patent valuation is built on a foundation of shifting sand. You have to always keep in mind the rater PoV, the context in which the rater worked, as well as the timing of the review all affect the outcome. Historically important technology can suddenly shift higher or lower in importance in an area. I covered a lot of this in the last patent value blog. So on the one hand, there’s a simple framework to use. On the other hand, there can be almost infinite layers of complexity and conflict from informed opinion in legal, business and technical perspectives. In addition, the information has a shelf life...
If the information “goes stale”, you have to carefully manage the effort you put in to getting rating and really look hard at the cost benefit ratio. While this will probably vary with industry, generally I found it best best was stick to a “minimalist approach”. I would ask for a very straight analysis that included only the essential data. I asked them to score each of the seven criteria (see them in the first blog in this series) plus I made the analyst give me a composite score as well. I essentially chose to break up the analysis, so every asset got a cursory screening, and I only invested in a full analysis for the more promising assets.
Let me offer a few practical notes here. First having individual as well as a composite score would sometime surface some interesting conflicts with high individual but low composite score or vice versa. This would sometimes find cases where there was something not addressed in the individual questions that the analyst was willing to share, but it did not fit in the framework. He “knew” something that he felt was important.
Second, make sure everyone answers every question all the time, by having a selection, “I don’t know”. This serves two purposes. It keeps the analysts from becoming laz
y and also gives you a check if you sent this to the right guy. If you have an analyst that’s answering “I don’t know” all the time, you can shift work away from him. Remember that you are collecting the minimum amount of data to make a quick analysis here. If you collect less than that, it’s not going to be enough to support your decision process.
Third, because of reason two, you need access to a stable of good analysts. One guy is not smart enough to rank across a number of disciplines or a number of industries. Don’t be afraid to spread the work out. If the process is simple enough, you should not need to have much of a relationship in order to get a good result in aggregate. It is also a low risk way to try out analysts for higher value work.
This seemed to provide a “Goldilocks” solution with the proper trade off between ease of completing without being onerous. Also surfacing some unknowns with apparent conflict in the answers.
Store your data with a date and an analyst name to help future you put your findings into context. As you go forward from the screening assessment, there will eventually be a big investment of time and effort to develop the assets into an infringement case. You and all the different experts will look at this asset a lot. Keep in mind that we’re talking about the very front end of a process that unfolds over months or years. The ability to understand who supplied the information and when it was acquired can be really helpful. Just being able to get on the phone a month later and ask a couple of questions can help move your process along. Additionally, if it doesn’t make the first cut, it can be beneficial to get more insight from the guy that’s familiar with the asset when thinking about a second group. An asset that’s only helpful for “one of many” is just fine if you’re specifically talking to “the one”.
If you’re not an attorney, here’s where you have to be careful about discovery and such. Since I’m not an attorney, I’ll just say you should get some help with setting up your process and you need to document that process and follow it slavishly. If it was not clear before now, I’ll just say analysis work is a team sport.
Also, keep in mind that you’re actually asking for a future prediction and nobody’s crystal ball is going to be 100% clear. Ultimately no one can predict the future, so you are working with flawed data. The date was a good check on the age of the information so that if we did decide to go with it, we at least knew the risk. In managing projects like this, you have to be really careful about using 20/20 hindsight when judging how good or bad an analysis is.
If you can get a good answer to this one you’re 90% of the way to your value. Patent value is largely extrinsic. That is, it relies on outside factors, the primary one being “Is it used?” because “use” is necessary to prove infringement. If I was just talking to technical experts I might only ask two questions. “Is it used?” and “Can I prove it?” As an example, I came out of semiconductor manufacturing. There some of the most valuable IP is defect control or reduction. However, all you typically have access to are working parts – parts that you can purchase. The defective parts are all sorted out presale. So, how would you “prove” that a part was built with a particular defect reducing method? This know how is extremely valuable, but can very hard to prove and therefore less valuable in a licensing situation.
Invariably, the best assets have to have the right guy looking at them to find the value outright. Typically it is a very specific and small group of people that can tell you definitively if a patent is actually being used. There is a slightly larger group of people that can make a pretty good guess, and with everyone else you’re just getting an opinion. Informed, but informed by things like reading as opposed to implementing or experiencing. If you’re rating a whole portfolio, it’s going to be very difficult to find that very specific guy and get that patent into his hands to assess. Even if you do, as we discussed in blog 2 of this series, you are faced with him possibly not knowing anything about the legal or business aspects that need to be considered. However, these conflicts between experts are almost invariably where the interesting assets are. Keep in mind that this is NOT a democratic process. Typically only one of your experts will know for sure. His opinion is the one you want to listen to. Single positive opinions should not be discounted out of hand.
Patent valuation is a complex, long and involved process. It typically involves a lot of information not present in the document itself and then a careful balancing of that information. The gathering and vetting process are just that, a process. Someone in this line of work would be well served to start populating their own “village” as it really does take a team of people to do valuation well. The bottom line is that the patent takes on the value based on the context in which it sits. The better you can place that patent in context, the better your estimates of value will be.
June 19, 2019
A comprehensive survey of more than 1,000 patent and trademark practitioners in-house counsel and IP law firms across 80...Read more
June 4, 2019
Working with both law firms and corporates across the Intellectual Property ecosystem, we at CPA Global ask ourselves ev...Read more