- Product Indicators include metrics on the correctness and clarity of our work products. We formulate these metrics using data from reviews conducted by the Office of Patent Quality Assurance using the Master Review Form.
- Process Indicators assist in tracking the efficiency and consistency of internal processes. Our current focus is on analyzing reopening of prosecution and rework of Office actions as well as improving consistency of decision making.
- Perception Indicators use both internal and external stakeholder surveys to solicit information that can be used for root cause analysis and to validate/verify the other metrics.
Product indicators include metrics on the correctness and clarity of our work products. We formulate these metrics using data from reviews on randomly-selected Office actions conducted by the Office of Patent Quality Assurance (OPQA) using the Master Review Form (MRF).
Correctness: We consider a quality patent to be one that is correctly issued in compliance with all the requirements of Title 35 as well as the relevant case law at the time of issuance. A statutorily compliant Office action, which is the framework of USPTO’s patent prosecution, includes all applicable rejections and in every asserted rejection, sufficient evidence to support a conclusion of unpatentability is provided. Visit Correctness Measures for definitions of statutory compliance by statute and our FY17 Compliance Rates along with our FY18 Compliance Targets.
Clarity: We are continuing to develop our clarity measures. Currently, while clarity data is captured as a part of every Office action review using the MRF, we have found that such assessments can be highly subjective. As such, we are working to ensure that the data captured through the MRF is as reliable as possible. As we improve our consistent use of the clarity measurements in the MRF, we will be formulating and sharing clarity measures with our stakeholders.
Our process indicators assist us in tracking the efficiency and consistency of internal processes to enhance our efficiency without sacrificing quality. We focus on analyzing reopening of prosecution and rework of Office actions as well as improving consistency of decision making. To do this, we evaluate certain types of transactions in our Quality Index Report (QIR) to identify trends and examiner behaviors indicative of either best practices or potential quality concerns.
Rather than setting targets for the particular transactions, we conduct a root-cause analysis on the trends and behaviors to either capture identified best practices or correct issues, as appropriate. It is sometimes desirable for an examiner to reopen prosecution or issue a second non-final rejection, such as when adjusting a rejection in view of changes to the law resulting from a new court decision. By conducting a root cause analysis that focuses on the underlying reasons for the given trends and behaviors, we allow for re-openings and rework where appropriate while providing training to ensure examiners have the necessary skills and resources to be as efficient as possible.
To assist with the root-cause analysis, we plot particular QIR transactions to more easily identify examiners who either may be performing best practices or may need additional assistance. View Process Indicator graphical resultsof this analysis below.
We have conducted both internal and external stakeholder perception surveys semi-annually since 2006. TThe results of these surveys are a vital quality indicator inasmuch as they are useful for validating our other quality metrics. For example, the results of the perception surveys assure alignment of the data underlying our metrics and our stakeholders’ perceptions and assure that the quality metrics we report are useful for our stakeholders.
View some of the results of the External Stakeholder Perception Survey below. For sample surveys, see the Tools section on the OPQA web page.
- Frequency of Sound Rejections by Statute chart shows the percentage of respondents reporting the frequency of the rejections they experienced were technically, legally, and logically sound showing data since FY13.
Prior Art Search Quality chart shows the percentage of respondents reporting “good” or “excellent” prior art searches performed by the examiner as well as “poor” or “very poor” prior art quality showing current data by technology field.
Consistency chart shows how much inconsistency respondents identified in the examinations they had received over the past 3 months.
- Overall Examination Quality chart shows the percentage of respondents reporting “good” or “excellent” overall examination quality as well as “poor” or “very poor” overall examination quality.