Step 1 Anchor

Quality Metrics

We are committed to self-improvement, developing and implementing new measures for understanding, evaluating, and reporting the correctness and clarity of examiners’ work products.

Quality Metrics

 

Patent Quality logo

The Office continues to identify new metrics to provide a more thorough understanding of the quality of the Office’s work products and processes”. In the second sentence, please change “Resulting from stakeholder feedback, in fiscal year 2015, the Office launched the Enhanced Patent Quality Initiative (EPQI) Quality Metrics Program and in fiscal year 2016 transitioned to the new quality metrics approach, categorizing quality metrics as follows:” to “Resulting from stakeholder feedback, in fiscal year 2015, the Office launched the Enhanced Patent Quality Initiative (EPQI) Quality Metrics Program, and in fiscal year 2016 transitioned to a new quality metrics approach.  This new approach includes new metrics assessing the correctness of the Office’s work products, which the Office formulates using data from reviews conducted by the  Office of Patent Quality Assurance (OPQA) using the Master Review Form (MRF).

The Office considers a quality patent to be one that is correctly issued in compliance with all the requirements of Title 35 as well as the relevant case law at the time of issuance.  A statutorily compliant Office action must include all applicable rejections – it must not omit an applicable rejection – and any asserted rejection must be correct in that the decision to reject is based on sufficient evidence to support the conclusion of unpatentability.  Thus, the Office has moved to reporting out statutory compliance for our metrics, where the metrics are based on reviews conducted on every Office action type in every technological discipline.  Learn more about the calculation of statutory compliance.

 

Step 2 Anchor

Correctness Measures

Our measures of correctness, in combination with our measures of clarity, define the product indicators of the quality in our work products, which are the Office actions written by examiners that prosecute patent applications.  While how we measure clarity continues to be developed for metrics purposes, correctness is readily measured in our work products today.

We consider a quality patent to be one that is correctly issued in compliance with all the requirements of Title 35 as well as the relevant case law at the time of issuance.  Thus it follows that a statutorily compliant Office action, which are the framework of USPTO’s patent prosecution, includes all applicable rejections and in every asserted rejection, sufficient evidence to support a conclusion of unpatentability is provided.

Our correctness measures provide statutory compliance rates for 35 U.S.C. 101, 35 U.S.C. 112, 35 U.S.C. 102, and 35 U.S.C. 103.  We quantitate these measures using data from reviews on randomly-selected Office actions conducted by the Office of Patent Quality, OPQA, using the Master Review FormMRF.  Office actions of various types (non-final rejections, final rejections, and allowances) are considered from all technologies.

With these measures being defined late in FY16, we set target ranges in FY17, and based on our accomplishments in FY17, we have set FY18 statutory compliance targets.

 

Statute
 

FY17 Statutory 
Compliance Targets

FY17 Statutory
Compliance Rates

 
FY18 Statutory
Compliance Targets

 
35 U.S.C. 101 93-98% 96.5% >97%
35 U.S.C. 112 87-92% 92.6% >93%
35 U.S.C. 102 90-95% 94.5% >95%
35 U.S.C. 103 88-93% 92.4% >93%
Step 3 Anchor

Process Indicators

The Office’s process indicators assist the Office in tracking the efficiency and consistency of internal processes.  The Office’s current focus is on preventing reopening of prosecution, reducing rework, and ensuring consistency of decision-making.  To do this, the Office is evaluating certain types of transactions in the Patent Application Location and Monitoring (PALM) system to identify trends and examiner behaviors indicative of either best practices or potential quality concerns.

Rather than setting targets for the particular transactions, the Office is conducting a root-cause analysis on the trends and behaviors to either capture identified best practices or correct issues, as appropriate.

View the Process Indicator graphical chart results, defined by the below, to learn more about process indicators metrics:

  • Consistency of Decision Making: This chart shows the differences in allowance rates for similarly situated primary examiners by comparing the allowance rate for primary examiners to the average of other primary examiners in a Technology Center or Class (e.g., (examiner rate – average rate)/average rate).  The chart also shows the percentage of primary examiners falling between 0 and a given point on the x-axis.
  • Rework: This chart shows the number of examiners having a particular number of instances of rework (e.g., 2nd+ non-finals, consecutive finals, consecutive restrictions).  The chart also shows the percentage of examiners falling between no instances of rework and a given number of instances of rework.
  • Reopens:  This chart shows the number of examiners having a particular number of re-openings (e.g., re-open after an appeal brief, re-open or allow after pre-appeal, re-open after final).  The chart also shows the percentage of examiners falling between no re-openings and a given number of re-openings.
Step 4 Anchor

Perception Indicators

The Office has conducted both internal and external stakeholder perception surveys semi-annually since 2006.  The results of these surveys are a vital quality indicator and they are useful for validating our other quality metrics.  For example, the results of the perception surveys assure alignment of the data underlying our metrics and our stakeholders’ perceptions and assure that the quality metrics we report are useful for our stakeholder.

Internal Stakeholder Perception Survey: the internal survey is sent to 750 randomly selected patent examiners on a semi-annual basis.

External Stakeholder Perception Survey: the external survey is sent to 3,000 of our frequent-filing customers on a semi-annual basis. 

View the Perception Indicator External Survey graphical chart results, defined by the below, to learn more about perception indicators metrics:

  • Frequency of Technically, Legally, and Logically Sound Rejections: Across all survey periods, customers who experienced 102 rejections, and 112 (b) rejections were most likely to report that those rejections were sound “most” or “all” of the time. Those who experienced 101 rejections were least likely to report that the rejections were sound “most” or “all” of the time.  The percentage of customers who experienced 112 (a) or (b) rejections and who reported that these rejections were sound most or all of the time increased significantly between FY16-Q1 and FY-16Q3.
  • Percent Positive and Negative Ratings: Between FY16-Q1 and FY16-Q3, the percentage of customers reporting that overall examination quality was “poor” or “very poor” increased slightly from 9% to 10% and the percentage of customers reporting that overall examination quality was “good” or “excellent” decreased from 54% to 50%. Neither of these changes were statistically significant.  In FY16-Q3, five in ten customers reported the overall examination quality was “good” or “excellent” and only one in ten customers reported that overall examination quality was “poor” or “very poor”.
  • Percent Reporting “Good” or “Excellent” Quality of Prior Art: These results show that in FY16-Q3 the proportion of customers in the Chemical, Electrical, and Mechanical technology fields were more likely to report that the quality of prior art was “good” or “excellent” than “poor” or “very poor”. Customers in the Electrical field (44%) are less likely to report “good” or “excellent” than customers in the Chemical (59%)* or Mechanical (56%)** fields.  (*p=0.001; ** p<0.05)
  • Consistency: Between FY16-Q1 and FY16-Q3, the percentage of customers who reported experiencing problems with the consistency of examination quality to “a large degree” increased slightly, but this change was not statistically significant. In both periods, approximately 8 in 10 customers experienced inconsistency in examination quality to some degree, whether small or large.
Step 5 Anchor

We Want Your Feedback

If you have questions or comments about Quality Metrics at the USPTO, please send an email to QualityMetrics@uspto.gov.  For general inquiries about patent quality, email PatentQuality@uspto.gov

Return to Quality Metrics at USPTO.

Learn about USPTO efforts to continuously increase patent quality.

Last Anchor