By Joshua Greenlee, MBA, HT/HTL (ASCP)cm Product Manager and Workflow & Productivity Optimization Consultant Sakura Finetek, USA (www.sakuraus.com)
Having an efficient tissue processing protocol is crucial for multiple reasons. With its upstream position in the histopathology process, inefficient tissue processing can have negative ramifications through the rest of the downstream process from embedding to staining, pathology review and imaging. Processing inefficiencies may impact the diagnosis as well as laboratories’ turnaround times (TAT).
Proper tissue processing quality is imperative for an accurate diagnosis. Without proper dehydration, clearing, and infiltration of the tissue, the tissue morphology or antigenicity may be negatively affected, sectioning may be difficult at microtomy, and the section may not achieve proper staining for H&E, special stains, or advanced methods like IHC and molecular. At a minimum, improper processing may require additional time, cost, or rework, but at a maximum, it may result in an impossible, incomplete or inaccurate diagnosis.
I have worked in the laboratory for several years, and I have also been privileged to be able to visit and talk to other histology professionals all over the world. Most histology laboratories are experiencing shared challenges. Workloads are growing, turnaround times must decrease, and finding qualified staff has become a burden. On top of that, conventional tissue processing is often the most time-consuming part of the entire histopathology workflow.
Laboratories are increasingly demanded to reduce TAT. Laboratories may feel pressure to find the “quick fix” for processing-related TAT issues by acquiring new equipment that promises to boost rapid processing capabilities. Indeed, some processors exist that provide true, rapid, and continuous tissue processing. Unfortunately, though, many processors on the market that claim the ability to perform rapid tissue processing are simply another form of conventional tissue processor using the same four traditional processing reagents with traditional processing setups, which have multiple reagent steps and may take significant time to process tissue effectively.
Laboratories do not always have the resources to purchase new equipment, either, which is why it is not uncommon to find 20-year-old and older tissue processors still in daily service in many laboratories. Because of this, it is important that tissue processing protocols established a long time ago are reviewed to prevent processing inefficiencies that may result in longer protocol times or rework that delays the laboratories’ TAT.
Despite the need for quality and efficiency, however, laboratories may settle for less-than-ideal quality and productivity of tissue processing because they have just grown to accept it as “normal” over time. “This is how we have always done it” is the phrase I often hear in the laboratories. It is not uncommon for laboratories to continue to use protocols that have not been reviewed or updated over long periods, in some cases, for decades. These protocols continue to be used not considering the continuous progress provided by science and industry. Conventional processing protocols consist of a host of variables and deciding where and how to begin to make protocol adjustments is difficult when this knowledge and guidance is not made available. Fear of making changes that could lead to worse processing or even potentially non-diagnosable tissue may rightly prevent laboratories from acting. These perceived risks, coupled with the time and energy to perform the validation of a possibly suspect protocol, may prove too intimidating for laboratories.
Understanding these challenges, it became increasingly important to find a way to aid laboratories in their ability to consider reviewing and updating their processing protocols. Having this goal in mind, I started by collecting conventional tissue processing protocols from many different types and sizes of laboratories, those that succeeded to continuously generate high quality processed tissue as well as those that did not. These laboratories included hospital, reference, university, research, and specialty laboratories, and their volumes ranged from less than 100 to more than 2,000 cassettes per day. In addition, published protocols from various sources were collected and analyzed, then incorporated into a new and now large, detail-rich library with 276 processing protocols. This library has been growing monthly. Thank you to all the laboratories who have provided their protocols along with detailed discussions about their processes.
It was cardinal to gather fundamental information for each protocol, like the type and thickness of the tissues processed, the fixation regimen of the tissues prior to processing, and, equally important, information about the quality of the results.
All collected protocols were grouped into categories based on the characteristics of the protocols and classified as Fast (STAT), Biopsy, Routine, or Fat. Within these categories, the shortest, longest, and average protocol times, not including fixation, were determined with the shortest time being 0 hr: 50 min and the longest as 12 hr: 30 min (Figure A).
The protocols were then evaluated for commonalities and capabilities in terms of quality and speed. Six (6) categories were established based on tissue thickness and type. These categories were then associated with the general protocol time ranges (Figure B).
It was discovered that some laboratories experienced significantly different levels of quality than other laboratories even though they may be using protocols of similar lengths for similar tissue characteristics. The dehydration, clearing, and infiltration times of the protocols were then compared to the quality comments to develop a set of general ratios of total protocol time (Figure C).
The evaluation of all information in the library of protocols enabled the establishment of the Greenlee Ratio to Estimate Average Time method, better known as the GREAT method, to determine an initial overall protocol length based on tissue type and thickness as well as a breakdown of the ratios of time in dehydration, clearing, and infiltration for those protocols. Working with laboratories volunteering for protocol reviews and open to external assessment, the GREAT method was tested in several laboratories, and provided more efficient protocols with better quality and even faster processing in most cases.
The GREAT method has since proved to be a useful tool to help guide laboratories in making protocol adjustments. Using this method, with its simple and low-risk set of guidelines, empowers laboratories to review and update their protocols to enhance processing efficiency, increase quality and reduce turnaround time, enabling a level of confidence that doing so will bring long-term value above and beyond the temporary work expended for revalidation.
Interested in learning more about how to update your tissue processing protocols? Joshua will be presenting the first webinar of the 2021 Laboratory Webinar Series, Understanding and Updating Tissue Processing Protocols Using the GREAT Method on January 27th. Click here to learn more.
Unlocking the mystery of an effective tissue processing protocol: Using the Greenlee Ratio to Estimate Average Time (GREAT) method to determine estimated protocol length and reagent time ratio (https://www.sakuraus.com/getattachment/bd5b8dbd-66fd-41c5-8ec8-cdc7f0bc4a77/Unlocking-the-mystery-of-an-effective-tissue-proce)