We evaluated the new Beckman Coulter DxH 800 hematology analyzer (Beckman Coulter, Miami, FL) vs the Abbott Diagnostics Cell-Dyn Sapphire (Abbott Diagnostics, Santa Clara, CA) and Beckman Coulter LH 780 hematology analyzers using 430 adult specimens. The DxH 800 provided a CBC and differential that correlated well with those of the Sapphire and LH 780, with most parameters showing correlation coefficients (r) of more than 0.97. In the instrument vs 400-cell manual differential comparison, all 3 instruments showed similar and acceptable accuracy to the reference method except for nucleated RBC (NRBC) enumeration, in which the DxH 800 and Sapphire outperformed the LH 780. We also compared clinical efficiency by determining whether flagged specimens showed abnormalities on a peripheral blood smear as defined by International Council for Standardization in Haematology criteria. The efficiency, sensitivity, and specificity of the DxH 800 were 77.0%, 87.1%, and 73.0%, respectively, compared with the Sapphire at 75.8%, 93.5%, and 68.8%, respectively, and LH 780 at 66.1%, 93.5%, and 55.3%, respectively.
Automated hematology analyzers are some of the most important instruments in today’s clinical laboratory, able to perform thousands of CBCs per day in a completely automated manner. However, some specimens require operator intervention or confirmatory studies, such as a smear review and manual differential cell count, and these additional steps impact laboratory turnaround time, efficiency, and labor costs. Manufacturers design instruments intending to minimize interventions while maintaining sensitivity, and given the myriad technologies used by different manufacturers, each analyzer has particular performance characteristics that can be highly specimen-dependent. Therefore, the most informative manner to select the optimum analyzer is to perform an in-house comparison of candidates using specimens that reflect their intended use.
In this report, we describe a 3-way comparison among the newly released UniCel DxH 800 (Beckman Coulter, Miami, FL), the LH 780 (Beckman Coulter), and the Cell-Dyn Sapphire (Abbott Diagnostics, Santa Clara, CA). This study represents one of the few early reports on the DxH 800, which is a recently released, high-volume hematology analyzer intended to supplant the previous generation, Beckman Coulter LH 780.1–3 The DxH 800 represents a new design from the manufacturer with less tubing and a smaller footprint than the LH 780. The Cell-Dyn Sapphire represents a comparable high-volume analyzer from Abbott Diagnostics that has been studied extensively.4–6
By using guidelines established by the Clinical and Laboratory Standards Institute (CLSI; formerly National Committee for Clinical Laboratory Standards) for evaluation of hematology analyzers, we ran a total of 430 adult specimens on all 3 instruments, side by side and under the same laboratory conditions. We also performed a similar study on 156 pediatric specimens in Microtainer tubes (Becton Dickinson, Franklin Lakes, NJ) in parallel, the results of which are reported separately in an accompanying article.7 For both studies, we compared the numeric results from each analyzer and compared each instrument’s WBC differential and nucleated RBC (NRBC) count with a reference 400-cell manual differential. We also compared clinical efficiencies by determining whether flagged specimens showed abnormalities on a peripheral blood smear as defined by criteria established by the International Council for Standardization in Haematology (ICSH).8 Our goal was to compare the performance of the DxH 800, LH 780, and Sapphire in an unbiased, systematic manner on adult specimens from a tertiary care hospital.
We performed these studies at the core laboratory of the Stanford Medical Center, Stanford, CA, using adult specimens obtained from Stanford Hospital and Clinics, which includes a 465-bed tertiary care hospital with a large number of hematology, oncology, and bone marrow transplant patients. The nature of this patient population, in our experience, leads to a high proportion of specimens that require a peripheral blood smear review and manual differential.
Materials and Methods
The UniCel DxH 800 is a scalable, fully automated hematology analyzer system capable of analyzing up to 100 samples per hour and providing a CBC, 5-part WBC differential, NRBC count, and reticulocyte count from 165 μL of sample. The analyzer uses new volume, conductivity, and scatter parameters for controlled flow cytometric analysis of WBC differential, reticulocytes, and NRBC detection. Improvements over previous Beckman Coulter analyzers (including the LH 780) include a proprietary blood detection design to minimize sample volume requirements, solid-state diluter design that contains no pinch valves, and new modular construction with a front-loading sample track to facilitate linkage of multiple DxH 800 instruments in a linear arrangement. A single DxH 800 was placed in the Stanford Clinical Laboratory during the period from July 22, 2009, to August 7, 2009.
The LH 780 is a fully automated hematology analyzer system capable of analyzing up to 110 samples per hour and providing a CBC, 5-part WBC differential, NRBC count, and reticulocyte count from 300 μL of sample. One particular feature of the LH 780 is the available LH slidemaker/slidestainer that connects and combines with the LH 780 to offer rule-based, automated peripheral blood smear production from a single workstation. The LH 780 has been in use at the Stanford Clinical Laboratory since 2008.
The Cell-Dyn Sapphire is a fully automated hematology analyzer capable of analyzing up to 106 samples per hour to provide a CBC, WBC differential, and NRBC count from 117 μL of sample or 69 samples per hour with the addition of a reticulocyte count. The instrument uses multiangle polarized scatter separation and 3-color fluorescence technologies to provide analysis of the WBC count and differential. The Sapphire also offers unique extended features, including immunologic-based testing for platelets using antibodies against CD61 and automated counting of CD3/CD4/CD8 T-cell subsets. The Sapphire has been in use at the Stanford Clinical Laboratory since March 2007.
Each instrument was calibrated according to the manufacturer’s protocols, and all interval maintenance activities were performed according to the manufacturer’s recommendations. Only reagents produced by the respective manufacturer were used during the evaluation, and 2 levels of manufacturer’s controls were run once per 8-hour shift.
The evaluation was conducted during a 2-week period from July 27 to August 7, 2009, during which time all instruments were placed in the Stanford Clinical Laboratory, which performs testing for the Lucile Packard Children’s Hospital, Palo Alto, CA, and Stanford Hospital and Clinics (the adult hospital of Stanford University Medical Center). The evaluation followed guidelines published by the CLSI document H20-A2 and International Council for Standardization in Haematology.8,9 All samples were tested within 4 hours of collection and run on all 3 instruments within a 2-hour window. All samples were run on the DxH 800 and LH 780 in “auto” mode using the default test type “CDR” and on the Sapphire in “open” mode by 2 licensed clinical laboratory scientists. For every sample, 3 blood smears were prepared within 2 hours of running on the first instrument, and 2 experienced licensed technologists each performed a 200-cell manual differential according to CLSI guidelines.9 The NRBC count was expressed as number per 100 WBCs.
Specimens were obtained from residual, fresh (<4 hours from collection) dipotassium EDTA–anticoagulated peripheral blood samples collected in Vacutainer tubes (Becton Dickinson) and stored at room temperature. A total of 430 adult samples were selected consecutively from the Stanford Clinical Laboratory workflow. Use of these specimens was approved by the Stanford University Panel on Medical Human Subjects (protocol ID 12532; institutional review board No. 6208 [Panel 8]).
For numeric values of WBC count, 5-part WBC differential, NRBC count, RBC count, hemoglobin concentration, mean corpuscular volume, platelet count, and mean platelet volume (MPV), results from the instruments and the 400-cell manual differential were entered into MedCalc, version 11.3.6 (MedCalc Software, Mariakerke, Belgium) for method comparison calculations and receiver operating characteristic (ROC) curve analysis. For numeric values, with the exception of NRBC, method comparison was performed using Passing-Bablok regression.10 NRBC comparison was performed using linear regression. For all comparisons, the correlation coefficient was calculated based on linear regression. For illustrative purposes, method comparison figures were produced using Microsoft Excel 2007 (Microsoft, Redmond, WA) using the same data and calculated parameters from MedCalc, and all ROC curves were produced using MedCalc.
Clinical sensitivity and specificity were determined by comparing whether flagged samples were abnormal on the reference 400-cell manual differential. An abnormal manual differential result was defined by criteria established by the ICSH with the addition that any sample with an NRBC count greater than or equal to 1.0 NRBC/100 WBCs would be considered abnormal.8 The sensitivity, specificity, positive predictive value, negative predictive value (NPV), and overall efficiency were calculated from the number of true-positives (TPs), true-negatives (TNs), false-positives (FPs), and false-negatives (FNs) as follows: Sensitivity (% True-Positives) = TP/(TP + FN) × 100; Specificity (% True-Negatives) = TN/(TN + FP) × 100; Positive Predictive Value = TP/(TP + FP) × 100; NPV = TN/(TN + FN) × 100; Efficiency = (TP + TN)/Total × 100.
To evaluate the sensitivity and specificity of blast-specific flags, the same specimens and manual differential data used in the clinical efficiency study were analyzed using the following criteria: For the DxH 800 and LH 780, the instrument result was considered positive if 1 or any combination of the neutrophilic blast, lymphocytic blast, and monocytic blast flags were triggered. For the Sapphire, the instrument result was considered positive if the blast flag was triggered. These results were compared with the reference 400-cell manual differential, and any nonzero value for blast enumeration was considered a positive manual result.
In the interinstrument comparison, the DxH 800 provided a CBC that was comparable to those from the Sapphire and the LH 780 Figure 1, Figure 2, and Table 1. Nearly all CBC parameters showed correlation coefficients (r) greater than 0.97 except for MPV. For most CBC parameters, the slope obtained from regression analysis was close to 1, and the y-intercept was close to 0. The exceptions were platelet count and MPV, for which the DxH 800 values were similar to the LH 780 values (slope = 0.959 and r = 0.998 for platelets; slope = 0.909 and r = 0.963 for MPV) but proportionately less than the Sapphire values (slope = 0.820 and r = 0.998 for platelets; slope = 0.850 and r = 0.734 for MPV). The WBC differential of the DxH 800 was also comparable to those of the Sapphire and the LH 780 Figure 3 and Figure 4 (Table 1), and the r values were greater than 0.97 for all WBC types with the exceptions of monocytes (r = 0.774 for DxH 800 vs Sapphire; r = 0.791 for DxH 800 vs LH 780) and basophils (r = 0.275 for DxH 800 vs Sapphire; r = 0.613 for DxH 800 vs LH 780).
In the instrument vs manual WBC differential study, all 3 instruments showed a high degree of correlation with the manual differential Figure 5, Figure 6, Figure 7, and Table 2. Among the differential parameters, the neutrophil, lymphocyte, and eosinophil counts showed r values greater than 0.95 and a slope close to 1 for all instruments. The monocyte count consistently showed lower r values (DxH 800, r = 0.750; Sapphire, r = 0.671; LH 780, r = 0.729), and visual inspection of the instrument vs manual monocyte counts (Figures 5C, 6C, and 7C) shows more frequent outliers for monocytes than for other WBC types. It is interesting that outliers on one instrument were often outliers on the other two instruments. For example, sample 104 was read as 23.5% monocytes by the manual count vs 13.7%, 7.06%, and 3.1% by the DxH 800, Sapphire, and LH 780, respectively, and sample 337 was read as 15% by the manual vs 6.3%, 5.08%, and 6.1% by the DxH 800, Sapphire, and LH 780, respectively. Both of these specimens, as well as other outliers in the lower right quadrant, were typified by immature and often small monocytes on the peripheral blood smear that were undercounted by the instruments. Finally, the instrument basophil count showed relatively poor correlation with the manual differential (DxH 800, r = 0.581; Sapphire, r = 0.478; LH 780, r = 0.148).
In contrast with the WBC differential, the NRBC count showed much greater differences between instruments. In comparing the instrument count with the manual differential, the Sapphire exhibited the best correlation coefficient and regression parameters (slope = 0.811; y-intercept = −0.036, and r = 0.995). The Sapphire absolute NRBC count was closest to that of the reference manual method, with a slope that was closest to 1 and greater than that of the DxH 800 (slope = 0.666) and less than that of the LH 780 (slope = 1.238).
We also performed ROC curve analysis to evaluate the instrument NRBC count vs the manual method using a threshold of 1.0 NRBC/100 WBCs to define a positive NRBC result by manual differential. The sensitivity, specificity, and area under the curve (AUC) plots are shown in Figure 8. The DxH 800 had that largest AUC, at 0.907, vs the Sapphire at 0.754 and the LH 780 at 0.671 (P = .0003 for DxH 800 vs Sapphire; P < .0001 for DxH 800 vs LH 780; P = .1038 for Sapphire vs LH 780).
For the DxH 800, highest efficiency was attained at an instrument cutoff of 0.2 NRBCs/100 WBCs, at which point the sensitivity and specificity were 77.3% and 85.8%, respectively. Adjusting the cutoff to 0.5 NRBCs/100 WBCs decreases the sensitivity to 47.8% but increases the specificity to 94.5%, and further increasing the cutoff to 0.9 NRBCs/100 WBCs decreases the sensitivity to 39.1% and increases the specificity to 100%.
For the Sapphire, optimum efficiency was achieved at an instrument cutoff of 0.8 NRBCs/100 WBCs, yielding a sensitivity of 52.2% and specificity of 98.7%. Adjusting the instrument cutoff to 0 NRBCs/100 WBCs yields the same sensitivity of 52.2% and lower specificity, at 97.9%, and changing the instrument cutoff to 1.5 NRBCs/100 WBCs yields a sensitivity of 43.5% and specificity of 99.6%. For the LH 780, the optimum instrument threshold of 0 NRBCs/100 WBCs yielded a sensitivity of 34.8% and a specificity of 99.2%.
In the clinical efficiency study, we compared whether flagged and unflagged specimens were truly normal or abnormal by applying ICSH criteria to the results of the reference 400-cell manual differential.8 The results of the truth table analysis are listed in Table 3, along with the sensitivity, specificity, and efficiency for each instrument. Overall, the DxH 800 had the highest efficiency at 77.0%, compared with the Sapphire at 75.8% and the LH 780 at 66.1%. The DxH 800 attained the highest efficiency by generating the fewest false-positive flags, with 64 false-positive flags for the DxH 800 vs 74 for the Sapphire and 106 for the LH 780. For each instrument, the number of times that a specific false-positive flag was encountered is listed in Table 4, Table 5, and Table 6, along with the number of times it was encountered as the only flag or along with other flags in a given case. It is interesting that all 3 instruments showed a similar false-positive flagging pattern. For the DxH 800, the most common false-positive flags were left shift and immature granulocytes, seen in 33 and 20 of 64 total falsely flagged specimens, respectively (Table 4). The most common false-positive flags on the Sapphire were band and immature granulocytes, detected in 44 and 35 of 74 total falsely flagged specimens, respectively (Table 5), and the most common false-positive flags on the LH 780 were immature neutrophils 1 and immature neutrophils 2, encountered in 57 and 33 of 106 total falsely flagged specimens, respectively (Table 6).
Although the DxH 800 was the most specific instrument in terms of flagging, it was also the least sensitive (sensitivity = 87.1% for DxH 800, 93.5% for Sapphire, 93.5% for LH 780). A total of 12 cases with abnormal peripheral blood smears were undetected by the DxH 800, including 7 samples with 1% to 3% myelocytes, 4 samples with 1 NRBC/100 WBCs, and 1 sample with 3% atypical lymphoid cells, consistent with lymphoma cells (sample 564). By contrast, 6 cases were undetected by the Sapphire, including 3 samples with 1% to 3% myelocytes, 2 samples with 1 NRBC/100 WBCs, and 1 sample with 3% atypical lymphoid cells (sample 564), and 6 cases were undetected by the LH 780, including 3 samples with 1% to 3% myelocytes, 1 sample with 1 NRBC/100 WBCs, 1 sample with 1% prolymphocytes, and 1 sample with 3% atypical lymphoid cells (sample 564).
It is interesting that many of the false-negative samples were missed by more than 1 instrument. For example, samples 56, 69, 130, and 144 each contained 1% to 3% myelocytes by manual differential that were missed by various combinations of 2 or all 3 analyzers; samples 182 and 454 contained 1 NRBC/100 WBCs missed by 2 or all 3 analyzers; and sample 564 contained 1% atypical lymphoid cells that were missed by all 3 analyzers.
In the clinical efficiency study, none of the false-negative cases resulted in missing blasts, and although 22 of the 330 specimens contained blasts by manual differential, each of these specimens was flagged by all 3 instruments. However, not all 22 cases with blasts were flagged with a blast-specific flag, and not all cases flagged with a blast flag showed blasts by manual differential. We evaluated the sensitivity and specificity of the blast flag for each instrument, and the results are shown in Table 7. For the DxH 800, 17 (77%) of 22 cases with blasts were flagged by a blast flag, but the remaining 5 cases were not detected by a blast flag and, instead, were detected by the left shift and immature granulocyte flags. For the Sapphire, 13 (59%) of 22 cases were detected by a blast flag, and the remaining 9 cases were detected by the band and immature granulocyte flags. For the LH 780, 9 (41%) of 22 cases were detected by a blast flag, and the remaining 13 cases were detected with the immature neutrophils flag. In this study, the majority of cases with blasts were accompanied by left-shifted granulocytes, which likely accounts for why cases missed by the blast flag were detected by flags designed to detect immature granulocytes.
Our main objectives were to evaluate the newly released Beckman Coulter UniCel DxH 800 and to assess its suitability to replace the Beckman Coulter LH 780 as the primary high-volume hematology analyzer in the Stanford Clinical Laboratory. We performed a 3-way comparison among the DxH 800, LH 780, and Cell-Dyn Sapphire according to CLSI guidelines.9 In this study, we used 430 adult specimens collected from our busy tertiary care service and performed this study in parallel with a similar study on 156 pediatric Microtainer specimens (reported separately in the accompanying article7). We compared the numeric results from the DxH 800, LH 780, and Sapphire and each instrument’s differential count vs a reference 400-cell manual differential. We also compared the clinical efficiency by determining whether flagged specimens were truly normal or abnormal by using ICSH criteria applied to the manual differential results.8
In the interinstrument comparison, the numeric CBC results from the DxH 800 showed results comparable to those of the Sapphire and LH 780. With the exception of MPV, the r values were greater than 0.97 and comparable to those in a multi-instrument study by Muller et al6 and a recent evaluation of the DxH 800 vs the LH 780.1 In addition to the r value, which indicates how well the data conform to the best-fit line, additional statistical parameters include the slope and y-intercept, which indicate the degree of proportional and absolute bias, respectively. All parameters for the CBC exhibited slopes close to 1 and y-intercepts close to 0, with the exceptions of the MPV and platelet count, for which the DxH 800 values were similar to those of the LH 780 but proportionately less than those of the Sapphire (slope = 0.820 for platelets; slope = 0.850 for MPV).
The cause for this difference is likely related to method because the DxH 800 and LH 780 use impedance technology to enumerate and measure the size of platelets, whereas the Sapphire uses optical and impedance methods. For each instrument, curve fitting is applied to a histogram of platelet volume vs count to separate platelets from interfering particles, such as microcytic RBCs, and algorithms subsequently derive the platelet count and MPV. Although higher concordance between the 2 Beckman Coulter instruments is expected, the degree of difference between the DxH 800 and Sapphire platelet count is surprising, given the clinical significance of this parameter. This difference was also seen in our accompanying pediatric study, in which comparison between the DxH 800 and Sapphire yielded a slope of 0.857. Although there are no published studies directly comparing the platelet count of the Sapphire with that of the DxH 800 or LH 780, Muller et al6 showed near equivalence in comparing the Sapphire with the Beckman Coulter GenS (slope = 1.00, intercept = 3.93, r2 = 0.99), and Segal et al11 observed minimal differences between the optical count of the Abbott Diagnostics Cell-Dyn 4000 and the Beckman Coulter LH 750. However, the study by Segal et al11 did not directly compare the 2 instruments using the same samples, but rather compared one instrument with the flow cytometric international reference method (IRM)12 at one site and the other instrument with the IRM at an alternative site. Nevertheless, the cause of the proportional difference between the Beckman Coulter instruments and Sapphire that is observed in our adult and pediatric studies will require further investigation and comparison with a reference method such as the IRM.
The MPV is a parameter not reported at our institution, given the known increase in platelet volume over time for EDTA-anticoagulated specimens.13–15 Nevertheless, we were intrigued by the lower MPV reported by the Beckman Coulter instruments vs the Sapphire. It is interesting that visual inspection of the DxH 800 vs LH 780 plot shows a linear relationship (Figure 2F), whereas the DxH 800 vs Sapphire plot (Figure 1F) shows a bimodal distribution with the majority of data points close to the regression line and a set of outliers forming a second line with a lower slope. Further investigation revealed that most of the outliers had platelet counts less than 100 × 103/μL (100 × 109/L). We hypothesize that differences between the Beckman Coulter and Abbott MPV algorithms are more pronounced when the platelet count is low or that platelets produced in the setting of thrombocytopenia have physical properties that cause larger differences between impedance and optical MPV detection. We were surprised that the bimodal distribution of the DxH 800 vs the Sapphire MPV was not readily apparent in our accompanying pediatric study.7
The DxH 800 WBC differential also showed good correlation with those of the Sapphire and the LH 780. The neutrophil, lymphocyte, and eosinophil counts displayed the highest degrees of correlation, with r values of 0.97 or greater. In comparison, the monocyte count showed lower but acceptable correlation (DxH 800 vs Sapphire, r = 0.774; DxH 800 vs LH 780, r = 0.791), whereas the basophil count showed very poor correlation (DxH 800 vs Sapphire, r = 0.275; DxH 800 vs LH 780, r = 0.613). This pattern—in which the correlation for neutrophils and lymphocytes is high, that for monocytes is moderate, and that for basophils is low—is commonly observed in interinstrument comparisons.6,16–18 In this study, minor outliers in the monocyte data (Figures 3C and 4C) contributed to the lower correlation coefficients. For basophils, others have shown poor correlation as a result of statistical uncertainty in counting rare events,19,20 and in this study, all samples contained fewer than 5% basophils.
In terms of WBC differential accuracy, all 3 instruments showed similar comparability to the reference 400-cell manual differential. The vast majority of counts were within the 95% statistical confidence intervals for a 400-cell manual count, and the slopes and intercepts were well within the 95% confidence intervals of one another.
In contrast with the CBC and WBC differential, the NRBC data showed very different performance characteristics among the 3 instruments. In terms of NRBC enumeration, the Sapphire showed the best linear regression parameters (DxH 800, slope = 0.666, y-intercept = 0.037, r = 0.993; Sapphire, slope = 0.811, y-intercept = −0.036, r = 0.995; LH 780, slope = 1.238, intercept = −0.209, r = 0.989). However, the majority of NRBC counts were 0, and few samples had detectable NRBCs. For example, 222 of the 262 specimens had no NRBCs by manual differential, 17 specimens had 0.5 NRBCs/100 WBCs, 22 specimens had between 0.5 and 12.5 NRBCs/100 WBCs, and specimen 431 had 85.5 NRBCs/100 WBCs (not shown in Figures 3F, 4F, 5F, 6F, and 7F owing to scale but included in regression calculations). This distribution reflects what is commonly seen for NRBC enumeration for hospital laboratories and is not well suited for regression analysis. In fact, our statistical software was not able to apply Passing-Bablok regression to the NRBC data, which is our preferred method, and instead we used linear regression for these data. Nevertheless, although the regression parameters are not robust in this setting, the instrument vs manual scatter plots (Figures 5F, 6F, and 7F) clearly indicate that the Sapphire and DxH 800 show a more linear, predictable relationship to the manual count than the LH 780 and that the Sapphire shows the closest absolute NRBC count to the manual differential.
Given the issues with linear regression, we also compared each instrument’s NRBC count with the manual differential using ROC curve analysis, which measures the instrument’s ability to detect the presence of NRBCs (defined as ≥1 NRBCs/100 WBCs on the manual differential) rather than enumerating an absolute count. Hedley et al1 used ROC curve analysis to compare the DxH 800 and LH 780 NRBC count with a reference flow cytometric method, and others have compared instrument NRBC detection with manual differential by truth table analysis,5,6,21 which is similar in principle to ROC curve analysis. For evaluating NRBC detection, ROC curve analysis is applicable to laboratory workflows such as ours, in which the presence of instrument-detected NRBCs elicits a manual smear review, and a manual differential is performed if significant numbers of NRBCs are seen.
In our studies, ROC curve analysis demonstrated that the DxH 800 had an AUC of 0.907, vs the Sapphire at 0.754 and the LH 780 at 0.671 (P = .0003 for DxH 800 vs Sapphire; P < .0001 for DxH 800 vs LH 780; P = .1038 for Sapphire vs LH 780). The DxH 800 offered the most sensitive NRBC detection, albeit at a low instrument cutoff of 0.2 NRBCs/100 WBCs, which reflects the fact that the DxH 800 NRBC count is proportionately lower than the manual method (slope = 0.666). Changing the instrument cutoff resulted in large changes in the sensitivity and specificity, which parallels the ability of the DxH 800 to discriminate differences in NRBC count at the low range (near 1.0 NRBC/100 WBCs), which is visibly demonstrable on the DxH 800 vs manual NRBC comparison plot (Figure 5F). By contrast, the Sapphire achieved optimal efficiency at an instrument cutoff of 0.8 NRBCs/100 WBCs, which reflects the fact that the Sapphire NRBC count is proportionately similar to the manual method (slope = 0.811). At its optimal efficiency, the Sapphire offered less sensitive but more specific NRBC detection than the DxH 800. Overall, the Sapphire was less sensitive in detecting small numbers of NRBCs but superior in terms of providing an absolute count that matched that of a manual differential.
Determining whether the DxH 800 or Sapphire provides the more desirable NRBC performance characteristics will depend on the needs and workflow of the laboratory. For laboratories that primarily report the manual rather than instrument NRBC count, the DxH 800 offers higher sensitivity in identifying specimens requiring a manual differential. On the other hand, for laboratories that report an instrument of NRBC count that must be equivalent to a manual count, the Sapphire offers better accuracy.
The clinical efficiency study also highlighted differences in performance among the 3 analyzers. The DxH 800 and Sapphire offered similar flagging efficiencies at 77.0% and 75.8%, respectively, exceeding that of the LH 780 at 66.1%. In the most comparable multianalyzer study, by Kang et al,5 the Sapphire was similar in efficiency to the LH 750 (Sapphire, 80.9%; LH 750, 79.5%), which is the predecessor to the LH 780, but these figures were obtained by using criteria different from the ICSH criteria. In our study, although the DxH 800 and Sapphire achieved similar overall efficiency, the Sapphire had higher sensitivity (DxH 800, 87.1%; Sapphire, 93.5%), whereas the DxH 800 had higher specificity (DxH 800, 73.0%; Sapphire, 68.8%). For both instruments, the most common false-positives were flags associated with immature granulocytes or platelet enumeration, and the most common false-negatives were cases in which small numbers of myelocytes or NRBCs were missed.
Although low numbers of myelocytes were missed by all 3 instruments, none of the instruments missed flagging a case in which blasts were enumerated by the manual differential. However, cases with blasts were not always flagged with a blast-specific flag and, instead, were identified by other flags. When specifically evaluating the ability of the blast flag to identify cases with blasts, the DxH 800 was the most sensitive, identifying 77.3% of cases with blasts, vs the Sapphire at 59.1% and the LH 780 at 40.9%. In this study, the NPVs of the blast flag were 98.2%, 97.0%, and 95.8% for the DxH 800, Sapphire, and LH 780, respectively. However, NPV is largely a function of the prevalence of the finding. Because only 22 (6.7%) of the 330 cases contained blasts by manual differential, the NPV, which equals TN/(FN + TN), becomes dominated by the large number of true-negatives (DxH 800 = 273; Sapphire = 289; LH 780 = 295), and there is little dependence on the low number of false-negatives (DxH 800 = 5; Sapphire = 9; LH 780 = 13). The calculated sensitivity of the blast flag is hardly reassuring, given that the DxH 800, Sapphire, and LH 780 missed flagging 5, 9, and 13 of the 22 cases with blasts, respectively. For this reason, nearly all laboratory workflows, including ours, require an investigation for blasts with a slide review or manual differential for any WBC-related flag and not just a blast-specific flag.
Deciding which analyzer provides the more optimal flagging characteristics will depend on the laboratory work-flow and specimen composition. For example, at Stanford University Medical Center, we receive a high proportion of abnormal specimens, and high specificity is essential to minimize the review rate to a manageable level. In this study, the DxH 800 offered the highest specificity, and the decreased sensitivity in detecting small numbers of myelocytes or NRBCs represents an acceptable compromise. In our tertiary care setting, which includes a high number of oncology, hematology, and bone marrow transplant patients, a small number of myelocytes or NRBCs in the peripheral blood is not a specific finding because these cells are commonly seen during or following therapy. On the other hand, for a laboratory that primarily tests healthy outpatients, the presence of immature granulocytes or NRBCs may reflect an unexpected physiologic process that warrants further investigation.
We demonstrated that the newly released Beckman Coulter DxH 800 provides numeric results that are comparable to those of the Abbott Diagnostics Sapphire and the Beckman Coulter LH 780 and flagging efficiency that matches the Sapphire and exceeds that of the LH 780. Although the DxH 800 and Sapphire offer similar efficiency, flagging on the DxH 800 is more specific, whereas that of the Sapphire is more sensitive. Deciding on the optimal analyzer will depend on the setting and type of specimens encountered, and we and others have demonstrated that the most effective approach is to evaluate candidate instruments using specimens that reflect their intended use.22,23 For example, in our accompanying pediatric study,7 we observed much greater differences in flagging efficiency between the DxH 800 and Sapphire because of a trend that was seen in pediatric but not adult specimens. Although comparative evaluations are costly in terms of time and labor, the resulting benefits can be realized over the entire instrument lifespan.
We thank Dawn Robertson, Beckman Coulter, for technical assistance.