K232879 · Ventana Medical Systems, Inc. · PSY · Jun 14, 2024 · Pathology
Device Facts
Record ID
K232879
Device Name
Roche Digital Pathology Dx (VENTANA DP 200)
Applicant
Ventana Medical Systems, Inc.
Product Code
PSY · Pathology
Decision Date
Jun 14, 2024
Decision
SESE
Submission Type
Traditional
Regulation
21 CFR 864.3700
Device Class
Class 2
Indications for Use
Roche Digital Pathology Dx (VENTANA DP 200) is an automated digital slide creation, viewing and management system. Roche Digital Pathology Dx (VENTANA DP 200) is intended for in vitro diagnostic use as an aid to the pathologist to review and interpret digital images of scanned pathology slides prepared from formalin-fixed paraffin-embedded (FFPE) tissue. Roche Digital Pathology Dx (VENTANA DP 200) is not intended for use with frozen section, cytology, or non-FFPE hematopathology specimens. Roche Digital Pathology Dx (VENTANA DP 200) is for creation and viewing of digital images of scanned glass slides that would otherwise be appropriate for manual visualization by conventional light microscopy. Roche Digital Pathology Dx (VENTANA DP 200) is composed of VENTANA DP 200 slide scanner, Roche uPath enterprise software, and the ASUS PA248QV display. It is the responsibility of a qualified pathologist to employ appropriate procedures and safeguards to assure the validity of the interpretation of images obtained using Roche Digital Pathology Dx (VENTANA DP 200).
Device Story
System creates, views, and manages digital pathology slides; aids pathologists in primary diagnosis. Input: FFPE tissue slides. Scanner captures high-resolution bright-field images using 20x objective (20x/40x scan capability); features auto-tissue detection, barcode reading, and selectable Z-axis focus layers. Images stored in proprietary BIF format. Software (uPath) provides web-based interface for image management, review, and reporting on standard workstations. Pathologists use system to perform digital reads equivalent to conventional light microscopy. Benefits: enables remote/digital workflow, maintains diagnostic accuracy comparable to manual microscopy, and supports efficient case management in clinical pathology labs.
Clinical Evidence
Retrospective multi-center study (4 sites, 2047 cases, 3259 slides) compared digital WSI review (DR) to manual light microscopy (MR). 16 pathologists reviewed cases in both modalities. Primary endpoint: non-inferiority of DR to MR based on major discordance rates relative to original sign-out diagnosis. Observed agreement: 92.00% (DR) vs 92.61% (MR); difference -0.61% (95% CI: -1.20%, 0.00%). GLIMMIX model confirmed non-inferiority (difference -0.62%, 95% CI: -1.60%, 0.20%), meeting the -4% margin. Precision studies (between-system, between-day, between-reader, within-reader) showed OPA > 85% for all endpoints.
Indicated for use by pathologists as an aid in the review and interpretation of digital images of FFPE tissue slides. Not for use with frozen section, cytology, or non-FFPE hematopathology specimens.
Regulatory Classification
Identification
The whole slide imaging system is an automated digital slide creation, viewing, and management system intended as an aid to the pathologist to review and interpret digital images of surgical pathology slides. The system generates digital images that would otherwise be appropriate for manual visualization by conventional light microscopy.
Special Controls
A whole slide imaging system must comply with the following special controls: (1) Premarket notification submissions must include the following information: (i) The indications for use must specify the tissue specimen that is intended to be used with the whole slide imaging system and the components of the system. (ii) A detailed description of the device and bench testing results at the component level, including for the following, as appropriate: (A) Slide feeder; (B) Light source; (C) Imaging optics: (D)Mechanical scanner movement; (E) Digital imaging sensor; (F) Image processing software; (G)Image composition techniques; (H)Image file formats; (I) Image review manipulation software; (J) Computer environment; (K)Display system. (iii)Detailed bench testing and results at the system level, including for the following, as appropriate: (A)Color reproducibility; (B) Spatial resolution; (C) Focusing test; (D) Whole slide tissue coverage; (E) Stitching error: (F) Turnaround time. (iv) Detailed information demonstrating the performance characteristics of the device, including, as appropriate: (A)Precision to evaluate intra-system and inter-system precision using a comprehensive set of clinical specimens with defined, clinically relevant histologic features from various organ systems and diseases. Multiple whole slide imaging systems, multiple sites, and multiple readers must be included. (B) Reproducibility data to evaluate inter-site variability using a comprehensive set of clinical specimens with defined, clinically relevant histologic features from various organ systems and diseases. Multiple whole slide imaging systems, multiple sites, and multiple readers must be included. (C) Data from a clinical study to demonstrate that viewing, reviewing, and diagnosing digital images of surgical pathology slides prepared from tissue slides using the whole slide imaging system is non-inferior to using an optical microscope. The study should evaluate the difference in major discordance rates between manual digital (MD) and manual optical (MO) modalities when compared to the reference (e.g., main sign-out diagnosis). (D) A detailed human factors engineering process must be used to evaluate the whole slide imaging system user interface(s). (2) Labeling compliant with 21 CFR 809.10(b) must include the following: The intended use statement must include the information described in paragraph (i) (1)(i) of this section, as applicable, and a statement that reads, "It is the responsibility of a qualified pathologist to employ appropriate procedures and safeguards to assure the validity of the interpretation of images obtained using this device." (ii) A description of the technical studies and the summary of results, including those that relate to paragraph (1)(ii) and (1)(iii) of this section, as appropriate. (iii) A description of the performance studies and the summary of results, including those that relate to paragraph (1)(iv) of this section, as appropriate. (iv) A limiting statement that specifies that pathologists should exercise professional judgment in each clinical situation and examine the glass slides by conventional microscopy if there is doubt about the ability to accurately render an interpretation using this device alone.
*Classification.* Class II (special controls). The special controls for this device are:(1) Premarket notification submissions must include the following information:
(i) The indications for use must specify the tissue specimen that is intended to be used with the whole slide imaging system and the components of the system.
(ii) A detailed description of the device and bench testing results at the component level, including for the following, as appropriate:
(A) Slide feeder;
(B) Light source;
(C) Imaging optics;
(D) Mechanical scanner movement;
(E) Digital imaging sensor;
(F) Image processing software;
(G) Image composition techniques;
(H) Image file formats;
(I) Image review manipulation software;
(J) Computer environment; and
(K) Display system.
(iii) Detailed bench testing and results at the system level, including for the following, as appropriate:
(A) Color reproducibility;
(B) Spatial resolution;
(C) Focusing test;
(D) Whole slide tissue coverage;
(E) Stitching error; and
(F) Turnaround time.
(iv) Detailed information demonstrating the performance characteristics of the device, including, as appropriate:
(A) Precision to evaluate intra-system and inter-system precision using a comprehensive set of clinical specimens with defined, clinically relevant histologic features from various organ systems and diseases. Multiple whole slide imaging systems, multiple sites, and multiple readers must be included.
(B) Reproducibility data to evaluate inter-site variability using a comprehensive set of clinical specimens with defined, clinically relevant histologic features from various organ systems and diseases. Multiple whole slide imaging systems, multiple sites, and multiple readers must be included.
(C) Data from a clinical study to demonstrate that viewing, reviewing, and diagnosing digital images of surgical pathology slides prepared from tissue slides using the whole slide imaging system is non-inferior to using an optical microscope. The study should evaluate the difference in major discordance rates between manual digital (MD) and manual optical (MO) modalities when compared to the reference (
*e.g.,* main sign-out diagnosis).(D) A detailed human factor engineering process must be used to evaluate the whole slide imaging system user interface(s).
(2) Labeling compliant with 21 CFR 809.10(b) must include the following:
(i) The intended use statement must include the information described in paragraph (b)(1)(i) of this section, as applicable, and a statement that reads, “It is the responsibility of a qualified pathologist to employ appropriate procedures and safeguards to assure the validity of the interpretation of images obtained using this device.”
(ii) A description of the technical studies and the summary of results, including those that relate to paragraphs (b)(1)(ii) and (iii) of this section, as appropriate.
(iii) A description of the performance studies and the summary of results, including those that relate to paragraph (b)(1)(iv) of this section, as appropriate.
(iv) A limiting statement that specifies that pathologists should exercise professional judgment in each clinical situation and examine the glass slides by conventional microscopy if there is doubt about the ability to accurately render an interpretation using this device alone.
{0}
FDA
U.S. FOOD & DRUG
ADMINISTRATION
# 510(k) SUBSTANTIAL EQUIVALENCE DETERMINATION DECISION SUMMARY
## I Background Information:
A 510(k) Number
K232879
B Applicant
Ventana Medical Systems, Inc.
C Proprietary and Established Names
Roche Digital Pathology Dx (VENTANA DP 200)
D Regulatory Information
| Product Code(s) | Classification | Regulation Section | Panel |
| --- | --- | --- | --- |
| PSY | Class II | 21 CFR 864.3700 | 88-Pathology |
## II Submission/Device Overview:
A Purpose for Submission:
New device
B Type of Test:
Digital pathology whole slide imaging
## III Intended Use/Indications for Use:
A Intended Use(s):
Roche Digital Pathology Dx (VENTANA DP 200) is an automated digital slide creation, viewing and management system. Roche Digital Pathology Dx (VENTANA DP 200) is intended for in vitro diagnostic use as an aid to the pathologist to review and interpret digital images of scanned pathology slides prepared from formalin-fixed paraffin-embedded (FFPE) tissue. Roche
Food and Drug Administration
10903 New Hampshire Avenue
Silver Spring, MD 20993-0002
www.fda.gov
{1}
Digital Pathology Dx (VENTANA DP 200) is not intended for use with frozen section, cytology, or non-FFPE hematopathology specimens. Roche Digital Pathology Dx (VENTANA DP 200) is for creation and viewing of digital images of scanned glass slides that would otherwise be appropriate for manual visualization by conventional light microscopy.
Roche Digital Pathology Dx (VENTANA DP 200) is composed of VENTANA DP 200 slide scanner, Roche uPath enterprise software, and ASUS PA248QV display. It is the responsibility of a qualified pathologist to employ appropriate procedures and safeguards to assure the validity of the interpretation of images obtained using Roche Digital Pathology Dx (VENTANA DP 200).
## B Indication(s) for Use:
Same as Intended Use.
## C Special Conditions for Use Statement(s):
Rx - For Prescription Use Only
For In vitro diagnostic (IVD) use only
## IV Device/System Characteristics:
### A Device Description:
The Roche Digital Pathology Dx (VENTANA DP 200) is a Whole Slide Imaging (WSI) system which includes the following components.
- VENTANA DP 200 slide scanner
- Roche uPath enterprise software
- Display (ASUS PA248QV)
The VENTANA DP 200 slide scanner is a bright-field digital pathology scanner that accommodates loading and scanning of up to 6 standard-sized (1 x 3 inch) glass slides. The scanner comprises a high-resolution 20x objective with the ability to scan at both magnification levels of 20x and 40x. The scanner features automatic detection of the tissue specimen on the slide, automated 1D and 2D barcode reading and selectable volume scanning. It also integrates color profiling to ensure that images produced from scanned slides are generated with a color-managed International Color Consortium (ICC) profile. After initial quality control of the slides, a batch of up to 6 slides is loaded into the slide tray of the scanner and scanning is initiated. For each slide, the scanner generates a macro image that includes the barcode label and a low-magnification image of tissue on the slide and locates the stained tissue within the slide scan area. Scanning is performed at high resolution (0.465 µm/pixel for 20x and 0.25 µm/pixel for 40x). The resulting image files are compressed, saved in the BIF file format, and are transferred into either the uPath IMS (Image Management System) server or the uPath server by the iScan API (a part of the DP 200 Scan Application software).
The Roche uPath enterprise software (uPath) component of the Roche Digital Pathology system
K232879 - Page 2 of 18
{2}
which contains the image viewer, is a web-based image management and workflow software application. The uPath software interface enables the user to review the digital image on the display screen and report results.
The ASUS PA248QV display allows the slide images to be viewed. The ASUS PA248QV features an IPS LCD color display with a display area of 518.4 mm × 324.0 mm, a resolution of 1,920 × 1,200 pixels and an aspect ratio of 16:10, illuminated by an edge LED backlight.
## Instrument Description Information:
1. Instrument Name:
Roche Digital Pathology Dx (VENTANA DP 200)
2. Specimen Identification:
Glass slides and scanned images are identified based on the previously assigned specimen identifiers such as patient identifiers, barcodes, etc. Digital images of surgical pathology slides prepared from FFPE tissue.
3. Specimen Sampling and Handling:
Specimen sampling and handling are performed upstream and independent of the use of the subject device. Specimen sampling includes surgical pathology specimens such as biopsy or resection specimens which are processed using standard histology techniques. The FFPE tissue sections are stained using the Hematoxylin and Eosin (H&E) staining procedure. Then digital images are obtained from these glass slides using the VENTANA DP 200 scanner. Glass slides are identified by automatic reading of 1D and 2D barcode on slide label.
4. Calibration:
The VENTANA DP 200 scanner is calibrated and verified at the factory before shipment and confirmed by authorized Roche technicians as part of the installation process. Additionally, the scanner requires regular (monthly) diagnostic tests run automatically after user-initiation and has separate regular self-calibration activities upon each power-up and hourly during use. The scanner also provides automatic calibration and verification every time a slide is scanned by performing calibration during the scan. The display provided with Roche Digital Pathology Dx (VENTANA DP 200) has been calibrated at the factory; the user should not attempt to change the monitor settings as doing so may interfere with the monitor calibration. The uPath software application does not require calibration.
5. Quality Control:
Quality control (QC) activities are performed by the user per the laboratory standards and professional guidelines (e.g., staining, cover-slipping, barcode placement) prior to loading the slides into the VENTANA DP 200. After completing a scan, the lab technician checks image data and image quality as per the instructions for use. Before review, the pathologist ensures the quality of the WSI images before reviewing the image for diagnostic purposes.
K232879 - Page 3 of 18
{3}
V Substantial Equivalence Information:
A Predicate Device Name(s):
Aperio AT2 DX System
B Predicate 510(k) Number(s):
K190332
C Comparison with Predicate(s):
| Device & Predicate Device(s): | K232879
Roche Digital Pathology Dx
(VENTANA DP 200) | K190332
Aperio AT2 DX System |
| --- | --- | --- |
| General Device Characteristics: Similarities | | |
| Intended Use | Roche Digital Pathology Dx (VENTANA DP 200) is an automated digital slide creation, viewing and management system. Roche Digital Pathology Dx (VENTANA DP 200) is intended for in vitro diagnostic use as an aid to the pathologist to review and interpret digital images of scanned pathology slides prepared from formalin-fixed paraffin-embedded (FFPE) tissue. Roche Digital Pathology Dx (VENTANA DP 200) is not intended for use with frozen section, cytology, or non-FFPE hematopathology specimens. Roche Digital Pathology Dx (VENTANA DP 200) is for creation and viewing of digital images of scanned glass slides that would otherwise be appropriate for manual visualization by conventional light microscopy.
Roche Digital Pathology Dx (VENTANA DP 200) is composed of VENTANA DP 200 slide scanner, Roche uPath enterprise software, and ASUS PA248QV display. It is the responsibility of a qualified pathologist to employ appropriate procedures and safeguards to assure the validity of the interpretation of images obtained using Roche Digital Pathology Dx (VENTANA DP 200). | The Aperio AT2 DX System is an automated digital slide creation and viewing system. The Aperio AT2 DX System is intended for in vitro diagnostic use as an aid to the pathologist to review and interpret digital images of surgical pathology slides prepared from formalin-fixed paraffin embedded (FFPE) tissue. The Aperio AT2 DX System is not intended for use with frozen section, cytology, or non-FFPE hematopathology specimens.
The Aperio AT2 DX System is composed of the AperioAT2 DX scanner, the ImageScope DX review application and Display. The Aperio AT2 DX System is for creation and viewing of digital images of scanned glass slides that would otherwise be appropriate for manual visualization by conventional light microscopy. It is the responsibility of a qualified pathologist to employ appropriate procedures and safeguards to assure the validity of the interpretation of images obtained using the Aperio AT2 DX System. |
K232879 - Page 4 of 18
{4}
| Device & Predicate Device(s): | K232879 Roche Digital Pathology Dx (VENTANA DP 200) | K190332 Aperio AT2 DX System |
| --- | --- | --- |
| Principle of Operation | After conducting Quality Control (QC) on the glass slides per laboratory standards (e.g., staining, coverslipping, barcode placement, etc.), the technician loads the slides into the VENTANA DP 200 slide scanner. The scanner scans the slides and generates a whole slide image for each slide. The technician performs QC on scanned WSI images by checking image data and image quality. When QC fails, the slide will be re-scanned. The acquired WSI images are stored in an end user provided image storage attached to the local network. During review, the pathologist opens WSI images from the image storage in Roche uPath enterprise software, performs further QC to ensure image quality, and reads the WSI images of the slides to make a diagnosis. | Same |
| Device Components | WSI scanner (VENTANA DP 200 slide scanner), Image Management System (Roche uPath enterprise software), and color monitor display | WSI scanner (Aperio AT2 DX scanner), Image Management System (ImageScope DX application) and color monitor display |
| Scanning Magnification | 40x 20x | Same |
| General Device Characteristic: Differences | | |
| Whole Slide Imaging Scanner/slide capacity | VENTANA DP 200/ 6 slides | Aperio AT2 DX/ 400 slides |
| Scan Output Image Format | BIF | SVS |
| Review software | Roche uPath | ImageScope DX |
| Compatible Display (Monitor) | ASUS PA248QV | Dell MR2416 |
VI Standards/Guidance Documents Referenced:
1. Technical Performance Assessment of Digital Pathology Whole Slide Imaging Devices. Guidance for Industry and Food and Drug Administration Staff April 20, 2016.
2. Applying Human Factors and Usability Engineering to Medical Devices: Guidance for Industry and Food and Drug Administration Staff. February 3, 2016.
K232879 - Page 5 of 18
{5}
3. Guidance for the Content of Premarket Submissions for Software Contained in Medical Devices. Guidance for Industry and Food and Drug Administration Staff, May 2005.
4. IEC/EN 61010-1:2010/AMD1: 2016, Safety requirements for electrical equipment for measurement, control, and laboratory use – Part 1: General requirements.
5. IEC/EN 61010-2-101: 2018, Safety requirements for electrical equipment for measurement, control, and laboratory use – Part 2-101: Particular requirements for in vitro diagnostic (IVD) medical equipment.
6. Electromagnetic Compatibility (EMC) of Medical Devices. Guidance for Industry and Food and Drug Administration Staff (June 6, 2022).
## VII Performance Characteristics (if/when applicable):
### A. Analytical Performance:
#### 1. Precision/Reproducibility:
The objective of this study was to evaluate the repeatability (within- and between- system) and reproducibility (between-site) of the Roche Digital Pathology Dx (Ventana DP 200 scanner) device.
The precision of the device was evaluated based on the review and identification of specific histopathologic "features" that are observed in FFPE H&E stained slides by 2 pathologists (readers) at each of 3 external pathology laboratories (study sites) across multiple scanning days. Twenty-three (23) primary features were selected for inclusion into the studies. The selected primary features were evaluated at their relevant magnifications -12 primary features evaluated at 20x magnification level and 11 primary features evaluated at 40x magnification level. The reading pathologists independently identified specific histological primary features in multiple Regions of Interest (ROIs,) pre-selected on WSI scans generated by the VENTANA DP 200 scanner at each of 3 study sites.
For each of the 23 feature types, 3 unique study cases, generally from different organ systems or tissue types (see Table 1 below), were enrolled and included in the study analyses, for a total analysis cohort of 69 cases (slides). H&E-stained slides from 12 additional unique cases were included in the study as "wild card" slides to reduce recall bias but were excluded from the statistical analyses. Each of the wild card slides also contained 3 ROIs selected by the Screening Pathologist, but the 3 ROIs for a given wild card slide did not have to contain the same type of primary feature. Since each slide enrolled in the study contained 3 ROIs, a total of 207 "study" ROIs (69 study cases x 3 ROIs/study case) were included in the analyses, and an additional 36 wild card ROIs (12 wild card cases x 3 ROIs/wild card case) were included in the study.
Table 1. Primary Histologic Study Features in Precision Study
| Feature | Organ #1 | Organ #2 | Organ #3 |
| --- | --- | --- | --- |
| 20x Magnification | | | |
K232879 - Page 6 of 18
{6}
K232879 - Page 7 of 18
| Feature | Organ #1 | Organ #2 | Organ #3 |
| --- | --- | --- | --- |
| Chondrocytes | Bone (left proximal humerus) | Bone (right scapula) | Soft tissue (chest, xiphoid) |
| Fat cells (adipocytes) | Lymph node (anterior prostatic) | Lymph node (pelvis, right) | Omentum |
| Foreign body giant cells | Breast, lower outer | Liver #1 | Soft tissue (sixth intercostal muscle) |
| Goblet cells | Colon (ascending) | Duodenum (second portion) | Lung (left upper lobe) |
| Granulomas | Lung (left lower lobe) | Lymph node (inguinal) | Lymph node (right axillary) |
| Infiltrating or metastatic lobular carcinoma | Breast (left) | Breast (right) | Chest wall (right) |
| Intraglandular necrosis | Breast (left) | Breast (right, lower outer quadrant) | Oral cavity (left buccal mucosa) |
| Osteoclasts | Bone (fibula, proximal, lesion, right) | Bone (left temporal bone) | Bone (tibia, left) |
| Osteocytes | Bone (frontal, right) | Bone (right tibia) | Pelvis (left, acetabular lesion) |
| Pleomorphic nucleus of malignant cell | Brain (right temporal mass) | Soft tissue (left thigh) | Soft tissue (right thigh) |
| Serrated intestinal epithelium (e.g., sessile serrated polyp) | Colon (ascending, polyp) | Colon (ascending, polyp) | Colon (ascending, polyp) |
| Skeletal muscle fibers | Breast (left) | Left thigh | Thyroid gland |
| 40x Magnification | | | |
| Asteroid bodies | Knee, right, synovium #3 | Lung (upper lobe) | Lymph node |
| Clear cells | Aorta, inter aorta caval lymph node | Left kidney | Ovary and fallopian tube |
| Foreign bodies (e.g., plant material or foreign debris) | Aorta, ascending, pseudoaneurysm wall | Small intestine and colon | Soft tissue (abdomen) |
{7}
| Feature | Organ #1 | Organ #2 | Organ #3 |
| --- | --- | --- | --- |
| Hemosiderin (pigment) | Breast (left chest wall nodule) | Breast (right) | Breast (right) |
| Megakaryocytes | Bone (distal sternum and right ribs) | Bone (rib, right) | Buttock (left, lesion) |
| Necrosis | Lung (right lower lobe) | Lung (right upper lobe) | Right great toe |
| Nerve cell bodies (e.g., ganglion cells) | Colon (cecum, polyp x2) | Esophagus | Soft tissue (left paraspinal) |
| Nuclear grooves | Bone (left acetabulum) | Left fallopian tube and left ovary | Thyroid (right) |
| Osteoid matrix | Bone (left acetabulum) | Bone (right tibia) | Bone (right ulna) |
| Psammoma bodies | Brain (posterior fossa tumor) | Brain (right frontal tumor) | Thyroid (lobe, left) |
| Reed-Sternberg cell | Lymph node (cervical right, level IV) | Neck mass (right) | Thymus |
In each of their reading sessions, each reader accessed the designated ROI images from their site in uPath and evaluated them to identify any primary features that were present, using a checklist of the 23 protocol-specified primary features. During their evaluations, readers were provided with the scanning magnification level and organ system/tissue type for the ROI image and were able to move about freely on the ROI image, varying the viewing magnification as desired, however, they were blinded to case ID, patient clinical information and sign-out diagnoses and all previous screening or study results. Each primary feature assessment for a study case ROI image was then compared to the reference primary feature for that case. The reference feature with agreement was evaluated between readers, between sites, and between scanning days. If a reader identified a primary feature other than the reference feature as present in a given ROI image (in addition to or instead of the reference feature), that non-reference identification had no effect on the study endpoints. Only the results from the study ROIs were used in the statistical analyses; those from wild card ROIs were excluded from the analyses. The precision of the system was to be considered acceptable if the lower bounds of the 2-sided 95% confidence intervals (CIs) for all co-primary endpoints (i.e., the overall percent agreement (OPA) point estimate for between system/ between site, between day/within-system, and between reader (pathologist)/ within reader) were at least 85%.
The precision of the Roche Digital Pathology Dx was assessed in 3 sub-studies:
- Between system (scanner)/Between Site precision using 3 independent systems.
- Within system (scanner)/Between Day precision using 3 independent systems at 3 different sites.
- Between reader (pathologist)/ Within reader precision.
K232879 - Page 8 of 18
{8}
Between System/Between Site Precision
The pairwise agreement of reference feature status between sites was analyzed for each site pair separately and then for all pairwise comparisons combined. For the analysis of OPA between sites, the 2 readers at each site and the 3 scanning days were considered. The comparison included all possible permutations between the 6 observations (2 readers × 3 days) per ROI within a site, for a total of 36 (6 × 6) possible pairwise permutations between 2 sites per ROI. Agreements between systems at Site A versus Site B, systems at Site A versus Site C, and systems at Site B versus Site C were analyzed, then aggregated across readers and days. The overall between-site/system precision is based on the pooled data from all site-to-site comparisons. OPA for the analysis pooled across all site pairs was 89.3% (95% CI: 85.8, 92.4), satisfying the acceptance criterion. Study results are presented in the Table below.
Table 2. Between System/Between Site Agreement Rate
| System | Number of Pairwise Agreements | Number of Comparison Pairs | Agreement Rate and 95% CI | |
| --- | --- | --- | --- | --- |
| | | | % Agreement | 95% CI |
| Site A vs. Site B | 6277 | 7210 | 87.1 | (83.2, 90.7) |
| Site A vs. Site C | 6572 | 7284 | 90.2 | (86.8, 93.3) |
| Site B vs. Site C | 6661 | 7345 | 90.7 | (87.2, 93.8) |
| Overall | 19510 | 21839 | 89.3 | (85.8, 92.4) |
Within System/Between Day Precision
This study evaluated the agreement between days for each scanner. Within system/Between Day precision was analyzed by first performing all possible pairwise comparisons between days for each reader separately (i.e., for each reader, their Day 1 results were compared to their Day 2 results, their Day 2 results were compared to their Day 3 results, and their Day 1 results were compared to their Day 3 results) and then pooling the 3 day-pair results together. These individual reader results were then aggregated across all readers at all sites to determine overall between-day/within-system precision. OPA for the analysis pooled across all site pairs was 90.3% (95% CI: 87.1, 93.2), satisfying the acceptance criterion for this endpoint (95% CI lower bound ≥ 85%). Study results are presented in the Table below.
Table 3. Within System/Between Day Agreement Rates
| Site, Reader | Number of Day to Day Pairwise Agreements | Number of Comparison Pairs | Agreement Rate and 95% CI | |
| --- | --- | --- | --- | --- |
| | | | % Agreement | 95% CI |
| Site A, Reader 1 | 495 | 592 | 83.6 | (78.7, 88.2) |
| Site A, Reader 2 | 549 | 607 | 90.4 | (86.0, 94.6) |
| Site B, Reader 1 | 540 | 613 | 88.1 | (83.7, 92.3) |
| Site B, Reader 2 | 541 | 604 | 89.6 | (84.9, 93.7) |
| Site C, Reader 1 | 607 | 621 | 97.7 | (94.8, 100.0) |
K232879 - Page 9 of 18
{9}
K232879 - Page 10 of 18
| Site C, Reader 2 | 570 | 619 | 92.1 | (88.4, 95.5) |
| --- | --- | --- | --- | --- |
| Overall | 3302 | 3656 | 90.3 | (87.1, 93.2) |
## Between Reader Precision
In the between-reader precision analysis, the pairwise agreement between the 2 readers within a site (i.e., between the 2 readers at Site A, between the 2 readers at Site B and between the 2 readers at Site C) was analyzed separately for each site, and these pairwise results were then pooled across all sites. OPA for the analysis pooled across all sites was 90.1% (95% CI: 86.6, 93.0), satisfying the acceptance criterion for this endpoint (95% CI lower bound ≥ 85%). Study results are presented in the Table below.
Table 4. Between Reader Agreement Rates
| Reader | Number of Pairwise Agreements | Number of Comparison Pairs | Agreement Rate and 95% CI | |
| --- | --- | --- | --- | --- |
| | | | % Agreement | 95% CI |
| Reader A1 vs A2 | 528 | 603 | 87.6 | (83.3, 91.4) |
| Reader B1 vs B2 | 536 | 609 | 88.0 | (83.6, 92.2) |
| Reader C1 vs C2 | 586 | 620 | 94.5 | (91.5, 97.3) |
| Overall | 1650 | 1832 | 90.1 | (86.6, 93.0) |
## Within Reader Precision
The within-reader precision analysis compared each reader's first assessment of their site's Day 1 ROI images (performed in their first reading session) with the same reader's second assessment of the same ROI images (performed in their fourth reading session), with the images presented in a different random order in each session. Due to the minimum 2-week washout period between reading sessions 1, 2, 3, and 4, the assessments used in the within-reader precision analyses therefore were performed at least 6 weeks apart. Pairwise agreement between reads was assessed for each reader separately, and the results were aggregated across all readers at all sites to determine the overall OPA for within-reader precision. Study results are presented in the Table below.
Table 5. Within Reader Agreement Rates
| Reader | Number of Pairwise Agreements | Number of Comparison Pairs | Agreement Rate and 95% CI | |
| --- | --- | --- | --- | --- |
| | | | % Agreement | 95% CI |
| Site A, Reader 1 | 160 | 200 | 80.0 | (73.4, 86.1) |
| Site A, Reader 2 | 183 | 203 | 90.1 | (85.7, 94.5) |
| Site B, Reader 1 | 172 | 206 | 83.5 | (78.0, 88.9) |
| Site B, Reader 2 | 175 | 201 | 87.1 | (81.4, 92.0) |
| Site C, Reader 1 | 202 | 207 | 97.6 | (94.7, 100.0) |
{10}
| Reader | Number of Pairwise Agreements | Number of Comparison Pairs | Agreement Rate and 95% CI | |
| --- | --- | --- | --- | --- |
| | | | % Agreement | 95% CI |
| Site C, Reader 2 | 186 | 206 | 90.3 | (86.0, 94.2) |
| Overall | 1078 | 1223 | 88.1 | (84.8, 91.3) |
2. Linearity:
Not applicable
3. Analytical Specificity/Interference:
Not applicable
4. Accuracy (Instrument):
Not applicable
5. Carry-Over:
Not applicable
B. Technical Studies:
Multiple studies were conducted to evaluate the performance of the Roche Digital Pathology Dx (Ventana DP200 scanner) as recommended in FDA guidance titled “Technical Performance Assessment of Digital Pathology Whole Slide Imaging Devices”.
a. Slide Feeder
Information was provided on the configuration of the slide feed mechanism, including a physical description of the slide, the number of slides in queue (carrier), and the class of automation. Information was provided on the user interaction with the slide feeder, including hardware, software, feedback mechanisms, and Failure Mode and Effects Analysis (FMEA).
a. Light source
Descriptive information associated with the lamp and the condenser was provided. Testing information was provided to verify the spectral distribution of the light source as part of the color reproduction capability of the VENTANA DP 200 scanner.
b. Imaging optics
An optical schematic with all optical elements identified from slide (object plane) to digital image sensor (image plane) was provided. Descriptive information regarding the microscope objective, the auxiliary lenses, and the magnification of imaging optics was provided. Testing information regarding the relative irradiance, optical distortions, and lateral chromatic aberrations was provided.
K232879 - Page 11 of 18
{11}
c. Mechanical scanner movement
Information and specifications on the configuration of the stage, method of movement, control of movement of the stage, and FMEA was provided. Test data to verify the repeatability of the stage movement mechanism staying within limit during operation was provided.
d. Digital imaging sensor
Information and specifications on the sensor type, pixel information, responsivity specifications, noise specifications, readout rate, and digital output format was provided. Test data to determine the correct functioning of the digital image sensor that converts optical signals of the slide to digital signals which consist of a set of numerical values corresponding to the brightness and color at each point in the optical image was provided.
e. Image processing software
Information and specifications on exposure control, white balance, color correction, subsampling, pixel-offset correction, pixel-gain or flat-field correction, and pixel-defect correction were provided.
f. Image composition
Information and specifications on the scanning method, the scanning speed, and the number of planes at the Z-axis to be digitized were provided. Test data to analyze the image composition performance was provided.
g. Image files format
Information and specifications on the compression method, compression ratio, file format, and file organization were provided.
h. Image review manipulation software
Information and specifications on continuous panning and pre-fetching, continuous zooming, discrete Z-axis displacement, the ability to compare multiple slides simultaneously on multiple windows, image enhancement and sharpening functions, color manipulation, annotation tools, and digital bookmarks were provided.
i. Computer environment
Information and specifications on the computer hardware, operating system, graphics card, graphics card driver, color management settings, color profile, and display interface were provided.
j. Display
Information and specifications on the technological characteristics of the display such as pixel density, aspect ratio, display viewing area, display surface, backlight type, panel type, viewing angle, pixel pitch, resolution, color space, max brightness, contrast ratio, response time, refresh rate (Max), color accuracy, color adjustment, gamma adjustment, adjustments, display interface, and certificate were provided. Test data to verify the performance of the display for user controls, spatial resolution, pixel defects (count and map), artifacts, temporal response, maximum and minimum luminance (achievable and recommended), grayscale, luminance uniformity, bidirectional reflection distribution function, gray tracking, color scale, and color gamut volume was provided.
K232879 - Page 12 of 18
{12}
k. Color reproducibility
Test data to evaluate the color reproducibility of the system was provided.
l. Spatial resolution
Test data to evaluate the composite optical performance of all components in the image acquisition phase was provided.
m. Focusing test
Test data to evaluate the technical focus quality of the system was provided.
n. Whole slide tissue coverage
Test data to demonstrate that the entire tissue specimen on the glass slide is detected by the tissue detection algorithms and that all the tissue specimens are included in the digital image file was provided.
o. Stitching error
Test data to evaluate the stitching errors and artifacts in the reconstructed image was provided.
p. Turnaround time
Requirements to specify the turnaround time of the system was provided.
C. Clinical Studies
A retrospective multi-center study was conducted to demonstrate that viewing, reviewing, and diagnosing digital images of surgical pathology FFPE tissue slides using the Roche Digital Pathology Dx (VENTANA DP 200) system is non-inferior to using traditional optical light microscopy. The primary endpoint was the difference in agreement rates between diagnoses rendered using Roche's digital WSI review modality (digital read [DR]) and the manual microscopy slide review modality (manual read [MR]) when each was compared to the reference diagnosis, which is based on the original sign-out pathologic diagnosis rendered at the study sites using an optical (light) microscope. The study consisted of reviewing archived, de-identified and previously "signed-out" slides representing main organ systems within surgical pathology. Cases included retrospective H&E stained FFPE tissue, special stains and/or immunohistochemical stains (IHC) from the pathology practice, but did not include frozen sections, or cytological and hematological cases.
Four sites were used in the study. Two Screening Pathologists at each study site pre-screened cases from that site for possible inclusion in the study by reviewing their clinical database of archived specimens. Each site selected cases sequentially (chronologically or reverse chronologically) with a minimum of 1 year between the date of sign-out diagnosis and beginning of the study. The first Screening Pathologist reviewed all available H&E and ancillary-stains slides for each case using manual microscopy to determine whether the case met the study inclusion/exclusion criteria and confirmed the diagnosis related to that particular case by microscopically evaluating the H&E and ancillary stained slides along with the relevant clinical information as extracted from the sign-out report. For biopsy cases, the Screening Pathologist selected a representative H&E slide(s) (and IHC and special stain slides, if available), which were critical for a pathologist to reproduce the primary (reference) diagnosis. For non-biopsy cases, in addition to the representative H&E slide(s) (IHC and special stain slides, if available), slides from surgical specimen margins that were reported in the primary diagnosis were included. For malignant cases, slide(s) that provided the grading and positive and/or negative lymph node
K232879 - Page 13 of 18
{13}
status that were reported in the primary diagnosis were also included. For a multi-part case, such as simultaneously obtained tissue biopsies from different areas of the organ, the first sequential case slide that matches a diagnostic category applicable to the pre-specified list of diagnoses was included in the study. Once the case slides were reviewed and selected by the first Screening Pathologist, the sign-out diagnosis report data captured on the screening case report forms (CRFs) was verified by a second Screening Pathologist to confirm the diagnostic accuracy of the case and whether the case met the study inclusion/exclusion criteria.
A total of 2047 cases (a total of 3259 slides) with approximately 500 per site consisting of multiple organ and tissue types were enrolled. At each site, all four Reading Pathologists read all the cases enrolled and scanned at that site using both MR and DR modalities in an alternating fashion and randomized order and with a washout period of at least 30 days between the MR and DR diagnoses. The 16 Reading Pathologists were provided with all representative slide(s) for each case at the same time, mimicking a practice setting.
Study inclusion/exclusion criteria are as follows:
**Inclusion Criteria:**
- Study slides should match one of the 20 pre-specified organ/tissue type categories listed in the study protocol and had to have a recorded sign-out diagnosis applicable to one of the pre-specified diagnoses listed in the study protocol.
- Surgical procedures for each case had to be as specified in the pathology report for each organ.
- FFPE tissue and stained with H&E using a documented, validated staining protocol.
- If applicable, slides of ancillary stains such as special histochemical and immunohistochemical (IHC) stains utilized for assessment of primary diagnosis had to be available and evaluable.
- When viewed by the Screening Pathologists, the slide(s) included had to match the sign-out diagnosis used as the study’s reference diagnosis.
**Exclusion Criteria:**
- Archived slides for the respective case were in poor condition. Indicators of poor condition included, but were not limited to, fading of the H&E stain, irremovable markings, air bubbles under the cover slip, cracks, and any issues that would affect the ability of the slide(s) to be scanned.
- The organ/procedure subtype counts exceed that allocated for the study.
- The H&E-stained slides that were used for the original sign-out diagnosis are not available at the site, and re-cuts were not available.
- If applicable, ancillary stain slides (IHC or special stain slides) that were used for the original artifacts. Control slides for the IHC or special stain are not available.
- The slides needed to support the original sign-out diagnosis require either a special light source (e.g., a mercury lamp for fluorescence microscopy) or special filters (e.g., for polarized light).
- Only frozen sections or gross specimens are available for the case. Note: Cases with frozen sections could be included if FFPE slides with non-frozen sections demonstrated one of the intended diagnoses; only the frozen section slides were excluded.
- Significant clinical and ancillary information that was used to establish the original diagnosis is missing.
K232879 - Page 14 of 18
{14}
- The signed-out date would have been less than 1 year prior to the study start date. More than 1 specimen per patient was selected.
After the reader pathologists at each site completed the review of the study cases and the primary diagnosis case report forms (CRF), three independent pathologist adjudicators reviewed the study reader's diagnosis and compared it with the original "signed out" diagnosis (reference) to determine concordance, minor discordance, or major discordance between the study diagnosis (by WSI-DR and glass methods-MR) and original "signed out" diagnosis. A third pathologist adjudicator was used if disagreement occurred between the first two adjudicators on the classification of a "major" discordance. The original signed-out diagnosis is based on the original sign-out pathologic diagnosis rendered at the institutions using an optical (light) microscope. A major discordance was defined as a difference in diagnosis that would be associated with a clinically important difference in patient management. A minor discordance was defined as a difference in diagnosis that would not be associated with a clinically important difference in patient management. If there was a disagreement between the 2 adjudicators, a third adjudication pathologist reviewed the case to achieve majority consensus.
The study acceptance criteria were as follows: The upper bound of the two-sided 95% CI of the difference between the overall major discordance rates of WSI diagnoses and glass slide based diagnoses is ≤4%.
## Study Results:
A total of 7562 DR diagnoses paired with 7562 MR diagnoses adjudicated by the adjudication panel had consensus scores and were included in the statistical analyses. The observed overall agreement rate, i.e., over all sites, Reading Pathologists and organs was 92.00% for DR modality and 92.61% for MR modality. The DR-MR difference in agreement rate was -0.61% (95% CI: -1.20%, 0.00%).
In addition to the observed analysis, a Generalized Linear Mixed Model (GLIMMIX) logistic regression was conducted on the study population to demonstrate the non-inferiority of the DR agreement rate as compared to the MR agreement rate. For each reading result and reading mode, the dependent variable was the agreement with sign-out diagnosis status. The model accounted for fixed study effects (i.e., reading modality and organ type) and random study effects (i.e., site and reader nested within site). The agreement rates as estimated by the GLIMMIX logistic model ("modeled") resulted in similar proportions as the study point estimates, i.e., 91.54% for DR modality and 92.16% for MR modality. The DR-MR difference in agreement rate was -0.62%, with derived 2-sided 95% CI of [-1.60%, 0.20%]. These model results fail to show any statistically significant difference between the 2 reading modalities.
The lower limit of the 95% confidence interval of DR-MR was greater than the pre-specified non-inferiority margin of -4%, and therefore, the DR modality using Roche Digital Pathology Dx was demonstrated to be non-inferior to the MR modality using light microscopy. Thus, the study met the primary objective.
K232879 - Page 15 of 18
{15}
Table 6. Clinical Study Results Based on Major Discordance Rates
| | Whole Slide Imaging Review (DR) | | | Light Microscope Slide Review (MR) | | | Difference (DR - MR) | |
| --- | --- | --- | --- | --- | --- | --- | --- | --- |
| | Total Reads | % Discordant | 95% CI | Total Reads | % discordant | 95% CI | % discordant | 95% CI |
| Observed | 7562 | 8.00 | 6.73, 9.27 | 7562 | 7.39 | 6.11, 8.78 | 0.61 | -0.35, 1.59 |
| Model | 7725 | 8.46 | 7.35, 9.71 | 7744 | 7.84 | 6.80, 9.12 | 0.62 | -0.26, 1.50 |
The differences in major discordance rates by organ types for the full cohort between WSIR and MSR are shown in the table below.
Table 7. Major Discordance Rates by Organ
| Organ Type | Digital Read (DR) | Manual Read (MR) | Difference in Agreement (DR-MR) |
| --- | --- | --- | --- |
| Anus/ Perianal | 93.0% | 95.7% | -2.7% |
| Appendix | 98.4% | 100.0% | -1.6% |
| Bladder | 85.9% | 87.8% | -1.8% |
| Brain/ Neurological | 94.3% | 92.4% | 1.9% |
| Breast | 91.0% | 93.2% | -2.1% |
| Colorectal | 93.2% | 93.0% | 0.2% |
| Endocrine | 91.3% | 92.1% | -0.8% |
| GE Junction | 90.7% | 91.5% | -0.9% |
| Gallbladder | 100.0% | 100.0% | 0.0% |
| Gynecological | 89.6% | 89.6% | 0.0% |
| Hernial/ Peritoneal | 100.0% | 100.0% | 0.0% |
| Kidney, Neoplastic | 96.2% | 94.9% | 1.3% |
| Liver/ Bile duct, Neoplastic | 97.0% | 98.5% | -1.5% |
| Lung/ Bronchus/ Larynx /Oral Cavity/ Nasopharynx | 89.4% | 92.3% | -2.9% |
| Lymph Node | 97.1% | 97.8% | -0.7% |
K232879 - Page 16 of 18
{16}
| Prostate | 93.4% | 92.9% | 0.5% |
| --- | --- | --- | --- |
| Salivary Gland | 95.3% | 94.8% | 0.5% |
| Skin | 89.6% | 89.4% | 0.2% |
| Soft Tissue Tumors | 96.6% | 93.1% | 3.4% |
| Stomach | 92.4% | 93.6% | -1.2% |
| Overall | 92.0% | 92.6% | -0.6% |
The clinical study was not powered to analyze the results by individual organ site or diagnosis. The difference in modality agreement, DR-MR, ranged from -2.9% for lung to 3.4% for soft tissue tumors. Three organ types, gallbladder, gynecological, and hernial/peritoneal had no differences (difference between reading modalities of 0.0%). For all organ types, the overall agreement rate was 92.0% for DR and 92.6% for MR, with a difference in agreement (DR-MR) of -0.6%. The lowest agreement in both modalities was observed with bladder cases, with an agreement rate of 85.9% for DR and 87.8% for MR. Subset of cases in the study were more difficult to adjudicate by both MR or DR reading modalities which may have contributed to the higher discordance rates.
When considering those cases where all four readers at the site provided successfully adjudicated diagnoses for both DR and MR, there were a total of 9552 reader pairs for agreement rate calculation. Overall, when considering all reader comparisons across all sites, the between-reader agreement rate was 91.4% (95% CI: 90.8, 91.9) for MR, and 90.6% (95% CI: 90.0, 91.1) for DR. The between-reader agreement rate ranged from 86.9% to 95.2% for MR, and 85.1% to 94.5% for DR.
D. **Human Factor Study:**
Human factors studies designed to assess performance of critical user tasks and use scenarios by representative users, including anatomic pathology lab technicians and pathologists were conducted. Information provided included a list of all critical user tasks and a description of the process that was followed. A systematic evaluation of simulated use by representative participants (17 technicians and 18 pathologists) performing all tasks (including critical tasks) required for operation of the system, and subjective assessment of potential failure modes was provided. All participants were able to perform all tasks (including the critical tasks), and no critical task failures were observed. There were some occasional difficulties. However, all user difficulties observed in the studies had minimal impact on the perception of the usability, and no difficulties or failures were observed performing tasks that could lead to permanent or serious patient harm. In all instances, both pathologists and histopathology technicians were able to identify cases and ensure that all information needed to perform primary diagnosis was available and accessible.
E. **Other Supportive Instrument Performance Characteristics Data:**
Not applicable
K232879 - Page 17 of 18
{17}
VIII Proposed Labeling:
The labeling is sufficient, and it satisfies the requirements of 21 CFR Parts 801 and 809, as applicable, and the special controls for this device type under 21 CFR 864.3700.
IX Conclusion:
The submitted information in this premarket notification is complete and supports a substantial equivalence decision.
K232879 - Page 18 of 18
Panel 1
/
Sort by
Ready
Predicate graph will load when search results are available.
Embedding visualization will load when search results are available.
PDF viewer will load when search results are available.
Loading panels...
Select an item from Submissions
Click any panel, subpart, regulation, product code, or device to see details here.
Section Matches
Results will appear here.
Product Code Matches
Results will appear here.
Special Control Matches
Results will appear here.
Loading collections...
Loading
My Alerts
You will receive email notifications based on the filters and frequency you set for each alert.
Sort by:
Create Alert
Search Filters
Agent Token
Create a read-only bearer token for Claude, ChatGPT, or other agents that can call HTTP APIs.