Philips IntelliSite Pathology Solution 5.1

K233204 · Philips Medical Systems Nederland B.V. · PSY · Jun 24, 2024 · Pathology

Device Facts

Record IDK233204
Device NamePhilips IntelliSite Pathology Solution 5.1
ApplicantPhilips Medical Systems Nederland B.V.
Product CodePSY · Pathology
Decision DateJun 24, 2024
DecisionSESE
Submission TypeTraditional
Regulation21 CFR 864.3700
Device ClassClass 2

Indications for Use

The Philips IntelliSite Pathology Solution (PIPS) 5.1 is an automated digital slide creation, viewing, and management system. The PIPS 5.1 is intended for in vitro diagnostic use as an aid to the pathologist to review and interpret digital images of surgical pathology slides prepared from formalin-fixed paraffin embedded (FFPE) tissue. The PIPS 5.1 is not intended for use with frozen section, cytology, or non-FFPE hematopathology specimens. The PIPS 5.1 comprises the Imagement System (IMS) 4.2, Ultra Fast Scanner (UFS), Pathology Scanner SG20, Pathology Scanner SG60, Pathology Scanner SG300 and PP27QHD Display. The PIPS 5.1 is for creation and viewing of digital images of scanned glass slides that would otherwise be appropriate for manual visualization by conventional light microscopy. It is the responsibility of a qualified pathologist to employ appropriate procedures and safeguards to assure the validity of the interpretation of images obtained using PIPS 5.1.

Device Story

PIPS 5.1 is an automated digital pathology system for slide creation, viewing, and management. It consists of a scanner (UFS or Pathology Scanner SG series), Image Management System (IMS) 4.2, and PP27QHD display. Technicians load FFPE slides into the scanner; the system generates digital images (iSyntax format). Pathologists review these images on the display to render diagnoses, replacing or supplementing conventional light microscopy. The system supports panning, zooming, and image enhancement. It is used in clinical pathology settings. By providing high-resolution digital images, the device enables remote or digital review, potentially improving workflow efficiency and diagnostic consistency compared to manual microscopy.

Clinical Evidence

Clinical validation study (n=952 cases, 1735 slides, 6988 total readings) compared manual digital (MD) diagnosis using PIPS 5.1 against manual optical (MO) microscopy. Primary endpoint: non-inferiority of MD to MO based on major discordance rates. Observed difference in major discordance was 0.1% (95% CI: -1.01%, 1.18%). Since the upper bound was <4%, non-inferiority was met. Analytical precision/reproducibility studies (within-system, between-system, between-site) showed overall agreement rates of 88.3%, 95.4%, and 90.7% respectively. Pixelwise comparison confirmed identical image reproduction between PIPS 5.1 and predicate.

Technological Characteristics

System includes WSI scanners (UFS, SG20/60/300) and IMS 4.2 software. Scanners use optical imaging, mechanical stage movement, and digital sensors to create WSIs. Image processing includes exposure control, white balance, color correction, and flat-field correction. Data stored in proprietary iSyntax format. Display: PP27QHD with built-in calibration sensor. Connectivity: Networked server-client architecture. Software: HTML-based web application for review. Sterilization: Not applicable (device is for slide imaging).

Indications for Use

Indicated for use by pathologists as an aid in the review and interpretation of digital images of FFPE surgical pathology slides. Not for use with frozen section, cytology, or non-FFPE hematopathology specimens.

Regulatory Classification

Identification

The whole slide imaging system is an automated digital slide creation, viewing, and management system intended as an aid to the pathologist to review and interpret digital images of surgical pathology slides. The system generates digital images that would otherwise be appropriate for manual visualization by conventional light microscopy.

Special Controls

A whole slide imaging system must comply with the following special controls: (1) Premarket notification submissions must include the following information: (i) The indications for use must specify the tissue specimen that is intended to be used with the whole slide imaging system and the components of the system. (ii) A detailed description of the device and bench testing results at the component level, including for the following, as appropriate: (A) Slide feeder; (B) Light source; (C) Imaging optics: (D)Mechanical scanner movement; (E) Digital imaging sensor; (F) Image processing software; (G)Image composition techniques; (H)Image file formats; (I) Image review manipulation software; (J) Computer environment; (K)Display system. (iii)Detailed bench testing and results at the system level, including for the following, as appropriate: (A)Color reproducibility; (B) Spatial resolution; (C) Focusing test; (D) Whole slide tissue coverage; (E) Stitching error: (F) Turnaround time. (iv) Detailed information demonstrating the performance characteristics of the device, including, as appropriate: (A)Precision to evaluate intra-system and inter-system precision using a comprehensive set of clinical specimens with defined, clinically relevant histologic features from various organ systems and diseases. Multiple whole slide imaging systems, multiple sites, and multiple readers must be included. (B) Reproducibility data to evaluate inter-site variability using a comprehensive set of clinical specimens with defined, clinically relevant histologic features from various organ systems and diseases. Multiple whole slide imaging systems, multiple sites, and multiple readers must be included. (C) Data from a clinical study to demonstrate that viewing, reviewing, and diagnosing digital images of surgical pathology slides prepared from tissue slides using the whole slide imaging system is non-inferior to using an optical microscope. The study should evaluate the difference in major discordance rates between manual digital (MD) and manual optical (MO) modalities when compared to the reference (e.g., main sign-out diagnosis). (D) A detailed human factors engineering process must be used to evaluate the whole slide imaging system user interface(s). (2) Labeling compliant with 21 CFR 809.10(b) must include the following: The intended use statement must include the information described in paragraph (i) (1)(i) of this section, as applicable, and a statement that reads, "It is the responsibility of a qualified pathologist to employ appropriate procedures and safeguards to assure the validity of the interpretation of images obtained using this device." (ii) A description of the technical studies and the summary of results, including those that relate to paragraph (1)(ii) and (1)(iii) of this section, as appropriate. (iii) A description of the performance studies and the summary of results, including those that relate to paragraph (1)(iv) of this section, as appropriate. (iv) A limiting statement that specifies that pathologists should exercise professional judgment in each clinical situation and examine the glass slides by conventional microscopy if there is doubt about the ability to accurately render an interpretation using this device alone.

*Classification.* Class II (special controls). The special controls for this device are:(1) Premarket notification submissions must include the following information: (i) The indications for use must specify the tissue specimen that is intended to be used with the whole slide imaging system and the components of the system. (ii) A detailed description of the device and bench testing results at the component level, including for the following, as appropriate: (A) Slide feeder; (B) Light source; (C) Imaging optics; (D) Mechanical scanner movement; (E) Digital imaging sensor; (F) Image processing software; (G) Image composition techniques; (H) Image file formats; (I) Image review manipulation software; (J) Computer environment; and (K) Display system. (iii) Detailed bench testing and results at the system level, including for the following, as appropriate: (A) Color reproducibility; (B) Spatial resolution; (C) Focusing test; (D) Whole slide tissue coverage; (E) Stitching error; and (F) Turnaround time. (iv) Detailed information demonstrating the performance characteristics of the device, including, as appropriate: (A) Precision to evaluate intra-system and inter-system precision using a comprehensive set of clinical specimens with defined, clinically relevant histologic features from various organ systems and diseases. Multiple whole slide imaging systems, multiple sites, and multiple readers must be included. (B) Reproducibility data to evaluate inter-site variability using a comprehensive set of clinical specimens with defined, clinically relevant histologic features from various organ systems and diseases. Multiple whole slide imaging systems, multiple sites, and multiple readers must be included. (C) Data from a clinical study to demonstrate that viewing, reviewing, and diagnosing digital images of surgical pathology slides prepared from tissue slides using the whole slide imaging system is non-inferior to using an optical microscope. The study should evaluate the difference in major discordance rates between manual digital (MD) and manual optical (MO) modalities when compared to the reference ( *e.g.,* main sign-out diagnosis).(D) A detailed human factor engineering process must be used to evaluate the whole slide imaging system user interface(s). (2) Labeling compliant with 21 CFR 809.10(b) must include the following: (i) The intended use statement must include the information described in paragraph (b)(1)(i) of this section, as applicable, and a statement that reads, “It is the responsibility of a qualified pathologist to employ appropriate procedures and safeguards to assure the validity of the interpretation of images obtained using this device.” (ii) A description of the technical studies and the summary of results, including those that relate to paragraphs (b)(1)(ii) and (iii) of this section, as appropriate. (iii) A description of the performance studies and the summary of results, including those that relate to paragraph (b)(1)(iv) of this section, as appropriate. (iv) A limiting statement that specifies that pathologists should exercise professional judgment in each clinical situation and examine the glass slides by conventional microscopy if there is doubt about the ability to accurately render an interpretation using this device alone.

Predicate Devices

Related Devices

Submission Summary (Full Text)

{0} FDA U.S. FOOD &amp; DRUG ADMINISTRATION # 510(k) SUBSTANTIAL EQUIVALENCE DETERMINATION DECISION SUMMARY ## I Background Information: A 510(k) Number K233204 B Applicant Philips Medical Systems Nederland B.V. C Proprietary and Established Names Philips IntelliSite Pathology Solution 5.1 D Regulatory Information | Product Code(s) | Classification | Regulation Section | Panel | | --- | --- | --- | --- | | PSY | Class II | 21 CFR 864.3700 - Whole Slide Imaging System | PA - Pathology | ## II Submission/Device Overview: ### A Purpose for Submission: 1. New device: Philips IntelliSite Pathology Solution 5.1 comprising the Pathology Scanner SG20, Pathology Scanner SG60, Pathology Scanner SG300, Image Management System 4.2 and PP27QHD Display. 2. Update of Image Management System 4.1 (IMS 4.1) to IMS 4.2 to support the previously cleared Ultra Fast Scanner (UFS) in K203854. ### B Type of Test: Digital pathology whole slide imaging ## III Intended Use/Indications for Use: ### A Intended Use(s): Food and Drug Administration 10903 New Hampshire Avenue Silver Spring, MD 20993-0002 www.fda.gov {1} See Indications for Use below. ## B Indication(s) for Use: The Philips IntelliSite Pathology Solution (PIPS) 5.1 is an automated digital slide creation, viewing, and management system. The PIPS 5.1 is intended for in vitro diagnostic use as an aid to the pathologist to review and interpret digital images of surgical pathology slides prepared from formalin-fixed paraffin embedded (FFPE) tissue. The PIPS 5.1 is not intended for use with frozen section, cytology, or non-FFPE hematopathology specimens. The PIPS 5.1 comprises the Image Management System (IMS) 4.2, Ultra Fast Scanner (UFS), Pathology Scanner SG20, Pathology Scanner SG60, Pathology Scanner SG300 and PP27QHD Display. The PIPS 5.1 is for creation and viewing of digital images of scanned glass slides that would otherwise be appropriate for manual visualization by conventional light microscopy. It is the responsibility of a qualified pathologist to employ appropriate procedures and safeguards to assure the validity of the interpretation of images obtained using PIPS 5.1. ## C Special Conditions for Use Statement(s): For in vitro diagnostic (IVD) use only Rx - For Prescription Use Only ## IV Device/System Characteristics: ### A Device Description: The Philips IntelliSite Pathology Solution 5.1 (PIPS 5.1), is an automated digital slide creation, viewing, and management system. PIPS 5.1 consists of two subsystems and a display component: 1. Subsystems: a. Whole slide imaging scanners: i. Ultra Fast Scanner (UFS) ii. Pathology Scanner SG with different versions for varying slide capacity: Pathology Scanner SG20, Pathology Scanner SG60, Pathology Scanner SG300 b. Image Management System (IMS) 4.2 2. PP27QHD Display The 3 pathology scanner SG models (versions) SG20, SG60 and SG 300 share the same imaging pipeline, which is implemented in the scan engine (Second Generation Scanner Engine-SGSE). Based on the System Design Specification documents, the differences between the 3 pathology scanner SG models are mainly the slide feeders and housing. The Pathology Scanner SG20 contains 1 slide rack with a maximum capacity of 20 slides, Pathology Scanner SG60 contains 3 slide racks with a maximum of 20 slides per rack for a total capacity of 60 slides and Pathology Scanner SG300 contains 15 slide racks with a maximum of 20 slides each per rack for a total capacity of 300 slides. The three models of the Pathology Scanner - SG20, SG60, SG300 are considered identical and interchangeable in terms of the imaging pipeline. K233204 - Page 2 of 14 {2} The slides are manually loaded in racks which are then placed in rack slots in the scanners. The pathology scanner SG20/SG60/SG300 consist of optical, mechanical, electronic and software components to scan FFPE tissue mounted on glass slides at a resolution equivalent to a 40x objective to create digital whole slide images (WSI). The pathology scanner SG20/SG60/SG300 take macro images (snapshot images of the entire glass slide) as well as the glass slide label and automatically decode the barcode, as applicable. Based on the macro image of the slide, the scanner determines which region(s) on the slide contain tissue and scans these region(s). All images including the WSI together with information about decoded barcode are sent to the IMS server. The scanning progress is shown on the touchscreen display of the scanner. Proprietary software is used for image processing during acquisition. Philips' proprietary format, iSyntax, is used to store and transmit the images between the scanners and the IMS. There are no changes to the UFS since the previous clearance in K203845. The IMS 4.2 is a software only subsystem of PIPS 5.1 and consists of the IMS Review Application Software and IMS Application Server &amp; Storage software installed on compatible server hardware. IMS Review Application Software is a HTML based web application and accessed via a web- browser on the user workstation. The IMS 4.2 organizes the WSIs into cases using a connection to the laboratory information system (LIS). The pathologist uses the IMS Review Application Software to view and manage WSIs and the corresponding cases. It allows the pathologist to pan, zoom, make measurements, create annotations and bookmark WSIs. IMS 4.2 was upgraded from the previous version cleared for use with UFS in K203854 to support both UFS and Pathology Scanner SG20, SG60 and SG300. There are no changes to the PP27QHD display from the previous clearance K203845. The PP27QHD display allows the slide images to be viewed and is connected to a client computer with access to the IMS Review Application Software. The PP27QHD display is calibrated using the built-in calibration sensor and software tool, which automatically performs a periodic calibration. ## B Instrument Description Information: 1. Instrument Name: Philips IntelliSite Pathology Solution 5.1 2. Specimen Identification: Glass slides and scanned images are identified based on the previously assigned specimen identifiers such as patient identifiers, barcodes, etc. Digital images of surgical pathology slides prepared from formalin-fixed paraffin embedded (FFPE) tissue. 3. Specimen Sampling and Handling: Specimen sampling and handling are performed upstream and independent of the use of the subject device. Specimen sampling includes surgical pathology specimens such as biopsy or resection specimens which are processed using standard histology techniques. The FFPE tissue sections are H&amp;E stained. Then digital images are obtained from these glass slides using the UFS or Pathology scanner SG20, SG60 or SG300. 4. Calibration: K233204 - Page 3 of 14 {3} The SG scanner performs a check on the image quality parameters for each scan. If the check fails, the SG scanner will automatically start to calibrate. When a calibration aspect that affects the image quality is outside the pre-specified tolerances, the user interface shows a "calibration failed" message, and the scanner will abort the scanning process. Then the user is prompted to perform calibration via the maintenance tab in the user interface. In addition to the automated calibration within the pathology scanner SG20/SG60/SG300, routine manual calibration is performed during field service visits. The display is calibrated using a built-in front sensor. Calibrations are initiated by the QAWeb agent and performed as a background activity. The IMS subsystem does not require calibration. # 5. Quality Control: It is the responsibility of the laboratory staff to conduct and maintain quality control of the slides per their laboratory standards (e.g., staining, cover-slipping, barcode placement) prior to loading the slides into the pathology scanner SG20/SG60/SG300. After completing a scan, the operator checks the image data and quality using the IMS Viewer per instructions for use and the slide is rescanned, as needed. In addition to calibrations, quality checks for the display are initiated by the QAWeb agent and performed as a background activity. # V Substantial Equivalence Information: A Predicate Device Name(s): Philips IntelliSite Pathology Solution B Predicate 510(k) Number(s): K203845 C Comparison with Predicate(s): Table 1: Comparison of Technological Characteristics | Device & Predicate Device(s): | K203845 Philips Intellisite Pathology Solution (PIPS) | K233204 Philips Intellisite Pathology Solution 5.1 (PIPS 5.1) | | --- | --- | --- | | General Device Characteristic: Similarities | | | | Intended Use / Indications for Use | The Philips IntelliSite Pathology Solution (PIPS) is an automated digital slide creation, viewing, and management system. The PIPS is intended for in vitro diagnostic use as an aid to the pathologist to review and interpret digital images of surgical pathology slides prepared from formalin-fixed paraffin embedded (FFPE) tissue. The PIPS is not intended for use with frozen section, | The Philips IntelliSite Pathology Solution (PIPS) 5.1 is an automated digital slide creation, viewing, and management system. The PIPS 5.1 is intended for in vitro diagnostic use as an aid to the pathologist to review and interpret digital images of surgical pathology slides prepared from formalin-fixed paraffin embedded (FFPE) tissue. The PIPS 5.1 is not intended for use with frozen section, | K233204 - Page 4 of 14 {4} K233204 - Page 5 of 14 | | cytology, or non-FFPE hematopathology specimens. The PIPS comprises the Image Management System (IMS), the Ultra Fast Scanner (UFS), and Display. The PIPS is for creation and viewing of digital images of scanned glass slides that would otherwise be appropriate for manual visualization by conventional light microscopy. It is the responsibility of a qualified pathologist to employ appropriate procedures and safeguards to assure the validity of the interpretation of images obtained using PIPS. | cytology, or non-FFPE hematopathology specimens. The PIPS 5.1 comprises the Image Management System (IMS) 4.2, the Ultra Fast Scanner (UFS), Pathology Scanner SG20, Pathology Scanner SG60, Pathology Scanner SG300 and PP27QHD Display. The PIPS 5.1 is for creation and viewing of digital images of scanned glass slides that would otherwise be appropriate for manual visualization by conventional light microscopy. It is the responsibility of a qualified pathologist to employ appropriate procedures and safeguards to assure the validity of the interpretation of images obtained using PIPS 5.1. | | --- | --- | --- | | Principle of Operation | The technician loads the slides into the WSI scanner. The scanner scans the slides and generates WSI image for each slide. The technician performs Quality Control (QC) on scanned WSI images and rescan the slide when QC is failed. The acquired WSI images are stored in an end user provided image storage attached to the local network. During review, the pathologist opens WSI images acquired with the WSI scanner from the image storage, performs further QC and reads WSI images of the slides to make a diagnosis. | Same | | Device Components | UFS, IMS and PP27QHD display. | UFS, Pathology Scanner SG20/SG60 and SG300, IMS 4.2 and PP27QHD display. | | Image File Format | iSyntax | Same | | End User’s Interface | The IMS Image Processing Pipeline 4.1 is designed to process UFS iSyntax data. | The IMS Image Processing Pipeline 4.2 is designed to process both UFS and Pathology Scanner SG20/SG60 and SG300 iSyntax data. | | **General Device Characteristic: Differences** | | | | Slide Feeder | Storage with a capacity of 15 racks (300 slides). | Three slide storage capacities: SG 300: 15 racks (300 slides) SG 60:3 racks (60 slides) SG 20:1 rack (20 slides). | {5} K233204 - Page 6 of 14 | Imaging Optics | Three RGB high resolution cameras and one single monochrome focus camera (4 Digital Image Sensors in total). | Single camera with a single Digital Image Sensor. | | --- | --- | --- | ## VI Standards/Guidance Documents Referenced: 1. Technical Performance Assessment of Digital Pathology Whole Slide Imaging Devices. Guidance for Industry and Food and Drug Administration Staff. April 20, 2016. 2. Guidance for the Content of Premarket Submissions for Software Contained in Medical Devices. Guidance for Industry and Food and Drug Administration Staff. June 14, 2023. 3. Cybersecurity in Medical Devices: Quality System Considerations and Content of Premarket Submissions. Guidance for Industry and Food and Drug Administration Staff. September 27, 2023. 4. Off-The-Shelf Software Use in Medical Devices. Guidance for Industry and Food and Drug Administration Staff. August 11, 2023. 5. IEC 62304 Medical device software – Software life cycle processes (Edition 1.1, 2015-06). ## VII Performance Characteristics: ### A. Analytical Performance: #### 1. Precision/Reproducibility: The objective of this study was to evaluate the repeatability (within- and between- system) and reproducibility (between-site) of PIPS 5.1. The precision of the device was based on three reading pathologists' assessment and identification of 21 specific clinically relevant histopathologic "features" that are observed in FFPE hematoxylin and eosin (H&amp;E) stained slides. The slides with the selected study features were scanned using 3 magnifications: 10x, 20x and 40x. As shown in the Table 2 below, there were seven different study features per magnification. In total, 21 features were selected, with each feature selected from three different organs, to ensure that multiple tissue types were evaluated. For the reading part of the study, three non-study-features per magnification were added to the reading checklists as "wild card" slides to reduce recall bias but were excluded from the statistical analyses. Table 2: Histologic Study Features | Study Features | Magnification | | | | --- | --- | --- | --- | | | 40x | 20x | 10x | | | Nucleolus | Reed Sternberg | Small artery | | | Eosinophil granules | Polymorph neutrophil | Psammoma body | | | Mitosis | Plasma cell | Keratin pearl | | | Nuclear membrane | Goblet cell | Granuloma | | | Cilia | Tingible body macrophage | Adipose Cell | | | Prickles desmosomes | Foreign Body giant cell | Glandular formation | | | Pigment laden macrophages | Lymphocyte | Necrosis | | Non-study-Features | Nuclear grooves | Mast cells | Peripheral nerves | | | Asteroid bodies | Chondrocytes | Germinal center | | | Nuclear inclusions | Apocrine cells | Secretions | {6} The study slides were selected from previously accessioned cases from the laboratory archives. All slides were specimens from subjects who already had received their diagnosis. Slides containing the pre-specified study features were selected from consecutive cases using the laboratory information system (LIS) by an enrollment pathologist (EP) and confirmed by a validating enrollment pathologist (VEP) and scanned by a study technician. The EP reviewed the WSI and defined an area (bookmark) containing the selected feature(s) at the appropriate magnification. The VEP confirmed the presence of the feature(s) in the FOV before enrolling the image in the study. Three independent study reading pathologists reviewed the FOV as a snapshot with no panning or zooming capabilities. The reading pathologists assessed the presence of each of the features based on a feature checklist appropriate for that particular magnification. One slide set (n=399) was used for all three sub-studies. Reading pathologists were different for each sub-study and were blinded to the study features. From a study slide set, 399 FOVs were extracted containing 420 selected features. In addition, a total of 210 wildcard FOVs was selected. The results from the FOV wildcards were not analyzed. The number of FOV wildcards added per reading session was 70 (approximately 17.5% of the total number of FOVs in the study). The acceptance criterion for precision was the lower-limit of the 95% confidence interval for the overall agreement rate is 85.0% or above. Precision was assessed in three sub-studies: within-system, between-system and between-site. - In the within-system study, precision within a system was evaluated on 3 systems by 3 pathologists at 1 site. - In the between-system study, precision between systems was evaluated on 3 systems by 3 pathologists at 1 site. - In the between-site study, precision between systems located at different sites was evaluated at 3 sites on 3 systems by 3 pathologists. # Within-System Precision: The study slide set was equally (n=133 slides per system) and randomly divided over 3 systems at 1 site. On each system the slides were scanned 3 times with at least 6 hours of system downtime (ensuring full cool down) between scanning iterations. Three separate reading sessions were performed by each of 3 reading pathologists, with a washout period of at least 2 weeks between sessions. All FOVs from all systems were read by all pathologists. All 3 iterations of all 3 systems were read by each pathologist in a randomized way. Within-system precision results (overall and by system) are presented in the Table 3 below. Table 3: Within-System Agreement Rates | | | | Agreement rate | | | --- | --- | --- | --- | --- | | System | Number of Comparison Pairs | Number of Pairwise Agreements | % | 95% CI | | Overall | 3600 | 3178 | 88.3 | (86.7; 89.9) | | Scanner1 | 1210 | 1054 | 87.1 | (84.1; 90.0) | | Scanner 2 | 1190 | 1049 | 88.2 | (85.6; 90.7) | | Scanner 3 | 1200 | 1075 | 89.6 | (86.6; 92.4) | # Between-System Precision: The complete study slide set (n=399) was scanned once on each of the 3 systems at one site. K233204 - Page 7 of 14 {7} The same 210 wildcard FOVs were used from the intra-system study. Three separate reading sessions were performed by each pathologist with a washout period of at least 2 weeks between sessions in a randomized way. All FOVs from all systems were read by all pathologists. Between-system precision results (overall and by system pair) are presented in the Table below. Table 4: Between-System Agreement Rates | | | | Agreement rate | | | --- | --- | --- | --- | --- | | System | Number of Comparison Pairs | Number of Pairwise Agreements | % | 95% CI | | Overall | 3610 | 3445 | 95.4 | (94.4; 96.5) | | Scanner 1 vs. Scanner 2 | 1200 | 1144 | 95.3 | (94.0; 96.6) | | Scanner 1 vs. Scanner 3 | 1207 | 1152 | 95.4 | (94.1; 96.7) | | Scanner 2 vs. Scanner 3 | 1203 | 1149 | 95.5 | (94.2; 96.7) | Between-Site Reproducibility Study: A different system and a different study technician were involved per site. The same slide set (n=399) was scanned once at each site, resulting in 3 WSI sets. In the within-site study, there were 3 readings R1, R2 and R3 from 3 pathologists and 3 different systems, each at a different site, for each selected feature on an FOV. There were 3 pairwise comparisons among the 3 readings, i.e., R1-R2, R1-R3 and R2-R3. A pair of readings was considered in agreement when both readings were 'present' or when both readings were 'absent'. The primary endpoint, i.e., the overall within-site agreement rate (i.e., when the selected feature was marked as "present" in both readings or not marked in both readings), was calculated using all available pairwise comparison results over all enrolled features. Between-site reproducibility results (overall and by system pair) are presented in the Table below. Table 5: Between-Site Agreement Rates | | | | Agreement rate | | | --- | --- | --- | --- | --- | | Sites | Number of Comparison Pairs | Number of Pairwise Agreements | % | 95% CI | | Overall | 1228 | 1114 | 90.7 | (88.4; 92.9) | | Site 1 vs Site 2 | 407 | 368 | 90.4 | (87.5; 93.1) | | Site 1 vs Site 3 | 406 | 371 | 91.4 | (88.5; 94.1) | | Site 2 vs Site 3 | 415 | 375 | 90.4 | (87.5; 93.1) | 2. Linearity: Not applicable 3. Analytical Specificity/Interference: Not applicable 4. Accuracy (Instrument): Not applicable 5. Carry-Over: Not applicable B. Technical Studies: K233204 - Page 8 of 14 {8} Studies were conducted to evaluate the performance of the new Pathology Scanner SG20/SG60/SG300 and IMS 4.2, as recommended in FDA guidance titled "Technical Performance Assessment of Digital Pathology Whole Slide Imaging Devices" as shown in the Table below. | TPA item | Testing Data Description | | --- | --- | | Slide Feeder | Information was provided on the configurations of the slide feed mechanism, including a physical description of the slide, the number of slides in carriers, and the class of automation. Information was provided on the user interaction with the slide feeder, including hardware, software, feedback mechanisms, and Failure Mode and Effects Analysis (FMEA). | | Light Source | Descriptive information associated with the light source was provided. Technical information was provided to verify the spectral distribution of the light that is incident on the slide to the slide is to verify that the color reproducibility of the Pathology Scanner SG20/SG60/SG300 is within predefined limits. The tests passed their acceptance criteria. | | Imaging optics | An optical schematic with all optical elements identified from slide (object plane) to image sensor (micro camera) was provided. Descriptive information regarding the microscope objective, auxiliary lens(es), and the magnification of imaging optics was provided. Testing information regarding the magnification, relative irradiance, optical distortions, and chromatic aberrations was provided. The tests passed their acceptance criteria. | | Mechanical scanner Movement | Information and specifications of the configuration of the stage, method of movement, control of movement of the stage, and FMEA was provided. Test data to determine positioning accuracy and repeatability for the X-Y and Z stages was provided. The tests passed their acceptance criteria. | | Digital Imaging sensor | Information and specifications on the sensor type, pixel information, responsivity specifications, noise specifications, readout rate, and digital output format were provided. Testing to determine the correct functioning of the digital image sensor was provided. The tests passed their acceptance criteria. | | Image Processing Software | Information and specifications on the exposure control, white balance, color correction, sub-sampling, pixel-offset correction, shading (flat-field) correction, and pixel-defect correction were provided. | | Image composition | Information and specifications on the scanning method and focus, was provided. Test data to analyze the image composition performance was provided. The tests passed their acceptance criteria. | | Image file formats | Information and specifications on the compression method, and compression ratio were provided. | | Image Review Manipulation Software | Information and specifications for continuous panning, continuous zooming, discrete Z-axis displacement, comparison of slides in multiple windows, annotation tools, image enhancement, color correction, tracking of visited areas, digital bookmarks, and shared viewing sessions was provided. Test data to analyze the image review manipulation software was provided. The tests passed their acceptance criteria. | | Display | No changes have been introduced for the PP27QHD display. Tests results for this display in previous 510(k) remain valid. | K233204 - Page 9 of 14 {9} | TPA item | Testing Data Description | | --- | --- | | Computer Environment | Information and specifications on the computer hardware, operating system, memory, hard disk, graphics card, graphics card driver, color management settings, color profile, display interface and network were provided. | | Color Reproducibility | Test data to quantify the accuracy and precision of the color transformation from the slide to the display monitor was provided. The tests passed their acceptance criteria | | Spatial Resolution | Test data to evaluate the spatial resolution of the image acquisition phase was provided. The tests passed their acceptance criteria. | | Focusing Test | Test data to demonstrate that the focus quality is clinically acceptable for a variety of histologic preparations, including different tissue types, stain intensities, specimen thicknesses, and stain types was provided. The tests passed their acceptance criteria. | | Whole Slide Tissue Coverage | Test data to demonstrate that the entire tissue specimen on the clinical slide is detected by device was provided. The tests passed their acceptance criteria. | | Stitching Error | Test data to evaluate the stitching quality of the high-resolution image of the Pathology Scanner SG20/SG60/SG300. The tests passed their acceptance criteria. | | Turnaround Time | Test data to evaluate the average time required to execute zooming and panning operations, and to refresh the display in response to user input was provided. The tests passed their acceptance criteria. | C. Clinical Performance: A clinical validation study was conducted to demonstrate that viewing, reviewing and diagnosing surgical pathology tissue slides by using PIPS 5.1, manual digital (MD) is non-inferior to using manual optical (light) microscope (MO) modality. The device configuration used in this study was Pathology Scanner SG300, IMS 4.2 and PP27QHD display. The primary endpoint was the paired MD-MO difference in the major discordance rates, derived from readings of the same cases using both MO and MD by the same reader. The MD modality is declared non-inferior to the MO modality if the upper bound of the 95% two-sided confidence interval for the overall MD – MO difference in major discordance rate (compared to the main diagnosis) was less than 4%. The study was conducted at 3 sites. Eleven reading pathologists participated in the study (4 reading pathologists at two sites, and 3 reading pathologists the third site). There was no subject enrollment or subject participation in this study and involved a review of previously accessioned cases from the laboratory archives. All slides were from subjects who had already received their diagnosis. The study consisted of 2 sets of cases: - Set 1: This study set consisted of a subset of cases from the previous clinical study that was part of DEN160056. This clinical study was based on the reading of surgical pathology slides which included hematoxylin &amp; eosin (H&amp;E) stained slides, obtained from consecutive cases at least one year old and for which a sign-out diagnosis (final diagnosis assigned to the case at the local institution using a light microscope) was available. The main diagnosis and clinical information from the original study was imported and used in this study. Cases and corresponding original representative slides were selected from the clinical study enrollment set randomly per organ/diagnosis subtype. K233204 - Page 10 of 14 {10} - Set 2: This set consisted of new cases, which were used in 2 of the 4 sites participating in the clinical study. As in the DEN160056 clinical study, each site had a site-specific list of organs to include and set 2 was added to ensure a total set that adequately represented routine clinical practice. Both sets were read with both the MO modality and the MD modality. An adjudication panel independently reviewed and compared the reader diagnosis against the main diagnosis to determine whether the diagnoses were concordant, minor discordant or major discordant according to pre-specified criteria. The study inclusion and exclusion criteria were as follows. **Inclusion Criteria:** - All H&amp;E, IHC and special stains glass cover-slipped slide or slides, with human tissue obtained via surgical pathology of original case are available. - Original sign-out diagnosis is available. - Relevant clinical information that was available to the sign-out pathologist in the pathology request form is available. - Cases are at least one year since accessioning. Both Set 1 and Set 2 had similar exclusion criteria which are as follows: **Exclusion Criteria:** - Cases, including sent out cases, for which any H&amp;E, IHC or special stains slide used for the original sign-out diagnosis is no longer available at the site. - Cases for which the control slides for IHC and special stains are not available. - The selected slide or slides for the main diagnosis do not match any subtype of the organ for which the case was selected. - Relevant clinical information that was available to the sign-out pathologist in the pathology request form cannot be obtained. - The selected slide or slides for the main diagnosis and the control slide(s) do not fulfill the quality checks according to general clinical practice. - Selected slides contain indelible markings. - Selected slides with damaged tissue. - More than one case was selected for a patient (only one case may be enrolled per patient). - Case consists of frozen section(s) only. - Case consists of gross specimens only. The study consisted of five phases: - **Study preparation:** Each case was randomized and assigned different case IDs for the Manual Optical Read and the Manual Digital Read. - **Enrollment:** The enrollment pathologist performed a check of the glass slides against the exclusion criteria. - **Manual Optical Read:** All slides were randomized by the study coordinator. All reading pathologists read all cases from their own site. Using a light microscope, the reading pathologists reviewed each case, and documented the main diagnosis obtained from the (representative) slide or slides in the electronic data capturing (system) (EDC). K233204 - Page 11 of 14 {11} There was a washout period of four weeks between the manual optical reading and the manual digital reading sessions. - Manual Digital Read: The study technician scanned all the randomized slides and upon completion of scanning, performed quality control (QC) on all images, in accordance with the QC procedures. All reading pathologists read all cases from their own site. The reading pathologists were provided with representative WSI(s) of the case at once without any possible bookmarks / annotations from other reading pathologists. The reading pathologist documented the main diagnosis obtained from the slide or slides in the EDC. - Adjudication of Study Diagnoses: Case by case adjudication was conducted for each case to determine whether the diagnosis was concordant, minor discordant or major discordant. A major discordance was defined as a difference in diagnosis that would be associated with a clinically important difference in patient management. A minor discordance was defined as a difference in diagnosis that would not be associated with a clinically important difference in patient management. Two adjudication pathologists (from the panel of three) independently reviewed and compared the reading pathologist's diagnosis against the main diagnosis. The adjudication pathologist only reviewed the diagnoses and determined if the comparison was concordant, minor discordant or major discordant. If the concordance reviews from the two adjudicators were different, the third adjudication pathologist was assigned to independently perform a $3^{\text{rd}}$ concordance review. The ultimate score was determined by a majority vote. If all three adjudication pathologists disagreed after their initial review, the ultimate score for analysis was determined in an adjudication panel meeting where all three pathologists together reviewed only the two diagnoses and came to a final adjudication score by majority vote. There were a total of 1735 slides from 952 cases [Site 1: 314 cases, 635 slides; Site 2: 345 cases, 627 slides; Site 3: 293 cases, 473 slides]. At each site, pathologists read all the cases assigned to the site using both the MO and the MD modalities with a washout period of four weeks in between, resulting in a total of 6988 readings. Table 6: Clinical Study Results Based on Major Discordance Rates | | Manual Digital (MD) Discordance Rate | | | Manual Optical (MO) Discordance Rate | | | Difference MD - MO | | | --- | --- | --- | --- | --- | --- | --- | --- | --- | | Site | Total Reads | % | 95% CI | Total Reads | % | 95% CI | % | 95% CI | | Observed | 3494 | 5.8% | | 3494 | 5.7% | | 0.1% | | | Model | 3494 | 5.8% | (3.93; 8.53) | 3494 | 5.7% | (3.87; 8.41) | 0.1% | (-1.01; 1.18) | As shown in Table 6 above, the overall difference in major discordance rate (compared to the main diagnosis) for digital-optical was $0.1\%$ , with a derived two-sided $95\%$ CI of $[-1.01\%; 1.18\%]$ . As the upper limit of this confidence interval was less than the pre-specified non-inferiority margin of $4\%$ , manual digital diagnosing by a pathologist using PIPS 5.1 is non-inferior to diagnosing by a pathologist using the optical microscope. K233204 - Page 12 of 14 {12} The differences in major discordance rates by organ types for the full analysis set between MD and MO are shown in the Table 7 below. The clinical study was not powered to analyze the results by individual organ site or diagnosis. Table 7: Major Discordance Rate by Organ and Modality | Organ | Manual Digital (MD) | Manual Optical (MO) | Difference (MD-MO) | | --- | --- | --- | --- | | Bladder | 10% | 16% | -6% | | Brain/Neuro | 26% | 20% | 6% | | Breast | 6% | 4% | 2% | | Colorectal | 6% | 5% | 1% | | Endocrine | 10% | 12% | -2% | | Gynecology | 6% | 7% | -1% | | Kidney | 7% | 7% | 0% | | Liver/BD, Neo | 7% | 11% | -4% | | Lung | 4% | 7% | -3% | | Lymph Node | 5% | 5% | 0% | | Prostate | 18% | 17% | 1% | | Salivary | 14% | 7% | 7% | | Skin | 12% | 12% | 0% | | Stomach | 0% | 4% | -4% | ## D. Other Supportive Instrument Performance Characteristics Data: The use of IMS component (IMS 4.2) of the PIPS 5.1 with the previously cleared UFS was supported by bench testing data using the pixelwise comparison approach. A pixelwise comparison study was performed to compare images reproduced by the predicate device (PIPS 4.1) and the subject device (PIPS 5.1) for the same iSyntax files to demonstrate identical image reproduction. Devices used in this study included: - Comparator (Predicate) device PIPS 4.1 consisting of UFS 1.8, IMS 4.1 and PP27QHD Display - Test device consisting of IMS 4.2 component of PIPS 5.1, UFS 1.8 and PP27QHD Display A total of 42 glass slides from a diverse set of stain types and human anatomical sites were used as the test data. Three representative features from each of the 42 slides were selected and marked with a bookmark by a pathologist, resulting in a total of 252 regions of interest (ROIs) (42 slides x 3 ROIs x 2 magnification levels). Each location was captured at 20x and 40x magnification levels on both the predicate and subject devices. The client workstation and the PP27QHD display were used to capture ROIs and to perform the pixelwise comparison. The pixelwise comparison test compared the screenshots of all the ROIs captured on the predicate device to those captured on the subject device, checking for an exact match in the RGB values of each pixel. The test results showed that the differences were zero for all pixels in all ROIs, indicating that all corresponding pixels are identical. These results demonstrated that the subject device PIPS 5.1 and the predicate device PIPS 4.1 generate identical images for the same iSyntax files generated by the UFS device. K233204 - Page 13 of 14 {13} F. Human Factor Study: This study was performed to obtain objective evidence that the user interface can be used safely and to assess the effectiveness of the usability-related risk management measures. Human factors studies were designed around user tasks, and use scenarios performed by the intended users. No user errors were identified that would result in serious harm to the patient or user. Overall, the results of the human factors testing were acceptable. VIII Proposed Labeling: The labeling supports the finding of substantial equivalence for this device. IX Conclusion: The submitted information in this premarket notification is complete and supports a substantial equivalence decision. K233204 - Page 14 of 14
Innolitics

Panel 1

/
Sort by
Ready

Predicate graph will load when search results are available.

Embedding visualization will load when search results are available.

PDF viewer will load when search results are available.

Loading panels...

Select an item from Submissions

Click any panel, subpart, regulation, product code, or device to see details here.

Section Matches

Results will appear here.

Product Code Matches

Results will appear here.

Special Control Matches

Results will appear here.

Loading collections...