This article is opened to everyone. To find other guides and documents, sign up and login.
Updated Date : 2024.03.27
Applicable latest version :
- CXR 3.1.5.X
- CXR 3.1.7.X (only Linux version)
- Refer to this for Windows version
- MMG 1.1.9.X
Components of Lunit INSIGHT CXR & MMG
There are 3 components required to run CXR or MMG. Go to Lunit INSIGHT Workflow page to see a more detailed description of each component.
- Lunit INSIGHT Gateway (GW)
- Lunit INSIGHT Backend (BE)
- Lunit INSIGHT Inference Server (IS)
Lunit INSIGHT CXR & MMG Installation Type
The hardware requirements will change depending on the type of Lunit INSIGHT installation.
Installation types are usually categorized as follows:
- Cloud
- AI analysis is done on the cloud.
- Private data is anonymized before being sent to the cloud.
- Only GW needs to be installed on the server within the confines of the user’s network.
- BE & IS need to be installed on the cloud.
- On-premise
- AI analysis is done on the server within the confines of the user’s network.
- Private data is anonymized before being sent to the AI model.
- All the components (GW, BE and IS) need to be installed in order to run CXR or MMG.
Requirements for the Hardware
Each column in the table below represents the components that need to be installed in the server.
- Hybrid Cloud : Only Lunit INSIGHT Gateway (GW) will be installed on the hardware.
- If you prepare the hardware for GW, Lunit can host the Cloud for BE & IS.
- If you want to host the Cloud for BE & IS yourself, please refer to the "On-premise" column for the requirements of the VM on the Cloud. If you plan to use the public cloud like AWS or Azure, Lunit can provide guidance for the appropriate VM size.
- On-premise : A single product of either CXR or MMG will be installed on the hardware.
- For commercial purposes requiring high stability, it's recommended to utilize desktop workstations or rack-mounted servers over laptops.
Type |
Hybrid Cloud (Only GW) |
On-premise |
OS |
Ubuntu 22.04 Server |
|
CPU |
≥ Intel Core i3 |
≥ Intel Core i3 that support OpenVINO™ |
RAM |
≥ 8 GB |
≥ 16 GB |
Storage |
≥ 240 GB |
|
Network speed |
≥ 100Mbps |
* RAM and Storage are supposed to be doubled when installing both CXR and MMG together on a single piece of hardware.
Hardware Selection Guideline
-
CPU
Lunit INSIGHT Inference Server uses OpenVINO for the AI prediction.
The devices below support OpenVINO. Go to Supported CPU for Lunit INSIGHT Inference Server page to find out if your CPU can support AI prediction.-
Intel Xeon with Intel AVX2
-
Intel Core Processors with Intel AVX2
-
- RAM
Memory shouldn't be less than what is specified in the table above. Otherwise the product may not run on an optimal level. ECC RAM is recommended because of its durability and reliability
-
OS
The Ubuntu 20.04 server is used, as it can be installed with a simple installation package. This OS can receive seamless technical support.
-
Storage
- SSD storage is recommended because of its durability and reliability
- Take into account that each image (Chest X-ray or Mammography) will be saved at three times its original size and that they will be saved for 12 hours by default.
- 50 GB is reserved for the OS, the product and basic tools.
- Below is an example of how to calculate the storage that will be required in practice.
- The image size and the number of cases are hypothetical values.
- Chest X-Ray image : 10MB
- The number of images : 5,000 cases (at once)
- Total image size : 10MB X 3 X 5,000 cases = 150GB
- Reserved for the OS, the product and basic tools.
- Total needed storage : 150GB (Images) + 50GB (Reserved) = 200GB
-
Installation of both CXR and MMG on one piece of hardware
It is recommended to use one piece of hardware per product. This is because of technical issues with the hardware which can result in a situation where both products cannot be used.
RAM is supposed to be doubled when installing multiple products (ex. CXR & MMG) on a single piece of hardware.
GPU Reduces Analysis Time
Having a GPU reduces analysis time but CXR and MMG are still available on hardware without a GPU.
The table below shows the GPU requirements for IS analysis.
Installation Type |
GPU VRAM |
GPU Compute Capability |
Cloud |
(GPU is not required) |
(GPU is not required) |
On-premise |
≥ 4 GB |
≥ 6.1 |
* Only Nvidia GPUs are available; Other brand like AMD is not supported.
* Compute Capability Check Website : https://developer.nvidia.com/cuda-gpus
* VRAM is supposed to be doubled when installing both CXR and MMG together on a single piece of hardware.
Time to Obtain the AI Prediction Results
Here are some example comparison charts to give you an idea of how the time will be different on different servers. The type of CPU model and whether or not a GPU is installed are the main reason for the differences. Notice that even if you use the same piece of hardware, there may be time differences depending on the network connection speed, the size of the image file to be analyzed, and the inferencing content. Thus, we cannot guarantee that results will always be within the same time ranges as in the charts below.
The times in the charts are estimated based upon the time from when Lunit INSIGHT received the original DICOM to the time when the AI prediction results reached the PACS. (The time spent uploading DICOM images from PACS to Lunit INSIGHT is not included in the table below.)
1. CPU model comparison
- i3 OpenVINO : Intel Core i3 processors 3.6GHz 4 Cores
- i7 OpenVINO : Intel Core i7 processors 3.0GHz 8 Cores
2. OpenVINO (without GPU) vs GPU
- OpenVINO : Intel Core i3 processors 3.6GHz 4 Cores
- GPU : NVIDIA T400 (Compute Capability : 7.5) 4GB