Medical Image Segmentation
Medical Image Segmentation is a computer vision task that involves dividing an medical image into multiple segments, where each segment represents a different object or structure of interest in the image. The goal of medical image segmentation is to provide a precise and accurate representation of the objects of interest within the image, typically for the purpose of diagnosis, treatment planning, and quantitative analysis.
( Image credit: IVD-Net )
Papers
Showing 1–10 of 2089 papers
All datasetsKvasir-SEGCVC-ClinicDBCVC-ColonDBETIS-LARIBPOLYPDBSynapse multi-organ CTAutomatic Cardiac Diagnosis Challenge (ACDC)MoNuSeg2018 Data Science BowlGlaSBKAI-IGH NeoPolyp-SmallMICCAI 2015 Multi-Atlas Abdomen Labeling ChallengeACDC
Benchmark Results
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | DUCK-Net | mean Dice | 0.95 | — | Unverified |
| 2 | EffiSegNet-B5 | mean Dice | 0.95 | — | Unverified |
| 3 | EffiSegNet-B4 | mean Dice | 0.95 | — | Unverified |
| 4 | SegMed | mean Dice | 0.95 | — | Unverified |
| 5 | FCB Former | mean Dice | 0.94 | — | Unverified |
| 6 | FCB-SwinV2 Transformer | mean Dice | 0.94 | — | Unverified |
| 7 | SEP | mean Dice | 0.94 | — | Unverified |
| 8 | LM-Net | mean Dice | 0.94 | — | Unverified |
| 9 | RAPUNet | mean Dice | 0.94 | — | Unverified |
| 10 | FCBFormer | mean Dice | 0.94 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | DUCK-Net | mean Dice | 0.97 | — | Unverified |
| 2 | RAPUNet | mean Dice | 0.96 | — | Unverified |
| 3 | EMCAD | mean Dice | 0.95 | — | Unverified |
| 4 | Yolo-SAM 2 | mean Dice | 0.95 | — | Unverified |
| 5 | RaBiT | mean Dice | 0.95 | — | Unverified |
| 6 | UGCANet | mean Dice | 0.95 | — | Unverified |
| 7 | ESFPNet-L | mean Dice | 0.95 | — | Unverified |
| 8 | FCBFormer | mean Dice | 0.95 | — | Unverified |
| 9 | DuAT | mean Dice | 0.95 | — | Unverified |
| 10 | SegMed | mean Dice | 0.95 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | RAPUNet | mean Dice | 0.95 | — | Unverified |
| 2 | DUCK-Net | mean Dice | 0.94 | — | Unverified |
| 3 | EMCAD | mean Dice | 0.92 | — | Unverified |
| 4 | SegMed | mean Dice | 0.92 | — | Unverified |
| 5 | UniNet | mean Dice | 0.92 | — | Unverified |
| 6 | ProMISe | mean Dice | 0.87 | — | Unverified |
| 7 | Meta-Polyp | mean Dice | 0.87 | — | Unverified |
| 8 | ResUNet++ + TTA | mean Dice | 0.85 | — | Unverified |
| 9 | PVT-GCASCADE | mean Dice | 0.83 | — | Unverified |
| 10 | PVT-CASCADE | mean Dice | 0.83 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | RAPUNet | mean Dice | 0.96 | — | Unverified |
| 2 | SegMed | mean Dice | 0.94 | — | Unverified |
| 3 | DUCK-Net | mean Dice | 0.94 | — | Unverified |
| 4 | EMCAD | mean Dice | 0.92 | — | Unverified |
| 5 | ProMISe | mean Dice | 0.84 | — | Unverified |
| 6 | RSAFormer | mean Dice | 0.84 | — | Unverified |
| 7 | ESFPNet-L | mean Dice | 0.82 | — | Unverified |
| 8 | DuAT | mean Dice | 0.82 | — | Unverified |
| 9 | PVT-CASCADE | mean Dice | 0.8 | — | Unverified |
| 10 | SSFormer-L | mean Dice | 0.8 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | Interactive AI-SAM gt box | Avg DSC | 90.66 | — | Unverified |
| 2 | Medical SAM Adapter | Avg DSC | 89.8 | — | Unverified |
| 3 | MedSegDiff-v2 | Avg DSC | 89.5 | — | Unverified |
| 4 | nnUNet | Avg DSC | 88.8 | — | Unverified |
| 5 | MedNeXt-L (5x5x5) | Avg DSC | 88.76 | — | Unverified |
| 6 | MIST | Avg DSC | 86.92 | — | Unverified |
| 7 | nnFormer | Avg DSC | 86.57 | — | Unverified |
| 8 | AgileFormer | Avg DSC | 86.11 | — | Unverified |
| 9 | MERIT | Avg DSC | 84.9 | — | Unverified |
| 10 | Automatic AI-SAM | Avg DSC | 84.21 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | FCT | Avg DSC | 94.26 | — | Unverified |
| 2 | Interactive AI-SAM gt box | Avg DSC | 93.89 | — | Unverified |
| 3 | FCT | Avg DSC | 93.02 | — | Unverified |
| 4 | LHU-Net | Avg DSC | 92.65 | — | Unverified |
| 5 | MIST | Avg DSC | 92.56 | — | Unverified |
| 6 | MERIT | Avg DSC | 92.32 | — | Unverified |
| 7 | MERIT-GCASCADE | Avg DSC | 92.23 | — | Unverified |
| 8 | EMCAD | Avg DSC | 92.12 | — | Unverified |
| 9 | nnFormer | Avg DSC | 92.06 | — | Unverified |
| 10 | Automatic AI-SAM | Avg DSC | 92.06 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | Stardist | F1 | 84.6 | — | Unverified |