Fast and Low-resource semi-supervised Abdominal oRgan sEgmentation in CT (FLARE 2022)¶
Abdomen organ segmentation has many important clinical applications, such as organ quantification, surgical planning, and disease diagnosis. However, manually annotating organs from CT scans is time-consuming and labor-intensive. Thus, we usually cannot obtain a huge number of labeled cases. As a potential alternative, semi-supervised learning can explore useful information from unlabeled cases.
We extend the FLARE 2021 Challenge from fully supervised settings to a semi-supervised setting that focuses on how to use unlabeled data. Specifically, we provide a small number of labeled cases (50) and a large number of unlabeled cases (2000) in the training set, 50 visible cases for validation, and 200 hidden cases for testing. The segmentation targets include 13 organs: liver, spleen, pancreas, right kidney, left kidney, stomach, gallbladder, esophagus, aorta, inferior vena cava, right adrenal gland, left adrenal gland, and duodenum. In addition to the typical Dice Similarity Coefficient (DSC) and Normalized Surface Dice (NSD), our evaluation metrics also focus on the inference speed and resources (GPU, CPU) consumption. Compare to the FLARE 2021 challenge, the dataset is 4x larger and the segmentations targets are increased to 13 organs. Moreover, the resource-related metrics are changed to the area under GPU memory-time curve and the area under CPU utilization-time curve rather than maximum GPU memory consumption.
The FLARE 2022 challenge has three main features:
- Task: we use a semi-supervised setting that focuses on how to use unlabeled data.
- Dataset: we curate a large-scale and diverse abdomen CT dataset, including 2300 CT scans from 20+ medical groups.
- Evaluation measures: we not only focus on segmentation accuracy but also on segmentation efficiency and resource consumption.
Current: Proceeding Publishing¶
News: The challenge summary is available on The Lancet Digital Health https://www.sciencedirect.com/science/article/pii/S2589750024001547
News: The challenge proceeding has been publishedhttps://link.springer.com/book/10.1007/978-3-031-23911-3
We will publish a proceeding in the Lecture Notes in Computer
Science (LNCS) volume to present the participants' methods: https://openreview.net/group?id=MICCAI.org/2022/Challenge/FLARE
The code and paper of top teams have been available on the awards page.
Timeline¶
15 March 2022 (12:00 AM EST): Launch of challenge and release of training data.31 March 2022 (12:00 AM EST): Release of validation data. Docker and short paper submission of validation set opening.15 April 2022 (12:00 AM EST): Deadline for the 1st validation submission.15 May 2022 (12:00 AM EST): Deadline for the 2nd validation submission.15 June 2022 (12:00 AM EST): Deadline for the 3rd validation submission and new registration. Docker and short paper submission of testing set opening.30 June 2022 (12:00 AM EST): Deadline for the 4th validation submission and new registration.7 July 2022 (12:00 AM EST): Deadline for the 5th validation submission and new registration.15 July 2022 (12:00 AM EST): Deadline for testing submission.15 August 2022 (12:00 AM EST): Invite top teams to prepare presentations and participate in MICCAI22 Satellite Event.- 22 September 2022: Final results and awards have been publicly available.
- Open submission
How to participate¶
Stage 0. Get the qualification¶
The challenge submission is based on Docker container.
So, participants should demonstrate basic segmentation skills and the
ability to encapsulate their methods in Docker. We provide a
playground for participants to practice.
Participants should
Develop any segmentation method (e.g., U-Net) based on the playground training dast and encapsulate the method by Docker.-
Use Docker to predict the testing set and record 5-10 minutes of the predicting process as a video (mp4 format). -
Submit the segmentation results here and upload your Docker to DockerHub. Send the (1) docker hub link, (2) download link to the recorded inference mp4 video, and (3) the screenshot of your playground leaderboard results (Mean DSC>0.3) to MICCAI.FLARE@aliyun.com. After reviewing your submission, we will get back to you with an Entry Number, then you can join FLARE22 Challenge.
If you have won an award in MICCAI FLARE21 Challenge, this step can
be exempt and you can directly go to Stage 1. Your Entry Number is
the Certificate Number in your award certificate.
If you have made a successful submission in other Docker-based MICCAI
Challenges (e.g., KiTS 2021, BraTS 2021...), you can also be exempt from
Stage 0. Please send supporting materials to MICCAI.FLARE@aliyun.com, we
will get back to you with an Entry Number.
Each team only needs one Entry Number.
Stage 1. Join the challenge and download the data¶
- Register on the website and verify your account.
- Click the green 'Join' button to participate in the FLARE22 Challenge. Your application will be automatically approved.
Note: Please only use lowercase letters and numbers in your team name! No spacing or special characters.
Stage 2. Develop your model and make validation submissions¶
- We offer three official submission opportunities on the validation set. To avoid submission jams, the three opportunities are assigned in three months (See Important Dates).
- To make official submissions, please send us (MICCAI.FLARE@aliyun.com): (1) a download link to your Docker container(teamname.tar.gz); (2) a sanity test video record (download example: Google drive, Baidu Netdisk) Please test your docker on validation case: FLARETs_0001_0000.nii.gz FLARETs_0016_0000.nii.gz FLARETs_0010_0000.nii.gz FLARETs_0030_0000.nii.gz FLARETs_0050_0000.nii.gz and record the prediction process) (3) a methodology paper. When the evaluation is finished, we will return back all metrics via email.
- Meanwhile, all teams have three chances per day to get the DSC score on the validation set by directly submitting the segmentation results on the Submit page. (All participants must submit in a form of teams, including the teams with a single user. Personal submission is not allowed.) Teams that violate this rule (e.g, making multiple submissions in a day via different team members' accounts) will be banned from submitting for a week!
Stage 3. Make testing submissions¶
- To avoid overfitting the testing set, we only offer one successful submission opportunity on the testing set.
- The submission should include (1) a download link to your Docker container(teamname.tar.gz); (2) a sanity test video record (download example: Google drive, Baidu Netdisk) Please test your docker on validation case: FLARETs_0001_0000.nii.gz FLARETs_0016_0000.nii.gz FLARETs_0010_0000.nii.gz FLARETs_0030_0000.nii.gz FLARETs_0050_0000.nii.gz and record the prediction process) (3) a methodology paper.
- The submitted Docker container will be evaluated with the following commands. If the Docker container does not work or the paper does not include all the necessary information to reproduce the method, we will return back the error information and review comments to participants.
docker load -i teamname.tar.gz
docker container run --gpus "device=1" --name teamname --rm -v $PWD/FLARE22_Test/:/workspace/inputs/ -v $PWD/teamname_outputs/:/workspace/outputs/ teamname:latest /bin/bash -c "sh predict.sh"
Sponsor¶