Anatomical Prior-Inspired Label Refinement for Weakly Supervised Liver Tumor Segmentation with Volume-Level Labels

Fei Lyu, Andy Jinhua Ma, Pong Chi Yuen

Research output: Chapter in book/report/conference proceedingConference proceedingpeer-review

Abstract

Automatic liver tumor segmentation is important for assisting doctors in accurate diagnosis of liver cancer. Existing models for liver tumor segmentation usually require accurate pixel-level labels. However, acquiring such dense labels is laborious and costly. In this paper, we propose a weakly supervised method for liver tumor segmentation using volume-level labels, which can significantly reduce the manual annotation effort. Volume-level labels are propagated to image-level labels where all the slices in one CT volume share the same label, and pixel-level pseudo labels can be estimated from image-level labels. However, it will cause the label noise problem because not all slices contain tumors. To address this issue, we propose two label refinement strategies based on anatomical priors to reduce the training noise and improve the performance. Evaluation experiments on two public datasets demonstrate that our proposed method can achieve competitive results compared to other methods with stronger supervision.
Original languageEnglish
Title of host publicationThe 33rd British Machine Vision Conference Proceedings, BMVC 2022
Publisher British Machine Vision Association
Pages1-12
Number of pages12
Publication statusPublished - Nov 2022
Event33rd British Machine Vision Conference, BMVC 2022 - London, United Kingdom
Duration: 21 Nov 202224 Nov 2022
https://bmvc2022.org/
https://bmvc2022.mpi-inf.mpg.de/

Conference

Conference33rd British Machine Vision Conference, BMVC 2022
Country/TerritoryUnited Kingdom
CityLondon
Period21/11/2224/11/22
Internet address

Fingerprint

Dive into the research topics of 'Anatomical Prior-Inspired Label Refinement for Weakly Supervised Liver Tumor Segmentation with Volume-Level Labels'. Together they form a unique fingerprint.

Cite this