Difference between revisions of "Publications/movn.21.bmvc"

From LRDE

 
(One intermediate revision by the same user not shown)
Line 2: Line 2:
 
| published = true
 
| published = true
 
| date = 2021-11-28
 
| date = 2021-11-28
| authors = Minh Ôn Vũ Ngoc, Yizi Chen, Nicolas C Boutry, Joseph Chazalon, Edwin Carlinet, Jonathan Fabrizio, Clément Mallet, Thierry Géraud
+
| authors = Minh Ôn Vũ Ngoc, Yizi Chen, Nicolas Boutry, Joseph Chazalon, Edwin Carlinet, Jonathan Fabrizio, Clément Mallet, Thierry Géraud
 
| title = Introducing the Boundary-Aware Loss for Deep Image Segmentation
 
| title = Introducing the Boundary-Aware Loss for Deep Image Segmentation
 
| booktitle = Proceedings of the 32nd British Machine Vision Conference (BMVC)
 
| booktitle = Proceedings of the 32nd British Machine Vision Conference (BMVC)
Line 12: Line 12:
 
| lrdenewsdate = 2021-11-28
 
| lrdenewsdate = 2021-11-28
 
| note = https://www.bmvc2021-virtualconference.com/assets/papers/1546.pdf
 
| note = https://www.bmvc2021-virtualconference.com/assets/papers/1546.pdf
  +
| nodoi =
 
| type = inproceedings
 
| type = inproceedings
 
| id = movn.21.bmvc
 
| id = movn.21.bmvc
 
| bibtex =
 
| bibtex =
 
@InProceedings<nowiki>{</nowiki> movn.21.bmvc,
 
@InProceedings<nowiki>{</nowiki> movn.21.bmvc,
author = <nowiki>{</nowiki>Minh \^On V\~<nowiki>{</nowiki>u<nowiki>}</nowiki> Ng\d<nowiki>{</nowiki>o<nowiki>}</nowiki>c and Yizi Chen and Nicolas C.
+
author = <nowiki>{</nowiki>Minh \^On V\~<nowiki>{</nowiki>u<nowiki>}</nowiki> Ng\d<nowiki>{</nowiki>o<nowiki>}</nowiki>c and Yizi Chen and Nicolas Boutry
Boutry and Joseph Chazalon and Edwin Carlinet and Jonathan
+
and Joseph Chazalon and Edwin Carlinet and Jonathan
 
Fabrizio and Cl\'ement Mallet and Thierry G\'eraud<nowiki>}</nowiki>,
 
Fabrizio and Cl\'ement Mallet and Thierry G\'eraud<nowiki>}</nowiki>,
 
title = <nowiki>{</nowiki>Introducing the Boundary-Aware Loss for Deep Image
 
title = <nowiki>{</nowiki>Introducing the Boundary-Aware Loss for Deep Image
Line 52: Line 53:
 
boundary-awareness loss is freely available at
 
boundary-awareness loss is freely available at
 
\url<nowiki>{</nowiki>https://github.com/onvungocminh/MBD_BAL<nowiki>}</nowiki><nowiki>}</nowiki>,
 
\url<nowiki>{</nowiki>https://github.com/onvungocminh/MBD_BAL<nowiki>}</nowiki><nowiki>}</nowiki>,
note = <nowiki>{</nowiki>https://www.bmvc2021-virtualconference.com/assets/papers/1546.pdf<nowiki>}</nowiki>
+
note = <nowiki>{</nowiki>https://www.bmvc2021-virtualconference.com/assets/papers/1546.pdf<nowiki>}</nowiki>,
  +
nodoi = <nowiki>{</nowiki><nowiki>}</nowiki>
 
<nowiki>}</nowiki>
 
<nowiki>}</nowiki>
   

Latest revision as of 18:08, 7 April 2023

Abstract

Most contemporary supervised image segmentation methods do not preserve the initial topology of the given input (like the closeness of the contours). One can generally remark that edge points have been inserted or removed when the binary prediction and the ground truth are compared. This can be critical when accurate localization of multiple interconnected objects is required. In this paper, we present a new loss function, called, Boundary-Aware loss (BALoss), based on the Minimum Barrier Distance (MBD) cut algorithm. It is able to locate what we call the it leakage pixels and to encode the boundary information coming from the given ground truth. Thanks to this adapted loss, we are able to significantly refine the quality of the predicted boundaries during the learning procedure. Furthermore, our loss function is differentiable and can be applied to any kind of neural network used in image processing. We apply this loss function on the standard U-Net and DC U-Net on Electron Microscopy datasets. They are well-known to be challenging due to their high noise level and to the close or even connected objects covering the image space. Our segmentation performance, in terms of Variation of Information (VOI) and Adapted Rank Index (ARI), are very promising and lead to better scores of VOI and better scores of ARI than the state-of-the-art. The code of boundary-awareness loss is freely available at https://github.com/onvungocminh/MBD_BAL

Documents

Bibtex (lrde.bib)

@InProceedings{	  movn.21.bmvc,
  author	= {Minh \^On V\~{u} Ng\d{o}c and Yizi Chen and Nicolas Boutry
		  and Joseph Chazalon and Edwin Carlinet and Jonathan
		  Fabrizio and Cl\'ement Mallet and Thierry G\'eraud},
  title		= {Introducing the Boundary-Aware Loss for Deep Image
		  Segmentation},
  booktitle	= {Proceedings of the 32nd British Machine Vision Conference
		  (BMVC)},
  year		= 2021,
  address	= {Online},
  abstract	= {Most contemporary supervised image segmentation methods do
		  not preserve the initial topology of the given input (like
		  the closeness of the contours). One can generally remark
		  that edge points have been inserted or removed when the
		  binary prediction and the ground truth are compared. This
		  can be critical when accurate localization of multiple
		  interconnected objects is required. In this paper, we
		  present a new loss function, called, Boundary-Aware loss
		  (BALoss), based on the Minimum Barrier Distance (MBD) cut
		  algorithm. It is able to locate what we call the {\it
		  leakage pixels} and to encode the boundary information
		  coming from the given ground truth. Thanks to this adapted
		  loss, we are able to significantly refine the quality of
		  the predicted boundaries during the learning procedure.
		  Furthermore, our loss function is differentiable and can be
		  applied to any kind of neural network used in image
		  processing. We apply this loss function on the standard
		  U-Net and DC U-Net on Electron Microscopy datasets. They
		  are well-known to be challenging due to their high noise
		  level and to the close or even connected objects covering
		  the image space. Our segmentation performance, in terms of
		  Variation of Information (VOI) and Adapted Rank Index
		  (ARI), are very promising and lead to $\approx{}15\%$
		  better scores of VOI and $\approx{}5\%$ better scores of
		  ARI than the state-of-the-art. The code of
		  boundary-awareness loss is freely available at
		  \url{https://github.com/onvungocminh/MBD_BAL}},
  note		= {https://www.bmvc2021-virtualconference.com/assets/papers/1546.pdf},
  nodoi		= {}
}