Going beyond p-convolutions to learn grayscale morphological operators

From LRDE

Revision as of 16:26, 2 March 2021 by Bot (talk | contribs) (Created page with "{{Publication | published = true | date = 2021-02-16 | authors = Alexandre Kirszenberg, Guillaume Tochon, Élodie Puybareau, Jesus Angulo | title = Going beyond p-convolutions...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
The printable version is no longer supported and may have rendering errors. Please update your browser bookmarks and please use the default browser print function instead.

Abstract

Integrating mathematical morphology operations within deep neural networks has been subject to increasing attention lately. However, replacing standard convolution layers with erosions or dilations is particularly challenging because the min and max operations are not differentiable. Relying on the asymptotic behavior of the counter-harmonic mean, p-convolutional layers were proposed as a possible workaround to this issue since they can perform pseudo-dilation or pseudo-erosion operations (depending on the value of their inner parameter p), and very promising results were reported. In this work, we present two new morphological layers based on the same principle as the p-convolutional layer while circumventing its principal drawbacks, and demonstrate their potential interest in further implementations within deep convolutional neural network architectures.

Documents

Bibtex (lrde.bib)

@InProceedings{	  kirszenberg.21.dgmm,
  author	= {Alexandre Kirszenberg and Guillaume Tochon and \'{E}lodie
		  Puybareau and Jesus Angulo},
  title		= {Going beyond p-convolutions to learn grayscale
		  morphological operators},
  booktitle	= {IAPR International Conference on Discrete Geometry and
		  Mathematical Morphology (DGMM)},
  year		= {2021},
  series	= {Lecture Notes in Computer Science},
  month		= may,
  address	= {Uppsala, Sweden},
  publisher	= {Springer},
  abstract	= {Integrating mathematical morphology operations within deep
		  neural networks has been subject to increasing attention
		  lately. However, replacing standard convolution layers with
		  erosions or dilations is particularly challenging because
		  the min and max operations are not differentiable. Relying
		  on the asymptotic behavior of the counter-harmonic mean,
		  p-convolutional layers were proposed as a possible
		  workaround to this issue since they can perform
		  pseudo-dilation or pseudo-erosion operations (depending on
		  the value of their inner parameter p), and very promising
		  results were reported. In this work, we present two new
		  morphological layers based on the same principle as the
		  p-convolutional layer while circumventing its principal
		  drawbacks, and demonstrate their potential interest in
		  further implementations within deep convolutional neural
		  network architectures.},
  note		= {Accepted}
}