Difference between revisions of "Publications/kirszenberg.21.dgmm"

From LRDE

(Created page with "{{Publication | published = true | date = 2021-02-16 | authors = Alexandre Kirszenberg, Guillaume Tochon, Élodie Puybareau, Jesus Angulo | title = Going beyond p-convolutions...")
 
 
(2 intermediate revisions by the same user not shown)
Line 4: Line 4:
 
| authors = Alexandre Kirszenberg, Guillaume Tochon, Élodie Puybareau, Jesus Angulo
 
| authors = Alexandre Kirszenberg, Guillaume Tochon, Élodie Puybareau, Jesus Angulo
 
| title = Going beyond p-convolutions to learn grayscale morphological operators
 
| title = Going beyond p-convolutions to learn grayscale morphological operators
| booktitle = IAPR International Conference on Discrete Geometry and Mathematical Morphology (DGMM)
+
| booktitle = Proceedings of the IAPR International Conference on Discrete Geometry and Mathematical Morphology (DGMM)
 
| series = Lecture Notes in Computer Science
 
| series = Lecture Notes in Computer Science
  +
| volume = 12708
 
| address = Uppsala, Sweden
 
| address = Uppsala, Sweden
 
| publisher = Springer
 
| publisher = Springer
  +
| pages = 470 to 482
| abstract = Integrating mathematical morphology operations within deep neural networks has been subject to increasing attention lately. However, replacing standard convolution layers with erosions or dilations is particularly challenging because the min and max operations are not differentiable. Relying on the asymptotic behavior of the counter-harmonic mean, p-convolutional layers were proposed as a possible workaround to this issue since they can perform pseudo-dilation or pseudo-erosion operations (depending on the value of their inner parameter p), and very promising results were reported. In this work, we present two new morphological layers based on the same principle as the p-convolutional layer while circumventing its principal drawbacks, and demonstrate their potential interest in further implementations within deep convolutional neural network architectures.
+
| abstract = Integrating mathematical morphology operations within deep neural networks has been subject to increasing attention lately. However, replacing standard convolution layers with erosions or dilations is particularly challenging because the min and max operations are not differentiable. Relying on the asymptotic behavior of the counter-harmonic meanp-convolutional layers were proposed as a possible workaround to this issue since they can perform pseudo-dilation or pseudo-erosion operations (depending on the value of their inner parameter p), and very promising results were reported. In this work, we present two new morphological layers based on the same principle as the p-convolutional layer while circumventing its principal drawbacks, and demonstrate their potential interest in further implementations within deep convolutional neural network architectures.
 
| lrdepaper = http://www.lrde.epita.fr/dload/papers/kirszie.2021.dgmm.pdf
 
| lrdepaper = http://www.lrde.epita.fr/dload/papers/kirszie.2021.dgmm.pdf
 
| lrdekeywords = Image
 
| lrdekeywords = Image
| note = Accepted
 
 
| lrdenewsdate = 2021-02-16
 
| lrdenewsdate = 2021-02-16
 
| type = inproceedings
 
| type = inproceedings
 
| id = kirszenberg.21.dgmm
 
| id = kirszenberg.21.dgmm
  +
| identifier = doi:10.1007/978-3-030-76657-3_34
 
| bibtex =
 
| bibtex =
 
@InProceedings<nowiki>{</nowiki> kirszenberg.21.dgmm,
 
@InProceedings<nowiki>{</nowiki> kirszenberg.21.dgmm,
Line 21: Line 23:
 
title = <nowiki>{</nowiki>Going beyond p-convolutions to learn grayscale
 
title = <nowiki>{</nowiki>Going beyond p-convolutions to learn grayscale
 
morphological operators<nowiki>}</nowiki>,
 
morphological operators<nowiki>}</nowiki>,
booktitle = <nowiki>{</nowiki>IAPR International Conference on Discrete Geometry and
+
booktitle = <nowiki>{</nowiki>Proceedings of the IAPR International Conference on
Mathematical Morphology (DGMM)<nowiki>}</nowiki>,
+
Discrete Geometry and Mathematical Morphology (DGMM)<nowiki>}</nowiki>,
 
year = <nowiki>{</nowiki>2021<nowiki>}</nowiki>,
 
year = <nowiki>{</nowiki>2021<nowiki>}</nowiki>,
 
series = <nowiki>{</nowiki>Lecture Notes in Computer Science<nowiki>}</nowiki>,
 
series = <nowiki>{</nowiki>Lecture Notes in Computer Science<nowiki>}</nowiki>,
  +
volume = <nowiki>{</nowiki>12708<nowiki>}</nowiki>,
 
month = may,
 
month = may,
 
address = <nowiki>{</nowiki>Uppsala, Sweden<nowiki>}</nowiki>,
 
address = <nowiki>{</nowiki>Uppsala, Sweden<nowiki>}</nowiki>,
 
publisher = <nowiki>{</nowiki>Springer<nowiki>}</nowiki>,
 
publisher = <nowiki>{</nowiki>Springer<nowiki>}</nowiki>,
  +
pages = <nowiki>{</nowiki>470--482<nowiki>}</nowiki>,
 
abstract = <nowiki>{</nowiki>Integrating mathematical morphology operations within deep
 
abstract = <nowiki>{</nowiki>Integrating mathematical morphology operations within deep
 
neural networks has been subject to increasing attention
 
neural networks has been subject to increasing attention
Line 44: Line 48:
 
further implementations within deep convolutional neural
 
further implementations within deep convolutional neural
 
network architectures.<nowiki>}</nowiki>,
 
network architectures.<nowiki>}</nowiki>,
note = <nowiki>{</nowiki>Accepted<nowiki>}</nowiki>
+
doi = <nowiki>{</nowiki>10.1007/978-3-030-76657-3_34<nowiki>}</nowiki>
 
<nowiki>}</nowiki>
 
<nowiki>}</nowiki>
   

Latest revision as of 10:56, 8 September 2021

Abstract

Integrating mathematical morphology operations within deep neural networks has been subject to increasing attention lately. However, replacing standard convolution layers with erosions or dilations is particularly challenging because the min and max operations are not differentiable. Relying on the asymptotic behavior of the counter-harmonic meanp-convolutional layers were proposed as a possible workaround to this issue since they can perform pseudo-dilation or pseudo-erosion operations (depending on the value of their inner parameter p), and very promising results were reported. In this work, we present two new morphological layers based on the same principle as the p-convolutional layer while circumventing its principal drawbacks, and demonstrate their potential interest in further implementations within deep convolutional neural network architectures.

Documents

Bibtex (lrde.bib)

@InProceedings{	  kirszenberg.21.dgmm,
  author	= {Alexandre Kirszenberg and Guillaume Tochon and \'{E}lodie
		  Puybareau and Jesus Angulo},
  title		= {Going beyond p-convolutions to learn grayscale
		  morphological operators},
  booktitle	= {Proceedings of the IAPR International Conference on
		  Discrete Geometry and Mathematical Morphology (DGMM)},
  year		= {2021},
  series	= {Lecture Notes in Computer Science},
  volume	= {12708},
  month		= may,
  address	= {Uppsala, Sweden},
  publisher	= {Springer},
  pages		= {470--482},
  abstract	= {Integrating mathematical morphology operations within deep
		  neural networks has been subject to increasing attention
		  lately. However, replacing standard convolution layers with
		  erosions or dilations is particularly challenging because
		  the min and max operations are not differentiable. Relying
		  on the asymptotic behavior of the counter-harmonic mean,
		  p-convolutional layers were proposed as a possible
		  workaround to this issue since they can perform
		  pseudo-dilation or pseudo-erosion operations (depending on
		  the value of their inner parameter p), and very promising
		  results were reported. In this work, we present two new
		  morphological layers based on the same principle as the
		  p-convolutional layer while circumventing its principal
		  drawbacks, and demonstrate their potential interest in
		  further implementations within deep convolutional neural
		  network architectures.},
  doi		= {10.1007/978-3-030-76657-3_34}
}