Difference between revisions of "Publications/dubuisson.15.visapp"

From LRDE

 
Line 3: Line 3:
 
| date = 2015-03-01
 
| date = 2015-03-01
 
| authors = Séverine Dubuisson, Myriam Robert-Seidowsky, Jonathan Fabrizio
 
| authors = Séverine Dubuisson, Myriam Robert-Seidowsky, Jonathan Fabrizio
| title = A self-adaptive likelihood function for tracking with particle filter
+
| title = A Self-Adaptive Likelihood Function for Tracking with Particle Filter
 
| booktitle = Proceedings of the 10th International Conference on Computer Vision Theory and Applications (VISAPP)
 
| booktitle = Proceedings of the 10th International Conference on Computer Vision Theory and Applications (VISAPP)
 
| pages = 446 to 453
 
| pages = 446 to 453
Line 11: Line 11:
 
| type = inproceedings
 
| type = inproceedings
 
| id = dubuisson.15.visapp
 
| id = dubuisson.15.visapp
  +
| identifier = doi:10.5220/0005260004460453
 
| bibtex =
 
| bibtex =
 
@InProceedings<nowiki>{</nowiki> dubuisson.15.visapp,
 
@InProceedings<nowiki>{</nowiki> dubuisson.15.visapp,
 
author = <nowiki>{</nowiki>S\'everine Dubuisson and Myriam Robert-Seidowsky and
 
author = <nowiki>{</nowiki>S\'everine Dubuisson and Myriam Robert-Seidowsky and
 
Jonathan Fabrizio<nowiki>}</nowiki>,
 
Jonathan Fabrizio<nowiki>}</nowiki>,
title = <nowiki>{</nowiki>A self-adaptive likelihood function for tracking with
+
title = <nowiki>{</nowiki>A Self-Adaptive Likelihood Function for Tracking with
particle filter<nowiki>}</nowiki>,
+
Particle Filter<nowiki>}</nowiki>,
 
booktitle = <nowiki>{</nowiki>Proceedings of the 10th International Conference on
 
booktitle = <nowiki>{</nowiki>Proceedings of the 10th International Conference on
 
Computer Vision Theory and Applications (VISAPP)<nowiki>}</nowiki>,
 
Computer Vision Theory and Applications (VISAPP)<nowiki>}</nowiki>,
Line 38: Line 39:
 
(illumination changes, cluttered background, etc.) without
 
(illumination changes, cluttered background, etc.) without
 
adding any computational cost compared to the common usage
 
adding any computational cost compared to the common usage
with fixed parameters.<nowiki>}</nowiki>
+
with fixed parameters.<nowiki>}</nowiki>,
  +
doi = <nowiki>{</nowiki>10.5220/0005260004460453<nowiki>}</nowiki>
 
<nowiki>}</nowiki>
 
<nowiki>}</nowiki>
   

Latest revision as of 17:00, 27 May 2021

Abstract

The particle filter is known to be efficient for visual tracking. However, its parameters are empirically fixeddepending on the target application, the video sequences and the context. In this paper, we introduce a new algorithm which automatically adjusts “on-line" two majors of them: the correction and the propagation parameters. Our purpose is to determine, for each frame of a video, the optimal value of the correction parameter and to adjust the propagation one to improve the tracking performance. On one hand, our experimental results show that the common settings of particle filter are sub-optimal. On another hand, we prove that our approach achieves a lower tracking error without needing tuning these parameters. Our adaptive method allows to track objects in complex conditions (illumination changes, cluttered background, etc.) without adding any computational cost compared to the common usage with fixed parameters.

Documents

Bibtex (lrde.bib)

@InProceedings{	  dubuisson.15.visapp,
  author	= {S\'everine Dubuisson and Myriam Robert-Seidowsky and
		  Jonathan Fabrizio},
  title		= {A Self-Adaptive Likelihood Function for Tracking with
		  Particle Filter},
  booktitle	= {Proceedings of the 10th International Conference on
		  Computer Vision Theory and Applications (VISAPP)},
  month		= mar,
  year		= 2015,
  pages		= {446--453},
  abstract	= {The particle filter is known to be efficient for visual
		  tracking. However, its parameters are empirically fixed,
		  depending on the target application, the video sequences
		  and the context. In this paper, we introduce a new
		  algorithm which automatically adjusts ``on-line" two majors
		  of them: the correction and the propagation parameters. Our
		  purpose is to determine, for each frame of a video, the
		  optimal value of the correction parameter and to adjust the
		  propagation one to improve the tracking performance. On one
		  hand, our experimental results show that the common
		  settings of particle filter are sub-optimal. On another
		  hand, we prove that our approach achieves a lower tracking
		  error without needing tuning these parameters. Our adaptive
		  method allows to track objects in complex conditions
		  (illumination changes, cluttered background, etc.) without
		  adding any computational cost compared to the common usage
		  with fixed parameters.},
  doi		= {10.5220/0005260004460453}
}