- Research Article
- Open Access

# Integrating the Projective Transform with Particle Filtering for Visual Tracking

- P.L.M. Bouttefroy
^{1}Email author, - A Bouzerdoum
^{1}, - SL Phung
^{1}and - A Beghdadi
^{2}

**2011**:839412

https://doi.org/10.1155/2011/839412

© P. L. M. Bouttefroy et al. 2011

**Received:**9 April 2010**Accepted:**26 October 2010**Published:**3 November 2010

## Abstract

This paper presents the projective particle filter, a Bayesian filtering technique integrating the projective transform, which describes the distortion of vehicle trajectories on the camera plane. The characteristics inherent to traffic monitoring, and in particular the projective transform, are integrated in the particle filtering framework in order to improve the tracking robustness and accuracy. It is shown that the projective transform can be fully described by three parameters, namely, the angle of view, the height of the camera, and the ground distance to the first point of capture. This information is integrated in the importance density so as to explore the feature space more accurately. By providing a fine distribution of the samples in the feature space, the projective particle filter outperforms the standard particle filter on different tracking measures. First, the resampling frequency is reduced due to a better fit of the importance density for the estimation of the posterior density. Second, the mean squared error between the feature vector estimate and the true state is reduced compared to the estimate provided by the standard particle filter. Third, the tracking rate is improved for the projective particle filter, hence decreasing track loss.

## Keywords

- Mean Square Error
- Video Sequence
- Particle Filter
- Object Tracking
- Average Mean Square Error

## 1. Introduction and Motivations

Vehicle tracking has been an active field of research within the past decade due to the increase in computational power and the development of video surveillance infrastructure. The area of Intelligent Transportation Systems (ITSs) is in need for robust tracking algorithms to ensure that top-end decisions such as automatic traffic control and regulation, automatic video surveillance and abnormal event detection are made with a high level of confidence. Accurate trajectory extraction provides essential statistics for traffic control, such as speed monitoring, vehicle count, and average vehicle flow. Therefore, as a low-level task at the bottom-end of ITS, vehicle tracking must provide accurate and robust information to higher-level modules making intelligent decisions. In this sense, intelligent transportation systems are a major breakthrough since they alleviate the need for devices that can be prohibitively costly or simply unpractical to implement. For instance, the installation of inductive loop sensors generates traffic perturbations that cannot always be afforded in dense traffic areas. Also, robust video tracking enables new applications such as vehicle identification and customized statistics that are not available with current technologies, for example, suspect vehicle tracking or differentiated vehicle speed limits. At the top-end of the system are high level-tasks such as event detection (e.g., accident and animal crossing) or traffic regulation (e.g., dynamic adaptation and lane allocation). Robust vehicle tracking is therefore necessary to ensure effective performance.

Several techniques have been developed for vehicle tracking over the past two decades. The most common ones rely on Bayesian filtering, and Kalman and particle filters in particular. Kalman filter-based tracking usually relies on background subtraction followed by segmentation [1, 2], although some techniques implement spatial features such as corners and edges [3, 4] or use Bayesian energy minimization [5]. Exhaustive search techniques involving template matching [6] or occlusion reasoning [7] have also been used for tracking vehicles. Particle filtering is preferred when the hypothesis of multimodality is necessary, for example, in case of severe occlusion [8, 9]. Particle filters offer the advantage of relaxing the Gaussian and linearity constraints imposed upon the Kalman filter. On the downside, particle filters only provide a suboptimal solution, which converges in a statistical sense to the optimal solution. The convergence is of the order , where is the number of particles; consequently, they are computation-intensive algorithms. For this reason, particle filtering techniques for visual tracking have been developed only recently with the widespread of powerful computers. Particle filters for visual object tracking have first been introduced by Isard and Blake, part of the CONDENSATION algorithm [10, 11], and Doucet [12]. Arulampalam et al. provide a more general introduction to Bayesian filtering, encompassing particle filter implementations [13]. Within the last decade, the interest in particle filters has been growing exponentially. Early contributions were based on the Kalman filter models; for instance, Van Der Merwe et al. discussed an extended particle filter (EPF) and proposed an unscented particle filter (UPF), using the unscented transform to capture second order nonlinearities [14]. Later, a Gaussian sum particle filter was introduced to reduce the computational complexity [15]. There has also been a plethora of theoretic improvements to the original algorithm such as the kernel particle filter [16, 17], the iterated extended Kalman particle filter [18], the adaptive sample size particle filter [19, 20], and the augmented particle filter [21]. As far as applications are concerned, particle filters are widely used in a variety of tracking tasks: head tracking via active contours [22, 23], edge and color histogram tracking [24, 25], sonar [26], and phase [27] tracking, to name few. Particle filters have also been used for object detection and segmentation [28, 29], and for audiovisual fusion [30].

Many vehicle tracking systems have been proposed that integrate features of the object, such as the traditional kinematic model parameters [2, 7, 31–33] or scale [1], in the tracking model. However, these techniques seldom integrate information specific to the vehicle tracking problem, which is key to the improvement of track extraction; rather, they are general estimators disregarding the particular traffic surveillance context. Since particle filters require a large number of samples in order to achieve accurate and robust tracking, information pertaining to the behavior of the vehicle is instrumental in drawing samples from the importance density. To this end, the projective fractional transform is used to map the vehicle position in the real world to its position on the camera plane. In [35], Bouttefroy et al. proposed the projective Kalman filter (PKF), which integrates the projective transform into the Kalman tracker to improve its performance. However, the PKF tracker differs from the proposed particle filter tracker in that the former relies on background subtraction to extract the objects, whereas the latter uses color information to track the objects.

The aim of this paper is to study the performance of a particle filter integrating vehicle characteristics in order to decrease the size of the particle set for a given error rate. In this framework, the task of vehicle tracking can be approached as a specific application of object tracking in a constrained environment. Indeed, vehicles do not evolve freely in their environment but follow particular trajectories. The most notable constraints imposed upon vehicle trajectories in traffic video surveillance are summarized below.

Low Definition and Highly Compressed Videos

Traffic monitoring video sequences are often of poor quality because of the inadequate infrastructure of the acquisition and transport system. Therefore, the size of the sample set ( ) necessary for vehicle tracking must be large to ensure robust and accurate estimates.

Slowly-Varying Vehicle Speed

A common assumption in vehicle tracking is the uniformity of the vehicle speed. The narrow angle of view of the scene and the short period of time a vehicle is in the field of view justify this assumption, especially when tracking vehicles on a highway.

Constrained Real-World Vehicle Trajectory

Projection of Vehicle Trajectory on the Camera Plane

The trajectory of a vehicle on the camera plane undergoes severe distortion due to the low elevation of the traffic surveillance camera. The curve described by the position of the vehicle converges asymptotically to the vanishing point.

We propose here to integrate these characteristics to obtain a finer estimate of the vehicle feature vector. More specifically, the mapping of real-world vehicle trajectory through a fractional transform enables a better estimate of the posterior density. A particle filter is thus implemented, which integrate cues of the projection in the importance density, resulting in a better exploration of the state space and a reduction of the variance in the trajectory estimation. Preliminary results of this work have been presented in [34]; this paper develops the work further. Its main contributions are: (i) a complete description of the homographic projection problem for vehicle tracking and a review of the solutions proposed to date; (ii) an evaluation of the projective particle filter tracking rate on a comprehensive dataset comprising around 2,600 vehicles; (iii) an evaluation of the resampling accuracy for the projective particle filter; (iv) a comparison of the performance of the projective particle filter and the standard particle filter using three different measures, namely, the sampling frequency, the mean squared error and tracking drift. The rest of the paper is organized as follows. Section 2 introduces the general particle filtering framework. Section 3 develops the proposed Projective Particle Filter (PPF). An analysis of the PPF performance versus the standard particle filter is presented in Section 4 before concluding in Section 5.

## 2. Bayesian and Particle Filtering

*posterior*probability density function (pdf) of the state given the observation as . The probability density function is estimated recursively, in two steps: prediction and update. First, let us denote by the

*posterior*pdf at time , and let us assume it is known. The prediction stage relies on the Chapman-Kolmogorov equation to estimate the prior pdf :

where is the likelihood function and is a normalizing constant, . As the posterior probability density function is recursively estimated through (3) and (4), only the initial density is to be known.

The choice of the importance density is crucial in order to obtain a good estimate of the posterior pdf. It has been shown that the set of particles and associated weights will eventually degenerate, that is, most of the weights will be carried by a small number of samples and a large number of samples will have negligible weight [38]. In such a case, and because samples are not drawn from the true posterior, the degeneracy problem cannot be avoided and resampling of the set needs to be performed. Nevertheless, the closer the importance density is from the true posterior density, the slower the set will degenerate; a good choice of importance density reduces the need for resampling. In this paper, we propose to model the fractional transform mapping the real world space onto the camera plane and to integrate the projection in the particle filter through the importance density .

## 3. Projective Particle Filter

The particle filter developed is named Projective Particle Filter (PPF) because the vehicle position is projected on the camera plane and used as an inference to diffuse the particles in the feature space. One of the particularities of the PPF is to differentiate between the importance density and the transition prior pdf, whilst the SIR (Sampling Importance Resampling) filter, also called standard particle filter, does not. Therefore, we need to define the importance density from the fractional transform as well as the transition prior and the likelihood in order to update the weights in (6).

### 3.1. Linear Fractional Transformation

The fractional transform is used to estimate the position of the object on the camera plane ( ) from its position on the road ( ). The physical trajectory is projected onto the camera plane as shown in Figure 2. The distortion of the object trajectory happens along the direction , tangential to the road. The axis is parallel to the camera plane; the projection of the vehicle position on is thus proportional to the position of the vehicle on the camera plane. The value of is scaled by , the projection of the vanishing point on , to obtain the position of the vehicle in terms of pixels. For practical implementation, it is useful to express the projection along the tangential direction onto the axis in terms of video footage parameters that are easily accessible, namely:

(i)angle of view ( ),

(ii) height of the camera ( ),

(iii) ground distance ( ) between the camera and the first location captured by the camera.

*β*is approximately equal to and, consequently, (11) simplifies to . Note that this result can be verified using the triangle proportionality theorem. Finally, we scale with the position of the vanishing point in the image to find the position of the vehicle in terms of pixel location, which yields

Video sequences used for the evaluation of the algorithm performance along with the duration, the number of vehicles, and the setting parameters, namely, the height ( ), the angle of view ( ) and the distance to field of view ( ).

Video sequence | Duration | No. of vehicles | Camera height ( ) | Angle of view ( ) | Distance to FOV ( ) |
---|---|---|---|---|---|

Video_001 | 199 s | 74 | 6 m | deg | 48 m |

Video_002 | 360 s | 115 | 5.5 m | deg | 75 m |

Video_003 | 480 s | 252 | 5.5 m | deg | 75 m |

Video_004 | 367 s | 132 | 6 m | deg | 29 m |

Video_005 | 140 s | 33 | 5.5 m | deg | 80 m |

Video_006 | 312 s | 83 | 5.5 m | deg | 57 m |

Video_007 | 302 s | 84 | 5.5 m | deg | 57 m |

Video_008 | 310 s | 89 | 5.5 m | deg | 57 m |

Video_009 | 80 s | 42 | 5.5 m | deg | 57 m |

Video_010 | 495 s | 503 | 7.5 m | deg | 135 m |

Video_011 | 297 s | 286 | 7.5 m | deg | 80 m |

Video_012 | 358 s | 183 | 8 m | deg | 43 m |

Video_013 | 377 s | 188 | 8 m | deg | 43 m |

Video_014 | 278 s | 264 | 6 m | deg | 64 m |

Video_015 | 269 s | 267 | 6 m | deg | 64 m |

### 3.2. Importance Density and Transition Prior

It is important to note that since the fractional transform is along the -axis, the function provides a better estimate than a simple kinematic model taking into account the speed of the vehicle. On the other hand, the distortion along the -axis is much weaker and such an estimation is not necessary. One novel aspect of this paper is the estimation of the vehicle position along the axis and its size through and , respectively. It is worthwhile noting that the standard kinematic model of the vehicle is recovered when and . The vector-valued function denotes the standard kinematic model in the sequel. The samples of the PPF are drawn from the importance density and the standard kinematic model is used in the prior density , where denotes the normal distribution of covariance matrix centered on . The distributions are considered Gaussian and isotropic to evenly spread the samples around the estimated state vector at time step .

### 3.3. Likelihood Estimation

### 3.4. Projective Particle Filter Implementation

Because most approaches to tracking take the prior density as importance density, the samples
are directly drawn from the standard kinematic model. In this paper, we differentiate between the prior and the importance density to obtain a better distribution of the samples. The initial state
is chosen as
where
and
are the initial coordinates of the object. The parameters are selected to cater for the majority of vehicles. The position of the vehicles (
) is estimated either manually or with an automatic procedure (see Section 4.2). The speed along the
-axis corresponds to the average pixel displacement for a speed of 90 km·h^{−1} and the apparent size
is set so that the elliptical region for histogram tracking encompasses at least the vehicle. The size is overestimated to fit all cars and most standard trucks at initialization: the size is then adjusted through tracking by the particle filters. The value
is used to draw the set of samples
. The transition prior
and the importance density
are both modeled with normal distributions. The prior covariance matrix and mean are initialized as
and
, respectively, and
and
, for the importance density. These initializations represent the physical constraints on the vehicle speed.

The implementation of the projective particle filter algorithm is summarized in Algorithm 1.

**Algorithm 1:** Projective Particle Filter Algorithm.

**Require**:
and

**for**
to
**do**

Compute from (16)

Draw

Compute the ratio

Update weights

**end** **for**

Normalize

**if**
**then**

**for**
to
**do**

**while**
**do**

**end** **while**

**end** **for**

**end** **if**

## 4. Experiments and Results

- (1)
the improvement in sample distribution with the implementation of the projective transform,

- (2)
the improvement in the position error of the vehicle by the projective particle filter,

- (3)
the robustness of vehicle tracking (in terms of an increase in tracking rate) due to the fine distribution of the particles in the feature space.

The algorithm is tested on 15 traffic monitoring video sequences, labeled Video_001 to Video_015 in Algorithm 1. The number of vehicles, and the duration of the video sequences as well as the parameters of the projective transform are summarized in Table 1. Around 2,600 moving vehicles are recorded in the set of video sequences. The videos range from clear weather to cloudy with weak illumination conditions. The camera was positioned above highways at a height ranging from 5.5 m to 8 m. Although the camera was placed at the center of the highways, a shift in the position has no effect on the performance, be it only for the earlier detection of vehicles and the length of the vehicle path. On the other hand, the rotation of the camera would affect the value of and the position of the vanishing point . The video sequences are low-definition ( ) to comply with the characteristics of traffic monitoring sequences. The video sequences are footage of vehicles traveling on a highway. Although the roads are straight in the dataset, the algorithm can be applied to curved roads with approximation of the parameters over short distances because the projection tends to linearize the curves in the image plane.

### 4.1. Distribution of Samples

An evaluation of the importance density can be performed by comparing the distribution of the samples in the feature space for the standard and the projective particle filters. Since the degeneracy of the particle set indicates the degree of fitting of the importance density through the number of effective samples (see (20)), the frequency of particle resampling is an indicator of the similarity between the posterior and the importance density. Ideally, the importance density should be the posterior. This is not possible in practice because the posterior is unknown; if the posterior were known, tracking would not be required.

For the problem of vehicle tracking, the importance density used in the projective particle filter is therefore more suitable for drawing samples, compared to the prior density used in the standard particle filter. An accurate importance density is beneficial not only from a computational perspective since the resampling procedure is less frequently called, but also for tracking performance, as the particles provide a better fit to the true posterior density. Subsequently, the tracker is less prone to distraction in case of occlusion or similarity between vehicles.

### 4.2. Trajectory Error Evaluation

MSE for the standard and the projective particle filters with 100 samples.

Video sequence | Video_005 | Video_006 | Video_008 |
---|---|---|---|

Avg. MSE Std PF | 2.26 | 0.99 | 1.07 |

Avg. MSE Proj. PF | 1.89 | 0.83 | 1.02 |

### 4.3. Tracking Rate Evaluation

### 4.4. Discussion

The experiments show that the projective particle filter performs better than the standard particle filter in terms of sample distribution, tracking error and tracking rate. The improvement is due to the integration of the projective transform in the importance density. Furthermore, the implementation of the projective transform requires very simple calculations under simplifying assumptions (12). Overall, since the projective particle filter requires fewer samples than the standard particle filter to achieve better tracking performance, the increase in computation due to the projective transform is offset by the reduction in sample set size. More specifically, the projective particle filter requires the computation of the vector-valued process function and the ratio for each sample. For the process function, (13) and (14), representing and , respectively, must be computed. The computation burden is low assuming that constant terms can be precomputed. On the other hand, the projective particle filter yields a gain in the sample set size since less particles are required for a given error and the resampling is 30% more efficient.

The projective particle filter performs better on the three different measures. The projective transform leads to a reduction in resampling frequency since the distribution of the particles carries accurately the posterior and, consequently, the degeneracy of the particle set is slower. The mean squared error is reduced since the particles focus around the actual position and size of the vehicle. The drift rate benefits from the projective transform since the tracker is less distracted by similar objects or by occlusion. The improvement is beneficial for applications that require vehicle "locking" such as vehicle counts or other applications for which performance is not based on the MSE. It is worthwhile noting here that the MSE and the tracking rate are independent: it can be observed from Figure 9 that the tracking rate is almost the same for Video_005, Video_006, and Video_008, but there is a factor of 2 between the MSE's of Video_005 and Video_008 (see Table 2).

## 5. Conclusion

A plethora of algorithms for object tracking based on Bayesian filtering are available. However, these systems fail to take advantage of traffic monitoring characteristics, in particular slow-varying vehicle speed, constrained real-world vehicle trajectory and projective transform of vehicles onto the camera plane. This paper proposed a new particle filter, namely, the projective particle filter, which integrates these characteristics into the importance density. The projective fractional transform, which maps the real world position of a vehicle onto the camera plane, provides a better distribution of the samples in the feature space. However, since the prior is not used for sampling, the weights of the projective particle filter have to be readjusted. The standard and the projective particle filters have been evaluated on traffic surveillance videos using three different measures representing robust and accurate vehicle tracking: (i) the degeneracy of the sample set is reduced when the fractional transform is integrated within the importance density; (ii) the tracking rate, measured through drift evaluation, shows an improvement in robustness of the tracker; (iii) the MSE on the vehicle trajectory is reduced with the projective particle filter. Furthermore, the proposed technique outperforms the standard particle filter in terms of MSE even with a fewer number of particles.

## Authors’ Affiliations

## References

- Jung YK, Ho YOS: Traffic parameter extraction using video-based vehicle tracking.
*Proceedings of IEEE/IEEJ/JSAI International Conference on Intelligent Transportation Systems, October 1999*764-769.Google Scholar - Mélo J, Naftel A, Bernardino A, Santos-Victor J: Detection and classification of highway lanes using vehicle motion trajectories.
*IEEE Transactions on Intelligent Transportation Systems*2006, 7(2):188-200. 10.1109/TITS.2006.874706View ArticleGoogle Scholar - Lin CP, Tai JC, Song KT: Traffic monitoring based on real-time image tracking.
*Proceedings of IEEE International Conference on Robotics and Automation, September 2003*2091-2096.Google Scholar - Qiu Z, An D, Yao D, Zhou D, Ran B: An adaptive Kalman Predictor applied to tracking vehicles in the traffic monitoring system.
*Proceedings of IEEE Intelligent Vehicles Symposium, June 2005*230-235.Google Scholar - Dellaert F, Pomerleau D, Thorpe C: Model-based car tracking integrated with a road-follower.
*Proceedings of IEEE International Conference on Robotics and Automation, 1998*3: 1889-1894.Google Scholar - Choi JY, Sung KS, Yang YK: Multiple vehicles detection and tracking based on scale-invariant feature transform.
*Proceedings of the 10th International IEEE Conference on Intelligent Transportation Systems (ITSC '07), October 2007*528-533.Google Scholar - Koller D, Weber J, Malik J: Towards realtime visual based tracking in cluttered traffic scenes.
*Proceedings of the Intelligent Vehicles Symposium, October 1994*201-206.View ArticleGoogle Scholar - Meier EB, Ade F: Tracking cars in range images using the condensation algorithm.
*Proceedings of IEEE/IEEJ/JSAI International Conference on Intelligent Transportation Systems, October 1999*129-134.Google Scholar - Xiong T, Debrunner C: Stochastic car tracking with line- and color-based features.
*Proceedings of IEEE Conference on Intelligent Transportation Systems, 2003*2: 999-1003.Google Scholar - Isard MA:
*Visual motion analysis by probabilistic propagation of conditional density, Ph.D. thesis*. University of Oxford, Oxford, UK; 1998.Google Scholar - Isard M, Blake A: Condensation—conditional density propagation for visual tracking.
*International Journal of Computer Vision*1998, 29(1):5-28. 10.1023/A:1008078328650View ArticleGoogle Scholar - Doucet A:
*On sequential simulation-based methods for Bayesian filtering.*University of Cambridge, Cambridge, UK; 1998.Google Scholar - Arulampalam MS, Maskell S, Gordon N, Clapp T: A tutorial on particle filters for online nonlinear/non-Gaussian Bayesian tracking.
*IEEE Transactions on Signal Processing*2002, 50(2):174-188. 10.1109/78.978374View ArticleGoogle Scholar - Van Der Merwe R, Doucet A, De Freitas N, Wan E:
*The unscented particle filter.*Cambridge University Engineering Department; 2000.Google Scholar - Kotecha JH, Djurić PM: Gaussian sum particle filtering.
*IEEE Transactions on Signal Processing*2003, 51(10):2602-2612. 10.1109/TSP.2003.816754View ArticleMathSciNetGoogle Scholar - Chang C, Ansari R: Kernel particle filter for visual tracking.
*IEEE Signal Processing Letters*2005, 12(3):242-245.View ArticleGoogle Scholar - Chang C, Ansari R, Khokhar A: Multiple object tracking with kernel particle filter.
*Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR '05), June 2005*568-573.Google Scholar - Liang-qun L, Hong-bing J, Jun-hui L: The iterated extended kalman particle filter.
*Proceedings of the International Symposium on Communications and Information Technologies (ISCIT '05), 2005*1213-1216.Google Scholar - Kwok C, Fox D, Meilǎ M: Adaptive real-time particle filters for robot localization.
*Proceedings of IEEE International Conference on Robotics and Automation, September 2003*2836-2841.Google Scholar - Kwok C, Fox D, Meilǎ M: Real-time particle filters.
*Proceedings of the IEEE*2004, 92(3):469-484. 10.1109/JPROC.2003.823144View ArticleGoogle Scholar - Shen C, Brooks MJ, van Hengel AD: Augmented particle filtering for efficient visual tracking.
*Proceedings of IEEE International Conference on Image Processing (ICIP '05), September 2005*856-859.Google Scholar - Zeng Z, Ma S: Head tracking by active particle filtering.
*Proceedings of IEEE Conference on Automatic Face and Gesture Recognition, 2002*82-87.Google Scholar - Feghali R, Mitiche A: Spatiotemporal motion boundary detection and motion boundary velocity estimation for tracking moving objects with a moving camera: a level sets PDEs approach with concurrent camera motion compensation.
*IEEE Transactions on Image Processing*2004, 13(11):1473-1490. 10.1109/TIP.2004.836158View ArticleGoogle Scholar - Yang C, Duraiswami R, Davis L: Fast multiple object tracking via a hierarchical particle filter.
*Proceedings of the 10th IEEE International Conference on Computer Vision (ICCV '05), October 2005*212-219.View ArticleGoogle Scholar - Li J, Chua C-S: Transductive inference for color-based particle filter tracking.
*Proceedings of IEEE International Conference on Image Processing, September 2003*3: 949-952.Google Scholar - Vadakkepat P, Jing L: Improved particle filter in sensor fusion for tracking randomly moving object.
*IEEE Transactions on Instrumentation and Measurement*2006, 55(5):1823-1832. 10.1109/TIM.2006.881569View ArticleGoogle Scholar - Ziadi A, Salut G: Non-overlapping deterministic Gaussian particles in maximum likelihood non-linear filtering. Phase tracking application.
*Proceedings of the International Symposium on Intelligent Signal Processing and Communication Systems (ISPACS '05), December 2005*645-648.Google Scholar - Czyz J: Object detection in video via particle filters.
*Proceedings of the 18th International Conference on Pattern Recognition (ICPR '06), August 2006*820-823.View ArticleGoogle Scholar - Woolrich MW, Behrens TE: Variational bayes inference of spatial mixture models for segmentation.
*IEEE Transactions on Medical Imaging*2006, 25(10):1380-1391.View ArticleGoogle Scholar - Nickel K, Gehrig T, Stiefelhagen R, McDonough J: A joint particle filter for audio-visual speaker tracking.
*Proceeding of the 7th International Conference on Multimodal Interfaces (ICMI '05), October 2005*61-68.View ArticleGoogle Scholar - Baş E, Tekalp AM, Salman FS: Automatic vehicle counting from video for traffic flow analysis.
*Proceedings of IEEE Intelligent Vehicles Symposium (IV '07), June 2007*392-397.Google Scholar - Gloyer B, Aghajan HK, Siu K-Y, Kailath T: Vehicle detection and tracking for freeway traffic monitoring.
*Proceedings of the Asilomar Conference on Signals, Systems and Computers, 1994*2: 970-974.Google Scholar - Zhao L, Thorpe C: Qualitative and quantitative car tracking from a range image sequence.
*Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, June 1998*496-501.Google Scholar - Bouttefroy PLM, Bouzerdoum A, Phung SL, Beghdadi A: Vehicle tracking using projective particle filter.
*Proceedings of the 6th IEEE International Conference on Advanced Video and Signal Based Surveillance (AVSS '09), September 2009*7-12.Google Scholar - Bouttefroy PLM, Bouzerdoum A, Phung SL, Beghdadi A: Vehicle tracking by non-drifting mean-shift using projective kalman filter.
*Proceedings of the 11th International IEEE Conference on Intelligent Transportation Systems (ITSC '08), December 2008*61-66.Google Scholar - Gustafsson F, Gunnarsson F, Bergman N, Forssell U, Jansson J, Karlsson R, Nordlund PJ: Particle filters for positioning, navigation, and tracking.
*IEEE Transactions on Signal Processing*2002, 50(2):425-437. 10.1109/78.978396View ArticleGoogle Scholar - Rathi Y, Vaswani N, Tannenbaum A: A generic framework for tracking using particle filter with dynamic shape prior.
*IEEE Transactions on Image Processing*2007, 16(5):1370-1382.View ArticleMathSciNetGoogle Scholar - Kong A, Lui JS, Wong WH: Sequential imputations and bayesian missing data problems.
*Journal of the American Statistical Assocation*1994, 89(425):278-288. 10.2307/2291224View ArticleMATHGoogle Scholar - Schreiber D, Alefs B, Clabian M: Single camera lane detection and tracking.
*Proceedings of the 8th International IEEE Conference on Intelligent Transportation Systems, September 2005*302-307.Google Scholar - Comaniciu D, Ramesh V, Meer P: Kernel-based object tracking.
*IEEE Transactions on Pattern Analysis and Machine Intelligence*2003, 25(5):564-577. 10.1109/TPAMI.2003.1195991View ArticleGoogle Scholar - Kitagawa G: Monte Carlo filter and smoother for non-Gaussian nonlinear state space models.
*Journal of Computational and Graphical Statistics*1996, 5(1):1-25. 10.2307/1390750MathSciNetGoogle Scholar

## Copyright

This article is published under license to BioMed Central Ltd. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.