 Research Article
 Open access
 Published:
Integrating the Projective Transform with Particle Filtering for Visual Tracking
EURASIP Journal on Image and Video Processing volume 2011, Article number: 839412 (2011)
Abstract
This paper presents the projective particle filter, a Bayesian filtering technique integrating the projective transform, which describes the distortion of vehicle trajectories on the camera plane. The characteristics inherent to traffic monitoring, and in particular the projective transform, are integrated in the particle filtering framework in order to improve the tracking robustness and accuracy. It is shown that the projective transform can be fully described by three parameters, namely, the angle of view, the height of the camera, and the ground distance to the first point of capture. This information is integrated in the importance density so as to explore the feature space more accurately. By providing a fine distribution of the samples in the feature space, the projective particle filter outperforms the standard particle filter on different tracking measures. First, the resampling frequency is reduced due to a better fit of the importance density for the estimation of the posterior density. Second, the mean squared error between the feature vector estimate and the true state is reduced compared to the estimate provided by the standard particle filter. Third, the tracking rate is improved for the projective particle filter, hence decreasing track loss.
1. Introduction and Motivations
Vehicle tracking has been an active field of research within the past decade due to the increase in computational power and the development of video surveillance infrastructure. The area of Intelligent Transportation Systems (ITSs) is in need for robust tracking algorithms to ensure that topend decisions such as automatic traffic control and regulation, automatic video surveillance and abnormal event detection are made with a high level of confidence. Accurate trajectory extraction provides essential statistics for traffic control, such as speed monitoring, vehicle count, and average vehicle flow. Therefore, as a lowlevel task at the bottomend of ITS, vehicle tracking must provide accurate and robust information to higherlevel modules making intelligent decisions. In this sense, intelligent transportation systems are a major breakthrough since they alleviate the need for devices that can be prohibitively costly or simply unpractical to implement. For instance, the installation of inductive loop sensors generates traffic perturbations that cannot always be afforded in dense traffic areas. Also, robust video tracking enables new applications such as vehicle identification and customized statistics that are not available with current technologies, for example, suspect vehicle tracking or differentiated vehicle speed limits. At the topend of the system are high leveltasks such as event detection (e.g., accident and animal crossing) or traffic regulation (e.g., dynamic adaptation and lane allocation). Robust vehicle tracking is therefore necessary to ensure effective performance.
Several techniques have been developed for vehicle tracking over the past two decades. The most common ones rely on Bayesian filtering, and Kalman and particle filters in particular. Kalman filterbased tracking usually relies on background subtraction followed by segmentation [1, 2], although some techniques implement spatial features such as corners and edges [3, 4] or use Bayesian energy minimization [5]. Exhaustive search techniques involving template matching [6] or occlusion reasoning [7] have also been used for tracking vehicles. Particle filtering is preferred when the hypothesis of multimodality is necessary, for example, in case of severe occlusion [8, 9]. Particle filters offer the advantage of relaxing the Gaussian and linearity constraints imposed upon the Kalman filter. On the downside, particle filters only provide a suboptimal solution, which converges in a statistical sense to the optimal solution. The convergence is of the order , where is the number of particles; consequently, they are computationintensive algorithms. For this reason, particle filtering techniques for visual tracking have been developed only recently with the widespread of powerful computers. Particle filters for visual object tracking have first been introduced by Isard and Blake, part of the CONDENSATION algorithm [10, 11], and Doucet [12]. Arulampalam et al. provide a more general introduction to Bayesian filtering, encompassing particle filter implementations [13]. Within the last decade, the interest in particle filters has been growing exponentially. Early contributions were based on the Kalman filter models; for instance, Van Der Merwe et al. discussed an extended particle filter (EPF) and proposed an unscented particle filter (UPF), using the unscented transform to capture second order nonlinearities [14]. Later, a Gaussian sum particle filter was introduced to reduce the computational complexity [15]. There has also been a plethora of theoretic improvements to the original algorithm such as the kernel particle filter [16, 17], the iterated extended Kalman particle filter [18], the adaptive sample size particle filter [19, 20], and the augmented particle filter [21]. As far as applications are concerned, particle filters are widely used in a variety of tracking tasks: head tracking via active contours [22, 23], edge and color histogram tracking [24, 25], sonar [26], and phase [27] tracking, to name few. Particle filters have also been used for object detection and segmentation [28, 29], and for audiovisual fusion [30].
Many vehicle tracking systems have been proposed that integrate features of the object, such as the traditional kinematic model parameters [2, 7, 31–33] or scale [1], in the tracking model. However, these techniques seldom integrate information specific to the vehicle tracking problem, which is key to the improvement of track extraction; rather, they are general estimators disregarding the particular traffic surveillance context. Since particle filters require a large number of samples in order to achieve accurate and robust tracking, information pertaining to the behavior of the vehicle is instrumental in drawing samples from the importance density. To this end, the projective fractional transform is used to map the vehicle position in the real world to its position on the camera plane. In [35], Bouttefroy et al. proposed the projective Kalman filter (PKF), which integrates the projective transform into the Kalman tracker to improve its performance. However, the PKF tracker differs from the proposed particle filter tracker in that the former relies on background subtraction to extract the objects, whereas the latter uses color information to track the objects.
The aim of this paper is to study the performance of a particle filter integrating vehicle characteristics in order to decrease the size of the particle set for a given error rate. In this framework, the task of vehicle tracking can be approached as a specific application of object tracking in a constrained environment. Indeed, vehicles do not evolve freely in their environment but follow particular trajectories. The most notable constraints imposed upon vehicle trajectories in traffic video surveillance are summarized below.
Low Definition and Highly Compressed Videos
Traffic monitoring video sequences are often of poor quality because of the inadequate infrastructure of the acquisition and transport system. Therefore, the size of the sample set () necessary for vehicle tracking must be large to ensure robust and accurate estimates.
SlowlyVarying Vehicle Speed
A common assumption in vehicle tracking is the uniformity of the vehicle speed. The narrow angle of view of the scene and the short period of time a vehicle is in the field of view justify this assumption, especially when tracking vehicles on a highway.
Constrained RealWorld Vehicle Trajectory
Normal driving rules impose a particular trajectory on the vehicle. Indeed, the curvature of the road and the different lanes constrain the position of the vehicle. Figure 1 illustrates the pattern of vehicle trajectories resulting from projective constraints that can be exploited in vehicle tracking.
Projection of Vehicle Trajectory on the Camera Plane
The trajectory of a vehicle on the camera plane undergoes severe distortion due to the low elevation of the traffic surveillance camera. The curve described by the position of the vehicle converges asymptotically to the vanishing point.
We propose here to integrate these characteristics to obtain a finer estimate of the vehicle feature vector. More specifically, the mapping of realworld vehicle trajectory through a fractional transform enables a better estimate of the posterior density. A particle filter is thus implemented, which integrate cues of the projection in the importance density, resulting in a better exploration of the state space and a reduction of the variance in the trajectory estimation. Preliminary results of this work have been presented in [34]; this paper develops the work further. Its main contributions are: (i) a complete description of the homographic projection problem for vehicle tracking and a review of the solutions proposed to date; (ii) an evaluation of the projective particle filter tracking rate on a comprehensive dataset comprising around 2,600 vehicles; (iii) an evaluation of the resampling accuracy for the projective particle filter; (iv) a comparison of the performance of the projective particle filter and the standard particle filter using three different measures, namely, the sampling frequency, the mean squared error and tracking drift. The rest of the paper is organized as follows. Section 2 introduces the general particle filtering framework. Section 3 develops the proposed Projective Particle Filter (PPF). An analysis of the PPF performance versus the standard particle filter is presented in Section 4 before concluding in Section 5.
2. Bayesian and Particle Filtering
This section presents a brief review of Bayesian and particle filtering. Bayesian filtering provides a convenient framework for object tracking due to the weak assumptions on the state space model and the firstorder Markov chain recursive properties. Without loss of generality, let us consider a system with state of dimension and observation of dimension . Let and denote, respectively, the set of states and the set of observations prior to and including time instant . The state space model can be expressed as
when the process and observation noises, and , respectively, are assumed to be additive. The vectorvalued functions and are the process and observation functions, respectively. Bayesian filtering aims to estimate the posterior probability density function (pdf) of the state given the observation as . The probability density function is estimated recursively, in two steps: prediction and update. First, let us denote by the posterior pdf at time , and let us assume it is known. The prediction stage relies on the ChapmanKolmogorov equation to estimate the prior pdf :
When a new observation becomes available, the prior is updated as follows:
where is the likelihood function and is a normalizing constant, . As the posterior probability density function is recursively estimated through (3) and (4), only the initial density is to be known.
Monte Carlo methods and more specifically particle filters have been extensively employed to tackle the Bayesian problem represented by (3) and (4) [36, 37]. Multimodality enables the system to evolve in time with several hypotheses on the state in parallel. This property is practical to corroborate or reject an eventual track after several frames. However, the Bayesian problem then cannot be solved in closed form, as in the Kalman filter, due to the complex density shapes involved. Particle filters rely on Sequential Monte Carlo (SMC) simulations, as a numerical method, to circumvent the direct evaluation of the ChapmanKolmogorov equation (3). Let us assume that a large number of samples are drawn from the posterior distribution . It follows from the law of large numbers that
where are positive weights, satisfying , and is the Kronecker delta function. However, because it is often difficult to draw samples from the posterior pdf, an importance density is used to generate the samples . It can then be shown that the recursive estimate of the posterior density via (3) and (4) can be carried out by the set of particles, provided that the weights are updated as follows [13]:
The choice of the importance density is crucial in order to obtain a good estimate of the posterior pdf. It has been shown that the set of particles and associated weights will eventually degenerate, that is, most of the weights will be carried by a small number of samples and a large number of samples will have negligible weight [38]. In such a case, and because samples are not drawn from the true posterior, the degeneracy problem cannot be avoided and resampling of the set needs to be performed. Nevertheless, the closer the importance density is from the true posterior density, the slower the set will degenerate; a good choice of importance density reduces the need for resampling. In this paper, we propose to model the fractional transform mapping the real world space onto the camera plane and to integrate the projection in the particle filter through the importance density .
3. Projective Particle Filter
The particle filter developed is named Projective Particle Filter (PPF) because the vehicle position is projected on the camera plane and used as an inference to diffuse the particles in the feature space. One of the particularities of the PPF is to differentiate between the importance density and the transition prior pdf, whilst the SIR (Sampling Importance Resampling) filter, also called standard particle filter, does not. Therefore, we need to define the importance density from the fractional transform as well as the transition prior and the likelihood in order to update the weights in (6).
3.1. Linear Fractional Transformation
The fractional transform is used to estimate the position of the object on the camera plane () from its position on the road (). The physical trajectory is projected onto the camera plane as shown in Figure 2. The distortion of the object trajectory happens along the direction , tangential to the road. The axis is parallel to the camera plane; the projection of the vehicle position on is thus proportional to the position of the vehicle on the camera plane. The value of is scaled by , the projection of the vanishing point on , to obtain the position of the vehicle in terms of pixels. For practical implementation, it is useful to express the projection along the tangential direction onto the axis in terms of video footage parameters that are easily accessible, namely:
(i)angle of view (),
(ii) height of the camera (),
(iii) ground distance () between the camera and the first location captured by the camera.
It can be inferred from Figure 2, after applying the law of cosines, that
where and . After squaring and substituting in (7), we obtain
Grouping the terms in to get a quadratic form leads to
After discarding the nonphysically acceptable solution, one gets
However, because and is small in practice (see Table 1), the angle β is approximately equal to and, consequently, (11) simplifies to . Note that this result can be verified using the triangle proportionality theorem. Finally, we scale with the position of the vanishing point in the image to find the position of the vehicle in terms of pixel location, which yields
(The position of the vanishing point can either be approximated manually or estimated automatically [39]. In our experiments, the position of the vanishing point is estimated manually). The projected speed and the observed size of the object on the camera plane are also important variables for the problem of tracking, and hence it is necessary to derive them. Let and . Differentiating (12), after substituting for () and eliminating , yields the observed speed of the vehicle on the camera plane:
The observed size of the vehicle can also be derived from the position if the real size of the vehicle is known. If the center of the vehicle is , its extremities are located at and . Therefore, applying the fractional transformation yields
3.2. Importance Density and Transition Prior
The projective particle filter integrates the fractional transform into the importance density . The state vector is modeled with the position, the speed and the size of the vehicle in the image:
where and are the Cartesian coordinates of the vehicle, and are the respective speeds and is the apparent size of the vehicle; more precisely, is the radius of the circle best fitting the vehicle shape. Object tracking is traditionally performed using a standard kinematic model (Newton's Laws), taking into account the position, the speed and the size of the object (The size of the object is essentially maintained for the purpose of likelihood estimation). In this paper, the kinematic model is refined with the estimation of the speed and the object size through the fractional transform along the distorted direction . Therefore, the process function , defined in (1), is given by
It is important to note that since the fractional transform is along the axis, the function provides a better estimate than a simple kinematic model taking into account the speed of the vehicle. On the other hand, the distortion along the axis is much weaker and such an estimation is not necessary. One novel aspect of this paper is the estimation of the vehicle position along the axis and its size through and , respectively. It is worthwhile noting that the standard kinematic model of the vehicle is recovered when and . The vectorvalued function denotes the standard kinematic model in the sequel. The samples of the PPF are drawn from the importance density and the standard kinematic model is used in the prior density , where denotes the normal distribution of covariance matrix centered on . The distributions are considered Gaussian and isotropic to evenly spread the samples around the estimated state vector at time step .
3.3. Likelihood Estimation
The estimation of the likelihood is based on the distance between color histograms, as in [40]. Let us define an bin histogram = , representing the distribution of color pixel values , as follows:
where is the set of bins regularly spaced on the interval [], is a linear binning function providing the bin index of pixel value , and is the Kronecker delta function. The pixels are selected from a circle of radius centered on . Indeed, after projection on the camera plane, the circle is the standard shape that delineates the vehicle best. Let us denote the target and the candidate histograms by and , respectively. The Bhattacharyya distance between two histograms is defined as
Finally, the likelihood is calculated as
3.4. Projective Particle Filter Implementation
Because most approaches to tracking take the prior density as importance density, the samples are directly drawn from the standard kinematic model. In this paper, we differentiate between the prior and the importance density to obtain a better distribution of the samples. The initial state is chosen as where and are the initial coordinates of the object. The parameters are selected to cater for the majority of vehicles. The position of the vehicles () is estimated either manually or with an automatic procedure (see Section 4.2). The speed along the axis corresponds to the average pixel displacement for a speed of 90 km·h^{−1} and the apparent size is set so that the elliptical region for histogram tracking encompasses at least the vehicle. The size is overestimated to fit all cars and most standard trucks at initialization: the size is then adjusted through tracking by the particle filters. The value is used to draw the set of samples . The transition prior and the importance density are both modeled with normal distributions. The prior covariance matrix and mean are initialized as and , respectively, and and , for the importance density. These initializations represent the physical constraints on the vehicle speed.
A resampling scheme is necessary to avoid the degeneracy of the particle set. Systematic sampling [41] is performed when the variance of the weight set is too large, that is, when the number of the effective samples falls below a given threshold , arbitrarily set to in the implementation. The number of effective samples is evaluated as
The implementation of the projective particle filter algorithm is summarized in Algorithm 1.
Algorithm 1: Projective Particle Filter Algorithm.
Require: and
for to do
Compute from (16)
Draw
Compute the ratio
Update weights
end for
Normalize
if then
for to do
while do
end while
end for
end if
4. Experiments and Results
In this section, the performances of the standard and the projective particle filters are evaluated on traffic surveillance data. Since the two vehicle tracking algorithms possess the same architecture, the difference in performance can be attributed to the distribution of particles through the importance density integrating the projective transform. The experimental results presented in this section aim to evaluate

(1)
the improvement in sample distribution with the implementation of the projective transform,

(2)
the improvement in the position error of the vehicle by the projective particle filter,

(3)
the robustness of vehicle tracking (in terms of an increase in tracking rate) due to the fine distribution of the particles in the feature space.
The algorithm is tested on 15 traffic monitoring video sequences, labeled Video_001 to Video_015 in Algorithm 1. The number of vehicles, and the duration of the video sequences as well as the parameters of the projective transform are summarized in Table 1. Around 2,600 moving vehicles are recorded in the set of video sequences. The videos range from clear weather to cloudy with weak illumination conditions. The camera was positioned above highways at a height ranging from 5.5 m to 8 m. Although the camera was placed at the center of the highways, a shift in the position has no effect on the performance, be it only for the earlier detection of vehicles and the length of the vehicle path. On the other hand, the rotation of the camera would affect the value of and the position of the vanishing point . The video sequences are lowdefinition () to comply with the characteristics of traffic monitoring sequences. The video sequences are footage of vehicles traveling on a highway. Although the roads are straight in the dataset, the algorithm can be applied to curved roads with approximation of the parameters over short distances because the projection tends to linearize the curves in the image plane.
4.1. Distribution of Samples
An evaluation of the importance density can be performed by comparing the distribution of the samples in the feature space for the standard and the projective particle filters. Since the degeneracy of the particle set indicates the degree of fitting of the importance density through the number of effective samples (see (20)), the frequency of particle resampling is an indicator of the similarity between the posterior and the importance density. Ideally, the importance density should be the posterior. This is not possible in practice because the posterior is unknown; if the posterior were known, tracking would not be required.
First, the mean squared error (MSE) between the true state of the feature vector and the set of particles is presented without resampling in order to compare the tracking accuracy of the projective and standard particle filters based solely on the performance of the importance and prior densities, respectively. Consequently, the fit of the importance density to the vehicle tracking problem is evaluated. Furthermore, computing the MSE provides a quantitative estimate of the error. Since there is no resampling, a large number of particles is required in this experiment: we chose . Figure 3 shows the position MSE for the standard and the projective particle filters for 80 trajectories in Video_008 sequence; the average MSEs are 1.10 and 0.58, respectively.
Second, the resampling frequencies for the projective and the standard particle filters are evaluated on the entire dataset. A decrease in the resampling frequency is the result of a better (i.e., closer to the posterior density) modeling of the density from which the samples are drawn. The resampling frequencies are expressed as the percentage of resampling compared to the direct sampling at each time step . Figure 4 displays the resampling frequencies across the entire dataset for each particle filter. On average, the projective particle filter resamples 14.9% of the time and the standard particle filter 19.4%, that is, an increase of 30% between the former and the latter.
For the problem of vehicle tracking, the importance density used in the projective particle filter is therefore more suitable for drawing samples, compared to the prior density used in the standard particle filter. An accurate importance density is beneficial not only from a computational perspective since the resampling procedure is less frequently called, but also for tracking performance, as the particles provide a better fit to the true posterior density. Subsequently, the tracker is less prone to distraction in case of occlusion or similarity between vehicles.
4.2. Trajectory Error Evaluation
An important measure in vehicle tracking is the variance of the trajectory. Indeed, highlevel tasks, such as abnormal behavior or DUI (driving under the influence) detection, require an accurate tracking of the vehicle and, in particular, a low MSE for the position. Figure 5 displays a track estimated with the projective particle filter and the standard particle filter. It can be inferred qualitatively that the PPF achieves better results than the standard particle filter. Two experiments are conducted to evaluate the performance in terms of position variance: one with semiautomatic variance estimation and the other one with ground truth labeling to evaluate the influence of the number of particles.
In the first experiment, the performance of each tracker is evaluated in terms of MSE. In order to avoid the tedious task of manually extracting the groundtruth of every track, a synthetic track is generated automatically based on the parameters of the real world projection of the vehicle trajectory on the camera plane. Figure 6 shows that the theoretic and the manually extracted tracks match almost perfectly. The initialization of the tracks is performed as in [35]. However, because the initial position of the vehicle when tracking starts may differ from one track to another, it is necessary to align the theoretic and the extracted tracks in order to cancel the bias in the estimation of the MSE. Furthermore, the variance estimation is semiautomatic since the match between the generated and the extracted tracks is visually assessed. It was found that Video_005, Video_006, and Video_008 sequences provide the best matches overall. The 205 vehicle tracks contained in the 3 sequences are matched against their respective generated tracks and visually inspected to ensure adequate correspondence. The average MSEs for each video sequence are presented in Table 2 for a sample set size of 100. It can be inferred from Table 2 that the PPF consistently outperforms the standard particle filter. It is also worth noting that the higher MSE in this experiment, compared to the one presented in Figure 5 for Video_008, is due to the smaller number of particles—even with resampling, the particle filters do not reach the accuracy achieved with 300 particles.
In the second experiment, we evaluate the performance of the two tracking algorithms w.r.t. the number of particles. Here, the ground truth is manually labeled in the video sequence. This experiment serves as validation to the semiautomatic procedure described above as well as an evaluation of the effect of particle set size on the performance of both the PPF and the standard particle filter. To ensure the impartiality of the evaluation, we arbitrarily decided to extract the ground truth for the first 5 trajectories in Video_001 sequence. Figure 7 displays the average MSE over 10 epochs for the first trajectory and for different values of . Figure 8 presents the average MSE for 10 epochs on the 5 ground truth tracks for and . The experiments are run with several epochs to increase the confidence in the results due to the stochastic nature of particle filters. It is clear that the projective particle filter outperforms the standard particle filter in terms of MSE. The higher accuracy of the PPF, with all parameters being identical in the comparison, is due to the finer estimation of the sample distribution by the importance density and the consequent adjustment of the weights.
4.3. Tracking Rate Evaluation
An important problem encountered in vehicle tracking is the phenomenon of tracker drift. We propose here to estimate the robustness of the tracking by introducing a tracking rate based on drift measure and to estimate the percentage of vehicles tracked without severe drift, that is, for which the track is not lost. The tracking rate primarily aims to detect the loss of vehicle track and, therefore, evaluates the robustness of the tracker. Robustness is differentiated from accuracy in that the former is a qualitative measure of tracking performance while the latter is a quantitative measure, based on an error measure as in Section 4.2, for instance. The drift measure for vehicle tracking is based on the observation that vehicles are converging to the vanishing point; therefore, the trajectory of the vehicle along the tangential axis is monotonically decreasing. As a consequence, we propose to measure the number of steps where the vehicle position decreases () and the number of steps where the vehicle position increases or is constant (), which is characteristic of drift of a tracker. Note that horizontal drift is seldom observed since the distortion along this axis is weak. The rate of vehicles tracked without severe drift is then calculated as
The tracking rate is evaluated for the projective and standard particle filters. Figure 9 displays the results for the entire traffic surveillance dataset. It shows that the projective particle filter yields better tracking rate than the standard particle filter across the entire dataset. The projective particle filter improves the tracking rate compared to the standard particle filter. Figure 9 also shows that the difference between the tracking rates is not as important as the difference in MSE because the second one already performs well on vehicle tracking. At a highlevel, the projective particle filter still yields a reduction in the drift of the tracker.
4.4. Discussion
The experiments show that the projective particle filter performs better than the standard particle filter in terms of sample distribution, tracking error and tracking rate. The improvement is due to the integration of the projective transform in the importance density. Furthermore, the implementation of the projective transform requires very simple calculations under simplifying assumptions (12). Overall, since the projective particle filter requires fewer samples than the standard particle filter to achieve better tracking performance, the increase in computation due to the projective transform is offset by the reduction in sample set size. More specifically, the projective particle filter requires the computation of the vectorvalued process function and the ratio for each sample. For the process function, (13) and (14), representing and , respectively, must be computed. The computation burden is low assuming that constant terms can be precomputed. On the other hand, the projective particle filter yields a gain in the sample set size since less particles are required for a given error and the resampling is 30% more efficient.
The projective particle filter performs better on the three different measures. The projective transform leads to a reduction in resampling frequency since the distribution of the particles carries accurately the posterior and, consequently, the degeneracy of the particle set is slower. The mean squared error is reduced since the particles focus around the actual position and size of the vehicle. The drift rate benefits from the projective transform since the tracker is less distracted by similar objects or by occlusion. The improvement is beneficial for applications that require vehicle "locking" such as vehicle counts or other applications for which performance is not based on the MSE. It is worthwhile noting here that the MSE and the tracking rate are independent: it can be observed from Figure 9 that the tracking rate is almost the same for Video_005, Video_006, and Video_008, but there is a factor of 2 between the MSE's of Video_005 and Video_008 (see Table 2).
5. Conclusion
A plethora of algorithms for object tracking based on Bayesian filtering are available. However, these systems fail to take advantage of traffic monitoring characteristics, in particular slowvarying vehicle speed, constrained realworld vehicle trajectory and projective transform of vehicles onto the camera plane. This paper proposed a new particle filter, namely, the projective particle filter, which integrates these characteristics into the importance density. The projective fractional transform, which maps the real world position of a vehicle onto the camera plane, provides a better distribution of the samples in the feature space. However, since the prior is not used for sampling, the weights of the projective particle filter have to be readjusted. The standard and the projective particle filters have been evaluated on traffic surveillance videos using three different measures representing robust and accurate vehicle tracking: (i) the degeneracy of the sample set is reduced when the fractional transform is integrated within the importance density; (ii) the tracking rate, measured through drift evaluation, shows an improvement in robustness of the tracker; (iii) the MSE on the vehicle trajectory is reduced with the projective particle filter. Furthermore, the proposed technique outperforms the standard particle filter in terms of MSE even with a fewer number of particles.
References
Jung YK, Ho YOS: Traffic parameter extraction using videobased vehicle tracking. Proceedings of IEEE/IEEJ/JSAI International Conference on Intelligent Transportation Systems, October 1999 764769.
Mélo J, Naftel A, Bernardino A, SantosVictor J: Detection and classification of highway lanes using vehicle motion trajectories. IEEE Transactions on Intelligent Transportation Systems 2006, 7(2):188200. 10.1109/TITS.2006.874706
Lin CP, Tai JC, Song KT: Traffic monitoring based on realtime image tracking. Proceedings of IEEE International Conference on Robotics and Automation, September 2003 20912096.
Qiu Z, An D, Yao D, Zhou D, Ran B: An adaptive Kalman Predictor applied to tracking vehicles in the traffic monitoring system. Proceedings of IEEE Intelligent Vehicles Symposium, June 2005 230235.
Dellaert F, Pomerleau D, Thorpe C: Modelbased car tracking integrated with a roadfollower. Proceedings of IEEE International Conference on Robotics and Automation, 1998 3: 18891894.
Choi JY, Sung KS, Yang YK: Multiple vehicles detection and tracking based on scaleinvariant feature transform. Proceedings of the 10th International IEEE Conference on Intelligent Transportation Systems (ITSC '07), October 2007 528533.
Koller D, Weber J, Malik J: Towards realtime visual based tracking in cluttered traffic scenes. Proceedings of the Intelligent Vehicles Symposium, October 1994 201206.
Meier EB, Ade F: Tracking cars in range images using the condensation algorithm. Proceedings of IEEE/IEEJ/JSAI International Conference on Intelligent Transportation Systems, October 1999 129134.
Xiong T, Debrunner C: Stochastic car tracking with line and colorbased features. Proceedings of IEEE Conference on Intelligent Transportation Systems, 2003 2: 9991003.
Isard MA: Visual motion analysis by probabilistic propagation of conditional density, Ph.D. thesis. University of Oxford, Oxford, UK; 1998.
Isard M, Blake A: Condensation—conditional density propagation for visual tracking. International Journal of Computer Vision 1998, 29(1):528. 10.1023/A:1008078328650
Doucet A: On sequential simulationbased methods for Bayesian filtering. University of Cambridge, Cambridge, UK; 1998.
Arulampalam MS, Maskell S, Gordon N, Clapp T: A tutorial on particle filters for online nonlinear/nonGaussian Bayesian tracking. IEEE Transactions on Signal Processing 2002, 50(2):174188. 10.1109/78.978374
Van Der Merwe R, Doucet A, De Freitas N, Wan E: The unscented particle filter. Cambridge University Engineering Department; 2000.
Kotecha JH, Djurić PM: Gaussian sum particle filtering. IEEE Transactions on Signal Processing 2003, 51(10):26022612. 10.1109/TSP.2003.816754
Chang C, Ansari R: Kernel particle filter for visual tracking. IEEE Signal Processing Letters 2005, 12(3):242245.
Chang C, Ansari R, Khokhar A: Multiple object tracking with kernel particle filter. Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR '05), June 2005 568573.
Liangqun L, Hongbing J, Junhui L: The iterated extended kalman particle filter. Proceedings of the International Symposium on Communications and Information Technologies (ISCIT '05), 2005 12131216.
Kwok C, Fox D, Meilǎ M: Adaptive realtime particle filters for robot localization. Proceedings of IEEE International Conference on Robotics and Automation, September 2003 28362841.
Kwok C, Fox D, Meilǎ M: Realtime particle filters. Proceedings of the IEEE 2004, 92(3):469484. 10.1109/JPROC.2003.823144
Shen C, Brooks MJ, van Hengel AD: Augmented particle filtering for efficient visual tracking. Proceedings of IEEE International Conference on Image Processing (ICIP '05), September 2005 856859.
Zeng Z, Ma S: Head tracking by active particle filtering. Proceedings of IEEE Conference on Automatic Face and Gesture Recognition, 2002 8287.
Feghali R, Mitiche A: Spatiotemporal motion boundary detection and motion boundary velocity estimation for tracking moving objects with a moving camera: a level sets PDEs approach with concurrent camera motion compensation. IEEE Transactions on Image Processing 2004, 13(11):14731490. 10.1109/TIP.2004.836158
Yang C, Duraiswami R, Davis L: Fast multiple object tracking via a hierarchical particle filter. Proceedings of the 10th IEEE International Conference on Computer Vision (ICCV '05), October 2005 212219.
Li J, Chua CS: Transductive inference for colorbased particle filter tracking. Proceedings of IEEE International Conference on Image Processing, September 2003 3: 949952.
Vadakkepat P, Jing L: Improved particle filter in sensor fusion for tracking randomly moving object. IEEE Transactions on Instrumentation and Measurement 2006, 55(5):18231832. 10.1109/TIM.2006.881569
Ziadi A, Salut G: Nonoverlapping deterministic Gaussian particles in maximum likelihood nonlinear filtering. Phase tracking application. Proceedings of the International Symposium on Intelligent Signal Processing and Communication Systems (ISPACS '05), December 2005 645648.
Czyz J: Object detection in video via particle filters. Proceedings of the 18th International Conference on Pattern Recognition (ICPR '06), August 2006 820823.
Woolrich MW, Behrens TE: Variational bayes inference of spatial mixture models for segmentation. IEEE Transactions on Medical Imaging 2006, 25(10):13801391.
Nickel K, Gehrig T, Stiefelhagen R, McDonough J: A joint particle filter for audiovisual speaker tracking. Proceeding of the 7th International Conference on Multimodal Interfaces (ICMI '05), October 2005 6168.
Baş E, Tekalp AM, Salman FS: Automatic vehicle counting from video for traffic flow analysis. Proceedings of IEEE Intelligent Vehicles Symposium (IV '07), June 2007 392397.
Gloyer B, Aghajan HK, Siu KY, Kailath T: Vehicle detection and tracking for freeway traffic monitoring. Proceedings of the Asilomar Conference on Signals, Systems and Computers, 1994 2: 970974.
Zhao L, Thorpe C: Qualitative and quantitative car tracking from a range image sequence. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, June 1998 496501.
Bouttefroy PLM, Bouzerdoum A, Phung SL, Beghdadi A: Vehicle tracking using projective particle filter. Proceedings of the 6th IEEE International Conference on Advanced Video and Signal Based Surveillance (AVSS '09), September 2009 712.
Bouttefroy PLM, Bouzerdoum A, Phung SL, Beghdadi A: Vehicle tracking by nondrifting meanshift using projective kalman filter. Proceedings of the 11th International IEEE Conference on Intelligent Transportation Systems (ITSC '08), December 2008 6166.
Gustafsson F, Gunnarsson F, Bergman N, Forssell U, Jansson J, Karlsson R, Nordlund PJ: Particle filters for positioning, navigation, and tracking. IEEE Transactions on Signal Processing 2002, 50(2):425437. 10.1109/78.978396
Rathi Y, Vaswani N, Tannenbaum A: A generic framework for tracking using particle filter with dynamic shape prior. IEEE Transactions on Image Processing 2007, 16(5):13701382.
Kong A, Lui JS, Wong WH: Sequential imputations and bayesian missing data problems. Journal of the American Statistical Assocation 1994, 89(425):278288. 10.2307/2291224
Schreiber D, Alefs B, Clabian M: Single camera lane detection and tracking. Proceedings of the 8th International IEEE Conference on Intelligent Transportation Systems, September 2005 302307.
Comaniciu D, Ramesh V, Meer P: Kernelbased object tracking. IEEE Transactions on Pattern Analysis and Machine Intelligence 2003, 25(5):564577. 10.1109/TPAMI.2003.1195991
Kitagawa G: Monte Carlo filter and smoother for nonGaussian nonlinear state space models. Journal of Computational and Graphical Statistics 1996, 5(1):125. 10.2307/1390750
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Open Access This article is distributed under the terms of the Creative Commons Attribution 2.0 International License (https://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
About this article
Cite this article
Bouttefroy, P., Bouzerdoum, A., Phung, S. et al. Integrating the Projective Transform with Particle Filtering for Visual Tracking. J Image Video Proc. 2011, 839412 (2011). https://doi.org/10.1155/2011/839412
Received:
Accepted:
Published:
DOI: https://doi.org/10.1155/2011/839412