Automatic Extraction of Pedestrian Trajectories from Video Recordings
In order to understand and thereupon to model the pedestrian dynamics, reliable empirical data of pedestrian movement are necessary for analysis and verification. The existing database is small, inaccurate and highly contradictory. Manual procedures for collecting this data are very time-consuming and usually do not supply sufficient accuracy in space and time.
For this reason we are developing the tool named PeTrack (Pedestrian Tracking) to automatically extract accurate pedestrian trajectories from video recordings. The joint trajectories of all pedestrians provide data like velocity, flow, density and individual distances at any time and position. With such a tool extensive experimental series with a large number of persons can be analyzed.
The program has to deal with wide angle lenses and a high density of pedestrians. Lens distortion and perspective view are taken into account. The procedure includes calibration, recognition, tracking and height detection.
At the moment we are improving PeTrack to use stereo recordings, so that pedestrians can be tracked without markers.
An early version of PeTrack is available here. The brief documentation of using PeTrack cannot answer all questions. Thus you may contact the author before setting up experiments and automatic extraction with PeTrack.