Object tracking systems using camera have become an essential requirement in today’s society. In-expensive and high-quality video cameras, availability and demand for analysis of automated video have produced a lot of interest for numerous fields. Almost all conventional algorithms are developed based on background subtraction, frame difference, and static background. They fail to track in environments such as variation in illumination, cluttered background, and occlusions. The image segmentation based object tracking algorithms fail to track in real-time. Feature extraction of an image is an indispensable first step in object tracking applications. In this paper, a novel real-time object tracking based on position and feature vectors is developed. The proposed algorithm involves two phases. The first phase is extraction of features for region of interest object in first frame and nine position features of second frame in video. The second phase is similarity estimation of extracted features of two frames using Euclidean distance. The nearest match is considered by minimum distance between first frame feature vectors and nine different feature vectors of second frame. The proposed algorithm is compared with other existing algorithms using different feature extraction techniques for object tracking in video. The proposed method is simulated and evaluated by statistical, discrete wavelet transform, Radon transform, scale-invariant feature transform and features from accelerated segment test. The performance evaluation shows that the proposed algorithm can be applied for any feature extraction technique and object tracking in video depends on tracking accuracy. © Springer Nature Singapore Pte Ltd. 2018.