Human Computer Interaction (HCI) has become an important focus of both computer science researches and industrial applications. And, on-screen gaze estimation is one of the hottest topics in this rapidly growing field. Eye-gaze direction estimation is a sub-research area of on-screen gaze estimation and the number of studies that focused on the estimation of on-screen gaze direction is limited. Due to this, various appearance-based video-oculography methods are investigated in this work. Firstly, a new dataset is created via user images taken from daylight censored cameras located at computer screen. Then, Local Binary Pattern Histogram (LBPH), which is used in this work for the first time to obtain on-screen gaze direction information, and Principal Component Analysis (PCA) methods are employed to extract image features. And, parameter optimized Support Vector Machine (SVM), Artificial Neural Networks (ANNs) and k-Nearest Neighbor (k-NN) learning methods are adopted in order to estimate on-screen gaze direction. Finally, these methods' abilities to correctly estimate the on-screen gaze direction are compared using the resulting classification accuracies of applied methods and previous works. The best classification accuracy of 96.67% is obtained when using LBPH and SVM method pair which is better than previous works. The results also show that appearance based methods are pretty applicable for estimating on-screen gaze direction.