In the recent years one of the popular ways for human computer interface (HCI) systems are processing of electrophysiological signals. In this project a computer human interface system has been developed for those who cannot speak or use hands (hemiparesis, quadriplegia or ALS) using electrophysiological genereted by the voluntary contradictions of the muscles around the eye. Eye movements are resposible for the movement of the cursor whereas the signals generated by the succesive blinks of the eye are used for simulating decision making such as mouse clicks. The developed system In the project, a text editor has been developed which is operated using EOG signals.