In this paper, we develop methods to assist people with disability to control robot using brain computer interface. Using a new technique based on action grammar, robot is able to carry out tasks such as opening a door knob simply by recognizing the intention of the user. To successfully implement this concept we have developed techniques that help articulated robot to become spatially aware. We provide a set of actions which can be combined using action-grammar that is modelled using stationary Markov decision process. We demonstrate our methodology using two tasks (i) screw-insertion and (ii) door-opening.