Although Markov models are widely used and researched, improving their capability to guarantee optimal performance in real world processes relies on perfect state inference amidst non-stationarity. This paper develops a novel estimation technique to capture non-stationarity in Markov sequences induced by switching transition probability matrices (TPMs). We introduce the concept of likelihood rate to establish existence of non-stationarity and to detect and estimate multiple TPMs. We layer another Markov chain to model switches between the estimated transition probability matrices resulting in Layered Non-stationary Markov Models (LNMM). We present a novel non-parametric estimation process that evaluates multiple priors and performs Bayesian update of a prior with highest likelihood rate. Our experiments on synthetic and honey bee dance dataset shows that the inference using LNMM is two times more accurate than the existing unsupervised learning methods while being computationally efficient, validating it as a highly expressive model for non-stationary Markov sequences.