This paper presents a new method for activity analysis of construction workers using inexpensive RGB+depth sensors. This is an important task, as no current workforce assessment method can provide detailed and continuous information to help project managers identify bottlenecks affecting labor's productivity. Previous work using RGB-D images focuses on action recognition form short video sequences wherein only one action is represented within each video. Automating this analysis for long sequences of RGB-D images is challenging because the start and the end of each action is unknown, recognizing single actions is still challenging, and there are no data sets and validation metrics to evaluate algorithms. Given an input sequence of RGB-D images, our algorithm divides it into temporal segments and automatically classifies the observed actions. To do so, the algorithm first detects body postures in real time. Then a kernel density estimation (KDE) model is trained to model classification scores from discriminatively trained bag-of-poses action classifiers. Further, a hidden Markov model (HMM) labels sequences of actions that are most discriminative. The performance of our model is tested on unprecedented data sets of actual drywall construction operations. Experimental results, in addition to the perceived benefits and limitations of the proposed method, are discussed in detail.