It is argued that for the computer to be able to interact with humans, it needs to have the communication skills of humans. One of these skills is the ability to understand the emotional state of the person. The most expressive way humans display emotions is through facial expressions. In most facial expression systems and databases, the emotion data was collected by asking the subjects to perform a series of facial expressions. However, these directed or deliberate facial action tasks typically differ in appearance and timing from the authentic facial expressions induced through events in the normal environment of the subject. In this paper, we present our effort in creating an authentic facial expression database based on spontaneous emotions derived from the environment. Furthermore, we test and compare a wide range of classifiers from the machine learning literature that can be used for facial expression classification.