This study explores the electroencephalographic (EEG) correlates of emotions during music listening. Principal component analysis (PCA) is used to correlate EEG features with complex music appreciation. This study also applies machine-leaning algorithms to demonstrate the feasibility of classifying EEG dynamics in four subjectively-reported emotional states. The high classification accuracy (81.58+/-3.74%) demonstrates the feasibility of using EEG features to assess emotional states of human subjects. Further, the spatial and spectral patterns of the EEG most relevant to emotions seem reproducible across subjects.