EEG signals in emotion recognition absorb special attention owing to their high temporal resolution and their information about what happens in the brain. Different regions of brain work together to process information and meanwhile the activity of brain changes during the time. Therefore, the investigation of the connection between different brain areas and their temporal patterns plays an important role in neuroscience. In this study, we investigate the emotion classification performance using functional connectivity features in different frequency bands and compare them with the classification performance using differential entropy feature, which has been previously used for this task. Moreover, we investigate the effect of using different time periods on the classification performance. Our results on publicly available SEED dataset show that as time goes on, emotions become more stable and the classification accuracy increases. Among different time periods, we achieve the highest classification accuracy using the time period of 140s-end. In this time period, the accuracy is improved by 4 to 6% compared to using the entire signal. The mean accuracy of about 88% is obtained using any of the Pearson correlation coefficient, coherence, and phase locking value features and SVM. Therefore, functional connectivity features lead to better classification accuracy than DE features (with the mean accuracy of 84.89%) using the proposed framework. Finally, in a relatively fair comparison, we show that using the best time interval and SVM, we achieve better accuracy than using Recurrent Neural Networks which need large amount of data and have high computational cost.