Factorization Machine (FM) is a supervised learning approach with a powerful capability of feature engineering. It yields state-of-the-art performance in various batch learning tasks where all the training data is made available prior to the training. However, in real-world applications where the data arrives sequentially in a streaming manner, the high cost of re-training with batch learning algorithms has posed formidable challenges in the online learning scenario. The initial challenge is that no prior formulations of FM could fulfill the requirements in Online Convex Optimization (OCO) -- the paramount framework for online learning algorithm design. To address the aforementioned challenge, we invent a new convexification scheme leading to a Compact Convexified FM (CCFM) that seamlessly meets the requirements in OCO. However for learning Compact Convexified FM (CCFM) in the online learning setting, most existing algorithms suffer from expensive projection operations. To address this subsequent challenge, we follow the general projection-free algorithmic framework of Online Conditional Gradient and propose an Online Compact Convex Factorization Machine (OCCFM) algorithm that eschews the projection operation with efficient linear optimization steps. In support of the proposed OCCFM in terms of its theoretical foundation, we prove that the developed algorithm achieves a sub-linear regret bound. To evaluate the empirical performance of OCCFM, we conduct extensive experiments on 6 real-world datasets for online recommendation and binary classification tasks. The experimental results show that OCCFM outperforms the state-of-art online learning algorithms.