Task-oriented communication is an emerging paradigm for next-generation communication networks, which extracts and transmits task-relevant information, instead of raw data, for downstream applications. Most existing deep learning (DL)-based task-oriented communication systems adopt a closed-world scenario, assuming either the same data distribution for training and testing, or the system could have access to a large out-of-distribution (OoD) dataset for retraining. However, in practical open-world scenarios, task-oriented communication systems need to handle unknown OoD data. Under such circumstances, the powerful approximation ability of learning methods may force the task-oriented communication systems to overfit the training data (i.e., in-distribution data) and provide overconfident judgments when encountering OoD data. Based on the information bottleneck (IB) framework, we propose a class conditional IB (CCIB) approach to address this problem in this paper, supported by information-theoretical insights. The idea is to extract distinguishable features from in-distribution data while keeping their compactness and informativeness. This is achieved by imposing the class conditional latent prior distribution and enforcing the latent of different classes to be far away from each other. Simulation results shall demonstrate that the proposed approach detects OoD data more efficiently than the baselines and state-of-the-art approaches, without compromising the rate-distortion tradeoff.