Algorithm portfolios represent a strategy of composing multiple heuristic algorithms, each suited to a different class of problems, within a single general solver that will choose the best suited algorithm for each input. This approach recently gained popularity especially for solving combinatoric problems, but optimization applications are still emerging. The COCO platform of the BBOB workshop series is the current standard way to measure performance of continuous black-box optimization algorithms. As an extension to the COCO platform, we present the Python-based COCOpf framework that allows composing portfolios of optimization algorithms and running experiments with different selection strategies. In our framework, we focus on black-box algorithm portfolio and online adaptive selection. As a demonstration, we measure the performance of stock SciPy optimization algorithms and the popular CMA algorithm alone and in a portfolio with two simple selection strategies. We confirm that even a naive selection strategy can provide improved performance across problem classes.