We introduce new combinatorial quantities for concept classes, and prove lower and upper bounds for learning complexity in several models of query learning in terms of various combinatorial quantities. Our approach is flexible and powerful enough to enough to give new and very short proofs of the efficient learnability of several prominent examples (e.g. regular languages and regular $\omega$-languages), in some cases also producing new bounds on the number of queries. In the setting of equivalence plus membership queries, we give an algorithm which learns a class in polynomially many queries whenever any such algorithm exists. We also study equivalence query learning in a randomized model, producing new bounds on the expected number of queries required to learn an arbitrary concept. Many of the techniques and notions of dimension draw inspiration from or are related to notions from model theory, and these connections are explained. We also use techniques from query learning to mildly improve a result of Laskowski regarding compression schemes.