Abstract:Traditional statistical learning theory relies on the assumption that data are identically and independently generated from a given distribution (i.i.d.). The independently distributed assumption, on the other hand, fails to hold in many real applications. In this survey, we consider learning settings in which examples are dependent and their dependence relationship can be characterized by a graph. We collect various graph-dependent concentration bounds, which are then used to derive Rademacher and stability generalization bounds for learning from graph-dependent data. We illustrate this paradigm with three learning tasks and provide some research directions for future work. To the best of our knowledge, this is the first survey on this subject.