Recurrent neural networks (RNNs) are a class of nonlinear dynamical systems often used to model sequence-to-sequence maps. RNNs have been shown to have excellent expressive power but lack stability or robustness guarantees that would be necessary for safety-critical applications. In this paper we formulate convex sets of RNNs with guaranteed stability and robustness properties. The guarantees are derived using differential IQC methods and can ensure contraction (global exponential stability of all solutions) and bounds on incremental l2 gain (the Lipschitz constant of the learnt sequence-to-sequence mapping). An implicit model structure is employed to construct a jointly-convex representation of an RNN and its certificate of stability or robustness. We prove that the proposed model structure includes all previously-proposed convex sets of contracting RNNs as special cases, and also includes all stable linear dynamical systems. We demonstrate the utility of the proposed model class in the context of nonlinear system identification.