We introduce a new model based on sets of probabilities for sequential data. We name the model GAMMT, which stands for Generative Ambiguity Models using Multiple Transformers. We suppose that data generating process of a sequence is ambiguous and determined by a set of probabilities rather than one as in the conventional model. We use multiple parallel transformers connected by a selection mechanism to approximate ambiguous probabilities. The GAMMT allows for ambiguity modeling in a generative way and multiple representations of the input tokens and the input sequence. This work explores the combination of attention mechanism and ambiguity by deep neural networks. We expect that this framework will facilitate new research into machine learning, improving our understanding of the attention-ambiguity mechanism.