Transfer learning for extremely low resource languages is a challenging task as there is no large scale monolingual corpora for pre training or sufficient annotated data for fine tuning. We follow the work of MetaXL which suggests using meta learning for transfer learning from a single source language to an extremely low resource one. We propose an enhanced approach which uses multiple source languages chosen in a data driven manner. In addition, we introduce a sample selection strategy for utilizing the languages in training by using a multi armed bandit algorithm. Using both of these improvements we managed to achieve state of the art results on the NER task for the extremely low resource languages while using the same amount of data, making the representations better generalized. Also, due to the method ability to use multiple languages it allows the framework to use much larger amounts of data, while still having superior results over the former MetaXL method even with the same amounts of data.