Low-resource languages such as Filipino suffer from data scarcity which makes it challenging to develop NLP applications for Filipino language. The use of Transfer Learning (TL) techniques alleviates this problem in low-resource setting. In recent years, transformer-based models are proven to be effective in low-resource tasks but faces challenges in accessibility due to its high compute and memory requirements. For this reason, there's a need for a cheaper but effective alternative. This paper has three contributions. First, release a pre-trained AWD-LSTM language model for Filipino language. Second, benchmark AWD-LSTM in the Hate Speech classification task and show that it performs on par with transformer-based models. Third, analyze the the performance of AWD-LSTM in low-resource setting using degradation test and compare it with transformer-based models. ----- Ang mga low-resource languages tulad ng Filipino ay gipit sa accessible na datos kaya't mahirap gumawa ng mga applications sa wikang ito. Ang mga Transfer Learning (TL) techniques ay malaking tulong para sa low-resource setting o mga pagkakataong gipit sa datos. Sa mga nagdaang taon, nanaig ang mga transformer-based TL techniques pagdating sa low-resource tasks ngunit ito ay mataas na compute and memory requirements kaya nangangailangan ng mas mura pero epektibong alternatibo. Ang papel na ito ay may tatlong kontribusyon. Una, maglabas ng pre-trained AWD-LSTM language model sa wikang Filipino upang maging tuntungan sa pagbuo ng mga NLP applications sa wikang Filipino. Pangalawa, mag benchmark ng AWD-LSTM sa Hate Speech classification task at ipakita na kayang nitong makipagsabayan sa mga transformer-based models. Pangatlo, suriin ang performance ng AWD-LSTM sa low-resource setting gamit ang degradation test at ikumpara ito sa mga transformer-based models.