Hiligaynon – Cebuano Sentence Translator using Recurrent Neural Networks


Authors
  • Sean Michael A. Cadigal
  • Christine F. Peña
  • Christian P. Gelbolingo
  • Anthonette D. Cantara
  • University of San Carlos
Published in


Abstract
  • In the Philippines, two of the most spoken dialects are Cebuano and Hiligaynon. Moreover, Cebuano and Hiligaynon are closely related in the sense that both languages share words but have different meaning according to context. The study aimed to create a translation model for Hiligaynon to Cebuano by applying recurrent artificial neural networks with long short-term memory. Two neural networks were developed, one for encoding source sentences and one for decoding target sentences in a manner following sequence-to-sequence learning. The highest accuracy achieved was when the model was trained at 150 epochs and 4000 sentences, yielding a BLEU score of 0.265579454. It can be concluded that a neural machine translation model can be created given sufficient training data.


Keywords
  • Machine Translation, Natural Language Processing, Cebuano-Hiligaynon, Long Short-Term Memory, Recurrent Neural Networks




Cite As
  • APA 7th Edition:
    Cadigal, S., Peña, C., Gelbolingo, C., & Cantara, A. (2018). Hiligaynon – Cebuano Sentence Translator using Recurrent Neural Networks. Innovatus, 1(1), 9-13.
  • Harvard:
    Cadigal, S., Peña, C., Gelbolingo, C. and Cantara, A., 2018. Hiligaynon – Cebuano Sentence Translator using Recurrent Neural Networks. Innovatus, 1(1), pp.9-13.
  • IEEE:
    [1] S. Cadigal, C. Peña, C. Gelbolingo and A. Cantara, "Hiligaynon – Cebuano Sentence Translator using Recurrent Neural Networks", Innovatus, vol. 1, no. 1, pp. 9-13, 2018.


References
  • D. S. Christos Stergiou, "Neural Networks," [Online]. Available: https://www.doc.ic.ac.uk/~nd/surprise_96/journal/vol4/cs11/report.html.
  • T. Mikolov, Recurrent Neural Network Based Language Model, Brno University of Technology, Johns Hopkins University, 2010.
  • S. Hochreiter and J. Schmidhuber, "Long short-term memory," Neural Computation, pp. 1735-1780, 1997.
  • "Lilt Labs," 2 August 2017. [Online]. Available: https://labs.lilt.com/2017-machine-translation-quality-evaluation-603ff3ec3c36 .
  • C. O. M. a. K. P. Callison-Burch, "Re-evaluating the Role of BLEU in Machine Translation Research," 11th Conference of the European Chapter of the Association for Computational Linguistics: EACL, p. 249–256, 2006.
  • S. R. T. W. a. W. Kishore Papineni, "Bleu: A method for automatic evaluation of machine translation," in ACL, 2002.
  • B. K. S. Baljinder Kaur, "Machine Translation: An Analytical Study," Baljinder Kaur et al Int. Journal of Engineering Research and Applications, pp. 168-175, 2014.
  • Yaser Al-Onaizan, "Statistical Machine Translation," 1999.
  • Kyunghyun Cho, "Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation," 2014.
  • K. Y. Dzmitry Bahdanau, "Neural Machine Translation By Jointly Learning to Align and Translate," 2016.
  • R. Dale, H. Moisl and H. Somers, Handbook of Natural Language Processing, New York: Marcel Dekker, 2000.
  • L. Schubert, The Stanford Encyclopedia of Philosophy, Metaphysics Research Lab, Stanford University, 2015.
  • E. Greenstein and D. Penner, Japanese-to-English Machine Translation Using Recurrent Neural Networks, Stanford University, 2015


Cited By
  • No citations found yet