Research Papers - Dept of Computer Systems Engineering

Permanent URI for this collection https://rda.sliit.lk/handle/123456789/1253

Browse

Search Results

Now showing 1 - 2 of 2
  • Thumbnail Image
    PublicationEmbargo
    Recognition and translation of Ancient Brahmi Letters using deep learning and NLP
    (IEEE, 2019-12) Wijerathna, K. A. S. A. N; Sepalitha, R; Thuiyadura, I; Athauda, H; Suranjini, P. D; Silva, J. A. D. C; Jayakodi, A
    Inscriptions are major resources for studying the ancient history and culture of civilization in any country. Analyzing, recognizing and translating the ancient letters (Brahmi letters) from the inscription is a very difficult work for present generation. There is no any automatic system for translating Brahmi letters to Sinhala language. However, they are using manual method for translating inscriptions. The method that used in epigraphy is being taken a long period to decipher, analyze and translate the inscribed text in inscriptions. This research mainly focuses on recognition of ancient Brahmi characters written the time period between 3 rd B.C and 1 st A. D. First, we remove the noise, segment the letters from the inscription image and convert it into the binary image using image processing techniques. Secondly, we recognize the correct Brahmi letters, broken letters and then identify the time period of the inscriptions using Convolution Neural Networks in deep learning. Finally, the Brahmi letters are translated into modern Sinhala letters and provide the meaning of the inscription using Natural Language Processing. This proposed system builds up solution to overcome the existing problems in epigraphy.
  • Thumbnail Image
    PublicationOpen Access
    Bidirectional LSTM-CRF for Named Entity Recognition
    (32nd Pacific Asia Conference on Language, Information and Computation, 2018-12-01) Panchendrarajan, R; Amaresan, A
    Named Entity Recognition (NER) is a challenging sequence labeling task which requires a deep understanding of the orthographic and distributional representation of words. In this paper, we propose a novel neural architecture that benefits from word and character level information and dependencies across adjacent labels. This model includes bidirectional LSTM (BI-LSTM) with a bidirectional Conditional Random Field (BI-CRF) layer. Our work is the first to experiment BI-CRF in neural architectures for sequence labeling task. We show that CRF can be extended to capture the dependencies between labels in both right and left directions of the sequence. This variation of CRF is referred to as BI-CRF and our results show that BI-CRF improves the performance of the NER model compare to an unidirectional CRF and backward CRF is capable of capturing most difficult entities compare to the forward CRF. Our system is competitive on the CoNLL-2003 dataset for English and outperforms most of the existing approaches which do not use any external labeled data.