{"id":34,"date":"2017-01-19T12:22:40","date_gmt":"2017-01-19T12:22:40","guid":{"rendered":"http:\/\/achrafothman.net\/site\/?page_id=34"},"modified":"2021-04-15T03:18:21","modified_gmt":"2021-04-15T03:18:21","slug":"asl-smt","status":"publish","type":"page","link":"https:\/\/achrafothman.net\/site\/asl-smt\/","title":{"rendered":"Statistical Machine Translation for Sign Language (ASL-SMT)"},"content":{"rendered":"<p>ASL-SML deal with machine translation to sign language. It&nbsp;starts with studying existing systems and issues in order to propose a new model for statistical machine translation from written English text to American Sign Language (English\/ASL). The study covers specificity of Sign Language from different communities and a scope of existing tools and solutions. According to the state of the art, the aims of this paper is to propose a new approach aiming to build artificial corpus using grammatical dependencies rules due to the lack of resources for Sign Language. The parallel corpus was the input of our machine translation that has been used to create the statistical memory translation based on the IBM alignment algorithms. These algorithms have been improved and optimized by integrating Jaro-Winkler distances. Then, based on the constructed translation memory, we have implemented a decoder to translate an English text to the American Sign Language using a new transcription system based on gloss annotation. Results had been evaluated by the BLEU evaluation metric.<\/p>\n<blockquote><p>if you are interested to deploy your own machine translation, check <a href=\"http:\/\/achrafothman.net\/site\/how-to-install-moses-statistical-machine-translation-in-ubuntu\/\" target=\"_blank\" rel=\"noopener noreferrer\">my tutorial here<\/a> or in <a href=\"https:\/\/www.youtube.com\/channel\/UCqXbfAw1pIq0e3Damnf7-dA\" target=\"_blank\" rel=\"noopener noreferrer\">my youtube channel (TechCarrot)<\/a>.<\/p><\/blockquote>\n<h3 style=\"background-color:#000080;color:white;\"><a href=\"http:\/\/ec2-54-201-65-71.us-west-2.compute.amazonaws.com\/ui.php\" target=\"_blank\" rel=\"noopener\">Link to the live demo<\/h3>\n<hr>\n<h3><strong>Corpus Sample Data:<\/strong><\/h3>\n<ul>\n<li><a href=\"http:\/\/www.achrafothman.net\/aslsmt\/corpus\/sample-corpus-asl-en.asl\" target=\"_blank\" rel=\"noopener noreferrer\">American Sign Language Data<\/a> (right click and click on save as)<\/li>\n<li><a href=\"http:\/\/www.achrafothman.net\/aslsmt\/corpus\/sample-corpus-asl-en.en\" target=\"_blank\" rel=\"noopener noreferrer\">English Data<\/a> (right click and click on save as)<\/li>\n<\/ul>\n<p> >> <a href=\"https:\/\/achrafothman.net\/site\/english-asl-gloss-parallel-corpus-2012-aslg-pc12\/\" target=\"_blank\" rel=\"noopener\">Click here to download the full corpus<\/a><\/p>\n<p>if you&#8217;re writing or working on the corpus, please cite this paper:<\/p>\n<blockquote style=\"font-size: 15px;\"><p>Achraf Othman and Zouhour Tmar. \u201c<a href=\"https:\/\/www.achrafothman.net\/aslsmt\/English-ASL-Gloss-Parallel-Corpus-2012-ASLG-PC12.pdf\" target=\"_blank\" rel=\"noopener\">English-ASL Gloss Parallel Corpus 2012: ASLG-PC12, The Second Release<\/a>\u201d. Fourth International Conference On Information and Communication Technology and Accessibility ICTA\u201913, Hammamet, Tunisia, October 24-26, 2013.<\/p><\/blockquote>\n<p style=\"font-size: 13px;background-color: beige;\" xmlns:cc=\"http:\/\/creativecommons.org\/ns#\" xmlns:dct=\"http:\/\/purl.org\/dc\/terms\/\"><a property=\"dct:title\" rel=\"cc:attributionURL\" href=\"https:\/\/achrafothman.net\/site\/english-asl-gloss-parallel-corpus-2012-aslg-pc12\/\">English-ASL Gloss Parallel Corpus 2012: ASLG-PC12<\/a> by <a rel=\"cc:attributionURL dct:creator\" property=\"cc:attributionName\" href=\"https:\/\/achrafothman.net\/\">Dr. Achraf Othman<\/a> is licensed under <a href=\"http:\/\/creativecommons.org\/licenses\/by-nc\/4.0\/?ref=chooser-v1\" target=\"_blank\" rel=\"license noopener noreferrer\" style=\"display:inline-block;\">Attribution-NonCommercial 4.0 International<img decoding=\"async\" style=\"height:22px!important;margin-left:3px;vertical-align:text-bottom;\" src=\"https:\/\/mirrors.creativecommons.org\/presskit\/icons\/cc.svg?ref=chooser-v1\"><img decoding=\"async\" style=\"height:22px!important;margin-left:3px;vertical-align:text-bottom;\" src=\"https:\/\/mirrors.creativecommons.org\/presskit\/icons\/by.svg?ref=chooser-v1\"><img decoding=\"async\" style=\"height:22px!important;margin-left:3px;vertical-align:text-bottom;\" src=\"https:\/\/mirrors.creativecommons.org\/presskit\/icons\/nc.svg?ref=chooser-v1\"><\/a><\/p>\n<hr>\n<h3><strong>Schema Ressources for Gloss Annotation System (XML-Gloss):<\/strong><\/h3>\n<ul>\n<li><a href=\"http:\/\/www.achrafothman.net\/aslsmt\/schemaXML.xsd\" target=\"_blank\" rel=\"noopener noreferrer\">XSD Schema<\/a> (right click and click on save as)<\/li>\n<li><a href=\"http:\/\/www.achrafothman.net\/aslsmt\/formexsl.xsl\">XSL Schema<\/a> (right click and click on save as)<\/li>\n<\/ul>\n<p>if you&#8217;re writing or working on the corpus, please cite this paper:<\/p>\n<blockquote style=\"font-size: 15px;\"><p>Achraf Othman, Mohamed Jemni, \u201c<a href=\"https:\/\/ieeexplore.ieee.org\/document\/8336054\" target=\"_blank\" rel=\"noopener noreferrer\">An XML-Gloss Annotation System for Sign Language Processing<\/a>\u201c, 6th International Conference On Information and Communication Technology and Accessibility ICTA\u201917, Muscat, Oman, December 19-21, 2017.<\/p><\/blockquote>\n<p>If you\u2019re just writing about this work, please cite this paper as follow:<\/p>\n<blockquote style=\"font-size: 15px;\"><p>Achraf Othman, Mohamed Jemni, \u201c<a href=\"http:\/\/www.achrafothman.net\/aslsmt\/Designing-High-Accuracy-Statistical-Machine-Translation-for-Sign-Language-Using-Parallel-Corpus_-Case-Study-English-and-American-Sign-Language.pdf\" target=\"_blank\" rel=\"noopener noreferrer\">Designing High Accuracy Statistical Machine Translation for Sign Language Using Parallel Corpus\u2014Case study English and American Sign Language<\/a> \u201c, Journal of Information Technology Research, Volume 12, Issue 2, 2019.<\/p><\/blockquote>\n","protected":false},"excerpt":{"rendered":"<p>ASL-SML deal with machine translation to sign language. It&nbsp;starts with studying existing systems and issues in order to propose a new model for statistical machine translation from written English text to American Sign Language (English\/ASL). The study covers specificity of Sign Language from different communities and a scope of existing tools and solutions. According to<\/p>\n","protected":false},"author":1,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"_uag_custom_page_level_css":""},"aioseo_notices":[],"uagb_featured_image_src":{"full":false,"thumbnail":false,"medium":false,"medium_large":false,"large":false,"1536x1536":false,"2048x2048":false,"post-thumbnail":false,"contentberg-main":false,"contentberg-main-full":false,"contentberg-slider-stylish":false,"contentberg-slider-carousel":false,"contentberg-slider-grid-b":false,"contentberg-slider-grid-b-sm":false,"contentberg-slider-bold-sm":false,"contentberg-grid":false,"contentberg-list":false,"contentberg-list-b":false,"contentberg-thumb":false,"contentberg-thumb-alt":false},"uagb_author_info":{"display_name":"Achraf Othman","author_link":"https:\/\/achrafothman.net\/site\/author\/achraf-othman\/"},"uagb_comment_info":0,"uagb_excerpt":"ASL-SML deal with machine translation to sign language. It&nbsp;starts with studying existing systems and issues in order to propose a new model for statistical machine translation from written English text to American Sign Language (English\/ASL). The study covers specificity of Sign Language from different communities and a scope of existing tools and solutions. According to","jetpack_shortlink":"https:\/\/wp.me\/P8KjJN-y","jetpack-related-posts":[{"id":718,"url":"https:\/\/achrafothman.net\/site\/english-asl-gloss-parallel-corpus-2012-aslg-pc12\/","url_meta":{"origin":34,"position":0},"title":"English-ASL Gloss Parallel Corpus 2012: ASLG-PC12","date":"April 15, 2021","format":false,"excerpt":"A serious problem facing the community of researchers in the field of sign language is the absence of a large parallel corpus for sign language. The ASLG-PC12 project proposes a rule-based approach for building a big parallel corpus of English written texts and American Sign Language glosses. We present a\u2026","rel":"","context":"Similar post","img":{"alt_text":"","src":"","width":0,"height":0},"classes":[]},{"id":40,"url":"https:\/\/achrafothman.net\/site\/list-publications\/","url_meta":{"origin":34,"position":1},"title":"List of Publications","date":"January 19, 2017","format":false,"excerpt":"2021 Achraf Othman, Oussama El Ghoul, \u201cSyntactic and semantic annotation tool for Qatari Sign Language Corpus\u201d, 8th International Conference on Information and Communication Technology and Accessibility ICTA\u201921, December 8-10, 2021 [online]. Mohamed Koutheair Khribi, Achraf Othman, Aljazi Nasser Al Jabor, \u201cFostering ICT accessibility proficiency through Mada ICT-AID Competency Framework\u201d, 8th\u2026","rel":"","context":"Similar post","img":{"alt_text":"","src":"","width":0,"height":0},"classes":[]},{"id":37,"url":"https:\/\/achrafothman.net\/site\/about-me\/","url_meta":{"origin":34,"position":2},"title":"About me","date":"January 19, 2017","format":false,"excerpt":"IEEE Senior Member. Currently, I am the Head of ICT Accessibility Innovation and Research Section at Mada Center, Doha, Qatar. Working on research projects to enable persons with disabilities using innovative technologies and Artificial Intelligence (AI). I have more than five years of technical leadership and people management experience and\u2026","rel":"","context":"Similar post","img":{"alt_text":"","src":"https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/cropped-achrafothman-sq-300x300.jpg?resize=350%2C200&ssl=1","width":350,"height":200},"classes":[]}],"_links":{"self":[{"href":"https:\/\/achrafothman.net\/site\/wp-json\/wp\/v2\/pages\/34"}],"collection":[{"href":"https:\/\/achrafothman.net\/site\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/achrafothman.net\/site\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/achrafothman.net\/site\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/achrafothman.net\/site\/wp-json\/wp\/v2\/comments?post=34"}],"version-history":[{"count":40,"href":"https:\/\/achrafothman.net\/site\/wp-json\/wp\/v2\/pages\/34\/revisions"}],"predecessor-version":[{"id":733,"href":"https:\/\/achrafothman.net\/site\/wp-json\/wp\/v2\/pages\/34\/revisions\/733"}],"wp:attachment":[{"href":"https:\/\/achrafothman.net\/site\/wp-json\/wp\/v2\/media?parent=34"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}