{"id":175,"date":"2017-07-16T08:12:27","date_gmt":"2017-07-16T08:12:27","guid":{"rendered":"http:\/\/achrafothman.net\/site\/?p=175"},"modified":"2021-10-19T07:50:25","modified_gmt":"2021-10-19T07:50:25","slug":"virtual-conversation-agent-avatar-for-sign-language","status":"publish","type":"post","link":"https:\/\/achrafothman.net\/site\/virtual-conversation-agent-avatar-for-sign-language\/","title":{"rendered":"Virtual Conversational Agent (avatar) for Sign Language"},"content":{"rendered":"<p>Sign language (SL) was first acknowledged as a separate language only in the 1960s. Similar to spoken language, it evolved from different cultural backgrounds. Every country has its own sign language with various dialects, which are based on different rules than the spoken language.<\/p>\n<p><!--more--><\/p>\n<p>Although several websites provide video clips in which sign language interpreters translate the text, much Internet content remains cryptic for the deaf community. To inform deaf people quickly in cases where there is no interpreter on hand, researchers are working on a novel approach to provide content. Their idea: avatars. These animated characters could be used in the context of announcements at train stations, or on websites.<\/p>\n<p>Virtual Conversation Agent (avatar) are a technology for displaying signed conversation without the necessity of displaying a video of a human signer. Instead, the systems use 3D animated models, which can be stored more efficiently than video. The avatar can produce movement of the fingers, hands, facial gestures for facial expressions (happiness, surprise etc.), body movements, and co-signs, in which two different words or ideas are signed at the same time. The avatar can be programmed to communicate in either a Sign Language (for example American Sign Language (ASL) or French Sign Language (LSF)). Advances in computer graphics capabilities mean that personal computers and smartphones are able to produce this animation with much great clarity than in the past, when transitions between the signs were rough and smooth and the hands had to return to a central position between each sign.<\/p>\n<p>To capture the motions of deaf people, the scientists and researchers make use of affordable cameras and sensors that are typically used by teenagers for computer games. A computing method transfers the movements of the entire body onto the avatar. In the long term, the researchers want to create a collection of short sign language sequences that can be used by the deaf to interact on the web. An alternate method can be used is to create animation manually using 3d processing software or using a developed editor solution.<\/p>\n<p><u>#1 &#8211; The American Sign Language Avatar Project at DePaul University<\/u><\/p>\n<p>The main goal of this project is to enable automatic translation of English into American Sign Language, the language of the Deaf in North America.<\/p>\n<p>Project\u2019s avatar, named &#8220;Paula&#8221; for DePaul University, can portray all linguistic parameters of ASL. Paula has earned high marks for clarity and naturalness from users fluent in ASL.<\/p>\n<p>In addition to producing signed language, Paula has participated in several related projects, from Deaf education and interpreter training, to tutoring caregivers of people who are both deaf and mentally challenged.<\/p>\n<p><u>#2 \u2013 WebSign Project<\/u><\/p>\n<p>WebSign is a project that aims to improve accessibility for the deaf to ICT and provide tools able to increase their autonomy and does not require the hearing person to acquire specific skills to communicate with them. The objective of this project is to develop a Web-based interpreter of Sign Language (SL). This tool would enable people who do not know SL to communicate with deaf individuals. Therefore, contribute in reducing the language barrier between deaf and hearing people. The secondary objective is to distribute this tool on a non-profit basis to educators, students, users, and researchers, and to disseminate a call for contribution to support this project mainly in its exploitation step and to encourage its wide use by different communities.<\/p>\n<p>Websign is based on the technology of avatar (animation in virtual world). The input of the system is a text in natural language. The output is a real-time and on-line interpretation in sign language. This interpretation is constructed thanks to a dictionary of word and signs.<\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n<p><strong>If your project is not figured in this article, feel free to contact me.<\/strong><\/p>\n<p><strong><u>Links<\/u><\/strong><\/p>\n<p><a href=\"http:\/\/www.hearingreview.com\/2014\/09\/researchers-create-avatars-use-sign-language\/\">http:\/\/www.hearingreview.com\/2014\/09\/researchers-create-avatars-use-sign-language\/<\/a><\/p>\n<p><a href=\"http:\/\/www.snow.idrc.ocad.ca\/content\/animated-signing-characters-signing-avatars\">http:\/\/www.snow.idrc.ocad.ca\/content\/animated-signing-characters-signing-avatars<\/a><\/p>\n<p><a href=\"http:\/\/asl.cs.depaul.edu\/project_info.html\">http:\/\/asl.cs.depaul.edu\/project_info.html<\/a><\/p>\n<p><a href=\"http:\/\/www.latice.rnu.tn\/websign\">http:\/\/www.latice.rnu.tn\/websign<\/a><\/p>\n<p><strong><u>Bibliography<\/u><\/strong><\/p>\n<p>Wolfe, R., Ethimious, E, Glauert, J., Hanke, T., McDonald, J., &amp; Schnepp, J. eds. <a href=\"https:\/\/link.springer.com\/article\/10.1007\/s10209-015-0412-5\">Special issue: recent advances in sign language translation and avatar technology<\/a> Springer International Publishing, 2016.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Sign language (SL) was first acknowledged as a separate language only in the 1960s. Similar to spoken language, it evolved from different cultural backgrounds. Every country has its own sign language with various dialects, which are based on different rules than the spoken language.<\/p>\n","protected":false},"author":1,"featured_media":176,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"image","meta":{"_uag_custom_page_level_css":""},"categories":[72],"tags":[60,71,70,69,38],"aioseo_notices":[],"jetpack_featured_media_url":"https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/Capture.png?fit=848%2C514&ssl=1","uagb_featured_image_src":{"full":["https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/Capture.png?fit=848%2C514&ssl=1",848,514,false],"thumbnail":["https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/Capture.png?resize=150%2C150&ssl=1",150,150,true],"medium":["https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/Capture.png?fit=300%2C182&ssl=1",300,182,true],"medium_large":["https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/Capture.png?fit=768%2C466&ssl=1",768,466,true],"large":["https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/Capture.png?fit=770%2C467&ssl=1",770,467,true],"1536x1536":["https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/Capture.png?fit=848%2C514&ssl=1",848,514,true],"2048x2048":["https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/Capture.png?fit=848%2C514&ssl=1",848,514,true],"post-thumbnail":["https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/Capture.png?resize=270%2C164&ssl=1",270,164,true],"contentberg-main":["https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/Capture.png?resize=770%2C514&ssl=1",770,514,true],"contentberg-main-full":["https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/Capture.png?resize=848%2C508&ssl=1",848,508,true],"contentberg-slider-stylish":["https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/Capture.png?resize=848%2C514&ssl=1",848,514,true],"contentberg-slider-carousel":["https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/Capture.png?resize=370%2C370&ssl=1",370,370,true],"contentberg-slider-grid-b":["https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/Capture.png?resize=554%2C466&ssl=1",554,466,true],"contentberg-slider-grid-b-sm":["https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/Capture.png?resize=306%2C466&ssl=1",306,466,true],"contentberg-slider-bold-sm":["https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/Capture.png?resize=150%2C150&ssl=1",150,150,true],"contentberg-grid":["https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/Capture.png?resize=370%2C245&ssl=1",370,245,true],"contentberg-list":["https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/Capture.png?resize=260%2C200&ssl=1",260,200,true],"contentberg-list-b":["https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/Capture.png?resize=370%2C305&ssl=1",370,305,true],"contentberg-thumb":["https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/Capture.png?resize=87%2C67&ssl=1",87,67,true],"contentberg-thumb-alt":["https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/Capture.png?resize=150%2C150&ssl=1",150,150,true]},"uagb_author_info":{"display_name":"Achraf Othman","author_link":"https:\/\/achrafothman.net\/site\/author\/achraf-othman\/"},"uagb_comment_info":0,"uagb_excerpt":"Sign language (SL) was first acknowledged as a separate language only in the 1960s. Similar to spoken language, it evolved from different cultural backgrounds. Every country has its own sign language with various dialects, which are based on different rules than the spoken language.","jetpack_shortlink":"https:\/\/wp.me\/p8KjJN-2P","jetpack-related-posts":[{"id":2006,"url":"https:\/\/achrafothman.net\/site\/qatar-sign-language-avatar-arabic\/","url_meta":{"origin":175,"position":0},"title":"Meet the First Qatari Sign Language Avatar: A 3D Realistic Virtual Conversational Agent","date":"September 28, 2021","format":false,"excerpt":"Authors: Achraf Othman Research and Innovation Letters \u2022 Volume 1 \u2022 Issue 2 \u2022 September 2021 \u2022 Published: September 28, 2021 \u2022 PDF Abstract- When it comes to inventions and technological assistance for the hearing impaired, science has come a long way, there is absolutely no doubt about that. However,\u2026","rel":"","context":"In &quot;Research and Innovation Letters&quot;","img":{"alt_text":"BuHamad Qatari Sign Language Avatar","src":"https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/BuHamad-Gif.gif?fit=644%2C480&ssl=1&resize=350%2C200","width":350,"height":200},"classes":[]},{"id":570,"url":"https:\/\/achrafothman.net\/site\/avatar-3d-interpretation-and-its-direct-relation-to-the-deaf-community\/","url_meta":{"origin":175,"position":1},"title":"Avatar (3D interpretation) and Its Direct Relation to the Deaf Community","date":"June 8, 2020","format":false,"excerpt":"When it comes to inventions and technological assistance for the hearing impaired, science has come a long way, there is absolutely no doubt about that. However, we live in a day and age where humans, by default, are genetically engineered to want more and to crave more, which is completely\u2026","rel":"","context":"In &quot;Blog&quot;","img":{"alt_text":"avatar sign language","src":"https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/Capture-1.png?fit=803%2C577&ssl=1&resize=350%2C200","width":350,"height":200},"classes":[]},{"id":31,"url":"https:\/\/achrafothman.net\/site\/statistical-sign-language-machine-translation-from-english-written-text-to-american-sign-language-gloss\/","url_meta":{"origin":175,"position":2},"title":"Statistical sign language machine translation: from english written text to american sign language gloss","date":"September 19, 2011","format":"image","excerpt":"This works aims to design a statistical machine translation from English text to American Sign Language (ASL). The system is based on Moses tool with some modifications and the results are synthesized through a 3D avatar for interpretation. First, we translate the input text to gloss, a written form of\u2026","rel":"","context":"In &quot;Blog&quot;","img":{"alt_text":"Statistical Machine Translation for sign language","src":"https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/smt.png?fit=768%2C425&ssl=1&resize=350%2C200","width":350,"height":200},"classes":[]},{"id":942,"url":"https:\/\/achrafothman.net\/site\/overview-of-text-to-gloss-in-computational-sign-language-processing-slp\/","url_meta":{"origin":175,"position":3},"title":"Overview of Text-to-Gloss in Computational Sign Language Processing (SLP)","date":"August 9, 2021","format":false,"excerpt":"Authors: Achraf Othman Research and Innovation Letters \u2022 Volume 1 \u2022 Issue 1 \u2022 August 2021 \u2022 Published: August 9, 2021 \u2022 PDF Abstract- Digital Accessibility to the content in web environments for people with hearing disabilities and with hearing impairment with a low level of literacy is becoming increasingly\u2026","rel":"","context":"In &quot;Research and Innovation Letters&quot;","img":{"alt_text":"Using Gloss in Sign Language","src":"https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/signlanguagegloss.png?fit=1024%2C683&ssl=1&resize=350%2C200","width":350,"height":200},"classes":[]},{"id":201,"url":"https:\/\/achrafothman.net\/site\/leap-motion-sign-language\/","url_meta":{"origin":175,"position":4},"title":"Leap Motion is the Next Step Forward in Sign Language","date":"November 1, 2017","format":"image","excerpt":"When it comes to improved technology and gadgets to help people with various disabilities, science has come a long way and made it possible for over hundreds of thousands of people to lead a better and easier life. From home robots that help you to make a doctor\u2019s appointment to\u2026","rel":"","context":"In &quot;Blog&quot;","img":{"alt_text":"Leap Motion is the Next Step Forward in Sign Language","src":"https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/JxZG9xL.jpg?fit=1200%2C675&ssl=1&resize=350%2C200","width":350,"height":200},"classes":[]},{"id":3705,"url":"https:\/\/achrafothman.net\/site\/unveiling-my-latest-book-sign-language-processing-from-gesture-to-meaning\/","url_meta":{"origin":175,"position":5},"title":"Unveiling My Latest Book: Sign Language Processing\u2014From Gesture to Meaning","date":"October 1, 2024","format":false,"excerpt":"It is with great excitement that I announce the release of my latest book, Sign Language Processing: From Gesture to Meaning. This work is the culmination of years of dedication, research, and a deep commitment to understanding the intricacies of sign languages, an area that bridges language, culture, and technology.\u2026","rel":"","context":"In &quot;Blog&quot;","img":{"alt_text":"Sign Language Processing Springer Book","src":"https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/WhatsApp-Image-2024-09-15-at-10.55.50-AM-e1727749850619.jpeg?fit=1183%2C1044&ssl=1&resize=350%2C200","width":350,"height":200},"classes":[]}],"_links":{"self":[{"href":"https:\/\/achrafothman.net\/site\/wp-json\/wp\/v2\/posts\/175"}],"collection":[{"href":"https:\/\/achrafothman.net\/site\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/achrafothman.net\/site\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/achrafothman.net\/site\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/achrafothman.net\/site\/wp-json\/wp\/v2\/comments?post=175"}],"version-history":[{"count":5,"href":"https:\/\/achrafothman.net\/site\/wp-json\/wp\/v2\/posts\/175\/revisions"}],"predecessor-version":[{"id":181,"href":"https:\/\/achrafothman.net\/site\/wp-json\/wp\/v2\/posts\/175\/revisions\/181"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/achrafothman.net\/site\/wp-json\/wp\/v2\/media\/176"}],"wp:attachment":[{"href":"https:\/\/achrafothman.net\/site\/wp-json\/wp\/v2\/media?parent=175"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/achrafothman.net\/site\/wp-json\/wp\/v2\/categories?post=175"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/achrafothman.net\/site\/wp-json\/wp\/v2\/tags?post=175"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}