{"id":3356,"date":"2023-02-20T02:39:38","date_gmt":"2023-02-20T02:39:38","guid":{"rendered":"https:\/\/achrafothman.net\/site\/?p=3356"},"modified":"2023-02-22T02:42:17","modified_gmt":"2023-02-22T02:42:17","slug":"new-dataset-jumla-qsl-22-a-dataset-of-qatari-sign-language-sentences","status":"publish","type":"post","link":"https:\/\/achrafothman.net\/site\/new-dataset-jumla-qsl-22-a-dataset-of-qatari-sign-language-sentences\/","title":{"rendered":"New dataset: JUMLA-QSL-22: A DATASET OF QATARI SIGN LANGUAGE SENTENCES"},"content":{"rendered":"<p><strong>ABSTRACT<\/strong><br \/>\nSign languages are the most common mode of communication with and between hearing-impaired individuals. In the Arab world, Arabic sign language is used with different dialects supporting a distinct set of rules for the gestures used. With research on natural language processing advancing, models have been developed to translate sign language to spoken language and vice versa. However, Arabic sign language has rarely been studied due to the lack of availability of datasets dealing with Arabic sign language.<\/p>\n<p>The aim of this project is to improve the accessibility of hearing-impaired individuals by bridging the gap in communication using the Jumla dataset. This dataset supplies a large sample of Arabic sign language in the Qatari dialect, having 6300 records collected over a period of 5 months. 7 participants were invited to the study which included 5 hearing-impaired individuals and 2 sign language interpreters. The participants were given a sentence from a list of 900 sentences at a time and videos of them signing the sentences in Qatari sign language were recorded. The videos were recorded from four angles (front, left side, right side, and top view) using four true-depth cameras.<\/p>\n<p><a href=\"https:\/\/ieee-dataport.org\/open-access\/jumla-qsl-22-dataset-qatari-sign-language-sentences\">IEEE Dataport Link to download<\/a><\/p>\n<p><strong>Instructions:<\/strong><br \/>\nThe dataset has 7 folders, one for each participant, with each folder having 900 sub-folders and a *.csv file. Within the sub-folders, the output of each signed sentence from a participant is stored. The output holds four videos from four different angles. The details of the dataset content are described below:<\/p>\n<p>i. Participant_YY folder: There are 7 such folders, one for each participant.<\/p>\n<p>ii. JUMLA-QSL-22-Participant_YY.csv: Each participant has a .csv file that contains the Arabic sentences they signed and their respective code.<\/p>\n<p>iii. f_YYxxx: Within each folder, there are 900 subfolders labelled with the coded sentence. For example, the folder \u2018f_AT374\u2019 represents the code \u2018374\u2019 in JUMLA-QSL-22-Participant_AT[MA1] .csv that refers to the sentence \u2018\u064a\u0648\u0645 \u0627\u0644\u062b\u0644\u0627\u062b\u0627\u0621\u2019 hence, this folder contains the video of the participant signing the sentence \u2018\u064a\u0648\u0645 \u0627\u0644\u062b\u0644\u0627\u062b\u0627\u0621\u2019. Each subfolder contains four types of files representing the four views recorded of the participant signing a sentence.<\/p>\n<p>1. recF.svo: This video file contains the front angle of the participant signing. The RGB video was recorded with a frame rate of 60 fps and a resolution of 2560 x 720 pixels.<\/p>\n<p>2. recT.svo: This video file contains the top angle of the participant signing. The RGB video was recorded with a frame rate of 60 fps and a resolution of 2560 x 720 pixels.<\/p>\n<p>3. recL.bag: This video file contains the left angle of the participant signing. The RGB video was recorded with a frame rate of 60 fps and a resolution of 640 x 480 pixels.<\/p>\n<p>4. recR.bag: This video file contains the right angle of the participant signing. The RGB video was recorded with a frame rate of 60 fps and a resolution of 640 x 480 pixels.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>ABSTRACT Sign languages are the most common mode of communication with and between hearing-impaired individuals. In the Arab world, Arabic sign language is used with different dialects supporting a distinct set of rules for the gestures used. With research on natural language processing advancing, models have been developed to translate sign language to spoken language<\/p>\n","protected":false},"author":1,"featured_media":3357,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_uag_custom_page_level_css":""},"categories":[174,6],"tags":[185,182,184,183,38],"aioseo_notices":[],"jetpack_featured_media_url":"https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/dataset.jpg?fit=1288%2C969&ssl=1","uagb_featured_image_src":{"full":["https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/dataset.jpg?fit=1288%2C969&ssl=1",1288,969,false],"thumbnail":["https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/dataset.jpg?resize=150%2C150&ssl=1",150,150,true],"medium":["https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/dataset.jpg?fit=300%2C226&ssl=1",300,226,true],"medium_large":["https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/dataset.jpg?fit=768%2C578&ssl=1",768,578,true],"large":["https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/dataset.jpg?fit=770%2C579&ssl=1",770,579,true],"1536x1536":["https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/dataset.jpg?fit=1288%2C969&ssl=1",1288,969,true],"2048x2048":["https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/dataset.jpg?fit=1288%2C969&ssl=1",1288,969,true],"post-thumbnail":["https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/dataset.jpg?resize=270%2C180&ssl=1",270,180,true],"contentberg-main":["https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/dataset.jpg?resize=770%2C515&ssl=1",770,515,true],"contentberg-main-full":["https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/dataset.jpg?resize=1170%2C508&ssl=1",1170,508,true],"contentberg-slider-stylish":["https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/dataset.jpg?resize=900%2C515&ssl=1",900,515,true],"contentberg-slider-carousel":["https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/dataset.jpg?resize=370%2C370&ssl=1",370,370,true],"contentberg-slider-grid-b":["https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/dataset.jpg?resize=554%2C466&ssl=1",554,466,true],"contentberg-slider-grid-b-sm":["https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/dataset.jpg?resize=306%2C466&ssl=1",306,466,true],"contentberg-slider-bold-sm":["https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/dataset.jpg?resize=150%2C150&ssl=1",150,150,true],"contentberg-grid":["https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/dataset.jpg?resize=370%2C245&ssl=1",370,245,true],"contentberg-list":["https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/dataset.jpg?resize=260%2C200&ssl=1",260,200,true],"contentberg-list-b":["https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/dataset.jpg?resize=370%2C305&ssl=1",370,305,true],"contentberg-thumb":["https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/dataset.jpg?resize=87%2C67&ssl=1",87,67,true],"contentberg-thumb-alt":["https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/dataset.jpg?resize=150%2C150&ssl=1",150,150,true]},"uagb_author_info":{"display_name":"Achraf Othman","author_link":"https:\/\/achrafothman.net\/site\/author\/achraf-othman\/"},"uagb_comment_info":0,"uagb_excerpt":"ABSTRACT Sign languages are the most common mode of communication with and between hearing-impaired individuals. In the Arab world, Arabic sign language is used with different dialects supporting a distinct set of rules for the gestures used. With research on natural language processing advancing, models have been developed to translate sign language to spoken language","jetpack_shortlink":"https:\/\/wp.me\/p8KjJN-S8","jetpack-related-posts":[{"id":2006,"url":"https:\/\/achrafothman.net\/site\/qatar-sign-language-avatar-arabic\/","url_meta":{"origin":3356,"position":0},"title":"Meet the First Qatari Sign Language Avatar: A 3D Realistic Virtual Conversational Agent","date":"September 28, 2021","format":false,"excerpt":"Authors: Achraf Othman Research and Innovation Letters \u2022 Volume 1 \u2022 Issue 2 \u2022 September 2021 \u2022 Published: September 28, 2021 \u2022 PDF Abstract- When it comes to inventions and technological assistance for the hearing impaired, science has come a long way, there is absolutely no doubt about that. However,\u2026","rel":"","context":"In &quot;Research and Innovation Letters&quot;","img":{"alt_text":"BuHamad Qatari Sign Language Avatar","src":"https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/BuHamad-Gif.gif?fit=644%2C480&ssl=1&resize=350%2C200","width":350,"height":200},"classes":[]},{"id":311,"url":"https:\/\/achrafothman.net\/site\/after-great-2018-conference-organized-by-mada-assistive-technology-center-doha-qatar\/","url_meta":{"origin":3356,"position":1},"title":"After GREAT 2018 Conference organized by MADA Assistive Technology Center, Doha, Qatar","date":"May 1, 2018","format":"image","excerpt":"The theme of this year\u2019s conference GREAT'2018 is Inclusive Computer Technology (ICT) and Assistive Technology (AT) with a focus on 4 domains: Education, Independent Living, e-Accessibility and Innovation. Many high-level speakers, attracting delegates representing their institutions are invited to present the latest technologies for people with disabilities. This year\u2019s conference\u2026","rel":"","context":"In &quot;Blog&quot;","img":{"alt_text":"GREAT 2018 CONFERENCE MADA ASSISTIVE TECHNOLOGY CENTER DOHA QATAr","src":"https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/20180425_074250_264.jpg?fit=1200%2C800&ssl=1&resize=350%2C200","width":350,"height":200},"classes":[]},{"id":570,"url":"https:\/\/achrafothman.net\/site\/avatar-3d-interpretation-and-its-direct-relation-to-the-deaf-community\/","url_meta":{"origin":3356,"position":2},"title":"Avatar (3D interpretation) and Its Direct Relation to the Deaf Community","date":"June 8, 2020","format":false,"excerpt":"When it comes to inventions and technological assistance for the hearing impaired, science has come a long way, there is absolutely no doubt about that. However, we live in a day and age where humans, by default, are genetically engineered to want more and to crave more, which is completely\u2026","rel":"","context":"In &quot;Blog&quot;","img":{"alt_text":"avatar sign language","src":"https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/Capture-1.png?fit=803%2C577&ssl=1&resize=350%2C200","width":350,"height":200},"classes":[]},{"id":942,"url":"https:\/\/achrafothman.net\/site\/overview-of-text-to-gloss-in-computational-sign-language-processing-slp\/","url_meta":{"origin":3356,"position":3},"title":"Overview of Text-to-Gloss in Computational Sign Language Processing (SLP)","date":"August 9, 2021","format":false,"excerpt":"Authors: Achraf Othman Research and Innovation Letters \u2022 Volume 1 \u2022 Issue 1 \u2022 August 2021 \u2022 Published: August 9, 2021 \u2022 PDF Abstract- Digital Accessibility to the content in web environments for people with hearing disabilities and with hearing impairment with a low level of literacy is becoming increasingly\u2026","rel":"","context":"In &quot;Research and Innovation Letters&quot;","img":{"alt_text":"Using Gloss in Sign Language","src":"https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/signlanguagegloss.png?fit=1024%2C683&ssl=1&resize=350%2C200","width":350,"height":200},"classes":[]},{"id":3705,"url":"https:\/\/achrafothman.net\/site\/unveiling-my-latest-book-sign-language-processing-from-gesture-to-meaning\/","url_meta":{"origin":3356,"position":4},"title":"Unveiling My Latest Book: Sign Language Processing\u2014From Gesture to Meaning","date":"October 1, 2024","format":false,"excerpt":"It is with great excitement that I announce the release of my latest book, Sign Language Processing: From Gesture to Meaning. This work is the culmination of years of dedication, research, and a deep commitment to understanding the intricacies of sign languages, an area that bridges language, culture, and technology.\u2026","rel":"","context":"In &quot;Blog&quot;","img":{"alt_text":"Sign Language Processing Springer Book","src":"https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/WhatsApp-Image-2024-09-15-at-10.55.50-AM-e1727749850619.jpeg?fit=1183%2C1044&ssl=1&resize=350%2C200","width":350,"height":200},"classes":[]},{"id":175,"url":"https:\/\/achrafothman.net\/site\/virtual-conversation-agent-avatar-for-sign-language\/","url_meta":{"origin":3356,"position":5},"title":"Virtual Conversational Agent (avatar) for Sign Language","date":"July 16, 2017","format":"image","excerpt":"Sign language (SL) was first acknowledged as a separate language only in the 1960s. Similar to spoken language, it evolved from different cultural backgrounds. Every country has its own sign language with various dialects, which are based on different rules than the spoken language. Although several websites provide video clips\u2026","rel":"","context":"In &quot;Blog&quot;","img":{"alt_text":"Virtual Conversation Agent (avatar) for Sign Language","src":"https:\/\/i0.wp.com\/achrafothman.net\/site\/wp-content\/uploads\/Capture.png?fit=848%2C514&ssl=1&resize=350%2C200","width":350,"height":200},"classes":[]}],"_links":{"self":[{"href":"https:\/\/achrafothman.net\/site\/wp-json\/wp\/v2\/posts\/3356"}],"collection":[{"href":"https:\/\/achrafothman.net\/site\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/achrafothman.net\/site\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/achrafothman.net\/site\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/achrafothman.net\/site\/wp-json\/wp\/v2\/comments?post=3356"}],"version-history":[{"count":1,"href":"https:\/\/achrafothman.net\/site\/wp-json\/wp\/v2\/posts\/3356\/revisions"}],"predecessor-version":[{"id":3358,"href":"https:\/\/achrafothman.net\/site\/wp-json\/wp\/v2\/posts\/3356\/revisions\/3358"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/achrafothman.net\/site\/wp-json\/wp\/v2\/media\/3357"}],"wp:attachment":[{"href":"https:\/\/achrafothman.net\/site\/wp-json\/wp\/v2\/media?parent=3356"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/achrafothman.net\/site\/wp-json\/wp\/v2\/categories?post=3356"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/achrafothman.net\/site\/wp-json\/wp\/v2\/tags?post=3356"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}