Show simple item record

dc.contributor.authorPrashant Katiyar, 19SCSE1180072
dc.contributor.authorKumar Skand Kartik, 19SCSE1180062
dc.date.accessioned2024-09-18T06:11:12Z
dc.date.available2024-09-18T06:11:12Z
dc.date.issued2022-12
dc.identifier.urihttp://10.10.11.6/handle/1/18089
dc.descriptionSCHOOL OF COMPUTING SCIENCE AND ENGINEERING DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING GALGOTIAS UNIVERSITY, GREATER NOIDA INDIAen_US
dc.description.abstractPeople with speech disabilities communicate in sign language and therefore have trouble in mingling with the able-bodied. There is a need for an interpretation system which could act as a bridge between them and those who do not know their sign language. A functional unobtrusive Indian sign language recognition system was implemented and tested on real world data. A vocabulary of 26 symbols was collected. The vocabulary consisted mostly of two-handed signs which were drawn from a wide repertoire of words of technical and daily-use origins. Our project aims to create a computer application and train a model which when shown a real time video of hand gestures of Indian Sign Language shows the output for that particular sign in text format on the screen.en_US
dc.language.isoen_USen_US
dc.publisherGalgotias Universityen_US
dc.subjectHuman Action Recognitionen_US
dc.subjectUsing Machine Learningen_US
dc.titleHuman Action Recognition Using Machine Learningen_US
dc.typeTechnical Reporten_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record