14-08-2011, 04:15 PM
i want more ideas about this topic and circuits and all resources for this topic
14-08-2011, 04:15 PM
i want more ideas about this topic and circuits and all resources for this topic
16-08-2011, 11:20 AM
To get more information about the topic "speaking microcontroller for deaf and dumb " please refer the link below
https://seminarproject.net/Thread-speaki...f-and-dumb
22-05-2012, 06:02 PM
please send the full details about speaking microcontroller for deaf and dumb
i want to implement this in my college so pls send the componets used for this project ...pls send as soon as possible[/size] please send the fuul detailas about the project and also send the components used for the project pls send as soon as possible
09-01-2017, 06:49 PM
Plz send ppt and seminar report of speaking microcontroller for deaf and dumb..
This is susheela
07-02-2017, 04:17 PM
Micro-based speech controller for deaf and dumb is designed to give the signs, which are pre-loaded into the device. It is a device based on micro-controller, which gives the alert sounds only with the hand gesture sensor, which give some redefined messages like ask for water, washroom etc ..., here the person can only give the predefined gesture that indicates the Water sign) Then the device sounds the same with a certain output volume. The micro-controller is the heart of the device. It stores the data of the needs of the person. The stored data every time the person uses the device. This device helps the deaf and dumb to announce their needs. That is why the person nearby can understand their need and help them. This saves time for understanding and facilitating communication. This device is designed to provide a greater advantage by producing a voice-based advertisement for users that is, the user gets the voice that pronounces their need as and when required. "Speech" and "gestures" are expressions, which are used primarily in communication between humans. The learning of its use begins with the first years of life. In human communication the use of speech and gestures is completely coordinated. The gesture of the machine and recognition of the sign language is about the recognition of the gestures and the sign language using gloves. Various hardware techniques are used to gather information about body positioning; Typically based on images (using cameras, moving lights, etc.) or device-based (using instrumented gloves, position followers, etc.), although hybrids are beginning to occur. However, getting the data is just the first step. The second step, that of recognising the sign or gesture once it has been captured is much more challenging, especially in a continuous flow. In fact, this is currently the focus of research.The first commercial gesture technology product for the general user launched in 2003. Speaking micro-controller for deaf and dumb |
|