written 8.7 years ago by |
The different styles of Human computing interaction are: Visual-Based HCI, Sensor-Based HCI and Audio-Based HCI.
i. Visual-Based HCI:
a) Body movement tracking and gesture recognition are usually the main focus of this area mostly used for direct interaction of human and computer in a command and action scenario.
b) Gaze detection is mostly used for better understanding of user’s attention, intent or focus in context-sensitive situations.
c) Lip reading is known to be used as an influential aid for speech recognition error correction.
ii. Audio-Based HCI:
a) Speech recognition and speaker recognition help in analysis of emotions in audio signals.
b) Other than that the tone and pitch of speech data, typical human auditory signs such as sigh, gasp, and etc helped emotion analysis for designing more intelligent HCI systems.
c) Music generation and interaction can be applied in art industry.
iii. Sensor-Based HCI:
a) Physical sensor is used between user and machine to provide the interaction.
b) These are the following sensors that can be used: Pen-Based Interaction, Mouse & Keyboard, Joysticks, Motion Tracking Sensors and Digitizers, Haptic sensors, Pressure Sensors and Taste/Smell sensors.
c) Pen-Based sensors are specifically of interest in mobile devices and are related to pen gesture and handwriting recognition areas.
d) Motion tracking sensors/digitizers are state-of-the-art technology which revolutionized movie, animation, art and video-game industry. They come in the form of wearable cloth or joint sensors and made computers much more able to interact with reality and human able to create their world virtually.
e) Haptic and pressure sensors are used for applications in robotics, virtual reality and medical surgery.