A Cross-Case Analysis of Possible Facial Emotion Extraction Methods that Could Be Used in Second Life - Pre Experimental Work
This research-in-brief compares – based on documentation and web sites information -- findings of three different facial emotion extraction methods and puts forward possibilities of implementing the methods to Second Life. The motivation for the research stemmed from a literature review which indicated that current virtual communication tools did not satisfy users. The review showed that people preferred real-life like communication in virtual environments due to higher immersion and better user experience. Research revealed three methods used to create avatar facial expressiveness using facial emotion extraction. The three methods found were:
- Extracting emotion through user texts to apply to avatar facial features in real-time
- Using Microsoft’s Kinect technology to capture user facial motion to apply to the avatars’ in real-time
- Extracting emotion through video capture of user’s facial expressions via webcam in real-time.
This research analyzed the three methods in terms of implementation, integration and feasibility in Second Life.
This work is licensed under a Creative Commons Attribution 3.0 License.
The full website for the Journal of Virtual Worlds Research can be found at: http://jvwresearch.org