研究著作內容
A Crowdsourceable QoE Evaluation Framework for Multimedia Content
(NOTE: Sheng-Wei Chen is also known as Kuan-Ta Chen.)

Abstract
Until recently, QoE (Quality of Experience) experiments had to be conducted in academic laboratories; however, with the advent of ubiquitous Internet access, it is now possible to ask an Internet crowd to conduct experiments on their personal computers. Since such a crowd can be quite large, crowdsourcing enables researchers to conduct experiments with a more diverse set of participants at a lower economic cost than would be possible under laboratory conditions. However, because participants carry out experiments without supervision, they may give erroneous feedback perfunctorily, carelessly, or dishonestly, even if they receive a reward for each experiment.

In this paper, we propose a crowdsourceable framework to quantify the QoE of multimedia content. The advantages of our framework over traditional MOS ratings are: 1) it enables crowdsourcing because it supports systematic verification of participants' inputs; 2) the rating procedure is simpler than that of MOS, so there is less burden on participants; and 3) it derives interval-scale scores that enable subsequent quantitative analysis and QoE provisioning. We conducted four case studies, which demonstrated that, with our framework, researchers can outsource their QoE evaluation experiments to an Internet crowd without risking the quality of the results; and at the same time, obtain a higher level of participant diversity at a lower monetary cost.


Materials
Citation
Kuan-Ta Chen, Chen-Chi Wu, Yu-Chun Chang, and Chin-Laung Lei, "A Crowdsourceable QoE Evaluation Framework for Multimedia Content," In Proceedings of ACM Multimedia 2009, 2009.

BibTex
@INPROCEEDINGS{chen09:crowdsourcing,
  TITLE      = {A Crowdsourceable QoE Evaluation Framework for Multimedia Content},
  AUTHOR     = {Kuan-Ta Chen and Chen-Chi Wu and Yu-Chun Chang and Chin-Laung Lei},
  BOOKTITLE  = {Proceedings of ACM Multimedia 2009},
  YEAR       = {2009}
}