@uniwue_info3

Crowdsourcing 2.0: Enhancing Execution Speed and Reliability of Web-based QoE Testing

, , , and . IEEE International Conference on Communications (ICC), Sydney, Australia, (June 2014)

Abstract

Since its introduction a few years ago, the concept of 'Crowdsourcing' has been heralded as highly attractive alternative approach towards evaluating the Quality of Experience (QoE) of networked multimedia services. The main reason is that, in comparison to traditional laboratory-based subjective quality testing, crowd-based QoE assessment over the Internet promises to be not only much more cost-effective (no lab facilities required, less cost per subject) but also much faster in terms of shorter campaign setup and turnaround times. However, the reliability of remote test subjects and consequently, the trustworthiness of study results is still an issue that prevents the widespread adoption of crowd-based QoE testing. Various ideas for improving user rating reliability and test efficiency have been proposed, with the majority of them relying on a posteriori analysis of results. However, such methods introduce a major lag that significantly affects efficiency of campaign execution. In this paper we address these shortcomings by introducing in momento methods for crowdsourced video QoE assessment, which yield improvements of results reliability by the factor two and campaign execution efficiency by the factor ten. The proposed in momento methods are applicable to existing crowd-based QoE testing approaches and suitable for a variety of service scenarios.

Links and resources

Tags

community

  • @uniwue_info3
  • @dblp
@uniwue_info3's tags highlighted