Continuous subjective quality assessment methods for cloud gaming

In Cloud Gaming (CG), servers render the game events and stream the resulted encoded videos to the thin clients. And thin clients just decode and display the incoming streamed video scenes.

In recent years, due to the existence of fast and reliable core networks and the increasing prevalence of cloud infrastructures, more services tend to move from the end users to the cloud servers. Cloud gaming is one of these services which is rapidly growing. In Cloud Gaming (CG), powerful servers render the game events and stream the resulted encoded videos to the thin clients. These thin clients then decode the incoming streamed video scenes and display them to the players. CG has many advantages for both end users and game developers. Gamers are no longer required to purchase high-end graphics devices, and developers no longer have to fear software piracy since the game engine does not leave the cloud. However, cloud gaming has its challenges [1]. The most important challenges are the high bandwidth requirement which has been addressed by video compression and network delay [2]. Both of them inherently impact players’ quality of experience (QoE).
The aim of the cloud gaming service like any other services is to increase its user’s experience. However, this improvement is not possible without having a good measurement system if it cannot be measured it cannot be improved! Quality assessment can be performed either subjectively or objectively. This thesis focuses on subjective quality assessment methods. At the moment the most common methods for assessing gaming QoE are questionnaires [3], which suffer from recency and primacy effect. There exist other methods such as SSCQE [4], and OneClick [5], but they use the subjects’ hands which are occupied during gaming. Although the measurement method which proposed in [6] solved the problem of the occupied hands, it distracts gamers from both rating and gaming task. In this thesis we are going to develop a new subjective measurement method which a) does not suffer from primacy, forgiveness, peak and recency effects by continually asking gamers to rate, b) doesn’t require gamers hands for rating and c) doesn’t distract participants from playing and rating task.


Providing high quality of experience and meeting the needs of gamers is one of the main promises of the cloud gaming service. Therefore, CG providers, researchers who develop new compression methods for CG, and those who are improving network protocols for CG require evaluating user experience using measurement methods. The objective of this thesis is designing a continuous subjective quality assessment method for assessment of gaming QoE.


  • Background in Human and Computer Interaction
  • Background in statistics


  • Carsten Griwodz


[1] K. Chen, C. Huang, and C. Hsu, “Cloud gaming onward: research opportunities and outlook,” in 2014 IEEE International Conference on Multimedia and Expo Workshops (ICMEW), 2014, pp. 1–4.
[2] M. Claypool, D. Finkel, A. Grant, and M. Solano, “Thin to win? Network performance analysis of the OnLive thin client game system,” Annu. Work. Netw. Syst. Support Games, 2012.
[3] I.-T. R. P.809, “Subjective Evaluation Methods for Gaming Quality (P.GAME),” Int. Telecommun. Union, vol. 851, no. 2003, 2018.
[4] ITU, “Methodology for the subjective assessment of the quality of television pictures BT Series Broadcasting service,” Int. Telecommun. Union, …, vol. 13, pp. 1–48, 2012.
[5] K.-T. K. T. K.-T. K. T. Chen, C. C. C.-C. Tu, and W. C. W.-C. W. C. W.-C. Xiao, “OneClick: A framework for measuring network quality of experience,” Proc. - IEEE INFOCOM, pp. 702–710, 2009.
[6] S. S. Sabet, M. R. Hashemi, and M. Ghanbari, “A testing apparatus for faster and more accurate subjective assessment of quality of experience in cloud gaming,” Proc. - 2016 IEEE Int. Symp. Multimedia, ISM 2016, pp. 463–466, 2017.