Research and Publications


  • Are lab-based audiovisual quality tests reflecting what users experience at home?Are lab-based audiovisual quality tests reflecting what users experience at home?
  • Feb 1, 2016Feb 1, 2016
    • Show publication
      • This study aims to understand the influence of the environment over the perception of Audiovisual quality on IPTV services. Thus, the test participants were distributed between 2 groups. One group performed the test in a living room lab. This laboratory has the typical characteristics of a regular living room. The second group performed the test in a room which characteristics were based on the ITU Recommendation ITU-R Rec. BT. 500. To ensure the reciprocity of the conditions between both rooms, extreme care of environmental conditions were taken. Hence, factors such as ambient light, ambient noise, temperature, and relation between TV size and viewing distance were replicated in both rooms. Preliminary results reveals discrepancies between subjective quality evaluations performed in both rooms.This study aims to understand the influence of the environment over the perception of Audiovisual quality on IPTV services. Thus, the test participants were distributed between 2 groups. One group performed the test in a living room lab. This laboratory has the typical characteristics of a regular living room. The second group performed the test in a room which characteristics were based on the ITU Recommendation ITU-R Rec. BT. 500. To ensure the reciprocity of the conditions between both rooms, extreme care of environmental conditions were taken. Hence, factors such as ambient light, ambient noise, temperature, and relation between TV size and viewing distance were replicated in both rooms. Preliminary results reveals discrepancies between subjective quality evaluations performed in both rooms.
      • Other authorsOther authors
  • Annoyance and acceptability of video service responsivenessAnnoyance and acceptability of video service responsiveness
  • May 1, 2015May 1, 2015
    • Show publication
      • This paper presents an evaluation of four general functionalities of IP-based TV or video services as a function of their responsiveness. The study explores the effect of different response times in terms of annoyance and acceptability judgments. Test participants were asked to interact with a web-technology based mock-up of an IPTV service interface, this way realistically mimicking real-life service interaction. Both the applied mock-up and the study of delay as degradation type make this research applicable to cloud-based video services. The addressed service functionalities are zapping, direct program access via remote control, browsing the electronic program guide and video rental. The paper presents the relation between annoyance and acceptability, respective acceptability thresholds for delays in different tasks, and discusses the results in the light of two types of user groups.This paper presents an evaluation of four general functionalities of IP-based TV or video services as a function of their responsiveness. The study explores the effect of different response times in terms of annoyance and acceptability judgments. Test participants were asked to interact with a web-technology based mock-up of an IPTV service interface, this way realistically mimicking real-life service interaction. Both the applied mock-up and the study of delay as degradation type make this research applicable to cloud-based video services. The addressed service functionalities are zapping, direct program access via remote control, browsing the electronic program guide and video rental. The paper presents the relation between annoyance and acceptability, respective acceptability thresholds for delays in different tasks, and discusses the results in the light of two types of user groups.
      • Other authorsOther authors
  • Chamber QoE: a multi-instrumental approach to explore affective aspects in relation to quality of experienceChamber QoE: a multi-instrumental approach to explore affective aspects in relation to quality of experience
  • Jan 1, 2014Jan 1, 2014
    • Show publication
      • Evaluating (audio)visual quality and Quality of Experience (QoE) from the user’s perspective, has become a key element in optimizing users’ experiences and their quality. Traditionally, the focus lies on how multi-level quality features are perceived by a human user. The interest has however gradually expanded towards human cognitive, affective and behavioral processes that may impact on, be an element of, or be influenced by QoE, and which have been underinvestigated so far. In addition, there is a major discrepancy between the new, broadly supported and more holistic conceptualization of QoE proposed by Le Callet et al. (2012) and traditional, standardized QoE assessment. This paper explores ways to tackle this discrepancy by means of a multi-instrumental approach. More concretely, it presents results from a lab study on video quality (N=27), aimed at going beyond the dominant QoE assessment paradigm and at exploring affective aspects in relation to QoE and in relation to perceived overall quality. Four types of data were collected: ‘traditional’ QoE self-report measures were complemented with ‘alternative’, emotional state- and user engagement-related self-report measures to evaluate QoE. In addition, we collected EEG (physiological) data, gazetracking data and facial expressions (behavioral) data. The video samples used in test were longer in duration than is
        common in standard tests allowing us to study e.g. more realistic experience and deeper user engagement. Our findings support the claim that the traditional QoE measures need to be reconsidered and extended with additional, affective state related measures. Evaluating (audio)visual quality and Quality of Experience (QoE) from the user’s perspective, has become a key element in optimizing users’ experiences and their quality. Traditionally, the focus lies on how multi-level quality features are perceived by a human user. The interest has however gradually expanded towards human cognitive, affective and behavioral processes that may impact on, be an element of, or be influenced by QoE, and which have been underinvestigated so far. In addition, there is a major discrepancy between the new, broadly supported and more holistic conceptualization of QoE proposed by Le Callet et al. (2012) and traditional, standardized QoE assessment. This paper explores ways to tackle this discrepancy by means of a multi-instrumental approach. More concretely, it presents results from a lab study on video quality (N=27), aimed at going beyond the dominant QoE assessment paradigm and at exploring affective aspects in relation to QoE and in relation to perceived overall quality. Four types of data were collected: ‘traditional’ QoE self-report measures were complemented with ‘alternative’, emotional state- and user engagement-related self-report measures to evaluate QoE. In addition, we collected EEG (physiological) data, gazetracking data and facial expressions (behavioral) data. The video samples used in test were longer in duration than is common in standard tests allowing us to study e.g. more realistic experience and deeper user engagement. Our findings support the claim that the traditional QoE measures need to be reconsidered and extended with additional, affective state related measures.
      • Other authorsOther authors
  • Chapter in Book Quality of ExperienceChapter in Book Quality of ExperienceSpringer ·
  • Jan 1, 2014Springer · Jan 1, 2014
      • This chapter addresses QoE in the context of video streaming services.
        Both reliable and unreliable transport mechanisms are covered. An overview of
        video quality models is provided for each case, with a focus on standardized models.
        The degradations typically occurring in video streaming services, and which should
        be covered by the models, are also described. In addition, the chapter presents the
        results of various studies conducted to fill the gap between the existing video quality
        models and the estimation of QoE in the context of video streaming services. These
        studies include work on audiovisual quality modeling, field testing, and on the user
        impact. The chapter finishes with a discussion on the open issues related to QoE.
        This chapter addresses QoE in the context of video streaming services. Both reliable and unreliable transport mechanisms are covered. An overview of video quality models is provided for each case, with a focus on standardized models. The degradations typically occurring in video streaming services, and which should be covered by the models, are also described. In addition, the chapter presents the results of various studies conducted to fill the gap between the existing video quality models and the estimation of QoE in the context of video streaming services. These studies include work on audiovisual quality modeling, field testing, and on the user impact. The chapter finishes with a discussion on the open issues related to QoE.
      • Other authorsOther authors
  • Evaluating QoE by means of traditional and alternative subjective measures: an exploratory’living room lab’study on IPTV
  • Jan 1, 2013Jan 1, 2013
    • Show publication
      • In this paper, we explore the use of a set of potential
        ‘alternative’, emotional state and engagement-related measures
        of QoE and investigate how they relate to traditional QoE
        measures. To this end, we present results from a living room lab
        study (N=28) in which the impact of slicing errors on QoE was
        investigated. Our findings indicate significant differences in QoE
        between the three used error profiles, both when considering the
        traditional and alternative measures. Furthermore, we found that
        the link between the traditional and alternative measures is rather
        weak, indicating that the former may need to be reconsidered and
        extended with alternative measures, which allow to measure QoE
        in terms of ‘delight’ or ‘annoyance’.
        In this paper, we explore the use of a set of potential ‘alternative’, emotional state and engagement-related measures of QoE and investigate how they relate to traditional QoE measures. To this end, we present results from a living room lab study (N=28) in which the impact of slicing errors on QoE was investigated. Our findings indicate significant differences in QoE between the three used error profiles, both when considering the traditional and alternative measures. Furthermore, we found that the link between the traditional and alternative measures is rather weak, indicating that the former may need to be reconsidered and extended with alternative measures, which allow to measure QoE in terms of ‘delight’ or ‘annoyance’.
      • Other authorsOther authors
  • Is taking into account the subjects degree of knowledge and expertise enough when rating quality?
  • Jul 7, 2012Jul 7, 2012
    • Show publication
      • This paper provides the results of a speech quality listening test on Wideband (WB, 500-7000 Hz) and narrowband (NB; 300-3400 Hz) codecs for new wideband-based, so-called “high-definition” (HD Voice) telephony services. Aside of error-free coding, the experiment also evaluated the effects of packet loss and external noise. Unlike “expert vs non-expert differentiation” typical of speech quality research, users are classified into six groups according to demographic characteristics, their attitude towards adopting new technologies and socio-economic information. With this experiment, we want to present how factors beyond the level of prior knowledge of the users affect their perception of quality.This paper provides the results of a speech quality listening test on Wideband (WB, 500-7000 Hz) and narrowband (NB; 300-3400 Hz) codecs for new wideband-based, so-called “high-definition” (HD Voice) telephony services. Aside of error-free coding, the experiment also evaluated the effects of packet loss and external noise. Unlike “expert vs non-expert differentiation” typical of speech quality research, users are classified into six groups according to demographic characteristics, their attitude towards adopting new technologies and socio-economic information. With this experiment, we want to present how factors beyond the level of prior knowledge of the users affect their perception of quality.
      • Other authorsOther authors
  • Towards assigning value to multimedia QoETowards assigning value to multimedia QoE
  • Jul 9, 2011
    • Show publication
      • The development of multimedia services with a high level of acceptance has created the necessity of migrating from traditional concepts like quality of service (QoS) to the more user-centric concept of quality of experience (QoE), where service performance is assessed in terms of user-perceived quality. However, due to the specific nature of current laboratory tests, it is difficult to extrapolate the results obtained in the laboratory to the real world. In this document, we take up the initiative made by different authors to assess the ecosystem of multimedia services from an interdisciplinary perspective. In this paper, we present results of a series of studies conducted to analyze the overall perception of VoIP-based speech communication service quality from a user perspective. The aim is to link QoS-related technical characteristics of IP-based multimedia services with QoE not only in terms of lab-QoE, but in terms of the value users associate with it.