Ideas

  1. QoE means buffering events in this work, not really loss (well handled with PLC, FEC, coding).
  2. Stability (statistically stationarities) of QoE over time with varying network capacity (and delays and losses) with VBR streams.

This is overall interesting to find the connection between the influence of caching strategies on video QoE. One way to approach this is via time-aligned series correlation, which events at the network level lead to “events” at the user level (even after the application) quantify them via user tests (Acreo experience) and look back at the network trace, as far back in the network as is possible, in a lab experiment the source stream (or streamingkolllen (Maria))

The influence of cache hits and misses on the QoE. Hits result in lower delays and misses longer, that makes the buffering at the receiver harder to predict (Ian has a paper outlining the problem and 1 solution), but there are others. A feedback from the QoE to the cache controller would be nice, but time dependent control is hard.

Suggestion, tie caching output to objective measures “probability of buffering event” match this with the state of the buffer, and of course subjective measures (easy to see), but can we predict it with the # of hits and misses from the caches. Possibly we can add QoE to the metadata of a cache (as well as access count). 4. Visualisation of network characteristics (Javascript) with audio-video playout The video distribution for streaming video is nowaday http-based, which means that the user never experience losses directly, but it may show up as variable quality or rebuffering event. yes, with WebRTC. 4. Show the correlation visually or audible/video (predicted with real measurements) that is probability of buffering events using latency, variance of delay (jitter), leading to early notification of losses.