A new tool for assessing video quality


Watching videos can bring quite a few benefits to viewers. They may perhaps consider a video informative, relaxing, exciting, or maybe just background noise. 

Whatever the reason for watching them, videos should be presented at a quality that is sufficient for achieving these benefits. Before we can assess the quality that is required for a specific user's intent (actually in parallel but not in this thesis), we need to create objective tools that are able to estimate users' ability to notice problems in transferring, decoding and showing videos.

Historically, when you ask users directly about providing a quality rating for a video with problems, it is called a subjective quality rating, and when you build tools that estimate what users would say, you call the objective quality metrics.

Research in developing objective quality metrics is improving quite slowly, and the ones that have been developed are rarely used. There are several reasons: you need high-quality content, you need real humans who give their subjective rating, you need to create rather complicated formulas, many bits and pieces of the existing objective quality metrics are patent-protected and only available in commercial products.

But there are standards that we can use for inspiration to develop open source tools for the entire research community. They are developed by a group called VQEG (visual quality experts group) and standardized by the International Telecommunications Union (ITU). They have released standards J.144, J.247, J.341, J.343.

An open source implementation of J.144 exists as Matlab code and is known as VQM (www.its.bldrdoc.gov/resources/video-quality-research/vqm-faq.aspx)

We have developed a subset of J.247 Annex B, known as PEVQ, that drops the patented part. We call this OPVQ (mlab.no/blog/2015/06/openvq-an-objective-quality-assessment-tools/)

Now, we would like to go for J.341 and J.343, and participate in the JEG working group of VQEG, which wants to collect open source tools for the benefit of all.


Thesis idea 1: Create a formal definition of J.341 Annex A

The company that created the only successful competitor in HD video quality assessment competition made a very interesting choice to achieve standardization of their method, but still make it impossible for competitors to copy their method: they put the essential parts of their source code into the standard. Now it is protected by copyright, and we cannot make an open source implementation of the standard.

Still, it is very desirable to have an implementation, because it adds a lot of knowledge that was not available when J.144 and J.247 was made.

We need to take a detour: in this thesis, you are supposed to look very closely at the standard, and formulate every piece of code, first by formulating in text what it means and wants to achieve, second by formulating it in an equivalent formula. When this is done, somebody else will become capable of implementing your specification without violating copyright.

Thesis idea 2: Extend OPVQ for adaptive streaming over HTTP

OPVQ works. It works quite well for comparing videos at different quality, and we have even found that it works quite well for videos whose quality changes due to camera movement. But J.343 has shown that it is possible to estimate the quality of DASH videos better. DASH means Dynamic Adaptive Streaming over HTTP and is used by YouTube, Netflix, NRK, HBO, … basically everybody, so it is pretty important.

In this thesis, OPVQ will be extended to achieve quality estimates that are of similar quality to the best contenders in the J.343 competition.


  • Pål Halvorsen
  • Carsten Griwodz