Research

Local NL-means, a first experiment

This is (finally) a follow-up to my latest post!

Breaking news: NL-means is local!

The Non-Local Means algorithm 1 is a very popular image denoising algorithm. Besides its efficiency at denoising, it’s also one of the main causes to the non-locality trend that has arisen in Image Processing conferences ever since. Yet, there is this recent paper on Arxiv 2: Non-Local means is a local image denoising algorithm by Postec, Froment and Vedel. Some background on NL-means NL-means is a very simple, yet highly efficient image denoising algorithm that preserves well image textures even with severe noise.

Lecture en ligne : Computational Photography (Stanford CS478)

Le week-end s’annonçant pluvieux, pourquoi ne pas l’occuper en s’instruisant ?

My PhD thesis is online!

Ca a pris un peu de temps1, mais mon manuscrit de thèse est accessible en ligne librement depuis quelques jours ! Bien évidemment, c’est un must si vous vous intéressez aux méthodes variationnelles non-locales :-p Blague à part, le manuscrit (en Anglais) est donc disponible ici. Au cours des semaines qui viennent, je rajouterai quelques notes pour décrire grossièrement les idées fortes de mon travail. N’hésitez donc pas à revenir !

A typical implementation of NL-means

Semi-nonlocal implementations As can be inferred from the description, NL-means is a greedy algorithm: for each pixel, the denoising pixel can explore the whole image. Thus, it is common practice to limit the nonlocal exploration stage at a limited window around each pixel1. However, since nonlocality is achieved only inside a subwindow around each pixel, I refer to these implementations as semi-nonlocal. It is common for an implementation of NL-means to include an additional parameter for the size of the learning window.

NL-means: some math

In the previous episode, we have seen the main idea of NL-means: A good denoising algorithm can be obtained by a simple weighted average of noisy pixels, at the condition that the weights depend on the content of patches around the pixels instead of their spatial distance. In this post, we will put proper mathematical definitions behind those patches and weighted average. I will proceed in successive steps:

One half

Achievement (almost) unlocked A quick note to say that I’ve fulfilled the first requirement towards graduation: my doctoral dissertation was accepted by the jury1 during a private oral exam. Wow ! And I have to say that I am really grateful to the jury, both for the kind comments and for their criticism about my work, that will help me deliver a better manuscript. What’s coming next ? I still have to make up a public presentation to officially disclose the content of the thesis and graduate, which should happen in 2 or 3 months.

Introducing NL-means

Non-locality and NL-Means Non-locality is a powerful recent paradigm in Image Processing that was introduced around 2005. Put in simple words, it consists in considering that an image is a collection of highly redundant patches. In this context, pixels get described by the surrounding patch instead of their sole value), and patch relationships are only deduced from their visual similarity (thus ignoring their spatial relationship). Non-locality comes in two flavors: exemplar-based 1 or sparsity-based.

[Paper] Inverting and Visualizing Features for Object Detection

Here are some short reading notes of a paper that came out on arXiv this week. I have a few RSS feeds positioned there, and I was immediately caught by the title: Inverting and Visualizing Features for Object Detection by Carl Vondrick, Aditya Khosla, Tomasz Malisiewicz and Antonio Torralba (MIT/CSAIL). The paper What is it about ? As the title says, it’s about feature inversion and visualization. Yes, but not any feature: the now ubiquitous HOG feature.

Back from ICPR'12

In mid-November, I’ve been attending the ICPR 2012 conference in Tsukuba (Japan). It was a first time for me at a more “Pattern recognition”-oriented conference, with a slightly different community of attendees than ICIP or ICASSP, and also a different organization. I’ve been a bit suprised to have the feeling that the French community was bigger and more homogeneous than in these 2 conferences. ICPR’12 facts About the conference, I liked the fact that there were less sessions in parallel than ICIP or ICASSP1, so it’s actually a bit easier to attend various sessions.