On the Merits of Indolence

What was never clear to me about James Bond’s personality was why he was dubbed ‘agent 007.’ I queried him once if there were at least six other super-agents comparable to him in skills, intelligence, and experience.

He responded, ‘This nickname actually has another origin. My colleagues jest that I possess zero motivation, zero concentration, and seven romantic adventures per mission.’

James Bond on a mission.

‘What unfairness!’ I screamed. ‘To judge so superficially… They should consider your excellent results…’

‘I must confess,’ remarked Bond, ‘they were not entirely incorrect. I’ve never been particularly industrious, but I’ve endeavored to compensate for any deficiencies in acquired information through advanced analysis.’

‘How thrilling!’ I exclaimed. ‘How do you manage that?’

‘Ah, now we tread upon my most closely guarded professional secrets,’ he replied, glancing about for prying eyes, listening devices, and surveillance cameras before hastily scribbling something on a small piece of paper.

Confidential keyword.

‘Top secret!’ he cautioned as he slipped the paper into my pocket.

I could scarcely resist the urge to unfurl the paper immediately, yet I restrained myself until I reached home and secured all the lockers. Upon unfolding it, I read the words: ‘Gaussian Process.’

‘Of course!’ I thought to myself. ‘I should have deduced it on my own. The Gaussian Process, the most precise Bayesian prediction method for filling in the missing pieces of information.

For instance, when tasked with retrieving an unknown 1D functional dependence, the initial inclination may be to sample it at equal intervals. However, this approach proves to be both costly and inefficient. Instead, a much more effective method involves random sampling, followed by probabilistic filling of the missing points using the Gaussian Process.

Few random sampling allow for accurate reconstruction of the unknown function with the Gaussian Process. The blue area around  the predicted curve shows the confidence interval.

This strategy proves particularly efficacious for retrieving 2D features with a limited sampling budget. I recently conducted an experiment wherein I generated a cosine blob at the center of an image and sampled it with only 100 randomly chosen points. And the Gaussian Process reconstructed the true feature with remarkable accuracy. To provide a point of comparison, I also plotted (on the rightmost side) the reconstruction obtained from standard regular sampling, which appeared significantly less impressive despite employing 100×100 (=10,000) sampling points.

Two-dimensional feature randomly sampled followed by reconstruction with the Gaussian Process. This is very economical and more efficient than the regular sampling reconstruction.

Furthermore, it is intriguing to observe how the reconstruction evolves with the sequential addition of sampling points. The resulting image consistently maintains a smooth appearance but remains rather inaccurate when only a small number of points are taken. However, accuracy improves rapidly with the accumulation of more sampling points, approaching the original image closely even with just 50 samplings.

Evolution of Gaussian Process reconstruction with increasing of the number of sampling points.

The Python codes can be found in the pdf version of this document: Full Text with Codes.

Comments

2 responses to “On the Merits of Indolence”

  1. Michael Avatar
    Michael

    Did you hear about compressed sensing?

    1. pavel.temdm Avatar
      pavel.temdm

      Yes, I did. Actually, this story is exactly about compressed sensing, although the reconstruction algorithms may differ. L1 sparsity algorithms are more common in compressed sensing, e.g. https://arxiv.org/abs/1211.5231. However, I think the Gaussian Process is more elegant.

Leave a Reply

Your email address will not be published. Required fields are marked *