Art of prophesy

One of the most exhilarating scenes in spy movies is the daring escape of a secret agent amidst a hailstorm of enemy gunfire. A cunning agent avoids a direct path, instead weaving through a serpentine course that makes targeting him a daunting task for his adversaries, who struggle to anticipate his next move.

It is hard to predict a move of a cunning secret agent.

James interjected, “It’s not very common, although, once I appeared exactly in the situation you described”

“Was it a sniper from a foreign agency?” I inquired.

A shadow crossed the Bond’ face.

“Actually, it was a jealous husband. That was a case where I failed to foresee the circumstances quite accurately…” Bond confessed, before firmly pushing aside the unpleasant memories and continuing,

Unforeseen situation.

“But it’s not important. In most instances, our aim is to predict not the trajectory of bullets, but rather the intentions and even the mental states of our adversaries. We employ various models and artificial neural networks.”

“And do they accurately predict the future?”

James maintained the optimistic facade but with some hints of doubts at his face.

“You know, it is very difficult to make predictions, especially about the future…”

The powerful MI-6 artificial neuronal networks can predict the result of a coin toss experiment with the precision of up to 50%.

Can one become a sort of oracle by employing the hints of James Bond? I pondered this question as I embarked on a simple experiment. Constructing a basic function, say a sinusoidal wave, I posed the question:

“If we observe such an oscillating time series for an extended time, can we forecast its future behavior?”

To increase the challenge, I introduced noise into the equation, ensuring that extrapolation alone could not decipher the pattern.

Sinusoidal time series with added noise.

I devised a rudimentary neural network with a couple of LSTM layers, augmented by a dropout layer to prevent overfitting. Configuring the model to utilize the preceding 50 measurements to predict the subsequent value, I discovered that a network trained on 70% of the complete time series could reasonably extrapolate the remaining 30% of data.

Prediction of the network vs actual data.

However, there is honestly a kind of cheating in this figure. The network did not forecast the entire curve at once; rather, it predicted only one next measurement based on knowledge of the preceding 50 actual measurements.

Lets do it more fair. Suppose we have a starting point for predictions and had no access to real measurements beyond that point. Utilizing the previous 50 measurements, we make one initial prediction. Subsequently, we treat this prediction as a fictitious measurement, appending it to the 49 previous actual measurements (totally 50 required) to make the subsequent guess. This process continues iteratively for the requested number of steps.

Of course, such predictions based on predictions will fail sooner or later but looking into the future for, say 10 steps, is well possible.

10-steps prediction starting from an arbitrary chosen point.

Why Bond was a bit skeptic about the predictions? Maybe he meant forecasting something more complicated than a sin function?

The Python codes can be found in the pdf version of this document: Full Text with Codes.


Posted

in

by

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *