AI-driven particles      (work in progress)

In this experimental project I create particles which learn to change their trajectory and color. Their learn it by analyzing real videos, such as fire, waterfall and others. After that, cluster of particles "try" to recreate real videos.

For me, it's kind of "style transfer", but not for photos, as it used today widely, but for animation.

Get video, compute optical flow and train “virtual particles” to change their acceleration and color in dependence of other’s perticles colors and velocities.

Next, run particles and drive them using learnt data to see how such AI-driven particles tries to recreate video.

Train data collecting and rendering: openFrameworks.
ML: pytorch/libtorch.
Stochastic learning:

Note on 3D
Using networks, we can recreate 3D from 2D, and so train particles to fly in 3D.
Final rendering is proposed to made in Houdini.

AI-driven light ray system
Einstein told that he invented his theory by imagining he flying on the end of light ray. But he only observe, and didn’t change the ray anyway.
So, let’s go further and imagine that light ray posess intelligence, having rights for changing its direction, thickness and brightness, depending on its state and observation of the near space.
In such a way, we obtain AI-driven light ray system.

Project is made with help by Olga Annenkova.
Thanks Frederick De Wilde for great suggestions.