Animating Image
Algorithm
Concepts

Animating Image Algorithm Concepts

Animating Image
Algorithm
Concepts

Animating Image Algorithm Concepts

Animating Image
Algorithm
Concepts

Animating Image Algorithm Concepts

Role

Chan Zuckerberg Initiative

Role

Design Engineer

Language

Python

Design Tools

Figma, Canva

Timeline

3 months

PROJECT 01

Gradient Consensus

Context

The mission of the Biohub at the Chan Zuckerberg Initiative is to empower scientists to cure disease.

In support of that mission, I collaborated with a microscope inventor to develop animations explaining Gradient Consensus, a novel algorithm for reconstructing true images from noisy scientific and medical data. For example, microscopes capture images that contain noise and blur, so Gradient Consensus reconstructs a sharper, more accurate image of the true object. Together, the animations and a written document I helped develop explained the algorithm's core concepts.

To ground the animations in real behavior, I ran the algorithm on synthetic test images in Python and captured how the reconstruction evolved at each iteration. Starting with a simulated cell image, I generated outputs showing how the algorithm progressively converges toward the underlying signal, distinguishes noise through gradient disagreement, and performs neighborhood-based updates when gradients agree. These computational results formed the foundation of the animations, ensuring they reflected the algorithm’s actual mechanics rather than illustrating the concept abstractly.

PROJECT 01

Gradient Consensus

Visuals and animations

Microscopes capture a blurry, noisy version of the true object

A measurement (the image captured by the microscope) is signal plus noise.

Gradient Consensus splits the measurement into two versions

The two versions created have the same signal but different noise

To form each version, each photon of the measurement is randomly assigned to version A or B

Gradient consensus then checks a neighborhood of pixels. If both versions agree on a change, it updates the inferred truth; if they disagree, it leaves it alone.

Blur spreads each pixel's brightness into its neighbors, so corrections have to account for a neighborhood of pixels

After 256 iterations, Gradient Consensus holds steady while Richardson-Lucy (a traditional algorithm) drifts further from the truth

When a traditional algorithm iterates too long, it starts overfitting the noise, creating wormy artifacts not part of the true object

PROJECT 02

WaveOrder

Context

I worked with a team of AI/ML engineers to design and animate a visual explainer for waveOrder, an image reconstruction and deconvolution algorithm for microscope images. It works by modeling microscope parameters like tilt and defocus to emulate how light behaves through the optics, then uses that model to reconstruct sharper, more detailed images from raw data. The video helps the team explain the algorithm's core concepts to other scientists.

PROJECT 02

WaveOrder

Animation