Agreeable-Run-9152

Agreeable-Run-9152 t1_j33xcmm wrote

Note that this argument really isnt about Diffusion or generative models but about optimization. I know my fair Share of generative modelling, but this Idea is a lot more general and might have been popped up somewhere else in optimization/inverse Problems?

1

Agreeable-Run-9152 t1_j33wlnt wrote

Lets think about a dataset consisting of only one image x and that the optimization process is known and deterministic.

Then given the weights of the diffusion model, and the optimization procedure P(theta_0,t, x) which maps the initial weights theta_0 to theta_t after t steps trained on image x, this problem would be:

Find x of |Theta_t - P(theta_0,t,x) | = 0 for all times t.

I would IMAGINE (i am not sure) that for enough times t, we get a unique solution x.

This argument should even hold for datasets consisting of more images.

2