Professional-Ebb4970

Professional-Ebb4970 t1_j6knmmz wrote

Indigenous peoples being murdered by the illegal miners and deforestors that he supported, and 700k covid deaths, a lot of which are direct consequences of his covid denial ("it's just a flu"), refusal to buy vaccines in time ("the hurry for a vaccine isn't justified" or "the covid vaccine might give you AIDS") and outright spreading of misinformation on fake medicines ("I took chloroquine when I had covid and recovered just fine")

17

Professional-Ebb4970 t1_iux570q wrote

There are Reversible Neural Networks where this is possible to do, they're used for things such as for Normalizing Flows and even for very large NNs that don't fit in memory, since if the outputs are reversible you don't need to save intermediate activations during the forward pass, you can just recalculate them in the backward pass.

3

Professional-Ebb4970 t1_isgm99j wrote

I strongly disagree with your first paragraph. There is still a lot of work to be done on lossless compression, and I don't believe we are as close to the Shannon Bound as you seem to imply.

For instance, there are recent methods that use neural networks to do lossless compression by using a combination of ANS, Bits Back and VAEs, and they can often achieve much better compression rates than traditional methods. For an example, check this paper: https://arxiv.org/abs/1901.04866

3