Professional-Ebb4970
Professional-Ebb4970 t1_j6knqc2 wrote
Reply to comment by snyckers in Brazil's Ex-President Bolsonaro Seeks 6-Month Visa To Remain In US: Lawyer by loggiews
As a Brazilian, I'd take that tbh. Trump won't win any elections here and Bolsonaro won't win any elections there
Professional-Ebb4970 t1_j6knmmz wrote
Reply to comment by DownvoteEvangelist in Brazil's Ex-President Bolsonaro Seeks 6-Month Visa To Remain In US: Lawyer by loggiews
Indigenous peoples being murdered by the illegal miners and deforestors that he supported, and 700k covid deaths, a lot of which are direct consequences of his covid denial ("it's just a flu"), refusal to buy vaccines in time ("the hurry for a vaccine isn't justified" or "the covid vaccine might give you AIDS") and outright spreading of misinformation on fake medicines ("I took chloroquine when I had covid and recovered just fine")
Professional-Ebb4970 t1_j6g3lf2 wrote
Reply to comment by Sofi_LoFi in [D] Remote PhD by TheRealMrMatt
Being a full-time commitment and being remote aren't mutually exclusive though
Professional-Ebb4970 t1_j1zvd1b wrote
Reply to [D] Protecting your model in a place where models are not intellectual property? by nexflatline
Did you train using a public dataset and public ML techniques? If so, the model is not your intellectual property regardless of what any country may say
Professional-Ebb4970 t1_ivr476b wrote
Reply to comment by [deleted] in [Discussion] Could someone explain the math behind the number of distinct images that can be generated with a latent diffusion model? by [deleted]
You don't need to use a single seed for the noise patch, you can use random numbers and it will work just fine
Professional-Ebb4970 t1_iux570q wrote
There are Reversible Neural Networks where this is possible to do, they're used for things such as for Normalizing Flows and even for very large NNs that don't fit in memory, since if the outputs are reversible you don't need to save intermediate activations during the forward pass, you can just recalculate them in the backward pass.
Professional-Ebb4970 t1_itcyf1z wrote
Reply to comment by mediocregradstudent in [D] What things did you learn in ML theory that are, in practice, different? by 4bedoe
MLP Mixer would like to speak to you
Professional-Ebb4970 t1_isgm99j wrote
I strongly disagree with your first paragraph. There is still a lot of work to be done on lossless compression, and I don't believe we are as close to the Shannon Bound as you seem to imply.
For instance, there are recent methods that use neural networks to do lossless compression by using a combination of ANS, Bits Back and VAEs, and they can often achieve much better compression rates than traditional methods. For an example, check this paper: https://arxiv.org/abs/1901.04866
Professional-Ebb4970 t1_irpktsf wrote
Reply to comment by Affectionate_Log999 in [P] Youtube channel for ML - initial feedback and suggestions by Fun_Wolverine8333
Depends on the person, there's probably many people who prefer the general theoretical aspects too
Professional-Ebb4970 t1_j6ld9un wrote
Reply to comment by academiac in Canadian universities have been conducting joint research with Chinese military scientists for years by No-Drawing-6975
Nah, joint research with the military is always 100% evil, regardless of the country of origin.