You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Informative latent space (with clusters) is antithetical with a generator which works taking N(0,1) which needs structureless latent-space.
sigma should be chosen so that reconstruction is of order 1 (and therefore balanced with the rest of the term). A simple way to do it is: sigma2 = (x-x.mean()).pow(2).mean()
At that point, all lambda terms can be between 0 and 5
RECONSTRUCTION IS ALWATS ON. If in range do not change lambda. If out_of_range change lambda up or down. Lambda is clamped to [0.1, 10]
SPARSITY IS ALWAYS ON: If in range do nothing. If out_of_range change lambda up or down. Lambda is clamped to [-10, 10]. The negative part is to get out of the empty solution if necessary.
User should provide a fg_mask which can be easily obtained by Otsu or other thresholding methods
Overlap immediately pushes the fg_fraction to zero. That makes sense since at the beginning y_k < 0.5 and y_k(1-y_k) is minimized pushing all y_k to zero. Is there any incentives in learning non-overlapping instances (via KL) is there is no overlap?
I should reintroduce overlap as computed in terms of no-self interaction
READ PAPERS ABOUT HOW THEY DO DYNAMICAL REGULARIZATION
I COULD CROP THE FEATURE MAP AT THE LEVEL OF THE PGRID B/C THE POINT IS THAT THE INTERACTION IS DISCOVERED AT THE COARSER LEVEL (SIMILAR TO MASK-R_CNN)
power of methods would come from:
--> combining dots (like Baysor)
--> graph consensus
BACKGROUND LATENT CODE CAN BE 5x5. That way I can probably describe the spreading spreading I see in MERFISH
If reconstruction is in range do nothing. when parameters are in range I should not change them, i.e. change g = min(x-x_min, x_max -x) to g = min(x-x_min, x_max -x).clamp(max=0)
4. sparsity should always be on.
OLD:
3. in reconstruction is high it overcomes the sparsity term and the overlap term -> therefore lambda_rec need to multiply everything)
The text was updated successfully, but these errors were encountered:
Informative latent space (with clusters) is antithetical with a generator which works taking N(0,1) which needs structureless latent-space.
sigma should be chosen so that reconstruction is of order 1 (and therefore balanced with the rest of the term). A simple way to do it is: sigma2 = (x-x.mean()).pow(2).mean()
At that point, all lambda terms can be between 0 and 5
RECONSTRUCTION IS ALWATS ON. If in range do not change lambda. If out_of_range change lambda up or down. Lambda is clamped to [0.1, 10]
SPARSITY IS ALWAYS ON: If in range do nothing. If out_of_range change lambda up or down. Lambda is clamped to [-10, 10]. The negative part is to get out of the empty solution if necessary.
User should provide a fg_mask which can be easily obtained by Otsu or other thresholding methods
Overlap immediately pushes the fg_fraction to zero. That makes sense since at the beginning y_k < 0.5 and y_k(1-y_k) is minimized pushing all y_k to zero. Is there any incentives in learning non-overlapping instances (via KL) is there is no overlap?
I should reintroduce overlap as computed in terms of no-self interaction
--> combining dots (like Baysor)
--> graph consensus
If reconstruction is in range do nothing. when parameters are in range I should not change them, i.e. change g = min(x-x_min, x_max -x) to g = min(x-x_min, x_max -x).clamp(max=0)
4. sparsity should always be on.
OLD:
3. in reconstruction is high it overcomes the sparsity term and the overlap term -> therefore lambda_rec need to multiply everything)
The text was updated successfully, but these errors were encountered: