Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

WHAT I AM LEARNING #62

Open
dalessioluca opened this issue Nov 19, 2020 · 0 comments
Open

WHAT I AM LEARNING #62

dalessioluca opened this issue Nov 19, 2020 · 0 comments

Comments

@dalessioluca
Copy link
Collaborator

dalessioluca commented Nov 19, 2020

  1. Informative latent space (with clusters) is antithetical with a generator which works taking N(0,1) which needs structureless latent-space.

  2. sigma should be chosen so that reconstruction is of order 1 (and therefore balanced with the rest of the term). A simple way to do it is: sigma2 = (x-x.mean()).pow(2).mean()

  3. At that point, all lambda terms can be between 0 and 5

  4. RECONSTRUCTION IS ALWATS ON. If in range do not change lambda. If out_of_range change lambda up or down. Lambda is clamped to [0.1, 10]

  5. SPARSITY IS ALWAYS ON: If in range do nothing. If out_of_range change lambda up or down. Lambda is clamped to [-10, 10]. The negative part is to get out of the empty solution if necessary.

  6. User should provide a fg_mask which can be easily obtained by Otsu or other thresholding methods

Overlap immediately pushes the fg_fraction to zero. That makes sense since at the beginning y_k < 0.5 and y_k(1-y_k) is minimized pushing all y_k to zero. Is there any incentives in learning non-overlapping instances (via KL) is there is no overlap?
I should reintroduce overlap as computed in terms of no-self interaction

  • READ PAPERS ABOUT HOW THEY DO DYNAMICAL REGULARIZATION
  • I COULD CROP THE FEATURE MAP AT THE LEVEL OF THE PGRID B/C THE POINT IS THAT THE INTERACTION IS DISCOVERED AT THE COARSER LEVEL (SIMILAR TO MASK-R_CNN)
  • power of methods would come from:
    --> combining dots (like Baysor)
    --> graph consensus
  • BACKGROUND LATENT CODE CAN BE 5x5. That way I can probably describe the spreading spreading I see in MERFISH

If reconstruction is in range do nothing. when parameters are in range I should not change them, i.e. change g = min(x-x_min, x_max -x) to g = min(x-x_min, x_max -x).clamp(max=0)
4. sparsity should always be on.

OLD:
3. in reconstruction is high it overcomes the sparsity term and the overlap term -> therefore lambda_rec need to multiply everything)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant