Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Isodose model generation fails due to non-default FeatureAngle value #227

Open
cpinter opened this issue Feb 13, 2023 · 9 comments
Open
Labels
Milestone

Comments

@cpinter
Copy link
Member

cpinter commented Feb 13, 2023

There is a thermal dose volume (with value range 0-300) for which the module generation yields very incorrect isodose surface (for level value 240).

This is a correct-looking result
image

And here is the result yielded by the Isodose module
image

The only difference between the two results is that the top one was computed with the default FeatureAngle value (15), and the bottom with the value set by Isodose logic (60), see

decimate->SetFeatureAngle(60);

Does anyone (@lassoan @gregsharp) remember why we chose this non-default value? There are no comments in the code. I found the commit, however, where this value was set (by Kevin Wang):
dc07b61
The comment was: "Fixed the isodose color in 2d viewer. It probably has something to do with the scalar and normals computation. Once I modified the pipeline to generate the polydata, it works fine now", so it was not clear to him either but there were 2D display issues before this change. That said, it was more than ten years ago...

If there is no good reason I'd remove this non-default feature angle setting. Thank you.

@cpinter cpinter added the bug label Feb 13, 2023
@cpinter cpinter added this to the SlicerRT 1.1 milestone Feb 13, 2023
@lassoan
Copy link
Member

lassoan commented Feb 13, 2023

Feature angle of 60 deg should not cause much trouble.

SetMaximumError to 1 looks suspicious. Is it 1mm or 100% of the bounding box? Both can be quite bad.

Number of smoothing iterations set to 2 does not make much sense either. Smoothing is an interative method and we cannot expect much to happen in 2 iterations.

@cpinter
Copy link
Member Author

cpinter commented Feb 13, 2023

Feature angle of 60 deg should not cause much trouble

In this case, however, changing this single property fixes the problem (I can confirm this). Do you suggest to experiment with other properties to try to fix this issue and leave the feature angle as is?

@lassoan
Copy link
Member

lassoan commented Feb 13, 2023

It might be just a matter of luck that changing the feature angle fixes the issue in this case. The number of iterations and maximum error values look more concerning to me.

The safest would be to reduce decimation and smoothing. Maybe remove decimation completely or even better replace with uniform remeshing (unfortunately ACVD is not in VTK yet). Then test it on many cases to verify that the result is smooth and still accurate.

@lassoan lassoan closed this as completed Feb 13, 2023
@lassoan lassoan reopened this Feb 13, 2023
@cpinter
Copy link
Member Author

cpinter commented Feb 13, 2023

The safest would be to reduce decimation and smoothong

Yes I agree with this.

  • MaximumError: default value is 1e+299, reverting to that does not change this particular output
  • NumberOfIterations: default is 20, using that reduces the object a lot, no change in that "cutoff" that is visible with the feature angle 60. Disabling smoothing only reverts the reasonable looking smoothing that was applied
  • TargetReduction: Reducing this from 0.6 to 0.4 or 0.3 also fixes the problem (much more detailed looking output with 0.3).

We could just change TargetReduction to 0.4 or 0.3, that seems to be a safe option if you'd rather not touch FeatureAngle.

@lassoan
Copy link
Member

lassoan commented Feb 13, 2023

My main concern that the processing is unstable if changing the feature angle cause such huge difference. Decreasing the feature angle does not really stabilize the processing. Decreasing the target reduction should stabilize the process, so if that fixes the issue then it is a better solution.

What is very strange is that for continuous images, marching cubes should provide a fairly smooth output. What is the scalar type of the dose volume and what are the isovalues? I'm also not sure about why the vtkImageReslice is needed and what resolution is used (fixed isotropic 1mm? what if the volume is small?).

@cpinter
Copy link
Member Author

cpinter commented Feb 13, 2023

The dose volume is of float, scalar range is 0 to 300, and the level used here was 240.

@cpinter
Copy link
Member Author

cpinter commented Feb 13, 2023

I think the image reslice is used to apply the volume node's geometry.

@lassoan
Copy link
Member

lassoan commented Feb 13, 2023

OK, so it seems that image resampling is done to achieve uniform remeshing. But then we don't need to do decimation at all, but we should just use the output image spacing that matches the desired resolution. After this resampling we only need marching cubes and maybe a little smoothing (but if we used bicubic interpolation for the image resampling and the spacing was right then probably we don't even need smoothing). Normal computation should be enabled in marching cubes to have a nice, smooth-looking surface in 3D.

@cpinter
Copy link
Member Author

cpinter commented Feb 24, 2023

Do you think if I just removed the decimation step it would be enough? Or if we want to keep it an option I can add a member to the parameter node.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants