-
Notifications
You must be signed in to change notification settings - Fork 15
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
HyperspyUI for 4D STEM #268
Comments
@magnunor Any thoughts as well? I've increasingly been having problems getting people to use pyxem/ hyperspy and there is a fairly big force that wants to do processing on the computer next to the microscope. |
Would it better to handle this in hyperspy?
Could it be done by adding a docked widget which allow the user to define manually or automatically?
Do you mean adding a plugin to support a specify workflow (if I infer correctly: handling several signals, one being a vector signal?), otherwise, I guess this should all be handling in hyperspy/pyxem?
You should already be able to run everything that you do in pyxem within HyperspyUI by using the embedded qtconsole or a script.
In both cases, the dependent process needs to connect the already running jupyter kernel when starting.
It should be possible to do that within HyperSpyUI because we should have good control of the event loop. In Ipython qtconsole or jupyter lab/notebook, we may not have that much control. |
xref #129 |
Yes I think I generally the navigation image should be reused especially for lazy images.
That's a really good idea. Although we have to be better about not freezing the application when someone tries to compute when they shouldn't.
Ultimately I would like to be able to do template matching or strain mapping etc. The problem with operations that return vectors is how do you display them in order to have a selectable signal object. We can now just plot them over a signal.
Yea but it might be nice to have the ability to right click and apply any function that the signal can apply.
Hmm I might look into how to do this!
Another thing to look into I guess. |
It should be possible and there is functionality for this, see for example: hyperspyUI/hyperspyui/uiprogressbar.py Lines 60 to 70 in fce8b44
It may be rough on the edge quite possible some code needs to be updated here to take into account changes in hyperspy. Any operation that it long enough should have a progress bar, for example in case of the decomposition: hyperspyUI/hyperspyui/plugins/mva.py Line 255 in fce8b44
|
Just some other thoughts related to this: @ericpre I'm curious about the "tree"/ 4D STEM Signal --> Centered 4D STEM Signal -->Polar 4D STEM Signal --> Orientation Match In this case, it would be nice to represent this entire workflow as a single signal, with the ability to switch between viewing each of the different "steps". |
Just curious if this is worth doing? If I was starting new for 4D STEM the way that I would tackle this is:
|
This could be done with some widgets (traitsui and ipwidgets so that they can be used in hyperspyui but also in a notebook) but these would different signals, maybe sharing the same navigator for convenience? I don't know if typing will be enough to generate widgets automatically. It may be worth checking what/how napari are doing. For interactive plotting on lazy signal, it may be worth revisiting caching in dask, as you already mentioned in hyperspy/hyperspy#3326. ;) |
@ericpre I was wondering what changes might be necessary to properly handle lazy datasets/4D STEM.
I think with the Lazy plotting and annotations this might be a little easier/ smoother than when first tried.
General Idea:
1 - The navigator should be reused for all transformations of some lazy signal using the
map
function to not repeatedly compute the navigator.2 - Any map-reduce functions should only compute when asked to (ie. Virtual imaging)
3 - Vectors signals are just displayed over the signal.
This should allow use to port most of the functions in pyxem into HyperspyUI and have pretty good functionality there.
One thing that might be nice is to have the UI running a seperate event loop and then have dask run all of the parallelization etc. I'm not sure that is the case at the moment. But with lazy signals you tend to get a lot of spinning wheels.
The text was updated successfully, but these errors were encountered: