the user should be able to
- click snap button
- view image from the microscope
- view objects in the image
- view object information (xy coordinate of the object)
we would have to figure out
- how can we create a (using magicgui)
snap button that calls a snap() in the backend and waits for the image to arrive from the hard-link, which is displayed as layers [image, segmented_object] in napari
references
start here: https://napari.org/tutorials/
and here: https://github.com/napari/magicgui
maybe as a plugin?: https://napari.org/docs/dev/plugins/for_plugin_developers.html
example application which is built on top of napari: https://github.com/quantumjot/BayesianTracker#Usage-with-Napari
the user should be able to
we would have to figure out
snapbutton that calls asnap()in the backend and waits for theimageto arrive from thehard-link, which is displayed as layers[image, segmented_object]innaparireferences
start here: https://napari.org/tutorials/
and here: https://github.com/napari/magicgui
maybe as a plugin?: https://napari.org/docs/dev/plugins/for_plugin_developers.html
example application which is built on top of napari: https://github.com/quantumjot/BayesianTracker#Usage-with-Napari