Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Arkitekt integration #31

Open
henrypinkard opened this issue Aug 27, 2024 · 5 comments
Open

Arkitekt integration #31

henrypinkard opened this issue Aug 27, 2024 · 5 comments

Comments

@henrypinkard
Copy link
Member

@jhnnsrs I've written up the documentation with what I think you need to get started:

When you have a chance to take a look, let me know if you'd like to talk more or want additional guidance

@jhnnsrs
Copy link

jhnnsrs commented Aug 29, 2024

@henrypinkard super cool! Will have a look at it once back. Is there any plans for having a testing backend that neither relies on micromanager nor imswitch? Just wondering because right now my windows access is a little bit limited hahaha

@henrypinkard
Copy link
Member Author

My hope is that someone who has been working on one of the virtual microscopes may put it in as a backend, but I don't know when that might happen.

Are you on linux only? You can definitely use the MM demo on mac, you just may need to make a conda env with this if you have apple silicon

@beniroquai
Copy link

beniroquai commented Aug 29, 2024 via email

@jhnnsrs
Copy link

jhnnsrs commented Sep 10, 2024

Hi everybody!

I finally have the time to write some of my thoughts down (got to play around with it the week before but had the death week in between). Generally i really like the design and the documentation is straightforward. Great great job! I have made mental notes where i found some hurdles in using it both when using it vanilla and with an asycnhrouns interface layered on top. Some additional examples would be really great!

A) What would be the design pattern for a LiveAcquistion i.e. an acqusition where we would like to stream indefinetly:
should this happen outside of the acqusition engine ? or how would one send the interrupts?

- ArmEvent
- StartEvent
- ????
- StopEvent
- DisArmEvent

Would the be an IndefiniteEvent inbetween that would then be canceled by the governing context? How would we make
sure that the Stop and DisarmEvent are fired if an exception is thrown?

B) LikewisesShould there be a way of inserting events into the acqiusition queue? I am imaging a "Software Auto Focus", that would set the Z-Stage in between. Could that be handled by a device that receives a "AutoFocusEvent" and takes the last notification from the stack runs the autofocus and puts in a "SetZPosition" event? Or would that again happen outside the context?

C) I am yet not sure if got the concept of the threaded queues yet, would that mean that different devices could declare their own threads (e.g. Camera vs Stage) and experience some parallelisms? e.g. Camera Acqusition of a Frame while we sweep in z? if so how would we ensure synchronisiy here? Also what if we don't want this?

D) Same for the event and its threadability. How does that relate to the device thread? If i send an event to a different thread than the executing device what happens?

E) I really like the abstraction of auto send events when you set parameters on the camera object directly https://github.com/micro-manager/ExEngine/blob/main/src/exengine/examples/using_devices.py, however it is currently a bit too magic for me and might lead to unexpected bugs later (maybe abstracting this through a proxy object which would mimic the base class but allow for the same style could be an option eg.) `

engine = ExecutionEngine()

t = Camera() # the real object
x = proxy(camera, engine)
//or
x = t.proxy(engine)

x.exposure = 300

F) Again super cool stuff! :)

These are some preliminary thoughts on the matter, happy to go deeper into in a meeting. :)

Best

Johannes

@henrypinkard
Copy link
Member Author

henrypinkard commented Sep 13, 2024

Thanks!

A) What would be the design pattern for a LiveAcquistion i.e. an acqusition where we would like to stream indefinetly:
should this happen outside of the acqusition engine ? or how would one send the interrupts?

Good question. I was able to implement this in a couple different ways. I hadn't yet decided which would be the most intuitive and thus the officially documented. The file I was playing with was here. I think I changed some argument calls and names since making these, but you can see the concepts.

In the first version, ReadoutImages runs indefinitely asynchronously, until you stop it from the user code thread

start_capture_event = StartContinuousCapture(camera=camera)
readout_images_event = ReadoutImages(camera=camera,
                                     image_coordinate_iterator=(DataCoordinates(time=t) for t in itertools.count()),
                                     data_handler=data_handler)
stop_capture_event = StopCapture(camera=camera)

_, readout_future, _ = executor.submit([start_capture_event, readout_images_event, stop_capture_event])
time.sleep(2)
readout_future.stop(await_completion=True)

In version 2, you put a delay and have it stop automatically, while continuously reading out images from another thread:

start_capture_event = StartContinuousCapture(camera=camera)
readout_images_event = ReadoutImages(camera=camera, num_images=10,
                                     image_coordinate_iterator=(DataCoordinates(time=t) for t in itertools.count()),
                                     data_handler=data_handler)
stop_capture_event = StopCapture(camera=camera)
sleep_event = Sleep(time_s=2)

_, _, _ = executor.submit([start_capture_event, sleep_event, stop_capture_event])
executor.submit(readout_images_event, use_free_thread=True)

Any preference for either of these?

B) LikewisesShould there be a way of inserting events into the acqiusition queue? I am imaging a "Software Auto Focus", that would set the Z-Stage in between. Could that be handled by a device that receives a "AutoFocusEvent" and takes the last notification from the stack runs the autofocus and puts in a "SetZPosition" event? Or would that again happen outside the context?

That was the idea behind the prioritize argument :

prioritize : bool, optional (default=False)

C) I am yet not sure if got the concept of the threaded queues yet, would that mean that different devices could declare their own threads (e.g. Camera vs Stage) and experience some parallelisms? e.g. Camera Acqusition of a Frame while we sweep in z? if so how would we ensure synchronisiy here? Also what if we don't want this?

By default, everything happens on one thread. Though there are many ways of customizing this. Synchronization really depends on the devices in question. A lot of times this may be handled by hardware. If you need to do software synchronization you need to use futures etc. But this is why the default is single threaded. Potentially less performant, but you don't have to think about unless you want to. (I'd also add that Micro-Manager/Pycro-Manager are mostly single threaded acquisition systems, so this works for many cases)

D) Same for the event and its threadability. How does that relate to the device thread? If i send an event to a different thread than the executing device what happens?

There's a hierarchy of whose threading preferences get taken into account first, as described here

E) I really like the abstraction of auto send events when you set parameters on the camera object directly https://github.com/micro-manager/ExEngine/blob/main/src/exengine/examples/using_devices.py, however it is currently a bit too magic for me and might lead to unexpected bugs later (maybe abstracting this through a proxy object which would mimic the base class but allow for the same style could be an option eg.) `

Is there something more specific your're thinking of here? There's a way to bypass the auto-rerouting of events to executor threads using the no_executor attr on a per-device or per method level. I could definitely expand this capability if there are more uncovered use cases

F) Again super cool stuff! :)
These are some preliminary thoughts on the matter, happy to go deeper into in a meeting. :)

Thanks! I'm happy to talk more. At this point I think its hard to do much more development in the abstract without end-user testing on specific applications

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants