Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for filterwheels on Pyro cameras, general refactoring #170

Merged
merged 18 commits into from
Apr 16, 2020

Conversation

AnthonyHorton
Copy link
Member

@AnthonyHorton AnthonyHorton commented Feb 16, 2020

Adds a filterwheel interface to the distributed camera system, following largely the same approach as the focusers.

I'm doing something new with remote access to Events. Would appreciate some feedback on this.

Edit: Scope expanded, just a little. Now:

  • Adds filterwheel interface to the distributed camera system
  • Introduces a RemoteEvent class, providing an interface to threading.Events used by the remote cameras and their subcomponents.
  • Switches all asynchronous operations (take_exposure, autofocus & filterwheel moves) to use the remote events instead of using Pyro's asynchronous results mechanism.
  • No longer spawning local threads (async_wait) for remote asynchronous operations, allowing remote exceptions to be re-raised locally in the main thread.
  • Removes now redundant parameter/status checks from local take_exposure and autofocus methods because the same checks on the remote camera are now able to raise exceptions locally.
  • Added backup local timeouts for remote take_exposure and autofocus to make it less likely calling code ends up waiting forever for a failed operation.
  • Replaces large number of remote attribute getter/setter methods on the CameraServer with generic get() and set() methods.
  • Removes unused and non-functional key parameter from camera server startup script and function.
  • Adds fall back to 'localhost' to load_device_config to simplify running a camera server locally for testing, no longer necessary to override the default config files.

@AnthonyHorton AnthonyHorton added enhancement camera pyro Related to Pyro4 based distributed cameras labels Feb 16, 2020
@AnthonyHorton AnthonyHorton self-assigned this Feb 16, 2020
Copy link
Collaborator

@wtgee wtgee left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. I like the class based event access although I'm curious what advantages you are hoping it gives. Seems slightly more organized to me but I'm not sure if that was the motivation.

Comment on lines 229 to 230
get_quantity_value(seconds, u.s) +
self.readout_time + self._timeout))
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe break this out into a variable before hand just so it's explicit: total_exptime = ... + ...

Also make me wonder if we shouldn't just have a property or method that returns that value (different PR).

huntsman/camera/pyro.py Outdated Show resolved Hide resolved
huntsman/filterwheel/pyro.py Show resolved Hide resolved
huntsman/filterwheel/pyro.py Show resolved Hide resolved
return self._camera.filterwheel._move_event.is_set()

def filterwheel_event_wait(self, timeout=None):
return self._camera.filterwheel._move_event.wait(timeout)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
return self._camera.filterwheel._move_event.wait(timeout)
return self._camera.filterwheel._move_event.wait(timeout)

End with a blank line.

@AnthonyHorton
Copy link
Member Author

LGTM. I like the class based event access although I'm curious what advantages you are hoping it gives. Seems slightly more organized to me but I'm not sure if that was the motivation.

A few thoughts motivating this. The alternative approach that I used for the remote camera take_exposure and autofocus methods was to force the calls to the remote camera to be blocking, and use the Pyro async results stuff and a _async_wait() method to poll from the client side and set an Event there. When faced with the prospect of doing the same for the filterwheel moves it occurred to me that there was a bunch of probably unnecessary duplication going on there. The camera, focuser & filterwheel class already have their own polling threads for setting Events when operations complete, and all I needed in order to make use of those was to have a way to access those remote Events. Worse, by making the remote calls blocking I was preventing the CameraServer from responding to any other calls (e.g. status checks) while an operation was in progress. This would have effects such as making it harder to check to check whether a filterwheel was already in progress when requesting another one (a call to is_moving() on the server would not return until any current move had completed).

@codecov
Copy link

codecov bot commented Apr 15, 2020

Codecov Report

Merging #170 into develop will decrease coverage by 0.18%.
The diff coverage is 79.16%.

Impacted file tree graph

@@             Coverage Diff             @@
##           develop     #170      +/-   ##
===========================================
- Coverage    61.08%   60.90%   -0.19%     
===========================================
  Files           31       33       +2     
  Lines         1344     1325      -19     
  Branches       161      161              
===========================================
- Hits           821      807      -14     
+ Misses         457      453       -4     
+ Partials        66       65       -1     
Impacted Files Coverage Δ
huntsman/utils/config.py 62.76% <23.52%> (-7.97%) ⬇️
huntsman/focuser/pyro.py 72.85% <46.66%> (-1.77%) ⬇️
huntsman/utils/pyro/event.py 87.50% <87.50%> (ø)
huntsman/camera/pyro.py 86.52% <92.22%> (+4.09%) ⬆️
huntsman/filterwheel/pyro.py 93.33% <93.33%> (ø)
huntsman/utils/pyro/camera_server.py 80.00% <100.00%> (ø)
huntsman/utils/pyro/serializers.py 83.78% <100.00%> (+3.13%) ⬆️
huntsman/utils/sshfs.py 55.55% <0.00%> (-1.86%) ⬇️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update b96c8a4...b6d9ad3. Read the comment docs.

@AnthonyHorton AnthonyHorton marked this pull request as ready for review April 15, 2020 15:00
@AnthonyHorton AnthonyHorton requested review from wtgee and fergusL April 15, 2020 15:01
@AnthonyHorton
Copy link
Member Author

AnthonyHorton commented Apr 15, 2020

Scope on this one expanded rather a lot as I attempted to fix some troublesome bugs. Now remote Events are used for all asynchronous operations (exposures, autofocus, filterwheel moves) and the Pyro asynchronous results machinery has been entirely removed. This has improved functionality, with a larger percentage of possible errors generating the same exceptions that they would do for a local camera. I've also made access to remote camera and subcomponent attributes and events generic, which has significantly reduced the amount of code in the CameraServer class and made it easier and cleaner to poke around inside the remote camera to artificially induce errors for unit tests.

Copy link
Collaborator

@wtgee wtgee left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks good, my only concern would be that with the move to accessing a lot of the properties on the proxy via get/set, you are now using strings instead of attributes (or method names), which I think could be more error prone. I think it could also be cleaned up via some clever currying or use of functools.partialmethod, but that would require more thought so should be left for separate PR.

Can you also update PR description with included changes?

Comment on lines 25 to 26
run_camera_server(ignore_local=args.ignore_local,
key=args.key)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is fine but it kind of looks like your auto-formatter is set at 80 chars.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Reverted, and more, because I've taken the key parameter out.

I do have an issue with my linter plugin though. Whatever I do I can't get it to accept a custom line length, which means it flags issues all over POCS and huntsman-pocs code, making it hard to spot real issues.

Comment on lines +76 to +78
# These two are only here as a temporary workaround for some typos in POCS
SerializerBase.register_class_to_dict(ValueError, value_error_to_dict)
SerializerBase.register_dict_to_class("ValueError", dict_to_value_error)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't understand the comment and associated code. What are the typos in POCS?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In POCS there are a few instance of

raise ValueError("Blah, blah, blah, {}!", obj)

where it should be

raise ValueError("Blah, blah, blah, {}!".format(obj))

or

raise ValueError(f"Blah, blah, blah, {obj}!"), e.g.

https://github.com/panoptes/POCS/blob/22532dab81745aa5b805d809b4c494bd5495bc39/pocs/focuser/focuser.py#L242 and https://github.com/panoptes/POCS/blob/22532dab81745aa5b805d809b4c494bd5495bc39/pocs/focuser/focuser.py#L249

These don't cause problems with local cameras (beyond funny looking exceptions) but it was causing test failures with the remote cameras because Pyro was trying to serialise ValueErrors that contained, in their args list, a reference to an object that it couldn't serialise. It took me a while to track down the root cause of the opaque 'circular reference' exceptions that Pyro was throwing as a result! My workaround in order to get this PR working was to add a custom serialiser/deserialiser for ValueError that simply casts all of the args to str, thereby making sure that Pyro can handle them. If we fix this in POCS then I can take this out again.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

huntsman/utils/pyro/event.py Outdated Show resolved Hide resolved
@@ -9,7 +9,10 @@
from huntsman.camera.pyro import CameraServer


def run_camera_server(ignore_local=False, unmount_sshfs=True, logger=None,
def run_camera_server(ignore_local=False,
key=None,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I assumed this was an ssh key but it looks like it is a config key item? Maybe be more explicit and add to docstring.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Or maybe just pass it through via kwargs if run_camera_server itself doesn't need it. Although that's a bit hiding it.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, config key. This parameter got added along with the config server stuff, but when I tried to actually use it to run a local camera server for testing purposes I discovered it didn't actually work. My changes in this file were the result of my attempt to fix it so that I could use a local camera server to work out some tricky bugs in the Pyro camera stuff. After a fair bit of wasted time poking at this stuff I discovered that it wasn't going to work anyway because there is no way to pass the key parameter to the CameraServer instance, and that will always try to use a config for the host's IP address regardless. That's why the config server test fixture in conftest inserts a device config with the host's IP address on the fly, and why I ended up added the localhost fallback in huntsman.utils.config.load_device_config. I should just remove all traces of the key parameter since it's not used and can't work.

@@ -235,7 +236,7 @@ def test_exposure_scaling(camera, tmpdir):
else:
fits_path = str(tmpdir.join('test_exposure_scaling.fits'))
camera.take_exposure(filename=fits_path, dark=True, blocking=True)
image_data, image_header = fits_utils.getdata(fits_path, header=True)
image_data, image_header = fits.getdata(fits_path, header=True)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh, hm. I should support returning the header from the fits_utils although I'm kind of surprised this just doesn't work.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, this was showing up in test failures. This test is usually skipped, but something broke the skipping while I was working on this PR and I found that fits_utils doesn't have get_data (at least not the one from POCS develop).

Comment on lines 289 to 299
success = self._autofocus_event.wait(timeout=max_wait)
if not success:
self._timeout_response("autofocus")
else:
# If the remote autofocus fails after starting in such a way that the event doesn't
# doesn't get set then calling code could wait forever. Have a local timeout thread
# to be safe.
timeout_thread = Timer(interval=max_wait,
function=self._timeout_response,
args=("autofocus",))
timeout_thread.start()
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is almost a repeat of (new) lines 210-218, might be worth wrapping up into a helper method. Might no. 😆

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Dangling more opportunities for reducing line count in front of me... OK, done.

# This could do more thorough checks for success, e.g. check is_exposing property,
# check for existence of output file, etc. It's supposed to be a last resort though,
# and most problems should be caught elsewhere.
relevant_event = getattr(self, f"_{timeout_type}_event")
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could have an AttributeError here without a default. I'll leave it to you as to whether to set default or let error happen. I see the note above so not too worried about it.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm happy with an AttributeError happening here. If the relevant Event is missing something has gone badly wrong and and raising an exception is the appropriate response.

Comment on lines 319 to 324
try:
is_set = relevant_event.is_set()
except Pyro4.errors.CommunicationError:
# Can happen is everything has finished and shutdown before timeout,
# e.g. when running tests.
pass
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
try:
is_set = relevant_event.is_set()
except Pyro4.errors.CommunicationError:
# Can happen is everything has finished and shutdown before timeout,
# e.g. when running tests.
pass
# Exception can happen if everything has finished and shutdown before
# timeout, e.g. when running tests.
with suppress(Pyro4.errors.CommunicationError):
is_set = relevant_event.is_set()

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done, but with some other overlapping changes so not using this suggested commit.

Comment on lines +412 to +416
def get(self, property_name, subcomponent=None):
obj = self._camera
if subcomponent:
obj = getattr(obj, subcomponent)
return getattr(obj, property_name)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I like the cleanup on this side of things but am a little worried about the string based calling from the user's perspective. See note in review comments.

huntsman/camera/pyro.py Outdated Show resolved Hide resolved
@AnthonyHorton AnthonyHorton changed the title Support for filterwheels on Pyro cameras (WIP) Support for filterwheels on Pyro cameras Apr 16, 2020
@AnthonyHorton AnthonyHorton changed the title Support for filterwheels on Pyro cameras Support for filterwheels on Pyro cameras, general refactoring Apr 16, 2020
@AnthonyHorton
Copy link
Member Author

This looks good, my only concern would be that with the move to accessing a lot of the properties on the proxy via get/set, you are now using strings instead of attributes (or method names), which I think could be more error prone. I think it could also be cleaned up via some clever currying or use of functools.partialmethod, but that would require more thought so should be left for separate PR.

Yep, the generic, attribute name based get() and set() are clunky, but I make myself feel better about them by reminding myself that the CameraServer class is not intended to be a user facing interface. It exists only as a wrapper around the remote Camera instance to enable the Pyro network layer. You're not supposed to interact with the CameraServer directly, you're supposed to use the Pyro Camera, Focuser and Filterwheel classes instead. Those three classes should be the only code calling CameraServer methods.

Can you also update PR description with included changes?

Done. Quite a long list now!

@AnthonyHorton AnthonyHorton merged commit 595eedc into AstroHuntsman:develop Apr 16, 2020
@AnthonyHorton AnthonyHorton deleted the pyro-filterwheel branch April 16, 2020 08:54
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
camera enhancement pyro Related to Pyro4 based distributed cameras
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants