diff --git a/README.md b/README.md
index 3234f1923..122dfd650 100644
--- a/README.md
+++ b/README.md
@@ -3,17 +3,34 @@
[![codecov](https://codecov.io/gh/panoptes/panoptes-utils/branch/develop/graph/badge.svg)](https://codecov.io/gh/panoptes/panoptes-utils)
[![Documentation Status](https://readthedocs.org/projects/panoptes-utils/badge/?version=latest)](https://panoptes-utils.readthedocs.io/en/latest/?badge=latest)
-# PANOPTES Utils
+PANOPTES Utils
+--------------
+
+- [PANOPTES Utils](#panoptes-utils)
+- [Getting](#getting)
+ - [pip](#pip)
+ - [Docker](#docker)
+- [Using](#using)
+ - [Modules](#modules)
+ - [Services](#services)
+ - [Config Server](#config-server)
+ - [Messaging Hub](#messaging-hub)
+ - [Logger](#logger)
+- [Development](#development)
+ - [Logging](#logging)
Utility functions for use within the PANOPTES ecosystem and for general astronomical processing.
+This library defines a number of modules that contain useful functions as well as a few
+[services](#services).
+
See the full documentation at: https://panoptes-utils.readthedocs.io
-# Install
-
+## Getting
+
+See [Docker](#docker) for ways to run `panoptes-utils` services without installing to your host computer.
-> See [Docker](#docker) for ways to run that `panoptes-utils` without installing
-to your host computer.
+### pip
To install type:
@@ -24,21 +41,44 @@ pip install panoptes-utils
There are also a number of optional dependencies, which can be installed as following:
```bash
-pip install "panoptes-utils[social,test]"
+pip install "panoptes-utils[social,testing]"
# -or-
pip install "panoptes-utils[all]"
```
-# Services
-
+### Docker
+
+Docker containers are available for running the `panoptes-utils` module and associated services, which also serve as the base container for all other PANOPTES related containers.
+
+See our [Docker documentation](https://panoptes-utils.readthedocs.io/en/latest/docker.html) for details.
+
+## Using
+### Modules
+
+The modules can be used as helper utilities anywhere you would like. See the complete documentation for details: [https://panoptes-utils.readthedocs.io/en/latest/](https://panoptes-utils.readthedocs.io/en/latest/).
-## Config Server
-
+### Services
+
+The services can be run either from a [docker](#docker) image or from the installed script, as described below.
+
+#### Config Server
A simple config param server. Runs as a Flask microservice that delivers JSON documents
in response to requests for config key items.
-For more details and usage examples, see the [config server README](panoptes/utils/config/README.md).
+
+Can be run from the installed script (defaults to `http://localhost:6563/get-config`):
+
+```bash
+$ bin/panoptes-config-server
+ * Serving Flask app "panoptes.utils.config.server" (lazy loading)
+ * Environment: production
+ WARNING: This is a development server. Do not use it in a production deployment.
+ Use a production WSGI server instead.
+ * Debug mode: off
+```
+
+Or inside a python process:
```python
>>> from panoptes.utils.config.server import config_server
@@ -52,21 +92,44 @@ For more details and usage examples, see the [config server README](panoptes/uti
>>> server_process.terminate() # Or just exit notebook/console
```
-## Messaging Hub
-
+For more details and usage examples, see the [config server README](panoptes/utils/config/README.md).
+
+#### Messaging Hub
-The messaging hub is responsible for relaying zeromq messages between the various components of a
-PANOPTES system. Running the Messaging Hub will set up a forwarding service that allows for an arbitrary
-number of publishers and subscribers.
+The messaging hub is responsible for relaying zeromq messages between the various components of a PANOPTES system. Running the Messaging Hub will set up a forwarding service that allows for an arbitrary number of publishers and subscribers.
```bash
panoptes-messaging-hub --from-config
```
-## Docker
-
+#### Logger
-Docker containers are available for running the `panoptes - utils` module and associated services, which
-also serve as the base container for all other PANOPTES related containers.
+A basic logger is defined in `panoptes.utils.logger.get_root_logger()`, which configures a [loguru](https://github.com/Delgan/loguru) logger with some defaults that are suitable to [POCS](https://github.com/panoptes/POCS).
-See our [Docker documentation](https://panoptes-utils.readthedocs.io/en/latest/docker.html) for details.
+For now this will remain part of this repository but may move directly to POCS in the future as it is likely to be unused by others.
+
+## Development
+
+### Logging
+
+The `panoptes-utils` module uses [`loguru`](https://github.com/Delgan/loguru) for logging, which also serves as the basis for the POCS logger (see [Logger](#logger)).
+
+To access the logs for the module, you can import directly from the `logger` module, i.e., `from panoptes.utils.logger import logger`. This is a simple wrapper around `luguru` with no extra configuration:
+
+```python
+>>> from panoptes.utils import CountdownTimer
+>>> # No logs by default
+>>> t0 = CountdownTimer(5)
+>>> t0.sleep()
+False
+
+>>> # Enable the logs
+>>> from panoptes.utils.logger import logger
+>>> logger.enable('panoptes')
+
+>>> t1 = CountdownTimer(5)
+2020-03-04 06:42:50 | DEBUG | panoptes.utils.time:restart:162 - Restarting Timer (blocking) 5.00/5.00
+>>> t1.sleep()
+2020-03-04 06:42:53 | DEBUG | panoptes.utils.time:sleep:183 - Sleeping for 2.43 seconds
+False
+```
diff --git a/bin/panoptes-messaging-hub b/bin/panoptes-messaging-hub
index 85fc28892..89f7086fc 100755
--- a/bin/panoptes-messaging-hub
+++ b/bin/panoptes-messaging-hub
@@ -7,8 +7,9 @@ import time
from astropy.utils import console
from panoptes.utils.config.client import get_config
-from panoptes.utils.logger import get_root_logger
from panoptes.utils.messaging import PanMessaging
+from panoptes.utils.logger import logger
+
the_root_logger = None
@@ -153,6 +154,6 @@ if __name__ == '__main__':
if not sub_and_pub_pairs:
arg_error('Found no port pairs to forward between.')
- the_root_logger = get_root_logger()
+ the_root_logger = logger
run_forwarders(sub_and_pub_pairs)
diff --git a/conda-requirements-amd64.yaml b/conda-requirements-amd64.yaml
index ce5469d62..1e0c7e805 100644
--- a/conda-requirements-amd64.yaml
+++ b/conda-requirements-amd64.yaml
@@ -1,19 +1,37 @@
-astropy
-cffi
-Flask
-jupyter_console
-libffi
-matplotlib-base
-numpy
-pip
-psycopg2
-pyserial
-python-dateutil
-PyYAML
-pyzmq
-readline
-scikit-image
-scikit-learn
-scipy
-tornado
-zeromq
+name: panoptes
+channels:
+ - astropy
+ - conda-forge
+dependencies:
+ - astroplan>=0.6
+ - astropy>=4.0.0
+ - codecov
+ - coverage
+ - coveralls
+ - Flask
+ - ipython
+ - loguru
+ - matplotlib-base>=3.0.0
+ - numpy
+ - photutils
+ - pip
+ - pycodestyle
+ - pyserial
+ - pytest-cov
+ - pytest-remotedata>=0.3.1
+ - pytest
+ - python-dateutil
+ - python-json-logger
+ - PyYAML
+ - pyzmq
+ - readline
+ - requests
+ - ruamel.yaml>=0.15
+ - scikit-image
+ - scipy
+ - tweepy
+ - versioneer
+ - zeromq
+ - pip:
+ - scalpl
+ - mocket
diff --git a/conftest.py b/conftest.py
index 564871000..4fab98235 100644
--- a/conftest.py
+++ b/conftest.py
@@ -13,8 +13,12 @@
import time
import shutil
+import logging
+from _pytest.logging import caplog as _caplog
+from contextlib import suppress
+
+from panoptes.utils.logger import logger
from panoptes.utils.database import PanDB
-from panoptes.utils.logger import get_root_logger
from panoptes.utils.messaging import PanMessaging
from panoptes.utils.config.client import set_config
from panoptes.utils.config.server import config_server
@@ -59,64 +63,6 @@ def pytest_addoption(parser):
". Note that travis-ci will test all of them by default."))
-def pytest_runtest_logstart(nodeid, location):
- """Signal the start of running a single test item.
-
- This hook will be called before pytest_runtest_setup(),
- pytest_runtest_call() and pytest_runtest_teardown() hooks.
-
- Args:
- nodeid (str) – full id of the item
- location – a triple of (filename, linenum, testname)
- """
- try:
- logger = get_root_logger()
- logger.critical('')
- logger.critical('##########' * 8)
- logger.critical(' START TEST {}', nodeid)
- except Exception:
- pass
-
-
-def pytest_runtest_logfinish(nodeid, location):
- """Signal the complete finish of running a single test item.
-
- This hook will be called after pytest_runtest_setup(),
- pytest_runtest_call() and pytest_runtest_teardown() hooks.
-
- Args:
- nodeid (str) – full id of the item
- location – a triple of (filename, linenum, testname)
- """
- try:
- logger = get_root_logger()
- logger.critical('')
- logger.critical(' END TEST {}', nodeid)
- logger.critical('##########' * 8)
- except Exception:
- pass
-
-
-def pytest_runtest_logreport(report):
- """Adds the failure info that pytest prints to stdout into the log."""
- if report.skipped or report.outcome != 'failed':
- return
- try:
- logger = get_root_logger()
- logger.critical('')
- logger.critical(' TEST {} FAILED during {}\n\n{}\n', report.nodeid, report.when,
- report.longreprtext)
- cnt = 15
- if report.capstdout:
- logger.critical('{}Captured stdout during {}{}\n{}\n', '= ' * cnt, report.when,
- ' =' * cnt, report.capstdout)
- if report.capstderr:
- logger.critical('{}Captured stderr during {}{}\n{}\n', '* ' * cnt, report.when,
- ' *' * cnt, report.capstderr)
- except Exception:
- pass
-
-
@pytest.fixture(scope='session')
def config_host():
return 'localhost'
@@ -162,8 +108,7 @@ def config_path():
@pytest.fixture(scope='session', autouse=True)
def static_config_server(config_host, static_config_port, config_path, images_dir, db_name):
- logger = get_root_logger()
- logger.critical(f'Starting config_server for testing session')
+ print(f'Starting config_server for testing session')
proc = config_server(
host=config_host,
@@ -172,7 +117,7 @@ def static_config_server(config_host, static_config_port, config_path, images_di
ignore_local=True,
)
- logger.info(f'config_server started with PID={proc.pid}')
+ print(f'config_server started with PID={proc.pid}')
# Give server time to start
time.sleep(1)
@@ -180,23 +125,23 @@ def static_config_server(config_host, static_config_port, config_path, images_di
# Adjust various config items for testing
unit_name = 'Generic PANOPTES Unit'
unit_id = 'PAN000'
- logger.info(f'Setting testing name and unit_id to {unit_id}')
+ print(f'Setting testing name and unit_id to {unit_id}')
set_config('name', unit_name, port=static_config_port)
set_config('pan_id', unit_id, port=static_config_port)
- logger.info(f'Setting testing database to {db_name}')
+ print(f'Setting testing database to {db_name}')
set_config('db.name', db_name, port=static_config_port)
fields_file = 'simulator.yaml'
- logger.info(f'Setting testing scheduler fields_file to {fields_file}')
+ print(f'Setting testing scheduler fields_file to {fields_file}')
set_config('scheduler.fields_file', fields_file, port=static_config_port)
# TODO(wtgee): determine if we need separate directories for each module.
- logger.info(f'Setting temporary image directory for testing')
+ print(f'Setting temporary image directory for testing')
set_config('directories.images', images_dir, port=static_config_port)
yield
- logger.critical(f'Killing config_server started with PID={proc.pid}')
+ print(f'Killing config_server started with PID={proc.pid}')
proc.terminate()
@@ -209,8 +154,7 @@ def dynamic_config_server(config_host, config_port, config_path, images_dir, db_
instances that are created (propogated through PanBase).
"""
- logger = get_root_logger()
- logger.critical(f'Starting config_server for testing function')
+ print(f'Starting config_server for testing function')
proc = config_server(
host=config_host,
@@ -219,7 +163,7 @@ def dynamic_config_server(config_host, config_port, config_path, images_dir, db_
ignore_local=True,
)
- logger.info(f'config_server started with PID={proc.pid}')
+ print(f'config_server started with PID={proc.pid}')
# Give server time to start
time.sleep(1)
@@ -227,23 +171,23 @@ def dynamic_config_server(config_host, config_port, config_path, images_dir, db_
# Adjust various config items for testing
unit_name = 'Generic PANOPTES Unit'
unit_id = 'PAN000'
- logger.info(f'Setting testing name and unit_id to {unit_id}')
+ print(f'Setting testing name and unit_id to {unit_id}')
set_config('name', unit_name, port=config_port)
set_config('pan_id', unit_id, port=config_port)
- logger.info(f'Setting testing database to {db_name}')
+ print(f'Setting testing database to {db_name}')
set_config('db.name', db_name, port=config_port)
fields_file = 'simulator.yaml'
- logger.info(f'Setting testing scheduler fields_file to {fields_file}')
+ print(f'Setting testing scheduler fields_file to {fields_file}')
set_config('scheduler.fields_file', fields_file, port=config_port)
# TODO(wtgee): determine if we need separate directories for each module.
- logger.info(f'Setting temporary image directory for testing')
+ print(f'Setting temporary image directory for testing')
set_config('directories.images', images_dir, port=config_port)
yield
- logger.critical(f'Killing config_server started with PID={proc.pid}')
+ print(f'Killing config_server started with PID={proc.pid}')
proc.terminate()
@@ -304,8 +248,7 @@ def db_type(request):
@pytest.fixture(scope='function')
def db(db_type):
- return PanDB(
- db_type=db_type, db_name='panoptes_testing', logger=get_root_logger(), connect=True)
+ return PanDB(db_type=db_type, db_name='panoptes_testing', connect=True)
@pytest.fixture(scope='function')
@@ -343,8 +286,7 @@ def message_forwarder(messaging_ports):
args.append(str(sub))
args.append(str(pub))
- logger = get_root_logger()
- logger.info('message_forwarder fixture starting: {}', args)
+ print(f'message_forwarder fixture starting: {args!r}')
proc = subprocess.Popen(args, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
# It takes a while for the forwarder to start, so allow for that.
# TODO(jamessynge): Come up with a way to speed up these fixtures.
@@ -352,8 +294,8 @@ def message_forwarder(messaging_ports):
# If message forwarder doesn't start, tell us why.
if proc.poll() is not None:
outs, errs = proc.communicate(timeout=5)
- logger.info(f'outs: {outs!r}')
- logger.info(f'errs: {errs!r}')
+ print(f'outs: {outs!r}')
+ print(f'errs: {errs!r}')
assert False
yield messaging_ports
@@ -450,3 +392,16 @@ def cr2_file():
def add_doctest_dependencies(doctest_namespace):
doctest_namespace['np'] = np
doctest_namespace['plt'] = plt
+
+
+@pytest.fixture
+def caplog(_caplog):
+ class PropogateHandler(logging.Handler):
+ def emit(self, record):
+ logging.getLogger(record.name).handle(record)
+
+ logger.enable('panoptes')
+ handler_id = logger.add(PropogateHandler(), format="{message}")
+ yield _caplog
+ with suppress(ValueError):
+ logger.remove(handler_id)
diff --git a/docker/Dockerfile.utils b/docker/Dockerfile.utils
index a4a2772fd..8f718b9cd 100644
--- a/docker/Dockerfile.utils
+++ b/docker/Dockerfile.utils
@@ -11,11 +11,6 @@ ENV PANDIR $pan_dir
RUN cd && \
# Create a panoptes group and change group ownership
groupadd panoptes && \
- # Install anaconda packages
- /opt/conda/bin/conda install --yes -c conda-forge -c astropy \
- --file "${PANDIR}/panoptes-utils/conda-requirements-${arch}.yaml" && \
- /opt/conda/bin/conda clean --all --yes && \
- /opt/conda/bin/conda clean -f --yes && \
# Create directories
mkdir -p ${POCS} && \
mkdir -p ${PANDIR}/logs && \
@@ -25,6 +20,12 @@ RUN cd && \
cd ${PANDIR}/panoptes-utils && \
# First deal with pip and PyYAML - see https://github.com/pypa/pip/issues/5247
pip install --no-cache-dir --no-deps --ignore-installed pip PyYAML && \
+ # Install anaconda packages
+ /opt/conda/bin/conda install --yes -c conda-forge -c astropy \
+ --file "${PANDIR}/panoptes-utils/conda-requirements-${arch}.yaml" && \
+ /opt/conda/bin/conda clean --all --yes && \
+ /opt/conda/bin/conda clean -f --yes && \
+ # Install requirements
pip install --no-cache-dir -r requirements.txt && \
pip install --no-cache-dir -e ".[all]" && \
# Change permissions on directories
diff --git a/docs/source/docker.md b/docker/README.md
similarity index 58%
rename from docs/source/docker.md
rename to docker/README.md
index 1fa298d3a..f08232034 100644
--- a/docs/source/docker.md
+++ b/docker/README.md
@@ -1,22 +1,33 @@
-Docker Containers
-=================
+Docker
+======
-**image**: A pre-built and configured Docker application (sort of like a virtualized OS environment with a running application). A Dockerfile will build an image. You download an image from a centralized server (e.g. Docker Hub or Google Cloud Registry). There is only one version of each image on your machine (although images support "tags", e.g. "latest", so you can have multiple tagged copies).
-
-**container**: A running instance of an image. You can run many copies of a single image.
+- [Docker](#docker)
+ - [PANOPTES Containers](#panoptes-containers)
+ - [panoptes-base](#panoptes-base)
+ - [panoptes-utils](#panoptes-utils)
+ - [Getting Docker Images](#getting-docker-images)
+ - [Running Docker Containers](#running-docker-containers)
+ - [Building Docker Images](#building-docker-images)
+ - [Definitions](#definitions)
-This repository creates two seaparate Docker images which act as the base for the other PANOPTES
-images. The first image (`panoptes-base`) is just the base operating system and system utilities but none of the
-PANOPTES software. The second image (`panoptes-utils`) builds off the first image but adds the contents
-of this repository.
+## PANOPTES Containers
> See [Development](#development) for tips on how to run the containers but still use local copies of your files.
+This repository creates two separate Docker images which act as the base for the other PANOPTES images. The first image (`panoptes-base`) is just the base operating system and system utilities but none of the PANOPTES software. The second image (`panoptes-utils`) builds off the first image but adds the contents
+of this repository.
+
There are two flavors for each image, which are tagged `amd64` and `arm32v7` (Raspberry Pi). A
[manifest](https://docs.docker.com/engine/reference/commandline/manifest/) file is created, which means
that you can simply pull `latest` and Docker will figure out which flavor you need.
-##### panoptes-base
+> :warning: NOTE: The `arm32v7` images are outdated as of March 2020 and need to be updated. Use with caution.
+
+Additionally, the `arm32v7` image builds and installs a modified Anconda environment called [Berryconda](https://github.com/jjhelmus/berryconda).
+
+Both the `arm32v7` and the `amd64` create a specific conda environment called `panoptes` that is the base for all other environments in all PANOPTES images.
+
+#### panoptes-base
Included in the image:
@@ -24,21 +35,18 @@ Included in the image:
* sextractor
* dcraw, exiftool
* `zsh` (and `oh-my-zsh`) by default :)
+* `jupyter-console`
Additionally, the `amd64` image is built off of the [continuumio/miniconda3](https://hub.docker.com/r/continuumio/miniconda3) image so is *not* running the system python. The `arm32v7` builds the anaconda environment as part of the `panoptes-utils`.
-##### panoptes-utils
+#### panoptes-utils
Included in the image:
* `panoptes-utils` module
-Additionally, the `arm32v7` image builds and installs a modified Anconda environment called [Berryconda](https://github.com/jjhelmus/berryconda).
-
-Both the `arm32v7` and the `amd64` create a specific conda environment called `panoptes-env` that is
-the base for all other environments in all PANOPTES images.
-### Getting Docker Images
+## Getting Docker Images
The `panoptes-utils` repository provides the base for the PANOPTES Docker images. The image
is public and can be obtained via `docker`:
@@ -47,23 +55,26 @@ is public and can be obtained via `docker`:
docker pull gcr.io/panoptes-exp/panoptes-utils:latest
```
-### Running Docker Containers
+## Running Docker Containers
-The image contains an installed version of the `panoptes-utils` module as well as the system dependencies
-required to run the various scripts and functions (see below). The default `CMD` is just a shell so
-can run a machine with default options and get placed inside the virtual environment.
+The image contains an installed version of the `panoptes-utils` module as well as the system dependencies required to run the various scripts and functions (see [Services](../README.md#services)). The default `ENTRY_POINT` is just a shell so can run a machine with default options and get placed inside the virtual environment.
> Note that we are running this with `network=host`, which opens up all network ports on
the host to the running container. As of April 2019 this still presents problems on the Mac.
-For PANOPTES purposes, the `docker-compose.yaml` defines two containers each running `panoptes-utils` image.
-The first container runs the configuration server (i.e. `scripts/run_config_server.py`) as a local web service and the second container runs the zeromq messaging hub (i.e. `scripts/run_messaging_hub.py`).
+For PANOPTES purposes, the `docker-compose.yaml` defines two containers each running `panoptes-utils` image. The first container runs the configuration server (i.e. `scripts/run_config_server.py`) as a local web service and the second container runs the zeromq messaging hub (i.e. `scripts/run_messaging_hub.py`).
-### Building Docker Images
+## Building Docker Images
`docker/build-image.sh` builds:
* `cloudbuild-base.yaml` uses `Dockerfile` to create a `panoptes-base` image.
* `cloudbuild-utils.yaml` uses `Dockerfile.utils.[amd64|rpi]` to create a `panoptes-utils` image.
- * Uses `conda-environment-[amd64|rpi.yaml` to create a conda environment called `panoptes-env`
+ * Uses `conda-environment-[amd64|rpi].yaml` to create a conda environment called `panoptes-env`
`.travis.yaml` uses `panoptes-utils` image to run `scripts/testing/run_tests.sh` with the `$TRAVIS_BUILD_DIR` mapped to the working dir for the module.
+
+## Definitions
+
+**image**: A pre-built and configured Docker application (sort of like a virtualized OS environment with a running application). A Dockerfile will build an image. You download an image from a centralized server (e.g. Docker Hub or Google Cloud Registry). There is only one version of each image on your machine (although images support "tags", e.g. "latest", so you can have multiple tagged copies).
+
+**container**: A running instance of an image. You can run many copies of a single image.
\ No newline at end of file
diff --git a/panoptes/__init__.py b/panoptes/__init__.py
new file mode 100644
index 000000000..6d24d1471
--- /dev/null
+++ b/panoptes/__init__.py
@@ -0,0 +1,3 @@
+from .utils.logger import logger
+
+logger.disable('panoptes')
diff --git a/panoptes/utils/__init__.py b/panoptes/utils/__init__.py
index 5c407e85f..a1e28c7bc 100644
--- a/panoptes/utils/__init__.py
+++ b/panoptes/utils/__init__.py
@@ -1,6 +1,6 @@
-
from .utils import *
from .time import *
+
from ._version import get_versions
__version__ = get_versions()['version']
del get_versions
diff --git a/panoptes/utils/config/__init__.py b/panoptes/utils/config/__init__.py
index 5d3f8680b..fb3744633 100644
--- a/panoptes/utils/config/__init__.py
+++ b/panoptes/utils/config/__init__.py
@@ -2,8 +2,10 @@
from contextlib import suppress
from warnings import warn
-from panoptes.utils import listify
-from panoptes.utils import serializers
+from ..logger import logger
+from ..utils import listify
+from ..serializers import from_yaml
+from ..serializers import to_yaml
def load_config(config_files=None, simulator=None, parse=True, ignore_local=False):
@@ -70,7 +72,7 @@ def load_config(config_files=None, simulator=None, parse=True, ignore_local=Fals
try:
_add_to_conf(config, path, parse=parse)
except Exception as e:
- warn("Problem with config file {}, skipping. {}".format(path, e))
+ warn(f"Problem with config file {path}, skipping. {e}")
# Load local version of config
if ignore_local is False:
@@ -79,7 +81,7 @@ def load_config(config_files=None, simulator=None, parse=True, ignore_local=Fals
try:
_add_to_conf(config, local_version, parse=parse)
except Exception:
- warn("Problem with local config file {}, skipping".format(local_version))
+ warn(f"Problem with local config file {local_version}, skipping")
# parse_config currently only corrects directory names.
if parse:
@@ -120,13 +122,13 @@ def save_config(path, config, overwrite=True):
full_path = f'{base}{ext}'
if os.path.exists(full_path) and overwrite is False:
- warn("Path exists and overwrite=False: {}".format(full_path))
+ logger.warning(f"Path exists and overwrite=False: {full_path}")
else:
# Create directory if does not exist
os.makedirs(os.path.dirname(full_path), exist_ok=True)
with open(full_path, 'w') as f:
print(config)
- serializers.to_yaml(config, stream=f)
+ to_yaml(config, stream=f)
def parse_config(config):
@@ -155,6 +157,6 @@ def parse_config(config):
def _add_to_conf(config, fn, parse=False):
with suppress(IOError):
with open(fn, 'r') as f:
- c = serializers.from_yaml(f, parse=parse)
+ c = from_yaml(f, parse=parse)
if c is not None and isinstance(c, dict):
config.update(c)
diff --git a/panoptes/utils/config/client.py b/panoptes/utils/config/client.py
index f1f5c8e94..03ef2180d 100644
--- a/panoptes/utils/config/client.py
+++ b/panoptes/utils/config/client.py
@@ -1,6 +1,8 @@
import requests
-from panoptes.utils import serializers
-from panoptes.utils.logger import get_root_logger
+
+from ..logger import logger
+from ..serializers import from_json
+from ..serializers import to_json
def get_config(key=None, host='localhost', port='6563', parse=True, default=None):
@@ -53,14 +55,14 @@ def get_config(key=None, host='localhost', port='6563', parse=True, default=None
try:
response = requests.post(url, json={'key': key})
except Exception as e:
- get_root_logger().info(f'Problem with get_config: {e!r}')
+ logger.info(f'Problem with get_config: {e!r}')
else:
if not response.ok:
- get_root_logger().info(f'Problem with get_config: {response.content!r}')
+ logger.info(f'Problem with get_config: {response.content!r}')
else:
if response.text != 'null\n':
if parse:
- config_entry = serializers.from_json(response.content.decode('utf8'))
+ config_entry = from_json(response.content.decode('utf8'))
else:
config_entry = response.json()
@@ -102,7 +104,7 @@ def set_config(key, new_value, host='localhost', port='6563', parse=True):
"""
url = f'http://{host}:{port}/set-config'
- json_str = serializers.to_json({key: new_value})
+ json_str = to_json({key: new_value})
config_entry = None
try:
@@ -114,10 +116,10 @@ def set_config(key, new_value, host='localhost', port='6563', parse=True):
if not response.ok:
raise Exception(f'Cannot access config server: {response.text}')
except Exception as e:
- get_root_logger().info(f'Problem with set_config: {e!r}')
+ logger.info(f'Problem with set_config: {e!r}')
else:
if parse:
- config_entry = serializers.from_json(response.content.decode('utf8'))
+ config_entry = from_json(response.content.decode('utf8'))
else:
config_entry = response.json()
diff --git a/panoptes/utils/config/server.py b/panoptes/utils/config/server.py
index 1341dea9f..192b1c186 100644
--- a/panoptes/utils/config/server.py
+++ b/panoptes/utils/config/server.py
@@ -8,10 +8,10 @@
from multiprocessing import Process
from scalpl import Cut
-from panoptes.utils.config import load_config
-from panoptes.utils.config import save_config
-from panoptes.utils.serializers import _serialize_object
-from panoptes.utils.logger import get_root_logger
+from ..logger import logger
+from . import load_config
+from . import save_config
+from ..serializers import _serialize_object
logging.getLogger('werkzeug').setLevel(logging.WARNING)
@@ -163,7 +163,7 @@ def set_config_entry():
@app.route('/reset-config', methods=['POST'])
def reset_config():
if request.is_json:
- get_root_logger().warning(f'Resetting config server')
+ logger.warning(f'Resetting config server')
req_data = request.get_json()
if req_data['reset']:
diff --git a/panoptes/utils/data.py b/panoptes/utils/data.py
index bd236e24f..ff5067f2d 100644
--- a/panoptes/utils/data.py
+++ b/panoptes/utils/data.py
@@ -7,6 +7,8 @@
import sys
import warnings
+from .logger import logger
+
# Use custom location for download
from astropy.utils.iers import conf as iers_conf
iers_conf.iers_auto_url = 'https://storage.googleapis.com/panoptes-resources/iers/ser7.dat'
@@ -55,7 +57,7 @@ def download_all_files(self):
except Exception as e:
if not self.keep_going:
raise e
- print('Failed to download IERS A bulletin: {}'.format(e))
+ logger.warning(f'Failed to download IERS A bulletin: {e}')
result = False
if self.wide_field:
for i in range(4110, 4119):
@@ -78,7 +80,7 @@ def download_one_file(self, fn):
except Exception as e:
if not self.keep_going:
raise e
- print('Failed to download {}: {}'.format(url, e))
+ logger.warning(f'Failed to download {url}: {e}')
return False
# The file has been downloaded to some directory. Move the file into the data folder.
try:
@@ -88,13 +90,13 @@ def download_one_file(self, fn):
except OSError as e:
if not self.keep_going:
raise e
- print("Problem saving {}. Check permissions: {}".format(url, e))
+ logger.warning(f"Problem saving {url}. Check permissions: {e}")
return False
def create_data_folder(self):
"""Creates the data folder if it does not exist."""
if not os.path.exists(self.data_folder):
- print("Creating data folder: {}.".format(self.data_folder))
+ logger.info("Creating data folder: {}.".format(self.data_folder))
os.makedirs(self.data_folder)
@@ -131,7 +133,7 @@ def main():
args = parser.parse_args()
if args.folder and not os.path.exists(args.folder):
- print("Warning, data folder {} does not exist.".format(args.folder))
+ logger.info("Warning, data folder {} does not exist.".format(args.folder))
keep_going = args.keep_going or not args.no_keep_going
diff --git a/panoptes/utils/database/__init__.py b/panoptes/utils/database/__init__.py
index 86b2c18c6..c372aabeb 100644
--- a/panoptes/utils/database/__init__.py
+++ b/panoptes/utils/database/__init__.py
@@ -1,8 +1,9 @@
import abc
-from warnings import warn
-from panoptes.utils import current_time
-from panoptes.utils.library import load_module
+from ..logger import logger
+from .. import error
+from ..time import current_time
+from ..library import load_module
def _get_db_class(module_name='file'):
@@ -39,7 +40,7 @@ def _get_db_class(module_name='file'):
class AbstractPanDB(metaclass=abc.ABCMeta):
- def __init__(self, db_name=None, collection_names=list(), logger=None, **kwargs):
+ def __init__(self, db_name=None, collection_names=list(), **kwargs):
"""
Init base class for db instances.
@@ -49,24 +50,13 @@ def __init__(self, db_name=None, collection_names=list(), logger=None, **kwargs)
logger: (Optional) logger to use for warnings.
"""
self.logger = logger
- if self.logger:
- self.logger.info(f'Creating PanDB {db_name} with collections: {collection_names}')
+ self.logger.info(f'Creating PanDB {db_name} with collections: {collection_names}')
self.db_name = db_name
self.collection_names = collection_names
- def _warn(self, *args, **kwargs):
- if self.logger:
- self.logger.warning(*args, **kwargs)
- else:
- warn(*args)
-
def validate_collection(self, collection):
if collection not in self.collection_names:
- msg = 'Collection type {!r} not available'.format(collection)
- self._warn(msg)
- # Can't import panoptes.utils.error earlier
- from panoptes.utils.error import InvalidCollection
- raise InvalidCollection(msg)
+ raise error.InvalidCollection(f'Collection type {collection!r} not available')
@abc.abstractclassmethod
def insert_current(self, collection, obj, store_permanently=True): # pragma: no cover
diff --git a/panoptes/utils/database/file.py b/panoptes/utils/database/file.py
index 61f8c7fa3..2e5664000 100644
--- a/panoptes/utils/database/file.py
+++ b/panoptes/utils/database/file.py
@@ -1,12 +1,14 @@
import os
+from warnings import warn
from contextlib import suppress
from uuid import uuid4
from glob import glob
-from panoptes.utils.serializers import to_json
-from panoptes.utils.serializers import from_json
-from panoptes.utils.database import AbstractPanDB
-from panoptes.utils.database import create_storage_obj
+from .. import error
+from ..serializers import to_json
+from ..serializers import from_json
+from ..database import AbstractPanDB
+from ..database import create_storage_obj
class PanFileDB(AbstractPanDB):
@@ -40,20 +42,12 @@ def insert_current(self, collection, obj, store_permanently=True):
# Overwrite current collection file with obj.
to_json(obj, filename=current_fn, append=False)
except Exception as e:
- self._warn(f"Problem serializing object for insertion: {e} {current_fn} {obj!r}")
- result = None
+ raise error.InvalidSerialization(f"Problem serializing object for insertion: {e} {current_fn} {obj!r}")
if not store_permanently:
return result
-
- collection_fn = self._get_file(collection)
- try:
- # Append obj to collection file.
- to_json(obj, filename=collection_fn, append=True)
- return obj_id
- except Exception as e:
- self._warn("Problem inserting object into collection: {}, {!r}".format(e, obj))
- return None
+ else:
+ return self.insert(collection, obj)
def insert(self, collection, obj):
self.validate_collection(collection)
@@ -65,8 +59,7 @@ def insert(self, collection, obj):
to_json(obj, filename=collection_fn)
return obj_id
except Exception as e:
- self._warn("Problem inserting object into collection: {}, {!r}".format(e, obj))
- return None
+ raise error.InvalidSerialization(f"Problem inserting object into collection: {e}, {obj!r}")
def get_current(self, collection):
current_fn = self._get_file(collection, permanent=False)
@@ -77,7 +70,7 @@ def get_current(self, collection):
return msg
except FileNotFoundError:
- self._warn("No record found for {}".format(collection))
+ self.logger.warning("No record found for {}".format(collection))
return None
def find(self, collection, obj_id):
diff --git a/panoptes/utils/database/memory.py b/panoptes/utils/database/memory.py
index ce9265b6e..a352dbb29 100644
--- a/panoptes/utils/database/memory.py
+++ b/panoptes/utils/database/memory.py
@@ -3,10 +3,11 @@
from uuid import uuid4
from contextlib import suppress
-from panoptes.utils.serializers import to_json
-from panoptes.utils.serializers import from_json
-from panoptes.utils.database import AbstractPanDB
-from panoptes.utils.database import create_storage_obj
+from .. import error
+from ..serializers import to_json
+from ..serializers import from_json
+from ..database import AbstractPanDB
+from ..database import create_storage_obj
class PanMemoryDB(AbstractPanDB):
@@ -49,8 +50,8 @@ def insert_current(self, collection, obj, store_permanently=True):
try:
obj = to_json(obj)
except Exception as e:
- self._warn("Problem inserting object into current collection: {}, {!r}".format(e, obj))
- return None
+ raise error.InvalidSerialization(f"Problem serializing object for insertion: {e} {obj!r}")
+
with self.lock:
self.current[collection] = obj
if store_permanently:
@@ -64,8 +65,8 @@ def insert(self, collection, obj):
try:
obj = to_json(obj)
except Exception as e:
- self._warn("Problem inserting object into collection: {}, {!r}".format(e, obj))
- return None
+ raise error.InvalidSerialization(f"Problem inserting object into collection: {e}, {obj!r}")
+
with self.lock:
self.collections.setdefault(collection, {})[obj_id] = obj
return obj_id
diff --git a/panoptes/utils/error.py b/panoptes/utils/error.py
index 6dd2d7ae9..d83159478 100644
--- a/panoptes/utils/error.py
+++ b/panoptes/utils/error.py
@@ -1,5 +1,5 @@
import sys
-import logging
+from .logger import logger
class PanError(Exception):
@@ -10,7 +10,7 @@ def __init__(self, msg=None, log=False, exit=False):
self.msg = msg
if self.msg and log:
- logging.error(str(self))
+ logger.error(str(self))
if exit:
self.exit_program(self.msg)
diff --git a/panoptes/utils/images/__init__.py b/panoptes/utils/images/__init__.py
index 64026a78e..742efc10d 100644
--- a/panoptes/utils/images/__init__.py
+++ b/panoptes/utils/images/__init__.py
@@ -1,28 +1,29 @@
import os
+import re
import subprocess
import shutil
from contextlib import suppress
-
from warnings import warn
+import numpy as np
from matplotlib.backends.backend_agg import FigureCanvasAgg as FigureCanvas
from matplotlib.figure import Figure
from astropy.wcs import WCS
from astropy.nddata import Cutout2D
-from astropy.io.fits import open as open_fits
from astropy.visualization import (PercentileInterval, LogStretch, ImageNormalize)
from dateutil import parser as date_parser
-from panoptes.utils import current_time
-from panoptes.utils import error
-from panoptes.utils.images import focus as focus_utils
-from panoptes.utils.images.plot import add_colorbar
-from panoptes.utils.images.plot import get_palette
+from .. import error
+from ..logger import logger
+from ..time import current_time
+from ..images import fits as fits_utils
+from ..images.plot import add_colorbar
+from ..images.plot import get_palette
-def crop_data(data, box_width=200, center=None, verbose=False, data_only=True, wcs=None):
+def crop_data(data, box_width=200, center=None, data_only=True, wcs=None, **kwargs):
"""Return a cropped portion of the image
Shape is a box centered around the middle of the data
@@ -31,7 +32,6 @@ def crop_data(data, box_width=200, center=None, verbose=False, data_only=True, w
data (`numpy.array`): Array of data.
box_width (int, optional): Size of box width in pixels, defaults to 200px.
center (tuple(int, int), optional): Crop around set of coords, default to image center.
- verbose (bool, optional): Print extra text output.
data_only (bool, optional): If True (default), return only data. If False
return the `Cutout2D` object.
wcs (None|`astropy.wcs.WCS`, optional): A valid World Coordinate System (WCS) that will
@@ -45,9 +45,6 @@ def crop_data(data, box_width=200, center=None, verbose=False, data_only=True, w
assert data.shape[0] >= box_width, "Can't clip data, it's smaller than {} ({})".format(
box_width, data.shape)
# Get the center
- if verbose:
- print("Data to crop: {}".format(data.shape))
-
if center is None:
x_len, y_len = data.shape
x_center = int(x_len / 2)
@@ -56,9 +53,8 @@ def crop_data(data, box_width=200, center=None, verbose=False, data_only=True, w
y_center = int(center[0])
x_center = int(center[1])
- if verbose:
- print("Using center: {} {}".format(x_center, y_center))
- print("Box width: {}".format(box_width))
+ logger.debug("Using center: {} {}".format(x_center, y_center))
+ logger.debug("Box width: {}".format(box_width))
cutout = Cutout2D(data, (y_center, x_center), box_width, wcs=wcs)
@@ -106,7 +102,7 @@ def make_pretty_image(fname,
return None
elif img_type == '.cr2':
pretty_path = _make_pretty_from_cr2(fname, title=title, timeout=timeout, **kwargs)
- elif img_type == '.fits':
+ elif img_type in ['.fits', '.fz']:
pretty_path = _make_pretty_from_fits(fname, title=title, **kwargs)
else:
warn("File must be a Canon CR2 or FITS file.")
@@ -136,11 +132,9 @@ def _make_pretty_from_fits(fname=None,
clip_percent=99.9,
**kwargs):
- with open_fits(fname) as hdu:
- header = hdu[0].header
- data = hdu[0].data
- data = focus_utils.mask_saturated(data)
- wcs = WCS(header)
+ data = mask_saturated(fits_utils.getdata(fname))
+ header = fits_utils.getheader(fname)
+ wcs = WCS(header)
if not title:
field = header.get('FIELD', 'Unknown field')
@@ -201,7 +195,7 @@ def _make_pretty_from_fits(fname=None,
add_colorbar(im)
fig.suptitle(title)
- new_filename = fname.replace('.fits', '.jpg')
+ new_filename = re.sub('.fits(.fz)?', '.jpg', fname)
fig.savefig(new_filename, bbox_inches='tight')
# explicitly close and delete figure
@@ -212,34 +206,58 @@ def _make_pretty_from_fits(fname=None,
def _make_pretty_from_cr2(fname, title=None, timeout=15, **kwargs):
- verbose = kwargs.get('verbose', False)
-
script_name = shutil.which('cr2-to-jpg')
cmd = [script_name, fname]
if title:
cmd.append(title)
- if verbose:
- print(cmd)
+ logger.debug(cmd)
try:
output = subprocess.check_output(cmd, stderr=subprocess.STDOUT)
- if verbose:
- print(output)
+ logger.debug(output)
except Exception as e:
raise error.InvalidCommand(f"Error executing {script_name}: {e.output!r}\nCommand: {cmd}")
return fname.replace('cr2', 'jpg')
+def mask_saturated(data, saturation_level=None, threshold=0.9, dtype=np.float64):
+ """Convert data to masked array of requested dtype with saturated values masked.
+
+ Args:
+ data (array_like): The numpy data array.
+ saturation_level (float|None, optional): The saturation level. If None,
+ the level will be set to the threshold times the max value for the dtype.
+ threshold (float, optional): The percentage of the max value to use.
+ dtype (`numpy.dtype`, optional): The requested dtype for the new array.
+
+ Returns:
+ `numpy.ma.array`: The masked numpy array.
+ """
+ if not saturation_level:
+ try:
+ # If data is an integer type use iinfo to compute machine limits
+ dtype_info = np.iinfo(data.dtype)
+ except ValueError:
+ # Not an integer type. Assume for now we have 16 bit data
+ saturation_level = threshold * (2**16 - 1)
+ else:
+ # Data is an integer type, set saturation level at specified fraction of
+ # max value for the type
+ saturation_level = threshold * dtype_info.max
+
+ # Convert data to masked array of requested dtype, mask values above saturation level
+ return np.ma.array(data, mask=(data > saturation_level), dtype=dtype)
+
+
def make_timelapse(
directory,
fn_out=None,
glob_pattern='20[1-9][0-9]*T[0-9]*.jpg',
overwrite=False,
timeout=60,
- verbose=False,
**kwargs):
"""Create a timelapse.
@@ -255,8 +273,7 @@ def make_timelapse(
to the local directory.
overwrite (bool, optional): Overwrite timelapse if exists, default False.
timeout (int): Timeout for making movie, default 60 seconds.
- verbose (bool, optional): Show output, default False.
- **kwargs (dict): Valid keywords: verbose
+ **kwargs (dict):
Returns:
str: Name of output file
@@ -275,9 +292,6 @@ def make_timelapse(
fname = '{}_{}_{}.mp4'.format(field_name, cam_name, tail)
fn_out = os.path.normpath(os.path.join(directory, fname))
- if verbose:
- print("Timelapse file: {}".format(fn_out))
-
if os.path.exists(fn_out) and not overwrite:
raise FileExistsError("Timelapse exists. Set overwrite=True if needed")
@@ -302,8 +316,7 @@ def make_timelapse(
ffmpeg_cmd.append(fn_out)
- if verbose:
- print(ffmpeg_cmd)
+ logger.debug(ffmpeg_cmd)
proc = subprocess.Popen(ffmpeg_cmd, universal_newlines=True,
stdout=subprocess.PIPE, stderr=subprocess.PIPE)
@@ -314,15 +327,13 @@ def make_timelapse(
proc.kill()
outs, errs = proc.communicate()
finally:
- if verbose:
- print(outs)
- print(errs)
+ logger.debug(f"Output: {outs}")
+ logger.debug(f"Errors: {errs}")
# Double-check for file existence
if not os.path.exists(fn_out):
fn_out = None
except Exception as e:
- warn("Problem creating timelapse in {}: {!r}".format(fn_out, e))
- fn_out = None
+ raise error.PanError(f"Problem creating timelapse in {fn_out}: {e!r}")
return fn_out
diff --git a/panoptes/utils/images/cr2.py b/panoptes/utils/images/cr2.py
index 3a723185c..a14472f37 100644
--- a/panoptes/utils/images/cr2.py
+++ b/panoptes/utils/images/cr2.py
@@ -10,8 +10,9 @@
from astropy.io import fits
-from panoptes.utils import error
-from panoptes.utils.images import fits as fits_utils
+from .. import error
+from ..logger import logger
+from ..images import fits as fits_utils
def cr2_to_fits(
@@ -44,15 +45,11 @@ def cr2_to_fits(
str: The full path to the generated FITS file.
"""
-
- verbose = kwargs.get('verbose', False)
-
if fits_fname is None:
fits_fname = cr2_fname.replace('.cr2', '.fits')
if not os.path.exists(fits_fname) or overwrite:
- if verbose:
- print("Converting CR2 to PGM: {}".format(cr2_fname))
+ logger.debug("Converting CR2 to PGM: {}".format(cr2_fname))
# Convert the CR2 to a PGM file then delete PGM
pgm = read_pgm(cr2_to_pgm(cr2_fname), remove_after=True)
@@ -86,9 +83,6 @@ def cr2_to_fits(
hdu.header.set('WBRGGB', exif.get('WB RGGBLevelAsShot', ''), 'From CR2')
hdu.header.set('DATE-OBS', obs_date)
- if verbose:
- print("Adding provided FITS header")
-
for key, value in fits_headers.items():
try:
hdu.header.set(key.upper()[0: 8], value)
@@ -96,8 +90,7 @@ def cr2_to_fits(
pass
try:
- if verbose:
- print("Saving fits file to: {}".format(fits_fname))
+ logger.debug("Saving fits file to: {}".format(fits_fname))
hdu.writeto(fits_fname, output_verify='silentfix', overwrite=overwrite)
except Exception as e:
@@ -139,8 +132,6 @@ def cr2_to_pgm(
str -- Filename of PGM that was created
"""
- verbose = kwargs.get('verbose', False)
-
dcraw = shutil.which('dcraw')
if dcraw is None:
raise error.InvalidCommand('dcraw not found')
@@ -149,21 +140,17 @@ def cr2_to_pgm(
pgm_fname = cr2_fname.replace('.cr2', '.pgm')
if os.path.exists(pgm_fname) and not overwrite:
- if verbose:
- print("PGM file exists, returning existing file: {}".format(
- pgm_fname))
+ logger.warning(f"PGM file exists, returning existing file: {pgm_fname}")
else:
try:
# Build the command for this file
command = '{} -t 0 -D -4 {}'.format(dcraw, cr2_fname)
cmd_list = command.split()
- if verbose:
- print("PGM Conversion command: \n {}".format(cmd_list))
+ logger.debug("PGM Conversion command: \n {}".format(cmd_list))
# Run the command
if subprocess.check_call(cmd_list) == 0:
- if verbose:
- print("PGM Conversion command successful")
+ logger.debug("PGM Conversion command successful")
except subprocess.CalledProcessError as err:
raise error.InvalidSystemCommand(msg="File: {} \n err: {}".format(cr2_fname, err))
diff --git a/panoptes/utils/images/fits.py b/panoptes/utils/images/fits.py
index 0bd2b1850..c53bd96eb 100644
--- a/panoptes/utils/images/fits.py
+++ b/panoptes/utils/images/fits.py
@@ -8,7 +8,8 @@
from astropy.wcs import WCS
from astropy import units as u
-from panoptes.utils import error
+from ..logger import logger
+from .. import error
def solve_field(fname, timeout=15, solve_opts=None, **kwargs):
@@ -19,12 +20,7 @@ def solve_field(fname, timeout=15, solve_opts=None, **kwargs):
timeout(int, optional): Timeout for the solve-field command,
defaults to 60 seconds.
solve_opts(list, optional): List of options for solve-field.
- verbose(bool, optional): Show output, defaults to False.
"""
- verbose = kwargs.get('verbose', False)
- if verbose:
- print("Entering solve_field")
-
solve_field_script = shutil.which('panoptes-solve-field')
if solve_field_script is None: # pragma: no cover
@@ -66,8 +62,6 @@ def solve_field(fname, timeout=15, solve_opts=None, **kwargs):
options.append('--extension=1')
cmd = [solve_field_script] + options + [fname]
- if verbose:
- print("Cmd:", cmd)
try:
proc = subprocess.Popen(cmd,
@@ -83,9 +77,6 @@ def solve_field(fname, timeout=15, solve_opts=None, **kwargs):
except Exception as e:
raise error.PanError("Timeout on plate solving: {}".format(e))
- if verbose:
- print("Returning proc from solve_field")
-
return proc
@@ -106,7 +97,6 @@ def get_solve_field(fname, replace=True, remove_extras=True, **kwargs):
Returns:
dict: Keyword information from the solved field
"""
- verbose = kwargs.get('verbose', False)
skip_solved = kwargs.get('skip_solved', True)
out_dict = {}
@@ -120,19 +110,12 @@ def get_solve_field(fname, replace=True, remove_extras=True, **kwargs):
# Check for solved file
if skip_solved and wcs.is_celestial:
-
- if verbose:
- print("Solved file exists, skipping",
- "(pass skip_solved=False to solve again):",
- fname)
+ logger.info(f"Solved file exists, skipping (use skip_solved=False to solve again): {fname}")
out_dict.update(header)
out_dict['solved_fits_file'] = fname
return out_dict
- if verbose:
- print("Entering get_solve_field:", fname)
-
# Set a default radius of 15
kwargs.setdefault('radius', 15)
@@ -147,10 +130,9 @@ def get_solve_field(fname, replace=True, remove_extras=True, **kwargs):
print(f'Errors on {fname}: {errs}')
raise error.Timeout(f'Timeout while solving: {output!r} {errs!r}')
else:
- if verbose:
- print(f'Returncode: {proc.returncode}')
- print(f'Output on {fname}: {output}')
- print(f'Errors on {fname}: {errs}')
+ logger.debug(f'Returncode: {proc.returncode}')
+ logger.debug(f'Output on {fname}: {output}')
+ logger.debug(f'Errors on {fname}: {errs}')
if proc.returncode == 3:
raise error.SolveError(f'solve-field not found: {output}')
@@ -190,31 +172,28 @@ def get_solve_field(fname, replace=True, remove_extras=True, **kwargs):
try:
out_dict.update(getheader(fname))
except OSError:
- if verbose:
- print("Can't read fits header for:", fname)
+ logger.warning(f"Can't read fits header for: {fname}")
return out_dict
-def get_wcsinfo(fits_fname, verbose=False):
+def get_wcsinfo(fits_fname, **kwargs):
"""Returns the WCS information for a FITS file.
Uses the `wcsinfo` astrometry.net utility script to get the WCS information
from a plate-solved file.
- Parameters
- ----------
- fits_fname : {str}
- Name of a FITS file that contains a WCS.
- verbose : {bool}, optional
- Verbose (the default is False)
- Returns
- -------
- dict
- Output as returned from `wcsinfo`
+ Args:
+ fits_fname ({str}): Name of a FITS file that contains a WCS.
+ **kwargs: Args that can be passed to wcsinfo.
+
+ Returns:
+ dict: Output as returned from `wcsinfo`
+
+ Raises:
+ error.InvalidCommand: Raised if `wcsinfo` is not found (part of astrometry.net)
"""
- assert os.path.exists(fits_fname), warn(
- "No file exists at: {}".format(fits_fname))
+ assert os.path.exists(fits_fname), warn(f"No file exists at: {fits_fname}")
wcsinfo = shutil.which('wcsinfo')
if wcsinfo is None:
@@ -226,8 +205,7 @@ def get_wcsinfo(fits_fname, verbose=False):
run_cmd.append('-e')
run_cmd.append('1')
- if verbose:
- print("wcsinfo command: {}".format(run_cmd))
+ logger.debug("wcsinfo command: {}".format(run_cmd))
proc = subprocess.Popen(run_cmd, stdout=subprocess.PIPE,
stderr=subprocess.STDOUT, universal_newlines=True)
@@ -303,8 +281,7 @@ def improve_wcs(fname, remove_extras=True, replace=True, timeout=30, **kwargs):
remove_extras (bool, optional): If generated files should be removed, default True.
replace (bool, optional): Overwrite existing file, default True.
timeout (int, optional): Timeout for the solve, default 30 seconds.
- **kwargs: Additional keyword args for `solve_field`. Can also include a
- `verbose` flag.
+ **kwargs: Additional keyword args for `solve_field`.
Returns:
dict: FITS headers, including solve information.
@@ -313,14 +290,10 @@ def improve_wcs(fname, remove_extras=True, replace=True, timeout=30, **kwargs):
error.SolveError: Description
error.Timeout: Description
"""
- verbose = kwargs.get('verbose', False)
out_dict = {}
output = None
errs = None
- if verbose:
- print("Entering improve_wcs: {}".format(fname))
-
options = [
'--continue',
'-t', '3',
@@ -343,9 +316,8 @@ def improve_wcs(fname, remove_extras=True, replace=True, timeout=30, **kwargs):
proc.kill()
raise error.Timeout("Timeout while solving")
else:
- if verbose:
- print("Output: {}", output)
- print("Errors: {}", errs)
+ logger.debug(f"Output: {output}")
+ logger.debug(f"Errors: {errs}")
if not os.path.exists(fname.replace('.fits', '.solved')):
raise error.SolveError('File not solved')
@@ -381,13 +353,12 @@ def improve_wcs(fname, remove_extras=True, replace=True, timeout=30, **kwargs):
try:
out_dict.update(fits.getheader(fname))
except OSError:
- if verbose:
- print("Can't read fits header for {}".format(fname))
+ logger.warning(f"Can't read fits header for {fname}")
return out_dict
-def fpack(fits_fname, unpack=False, verbose=False):
+def fpack(fits_fname, unpack=False):
"""Compress/Decompress a FITS file
Uses `fpack` (or `funpack` if `unpack=True`) to compress a FITS file
@@ -395,7 +366,6 @@ def fpack(fits_fname, unpack=False, verbose=False):
Args:
fits_fname ({str}): Name of a FITS file that contains a WCS.
unpack ({bool}, optional): file should decompressed instead of compressed, default False.
- verbose ({bool}, optional): Verbose, default False.
Returns:
str: Filename of compressed/decompressed file.
@@ -418,8 +388,7 @@ def fpack(fits_fname, unpack=False, verbose=False):
warn("fpack not found (try installing cfitsio). File has not been changed")
return fits_fname
- if verbose:
- print("fpack command: {}".format(run_cmd))
+ logger.debug("fpack command: {}".format(run_cmd))
proc = subprocess.Popen(run_cmd, stdout=subprocess.PIPE,
stderr=subprocess.STDOUT, universal_newlines=True)
@@ -450,7 +419,7 @@ def funpack(*args, **kwargs):
return fpack(*args, unpack=True, **kwargs)
-def write_fits(data, header, filename, logger=None, exposure_event=None):
+def write_fits(data, header, filename, exposure_event=None, **kwargs):
"""Write FITS file to requested location.
>>> from panoptes.utils.images import fits as fits_utils
@@ -469,7 +438,6 @@ def write_fits(data, header, filename, logger=None, exposure_event=None):
data (array_like): The data to be written.
header (dict): Dictionary of items to be saved in header.
filename (str): Path to filename for output.
- logger (None|logger, optional): An optional logger.
exposure_event (None|`threading.Event`, optional): A `threading.Event` that
can be triggered when the image is written.
"""
@@ -485,12 +453,10 @@ def write_fits(data, header, filename, logger=None, exposure_event=None):
try:
hdu.writeto(filename)
except OSError as err:
- if logger:
- logger.error('Error writing image to {}!'.format(filename))
- logger.error(err)
+ logger.error('Error writing image to {}!'.format(filename))
+ logger.error(err)
else:
- if logger:
- logger.debug('Image written to {}'.format(filename))
+ logger.debug('Image written to {}'.format(filename))
finally:
if exposure_event:
exposure_event.set()
diff --git a/panoptes/utils/images/focus.py b/panoptes/utils/images/focus.py
index febdec0d7..9ea922b04 100644
--- a/panoptes/utils/images/focus.py
+++ b/panoptes/utils/images/focus.py
@@ -1,4 +1,3 @@
-import numpy as np
def focus_metric(data, merit_function='vollath_F4', **kwargs):
@@ -43,41 +42,21 @@ def vollath_F4(data, axis=None):
Returns:
float64: Calculated F4 value for y, x axis or both
"""
- if axis == 'Y' or axis == 'y':
- return _vollath_F4_y(data)
- elif axis == 'X' or axis == 'x':
- return _vollath_F4_x(data)
+ def _vollath_F4_y():
+ A1 = (data[1:] * data[:-1]).mean()
+ A2 = (data[2:] * data[:-2]).mean()
+ return A1 - A2
+
+ def _vollath_F4_x():
+ A1 = (data[:, 1:] * data[:, :-1]).mean()
+ A2 = (data[:, 2:] * data[:, :-2]).mean()
+ return A1 - A2
+
+ if str(axis).lower() == 'y':
+ return _vollath_F4_y()
+ elif str(axis).lower() == 'x':
+ return _vollath_F4_x()
elif not axis:
- return (_vollath_F4_y(data) + _vollath_F4_x(data)) / 2
+ return (_vollath_F4_y() + _vollath_F4_x()) / 2
else:
- raise ValueError(
- "axis must be one of 'Y', 'y', 'X', 'x' or None, got {}!".format(axis))
-
-
-def mask_saturated(data, saturation_level=None, threshold=0.9, dtype=np.float64):
- if not saturation_level:
- try:
- # If data is an integer type use iinfo to compute machine limits
- dtype_info = np.iinfo(data.dtype)
- except ValueError:
- # Not an integer type. Assume for now we have 16 bit data
- saturation_level = threshold * (2**16 - 1)
- else:
- # Data is an integer type, set saturation level at specified fraction of
- # max value for the type
- saturation_level = threshold * dtype_info.max
-
- # Convert data to masked array of requested dtype, mask values above saturation level
- return np.ma.array(data, mask=(data > saturation_level), dtype=dtype)
-
-
-def _vollath_F4_y(data):
- A1 = (data[1:] * data[:-1]).mean()
- A2 = (data[2:] * data[:-2]).mean()
- return A1 - A2
-
-
-def _vollath_F4_x(data):
- A1 = (data[:, 1:] * data[:, :-1]).mean()
- A2 = (data[:, 2:] * data[:, :-2]).mean()
- return A1 - A2
+ raise ValueError(f"axis must be one of 'Y', 'y', 'X', 'x' or None, got {axis}!")
diff --git a/panoptes/utils/images/polar_alignment.py b/panoptes/utils/images/polar_alignment.py
index 069255b8d..1b895744c 100644
--- a/panoptes/utils/images/polar_alignment.py
+++ b/panoptes/utils/images/polar_alignment.py
@@ -12,7 +12,7 @@
from astropy.visualization.mpl_normalize import ImageNormalize
from astropy.wcs import WCS
-from panoptes.utils.images.fits import get_solve_field
+from ..images.fits import get_solve_field
def analyze_polar_rotation(pole_fn, *args, **kwargs):
diff --git a/panoptes/utils/library.py b/panoptes/utils/library.py
index 03b21e1cf..59fcb9bbd 100644
--- a/panoptes/utils/library.py
+++ b/panoptes/utils/library.py
@@ -1,11 +1,12 @@
import ctypes
import ctypes.util
+from panoptes.utils.logger import logger
from astropy.utils import resolve_name
-from panoptes.utils import error
+from . import error
-def load_c_library(name, path=None, mode=ctypes.DEFAULT_MODE, logger=None):
+def load_c_library(name, path=None, mode=ctypes.DEFAULT_MODE, **kwargs):
"""Utility function to load a shared/dynamically linked library (.so/.dylib/.dll).
The name and location of the shared library can be manually specified with the library_path
@@ -18,7 +19,6 @@ def load_c_library(name, path=None, mode=ctypes.DEFAULT_MODE, logger=None):
mode (int, optional): mode in which to load the library, see dlopen(3) man page for
details. Should be one of ctypes.RTLD_GLOBAL, ctypes.RTLD_LOCAL, or
ctypes.DEFAULT_MODE. Default is ctypes.DEFAULT_MODE.
- logger (logging.Logger, optional): logger to use.
Returns:
ctypes.CDLL
@@ -32,12 +32,11 @@ def load_c_library(name, path=None, mode=ctypes.DEFAULT_MODE, logger=None):
# Interpret a value of None as the default.
mode = ctypes.DEFAULT_MODE
# Open library
- if logger:
- logger.debug("Opening {} library".format(name))
+ logger.debug(f"Opening {name} library")
if not path:
path = ctypes.util.find_library(name)
if not path:
- raise error.NotFound("Cound not find {} library!".format(name))
+ raise error.NotFound(f"Cound not find {name} library!")
# This CDLL loader will raise OSError if the library could not be loaded
return ctypes.CDLL(path, mode=mode)
diff --git a/panoptes/utils/logger.py b/panoptes/utils/logger.py
index 64e4598bb..cefd3fa77 100644
--- a/panoptes/utils/logger.py
+++ b/panoptes/utils/logger.py
@@ -1,344 +1,61 @@
-import collections.abc
-import datetime
-import logging
-import logging.config
import os
-import re
-import string
-import sys
-from tempfile import gettempdir
-import time
-from warnings import warn
-from contextlib import suppress
+from loguru import logger
-from panoptes.utils.config import parse_config
-from panoptes.utils.serializers import from_yaml
-from panoptes.utils.serializers import to_json
-logging.getLogger('matplotlib').setLevel(logging.WARNING)
-logging.getLogger('urllib3').setLevel(logging.WARNING)
-logging.getLogger('requests').setLevel(logging.WARNING)
-# Don't want log messages from state machine library, it is very noisy and
-# we have our own way of logging state transitions
-logging.getLogger('transitions.core').setLevel(logging.WARNING)
-
-
-# We don't want to create multiple root loggers that are "identical",
-# so track the loggers in a dict keyed by a tuple of:
-# (profile, json_serialized_logger_config).
-all_loggers = {}
-
-
-def field_name_to_key(field_name):
- """Given a field_name from Formatter.parse(), extract the argument key.
-
- Args:
- field_name: expression used to identify the source of the value for
- a field. See string.Formatter.parse for more info.
-
- Returns:
- The name or index at the start of the field_name.
- """
- assert isinstance(field_name, str)
- assert len(field_name)
- m = re.match(r'^([^.[]+)', field_name)
- if not m:
- return None
- arg_name = m.group(1)
- if arg_name.isdigit():
- return int(arg_name)
- else:
- return arg_name
-
-
-def format_has_reference_keys(fmt, args):
- """Does fmt have named references in it?
-
- Args:
- fmt: A format string of the type supported by string.Formatter.
- args: A dictionary which *may* be providing values to be formatted
- according to the format fmt.
-
- Returns:
- True if fmt has any format substitution field that references an
- entry in args by a string key. False otherwise.
- """
- assert isinstance(args, dict)
- try:
- for literal_text, field_name, format_spec, conversion in string.Formatter().parse(fmt):
- if field_name:
- key = field_name_to_key(field_name)
- if isinstance(key, str) and key in args:
- return True
- except Exception:
- pass
- return False
-
-
-def format_has_legacy_style(fmt):
- """Does fmt have a % in it? I.e. is it a legacy style?
-
- We replace two %%'s in a row with nothing, see if any percents are
- left.
- """
- fmt = fmt.replace('%%', '')
- return '%' in fmt
-
-
-# formatting_methods encapsulates the different ways that we can apply
-# a format string to a dictionary of args. Those starting with legacy_
-# use the original printf style operator '%'. Those starting with
-# modern_ use the Advanced String Formatting method defined in PEP 3101.
-formatting_methods = dict(
- legacy_direct=lambda fmt, args: fmt % args,
- legacy_tuple=lambda fmt, args: fmt % (args, ),
- modern_direct=lambda fmt, args: fmt.format(args),
- modern_args=lambda fmt, args: fmt.format(*args),
- modern_kwargs=lambda fmt, args: fmt.format(**args),
-)
-
-
-def logger_msg_formatter(fmt, args):
- """Returns the formatted logger message.
-
- Python's logger package uses the old printf style formatting
- strings, rather than the newer PEP-3101 "Advanced String Formatting"
- style of formatting strings.
-
- This function supports using either style, though not both in one
- string. It examines msg to look for which style is in use,
- and is exposed as a function for easier testing.
-
- The logging package assumes that if the sole argument to the logger
- call is a dict, that the caller intends to use that dict as a source
- for mapping key substitutions in the formatting operation, so
- discards the sequence that surrounded the dict (as part of *args),
- keeping only the dict as the value of logging.LogRecord.args here.
- It happens that the old style formatting operator '%' would detect
- whether the string included keys mapping into the dict on the right
- hand side of the % operator, and if so would look them up; however,
- if the formatting string didn't include mapping keys, then a sole
- dict arg was treated as a single value, thus permitting a single
- substitution (e.g. 'This is the result: %r' % some_dict).
-
- The .format() method of strings doesn't have the described behavior,
- so this formatter class attempts to provide it.
- """
- if not args:
- return fmt
-
- # There are args, so fmt must be a format string. Select the
- # formatting methods to try based on the contents.
- method_names = []
- may_have_legacy_subst = format_has_legacy_style(fmt)
- args_are_mapping = isinstance(args, collections.abc.Mapping)
- if '{' in fmt:
- # Looks modern.
- if args_are_mapping:
- if format_has_reference_keys(fmt, args):
- method_names.append('modern_kwargs')
- else:
- method_names.append('modern_direct')
- else:
- method_names.append('modern_args')
- if may_have_legacy_subst:
- # Looks old school.
- method_names.append('legacy_direct')
-
- # Add fallback methods.
- def add_fallback(name):
- if name not in method_names:
- method_names.append(name)
- if '{' in fmt:
- add_fallback('modern_direct')
- if may_have_legacy_subst:
- add_fallback('legacy_tuple')
- elif '%' in fmt:
- add_fallback('legacy_direct')
-
- # Now try to format:
- for method_name in method_names:
- try:
- method = formatting_methods[method_name]
- return method(fmt, args)
- except Exception:
- pass
-
- warn(f'Unable to format log.')
- warn(f'Log message (format string): {fmt!r}')
- warn('Log args type: %s' % type(args))
- try:
- warn(f'Log args: {args!r}')
- except Exception: # pragma: no cover
- warn('Unable to represent log args in string form.')
- return fmt
-
-
-class StrFormatLogRecord(logging.LogRecord):
- """Allow for `str.format` style log messages
-
- Even though you can select '{' as the style for the formatter class,
- you still can't use {} formatting for your message. The custom
- `getMessage` tries new format, then falls back to legacy format.
-
- Originally inspired by https://goo.gl/Cyt5NH but much changed since
- then.
- """
-
- def getMessage(self):
- msg = str(self.msg)
- return logger_msg_formatter(msg, self.args)
-
- @property
- def text(self):
- return self.getMessage()
-
-
-def get_root_logger(profile='panoptes', log_config=None):
+def get_root_logger(profile='panoptes',
+ log_file='panoptes_{time:YYYYMMDD!UTC}.log',
+ log_dir=None,
+ log_level='DEBUG',
+ serialize=True,
+ stderr=False):
"""Creates a root logger for PANOPTES used by the PanBase object.
+ Note: The `log_dir` is determined first from `$PANLOG` if it exists, then
+ `$PANDIR/logs` if `$PANDIR` exists, otherwise defaults to `.`.
+
Args:
- profile (str, optional): The name of the logger to use, defaults
- to 'panoptes'.
- log_config (dict|None, optional): Configuration options for the logger.
- See https://docs.python.org/3/library/logging.config.html for
- available options. Default is `None`, which then looks up the
- values in the `log.yaml` config file.
+ profile (str, optional): The name of the logger to use, defaults to 'panoptes'.
+ log_file (str|None, optional): The filename, defaults to `panoptes_{time:YYYYMMDD!UTC}.log`.
+ log_dir (str|None, optional): The directory to place the log file, see note.
+ log_level (str, optional): Log level, defaults to 'DEBUG'. Note that it should be
+ a string that matches standard `logging` levels and also includes `TRACE`
+ (below `DEBUG`) and `SUCCESS` (above `INFO`)
+ serialize (bool, optional): If logs should be serialized to JSON, default True.
+ stderr (bool, optional): If the default `stderr` handler should be included,
+ defaults to False.
Returns:
- logger(logging.logger): A configured instance of the logger
+ `loguru.logger`: A configured instance of the logger.
"""
- # Get log info from config
- log_config = log_config if log_config else load_default()
-
- # If we already created a logger for this profile and log_config, return that.
- logger_key = (profile, to_json(log_config, sort_keys=True))
- try:
- return all_loggers[logger_key]
- except KeyError:
- pass
-
- # Alter the log_config to use UTC times
- if log_config.get('use_utc', True):
- # TODO(jamessynge): Figure out why 'formatters' is sometimes
- # missing from the log_config. It is hard to understand how
- # this could occur given that none of the callers of
- # get_root_logger pass in their own log_config.
- if 'formatters' not in log_config: # pragma: no cover
- # TODO(jamessynge): Raise a custom exception in this case instead
- # of issuing a warning; after all, a standard dict will throw a
- # KeyError in the for loop below if 'formatters' is missing.
- warn('formatters is missing from log_config!')
- warn(f'log_config: {log_config!r}')
-
- log_fname_datetime = datetime.datetime.utcnow().strftime('%Y%m%dT%H%M%SZ')
-
- # Make the log use UTC
- logging.Formatter.converter = time.gmtime
- else:
- log_fname_datetime = datetime.datetime.now().strftime('%Y%m%dT%H%M%S')
-
- # Setup log file names
- invoked_script = os.path.basename(sys.argv[0])
- log_dir = os.getenv('PANLOG', '')
- if not log_dir:
- log_dir = os.path.join(os.getenv('PANDIR', gettempdir()), 'logs')
- per_run_dir = os.path.join(log_dir, 'per-run', invoked_script)
- log_fname = '{}-{}-{}'.format(invoked_script, log_fname_datetime, os.getpid())
-
# Create the directory for the per-run files.
- os.makedirs(per_run_dir, exist_ok=True)
-
- # Set log filename and rotation
- for handler in log_config.get('handlers', []):
- # Set the filename
- partial_fname = '{}-{}.log'.format(log_fname, handler)
- full_log_fname = os.path.join(per_run_dir, partial_fname)
- log_config['handlers'][handler].setdefault('filename', full_log_fname)
-
- # Setup the TimedRotatingFileHandler for middle of day
- log_config['handlers'][handler].setdefault('atTime', datetime.time(hour=11, minute=30))
-
- # Create a symlink to the log file with just the name of the script and the handler
- # (level), as this makes it easier to find the latest file.
- log_symlink = os.path.join(log_dir, '{}-{}.log'.format(invoked_script, handler))
- log_symlink_target = os.path.abspath(full_log_fname)
- with suppress(FileNotFoundError):
- os.unlink(log_symlink)
-
- os.symlink(log_symlink_target, log_symlink)
-
- # Configure the logger
- logging.config.dictConfig(log_config)
-
- # Get the logger and set as attribute to class
- logger = logging.getLogger(profile)
-
- # Set custom LogRecord
- logging.setLogRecordFactory(StrFormatLogRecord)
+ if log_dir is None:
+ try:
+ log_dir = os.environ['PANLOG']
+ except KeyError:
+ log_dir = os.path.join(os.getenv('PANDIR', '.'), 'logs')
+ log_dir = os.path.normpath(log_dir)
+ os.makedirs(log_dir, exist_ok=True)
+
+ # Clear default stderr handler.
+ if stderr is False:
+ logger.remove()
+
+ # Serialize messages to a file.
+ log_path = os.path.normpath(os.path.join(log_dir, log_file))
+ # Turn on logging from this repo.
+ # TODO(wtgee): determine a retention (or upload) policy.
+ handler_id = logger.add(log_path,
+ rotation='11:30',
+ enqueue=True, # multiprocessing
+ serialize=serialize,
+ backtrace=True,
+ diagnose=True,
+ level=log_level)
+
+ logger._handlers = {
+ handler_id: log_path
+ }
+ logger.enable(profile)
- logger.info('{:*^80}'.format(' Starting PanLogger '))
- # TODO(jamessynge) Output name of script, cmdline args, etc. And do son
- # when the log rotates too!
- all_loggers[logger_key] = logger
return logger
-
-
-def load_default():
- return parse_config(from_yaml(DEFAULT_CONFIG))
-
-
-DEFAULT_CONFIG = """
-version: 1
-use_utc: True
-formatters:
- simple:
- format: '%(asctime)s - %(message)s'
- datefmt: '%H:%M:%S'
- detail:
- style: '{'
- format: '{levelname:.1s}{asctime}.{msecs:03.0f} {filename:>25s}:{lineno:03d}] {message}'
- datefmt: '%m%d %H:%M:%S'
-handlers:
- all:
- class: logging.handlers.TimedRotatingFileHandler
- level: DEBUG
- formatter: detail
- when: W6
- backupCount: 4
- info:
- class: logging.handlers.TimedRotatingFileHandler
- level: INFO
- formatter: detail
- when: W6
- backupCount: 4
- warn:
- class: logging.handlers.TimedRotatingFileHandler
- level: WARNING
- formatter: detail
- when: W6
- backupCount: 4
- error:
- class: logging.handlers.TimedRotatingFileHandler
- level: ERROR
- formatter: detail
- when: W6
- backupCount: 4
-loggers:
- all:
- handlers: [all]
- propagate: true
- info:
- handlers: [info]
- propagate: true
- warn:
- handlers: [warn]
- propagate: true
- error:
- handlers: [error]
- propagate: true
-root:
- level: DEBUG
- handlers: [all, warn]
-"""
diff --git a/panoptes/utils/messaging.py b/panoptes/utils/messaging.py
index 2e3c6f225..a380c310d 100644
--- a/panoptes/utils/messaging.py
+++ b/panoptes/utils/messaging.py
@@ -1,10 +1,10 @@
import re
import zmq
-from panoptes.utils import current_time
-from panoptes.utils.logger import get_root_logger
-from panoptes.utils.serializers import from_json
-from panoptes.utils.serializers import to_json
+from .logger import logger
+from .time import current_time
+from .serializers import from_json
+from .serializers import to_json
class PanMessaging(object):
@@ -69,7 +69,7 @@ class PanMessaging(object):
a byte array of this format:
"""
- logger = get_root_logger()
+ logger = logger
# Topic names must consist of the characters.
topic_name_re = re.compile('[a-zA-Z][-a-zA-Z0-9_.:]*')
diff --git a/panoptes/utils/rs232.py b/panoptes/utils/rs232.py
index 1803722b5..9fc13123e 100644
--- a/panoptes/utils/rs232.py
+++ b/panoptes/utils/rs232.py
@@ -6,9 +6,9 @@
import time
from contextlib import suppress
-from panoptes.utils.logger import get_root_logger
-from panoptes.utils import error
-from panoptes.utils.serializers import from_json
+from . import error
+from .logger import logger
+from .serializers import from_json
# Note: get_serial_port_info is replaced by tests to override the normal
@@ -78,7 +78,7 @@ def __init__(self,
open_delay=0.0,
retry_limit=5,
retry_delay=0.5,
- logger=None,
+ **kwargs
):
"""Create a SerialData instance and attempt to open a connection.
@@ -96,15 +96,11 @@ def __init__(self,
open_delay: Seconds to wait after opening the port.
retry_limit: Number of times to try readline() calls in read().
retry_delay: Delay between readline() calls in read().
- logger (`logging.logger` or None, optional): A logger instance. If left as None
- then `panoptes.utils.logger.get_root_logger` will be called.
Raises:
ValueError: If the serial parameters are invalid (e.g. a negative baudrate).
"""
- if not logger:
- logger = get_root_logger()
self.logger = logger
if not port:
diff --git a/panoptes/utils/serializers.py b/panoptes/utils/serializers.py
index fd61c3980..7c56f44aa 100644
--- a/panoptes/utils/serializers.py
+++ b/panoptes/utils/serializers.py
@@ -9,7 +9,7 @@
from astropy.time import Time
from astropy import units as u
-from panoptes.utils import error
+from . import error
class StringYAML(YAML):
diff --git a/panoptes/utils/social/__init__.py b/panoptes/utils/social/__init__.py
new file mode 100644
index 000000000..e69de29bb
diff --git a/panoptes/utils/social/slack.py b/panoptes/utils/social/slack.py
index 0f06faba3..a6a85ae65 100644
--- a/panoptes/utils/social/slack.py
+++ b/panoptes/utils/social/slack.py
@@ -1,14 +1,11 @@
import requests
-
-from panoptes.utils.logger import get_root_logger
+from ..logger import logger
class SocialSlack(object):
"""Social Messaging sink to output to Slack."""
- logger = get_root_logger()
-
def __init__(self, **kwargs):
self.web_hook = kwargs.get('webhook_url', '')
if self.web_hook == '':
@@ -25,5 +22,5 @@ def send_message(self, msg, timestamp):
# We ignore the response body and headers of a successful post.
requests.post(self.web_hook, json={'text': post_msg})
- except Exception as e:
- self.logger.debug('Error posting to slack: {}'.format(e))
+ except Exception as e: # pragma: no cover
+ logger.warning('Error posting to slack: {}'.format(e))
diff --git a/panoptes/utils/social/twitter.py b/panoptes/utils/social/twitter.py
index 6200513e0..767b0dbe5 100644
--- a/panoptes/utils/social/twitter.py
+++ b/panoptes/utils/social/twitter.py
@@ -1,14 +1,11 @@
import tweepy
-
-from panoptes.utils.logger import get_root_logger
+from ..logger import logger
class SocialTwitter(object):
"""Social Messaging sink to output to Twitter."""
- logger = get_root_logger()
-
def __init__(self, **kwargs):
consumer_key = kwargs.get('consumer_key', '')
if consumer_key == '':
@@ -32,9 +29,9 @@ def __init__(self, **kwargs):
auth.set_access_token(access_token, access_token_secret)
self.api = tweepy.API(auth)
- except tweepy.TweepError:
+ except tweepy.TweepError: # pragma: no cover
msg = 'Error authenicating with Twitter. Please check your Twitter configuration.'
- self.logger.warning(msg)
+ logger.warning(msg)
raise ValueError(msg)
def send_message(self, msg, timestamp):
@@ -46,5 +43,5 @@ def send_message(self, msg, timestamp):
self.api.update_status('{} - {}'.format(msg, timestamp))
else:
self.api.update_status(msg)
- except tweepy.TweepError:
- self.logger.debug('Error tweeting message. Please check your Twitter configuration.')
+ except tweepy.TweepError: # pragma: no cover
+ logger.debug('Error tweeting message. Please check your Twitter configuration.')
diff --git a/panoptes/utils/tests/test_fits_utils.py b/panoptes/utils/tests/images/test_fits_utils.py
similarity index 83%
rename from panoptes/utils/tests/test_fits_utils.py
rename to panoptes/utils/tests/images/test_fits_utils.py
index a98e4afbf..11cb385f0 100644
--- a/panoptes/utils/tests/test_fits_utils.py
+++ b/panoptes/utils/tests/images/test_fits_utils.py
@@ -26,10 +26,10 @@ def test_fpack(solved_fits_file):
info = os.stat(copy_file)
assert info.st_size > 0.
- uncompressed = fits_utils.funpack(copy_file, verbose=True)
+ uncompressed = fits_utils.funpack(copy_file)
assert os.stat(uncompressed).st_size > info.st_size
- compressed = fits_utils.fpack(uncompressed, verbose=True)
+ compressed = fits_utils.fpack(uncompressed)
assert os.stat(compressed).st_size == info.st_size
os.remove(copy_file)
@@ -47,7 +47,7 @@ def test_getval(solved_fits_file):
def test_solve_field(solved_fits_file):
- proc = fits_utils.solve_field(solved_fits_file, verbose=True)
+ proc = fits_utils.solve_field(solved_fits_file)
assert isinstance(proc, subprocess.Popen)
proc.wait()
assert proc.returncode == 0
@@ -55,13 +55,15 @@ def test_solve_field(solved_fits_file):
def test_solve_options(solved_fits_file):
proc = fits_utils.solve_field(
- solved_fits_file, solve_opts=['--guess-scale'], verbose=False)
+ solved_fits_file, solve_opts=['--guess-scale'])
assert isinstance(proc, subprocess.Popen)
proc.wait()
assert proc.returncode == 0
def test_solve_bad_field(solved_fits_file):
- proc = fits_utils.solve_field('Foo', verbose=True)
+ proc = fits_utils.solve_field('Foo.fits')
outs, errs = proc.communicate()
+ print('outs', outs)
+ print('errs', errs)
assert 'ERROR' in errs
diff --git a/panoptes/utils/tests/test_focus_utils.py b/panoptes/utils/tests/images/test_focus_utils.py
similarity index 91%
rename from panoptes/utils/tests/test_focus_utils.py
rename to panoptes/utils/tests/images/test_focus_utils.py
index 5bca22008..8ea4f43b7 100644
--- a/panoptes/utils/tests/test_focus_utils.py
+++ b/panoptes/utils/tests/images/test_focus_utils.py
@@ -3,12 +3,13 @@
from astropy.io import fits
+from panoptes.utils.images import mask_saturated
from panoptes.utils.images import focus as focus_utils
def test_vollath_f4(data_dir):
data = fits.getdata(os.path.join(data_dir, 'unsolved.fits'))
- data = focus_utils.mask_saturated(data)
+ data = mask_saturated(data)
assert focus_utils.vollath_F4(data) == pytest.approx(14667.207897717599)
assert focus_utils.vollath_F4(data, axis='Y') == pytest.approx(14380.343807477504)
assert focus_utils.vollath_F4(data, axis='X') == pytest.approx(14954.071987957694)
@@ -18,7 +19,7 @@ def test_vollath_f4(data_dir):
def test_focus_metric_default(data_dir):
data = fits.getdata(os.path.join(data_dir, 'unsolved.fits'))
- data = focus_utils.mask_saturated(data)
+ data = mask_saturated(data)
assert focus_utils.focus_metric(data) == pytest.approx(14667.207897717599)
assert focus_utils.focus_metric(data, axis='Y') == pytest.approx(14380.343807477504)
assert focus_utils.focus_metric(data, axis='X') == pytest.approx(14954.071987957694)
@@ -28,7 +29,7 @@ def test_focus_metric_default(data_dir):
def test_focus_metric_vollath(data_dir):
data = fits.getdata(os.path.join(data_dir, 'unsolved.fits'))
- data = focus_utils.mask_saturated(data)
+ data = mask_saturated(data)
assert focus_utils.focus_metric(
data, merit_function='vollath_F4') == pytest.approx(14667.207897717599)
assert focus_utils.focus_metric(
@@ -45,6 +46,6 @@ def test_focus_metric_vollath(data_dir):
def test_focus_metric_bad_string(data_dir):
data = fits.getdata(os.path.join(data_dir, 'unsolved.fits'))
- data = focus_utils.mask_saturated(data)
+ data = mask_saturated(data)
with pytest.raises(KeyError):
focus_utils.focus_metric(data, merit_function='NOTAMERITFUNCTION')
diff --git a/panoptes/utils/tests/images/test_image_utils.py b/panoptes/utils/tests/images/test_image_utils.py
index 2c076ae23..9ae835570 100644
--- a/panoptes/utils/tests/images/test_image_utils.py
+++ b/panoptes/utils/tests/images/test_image_utils.py
@@ -14,18 +14,17 @@ def test_crop_data():
ones = np.ones((201, 201))
assert ones.sum() == 40401.
- cropped01 = img_utils.crop_data(ones, verbose=False) # False to exercise coverage.
+ cropped01 = img_utils.crop_data(ones) # False to exercise coverage.
assert cropped01.sum() == 40000.
- cropped02 = img_utils.crop_data(ones, verbose=True, box_width=10)
+ cropped02 = img_utils.crop_data(ones, box_width=10)
assert cropped02.sum() == 100.
- cropped03 = img_utils.crop_data(ones, verbose=True, box_width=6, center=(50, 50))
+ cropped03 = img_utils.crop_data(ones, box_width=6, center=(50, 50))
assert cropped03.sum() == 36.
# Test the Cutout2D object
cropped04 = img_utils.crop_data(ones,
- verbose=True,
box_width=20,
center=(50, 50),
data_only=False)
@@ -37,13 +36,8 @@ def test_crop_data():
def test_make_pretty_image(solved_fits_file, tiny_fits_file, save_environ):
- # Not a valid file type (can't automatically handle .fits.fz files).
- with pytest.warns(UserWarning, match='File must be'):
- assert not img_utils.make_pretty_image(solved_fits_file)
-
# Make a dir and put test image files in it.
with tempfile.TemporaryDirectory() as tmpdir:
- fz_file = os.path.join(tmpdir, os.path.basename(solved_fits_file))
fits_file = os.path.join(tmpdir, os.path.basename(tiny_fits_file))
# TODO Add a small CR2 file to our sample image files.
@@ -55,10 +49,6 @@ def test_make_pretty_image(solved_fits_file, tiny_fits_file, save_environ):
shutil.copy(solved_fits_file, tmpdir)
shutil.copy(tiny_fits_file, tmpdir)
- # Not a valid file type (can't automatically handle fits.fz files).
- with pytest.warns(UserWarning):
- assert not img_utils.make_pretty_image(fz_file)
-
# Can handle the fits file, and creating the images dir for linking
# the latest image.
imgdir = os.path.join(tmpdir, 'images')
@@ -92,10 +82,9 @@ def test_make_pretty_image_cr2_fail():
f.write('not an image file')
with pytest.raises(error.InvalidCommand):
img_utils.make_pretty_image(tmpfile,
- title='some text',
- verbose=True)
+ title='some text')
with pytest.raises(error.InvalidCommand):
- img_utils.make_pretty_image(tmpfile, verbose=True)
+ img_utils.make_pretty_image(tmpfile)
@pytest.mark.skipif("TRAVIS" not in os.environ, reason="Skipping this test if not on Travis CI.")
@@ -104,8 +93,7 @@ def test_make_pretty_image_cr2(cr2_file):
pretty_path = img_utils.make_pretty_image(cr2_file,
title='CR2 Test',
image_type='cr2',
- link_path=link_path,
- verbose=True)
+ link_path=link_path)
assert os.path.exists(pretty_path)
assert pretty_path == link_path
diff --git a/panoptes/utils/tests/test_polar_alignment.py b/panoptes/utils/tests/images/test_polar_alignment.py
similarity index 100%
rename from panoptes/utils/tests/test_polar_alignment.py
rename to panoptes/utils/tests/images/test_polar_alignment.py
diff --git a/panoptes/utils/tests/serial_handlers/protocol_arduinosimulator.py b/panoptes/utils/tests/serial_handlers/protocol_arduinosimulator.py
index abee9bf27..740829fef 100644
--- a/panoptes/utils/tests/serial_handlers/protocol_arduinosimulator.py
+++ b/panoptes/utils/tests/serial_handlers/protocol_arduinosimulator.py
@@ -14,10 +14,10 @@
import time
import urllib
+from panoptes.utils.logger import logger
from panoptes.utils.tests import serial_handlers
from panoptes.utils.serializers import to_json
from panoptes.utils.serializers import from_json
-from panoptes.utils.logger import get_root_logger
def _drain_queue(q):
@@ -36,7 +36,7 @@ class ArduinoSimulator:
at a rate similar to 9600 baud, the rate used by our Arduino sketches.
"""
- def __init__(self, message, relay_queue, json_queue, chunk_size, stop, logger):
+ def __init__(self, message, relay_queue, json_queue, chunk_size, stop):
"""
Args:
message: The message to be sent (millis and report_num will be added).
@@ -48,14 +48,13 @@ def __init__(self, message, relay_queue, json_queue, chunk_size, stop, logger):
length up to chunk_size).
chunk_size: The number of bytes to write to json_queue at a time.
stop: a threading.Event which is checked to see if run should stop executing.
- logger: the Python logger to use for reporting messages.
"""
self.message = copy.deepcopy(message)
- get_root_logger().critical(f'message: {message}')
+ self.logger = logger
+ self.logger.critical(f'message: {message}')
self.relay_queue = relay_queue
self.json_queue = json_queue
self.stop = stop
- self.logger = logger
# Time between producing messages.
self.message_delta = datetime.timedelta(seconds=2)
self.next_message_time = None
@@ -195,7 +194,7 @@ def generate_next_message_bytes(self, now):
class FakeArduinoSerialHandler(serial_handlers.NoOpSerial):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
- self.logger = get_root_logger()
+ self.logger = logger
self.simulator_thread = None
self.relay_queue = queue.Queue(maxsize=1)
self.json_queue = queue.Queue(maxsize=1)
diff --git a/panoptes/utils/tests/test_database.py b/panoptes/utils/tests/test_database.py
index 9653bb977..0c9c580eb 100644
--- a/panoptes/utils/tests/test_database.py
+++ b/panoptes/utils/tests/test_database.py
@@ -1,7 +1,7 @@
import pytest
from panoptes.utils.database import PanDB
-from panoptes.utils.error import InvalidCollection
+from panoptes.utils import error
def test_bad_db():
@@ -50,21 +50,29 @@ def test_simple_insert(db):
assert record['data']['test'] == rec['test']
-# Filter out (hide) "UserWarning: Collection not available"
+def test_bad_insert(db):
+ """Can't serialize `db` properly so gives warning and returns nothing."""
+ with pytest.raises(error.InvalidSerialization):
+ rec = db.insert_current('config', db, store_permanently=False)
+ assert rec is None
+
+ with pytest.raises(error.InvalidSerialization):
+ rec = db.insert('config', db)
+ assert rec is None
+
+
@pytest.mark.filterwarnings('ignore')
def test_bad_collection(db):
- with pytest.raises(InvalidCollection):
+ with pytest.raises(error.InvalidCollection):
db.insert_current('foobar', {'test': 'insert'})
- with pytest.raises(InvalidCollection):
+ with pytest.raises(error.InvalidCollection):
db.insert('foobar', {'test': 'insert'})
def test_warn_bad_object(db):
- db.logger = None
-
- with pytest.warns(UserWarning):
+ with pytest.raises(error.InvalidSerialization):
db.insert_current('observations', {'junk': db})
- with pytest.warns(UserWarning):
+ with pytest.raises(error.InvalidSerialization):
db.insert('observations', {'junk': db})
diff --git a/panoptes/utils/tests/test_logger.py b/panoptes/utils/tests/test_logger.py
index a4f4e7f60..6b351c154 100644
--- a/panoptes/utils/tests/test_logger.py
+++ b/panoptes/utils/tests/test_logger.py
@@ -1,104 +1,62 @@
+import time
+import os
import pytest
-from panoptes.utils.logger import field_name_to_key
-from panoptes.utils.logger import logger_msg_formatter
-
-
-def test_field_name_to_key():
- assert not field_name_to_key('.')
- assert not field_name_to_key('[')
- assert field_name_to_key('abc') == 'abc'
- assert field_name_to_key(' abc ') == ' abc '
- assert field_name_to_key('abc.def') == 'abc'
- assert field_name_to_key('abc[1].def') == 'abc'
-
-
-def test_logger_msg_formatter_1_dict():
- d = dict(abc='def', xyz=123)
-
- tests = [
- # Single anonymous reference, satisfied by the entire dict.
- ('{}', "{'abc': 'def', 'xyz': 123}"),
-
- # Single anonymous reference, satisfied by the entire dict.
- ('{!r}', "{'abc': 'def', 'xyz': 123}"),
-
- # Position zero references, satisfied by the entire dict.
- ('{0} {0}', "{'abc': 'def', 'xyz': 123} {'abc': 'def', 'xyz': 123}"),
-
- # Reference to a valid key in the dict.
- ('{xyz}', "123"),
-
- # Invalid modern reference, so %s format applied.
- ('%s {1}', "{'abc': 'def', 'xyz': 123} {1}"),
-
- # Valid legacy format applied to whole dict.
- ('%r', "{'abc': 'def', 'xyz': 123}"),
- ('%%', "%"),
- ]
-
- for fmt, msg in tests:
- assert logger_msg_formatter(fmt, d) == msg, fmt
-
- # Now tests with entirely invalid formats, so warnings should be issued.
- tests = [
- '%(2)s',
- '{def}',
- '{def',
- 'def}',
- '%d',
- # Bogus references either way.
- '{0} {1} %(2)s'
- ]
-
- for fmt in tests:
- with pytest.warns(UserWarning):
- assert logger_msg_formatter(fmt, d) == fmt
-
-
-def test_logger_msg_formatter_1_non_dict():
- d = ['abc', 123]
-
- tests = [
- # Single anonymous reference, satisfied by first element.
- ('{}', "abc"),
-
- # Single anonymous reference, satisfied by first element.
- ('{!r}', "'abc'"),
-
- # Position references, satisfied by elements.
- ('{1} {0!r}', "123 'abc'"),
-
- # Valid modern reference, %s ignored.
- ('%s {1}', "%s 123"),
-
- # Valid legacy format applied to whole list.
- ('%r', "['abc', 123]"),
-
- # Valid legacy format applied to whole list.
- ('%s', "['abc', 123]"),
- ]
-
- for fmt, msg in tests:
- assert logger_msg_formatter(fmt, d) == msg, fmt
-
- # Now tests with entirely invalid formats, so warnings should be issued.
- tests = [
- # We only have two args, so a reference to a third should fail.
- '{2}',
- '%(2)s',
- # Unknown key
- '{def}',
- '%(def)s',
- # Malformed key
- '{2',
- '{',
- '2}',
- '}',
- '{}{}{}',
- '%d',
- ]
-
- for fmt in tests:
- with pytest.warns(UserWarning):
- assert logger_msg_formatter(fmt, d) == fmt
+from panoptes.utils.logger import get_root_logger
+
+
+@pytest.fixture()
+def profile():
+ return 'testing'
+
+
+def test_logger_no_output(caplog, profile, tmp_path):
+ # The stderr=False means no log output to stderr so can't be captured.
+ log_file = os.path.join(str(tmp_path), 'testing.log')
+ logger = get_root_logger(log_file='testing.log',
+ log_dir=str(tmp_path),
+ profile=profile,
+ stderr=False)
+ msg = "You won't see me"
+ logger.debug(msg)
+ time.sleep(0.5) # Give it time to write.
+
+ # Not in stderr output
+ assert len(caplog.records) == 0
+
+ # But is in file
+ assert os.path.exists(log_file)
+ with open(log_file, 'r') as f:
+ assert msg in f.read()
+
+
+def test_base_logger(caplog, profile, tmp_path):
+ logger = get_root_logger(log_dir=str(tmp_path), profile=profile, stderr=True)
+ logger.debug('Hello')
+ assert caplog.records[-1].message == 'Hello'
+
+
+def test_root_logger(caplog, profile, tmp_path):
+ logger = get_root_logger(log_dir=str(tmp_path), profile=profile, stderr=True)
+ logger.debug('Hi')
+ assert os.listdir(tmp_path)[0].startswith('panoptes_')
+ assert caplog.records[-1].message == 'Hi'
+ assert caplog.records[-1].levelname == 'DEBUG'
+
+ os.environ['PANLOG'] = str(tmp_path)
+ logger = get_root_logger(log_file='foo.log', profile=profile, stderr=True)
+ logger.info('Bye', extra=dict(foo='bar'))
+ assert len(os.listdir(tmp_path)) == 2
+ assert os.listdir(tmp_path)[-1] == 'foo.log'
+ assert caplog.records[-1].message == 'Bye'
+ assert caplog.records[-1].levelname == 'INFO'
+
+ del os.environ['PANLOG']
+ os.environ['PANDIR'] = str(tmp_path)
+ logger = get_root_logger(profile=profile, stderr=True)
+ logger.critical('Bye Again')
+ dir_name = os.path.join(str(tmp_path), 'logs')
+ assert os.path.isdir(dir_name)
+ assert len(os.listdir(dir_name)) == 1
+ assert caplog.records[-1].message == 'Bye Again'
+ assert caplog.records[-1].levelname == 'CRITICAL'
diff --git a/panoptes/utils/tests/test_utils.py b/panoptes/utils/tests/test_utils.py
index 45bb93b9a..313992f5d 100644
--- a/panoptes/utils/tests/test_utils.py
+++ b/panoptes/utils/tests/test_utils.py
@@ -7,7 +7,6 @@
from panoptes.utils import error
from panoptes.utils.library import load_module
from panoptes.utils.library import load_c_library
-from panoptes.utils.logger import get_root_logger
def test_bad_load_module():
@@ -20,7 +19,7 @@ def test_load_c_library():
libc = load_c_library('c')
assert libc._name[:4] == 'libc'
- libc = load_c_library('c', mode=None, logger=get_root_logger())
+ libc = load_c_library('c', mode=None)
assert libc._name[:4] == 'libc'
diff --git a/panoptes/utils/theskyx.py b/panoptes/utils/theskyx.py
index 9b84c3b62..fa86b0327 100644
--- a/panoptes/utils/theskyx.py
+++ b/panoptes/utils/theskyx.py
@@ -1,7 +1,7 @@
import socket
-from panoptes.utils import error
-from panoptes.utils.logger import get_root_logger
+from .logger import logger
+from . import error
class TheSkyX(object):
@@ -12,7 +12,7 @@ class TheSkyX(object):
"""
def __init__(self, host='localhost', port=3040, connect=True, *args, **kwargs):
- self.logger = get_root_logger()
+ self.logger = logger
self._host = host
self._port = port
diff --git a/panoptes/utils/time.py b/panoptes/utils/time.py
index 17326c64e..59021642f 100644
--- a/panoptes/utils/time.py
+++ b/panoptes/utils/time.py
@@ -5,6 +5,8 @@
from astropy import units as u
from astropy.time import Time
+from .logger import logger
+
def current_time(flatten=False, datetime=False, pretty=False):
""" Convenience method to return the "current" time according to the system.
@@ -111,8 +113,7 @@ def __init__(self, duration):
if isinstance(duration, u.Quantity):
duration = duration.to(u.second).value
elif not isinstance(duration, (int, float)):
- raise ValueError(
- 'duration (%r) is not a supported type: %s' % (duration, type(duration)))
+ raise ValueError(f'duration ({duration}) is not a supported type: {type(duration)}')
#: bool: True IFF the duration is zero.
assert duration >= 0, "Duration must be non-negative."
@@ -121,6 +122,15 @@ def __init__(self, duration):
self.duration = float(duration)
self.restart()
+ def __str__(self):
+ is_blocking = ''
+ if self.is_non_blocking is False:
+ is_blocking = '(blocking)'
+ is_expired = ''
+ if self.expired():
+ is_expired = 'EXPIRED '
+ return f'{is_expired}Timer {is_blocking} {self.time_left():.02f}/{self.duration:.02f}'
+
def expired(self):
"""Return a boolean, telling if the timeout has expired.
@@ -149,6 +159,7 @@ def time_left(self):
def restart(self):
"""Restart the timed duration."""
self.target_time = time.monotonic() + self.duration
+ logger.debug(f'Restarting {self}')
def sleep(self, max_sleep=None):
"""Sleep until the timer expires, or for max_sleep, whichever is sooner.
@@ -158,12 +169,18 @@ def sleep(self, max_sleep=None):
Returns:
True if slept for less than time_left(), False otherwise.
"""
+ # Sleep for remaining time by default.
remaining = self.time_left()
if not remaining:
return False
+ sleep_time = remaining
+
+ # Sleep only for max time if requested.
if max_sleep and max_sleep < remaining:
assert max_sleep > 0
- time.sleep(max_sleep)
- return True
- time.sleep(remaining)
- return False
+ sleep_time = max_sleep
+
+ logger.debug(f'Sleeping for {sleep_time:.02f} seconds')
+ time.sleep(sleep_time)
+
+ return sleep_time < remaining
diff --git a/panoptes/utils/utils.py b/panoptes/utils/utils.py
index aab349dd7..f4d996582 100644
--- a/panoptes/utils/utils.py
+++ b/panoptes/utils/utils.py
@@ -11,7 +11,8 @@
from astropy.coordinates import ICRS
from astropy.coordinates import SkyCoord
-from panoptes.utils.time import current_time
+from .time import current_time
+
PATH_MATCHER = re.compile(
r'.*(?PPAN\d{3})/(?P[a-gA-G1-9]{6})/(?P.*?)/(?P.*?)\..*')
@@ -161,7 +162,7 @@ def string_to_params(opts):
return args, kwargs
-def altaz_to_radec(alt=35, az=90, location=None, obstime=None, verbose=False):
+def altaz_to_radec(alt=35, az=90, location=None, obstime=None, **kwargs):
"""Convert alt/az degrees to RA/Dec SkyCoord.
>>> from panoptes.utils import altaz_to_radec
@@ -178,8 +179,7 @@ def altaz_to_radec(alt=35, az=90, location=None, obstime=None, verbose=False):
- >>> altaz_to_radec(location=keck, obstime='2020-02-02T20:20:02.02', verbose=True)
- Getting coordinates for Alt 35 Az 90, from (-5464487..., -2492806..., 2151240.19451846) m at 2020-02-02T20:20:02.02
+ >>> altaz_to_radec(location=keck, obstime='2020-02-02T20:20:02.02')
@@ -196,7 +196,6 @@ def altaz_to_radec(alt=35, az=90, location=None, obstime=None, verbose=False):
az (int, optional): Azimute, defaults to 90 (east)
location (None|astropy.coordinates.EarthLocation, required): A valid location.
obstime (None, optional): Time for object, defaults to `current_time`
- verbose (bool, optional): Verbose, default False.
Returns:
astropy.coordinates.SkyCoord: Coordinates corresponding to the AltAz.
@@ -205,10 +204,6 @@ def altaz_to_radec(alt=35, az=90, location=None, obstime=None, verbose=False):
if obstime is None:
obstime = current_time()
- if verbose:
- print("Getting coordinates for Alt {} Az {}, from {} at {}".format(
- alt, az, location, obstime))
-
altaz = AltAz(obstime=obstime, location=location, alt=alt * u.deg, az=az * u.deg)
return SkyCoord(altaz.transform_to(ICRS))
diff --git a/requirements.txt b/requirements.txt
index f4ac1fa90..894a2dada 100644
--- a/requirements.txt
+++ b/requirements.txt
@@ -14,6 +14,7 @@ imageio==2.6.1 # via scikit-image
itsdangerous==1.1.0 # via flask
jinja2==2.11.1 # via flask
kiwisolver==1.1.0 # via matplotlib
+loguru==0.4.1
markupsafe==1.1.1 # via jinja2
matplotlib==3.1.3
networkx==2.4 # via scikit-image
@@ -23,6 +24,7 @@ pillow==7.0.0 # via imageio, scikit-image
pyparsing==2.4.6 # via matplotlib
pyserial==3.4
python-dateutil==2.8.1
+python-json-logger==0.1.11
pytz==2019.3 # via astroplan
pywavelets==1.1.1 # via scikit-image
pyyaml==5.3
@@ -37,4 +39,4 @@ versioneer==0.18
werkzeug==1.0.0 # via flask
# The following packages are considered to be unsafe in a requirements file:
-# setuptools
+# setuptools==45.2.0 # via kiwisolver
diff --git a/scripts/build_containers.sh b/scripts/build_containers.sh
deleted file mode 100755
index efabd7a66..000000000
--- a/scripts/build_containers.sh
+++ /dev/null
@@ -1,22 +0,0 @@
-#!/bin/bash -e
-SOURCE_DIR="${PANDIR}/panoptes-utils"
-BASE_CLOUD_FILE="cloudbuild-base-${1:-all}.yaml"
-CLOUD_FILE="cloudbuild-utils-${1:-all}.yaml"
-
-
-if [[ $* == *--base* ]]; then
- echo "Using ${BASE_CLOUD_FILE}"
- echo "Building panoptes-base!"
- gcloud builds submit \
- --timeout="5h" \
- --config "${SOURCE_DIR}/docker/${BASE_CLOUD_FILE}" \
- "${SOURCE_DIR}"
-fi
-
-echo "Using ${CLOUD_FILE}"
-echo "Building panoptes-utils"
-gcloud builds submit \
- --timeout="5h" \
- --config "${SOURCE_DIR}/docker/${CLOUD_FILE}" \
- "${SOURCE_DIR}"
-
diff --git a/scripts/testing/run-tests.sh b/scripts/testing/run-tests.sh
index 2f7dc7be2..3bd81eab9 100755
--- a/scripts/testing/run-tests.sh
+++ b/scripts/testing/run-tests.sh
@@ -11,7 +11,7 @@ pip install -e ".[all]"
export PYTHONPATH="$PYTHONPATH:$PANDIR/panoptes-utils/scripts/testing/coverage"
export COVERAGE_PROCESS_START="${PANDIR}/panoptes-utils/.coveragerc"
-coverage run "$(command -v pytest)" -vvrs --test-databases all
+coverage run "$(command -v pytest)" -vv -rfes --test-databases all
# Upload coverage reports if running from Travis.
if [[ $TRAVIS ]]; then
diff --git a/scripts/testing/test-software.sh b/scripts/testing/test-software.sh
index c8b788e6d..abc06f46d 100755
--- a/scripts/testing/test-software.sh
+++ b/scripts/testing/test-software.sh
@@ -20,6 +20,7 @@ sleep 5;
docker run --rm -it \
-e LOCAL_USER_ID=$(id -u) \
-v /var/panoptes/panoptes-utils:/var/panoptes/panoptes-utils \
+ -v /var/panoptes/logs:/var/panoptes/logs \
gcr.io/panoptes-exp/panoptes-utils \
"${PANDIR}/panoptes-utils/scripts/testing/run-tests.sh"
diff --git a/setup.py b/setup.py
index d872c2257..fd6b585ba 100644
--- a/setup.py
+++ b/setup.py
@@ -1,7 +1,6 @@
#!/usr/bin/env python
# Licensed under an MIT style license - see LICENSE.txt
-from distutils.command.build_py import build_py
from configparser import ConfigParser
from setuptools import setup, find_namespace_packages
@@ -25,20 +24,17 @@
PACKAGENAME = metadata.get('package_name', 'packagename')
URL = metadata.get('url', 'https://projectpanoptes.org')
-requirements = list()
-requirements_fn = 'requirements.txt'
-with open(requirements_fn) as f:
- requirements = f.read().splitlines()
-
modules = {
'required': [
'astroplan>=0.6',
'astropy>=4.0.0',
'Flask',
+ 'loguru',
'matplotlib>=3.0.0',
'numpy',
'photutils',
'pyserial',
+ 'python-json-logger',
'python-dateutil',
'PyYAML',
'pyzmq',
@@ -54,8 +50,8 @@
'coverage',
'coveralls',
'mocket',
- 'pycodestyle==2.3.1',
- 'pytest>=3.6',
+ 'pycodestyle',
+ 'pytest',
'pytest-cov',
'pytest-remotedata>=0.3.1'
],
@@ -75,10 +71,6 @@
python_requires='>=3.6',
setup_requires=['pytest-runner'],
tests_require=modules['testing'],
- # List additional groups of dependencies here (e.g. development
- # dependencies). You can install these using the following syntax,
- # for example:
- # $ pip install -e .[dev,test]
scripts=[
'bin/cr2-to-jpg',
'bin/panoptes-config-server',