Skip to content
This repository has been archived by the owner on Sep 18, 2023. It is now read-only.

create data export example #33

Closed
prjemian opened this issue Nov 9, 2018 · 8 comments
Closed

create data export example #33

prjemian opened this issue Nov 9, 2018 · 8 comments
Assignees

Comments

@prjemian
Copy link
Contributor

prjemian commented Nov 9, 2018

Create an example notebook showing how to move scans from from broker to another using the Broker API.

Related to a need at APS 3-ID-D

@prjemian prjemian self-assigned this Nov 9, 2018
prjemian added a commit that referenced this issue Nov 9, 2018
@prjemian
Copy link
Contributor Author

prjemian commented Nov 9, 2018

These are the contents of the sqlite directory just after the exception:

-rw-r--r-- 1 user group    0 Nov  9 16:40 6ea22d45-7b57-4936-b007-a6129a04a28c.sqlite
-rw-r--r-- 1 user group    2 Nov  9 16:40 event_descriptors.json
-rw-r--r-- 1 user group 5.7K Nov  9 16:40 run_starts.json
-rw-r--r-- 1 user group    2 Nov  9 16:41 run_stops.json

@prjemian
Copy link
Contributor Author

prjemian commented Nov 9, 2018

OperationalError: duplicate column name: data_NFS

There is only one data column with "NFS" in it.

Seems be trouble creating ophyd object names with the data column names that have embedded whitespace. These names come from the text in the scaler channel labels. It was OK to acquire data with such names.

Looks like an assumption bug to me in the export code. Either of these places:

  • databroker/headersource/core.py
  • databroker/headersource/sqlite.py

@prjemian
Copy link
Contributor Author

prjemian commented May 3, 2020

@tcaswell : FYI only. In a recent zoom meeting, I mentioned previous problems trying to move data between two mongodb servers for databroker. You emphasized the importance of reporting problems. This issue records part of that problem, which continues to sit untouched. Not sure this ever rose to the level of bug report since I was learning at the time.

Since there is renewed attention on moving data between databroker instances with the Pilot project, this could follow that effort.

@prjemian
Copy link
Contributor Author

prjemian commented May 3, 2020

So, still not a problem to be fixed but rather a procedure to be determined.

@prjemian
Copy link
Contributor Author

prjemian commented Dec 4, 2020

Follow this comment for 4-ID-C Polar,

pip install databroker-pack
databroker-pack mongodb_config --copy-external --query '{"scan_id": {"$in": [135, 136, 137]}}' exported_scans

Be careful about the use of single-quotes and double-quotes above; they matter.

Other search criteria are possible, using the mongodb-query language.

@prjemian
Copy link
Contributor Author

prjemian commented Dec 4, 2020

This command was a successful export:

databroker-pack mongodb_config -q "TimeRange(since='2020-09-01', until='2020-09-02')" /tmp/0901

Now, what about in a jupyter notebook?

@prjemian
Copy link
Contributor Author

prjemian commented Dec 4, 2020

No notebook example here, the python API looks complicated, in favor of command-line tools. Maybe a markdown page showing how to use the command-line tools.

@prjemian
Copy link
Contributor Author

prjemian commented Dec 4, 2020

See the documentation for instructions to pack and unpack. The documentation addresses the copy from broker to broker using the command-line tools.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

1 participant