Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TimeSeries Query for last 5 Minutes not Last 5 Minutes #2459

Closed
waprin opened this issue Sep 28, 2016 · 5 comments
Closed

TimeSeries Query for last 5 Minutes not Last 5 Minutes #2459

waprin opened this issue Sep 28, 2016 · 5 comments
Assignees
Labels
api: monitoring Issues related to the Cloud Monitoring API.

Comments

@waprin
Copy link
Contributor

waprin commented Sep 28, 2016

The usage docs say

query = client.query(METRIC, minutes=5)

will give you stats for the last five minutes. But since the code replaces seconds and microseconds with 0, it's not actually the last 5 minutes since you won't be querying any time interval between the end of the previous minute and the current second.

The reason this came up is in the system tests, I write a point, then try to query it back, but I never get the time series even if they are there, since I

write_point: 2016-09-28T19:29:57.726637
query end_time: 2016-09-28T19:29:00.000000Z

so even though write_point happened first, it's end_time is after the end_time of the query.

Maybe people think second granularity doesn't matter? I'm curious why replacing the second/microseconds with 0 was done in the first place. I think at a minimum the docs are slightly misleading, since it's not the last 5 minutes in the colloquial sense (doesn't include the seconds up until now in the current minute).

cc @rimey @supriyagarg

@supriyagarg
Copy link
Contributor

supriyagarg commented Sep 28, 2016

@waprin: To get the data for a time period ending exactly now, it is recommended to explicitly specify the end time either during initialization or using select_interval. Given that you can write up to one point per minute, this could miss the latest data point.

The end time of the query determines the exact timestamps when using alignment, so zeroing out seconds and microseconds makes the results look nicer, and easier to understand.

@waprin
Copy link
Contributor Author

waprin commented Sep 28, 2016

Ok, that's what I did in #2460 so we could leave it at that. However I still think the behavior is a bit surprising so I'm curious if you

  • Can explain the motivation for replacing the seconds
  • Agree we should at least call this out more explicitly in the docs, rather than just say "the last 5 minutes"

@supriyagarg
Copy link
Contributor

supriyagarg commented Sep 28, 2016

The motivation is as I mentioned in the previous comment:

The end time of the query determines the exact timestamps when using alignment, so zeroing out seconds and microseconds makes the results look nicer, and easier to understand.

In most cases alignment is indeed required, so this applies to a majority of use cases. Also, for specifying exact time intervals select_interval is better. The init method is designed to support common use cases, especially when trying out the library.

Agreed that the top level docs should be updated. Something like the following would work:

Display CPU utilization across your GCE instances over a 5 minute duration ending at the start of the current minute.

@rimey
Copy link
Contributor

rimey commented Sep 28, 2016

It's good to know that this pitfall exists, but we have to focus on normal end-user use-cases. Using a round minute as the default end time makes aligned data much prettier. I found the library annoying to use with the exact current time as the default.

@waprin
Copy link
Contributor Author

waprin commented Sep 28, 2016

Yeah it makes sense to me, writing a time series and immediately reading it back is probably only something we'll do in the system test, and I'm sure on charts and graphs having round minutes is cleaner. I think the doc change is still a good idea just in case, so after that gets merged I think we can close the issue.

@dhermes dhermes added the api: monitoring Issues related to the Cloud Monitoring API. label Sep 29, 2016
parthea pushed a commit that referenced this issue Oct 21, 2023
* Tables Notebooks [(#2090)](GoogleCloudPlatform/python-docs-samples#2090)

* initial commit
* update census
* update notebooks

* remove the reference to a bug [(#2100)](GoogleCloudPlatform/python-docs-samples#2100)

as the bug has been fixed in the public client lib

* delete this file. [(#2102)](GoogleCloudPlatform/python-docs-samples#2102)

* rename file name [(#2103)](GoogleCloudPlatform/python-docs-samples#2103)

* trying to fix images [(#2101)](GoogleCloudPlatform/python-docs-samples#2101)

* remove typo in installation [(#2110)](GoogleCloudPlatform/python-docs-samples#2110)

* Rename census_income_prediction.ipynb to getting_started_notebook.ipynb [(#2115)](GoogleCloudPlatform/python-docs-samples#2115)

renaming the notebooks as Getting Started (will be in sync with the doc). It will be great if the folder could be renamed too

* added back missing file package import [(#2150)](GoogleCloudPlatform/python-docs-samples#2150)

* added back missing file import [(#2145)](GoogleCloudPlatform/python-docs-samples#2145)

* remove incorrect reference to Iris dataset [(#2203)](GoogleCloudPlatform/python-docs-samples#2203)

* conversion to jupyter/colab [(#2340)](GoogleCloudPlatform/python-docs-samples#2340)

plus bug fixes

* updated for the Jupyter support [(#2337)](GoogleCloudPlatform/python-docs-samples#2337)

* updated readme for support Jupyter [(#2336)](GoogleCloudPlatform/python-docs-samples#2336)

to approve with the updated notebook supporting jupyter

* conversion to jupyer/colab [(#2339)](GoogleCloudPlatform/python-docs-samples#2339)

plus bug fixes

* conversion of notebook for jupyter/Colab [(#2338)](GoogleCloudPlatform/python-docs-samples#2338)

conversion of the notebook to support both Jupyter and Colab + bug fixes

* [BLOCKED] AutoML Tables: Docs samples updated to use new (pending) client [(#2276)](GoogleCloudPlatform/python-docs-samples#2276)

* AutoML Tables: Docs samples updated to use new (pending) client

* Linter warnings

* add product recommendation for automl tables notebook [(#2257)](GoogleCloudPlatform/python-docs-samples#2257)

* added colab filtering notebook

* update to tables client

* update readme

* tell user to restart kernel for automl

* AutoML Tables: Notebook samples updated to use new tables client [(#2424)](GoogleCloudPlatform/python-docs-samples#2424)

* fix users bug and emphasize kernal restart [(#2407)](GoogleCloudPlatform/python-docs-samples#2407)

* fix problems with automl docs [(#2501)](GoogleCloudPlatform/python-docs-samples#2501)

Today when we try to use the function `batch_predict` follow the docs we receive and error saying: `the paramaters should be a pandas.Dataframe` it’s happens because the first parameter of the function `batch_predict` is a pandas.Dataframe. To solve this problem we need to use de named parameters of python.

* Fix typo in GCS URI parameter [(#2459)](GoogleCloudPlatform/python-docs-samples#2459)

* fix: fix tables notebook links and bugs [(#2601)](GoogleCloudPlatform/python-docs-samples#2601)

* feat(tables): update samples to show explainability [(#2523)](GoogleCloudPlatform/python-docs-samples#2523)

* show xai

* local feature importance

* use updated client

* use fixed library

* use new model

* Auto-update dependencies. [(#2005)](GoogleCloudPlatform/python-docs-samples#2005)

* Auto-update dependencies.

* Revert update of appengine/flexible/datastore.

* revert update of appengine/flexible/scipy

* revert update of bigquery/bqml

* revert update of bigquery/cloud-client

* revert update of bigquery/datalab-migration

* revert update of bigtable/quickstart

* revert update of compute/api

* revert update of container_registry/container_analysis

* revert update of dataflow/run_template

* revert update of datastore/cloud-ndb

* revert update of dialogflow/cloud-client

* revert update of dlp

* revert update of functions/imagemagick

* revert update of functions/ocr/app

* revert update of healthcare/api-client/fhir

* revert update of iam/api-client

* revert update of iot/api-client/gcs_file_to_device

* revert update of iot/api-client/mqtt_example

* revert update of language/automl

* revert update of run/image-processing

* revert update of vision/automl

* revert update testing/requirements.txt

* revert update of vision/cloud-client/detect

* revert update of vision/cloud-client/product_search

* revert update of jobs/v2/api_client

* revert update of jobs/v3/api_client

* revert update of opencensus

* revert update of translate/cloud-client

* revert update to speech/cloud-client

Co-authored-by: Kurtis Van Gent <[email protected]>
Co-authored-by: Doug Mahugh <[email protected]>

* Update dependency google-cloud-automl to v0.10.0 [(#3033)](GoogleCloudPlatform/python-docs-samples#3033)

Co-authored-by: Bu Sun Kim <[email protected]>
Co-authored-by: Leah E. Cole <[email protected]>

* Simplify noxfile setup. [(#2806)](GoogleCloudPlatform/python-docs-samples#2806)

* chore(deps): update dependency requests to v2.23.0

* Simplify noxfile and add version control.

* Configure appengine/standard to only test Python 2.7.

* Update Kokokro configs to match noxfile.

* Add requirements-test to each folder.

* Remove Py2 versions from everything execept appengine/standard.

* Remove conftest.py.

* Remove appengine/standard/conftest.py

* Remove 'no-sucess-flaky-report' from pytest.ini.

* Add GAE SDK back to appengine/standard tests.

* Fix typo.

* Roll pytest to python 2 version.

* Add a bunch of testing requirements.

* Remove typo.

* Add appengine lib directory back in.

* Add some additional requirements.

* Fix issue with flake8 args.

* Even more requirements.

* Readd appengine conftest.py.

* Add a few more requirements.

* Even more Appengine requirements.

* Add webtest for appengine/standard/mailgun.

* Add some additional requirements.

* Add workaround for issue with mailjet-rest.

* Add responses for appengine/standard/mailjet.

Co-authored-by: Renovate Bot <[email protected]>

* chore: some lint fixes [(#3750)](GoogleCloudPlatform/python-docs-samples#3750)

* automl: tables code sample clean-up [(#3571)](GoogleCloudPlatform/python-docs-samples#3571)

* delete unused tables_dataset samples

* delete args code associated with unused automl_tables samples

* delete tests associated with unused automl_tables samples

* restore get_dataset method/yargs without region tagging

* Restore update_dataset methodsa without region tagging

Co-authored-by: Takashi Matsuo <[email protected]>
Co-authored-by: Leah E. Cole <[email protected]>

* add example of creating AutoML Tables client with non-default endpoint ('new' sdk) [(#3929)](GoogleCloudPlatform/python-docs-samples#3929)

* add example of creating client with non-default endpoint

* more test file cleanup

* move connectivity print stmt out of test fn

Co-authored-by: Leah E. Cole <[email protected]>
Co-authored-by: Torry Yang <[email protected]>

* Replace GCLOUD_PROJECT with GOOGLE_CLOUD_PROJECT. [(#4022)](GoogleCloudPlatform/python-docs-samples#4022)

* chore(deps): update dependency google-cloud-automl to v1 [(#4127)](GoogleCloudPlatform/python-docs-samples#4127)

This PR contains the following updates:

| Package | Update | Change |
|---|---|---|
| [google-cloud-automl](https://togithub.com/googleapis/python-automl) | major | `==0.10.0` -> `==1.0.1` |

---

### Release Notes

<details>
<summary>googleapis/python-automl</summary>

### [`v1.0.1`](https://togithub.com/googleapis/python-automl/blob/master/CHANGELOG.md#&#8203;101-httpswwwgithubcomgoogleapispython-automlcomparev100v101-2020-06-18)

[Compare Source](https://togithub.com/googleapis/python-automl/compare/v0.10.0...v1.0.1)

</details>

---

### Renovate configuration

:date: **Schedule**: At any time (no schedule defined).

:vertical_traffic_light: **Automerge**: Disabled by config. Please merge this manually once you are satisfied.

:recycle: **Rebasing**: Never, or you tick the rebase/retry checkbox.

:no_bell: **Ignore**: Close this PR and you won't be reminded about this update again.

---

 - [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check this box

---

This PR has been generated by [WhiteSource Renovate](https://renovate.whitesourcesoftware.com). View repository job log [here](https://app.renovatebot.com/dashboard#GoogleCloudPlatform/python-docs-samples).

* [tables/automl] fix: update the csv file and the dataset name [(#4188)](GoogleCloudPlatform/python-docs-samples#4188)

fixes #4177
fixes #4178

* samples: Automl table batch test [(#4267)](GoogleCloudPlatform/python-docs-samples#4267)

* added rtest req.txt

* samples: added automl batch predict test

* added missing package

* Update tables/automl/batch_predict_test.py

Co-authored-by: Bu Sun Kim <[email protected]>

Co-authored-by: Bu Sun Kim <[email protected]>

* samples: fixed wrong format on GCS input Uri [(#4270)](GoogleCloudPlatform/python-docs-samples#4270)

## Description

Current predict sample indicates that it can multiples GCS URI inputs but it should be singular.

## Checklist
- [X] Please **merge** this PR for me once it is approved.

* chore(deps): update dependency pytest to v5.4.3 [(#4279)](GoogleCloudPlatform/python-docs-samples#4279)

* chore(deps): update dependency pytest to v5.4.3

* specify pytest for python 2 in appengine

Co-authored-by: Leah Cole <[email protected]>

* Update automl_tables_predict.py with batch_predict_bq sample [(#4142)](GoogleCloudPlatform/python-docs-samples#4142)

Added a new method `batch_predict_bq` demonstrating running batch_prediction using BigQuery.
Added notes in comments about asynchronicity for `batch_predict` method.

The region `automl_tables_batch_predict_bq` will be used on cloud.google.com (currently both sections for GCS and BigQuery use the same sample code which is incorrect).

Fixes #4141

Note: It's a good idea to open an issue first for discussion.

- [x] Please **merge** this PR for me once it is approved.

* Update dependency pytest to v6 [(#4390)](GoogleCloudPlatform/python-docs-samples#4390)

* chore: exclude notebooks

* chore: update templates

* chore: add codeowners and fix tests

* chore: ignore warnings from sphinx

* chore: fix tables client

* test: fix unit tests

Co-authored-by: Torry Yang <[email protected]>
Co-authored-by: florencep <[email protected]>
Co-authored-by: Mike Burton <[email protected]>
Co-authored-by: Lars Wander <[email protected]>
Co-authored-by: Michael Hu <[email protected]>
Co-authored-by: Michael Hu <[email protected]>
Co-authored-by: Alefh Sousa <[email protected]>
Co-authored-by: DPEBot <[email protected]>
Co-authored-by: Kurtis Van Gent <[email protected]>
Co-authored-by: Doug Mahugh <[email protected]>
Co-authored-by: WhiteSource Renovate <[email protected]>
Co-authored-by: Leah E. Cole <[email protected]>
Co-authored-by: Takashi Matsuo <[email protected]>
Co-authored-by: Anthony <[email protected]>
Co-authored-by: Amy <[email protected]>
Co-authored-by: Mike <[email protected]>
Co-authored-by: Leah Cole <[email protected]>
Co-authored-by: Sergei Dorogin <[email protected]>
parthea pushed a commit that referenced this issue Oct 21, 2023
🤖 I have created a release \*beep\* \*boop\* 
---
## [2.0.0](https://www.github.com/googleapis/python-automl/compare/v1.0.1...v2.0.0) (2020-09-16)


### ⚠ BREAKING CHANGES

* move to microgen (#61)

### Features

* move to microgen ([#61](https://www.github.com/googleapis/python-automl/issues/61)) ([009085e](https://www.github.com/googleapis/python-automl/commit/009085e0a82d1d7729349746c2c8954d5d60e0a9))


### Bug Fixes

* **translate:** fix a broken test [([#4360](https://www.github.com/googleapis/python-automl/issues/4360))](https://github.com/GoogleCloudPlatform/python-docs-samples/issues/4360) ([5f7d141](https://www.github.com/googleapis/python-automl/commit/5f7d141afe732acf7458a9ac98618e93baa93d38)), closes [#4353](https://www.github.com/googleapis/python-automl/issues/4353)
* `update_column_spec` typo in TablesClient docstring ([#18](https://www.github.com/googleapis/python-automl/issues/18)) ([9feb4cc](https://www.github.com/googleapis/python-automl/commit/9feb4cc5e04a01a4199da43400457cca6c0bfa05)), closes [#17](https://www.github.com/googleapis/python-automl/issues/17)
* update retry configs ([#44](https://www.github.com/googleapis/python-automl/issues/44)) ([7df9059](https://www.github.com/googleapis/python-automl/commit/7df905910b86721a6ee3a3b6c916a4f8e27d0aa7))


### Documentation

* add cancel operation sample ([abc5070](https://www.github.com/googleapis/python-automl/commit/abc507005d5255ed5adf2c4b8e0b23042a0bdf47))
* add samples from tables/automl ([#54](https://www.github.com/googleapis/python-automl/issues/54)) ([d225a5f](https://www.github.com/googleapis/python-automl/commit/d225a5f97c2823218b91a79e77d3383132875231)), closes [#2090](https://www.github.com/googleapis/python-automl/issues/2090) [#2100](https://www.github.com/googleapis/python-automl/issues/2100) [#2102](https://www.github.com/googleapis/python-automl/issues/2102) [#2103](https://www.github.com/googleapis/python-automl/issues/2103) [#2101](https://www.github.com/googleapis/python-automl/issues/2101) [#2110](https://www.github.com/googleapis/python-automl/issues/2110) [#2115](https://www.github.com/googleapis/python-automl/issues/2115) [#2150](https://www.github.com/googleapis/python-automl/issues/2150) [#2145](https://www.github.com/googleapis/python-automl/issues/2145) [#2203](https://www.github.com/googleapis/python-automl/issues/2203) [#2340](https://www.github.com/googleapis/python-automl/issues/2340) [#2337](https://www.github.com/googleapis/python-automl/issues/2337) [#2336](https://www.github.com/googleapis/python-automl/issues/2336) [#2339](https://www.github.com/googleapis/python-automl/issues/2339) [#2338](https://www.github.com/googleapis/python-automl/issues/2338) [#2276](https://www.github.com/googleapis/python-automl/issues/2276) [#2257](https://www.github.com/googleapis/python-automl/issues/2257) [#2424](https://www.github.com/googleapis/python-automl/issues/2424) [#2407](https://www.github.com/googleapis/python-automl/issues/2407) [#2501](https://www.github.com/googleapis/python-automl/issues/2501) [#2459](https://www.github.com/googleapis/python-automl/issues/2459) [#2601](https://www.github.com/googleapis/python-automl/issues/2601) [#2523](https://www.github.com/googleapis/python-automl/issues/2523) [#2005](https://www.github.com/googleapis/python-automl/issues/2005) [#3033](https://www.github.com/googleapis/python-automl/issues/3033) [#2806](https://www.github.com/googleapis/python-automl/issues/2806) [#3750](https://www.github.com/googleapis/python-automl/issues/3750) [#3571](https://www.github.com/googleapis/python-automl/issues/3571) [#3929](https://www.github.com/googleapis/python-automl/issues/3929) [#4022](https://www.github.com/googleapis/python-automl/issues/4022) [#4127](https://www.github.com/googleapis/python-automl/issues/4127)
---


This PR was generated with [Release Please](https://github.com/googleapis/release-please).
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
api: monitoring Issues related to the Cloud Monitoring API.
Projects
None yet
Development

No branches or pull requests

5 participants