Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Setting precision in influx_write returns an error #25

Closed
ckatsulis opened this issue Aug 15, 2017 · 8 comments
Closed

[BUG] Setting precision in influx_write returns an error #25

ckatsulis opened this issue Aug 15, 2017 · 8 comments
Labels

Comments

@ckatsulis
Copy link

If sub-second precision is specified in the xts index, it appears that granularity is not translated by default to the db through influx_write(); e.g. items get clipped at the second level and it results in sub-second data being overwritten by influxdb.

I then changed precision in the influx_write() call to ms and received the following error. My guess would be that time is specified in the write line syntax only in integer form and the correct precision must be provided along with the appropriately converted xts index vector (e.g. epoch second, ms, us, ns)

> lapply(xtsWrite, 
+        function (x){
+          print(dim(x))
+          y = influx_write(con, 
+                           "stats",
+                           xts = x,
+                           measurement = "orderDataTest",
+                           precision = "ms"
+          )
+          print(y)
+        }
+ )
 Error: {"error":"unable to parse 'orderDataTest,symbol=AUD/USD OrdId=11,OrdPx=0.78326,OrdQty=3000,side=1 NA': bad timestamp\nunable to parse 'orderDataTest,symbol=AUD/USD OrdId=17,OrdPx=0.78319,OrdQty=5000,side=1 NA': bad timestamp\nunable to parse 'orderDataTest,symbol=AUD/USD OrdId=24,OrdPx=0.78333,OrdQty=3000,side=-1 NA': bad timestamp\nunable to parse 'orderDataTest,symbol=AUD/USD OrdId=36,OrdPx=0.78305,OrdQty=2000,side=-1 NA': bad timestamp\nunable to parse 'orderDataTest,symbol=AUD/USD OrdId=42,OrdPx=0.78278,OrdQty=2445,side=-1 NA': bad timestamp\nunable to parse 'orderDataTest,symbol=AUD/USD OrdId=49,OrdPx=0.78272,OrdQty=2445,side=-1 NA': bad timestamp\nunable to parse 'orderDataTest,symbol=AUD/USD OrdId=55,OrdPx=0.78261,OrdQty=5000,side=-1 NA': bad timestamp\nunable to parse 'orderDataTest,symbol=AUD/USD OrdId=60,OrdPx=0.78258,OrdQty=2555,side=-1 NA': bad timestamp\nunable to parse 'orderDataTest,symbol=AUD/USD OrdId=67,OrdPx=0.78202,OrdQty=8000,side=-1 NA': bad timestamp\nuna In addition: Warning message:
In format(as.integer(as.numeric(zoo::index(xts)) * div), scientific = FALSE) :
  NAs introduced by coercion to integer range

Do you know why this is happening?

@dleutnant
Copy link
Owner

dleutnant commented Aug 15, 2017

Sorry, I can't reproduce without data. How does the zoo::index(x) look like? Does your year exceed 2037?

@ckatsulis
Copy link
Author

ckatsulis commented Aug 15, 2017 via email

@ckatsulis
Copy link
Author

ckatsulis commented Aug 19, 2017

The sample data is an xts object and has sub second granularity as the index. When I attempt to write them to influx with influx_write() without precision set , the write is successful. However the timestamp is truncated to second level and because the data has multiple items with the same time stamp and associated tag key values, the data gets overwritten for seconds when there were multiple entries. When I set precision to 'ms' or u, I get the following error:
image

 Error: {"error":"unable to parse 'orderDataTest,symbol=AUD/USD,side=Buy,action=ADD ordId=11,ordRootId=11,px=0.78326,size=3000,desiredRootPx=0.783265 NA': bad timestamp\nunable to parse 'orderDataTest,symbol=AUD/USD,side=Buy,action=ADD ordId=17,ordRootId=17,px=0.78319,size=5000,desiredRootPx=0.783195 NA': bad timestamp\nunable to parse 'orderDataTest,symbol=AUD/USD,side=Buy,action=ADD ordId=72,ordRootId=72,px=0.78173,size=3000,desiredRootPx=0.781735 NA': bad timestamp\nunable to parse 'orderDataTest,symbol=AUD/USD,side=Buy,action=ADD ordId=81,ordRootId=81,px=0.78178,size=8000,desiredRootPx=0.781785 NA': bad timestamp\nunable to parse 'orderDataTest,symbol=AUD/USD,side=Buy,action=ADD ordId=88,ordRootId=81,px=0.78183,size=8000,desiredRootPx=0.781785 NA': bad timestamp\nunable to parse 'orderDataTest,symbol=AUD/USD,side=Buy,action=ADD ordId=91,ordRootId=91,px=0.78174,size=3000,desiredRootPx=0.781745 NA': bad timestamp\nunable to parse 'orderDataTest,symbol=AUD/USD,side=Buy,action=ADD ordI In addition: Warning message:
In format(as.integer(as.numeric(zoo::index(xts)) * div), scientific = FALSE) :
  NAs introduced by coercion to integer range
lapply(xtsWrite, 
         function (x){
           influxdbr::influx_write(con, 
                                   "stats",
                                   xts = x,
                                   measurement = orderMeasurement,
                                   precision = "u"
           )
         }
  )

@dleutnant
Copy link
Owner

Could you please post the code to generate the xts object you are trying to write? Especially, the index of interest, coredata can contain dummy values, e.g.
xts::xts(x = runif(100), order.by = seq(Sys.time(), by = "1 min", length.out = 100), tzone = "GMT").

@ckatsulis
Copy link
Author

ckatsulis commented Aug 20, 2017

xts::xts(orderDataList[[x]][,fields],orderDataList[[x]]$timeSent)
orderDataList[[x]]$timeSent is an ordered unique vector of class POSIXct

My apologies. The last test data set was blank. Here is the correct file. You can use dget.

> head(testData)
                    ordId ordRootId      px size bidPx askPx desiredRootPx
2017-07-17 09:30:00    11        11 0.78326 3000    NA    NA      0.783265
2017-07-17 09:31:00    17        17 0.78319 5000    NA    NA      0.783195
2017-07-17 09:57:00    72        72 0.78173 3000    NA    NA      0.781735
2017-07-17 09:59:00    81        81 0.78178 8000    NA    NA      0.781785
2017-07-17 10:00:00    88        81 0.78183 8000    NA    NA      0.781785
2017-07-17 10:03:00    91        91 0.78174 3000    NA    NA      0.781745

testData (2).txt

Some other info:

> tail(diff(.index(testData)))
[1]  5220.097  1319.932 62280.164  1739.899  2520.053 19919.924

So clearly sub second precision

@ckatsulis ckatsulis changed the title [Bug] Setting precision in influx_write returns an error [BUG] Setting precision in influx_write returns an error Aug 20, 2017
@dleutnant dleutnant added the bug label Aug 20, 2017
@dleutnant
Copy link
Owner

dleutnant commented Aug 20, 2017

@ckatsulis I would be happy if you could check the dev version of influxdbr. Does the error still occur?

@ckatsulis
Copy link
Author

ckatsulis commented Aug 20, 2017 via email

@ckatsulis
Copy link
Author

Pulled the dev version and set precision to u

devtools::install_github("dleutnant/influxdbr@dev")

That did the trick. Thanks for the quick turn around.

dleutnant added a commit that referenced this issue Aug 21, 2017
* cran submission

* update description to indicate dev version

* Fix coercion error in influx_write()` in case of sub-second accuracy (#25).

* `influx_select()` correctly parses integer arguments (#27)

* typo

* significant parsing speed improvement (naming series columns at later stage)

* Update NEWS.md

* prepare merge with master

* prepare merge with master
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants