You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This is a feature request to prevent dt_rows_all being set to a vector that has one integer for each row in the entire table; instead it should be trimmed to match dt_rows_current.
An unnecessarily large JSON object is sent over the network for very large, paged, server-side datatables. DT_rows_all is set to seq_len(n) (or some reordering), which is a very large vector, even though data is only sent for the rows in DT_rows_current. Indeed, for my application, which uses a 70K row table, the majority of the data sent over the network is DT_rows_all.
I would write a reproducible example, but it is clear that the authors are aware of this issue, it is marked as TODO in the code itself:
# TODO: if iAll is just 1:n, is it necessary to pass this vector to JSON, then
# to R? When n is large, it may not be very efficient
list(
draw= as.integer(q$draw),
recordsTotal=n,
recordsFiltered= nrow(data),
data= cleanDataFrame(fdata),
DT_rows_all=iAll,
DT_rows_current=iCurrent
)
I think, ideally, DT_rows_all would be set to iCurrent, which would make it redundant with dt_rows_current, but it would have less chance of breaking existing user code. It would also have to change here:
Alternatively, the filterFun function in sessionDataURL function could be modified to trim DT_rows_current before encoding it as JSON.
Without changing the package at all, I could write a custom filter for large data tables, which would trim dt_rows_all. I'm not quite sure how to write a custom filter, and I couldn't find any examples. I saw the lack of examples referenced in issue #194 as well, perhaps this would be a good example to provide? Any quick tips would be very helpful as well.
I would be happy to provide a PR, but I wanted to get a sense of what strategy you think would work best for this (making DT_rows_all the same as DT_rows_current, or trimming it in filterFun).
The text was updated successfully, but these errors were encountered:
This is a feature request to prevent
dt_rows_all
being set to a vector that has one integer for each row in the entire table; instead it should be trimmed to matchdt_rows_current
.An unnecessarily large JSON object is sent over the network for very large, paged, server-side datatables.
DT_rows_all
is set toseq_len(n)
(or some reordering), which is a very large vector, even though data is only sent for the rows inDT_rows_current
. Indeed, for my application, which uses a 70K row table, the majority of the data sent over the network isDT_rows_all
.I would write a reproducible example, but it is clear that the authors are aware of this issue, it is marked as TODO in the code itself:
DT/R/shiny.R
Lines 493 to 502 in 1163778
I think, ideally,
DT_rows_all
would be set toiCurrent
, which would make it redundant withdt_rows_current
, but it would have less chance of breaking existing user code. It would also have to change here:DT/R/shiny.R
Line 399 in a314d79
Alternatively, the
filterFun
function insessionDataURL
function could be modified to trimDT_rows_current
before encoding it as JSON.Without changing the package at all, I could write a custom filter for large data tables, which would trim
dt_rows_all
. I'm not quite sure how to write a custom filter, and I couldn't find any examples. I saw the lack of examples referenced in issue #194 as well, perhaps this would be a good example to provide? Any quick tips would be very helpful as well.I would be happy to provide a PR, but I wanted to get a sense of what strategy you think would work best for this (making
DT_rows_all
the same asDT_rows_current
, or trimming it infilterFun
).The text was updated successfully, but these errors were encountered: