Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

JSON object unnecessarily large for paged, server-side datatables; sends seq_len(DT) #504

Open
nograpes opened this issue Feb 14, 2018 · 2 comments

Comments

@nograpes
Copy link

This is a feature request to prevent dt_rows_all being set to a vector that has one integer for each row in the entire table; instead it should be trimmed to match dt_rows_current.

An unnecessarily large JSON object is sent over the network for very large, paged, server-side datatables. DT_rows_all is set to seq_len(n) (or some reordering), which is a very large vector, even though data is only sent for the rows in DT_rows_current. Indeed, for my application, which uses a 70K row table, the majority of the data sent over the network is DT_rows_all.

I would write a reproducible example, but it is clear that the authors are aware of this issue, it is marked as TODO in the code itself:

DT/R/shiny.R

Lines 493 to 502 in 1163778

# TODO: if iAll is just 1:n, is it necessary to pass this vector to JSON, then
# to R? When n is large, it may not be very efficient
list(
draw = as.integer(q$draw),
recordsTotal = n,
recordsFiltered = nrow(data),
data = cleanDataFrame(fdata),
DT_rows_all = iAll,
DT_rows_current = iCurrent
)

I think, ideally, DT_rows_all would be set to iCurrent, which would make it redundant with dt_rows_current, but it would have less chance of breaking existing user code. It would also have to change here:

DT/R/shiny.R

Line 399 in a314d79

DT_rows_all = seq_len(n),

Alternatively, the filterFun function in sessionDataURL function could be modified to trim DT_rows_current before encoding it as JSON.

Without changing the package at all, I could write a custom filter for large data tables, which would trim dt_rows_all. I'm not quite sure how to write a custom filter, and I couldn't find any examples. I saw the lack of examples referenced in issue #194 as well, perhaps this would be a good example to provide? Any quick tips would be very helpful as well.

I would be happy to provide a PR, but I wanted to get a sense of what strategy you think would work best for this (making DT_rows_all the same as DT_rows_current, or trimming it in filterFun).

@nuno-agostinho
Copy link

Hello, could you please look into this issue? I am getting really large JSON objects simply because the tables have many rows. Thanks!

@nuno-agostinho
Copy link

As a workaround, I am now using the following code to remove the information from DT_rows_all (as I don't need this functionality for my app):

# Avoid large JSON response from DT
dt_mod <- getFromNamespace("dataTablesFilter", "DT")
dt_rows_all_line <- grep("DT_rows_all = iAll", body(dt_mod))

if (length(dt_rows_all_line) == 1) {
    mod <- gsub("DT_rows_all = iAll", "DT_rows_all = iCurrent", fixed=TRUE,
                body(dt_mod)[dt_rows_all_line])
    body(dt_mod)[[dt_rows_all_line]] <- parse(text=mod)[[1]]
    assignInNamespace("dataTablesFilter", dt_mod, "DT")
}

In context: https://github.com/nuno-agostinho/cTRAP/blob/master/R/shinyInterface_session.R#L456-L465

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants