Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Address several issues #103

Closed
wants to merge 32 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
32 commits
Select commit Hold shift + click to select a range
6f9c73d
Add drop fix.
colearendt Apr 23, 2017
55220a1
Add direct support for NSE dplyr verbs
colearendt May 14, 2017
e190f42
Add failing test for path with vector input
colearendt May 15, 2017
96ce780
Initial commit on multiple-apis vignette
colearendt May 15, 2017
9b7a965
Add appveyor support using devtools::use_appveyor()
colearendt May 2, 2017
ecb0c1b
Add packrat
colearendt May 19, 2017
1634b33
Add bind_rows support
colearendt May 19, 2017
27e50f1
Skip print tests
colearendt May 21, 2017
9a70a20
Add revdep
colearendt May 22, 2017
6d6b2aa
Improve vignettes and docs
colearendt May 24, 2017
4464ff0
Update docs
colearendt May 29, 2017
587734f
Update packrat.lock
colearendt May 29, 2017
64e361b
Fix spread_all recursive=FALSE
colearendt May 29, 2017
3ed3d07
Handle errors in print
colearendt May 29, 2017
392dc6a
Turn spread_all name dedupe into loop
colearendt May 30, 2017
b877d83
Change naming of j* functions to purrr
colearendt May 30, 2017
43496cb
Update docs, NEWS.md, vignettes, NAMESPACE
colearendt May 30, 2017
4ffec9d
Fix json_structure
colearendt Jun 2, 2017
34624aa
Impute document.id instead of bypass it
colearendt Jun 2, 2017
aaf1f3c
Update vignette and NEWS
colearendt Jun 2, 2017
9e2a4a6
.travis.yml config
colearendt Jun 10, 2017
f7f767b
Add tests for new functionality
colearendt Jun 11, 2017
d15e7ae
Deprecate append_values_* family
colearendt Jun 11, 2017
47b458c
Update docs
colearendt Jun 11, 2017
f6f13f4
Convert tbl_df to as_tibble
colearendt Jun 12, 2017
168b4d7
Merge pull request #3 from colearendt/developclean
colearendt Jun 15, 2017
81b00a4
Update badges
colearendt Aug 19, 2017
91167bf
Append_values undo rename
colearendt Aug 19, 2017
e5ed7c7
Undo is_json and json_ renames
colearendt Aug 19, 2017
a4f9222
Add osx builds
colearendt Aug 19, 2017
5cdc59f
Merge pull request #4 from colearendt/feature/fix_naming
colearendt Aug 20, 2017
d661b42
Merge pull request #5 from colearendt/develop
colearendt Aug 26, 2017
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 5 additions & 0 deletions .Rbuildignore
Original file line number Diff line number Diff line change
Expand Up @@ -5,3 +5,8 @@
^codecov\.yml$
^README\.Rmd$
^README-.*\.png$
^packrat/
^\.Rprofile$
^working/
^appveyor\.yml$
^revdep/
4 changes: 4 additions & 0 deletions .gitattributes
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
* text=auto
data/* binary
src/* text=lf
R/* text=lf
4 changes: 4 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -6,3 +6,7 @@ inst/doc
.Rproj.user
*.Rproj
.DS_Store
packrat/lib*/
packrat/src/
working/
.Rprofile
21 changes: 21 additions & 0 deletions .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -4,8 +4,29 @@ language: R
sudo: false
cache: packages

os:
- linux
- osx

r:
- oldrel
- release
- devel

matrix:
allow_failures:
- r: devel

r_packages:
- covr

script:
- |
travis_wait 60 R CMD build --no-build-vignettes --no-manual --no-resave-data .
travis_wait 60 R CMD check --no-build-vignettes --no-manual tidyjson*tar.gz

after_success:
- Rscript -e 'library(covr); codecov()'

after_script:
- ./travis-tool.sh dump_logs
7 changes: 4 additions & 3 deletions DESCRIPTION
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
Package: tidyjson
Title: Tidy Complex JSON
Version: 0.2.1.9000
Version: 0.2.1.9001
Author: Jeremy Stanley <[email protected]>
Maintainer: Jeremy Stanley <[email protected]>
Description: Turn complex JSON data into tidy data frames.
Expand Down Expand Up @@ -28,8 +28,9 @@ Suggests:
listviewer,
igraph,
RColorBrewer,
covr
covr,
lubridate
VignetteBuilder: knitr
URL: https://github.com/jeremystan/tidyjson
BugReports: https://github.com/jeremystan/tidyjson/issues
RoxygenNote: 5.0.1
RoxygenNote: 6.0.1
7 changes: 7 additions & 0 deletions NAMESPACE
Original file line number Diff line number Diff line change
@@ -1,20 +1,27 @@
# Generated by roxygen2: do not edit by hand

S3method("[",tbl_json)
S3method(arrange,tbl_json)
S3method(arrange_,tbl_json)
S3method(as.character,tbl_json)
S3method(as.tbl_json,character)
S3method(as.tbl_json,data.frame)
S3method(as.tbl_json,tbl_json)
S3method(filter,tbl_json)
S3method(filter_,tbl_json)
S3method(mutate,tbl_json)
S3method(mutate_,tbl_json)
S3method(print,tbl_json)
S3method(slice,tbl_json)
S3method(slice_,tbl_json)
export("%>%")
export(append_values_logical)
export(append_values_number)
export(append_values_string)
export(as.tbl_json)
export(as_data_frame)
export(as_tibble)
export(bind_rows)
export(enter_object)
export(gather_array)
export(gather_keys)
Expand Down
29 changes: 28 additions & 1 deletion NEWS.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,31 @@
# purrr 0.2.1.9000
# tidyjson 0.2.1.9001

## New functions

* Add `bind_rows()` support. Though currently not an S3 implementation, it behaves as much like the `dplyr` variant as possible, preserving the `attr(.,'JSON')` components if all components are `tbl_json` objects. (#58)

## Documentation Changes

* "Using Multiple APIs" vignette added to show support for using tidyjson with multiple APIs (#85)

* Updated README.md to better explain `spread_all()` (#92)

## Bug fixes and minor changes

* `DROP=TRUE` caused an error. Altered behavior to be consistent with `tbl_df` (throw a warning and do nothing)

* Fix `spread_all(recursive=FALSE)` bug that caused an error (#65)

* Alter `spread_all()` behavior to recursively check for deduplication of names (and thus avoid an error) (#76)

* Add named support for the `NSE` versions of dplyr functions (`filter()`,`mutate()`,`slice()`, etc.) since the `SE` variants are no longer called behind-the-scenes since `dplyr 0.6.0`. (#97)

* Fix errors with `print.tbl_json()` when the JSON attribute is missing

* Fix json_structure() failure if `document.id` missing by imputing
the missing `document.id`. (#86)

# tidyjson 0.2.1.9000

## New functions

Expand Down
8 changes: 4 additions & 4 deletions R/append_values.R
Original file line number Diff line number Diff line change
Expand Up @@ -52,12 +52,12 @@ append_values_factory <- function(type, as.value) {

if (!is.tbl_json(.x)) .x <- as.tbl_json(.x)

if (force == FALSE) assert_that(recursive == FALSE)
if (force == FALSE) assertthat::assert_that(recursive == FALSE)

# Extract json
json <- attr(.x, "JSON")

assert_that(length(json) == nrow(.x))
assertthat::assert_that(length(json) == nrow(.x))

# if json is empty, return empty
if (length(json) == 0) {
Expand All @@ -78,7 +78,7 @@ append_values_factory <- function(type, as.value) {
new_val[loc] <- NA
}
new_val <- new_val %>% as.value
assert_that(length(new_val) == nrow(.x))
assertthat::assert_that(length(new_val) == nrow(.x))
.x[column.name] <- new_val
}

Expand All @@ -92,7 +92,7 @@ append_values_factory <- function(type, as.value) {
#' @param l a list that we want to unlist
#' @param recursive logical indicating whether to unlist nested lists
my_unlist <- function(l, recursive = FALSE) {
nulls <- map_int(l, length) != 1
nulls <- purrr::map_int(l, length) != 1
l[nulls] <- NA
unlist(l, recursive = recursive)
}
Expand Down
4 changes: 2 additions & 2 deletions R/enter_object.R
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@
#'
#' @seealso \code{\link{gather_object}} to find sub-objects that could be
#' entered into, \code{\link{gather_array}} to gather an array in an object
#' and \code{\link{spread_all}} to spread values in an object.
#' and \code{\link{spread_all}} or \code{\link{spread_values}} to spread values in an object.
#' @param .x a json string or tbl_json object
#' @param ... a quoted or unquoted sequence of strings designating the object
#' name or sequences of names you wish to enter
Expand Down Expand Up @@ -71,7 +71,7 @@ enter_object <- function(.x, ...) {
json <- attr(.x, "JSON")

# Access path
json <- map(json, path %>% as.list)
json <- purrr::map(json, path %>% as.list)

tbl_json(.x, json, drop.null.json = TRUE)

Expand Down
18 changes: 9 additions & 9 deletions R/gather.R
Original file line number Diff line number Diff line change
Expand Up @@ -12,8 +12,8 @@ gather_factory <- function(default.column.name, default.column.empty,

function(.x, column.name = default.column.name) {

assert_that(!("..name" %in% names(.x)))
assert_that(!("..json" %in% names(.x)))
assertthat::assert_that(!("..name" %in% names(.x)))
assertthat::assert_that(!("..json" %in% names(.x)))

if (!is.tbl_json(.x)) .x <- as.tbl_json(.x)

Expand All @@ -36,13 +36,13 @@ gather_factory <- function(default.column.name, default.column.empty,
stop(sprintf("%s records are not %ss", sum(bad_type), required.type))

y <- .x %>%
tbl_df %>%
mutate(
..name = json %>% map(expand.fun),
dplyr::as_tibble() %>%
dplyr::mutate(
..name = json %>% purrr::map(expand.fun),
..json = json %>%
map(~data_frame(..json = as.list(.)))
purrr::map(~dplyr::data_frame(..json = as.list(.)))
) %>%
unnest(..name, ..json, .drop = FALSE)
tidyr::unnest(..name, ..json, .drop = FALSE)

# Check to see if column.name exists, otherwise, increment until not
if (column.name %in% names(y)) {
Expand All @@ -58,10 +58,10 @@ gather_factory <- function(default.column.name, default.column.empty,
}

# Rename
y <- y %>% rename_(.dots = setNames("..name", column.name))
y <- y %>% dplyr::rename_(.dots = setNames("..name", column.name))

# Construct tbl_json
tbl_json(y %>% select(-..json), y$..json)
tbl_json(y %>% dplyr::select(-..json), y$..json)

}

Expand Down
6 changes: 3 additions & 3 deletions R/is_json.R
Original file line number Diff line number Diff line change
Expand Up @@ -46,15 +46,15 @@ NULL

#' @rdname is_json
#' @export
is_json_string <- is_json_factory("string")
is_json_string <- is_json_factory('string')

#' @rdname is_json
#' @export
is_json_number <- is_json_factory("number")
is_json_number <- is_json_factory('number')

#' @rdname is_json
#' @export
is_json_logical <- is_json_factory("logical")
is_json_logical <- is_json_factory('logical')

#' @rdname is_json
#' @export
Expand Down
2 changes: 1 addition & 1 deletion R/json_complexity.R
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ json_complexity <- function(.x, column.name = "complexity") {
json <- attr(.x, "JSON")

# Determine lengths
lengths <- json %>% map(unlist, recursive = TRUE) %>% map_int(length)
lengths <- json %>% purrr::map(unlist, recursive = TRUE) %>% purrr::map_int(length)

# Add as a column to x
.x[column.name] <- lengths
Expand Down
2 changes: 1 addition & 1 deletion R/json_lengths.R
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ json_lengths <- function(.x, column.name = "length") {
json <- attr(.x, "JSON")

# Determine lengths
lengths <- map_int(json, length)
lengths <- purrr::map_int(json, length)

# Add as a column to x
.x[column.name] <- lengths
Expand Down
44 changes: 22 additions & 22 deletions R/json_schema.R
Original file line number Diff line number Diff line change
Expand Up @@ -74,7 +74,7 @@ json_schema <- function(.x, type = c("string", "value")) {

if (any(is_array)) {

array_schema <- json[is_array] %>% map(json_schema_array, type)
array_schema <- json[is_array] %>% purrr::map(json_schema_array, type)

array_schema <- array_schema %>%
unlist(recursive = FALSE) %>%
Expand All @@ -88,11 +88,11 @@ json_schema <- function(.x, type = c("string", "value")) {

if (any(is_object)) {

object_schema <- json[is_object] %>% map(json_schema_object, type)
object_schema <- json[is_object] %>% purrr::map(json_schema_object, type)

object_schema <- object_schema %>%
bind_rows %>%
tbl_df %>%
dplyr::as_tibble() %>%
unique

object_schema <- collapse_object(object_schema)
Expand Down Expand Up @@ -124,7 +124,7 @@ json_schema <- function(.x, type = c("string", "value")) {

list_to_tbl_json <- function(l) {

tbl_json(data_frame(document.id = 1L), list(l))
tbl_json(dplyr::data_frame(document.id = 1L), list(l))

}

Expand All @@ -143,15 +143,15 @@ json_schema_array <- function(json, type) {

collapse_array <- function(schema) {

data_frame(schemas = schema) %>%
mutate(json = schemas) %>%
dplyr::data_frame(schemas = schema) %>%
dplyr::mutate(json = schemas) %>%
as.tbl_json(json.column = "json") %>%
json_types %>%
json_complexity %>%
tbl_df %>%
arrange(desc(complexity), type) %>%
slice(1) %>%
extract2("schemas") %>%
dplyr::as_tibble() %>%
dplyr::arrange(desc(complexity), type) %>%
dplyr::slice(1) %>%
magrittr::extract2("schemas") %>%
paste(collapse = ", ") %>%
sprintf("[%s]", .)

Expand All @@ -161,10 +161,10 @@ json_schema_object <- function(json, type) {

x <- json %>% list_to_tbl_json %>% gather_object

x$schemas <- attr(x, "JSON") %>% map(list_to_tbl_json) %>%
map_chr(json_schema, type)
x$schemas <- attr(x, "JSON") %>% purrr::map(list_to_tbl_json) %>%
purrr::map_chr(json_schema, type)

schemas <- x %>% select(name, schemas) %>% unique
schemas <- x %>% dplyr::select(name, schemas) %>% unique

schemas

Expand All @@ -173,18 +173,18 @@ json_schema_object <- function(json, type) {
collapse_object <- function(schema) {

schema %>%
mutate(json = schemas) %>%
dplyr::mutate(json = schemas) %>%
as.tbl_json(json.column = "json") %>%
json_types %>%
json_complexity %>%
tbl_df %>%
group_by(name) %>%
arrange(desc(complexity), type) %>%
slice(1) %>%
ungroup %>%
mutate(name = name %>% sprintf('"%s"', .)) %>%
mutate(schemas = map2(name, schemas, paste, sep = ": ")) %>%
extract2("schemas") %>%
dplyr::as_tibble() %>%
dplyr::group_by(name) %>%
dplyr::arrange(desc(complexity), type) %>%
dplyr::slice(1) %>%
dplyr::ungroup() %>%
dplyr::mutate(name = name %>% sprintf('"%s"', .)) %>%
dplyr::mutate(schemas = map2(name, schemas, paste, sep = ": ")) %>%
magrittr::extract2("schemas") %>%
paste(collapse = ", ") %>%
sprintf("{%s}", .)

Expand Down
Loading