Skip to content

Commit

Permalink
Use purrr::pluck()
Browse files Browse the repository at this point in the history
See #326
  • Loading branch information
damianooldoni committed Jul 22, 2024
1 parent 4801760 commit 489314a
Show file tree
Hide file tree
Showing 5 changed files with 75 additions and 22 deletions.
18 changes: 15 additions & 3 deletions R/camtrapR_recordTable.R
Original file line number Diff line number Diff line change
Expand Up @@ -92,9 +92,21 @@
#' # How to deal with duplicates
#' x_dup <- x
#' # create a duplicate at 2020-07-29 05:46:48, location: B_DL_val 5_beek kleine vijver
#' x_dup$data$observations[4,"sequenceID"] <- observations(x_dup)$sequenceID[3]
#' x_dup$data$observations[4, "deploymentID"] <- observations(x_dup)$deploymentID[3]
#' x_dup$data$observations[4, "timestamp"] <- observations(x_dup)$timestamp[3]
#' x_dup$data$observations[4,"sequenceID"] <- purrr::pluck(
#' observations(x_dup),
#' "sequenceID",
#' 3
#' )
#' x_dup$data$observations[4, "deploymentID"] <- purrr::pluck(
#' observations(x_dup),
#' "deploymentID",
#' 3
#' )
#' x_dup$data$observations[4, "timestamp"] <- purrr::pluck(
#' observations(x_dup),
#' "timestamp",
#' 3
#' )
#'
#' # duplicates are removed by default by camtrapR_recordTable()
#' camtrapR_recordTable(x_dup)
Expand Down
18 changes: 15 additions & 3 deletions man/camtrapR_recordTable.Rd

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

17 changes: 9 additions & 8 deletions man/get_n_individuals.Rd

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

26 changes: 21 additions & 5 deletions tests/testthat/test-get_record_table.R
Original file line number Diff line number Diff line change
Expand Up @@ -212,18 +212,34 @@ test_that(paste(
x_dup <- x
# create duplicates at 2020-07-29 05:46:48, location: B_DL_val 5_beek kleine vijver
# use 3rd observation as the first two are unknown or blank (= no animal)
x_dup$data$observations[,"sequenceID"] <- observations(x_dup)$sequenceID[3]
x_dup$data$observations[, "deploymentID"] <- observations(x_dup)$deploymentID[3]
x_dup$data$observations[, "timestamp"] <- observations(x_dup)$timestamp[3]
x_dup$data$observations[, "scientificName"] <- observations(x_dup)$scientificName[3]
x_dup$data$observations[,"sequenceID"] <- purrr::pluck(
observations(x_dup),
"sequenceID",
3
)
x_dup$data$observations[, "deploymentID"] <- purrr::pluck(
observations(x_dup),
"deploymentID",
3
)
x_dup$data$observations[, "timestamp"] <- purrr::pluck(
observations(x_dup),
"timestamp",
3
)
x_dup$data$observations[, "scientificName"] <- purrr::pluck(
observations(x_dup),
"scientificName",
3
)

rec_table <- camtrapR_recordTable(x_dup)
rec_table_dup <- camtrapR_recordTable(x_dup,
removeDuplicateRecords = FALSE
)
expect_identical(nrow(rec_table), 1L)
expect_identical(
rec_table$DateTimeOriginal, observations(x)$timestamp[3]
rec_table$DateTimeOriginal, purrr::pluck(observations(x), "timestamp", 3)
)
expect_identical(rec_table$delta.time.secs, 0)
expect_identical(names(rec_table_dup), names(rec_table))
Expand Down
18 changes: 15 additions & 3 deletions vignettes/record-table.Rmd
Original file line number Diff line number Diff line change
Expand Up @@ -121,9 +121,21 @@ Let's create an easy example with a duplicate based on `x`:
```{r dummy_data_with_duplicates}
x_dup <- x
# create a duplicate at 2020-07-29 05:46:48, location: B_DL_val 5_beek kleine vijver
x_dup$data$observations[4,"sequenceID"] <- observations(x_dup)$sequenceID[3]
x_dup$data$observations[4, "deploymentID"] <- observations(x_dup)$deploymentID[3]
x_dup$data$observations[4, "timestamp"] <- observations(x_dup)$timestamp[3]
x_dup$data$observations[4,"sequenceID"] <- purrr::pluck(
observations(x_dup),
"sequenceID",
3
)
x_dup$data$observations[4, "deploymentID"] <- purrr::pluck(
observations(x_dup),
"deploymentID",
3
)
x_dup$data$observations[4, "timestamp"] <- purrr::pluck(
observations(x_dup),
"timestamp",
3
)
```

Record table without duplicates:
Expand Down

0 comments on commit 489314a

Please sign in to comment.