Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
61 commits
Select commit Hold shift + click to select a range
9a30dcc
Update convert.input.R
moki1202 Mar 14, 2021
57290cd
Update download.url.R
moki1202 Mar 14, 2021
9c86b3f
Update met.process.R
moki1202 Mar 14, 2021
976a155
Update download.NLDAS.R
moki1202 Mar 20, 2021
675bcfa
Update download.GLDAS.R
moki1202 Mar 20, 2021
6dea9ad
Update extract_soil_nc.R
moki1202 Mar 20, 2021
eb977af
Update LandTrendr.AGB.R
moki1202 Mar 20, 2021
b172a68
Update land.utils.R
moki1202 Mar 20, 2021
15185f1
Update salix_benchmarks.Rmd
moki1202 Mar 20, 2021
55dccd8
Update convert.input.R
moki1202 Mar 21, 2021
4c3ccdf
Update convert.input.R
moki1202 Mar 21, 2021
95b8dc7
Update browndog.R
moki1202 Mar 21, 2021
ab9cbca
Update met.process.R
moki1202 Mar 21, 2021
4a9f382
Update DESCRIPTION
moki1202 Mar 21, 2021
fbcb2e5
Update abbreviated_workflow_SIPNET.R
moki1202 Mar 21, 2021
eeb787c
Update workflow.bm.R
moki1202 Mar 21, 2021
cc00f84
Update workflow.pda.R
moki1202 Mar 21, 2021
48c31f9
Update workflow.wcr.assim.R
moki1202 Mar 21, 2021
dd18db2
Update workflow.R
moki1202 Mar 21, 2021
53dcf86
Update DESCRIPTION
moki1202 Mar 21, 2021
2b50e5f
Update DESCRIPTION
moki1202 Mar 21, 2021
8fc0f5e
Update pecan.depends.R
moki1202 Mar 21, 2021
40ea64d
Update running_maat_in_pecan.Rmd
moki1202 Mar 21, 2021
7864b50
Update graph_fluxtowers.R
moki1202 Mar 21, 2021
7789db7
Update graph_SDA_fluxtowers.R
moki1202 Mar 21, 2021
6c3df94
Update workflow.template.R
moki1202 Mar 21, 2021
6110737
Update DESCRIPTION
moki1202 Mar 21, 2021
a0b835c
Update DESCRIPTION
moki1202 Mar 21, 2021
8c64719
Update LoadFLUXNETsites.R
moki1202 Mar 21, 2021
4bc6721
Update LoadPalEONsites.R
moki1202 Mar 21, 2021
f689ce5
Merge branch 'develop' into patch-8
infotroph May 23, 2021
8795d84
Merge branch 'develop' into patch-8
infotroph May 24, 2021
ddbbbe5
Merge branch 'develop' into replace-rcurl
infotroph Feb 16, 2022
6c3bff1
curl calls in PEcAn.utils, plus argument fix in download.url
infotroph Feb 16, 2022
ff15be7
viz was declaring (RC|c)url without using + reorder deps
infotroph Feb 17, 2022
33d00d2
Revert "Update browndog.R"
infotroph Feb 18, 2022
f0f8755
Revert "Update graph_SDA_fluxtowers.R"
infotroph Feb 20, 2022
e35e107
Revert "Update graph_fluxtowers.R"
infotroph Feb 20, 2022
7048846
Revert "Update workflow.template.R"
infotroph Feb 20, 2022
6159d54
Revert "Update LoadFLUXNETsites.R"
infotroph Feb 25, 2022
02db8b6
Revert "Update LoadPalEONsites.R"
infotroph Feb 25, 2022
57a8a3d
remove unneeded library calls
infotroph Feb 25, 2022
a7bf6d4
typo
infotroph Mar 3, 2022
7abfd12
typos
infotroph Mar 3, 2022
5afb1ce
Remove stray lines describing unrelated function
infotroph Mar 3, 2022
d669325
roxygen
infotroph Mar 3, 2022
34f2705
handle is already a connection
infotroph Mar 3, 2022
144abb0
Merge branch 'develop' into replace-rcurl
infotroph Mar 9, 2022
52eff3c
fix curl_download call in download.LandTrendr.AGB
infotroph Mar 26, 2022
e2e6b11
fix curl_download in get.elevation
infotroph Mar 26, 2022
5a5de06
fix browndog POST in met.process
infotroph Mar 26, 2022
5d37712
Merge branch 'develop' into replace-rcurl
infotroph Jun 23, 2022
6fddaee
whitespace
infotroph Jun 23, 2022
5dcd1c1
declare curl in data.atm
infotroph Jun 25, 2022
f7f0080
remove unused RCurl load
infotroph Jun 25, 2022
8d86c7f
typo
infotroph Jun 26, 2022
7a25855
Merge branch 'develop' into replace-rcurl
infotroph Sep 24, 2022
17d0096
sort
infotroph Sep 24, 2022
c52fd84
db description
infotroph Sep 24, 2022
3791b40
leave RCurl n SSURGO calls; will fix in #2964
infotroph Sep 24, 2022
525f0a7
update import count
infotroph Sep 24, 2022
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/docker.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ name: Docker
# latest and develop
# - when a pull request is created and updated to make sure the
# Dockerfile is still valid.
# To be able to push to dockerhub, this execpts the following
# To be able to push to dockerhub, this expects the following
# secrets to be set in the project:
# - DOCKERHUB_USERNAME : username that can push to the org
# - DOCKERHUB_PASSWORD : password asscoaited with the username
Expand Down
2 changes: 2 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -56,6 +56,8 @@ convert data for a single PFT fixed (#1329, #2974, #2981)

- Using R4.0 and R4.1 tags to build PEcAn. Default is now 4.1
- Database connections consistently use `DBI::dbConnect` instead of the deprecated `dplyr::src_postgres` (#2881). This change should be invisible to most users, but it involved converting a lot of internal variables from `bety$con` to `con`. If you see errors involving these symbols it means we missed a place, so please report them as bugs.
- `PEcAn.utils::download.url` argument `retry404` is now renamed to `retry` and
now functions as intended (it was being ignored completely before).
- Update URL for MERRA downloads (#2888)
- PEcAn.logger is now BSD-3 License
- Skipped ICOS and MERRA download tests when running in github actions
Expand Down
2 changes: 1 addition & 1 deletion base/db/DESCRIPTION
Original file line number Diff line number Diff line change
Expand Up @@ -41,6 +41,7 @@ Description: The Predictive Ecosystem Carbon Analyzer (PEcAn) is a scientific
streamline the interaction between data and models, and to improve the
efficacy of scientific investigation.
Imports:
curl,
DBI,
dbplyr,
dplyr,
Expand All @@ -55,7 +56,6 @@ Imports:
PEcAn.utils,
purrr,
R.utils,
RCurl,
rlang,
tibble,
tidyr,
Expand Down
32 changes: 22 additions & 10 deletions base/db/R/convert_input.R
Original file line number Diff line number Diff line change
Expand Up @@ -605,18 +605,27 @@ convert_input <-
}

# create curl options
curloptions <- list(followlocation = TRUE)
if (!is.null(browndog$username) && !is.null(browndog$password)) {
userpwd <- paste(browndog$username, browndog$password, sep = ":")
curloptions <- list(userpwd = userpwd, httpauth = 1L)
curloptions$userpwd = paste(
browndog$username, browndog$password, sep = ":")
curloptions$httpauth = 1L
}
curloptions <- c(curloptions, followlocation = TRUE)

# check if we can do conversion
out.html <- RCurl::getURL(paste0("http://dap-dev.ncsa.illinois.edu:8184/inputs/",
browndog$inputtype), .opts = curloptions)
if (outputtype %in% unlist(strsplit(out.html, "\n"))) {
PEcAn.logger::logger.info(paste("Conversion from", browndog$inputtype, "to", outputtype,
"through Brown Dog"))
h <- curl::new_handle()
curl::handle_setopt(h, .list = curloptions)
out.html <- readLines(
curl::curl(
url = paste0(
"http://dap-dev.ncsa.illinois.edu:8184/inputs/",
browndog$inputtype),
handle = h))
if (outputtype %in% out.html) {
PEcAn.logger::logger.info(
"Conversion from", browndog$inputtype,
"to", outputtype,
"through Brown Dog")
conversion <- "browndog"
}
}
Expand All @@ -637,13 +646,16 @@ convert_input <-
}

# post zipped file to Brown Dog
html <- RCurl::postForm(url, fileData = RCurl::fileUpload(zipfile), .opts = curloptions)
h <- curl::new_handle()
curl::handle_setopt(handle = h, .list = curloptions)
curl::handle_setform(handle = h, fileData = curl::form_file(zipfile))
html <- readLines(curl::curl(url = url, handle = h))
link <- XML::getHTMLLinks(html)
file.remove(zipfile)

# download converted file
outfile <- file.path(outfolder, unlist(strsplit(basename(link), "_"))[2])
PEcAn.utils::download.url(url = link, file = outfile, timeout = 600, .opts = curloptions, retry404 = TRUE)
PEcAn.utils::download.url(url = link, file = outfile, timeout = 600, .opts = curloptions, retry = TRUE)

# unzip downloaded file if necessary
if (file.exists(outfile)) {
Expand Down
2 changes: 1 addition & 1 deletion base/utils/DESCRIPTION
Original file line number Diff line number Diff line change
Expand Up @@ -33,12 +33,12 @@ Description: The Predictive Ecosystem Carbon Analyzer
investigation.
Imports:
abind (>= 1.4.5),
curl,
dplyr,
lubridate (>= 1.6.0),
magrittr,
ncdf4 (>= 1.15),
PEcAn.logger,
RCurl,
rlang,
stringi,
units
Expand Down
31 changes: 21 additions & 10 deletions base/utils/R/download.url.R
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
##' Try and download a file.
##'
##' This will download a file, if retry404 and 404 is returned it will
##' This will download a file, if retry is set and 404 is returned it will
##' wait until the file is available. If the file is still not available
##' after timeout tries, it will return NA. If the file is downloaded
##' it will return the name of the file
Expand All @@ -13,26 +13,37 @@
##' @param timeout number of seconds to wait for file (default 600)
##' @param .opts list of options for curl, for example to download from a
##' protected site use list(userpwd=userpass, httpauth = 1L)
##' @param retry404 retry on a 404, this is used by Brown Dog
##' @param retry retry if url not found yet, this is used by Brown Dog
##' @return returns name of file if successful or NA if not.
##'
##' @examples
##' \dontrun{
##' download.url('http://localhost/', index.html)
##' }
download.url <- function(url, file, timeout = 600, .opts = list(), retry404 = TRUE) {
dir.create(basename(file), recursive = TRUE)
download.url <- function(url, file, timeout = 600, .opts = list(), retry = TRUE) {
count <- 0
while (!RCurl::url.exists(url, .opts = .opts) && count < timeout) {
while (retry && !url_found(url) && count < timeout) {
count <- count + 1
Sys.sleep(1)
}
if (count >= timeout) {
if (count >= timeout || (!retry && !url_found(url))) {
return(NA)
}
f <- RCurl::CFILE(file, mode = "wb")
RCurl::curlPerform(url = url, writedata = f@ref, .opts = .opts)
RCurl::close(f)
dir.create(dirname(file), recursive = TRUE)
res <- curl::curl_download(
url = url,
destfile = file,
handle = curl::new_handle(.list = .opts))

return(file)
res
} # download.url


# An approximate replacement for RCurl::url.exists
# Treats any 200 status as success (NB does not follow redirects!)
url_found <- function(url) {
h <- curl::new_handle(nobody = 1L) # "nobody" = header-only request
res <- curl::curl_fetch_memory(url, handle = h)

res$status_code %/% 200 == 1
}
2 changes: 0 additions & 2 deletions base/utils/R/mail.R
Original file line number Diff line number Diff line change
Expand Up @@ -9,8 +9,6 @@

##' Sends email. This assumes the program sendmail is installed.
##'
##' @title Clear EBI-CLUSTER worker node local scratch directories of old PEcAn output
##' @name sendmail
##' @param from the sender of the mail message
##' @param to the receipient of the mail message
##' @param subject the subject of the mail message
Expand Down
6 changes: 3 additions & 3 deletions base/utils/man/download.url.Rd

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

2 changes: 1 addition & 1 deletion base/utils/man/sendmail.Rd

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

1 change: 0 additions & 1 deletion base/visualization/DESCRIPTION
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,6 @@ Imports:
PEcAn.logger,
PEcAn.utils,
plyr (>= 1.8.4),
RCurl,
reshape2,
rlang,
stringr(>= 1.1.0)
Expand Down
1 change: 1 addition & 0 deletions docker/depends/pecan.depends.R
Original file line number Diff line number Diff line change
Expand Up @@ -29,6 +29,7 @@ wanted <- c(
'BrownDog',
'coda',
'corrplot',
'curl',
'data.table',
'dataone',
'datapack',
Expand Down
20 changes: 11 additions & 9 deletions models/biocro/inst/salix_benchmarks.Rmd
Original file line number Diff line number Diff line change
Expand Up @@ -2,10 +2,11 @@ BioCro fitting, parameterization, and testing
========================================================

```{r}
require(data.table)
require(lubridate)
require(ggplot2)
require(PEcAn.DB)
library(data.table)
library(lubridate)
library(ggplot2)
library(PEcAn.DB)
library(curl)
load(system.file("extdata", "salix.RData", package = "BioCro"))
settings.xml <- system.file("extdata/pecan.biocro.xml",
package = "PEcAn.BIOCRO")
Expand Down Expand Up @@ -34,10 +35,11 @@ salix <- salix.us[stand_age > 2,list(lat = mean(lat), lon = mean(lon), yield = m

```{r}

require(RCurl)
file.url <- getURL("https://www.betydb.org/miscanthusyield.csv",
ssl.verifypeer = FALSE)
mxg <- read.csv(textConnection(file.url))

mxg_file_con <- curl(
url = "https://www.betydb.org/miscanthusyield.csv",
handle = new_handle(ssl_verifypeer = FALSE))
mxg <- read.csv(mxg_file_con)


salix$model <- vector(mode="numeric", length = nrow(salix))
Expand Down Expand Up @@ -103,4 +105,4 @@ trait.summary <- salix.traits[sort(n),
trait.summary2 <- trait.summary[with(trait.summary, rank(n + rank(trait)/1000)),]
ggplot(data = trait.summary, aes(x = trait, y = n, order = n + rank(trait)/100)) + geom_point() + geom_linerange(aes(ymin = 0, ymax = n)) #+ coord_flip() + theme_bw()

```
```
1 change: 0 additions & 1 deletion models/maat/vignettes/running_maat_in_pecan.Rmd
Original file line number Diff line number Diff line change
Expand Up @@ -190,7 +190,6 @@ In this example we will run ten MAAT model ensembles in serial based on paramete
```{r run_maat}
library(PEcAn.all)
library(PEcAn.utils)
library(RCurl)

setwd("~")
getwd()
Expand Down
2 changes: 1 addition & 1 deletion modules/data.atmosphere/DESCRIPTION
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,7 @@ Depends:
Imports:
abind (>= 1.4.5),
amerifluxr,
curl,
data.table,
dplyr,
geonames (> 0.998),
Expand All @@ -45,7 +46,6 @@ Imports:
PEcAn.utils,
purrr (>= 0.2.3),
raster,
RCurl,
REddyProc,
reshape2,
rgdal,
Expand Down
4 changes: 2 additions & 2 deletions modules/data.atmosphere/R/download.GLDAS.R
Original file line number Diff line number Diff line change
Expand Up @@ -128,7 +128,7 @@ download.GLDAS <- function(outfolder, start_date, end_date, site_id, lat.in, lon
dap_file <- paste0(dap_base, "/", year, "/", doy, "/", dap.log[h, 1], ".ascii?")

# Query lat/lon
latlon <- RCurl::getURL(paste0(dap_file, "lat[0:1:599],lon[0:1:1439]"))
latlon <- curl::curl_download(paste0(dap_file, "lat[0:1:599],lon[0:1:1439]"))
lat.ind <- gregexpr("lat", latlon)
lon.ind <- gregexpr("lon", latlon)
lats <- as.vector(utils::read.table(
Expand All @@ -153,7 +153,7 @@ download.GLDAS <- function(outfolder, start_date, end_date, site_id, lat.in, lon
}
dap_query <- substr(dap_query, 2, nchar(dap_query))

dap.out <- RCurl::getURL(paste0(dap_file, dap_query))
dap.out <- curl::curl_download(paste0(dap_file, dap_query))
for (v in seq_len(nrow(var))) {
var.now <- var$DAP.name[v]
ind.1 <- gregexpr(paste(var.now, var.now, sep = "."), dap.out)
Expand Down
4 changes: 2 additions & 2 deletions modules/data.atmosphere/R/download.NLDAS.R
Original file line number Diff line number Diff line change
Expand Up @@ -123,7 +123,7 @@ download.NLDAS <- function(outfolder, start_date, end_date, site_id, lat.in, lon
mo.now, day.mo, ".", hr, ".002.grb.ascii?")

# Query lat/lon
latlon <- RCurl::getURL(paste0(dap_file, "lat[0:1:223],lon[0:1:463]"))
latlon <- curl::curl_download(paste0(dap_file, "lat[0:1:223],lon[0:1:463]"))
lat.ind <- gregexpr("lat", latlon)
lon.ind <- gregexpr("lon", latlon)
lats <- as.vector(utils::read.table(con <- textConnection(substr(latlon, lat.ind[[1]][3],
Expand All @@ -146,7 +146,7 @@ download.NLDAS <- function(outfolder, start_date, end_date, site_id, lat.in, lon
}
dap_query <- substr(dap_query, 2, nchar(dap_query))

dap.out <- RCurl::getURL(paste0(dap_file, dap_query))
dap.out <- curl::curl_download(paste0(dap_file, dap_query))
for (v in seq_len(nrow(var))) {
var.now <- var$DAP.name[v]
ind.1 <- gregexpr(paste(var.now, var.now, sep = "."), dap.out)
Expand Down
10 changes: 8 additions & 2 deletions modules/data.atmosphere/R/met.process.R
Original file line number Diff line number Diff line change
Expand Up @@ -527,8 +527,14 @@ browndog.met <- function(browndog, source, site, start_date, end_date, model, di

userpass <- paste(browndog$username, browndog$password, sep = ":")
curloptions <- list(userpwd = userpass, httpauth = 1L, followlocation = TRUE)
result <- RCurl::postForm(paste0(browndog$url, formatname, "/"),
fileData = RCurl::fileUpload("pecan.xml", xmldata, "text/xml"), .opts = curloptions)

result <- httr::POST(
url = paste0(browndog$url, formatname, "/"),
config = do.call(httr::config, curloptions),
httr::content_type("text/xml"),
body = xmldata)
httr::warn_for_status(result)
result_txt <- httr::content(result, "text")
url <- gsub(".*<a.*>(.*)</a>.*", "\\1", result)
PEcAn.logger::logger.info("browndog download url :", url)
downloadedfile <- PEcAn.utils::download.url(url, outputfile, 600, curloptions)
Expand Down
3 changes: 2 additions & 1 deletion modules/data.land/DESCRIPTION
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,7 @@ Description: The Predictive Ecosystem Carbon Analyzer (PEcAn) is a scientific
efficacy of scientific investigation.
Imports:
coda,
curl,
datapack,
dplyr,
dplR,
Expand All @@ -45,9 +46,9 @@ Imports:
PEcAn.utils,
PEcAn.visualization,
purrr,
RCurl,
rjags,
rlang,
RCurl,
RPostgreSQL,
sf,
sirt,
Expand Down
2 changes: 1 addition & 1 deletion modules/data.land/R/extract_soil_nc.R
Original file line number Diff line number Diff line change
Expand Up @@ -77,7 +77,7 @@ extract_soil_gssurgo<-function(outdir, lat, lon, size=1, radius=500, depths=c(0.
#the output is a gml file which need to be downloaded and read as a spatial file but I don't do that.
#I just read the file as a text and parse it out and try to find the mukey==mapunitkey
xmll <-
RCurl::getURL(mu.Path,
curl::curl_download(mu.Path,
ssl.verifyhost = FALSE,
ssl.verifypeer = FALSE
)
Expand Down
2 changes: 1 addition & 1 deletion modules/data.land/R/land.utils.R
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ get.elevation <- function(lat, lon) {

url <- paste("http://www.earthtools.org/height", lat, lon, sep = "/")

page <- RCurl::getURL(url)
page <- paste0(readLines(curl::curl(url)), collapse = "\n")
ans <- XML::xmlTreeParse(page, useInternalNodes = TRUE)
heightNode <- XML::xpathApply(ans, "//meters")[[1]]
return(as.numeric(XML::xmlValue(heightNode)))
Expand Down
4 changes: 2 additions & 2 deletions modules/data.land/tests/Rcheck_reference.log
Original file line number Diff line number Diff line change
Expand Up @@ -10,12 +10,12 @@
* checking package namespace information ... OK
* checking package dependencies ... OK

Imports includes 32 non-default packages.
Imports includes 33 non-default packages.
Importing from so many packages makes the package vulnerable to any of
them becoming unavailable. Move as many as possible to Suggests and
use conditionally.
* checking package dependencies ... NOTE
Imports includes 32 non-default packages.
Imports includes 33 non-default packages.
Importing from so many packages makes the package vulnerable to any of
them becoming unavailable. Move as many as possible to Suggests and
use conditionally.
Expand Down
Loading