-
Notifications
You must be signed in to change notification settings - Fork 4
Description
Hi, I am an enthiusaist of the accessmap project and trying to deploy it locally to do some learning. I was following the readme instruction step by step. After fixing some bugs while setting up the environment, I thought the data extraction would go smoothly, but errors occurred still.
The error is shown as below:
`
~/Desktop/accessmapp-playground/opensidewalks-data/cities/seattle$ poetry run snakemake -j 8 --snakefile Snakefile.fetch`
The currently activated Python version 3.6.9 is not supported by the project (^3.8).
Trying to find and use a compatible version.
Using python3.8 (3.8.11)
Building DAG of jobs...
Using shell: /bin/bash
Provided cores: 8
Rules claiming more threads will be scaled down.
Job counts:
count jobs
1 all
1 fetch_crosswalks
1 fetch_curbramps
1 fetch_dem
1 fetch_sidewalks
1 fetch_snd
1 fetch_streets
7[Thu Aug 26 17:14:11 2021]
rule fetch_curbramps:
output: data_sources/curbramps.geojson
jobid: 2[Thu Aug 26 17:14:11 2021]
rule fetch_crosswalks:
output: data_sources/crosswalks.geojson
jobid: 3[Thu Aug 26 17:14:11 2021]
rule fetch_snd:
input: data-seattlecitygis.opendata.arcgis.com/datasets/0dd0ad79dc3845f3a296215d7c448a0d_2.geojson
output: data_sources/street_network_database.geojson
jobid: 5[Thu Aug 26 17:14:11 2021]
rule fetch_sidewalks:
output: data_sources/sidewalks.geojson
jobid: 1[Thu Aug 26 17:14:11 2021]
rule fetch_streets:
output: data_sources/streets.geojson
jobid: 4[Thu Aug 26 17:14:11 2021]
rule fetch_dem:
output: data_sources/dem.tif
jobid: 6Job counts:
count jobs
1 fetch_curbramps
1
Job counts:
count jobs
1 fetch_dem
1
Job counts:
count jobs
1 fetch_sidewalks
1
Job counts:
count jobs
1 fetch_crosswalks
1
Job counts:
count jobs
1 fetch_snd
1
Job counts:
count jobs
1 fetch_streets
1
[Thu Aug 26 17:14:13 2021]
Error in rule fetch_dem:
jobid: 0
output: data_sources/dem.tifRuleException:
HTTPError in line 128 of /home/alpha/Desktop/accessmapp-playground/opensidewalks-data/cities/seattle/Snakefile.fetch:
404 Client Error: Not Found for url: https://prd-tnm.s3.amazonaws.com/StagedProducts/Elevation/13/ArcGrid/USGS_NED_13_n48w123_ArcGrid.zip
File "/home/alpha/.cache/pypoetry/virtualenvs/opensidewalks-data-d3J5Akl6-py3.8/lib/python3.8/site-packages/snakemake/executors/init.py", line 2168, in run_wrapper
File "/home/alpha/Desktop/accessmapp-playground/opensidewalks-data/cities/seattle/Snakefile.fetch", line 128, in __rule_fetch_dem
File "/home/alpha/.cache/pypoetry/virtualenvs/opensidewalks-data-d3J5Akl6-py3.8/lib/python3.8/site-packages/requests/models.py", line 941, in raise_for_status
File "/home/alpha/.cache/pypoetry/virtualenvs/opensidewalks-data-d3J5Akl6-py3.8/lib/python3.8/site-packages/snakemake/executors/init.py", line 529, in _callback
File "/usr/lib/python3.8/concurrent/futures/thread.py", line 57, in run
File "/home/alpha/.cache/pypoetry/virtualenvs/opensidewalks-data-d3J5Akl6-py3.8/lib/python3.8/site-packages/snakemake/executors/init.py", line 515, in cached_or_run
File "/home/alpha/.cache/pypoetry/virtualenvs/opensidewalks-data-d3J5Akl6-py3.8/lib/python3.8/site-packages/snakemake/executors/init.py", line 2199, in run_wrapper
Exiting because a job execution failed. Look above for error message
mv: cannot move 'data-seattlecitygis.opendata.arcgis.com/datasets/0dd0ad79dc3845f3a296215d7c448a0d_2.geojson' to 'data_sources/street_network_database.geojson': No such file or directory
[Thu Aug 26 17:14:16 2021]
Error in rule fetch_snd:
jobid: 0
output: data_sources/street_network_database.geojsonRuleException:
CalledProcessError in line 74 of /home/alpha/Desktop/accessmapp-playground/opensidewalks-data/cities/seattle/Snakefile.fetch:
Command 'set -euo pipefail; mv data-seattlecitygis.opendata.arcgis.com/datasets/0dd0ad79dc3845f3a296215d7c448a0d_2.geojson data_sources/street_network_database.geojson' returned non-zero exit status 1.
File "/home/alpha/.cache/pypoetry/virtualenvs/opensidewalks-data-d3J5Akl6-py3.8/lib/python3.8/site-packages/snakemake/executors/init.py", line 2168, in run_wrapper
File "/home/alpha/Desktop/accessmapp-playground/opensidewalks-data/cities/seattle/Snakefile.fetch", line 74, in __rule_fetch_snd
File "/home/alpha/.cache/pypoetry/virtualenvs/opensidewalks-data-d3J5Akl6-py3.8/lib/python3.8/site-packages/snakemake/executors/init.py", line 529, in _callback
File "/usr/lib/python3.8/concurrent/futures/thread.py", line 57, in run
File "/home/alpha/.cache/pypoetry/virtualenvs/opensidewalks-data-d3J5Akl6-py3.8/lib/python3.8/site-packages/snakemake/executors/init.py", line 515, in cached_or_run
File "/home/alpha/.cache/pypoetry/virtualenvs/opensidewalks-data-d3J5Akl6-py3.8/lib/python3.8/site-packages/snakemake/executors/init.py", line 2199, in run_wrapper
Exiting because a job execution failed. Look above for error message
[Thu Aug 26 17:14:26 2021]
Error in rule fetch_crosswalks:
jobid: 0
output: data_sources/crosswalks.geojsonRuleException:
FileNotFoundError in line 117 of /home/alpha/Desktop/accessmapp-playground/opensidewalks-data/cities/seattle/Snakefile.fetch:
[Errno 2] No such file or directory: 'data_sources/crosswalks.geojson'
File "/home/alpha/.cache/pypoetry/virtualenvs/opensidewalks-data-d3J5Akl6-py3.8/lib/python3.8/site-packages/snakemake/executors/init.py", line 2168, in run_wrapper
File "/home/alpha/Desktop/accessmapp-playground/opensidewalks-data/cities/seattle/Snakefile.fetch", line 117, in __rule_fetch_crosswalks
File "/home/alpha/.cache/pypoetry/virtualenvs/opensidewalks-data-d3J5Akl6-py3.8/lib/python3.8/site-packages/snakemake/executors/init.py", line 529, in _callback
File "/usr/lib/python3.8/concurrent/futures/thread.py", line 57, in run
File "/home/alpha/.cache/pypoetry/virtualenvs/opensidewalks-data-d3J5Akl6-py3.8/lib/python3.8/site-packages/snakemake/executors/init.py", line 515, in cached_or_run
File "/home/alpha/.cache/pypoetry/virtualenvs/opensidewalks-data-d3J5Akl6-py3.8/lib/python3.8/site-packages/snakemake/executors/init.py", line 2199, in run_wrapper
Exiting because a job execution failed. Look above for error message
Maybe I do not completely understand the data ETL pipeline behind the scene. Really appreciate any help!