AutoZipping/Deleting Files #75
NickCondon
started this conversation in
Ideas
Replies: 1 comment
-
|
Rubbiya, myself and Roger just had our first discussion regarding data reduction last Friday. It is clear that everyone has been thinking a about this, but there is no consistent way to do it. What we planned is to allow users to flag files to be deleted. But nothing is concrete yet. I need to understand more about your criteria for reducing data. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
This may not be a 'portal' thing but general RDM thing or Pitschi tool perhaps...
Files/directories after X days (ie 30?) get auto Zipped and flagged for removal/archival.
Upon this time, metadata is scraped from the file and a job submission 'template' could be created at the location to 'regenerate this data'
Example use-case
Directory Tree = MyImagingData / 20210815 / {File1.sld, File2.sld} / File1_converted / {File1_t1.tif, File1_t2.tif....} / Decon_Output / {File1_t1_decon.tif, File1_t2_decon.tif...}
File1_t2_decon.tif would have appended in its metadata the deconvolution job settings used to generate it.
Upon going 'stale' the file/slurm template file / etc would be flagged for deletion... this metadata would be scraped out to generate these files again...
IF this re-generate file existed it would put a lock on the parent files (ie the converted files) so they couldnt be deleted/moved...
Beta Was this translation helpful? Give feedback.
All reactions