You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
Current documentation shows small examples. Working on real large datasets varies in some ways and has specific needs:
limiting Dask workers to limit memory usage
job-based instead of interactive
sometimes needs to use a specific HPC setup
requesting resources via SLURM or using workflow managers like Nextflow, Snakemake...
working with a distributed Dask cluster
Describe the solution you'd like
A documentation page should explain this and link to existing resources. It would also be interesting to gather existing documentation of executing large jobs with SpatialData.
Is your feature request related to a problem? Please describe.
Current documentation shows small examples. Working on real large datasets varies in some ways and has specific needs:
Describe the solution you'd like
A documentation page should explain this and link to existing resources. It would also be interesting to gather existing documentation of executing large jobs with SpatialData.
Some resources:
The text was updated successfully, but these errors were encountered: