Skip to content

Commit

Permalink
fix typos, etc.
Browse files Browse the repository at this point in the history
  • Loading branch information
annefou committed Jun 19, 2024
1 parent b334418 commit 91abd52
Show file tree
Hide file tree
Showing 3 changed files with 12 additions and 14 deletions.
10 changes: 4 additions & 6 deletions docs/pangeo/visualization.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -87,9 +87,7 @@
"\n",
"### Data\n",
"\n",
"We will use data that have been generated in the previous episode.\n",
"\n",
"If the dataset is not present in the same folder as this Jupyter notebook, it will be downloaded from zenodo using `pooch`, a very handy python-based library to download and cache your data files locally (see further info [here](https://www.fatiando.org/pooch/latest/index.html))."
"Data will be downloaded from zenodo using `pooch`, a very handy python-based library to download and cache your data files locally (see further info [here](https://www.fatiando.org/pooch/latest/index.html))."
]
},
{
Expand Down Expand Up @@ -202,7 +200,7 @@
},
"source": [
":::{tip}\n",
"If you get an error with the previous command, check the previous episode where the input file some_hash-C_GLS_NDVI_20220101_20220701_Lombardia_S3_2_masked.netcdf is downloaded locally and it is in the same directory as your Jupyter Notebook.\n",
"If you get an error with the previous command, check the input file some_hash-C_GLS_NDVI_20220101_20220701_Lombardia_S3_2_masked.netcdf is downloaded locally and it is in the same directory as your Jupyter Notebook.\n",
":::"
]
},
Expand Down Expand Up @@ -2321,7 +2319,7 @@
"tags": []
},
"source": [
"Having a look to data distribution can reveal a lot about the data."
"Let's have a look at the data distribution."
]
},
{
Expand Down Expand Up @@ -2683,7 +2681,7 @@
"tags": []
},
"source": [
"### Plot a single point over the time dimension"
"### Plot a time series"
]
},
{
Expand Down
14 changes: 7 additions & 7 deletions docs/pangeo/xarray_introduction.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -1799,7 +1799,7 @@
"id": "ed915a07",
"metadata": {},
"source": [
"`cams.pm2p5_conc` is a 4-dimensional `xarray.DataArray` with tas values of type `float32`"
"`cams.pm2p5_conc` is a 4-dimensional `xarray.DataArray` with PM2.5 values of type `float32`"
]
},
{
Expand Down Expand Up @@ -2329,10 +2329,10 @@
"\n",
"Once a Data Array|Set is opened, xarray loads into memory only the coordinates and all the metadata needed to describe it.\n",
"The underlying data, the component written into the datastore, are loaded into memory as a NumPy array, only once directly accessed; once in there, it will be kept to avoid re-readings.\n",
"This brings the fact that it is good practice to have a look to the size of the data before accessing it. A classical mistake is to try loading arrays bigger than the memory with the obvious result of killing a notebook Kernel or Python process.\n",
"This brings the fact that it is good practice to have a look at the size of the data before accessing it. A classical mistake is to try loading arrays bigger than the memory with the obvious result of killing a notebook Kernel or Python process.\n",
"If the dataset does not fit in the available memory, then the only option will be to load it through the chunking; later on, in the tutorial 'chunking_introduction', we will introduce this concept.\n",
"\n",
"As the size of the data is not too big here, we can continue without any problem. But let's first have a look to the actual size and then how it impacts the memory once loaded into it."
"As the size of the data is not too big here, we can continue without any problem. But let's first have a look at the actual size and then how it impacts the memory once loaded into it."
]
},
{
Expand Down Expand Up @@ -2360,7 +2360,7 @@
}
],
"source": [
"print(f'{np.round(cams.pm2p5_conc.nbytes / 1024**2, 2)} MB') # all the data are automatically loaded into memory as NumpyArray once they are accessed."
"print(f'{np.round(cams.pm2p5_conc.nbytes / 1024**2, 2)} MB') # all the data is automatically loaded into memory as NumpyArray once they are accessed."
]
},
{
Expand Down Expand Up @@ -10240,7 +10240,7 @@
"id": "19379f3b",
"metadata": {},
"source": [
"You can make a statistical operation over a dimension. For instance, let's retrieve the maximum tas value among all those available for different times, at each lat-lon location."
"You can make a statistical operation over a dimension. For instance, let's retrieve the maximum pm2p5_conc value among all those available for different times, at each lat-lon location."
]
},
{
Expand Down Expand Up @@ -12631,7 +12631,7 @@
"\n",
"From the near-surface temperature dataset we already know that values are encoded as `float32`. A compression method can be defined as well; if the format is netCDF4 with the engine set to 'netcdf4' or 'h5netcdf' there are different compression options. The easiest solution is to stick with the default one for NetCDF4 files.\n",
"\n",
"Note that encoding parameters needs to be done through a nested dictionary and parameters has to be defined for each single variable."
"Note that encoding parameters needs to be done through a nested dictionary and parameters have to be defined for each single variable."
]
},
{
Expand Down Expand Up @@ -12677,7 +12677,7 @@
"id": "44c44075",
"metadata": {},
"source": [
"Through the datatype and the compression a compression of almost 10 time has been achieved; as drawback speed reading has been decreased."
"Through the datatype and the compression a compression of almost 10 time has been achieved; as drawback reading speed has been decreased."
]
},
{
Expand Down
2 changes: 1 addition & 1 deletion docs/setup/users-getting-started.md
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ In addition to the MinIO console, the API end point is `https://pangeo-eosc-mini

## Clone the github repository

To get a local copy of the `geo-open-hack-2024` repository, you can clone it on your local computer and/or server:
To get a local copy of the `geo-open-hack-2024` repository, you can clone it on your local computer and/or server. Open a terminal in the JupyterHub and clone the repository with the following command:

```
git clone https://github.com/pangeo-data/geo-open-hack-2024.git
Expand Down

0 comments on commit 91abd52

Please sign in to comment.