Skip to content

Commit

Permalink
Improve brainglobe-segmentation 1D segmentation documentation (#134)
Browse files Browse the repository at this point in the history
* some doc for seg output files

* add doc for 1d seg files

* some instruction for probe tracking rendering

* python code example

* Add Jingjie  Li to contributors

* Update docs/source/tutorials/silicon-probe-tracking.md

Co-authored-by: Adam Tyson <[email protected]>

* Update docs/source/documentation/brainglobe-segmentation/user-guide/output-files.md

Co-authored-by: Adam Tyson <[email protected]>

* Update docs/source/documentation/brainglobe-segmentation/user-guide/output-files.md

Co-authored-by: Adam Tyson <[email protected]>

* Update docs/source/documentation/brainglobe-segmentation/user-guide/output-files.md

Co-authored-by: Adam Tyson <[email protected]>

* Update docs/source/tutorials/silicon-probe-tracking.md

Co-authored-by: Adam Tyson <[email protected]>

* Update docs/source/tutorials/silicon-probe-tracking.md

Co-authored-by: Adam Tyson <[email protected]>

* Update docs/source/tutorials/silicon-probe-tracking.md

Co-authored-by: Adam Tyson <[email protected]>

* moved the file format to 1d tutorial

* link removed

* probe track example linked to brainrender repo, and having an example pics

* Remove broken links

* Update image title

* update links in 1d segmentation tutorial

* Slight reword of probe tracking tutorial

---------

Co-authored-by: Adam Tyson <[email protected]>
  • Loading branch information
jingjie-li and adamltyson authored Jan 11, 2024
1 parent 8cc8caf commit 01306be
Show file tree
Hide file tree
Showing 5 changed files with 33 additions and 6 deletions.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
# Brainglobe main website
To contribute, please see the
[documentation](https://brainglobe.info/developers/index.html#to-improve-the-documentation).
[documentation](https://brainglobe.info/community/developers/index.html#to-improve-the-documentation).
4 changes: 4 additions & 0 deletions docs/source/people.md
Original file line number Diff line number Diff line change
Expand Up @@ -418,6 +418,10 @@ In no particular order:
:link: https://github.com/carlocastoldi
:::

:::{grid-item-card} Jingjie Li
:img-bottom: https://avatars.githubusercontent.com/u/16413662?v=4
:link: https://github.com/jingjie-li
:::
::::

Inspired by [All Contributors](https://allcontributors.org/). All information is sourced from GitHub. If any changes
Expand Down
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
11 changes: 9 additions & 2 deletions docs/source/tutorials/segmenting-1d-tracks.md
Original file line number Diff line number Diff line change
Expand Up @@ -56,10 +56,17 @@ Make sure you select the points in the order you wish them to be joined.
**A new Points layer containing the fitted points named `track_0_fit` appears on the left hand side and in the napari window, and a `.csv` file will be saved, showing the brain region for every spline point along the track along with the distance from the start of the track.**

:::{note}
All data will be saved into your brainreg output directory
All data will be saved into your brainreg output directory at `/segmentation/atlas_space/tracks` subfolder if you loaded the data from atlas space, otherwise, it will be in the `sample_space` subfolder.
:::

14. (Optional) Use the `Save` button to save your points to be reloaded at a later date.
14. (Optional) Use the `Save` button to save your points as `.points` to be reloaded at a later date. Use the `To Brainrender` button to save the fitted spline as `.npy` for [brainrender](/documentation/brainrender/index) visualisation.

:::{note}
Three files will be saved for each 1D track:
+ `TRACK_NAME.csv` - a csv file summarising the depth, atlas region name, and atlas region ID (based on your chosen atlas) for each point of the fitted spline.
+ `TRACK_NAME.npy` - a numpy array containing the coordinates for each point of the fitted spline. This array can be [visualised in 3D with brainrender](https://github.com/brainglobe/brainrender/blob/main/examples/probe_tracks.py).
+ `TRACK_NAME.points` - a [pandas HDF5 dataframe](https://pandas.pydata.org/docs/reference/api/pandas.DataFrame.to_hdf.html) containing the coordinates for each point used to create the track (e.g., from manual annotation).
:::

:::{hint}
For more information about how to use automated methods to segment your feature of interest, please see [Analysing segmentation from other napari plugins](../documentation/brainglobe-segmentation/user-guide/analysing-external-segmentation).
Expand Down
22 changes: 19 additions & 3 deletions docs/source/tutorials/silicon-probe-tracking.md
Original file line number Diff line number Diff line change
Expand Up @@ -83,9 +83,25 @@ Make sure your conda environment is still activated!
To open the graphical user interface, open napari and then load the `brainglobe-segmentation` plugin (see
[User guide](/documentation/brainglobe-segmentation/user-guide/index)).

The `brainglobe-segmentation`graphical user interface opens and shows a set of tools.You can then load your brainreg output
directory, and follow the main brainglobe-segmentation instructions [here](./segmenting-1d-tracks) for
The `brainglobe-segmentation`graphical user interface opens and shows a set of tools. You can then load your brainreg output
directory, and follow the main `brainglobe-segmentation instructions` [here](./segmenting-1d-tracks) for
segmenting a 1D track. Setting `Spline points` will determine how many times along the length of the track that
the brain region is sampled at. This can be used to determine the brain region for each recording site on your probe.

**Adapted from instructions by** [**Mateo Vélez-Fort**](https://www.sainsburywellcome.org/web/people/mateo-velez-fort)
After the spline fit is performed, a `csv` file will be saved in `brainreg_output/segmentation/atlas_space/tracks` for each track.
You can then find the brain area for each recording channel by matching the `distance` in the csv file and the
`depth` to the geometry of your recording probe.

## Visualize the probe track with brainrender
If you then click the `To brainrender` button, a `.npy` file will be saved for each track.
With `brainrender`, you can load the `npy` file using the [scene class](/documentation/brainrender/usage/scene).
This will provide a 3D interactive display of the probe tracks:

![brainrender visualisation](./images/probe_tracks_brainrender.png)
**Visualisation of two probe tracks using brainrender**

:::{tip}
The code to run this example can be found at [probe_tracks.py](https://github.com/brainglobe/brainrender/blob/main/examples/probe_tracks.py).
:::

**Tutorial adapted from instructions by** [**Mateo Vélez-Fort**](https://www.sainsburywellcome.org/web/people/mateo-velez-fort) and [**Jingjie Li**](https://www.sainsburywellcome.org/web/people/jingjie-li)

0 comments on commit 01306be

Please sign in to comment.