You can use DeepLabCut in the cloud without any installation (see Demo using Google Colaboratory below).
On your computer, we recommend using Anaconda to install Python and Jupyter Notebooks, see the Installation page. Then, on your local machine using these notebooks to guide you, you can (1) demo our labeled data (or create your own), (2) create a project, extract frames to lablel, use the GUI to label, and create a training set.
We suggest making a "Fork" of this repo and/or then place DeepLabCut files in a folder :
git clone https://github.com/AlexEMG/DeepLabCut
so you can access it locally with Anaconda. You can also click the "download" button, rather than using git
. Then you can edit the Notebooks as you like!
Demo 1: run DeepLabCut on our reaching data or on our open-field data
- This will give you a feel of the workflow for DeepLabCut. Follow the instructions inside the notebook!
Note, the notebooks with labeled data: reaching data, or open-field data can be run on a CPU, GPU, etc. The one with the trail-tracking data even achieves good results, when trained for half an hour on a GPU!
Demo 2: Run DeepLabCut on a GPU in Docker (linux only)
- This requires the DeepLabCut Docker!
Demo 3: Set up DeepLabCut on your own data
- Now that you're a master of the demos, this Notebook walks you through how to build your own pipeline:
- Create a new project
- Label new data
- Then, either use your CPU, or your GPU (the Notebook will guide you at this junction), to train, analyze and perform some basic analysis of your data.
For GPU-based training and analysis you will need to switch to either our supplied Docker container, and modify the Docker Demo Notebook for your project, or you need to install TensorFlow with GPU support in an Anaconda Env, or use Google Colab, more below:
We suggest making a "Fork" of this repo, git clone or download the folder into your google drive, then linking your google account to your GitHub (you'll see how to do this in the Notebook below). Then you can edit the Notebook for your own data too (just put https://colab.research.google.com/ in front of the web address of your own repo)
-
You can use Google Colaboratory to demo running DeepLabCut on our data. Here is an example colab-ready Jupyter Notebook for the open field data, which you can launch by clicking the badge below:
-
Using Colab on your data for the training and analysis of new videos, i.e. the parts that need a GPU!
- Click Open in Colab to launch the notebook.
- Make the notebook live by clicking 'Connect' in the Colab toolbar, and then click "Runtime > Change Runtime Type > and select Python3 and GPU as your hardware. Follow the instructions in the Notebook.
- Be aware, they often don't let you run on their GPUs for very long, so make sure your
save_inters
variable is low for this setting.
Here is a demo of us using the Colab Notebooks: https://www.youtube.com/watch?v=qJGs8nxx80A & https://www.youtube.com/watch?v=j13aXxysI2E
Warning: Colab updates their CUDA/TensorFlow likely faster than we can keep up, so this may not work at all future points in time (and, as a reminder, this whole package is released with a LICENSE that implies no Liability and no Warranty).
Ready to take your pose estimation to a new dimension? As of 2.0.7 we support 3D within the package. Please check out the dedicated 3D_Demo_DeepLabCut.ipynb above for more details!
All of DeepLabCut can be run from an ipython console in the program terminal! Go here for detailed instructions!
We also have some video tutorials to demonstrate how we use Anaconda and Docker via the terminal:
https://www.youtube.com/watch?v=7xwOhUcIGio & https://www.youtube.com/watch?v=bgfnz1wtlpo