diff --git a/demos/README.ipynb b/demos/README.ipynb index 6711c930..a0cd480f 100644 --- a/demos/README.ipynb +++ b/demos/README.ipynb @@ -38,17 +38,15 @@ "\n", "## Image Classification\n", "\n", - "The [**image-classification**](image-classification/01-image-classification.ipynb) demo demonstrates image recognition: the application builds and trains an ML model that identifies (recognizes) and classifies images.\n", + "The [**image-classification**](image-classification/01-image-classification.ipynb) demo demonstrates an end-to-end solution for image recognition: the application uses TensorFlow, Keras, Horovod, and Nuclio to build and train an ML model that identifies (recognizes) and classifies images. \n", + "The application consists of four MLRun and Nuclio functions for performing the following operations:\n", "\n", - "This example is using TensorFlow, Horovod, and Nuclio demonstrating end to end solution for image classification, \n", - "it consists of 4 MLRun and Nuclio functions:\n", + "1. Import an image archive from from an Amazon Simple Storage (S3) bucket to the platform's data store.\n", + "2. Tag the images based on their name structure.\n", + "3. Train the image-classification ML model by using [TensorFlow](https://www.tensorflow.org/) and [Keras](https://keras.io/); use [Horovod](https://eng.uber.com/horovod/) to perform distributed training over either GPUs or CPUs.\n", + "4. Automatically deploy a Nuclio model-serving function from [Jupyter Notebook](nuclio-serving-tf-images.ipynb) or from a [Dockerfile](./inference-docker).\n", "\n", - "1. import an image archive from S3 to the cluster file system\n", - "2. Tag the images based on their name structure \n", - "3. Distrubuted training using TF, Keras and Horovod\n", - "4. Automated deployment of Nuclio model serving function (form [Notebook](nuclio-serving-tf-images.ipynb) and from [Dockerfile](./inference-docker))\n", - "\n", - "The Example also demonstrate an [automated pipeline](mlrun_mpijob_pipe.ipynb) using MLRun and KubeFlow pipelines " + "This demo also provides an example of an [automated pipeline](image-classification/02-create_pipeline.ipynb) using [MLRun](https://github.com/mlrun/mlrun) and [Kubeflow pipelines](https://github.com/kubeflow/pipelines)." ] }, { diff --git a/demos/README.md b/demos/README.md index 2184ef7a..8298a68f 100644 --- a/demos/README.md +++ b/demos/README.md @@ -17,17 +17,15 @@ The **demos** tutorials directory contains full end-to-end use-case applications ## Image Classification -The [**image-classification**](image-classification/01-image-classification.ipynb) demo demonstrates image recognition: the application builds and trains an ML model that identifies (recognizes) and classifies images. +The [**image-classification**](image-classification/01-image-classification.ipynb) demo demonstrates an end-to-end solution for image recognition: the application uses TensorFlow, Keras, Horovod, and Nuclio to build and train an ML model that identifies (recognizes) and classifies images. +The application consists of four MLRun and Nuclio functions for performing the following operations: -This example is using TensorFlow, Horovod, and Nuclio demonstrating end to end solution for image classification, -it consists of 4 MLRun and Nuclio functions: +1. Import an image archive from from an Amazon Simple Storage (S3) bucket to the platform's data store. +2. Tag the images based on their name structure. +3. Train the image-classification ML model by using [TensorFlow](https://www.tensorflow.org/) and [Keras](https://keras.io/); use [Horovod](https://eng.uber.com/horovod/) to perform distributed training over either GPUs or CPUs. +4. Automatically deploy a Nuclio model-serving function from [Jupyter Notebook](nuclio-serving-tf-images.ipynb) or from a [Dockerfile](./inference-docker). -1. import an image archive from S3 to the cluster file system -2. Tag the images based on their name structure -3. Distrubuted training using TF, Keras and Horovod -4. Automated deployment of Nuclio model serving function (form [Notebook](nuclio-serving-tf-images.ipynb) and from [Dockerfile](./inference-docker)) - -The Example also demonstrate an [automated pipeline](mlrun_mpijob_pipe.ipynb) using MLRun and KubeFlow pipelines +This demo also provides an example of an [automated pipeline](image-classification/02-create_pipeline.ipynb) using [MLRun](https://github.com/mlrun/mlrun) and [Kubeflow pipelines](https://github.com/kubeflow/pipelines). ## Predictive Infrastructure Monitoring diff --git a/demos/gpu/README.ipynb b/demos/gpu/README.ipynb index 36596ff6..ca362b65 100644 --- a/demos/gpu/README.ipynb +++ b/demos/gpu/README.ipynb @@ -25,14 +25,16 @@ "- A **horovod** directory with applications that use Uber's [Horovod](https://eng.uber.com/horovod/) distributed deep-learning framework, which can be used to convert a single-GPU TensorFlow, Keras, or PyTorch model-training program to a distributed program that trains the model simultaneously over multiple GPUs.\n", " The objective is to speed up your model training with minimal changes to your existing single-GPU code and without complicating the execution.\n", " Horovod code can also run over CPUs with only minor modifications.\n", - " The Horovod tutorials include the following:\n", - " - Benchmark tests (**benchmark-tf.ipynb**, which executes **tf_cnn_benchmarks.py**).\n", - " - Note that under the demo folder you will find an image classificaiton demo that is also running with Horovod and can be set to run with GPU
\n", + " For more information and examples, see the [Horovod GitHub repository](https://github.com/horovod/horovod).\n", + " \n", + " The Horovod GPU tutorials include benchmark tests (**benchmark-tf.ipynb**, which executes **tf_cnn_benchmarks.py**).
\n", + " In addition, the image-classification demo ([**demos/image-classification/**](../image-classification/01-image-classification.ipynb)) demonstrates how to use Horovod for image recognition, and can be configured to run over GPUs.\n", "\n", "- A **rapids** directory with applications that use NVIDIA's [RAPIDS](https://rapids.ai/) open-source libraries suite for executing end-to-end data science and analytics pipelines entirely on GPUs.\n", + "\n", " The RAPIDS tutorials include the following:\n", "\n", - " - Demo applications that use the [cuDF](https://rapidsai.github.io/projects/cudf/en/latest/index.html) RAPIDS GPU DataFrame library to perform batching and aggregation of data that's read from a Kafaka stream, and then write the results to a Parquet file.
\n", + " - Demo applications that use the [cuDF](https://rapidsai.github.io/projects/cudf/en/latest/index.html) RAPIDS GPU DataFrame library to perform batching and aggregation of data that's read from a Kafka stream, and then write the results to a Parquet file.
\n", " The **nuclio-cudf-agg.ipynb** demo implements this by using a Nuclio serverless function while the **python-agg.ipynb** demo implements this by using a standalone Python function.\n", " - Benchmark tests that compare the performance of RAPIDS cuDF to pandas DataFrames (**benchmark-cudf-vs-pd.ipynb**)." ] diff --git a/demos/gpu/README.md b/demos/gpu/README.md index a6fc7725..3432aea1 100644 --- a/demos/gpu/README.md +++ b/demos/gpu/README.md @@ -16,17 +16,14 @@ The **demos/gpu** directory includes the following: Horovod code can also run over CPUs with only minor modifications. For more information and examples, see the [Horovod GitHub repository](https://github.com/horovod/horovod). - The Horovod tutorials include the following: - - - An image-recognition demo application for execution over GPUs (**image-classification**). - - A slightly modified version of the GPU image-classification demo application for execution over CPUs (**cpu/image-classification**). - - Benchmark tests (**benchmark-tf.ipynb**, which executes **tf_cnn_benchmarks.py**). + The Horovod GPU tutorials include benchmark tests (**benchmark-tf.ipynb**, which executes **tf_cnn_benchmarks.py**).
+ In addition, the image-classification demo ([**demos/image-classification/**](../image-classification/01-image-classification.ipynb)) demonstrates how to use Horovod for image recognition, and can be configured to run over GPUs. - A **rapids** directory with applications that use NVIDIA's [RAPIDS](https://rapids.ai/) open-source libraries suite for executing end-to-end data science and analytics pipelines entirely on GPUs. The RAPIDS tutorials include the following: - - Demo applications that use the [cuDF](https://rapidsai.github.io/projects/cudf/en/latest/index.html) RAPIDS GPU DataFrame library to perform batching and aggregation of data that's read from a Kafaka stream, and then write the results to a Parquet file.
+ - Demo applications that use the [cuDF](https://rapidsai.github.io/projects/cudf/en/latest/index.html) RAPIDS GPU DataFrame library to perform batching and aggregation of data that's read from a Kafka stream, and then write the results to a Parquet file.
The **nuclio-cudf-agg.ipynb** demo implements this by using a Nuclio serverless function while the **python-agg.ipynb** demo implements this by using a standalone Python function. - Benchmark tests that compare the performance of RAPIDS cuDF to pandas DataFrames (**benchmark-cudf-vs-pd.ipynb**). diff --git a/demos/image-classification/README.ipynb b/demos/image-classification/README.ipynb new file mode 100644 index 00000000..d809b468 --- /dev/null +++ b/demos/image-classification/README.ipynb @@ -0,0 +1,71 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Image Classification Using Distributed Training\n", + "\n", + "- [Overview](#image-classif-demo-overview)\n", + "- [Notebooks and Code](#image-classif-demo-nbs-n-code)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n", + "## Overview\n", + "\n", + "This demo demonstrates an end-to-end solution for image recognition: the application uses TensorFlow, Keras, Horovod, and Nuclio to build and train an ML model that identifies (recognizes) and classifies images. \n", + "The application consists of four MLRun and Nuclio functions for performing the following operations:\n", + "\n", + "1. Import an image archive from from an Amazon Simple Storage (S3) bucket to the platform's data store.\n", + "2. Tag the images based on their name structure.\n", + "3. Train the image-classification ML model by using [TensorFlow](https://www.tensorflow.org/) and [Keras](https://keras.io/); use [Horovod](https://eng.uber.com/horovod/) to perform distributed training over either GPUs or CPUs.\n", + "4. Automatically deploy a Nuclio model-serving function from [Jupyter Notebook](nuclio-serving-tf-images.ipynb) or from a [Dockerfile](./inference-docker).\n", + "\n", + "


\n", + "\n", + "This demo also provides an example of an [automated pipeline](image-classification/02-create_pipeline.ipynb) using [MLRun](https://github.com/mlrun/mlrun) and [Kubeflow pipelines](https://github.com/kubeflow/pipelines)." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "\n", + "## Notebooks and Code\n", + "\n", + "- [**01-image-classification.ipynb**](01-image-classification.ipynb) — all-in-one: import, tag, launch train, deploy, and serve\n", + "- [**horovod-training.py**](horovod-training.py) — train function code\n", + "- [**nuclio-serving-tf-images.ipynb**](nuclio-serving-tf-images.ipynb) — serve function development and test\n", + "- [**02-create_pipeline.ipynb**](02-create_pipeline.ipynb) — auto-generate a Kubeflow pipeline workflow\n", + "- **inference-docker/** — build and serve functions using a Dockerfile:\n", + " - [**main.py**](./inference-docker/main.py) — function code\n", + " - [**Dockerfile**](./inference-docker/Dockerfile) — a Dockerfile" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.8" + } + }, + "nbformat": 4, + "nbformat_minor": 4 +} diff --git a/demos/image-classification/README.md b/demos/image-classification/README.md index c6a573fc..98d67746 100644 --- a/demos/image-classification/README.md +++ b/demos/image-classification/README.md @@ -1,24 +1,30 @@ # Image Classification Using Distributed Training -This example is using TensorFlow, Horovod, and Nuclio demonstrating end to end solution for image classification, -it consists of 4 MLRun and Nuclio functions: +- [Overview](#image-classif-demo-overview) +- [Notebooks and Code](#image-classif-demo-nbs-n-code) -1. import an image archive from S3 to the cluster file system -2. Tag the images based on their name structure -3. Distrubuted training using TF, Keras and Horovod -4. Automated deployment of Nuclio model serving function (form [Notebook](nuclio-serving-tf-images.ipynb) and from [Dockerfile](./inference-docker)) + +## Overview -


+This demo demonstrates an end-to-end solution for image recognition: the application uses TensorFlow, Keras, Horovod, and Nuclio to build and train an ML model that identifies (recognizes) and classifies images. +The application consists of four MLRun and Nuclio functions for performing the following operations: + +1. Import an image archive from from an Amazon Simple Storage (S3) bucket to the platform's data store. +2. Tag the images based on their name structure. +3. Train the image-classification ML model by using [TensorFlow](https://www.tensorflow.org/) and [Keras](https://keras.io/); use [Horovod](https://eng.uber.com/horovod/) to perform distributed training over either GPUs or CPUs. +4. Automatically deploy a Nuclio model-serving function from [Jupyter Notebook](nuclio-serving-tf-images.ipynb) or from a [Dockerfile](./inference-docker). -The Example also demonstrate an [automated pipeline](mlrun_mpijob_pipe.ipynb) using MLRun and KubeFlow pipelines +


-## Notebooks & Code +This demo also provides an example of an [automated pipeline](image-classification/02-create_pipeline.ipynb) using [MLRun](https://github.com/mlrun/mlrun) and [Kubeflow pipelines](https://github.com/kubeflow/pipelines). -* [All-in-one: Import, tag, launch training, deploy serving](01-image-classification.ipynb) -* [Training function code](horovod-training.py) -* [Serving function development and testing](nuclio-serving-tf-images.ipynb) -* [Auto generation of KubeFlow pipelines workflow](02-create_pipeline.ipynb) -* [Building serving function using Dockerfile](./inference-docker) - * [function code](./inference-docker/main.py) - * [Dockerfile](./inference-docker/Dockerfile) + +## Notebooks and Code +- [**01-image-classification.ipynb**](01-image-classification.ipynb) — all-in-one: import, tag, launch train, deploy, and serve +- [**horovod-training.py**](horovod-training.py) — train function code +- [**nuclio-serving-tf-images.ipynb**](nuclio-serving-tf-images.ipynb) — serve function development and test +- [**02-create_pipeline.ipynb**](02-create_pipeline.ipynb) — auto-generate a Kubeflow pipeline workflow +- **inference-docker/** — build and serve functions using a Dockerfile: + - [**main.py**](./inference-docker/main.py) — function code + - [**Dockerfile**](./inference-docker/Dockerfile) — a Dockerfile diff --git a/getting-started/frames.ipynb b/getting-started/frames.ipynb index 59115e0a..f66bdd5d 100644 --- a/getting-started/frames.ipynb +++ b/getting-started/frames.ipynb @@ -28,13 +28,13 @@ "The `Client` class features the following object methods for supporting basic data operations; the type of data is derived from the backend type (`tsdb` — TSDB table / `kv` — NoSQL table / `stream` — data stream):\n", "\n", "- `create` — creates a new TSDB table or stream (\"backend data\").\n", - "- `delete` — deletes a table or stream or specific NoSQL (\"KV\") table items.\n", + "- `delete` — deletes a table or stream.\n", "- `read` — reads data from a table or stream into pandas DataFrames.\n", "- `write` — writes data from pandas DataFrames to a table or stream.\n", "- `execute` — executes a command on a table or stream.\n", " Each backend may support multiple commands.\n", "\n", - "For a detailed description of the Frames API, see the [Frames documentation](https://github.com/v3io/frames/blob/development/README.md).
\n", + "For a detailed description of the Frames API, see the [Frames API reference](https://www.iguazio.com/docs/reference/latest-release/api-reference/frames/).
\n", "For more help and usage details, use the internal API help — `.?` in Jupyter Notebook or `print(..__doc__)`.
\n", "For example, the following command returns information about the read operation for a client object named `client`:\n", "```\n", @@ -111,7 +111,7 @@ "outputs": [], "source": [ "# Relative path to the NoSQL table within the parent platform data container\n", - "table = os.path.join(os.getenv(\"V3IO_USERNAME\") + \"/examples/bank\")\n", + "table = os.path.join(os.getenv(\"V3IO_USERNAME\"), \"examples/bank\")\n", "\n", "# Full path to the NoSQL table for SQL queries (platform Presto data-path syntax);\n", "# use the same data container as used for the Frames client (\"users\")\n", @@ -324,7 +324,7 @@ "metadata": {}, "outputs": [], "source": [ - "out = client.write(\"kv\", table=table, dfs=df)" + "client.write(\"kv\", table=table, dfs=df)" ] }, { @@ -380,175 +380,175 @@ " \n", " \n", " no\n", - " primary\n", + " tertiary\n", " 0\n", - " no\n", + " yes\n", " unknown\n", - " 323\n", - " single\n", + " 397\n", + " married\n", " no\n", - " 11262\n", - " aug\n", + " 14220\n", + " sep\n", " cellular\n", " 1\n", " yes\n", - " 368\n", - " technician\n", - " 26\n", - " 60\n", + " 2962\n", + " retired\n", + " 9\n", + " 71\n", " -1\n", " \n", " \n", " no\n", - " secondary\n", + " tertiary\n", " 0\n", - " no\n", + " yes\n", " unknown\n", - " 14\n", - " married\n", + " 95\n", + " single\n", " no\n", - " 17555\n", + " 11797\n", " aug\n", " cellular\n", - " 14\n", + " 2\n", " no\n", - " 1776\n", + " 3177\n", " management\n", - " 26\n", - " 43\n", + " 11\n", + " 32\n", " -1\n", " \n", " \n", - " no\n", - " primary\n", - " 4\n", " yes\n", - " success\n", - " 146\n", - " married\n", + " tertiary\n", + " 0\n", + " yes\n", + " unknown\n", + " 197\n", + " divorced\n", " no\n", - " 12519\n", - " apr\n", + " 13204\n", + " nov\n", " cellular\n", " 2\n", " no\n", - " 602\n", - " blue-collar\n", - " 17\n", - " 50\n", - " 147\n", + " 3329\n", + " management\n", + " 20\n", + " 34\n", + " -1\n", " \n", " \n", " no\n", " secondary\n", " 0\n", - " yes\n", + " no\n", " unknown\n", - " 60\n", + " 223\n", " married\n", " no\n", - " 14440\n", - " nov\n", + " 16873\n", + " oct\n", " cellular\n", " 1\n", " no\n", - " 3910\n", + " 64\n", " admin.\n", - " 21\n", - " 49\n", + " 7\n", + " 56\n", " -1\n", " \n", " \n", " no\n", - " tertiary\n", + " secondary\n", " 0\n", " no\n", " unknown\n", - " 420\n", + " 113\n", " married\n", " no\n", - " 15520\n", - " nov\n", - " cellular\n", + " 11084\n", + " jun\n", + " unknown\n", " 1\n", " no\n", - " 1778\n", - " management\n", - " 18\n", - " 56\n", + " 670\n", + " blue-collar\n", + " 11\n", + " 40\n", " -1\n", " \n", " \n", - " no\n", - " secondary\n", + " yes\n", + " tertiary\n", " 0\n", - " no\n", + " yes\n", " unknown\n", - " 29\n", - " married\n", + " 117\n", + " single\n", " no\n", - " 12186\n", - " jun\n", - " unknown\n", - " 3\n", + " 16874\n", + " may\n", + " cellular\n", + " 2\n", " no\n", - " 272\n", - " management\n", - " 20\n", - " 46\n", + " 3485\n", + " entrepreneur\n", + " 15\n", + " 25\n", " -1\n", " \n", " \n", " no\n", " secondary\n", " 0\n", - " no\n", + " yes\n", " unknown\n", - " 272\n", - " single\n", + " 66\n", + " married\n", " no\n", - " 10177\n", + " 10910\n", " may\n", " cellular\n", - " 4\n", + " 2\n", " no\n", - " 1211\n", - " admin.\n", - " 5\n", - " 66\n", + " 4394\n", + " blue-collar\n", + " 15\n", + " 43\n", " -1\n", " \n", " \n", " no\n", " tertiary\n", - " 1\n", - " no\n", - " failure\n", - " 172\n", - " married\n", + " 3\n", + " yes\n", + " success\n", + " 638\n", + " single\n", " no\n", - " 15834\n", - " apr\n", + " 13711\n", + " may\n", " cellular\n", - " 3\n", + " 1\n", " no\n", - " 1805\n", - " retired\n", - " 5\n", - " 70\n", - " 186\n", + " 1779\n", + " technician\n", + " 14\n", + " 32\n", + " 175\n", " \n", "" ], "text/plain": [ - "[('no', 'primary', 0, 'no', 'unknown', 323, 'single', 'no', 11262, 'aug', 'cellular', 1, 'yes', 368, 'technician', 26, 60, -1),\n", - " ('no', 'secondary', 0, 'no', 'unknown', 14, 'married', 'no', 17555, 'aug', 'cellular', 14, 'no', 1776, 'management', 26, 43, -1),\n", - " ('no', 'primary', 4, 'yes', 'success', 146, 'married', 'no', 12519, 'apr', 'cellular', 2, 'no', 602, 'blue-collar', 17, 50, 147),\n", - " ('no', 'secondary', 0, 'yes', 'unknown', 60, 'married', 'no', 14440, 'nov', 'cellular', 1, 'no', 3910, 'admin.', 21, 49, -1),\n", - " ('no', 'tertiary', 0, 'no', 'unknown', 420, 'married', 'no', 15520, 'nov', 'cellular', 1, 'no', 1778, 'management', 18, 56, -1),\n", - " ('no', 'secondary', 0, 'no', 'unknown', 29, 'married', 'no', 12186, 'jun', 'unknown', 3, 'no', 272, 'management', 20, 46, -1),\n", - " ('no', 'secondary', 0, 'no', 'unknown', 272, 'single', 'no', 10177, 'may', 'cellular', 4, 'no', 1211, 'admin.', 5, 66, -1),\n", - " ('no', 'tertiary', 1, 'no', 'failure', 172, 'married', 'no', 15834, 'apr', 'cellular', 3, 'no', 1805, 'retired', 5, 70, 186)]" + "[('no', 'tertiary', 0, 'yes', 'unknown', 397, 'married', 'no', 14220, 'sep', 'cellular', 1, 'yes', 2962, 'retired', 9, 71, -1),\n", + " ('no', 'tertiary', 0, 'yes', 'unknown', 95, 'single', 'no', 11797, 'aug', 'cellular', 2, 'no', 3177, 'management', 11, 32, -1),\n", + " ('yes', 'tertiary', 0, 'yes', 'unknown', 197, 'divorced', 'no', 13204, 'nov', 'cellular', 2, 'no', 3329, 'management', 20, 34, -1),\n", + " ('no', 'secondary', 0, 'no', 'unknown', 223, 'married', 'no', 16873, 'oct', 'cellular', 1, 'no', 64, 'admin.', 7, 56, -1),\n", + " ('no', 'secondary', 0, 'no', 'unknown', 113, 'married', 'no', 11084, 'jun', 'unknown', 1, 'no', 670, 'blue-collar', 11, 40, -1),\n", + " ('yes', 'tertiary', 0, 'yes', 'unknown', 117, 'single', 'no', 16874, 'may', 'cellular', 2, 'no', 3485, 'entrepreneur', 15, 25, -1),\n", + " ('no', 'secondary', 0, 'yes', 'unknown', 66, 'married', 'no', 10910, 'may', 'cellular', 2, 'no', 4394, 'blue-collar', 15, 43, -1),\n", + " ('no', 'tertiary', 3, 'yes', 'success', 638, 'single', 'no', 13711, 'may', 'cellular', 1, 'no', 1779, 'technician', 14, 32, 175)]" ] }, "execution_count": 5, @@ -651,6 +651,26 @@ " \n", " \n", " \n", + " 1821\n", + " 51\n", + " 21244\n", + " 2\n", + " cellular\n", + " 4\n", + " no\n", + " 166\n", + " unknown\n", + " no\n", + " housemaid\n", + " yes\n", + " married\n", + " aug\n", + " -1\n", + " unknown\n", + " 0\n", + " no\n", + " \n", + " \n", " 2624\n", " 53\n", " 22370\n", @@ -691,26 +711,6 @@ " no\n", " \n", " \n", - " 1821\n", - " 51\n", - " 21244\n", - " 2\n", - " cellular\n", - " 4\n", - " no\n", - " 166\n", - " unknown\n", - " no\n", - " housemaid\n", - " yes\n", - " married\n", - " aug\n", - " -1\n", - " unknown\n", - " 0\n", - " no\n", - " \n", - " \n", " 871\n", " 31\n", " 26965\n", @@ -751,6 +751,26 @@ " no\n", " \n", " \n", + " 650\n", + " 33\n", + " 23663\n", + " 2\n", + " cellular\n", + " 16\n", + " no\n", + " 199\n", + " tertiary\n", + " yes\n", + " housemaid\n", + " no\n", + " single\n", + " apr\n", + " 146\n", + " failure\n", + " 2\n", + " no\n", + " \n", + " \n", " 3830\n", " 57\n", " 27069\n", @@ -790,26 +810,6 @@ " 0\n", " no\n", " \n", - " \n", - " 650\n", - " 33\n", - " 23663\n", - " 2\n", - " cellular\n", - " 16\n", - " no\n", - " 199\n", - " tertiary\n", - " yes\n", - " housemaid\n", - " no\n", - " single\n", - " apr\n", - " 146\n", - " failure\n", - " 2\n", - " no\n", - " \n", " \n", "\n", "" @@ -817,25 +817,25 @@ "text/plain": [ " age balance campaign contact day default duration education \\\n", "idx \n", + "1821 51 21244 2 cellular 4 no 166 unknown \n", "2624 53 22370 1 unknown 15 no 106 tertiary \n", "4014 41 21515 1 unknown 5 no 87 secondary \n", - "1821 51 21244 2 cellular 4 no 166 unknown \n", "871 31 26965 2 cellular 21 no 654 primary \n", "1483 43 27733 7 unknown 3 no 164 tertiary \n", + "650 33 23663 2 cellular 16 no 199 tertiary \n", "3830 57 27069 3 unknown 20 no 174 tertiary \n", "2989 42 42045 2 cellular 8 no 205 tertiary \n", - "650 33 23663 2 cellular 16 no 199 tertiary \n", "\n", " housing job loan marital month pdays poutcome previous y \n", "idx \n", + "1821 no housemaid yes married aug -1 unknown 0 no \n", "2624 yes entrepreneur no married may -1 unknown 0 no \n", "4014 yes admin. no married jun -1 unknown 0 no \n", - "1821 no housemaid yes married aug -1 unknown 0 no \n", "871 no housemaid no single apr -1 unknown 0 yes \n", "1483 yes technician no single jun -1 unknown 0 no \n", + "650 yes housemaid no single apr 146 failure 2 no \n", "3830 no technician yes married jun -1 unknown 0 no \n", - "2989 no entrepreneur no married aug -1 unknown 0 no \n", - "650 yes housemaid no single apr 146 failure 2 no " + "2989 no entrepreneur no married aug -1 unknown 0 no " ] }, "execution_count": 6, @@ -889,7 +889,8 @@ } ], "source": [ - "dfs = client.read(backend=\"kv\", table=table, filter=\"balance > 20000\", iterator=True)\n", + "dfs = client.read(backend=\"kv\", table=table, filter=\"balance > 20000\",\n", + " iterator=True)\n", "for df in dfs:\n", " print(df.head())" ] @@ -927,6 +928,7 @@ "- [Create a TSDB Table](#frames-tsdb-create)\n", "- [Write to the TSDB Table](#frames-tsdb-write)\n", "- [Read from the TSDB Table](#frames-tsdb-read)\n", + " - [Conditional Read](#frames-tsdb-read-conditional)\n", "- [Delete the TSDB Table](#frames-tsdb-delete)" ] }, @@ -948,7 +950,7 @@ "outputs": [], "source": [ "# Relative path to the TSDB table within the parent platform data container\n", - "tsdb_table = os.path.join(os.getenv(\"V3IO_USERNAME\") + \"/examples/tsdb_tab\")" + "tsdb_table = os.path.join(os.getenv(\"V3IO_USERNAME\"), \"examples/tsdb_tab\")" ] }, { @@ -973,8 +975,8 @@ "metadata": {}, "outputs": [], "source": [ - "# Create a new TSDB table; ingestion rate = one sample per minute (\"1/m\")\n", - "client.create(backend=\"tsdb\", table=tsdb_table, attrs={\"rate\": \"1/m\"})" + "# Create a new TSDB table; ingestion rate = one sample per hour (\"1/h\")\n", + "client.create(backend=\"tsdb\", table=tsdb_table, attrs={\"rate\": \"1/h\"})" ] }, { @@ -990,16 +992,16 @@ "You can add labels to TSDB table items in one of two ways; you can also combine these methods:\n", "\n", "- Use the `labels` dictionary parameter of the `write` method to add labels to all the written metric-sample table items (DataFrame rows) — `{