Skip to content

Commit

Permalink
Upgrade to sklearn-1.5 (#3293)
Browse files Browse the repository at this point in the history
  • Loading branch information
jeff-shepherd authored Jul 18, 2024
1 parent ae0a751 commit f52c498
Show file tree
Hide file tree
Showing 4 changed files with 8 additions and 8 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -119,7 +119,7 @@
" \"epochs\": 10,\n",
" \"lr\": 0.1,\n",
" },\n",
" environment=\"AzureML-sklearn-1.0-ubuntu20.04-py38-cpu@latest\",\n",
" environment=\"AzureML-sklearn-1.5@latest\",\n",
" display_name=\"pytorch-iris-example\",\n",
" description=\"Train a neural network with PyTorch on the Iris dataset.\",\n",
")"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -127,7 +127,7 @@
" - Azure ML `data`/`dataset` or `datastore` are of type `uri_folder`. To use `data`/`dataset` as input, you can use registered dataset in the workspace using the format '<data_name>:<version>'. For e.g Input(type='uri_folder', path='my_dataset:1')\n",
" - `mode` - \tMode of how the data should be delivered to the compute target. Allowed values are `ro_mount`, `rw_mount` and `download`. Default is `ro_mount`\n",
"- `environment` - This is the environment needed for the command to run. Curated or custom environments from the workspace can be used. Or a custom environment can be created and used as well. Check out the [environment](../../../../assets/environment/environment.ipynb) notebook for more examples.\n",
"- `compute` - The compute on which the command will run. In this example we are using [serverless compute (preview)](https://learn.microsoft.com/azure/machine-learning/how-to-use-serverless-compute?view=azureml-api-2&tabs=python) so there is no need to specify any compute. You can also replace serverless with any other compute in the workspace. You can run it on the local machine by using `local` for the compute. This will run the command on the local machine and all the run details and output of the job will be uploaded to the Azure ML workspace.\n",
"- `compute` - The compute on which the command will run. In this example we are using [serverless compute (preview)](https://learn.microsoft.com/azure/machine-learning/how-to-use-serverless-compute?view=azureml-api-2&tabs=python) so there is no need to specify any compute. You can also replace serverless with any other compute in the workspace. You can run it on the local machine by using `local` for the compute. This will run the command on the local machine and all the run details and output of the job will be uploaded to the Azure ML workspace.\n",
"- `distribution` - Distribution configuration for distributed training scenarios. Azure Machine Learning supports PyTorch, TensorFlow, and MPI-based distributed training. The allowed values are `PyTorch`, `TensorFlow` or `Mpi`.\n",
"- `display_name` - The display name of the Job\n",
"- `description` - The description of the experiment\n"
Expand Down Expand Up @@ -163,7 +163,7 @@
" path=\"https://azuremlexamples.blob.core.windows.net/datasets/diabetes.csv\",\n",
" )\n",
" },\n",
" environment=\"AzureML-sklearn-1.0-ubuntu20.04-py38-cpu@latest\",\n",
" environment=\"AzureML-sklearn-1.5@latest\",\n",
" display_name=\"sklearn-diabetes-example\",\n",
" # description,\n",
" # experiment_name\n",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -127,8 +127,8 @@
" - `path` - The path to the file or folder. These can be local or remote files or folders. For remote files - http/https, wasb are supported. \n",
" - Azure ML `data`/`dataset` or `datastore` are of type `uri_folder`. To use `data`/`dataset` as input, you can use registered dataset in the workspace using the format '<data_name>:<version>'. For e.g Input(type='uri_folder', path='my_dataset:1')\n",
" - `mode` - \tMode of how the data should be delivered to the compute target. Allowed values are `ro_mount`, `rw_mount` and `download`. Default is `ro_mount`\n",
"- `environment` - This is the environment needed for the command to run. Curated or custom environments from the workspace can be used. Or a custom environment can be created and used as well. Check out the [environment](../../../../assets/environment/environment.ipynb) notebook for more examples.\n",
"- `compute` - The compute on which the command will run. In this example we are using [serverless compute (preview)](https://learn.microsoft.com/azure/machine-learning/how-to-use-serverless-compute?view=azureml-api-2&tabs=python) so there is no need to specify any compute. You can also replace serverless with any other compute in the workspace. You can run it on the local machine by using `local` for the compute. This will run the command on the local machine and all the run details and output of the job will be uploaded to the Azure ML workspace.\n",
"- `environment` - This is the environment needed for the command to run. Curated or custom environments from the workspace can be used. Or a custom environment can be created and used as well. Check out the [environment](../../../../assets/environment/environment.ipynb) notebook for more examples.\n",
"- `compute` - The compute on which the command will run. In this example we are using [serverless compute (preview)](https://learn.microsoft.com/azure/machine-learning/how-to-use-serverless-compute?view=azureml-api-2&tabs=python) so there is no need to specify any compute. You can also replace serverless with any other compute in the workspace. You can run it on the local machine by using `local` for the compute. This will run the command on the local machine and all the run details and output of the job will be uploaded to the Azure ML workspace.\n",
"- `distribution` - Distribution configuration for distributed training scenarios. Azure Machine Learning supports PyTorch, TensorFlow, and MPI-based distributed training. The allowed values are `PyTorch`, `TensorFlow` or `Mpi`.\n",
"- `display_name` - The display name of the Job\n",
"- `description` - The description of the experiment"
Expand Down Expand Up @@ -177,7 +177,7 @@
" # }\n",
" # ) # uncomment add SSH Public Key to access job container via SSH\n",
" },\n",
" environment=\"AzureML-sklearn-1.0-ubuntu20.04-py38-cpu@latest\",\n",
" environment=\"AzureML-sklearn-1.5@latest\",\n",
" display_name=\"sklearn-iris-example\",\n",
" # experiment_name\n",
" # description\n",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -107,7 +107,7 @@
" - Azure ML `data`/`dataset` or `datastore` are of type `uri_folder`. To use `data`/`dataset` as input, you can use registered dataset in the workspace using the format '<data_name>:<version>'. For e.g Input(type='uri_folder', path='my_dataset:1')\n",
" - `mode` - \tMode of how the data should be delivered to the compute target. Allowed values are `ro_mount`, `rw_mount` and `download`. Default is `ro_mount`\n",
"- `environment` - This is the environment needed for the command to run. Curated or custom environments from the workspace can be used. Or a custom environment can be created and used as well. Check out the [environment](../../../../assets/environment/environment.ipynb) notebook for more examples.\n",
"- `compute` - The compute on which the command will run. In this example we are using [serverless compute (preview)](https://learn.microsoft.com/azure/machine-learning/how-to-use-serverless-compute?view=azureml-api-2&tabs=python) so there is no need to specify any compute. You can also replace serverless with any other compute in the workspace. You can run it on the local machine by using `local` for the compute. This will run the command on the local machine and all the run details and output of the job will be uploaded to the Azure ML workspace.\n",
"- `compute` - The compute on which the command will run. In this example we are using [serverless compute (preview)](https://learn.microsoft.com/azure/machine-learning/how-to-use-serverless-compute?view=azureml-api-2&tabs=python) so there is no need to specify any compute. You can also replace serverless with any other compute in the workspace. You can run it on the local machine by using `local` for the compute. This will run the command on the local machine and all the run details and output of the job will be uploaded to the Azure ML workspace.\n",
"- `distribution` - Distribution configuration for distributed training scenarios. Azure Machine Learning supports PyTorch, TensorFlow, and MPI-based distributed training. The allowed values are `PyTorch`, `TensorFlow` or `Mpi`.\n",
"- `display_name` - The display name of the Job\n",
"- `description` - The description of the experiment"
Expand All @@ -128,7 +128,7 @@
" code=\"./src\", # local path where the code is stored\n",
" command=\"pip install -r requirements.txt && python main.py --C ${{inputs.C}} --penalty ${{inputs.penalty}}\",\n",
" inputs={\"C\": 0.8, \"penalty\": \"l2\"},\n",
" environment=\"AzureML-sklearn-1.0-ubuntu20.04-py38-cpu@latest\",\n",
" environment=\"AzureML-sklearn-1.5@latest\",\n",
" display_name=\"sklearn-mnist-example\"\n",
" # experiment_name: sklearn-mnist-example\n",
" # description: Train a scikit-learn LogisticRegression model on the MNSIT dataset.\n",
Expand Down

0 comments on commit f52c498

Please sign in to comment.