Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update to HF_HOME from TRANSFORMERS_CACHE #4816

Merged
merged 21 commits into from
May 22, 2024
Merged
Show file tree
Hide file tree
Changes from 3 commits
Commits
Show all changes
21 commits
Select commit Hold shift + click to select a range
b0ae592
Test moving to HF_HOME from TRANSFORMERS_CACHE
loadams Dec 14, 2023
ea3838c
Merge branch 'master' into loadams/switch-hf-home
loadams Dec 15, 2023
a48c4cd
Merge branch 'master' into loadams/switch-hf-home
loadams Jan 5, 2024
a9302c3
Merge branch 'master' into loadams/switch-hf-home
loadams Jan 10, 2024
2f1eb2f
Merge conflicts
loadams Apr 8, 2024
910aacf
Update remaining instances of TRANSFORMERS_CACHE to HF_HOME
loadams Apr 8, 2024
bfe17dc
Merge branch 'master' into loadams/switch-hf-home
loadams Apr 9, 2024
350aab1
Merge branch 'master' into loadams/switch-hf-home
loadams Apr 10, 2024
5b03367
Merge branch 'master' into loadams/switch-hf-home
loadams Apr 19, 2024
c17f956
Merge branch 'master' into loadams/switch-hf-home
loadams Apr 22, 2024
630d329
Merge branch 'master' into loadams/switch-hf-home
loadams Apr 22, 2024
618a6a3
Merge branch 'master' into loadams/switch-hf-home
loadams Apr 22, 2024
6d18742
Merge branch 'master' into loadams/switch-hf-home
loadams Apr 23, 2024
f8851d7
Merge branch 'master' into loadams/switch-hf-home
loadams Apr 24, 2024
c953e94
Merge branch 'master' into loadams/switch-hf-home
loadams Apr 29, 2024
8a06095
Merge branch 'master' into loadams/switch-hf-home
loadams Apr 30, 2024
65f8d18
Merge branch 'master' into loadams/switch-hf-home
loadams May 15, 2024
8037c7d
Merge branch 'master' into loadams/switch-hf-home
loadams May 17, 2024
8ca2504
Merge branch 'master' into loadams/switch-hf-home
loadams May 20, 2024
c7e21b5
Merge branch 'master' into loadams/switch-hf-home
loadams May 21, 2024
94e1646
Update path
loadams May 21, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions .github/workflows/cpu-inference.yml
Original file line number Diff line number Diff line change
Expand Up @@ -71,5 +71,5 @@ jobs:
source oneCCL/build/_install/env/setvars.sh
unset TORCH_CUDA_ARCH_LIST # only jit compile for current arch
cd tests
TRANSFORMERS_CACHE=~/tmp/transformers_cache/ TORCH_EXTENSIONS_DIR=./torch-extensions pytest -m 'seq_inference' unit/
TRANSFORMERS_CACHE=~/tmp/transformers_cache/ TORCH_EXTENSIONS_DIR=./torch-extensions pytest -m 'inference_ops' -m 'inference' unit/
HF_HOME=~/tmp/transformers_cache/ TORCH_EXTENSIONS_DIR=./torch-extensions pytest -m 'seq_inference' unit/
HF_HOME=~/tmp/transformers_cache/ TORCH_EXTENSIONS_DIR=./torch-extensions pytest -m 'inference_ops' -m 'inference' unit/
4 changes: 2 additions & 2 deletions .github/workflows/nv-torch-latest-cpu.yml
Original file line number Diff line number Diff line change
Expand Up @@ -45,5 +45,5 @@ jobs:
run: |
unset TORCH_CUDA_ARCH_LIST # only jit compile for current arch
cd tests
TRANSFORMERS_CACHE=/tmp/transformers_cache/ pytest $PYTEST_OPTS -n 4 unit/ --torch_ver="1.12"
TRANSFORMERS_CACHE=/tmp/transformers_cache/ pytest $PYTEST_OPTS -m 'sequential' unit/ --torch_ver="1.12"
HF_HOME=/tmp/transformers_cache/ pytest $PYTEST_OPTS -n 4 unit/ --torch_ver="1.12"
HF_HOME=/tmp/transformers_cache/ pytest $PYTEST_OPTS -m 'sequential' unit/ --torch_ver="1.12"
2 changes: 1 addition & 1 deletion .github/workflows/setup-venv/action.yml
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ runs:
- id: set-env-vars
run: |
echo TEST_DATA_DIR=/blob/ >> $GITHUB_ENV
echo TRANSFORMERS_CACHE=/blob/transformers_cache/ >> $GITHUB_ENV
echo HF_HOME=/blob/transformers_cache/ >> $GITHUB_ENV
echo TORCH_EXTENSIONS_DIR=./torch-extensions/ >> $GITHUB_ENV
echo TORCH_CACHE=/blob/torch_cache/ >> $GITHUB_ENV
echo HF_DATASETS_CACHE=/blob/datasets_cache/ >> $GITHUB_ENV
Expand Down
2 changes: 1 addition & 1 deletion tests/unit/inference/test_checkpoint_sharding.py
Original file line number Diff line number Diff line change
Expand Up @@ -110,7 +110,7 @@ def write_checkpoints_json(model_name, class_tmpdir):
cached_repo_dir = snapshot_download(
model_name,
local_files_only=is_offline_mode(),
cache_dir=os.getenv("TRANSFORMERS_CACHE", None),
cache_dir=os.getenv("HF_HOME", None),
ignore_patterns=["*.safetensors", "*.msgpack", "*.h5"],
)
file_list = [str(entry) for entry in Path(cached_repo_dir).rglob("*.[bp][it][n]") if entry.is_file()]
Expand Down
Loading