Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

has some problem when edit with ‘./hparams/ROME/gpt2-xl.yaml’ #476

Open
AndyWangfl opened this issue Jan 23, 2025 · 0 comments
Open
Labels
question Further information is requested

Comments

@AndyWangfl
Copy link

create a new function in edit.py:

def test_ROME_Messi():
    hparams = ROMEHyperParams.from_hparams('./hparams/ROME/gpt2-xl.yaml')
    prompts = ['Question:What sport does Lionel Messi play? Answer:'] # x_e
    ground_truth = ['football'] # y
    target_new = ['basketball'] # y_e
    subject = ['Lionel Messi']

    editor=BaseEditor.from_hparams(hparams)
    metrics, edited_model, _ = editor.edit(
        prompts=prompts,
        ground_truth=ground_truth,
        target_new=target_new,
        subject=subject,
        keep_original_weight=False
    )
    print(metrics)
    print(edited_model)

gpt2-xl.yaml don't change

alg_name: "ROME"
model_name: "./hugging_cache/gpt2-xl"
stats_dir: "./data/stats"
device: 0
layers: [17]
fact_token: "subject_last"
v_num_grad_steps: 20
v_lr: 5e-1
v_loss_layer: 47
v_weight_decay: 0.5
clamp_norm_factor: 4
kl_factor: 0.0625
mom2_adjustment: false
context_template_length_params: [[5, 10], [10, 10]]
rewrite_module_tmp: "transformer.h.{}.mlp.c_proj"
layer_module_tmp: "transformer.h.{}"
mlp_module_tmp: "transformer.h.{}.mlp"
attn_module_tmp: "transformer.h.{}.attn"
ln_f_module: "transformer.ln_f"
lm_head_module: "transformer.wte"
mom2_dataset: "wikipedia"
mom2_n_samples: 100000
mom2_dtype: "float32"
model_parallel: false
fp16: false

run this funciton will exception


1. dit.py::test_ROME_Messi FAILED                                          [100%]
2. edit.py:3019 (test_ROME_Messi)
3. path_or_repo_id = './hugging_cache/gpt2-xl', filename = 'config.json'
4. cache_dir = '/Users/bytedance/.cache/huggingface/hub', force_download = False
5. resume_download = None, proxies = None, token = None, revision = None
6. local_files_only = False, subfolder = '', repo_type = None
7. user_agent = 'transformers/4.44.2; python/3.9.6; session_id/37a2ecdbdf6e42d9b75680efe199628d; torch/2.0.1'
8. _raise_exceptions_for_gated_repo = False
9. _raise_exceptions_for_missing_entries = False
10. _raise_exceptions_for_connection_errors = False, _commit_hash = None
11. deprecated_kwargs = {}, use_auth_token = None, full_filename = 'config.json'
12. 
13.     def cached_file(
14.         path_or_repo_id: Union[str, os.PathLike],
15.         filename: str,
16.         cache_dir: Optional[Union[str, os.PathLike]] = None,
17.         force_download: bool = False,
18.         resume_download: Optional[bool] = None,
19.         proxies: Optional[Dict[str, str]] = None,
20.         token: Optional[Union[bool, str]] = None,
21.         revision: Optional[str] = None,
22.         local_files_only: bool = False,
23.         subfolder: str = "",
24.         repo_type: Optional[str] = None,
25.         user_agent: Optional[Union[str, Dict[str, str]]] = None,
26.         _raise_exceptions_for_gated_repo: bool = True,
27.         _raise_exceptions_for_missing_entries: bool = True,
28.         _raise_exceptions_for_connection_errors: bool = True,
29.         _commit_hash: Optional[str] = None,
30.         **deprecated_kwargs,
31.     ) -> Optional[str]:
32.         """
33.         Tries to locate a file in a local folder and repo, downloads and cache it if necessary.
34.     
35.         Args:
36.             path_or_repo_id (`str` or `os.PathLike`):
37.                 This can be either:
38.     
39.                 - a string, the *model id* of a model repo on huggingface.co.
40.                 - a path to a *directory* potentially containing the file.
41.             filename (`str`):
42.                 The name of the file to locate in `path_or_repo`.
43.             cache_dir (`str` or `os.PathLike`, *optional*):
44.                 Path to a directory in which a downloaded pretrained model configuration should be cached if the standard
45.                 cache should not be used.
46.             force_download (`bool`, *optional*, defaults to `False`):
47.                 Whether or not to force to (re-)download the configuration files and override the cached versions if they
48.                 exist.
49.             resume_download:
50.                 Deprecated and ignored. All downloads are now resumed by default when possible.
51.                 Will be removed in v5 of Transformers.
52.             proxies (`Dict[str, str]`, *optional*):
53.                 A dictionary of proxy servers to use by protocol or endpoint, e.g., `{'http': 'foo.bar:3128',
54.                 'http://hostname': 'foo.bar:4012'}.` The proxies are used on each request.
55.             token (`str` or *bool*, *optional*):
56.                 The token to use as HTTP bearer authorization for remote files. If `True`, will use the token generated
57.                 when running `huggingface-cli login` (stored in `~/.huggingface`).
58.             revision (`str`, *optional*, defaults to `"main"`):
59.                 The specific model version to use. It can be a branch name, a tag name, or a commit id, since we use a
60.                 git-based system for storing models and other artifacts on huggingface.co, so `revision` can be any
61.                 identifier allowed by git.
62.             local_files_only (`bool`, *optional*, defaults to `False`):
63.                 If `True`, will only try to load the tokenizer configuration from local files.
64.             subfolder (`str`, *optional*, defaults to `""`):
65.                 In case the relevant files are located inside a subfolder of the model repo on huggingface.co, you can
66.                 specify the folder name here.
67.             repo_type (`str`, *optional*):
68.                 Specify the repo type (useful when downloading from a space for instance).
69.     
70.         <Tip>
71.     
72.         Passing `token=True` is required when you want to use a private model.
73.     
74.         </Tip>
75.     
76.         Returns:
77.             `Optional[str]`: Returns the resolved file (to the cache folder if downloaded from a repo).
78.     
79.         Examples:
80.     
81.         ```python
82.         # Download a model weight from the Hub and cache it.
83.         model_weights_file = cached_file("google-bert/bert-base-uncased", "pytorch_model.bin")
84.         ```
85.         """
86.         use_auth_token = deprecated_kwargs.pop("use_auth_token", None)
87.         if use_auth_token is not None:
88.             warnings.warn(
89.                 "The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.",
90.                 FutureWarning,
91.             )
92.             if token is not None:
93.                 raise ValueError("`token` and `use_auth_token` are both specified. Please set only the argument `token`.")
94.             token = use_auth_token
95.     
96.         # Private arguments
97.         #     _raise_exceptions_for_gated_repo: if False, do not raise an exception for gated repo error but return
98.         #         None.
99.         #     _raise_exceptions_for_missing_entries: if False, do not raise an exception for missing entries but return
100.         #         None.
101.         #     _raise_exceptions_for_connection_errors: if False, do not raise an exception for connection errors but return
102.         #         None.
103.         #     _commit_hash: passed when we are chaining several calls to various files (e.g. when loading a tokenizer or
104.         #         a pipeline). If files are cached for this commit hash, avoid calls to head and get from the cache.
105.         if is_offline_mode() and not local_files_only:
106.             logger.info("Offline mode: forcing local_files_only=True")
107.             local_files_only = True
108.         if subfolder is None:
109.             subfolder = ""
110.     
111.         path_or_repo_id = str(path_or_repo_id)
112.         full_filename = os.path.join(subfolder, filename)
113.         if os.path.isdir(path_or_repo_id):
114.             resolved_file = os.path.join(os.path.join(path_or_repo_id, subfolder), filename)
115.             if not os.path.isfile(resolved_file):
116.                 if _raise_exceptions_for_missing_entries:
117.                     raise EnvironmentError(
118.                         f"{path_or_repo_id} does not appear to have a file named {full_filename}. Checkout "
119.                         f"'https://huggingface.co/{path_or_repo_id}/tree/{revision}' for available files."
120.                     )
121.                 else:
122.                     return None
123.             return resolved_file
124.     
125.         if cache_dir is None:
126.             cache_dir = TRANSFORMERS_CACHE
127.         if isinstance(cache_dir, Path):
128.             cache_dir = str(cache_dir)
129.     
130.         if _commit_hash is not None and not force_download:
131.             # If the file is cached under that commit hash, we return it directly.
132.             resolved_file = try_to_load_from_cache(
133.                 path_or_repo_id, full_filename, cache_dir=cache_dir, revision=_commit_hash, repo_type=repo_type
134.             )
135.             if resolved_file is not None:
136.                 if resolved_file is not _CACHED_NO_EXIST:
137.                     return resolved_file
138.                 elif not _raise_exceptions_for_missing_entries:
139.                     return None
140.                 else:
141.                     raise EnvironmentError(f"Could not locate {full_filename} inside {path_or_repo_id}.")
142.     
143.         user_agent = http_user_agent(user_agent)
144.         try:
145.             # Load from URL or cache if already cached
146. >           resolved_file = hf_hub_download(
147.                 path_or_repo_id,
148.                 filename,
149.                 subfolder=None if len(subfolder) == 0 else subfolder,
150.                 repo_type=repo_type,
151.                 revision=revision,
152.                 cache_dir=cache_dir,
153.                 user_agent=user_agent,
154.                 force_download=force_download,
155.                 proxies=proxies,
156.                 resume_download=resume_download,
157.                 token=token,
158.                 local_files_only=local_files_only,
159.             )
160. 
161. venv/lib/python3.9/site-packages/transformers/utils/hub.py:402: 
162. _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
163. venv/lib/python3.9/site-packages/huggingface_hub/utils/_validators.py:106: in _inner_fn
164.     validate_repo_id(arg_value)
165. _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
166. 
167. repo_id = './hugging_cache/gpt2-xl'
168. 
169.     def validate_repo_id(repo_id: str) -> None:
170.         """Validate `repo_id` is valid.
171.     
172.         This is not meant to replace the proper validation made on the Hub but rather to
173.         avoid local inconsistencies whenever possible (example: passing `repo_type` in the
174.         `repo_id` is forbidden).
175.     
176.         Rules:
177.         - Between 1 and 96 characters.
178.         - Either "repo_name" or "namespace/repo_name"
179.         - [a-zA-Z0-9] or "-", "_", "."
180.         - "--" and ".." are forbidden
181.     
182.         Valid: `"foo"`, `"foo/bar"`, `"123"`, `"Foo-BAR_foo.bar123"`
183.     
184.         Not valid: `"datasets/foo/bar"`, `".repo_id"`, `"foo--bar"`, `"foo.git"`
185.     
186.         Example:
187.         ```py
188.         >>> from huggingface_hub.utils import validate_repo_id
189.         >>> validate_repo_id(repo_id="valid_repo_id")
190.         >>> validate_repo_id(repo_id="other..repo..id")
191.         huggingface_hub.utils._validators.HFValidationError: Cannot have -- or .. in repo_id: 'other..repo..id'.
192.         ```
193.     
194.         Discussed in https://github.com/huggingface/huggingface_hub/issues/1008.
195.         In moon-landing (internal repository):
196.         - https://github.com/huggingface/moon-landing/blob/main/server/lib/Names.ts#L27
197.         - https://github.com/huggingface/moon-landing/blob/main/server/views/components/NewRepoForm/NewRepoForm.svelte#L138
198.         """
199.         if not isinstance(repo_id, str):
200.             # Typically, a Path is not a repo_id
201.             raise HFValidationError(f"Repo id must be a string, not {type(repo_id)}: '{repo_id}'.")
202.     
203.         if repo_id.count("/") > 1:
204. >           raise HFValidationError(
205.                 "Repo id must be in the form 'repo_name' or 'namespace/repo_name':"
206.                 f" '{repo_id}'. Use `repo_type` argument if needed."
207.             )
208. E           huggingface_hub.errors.HFValidationError: Repo id must be in the form 'repo_name' or 'namespace/repo_name': './hugging_cache/gpt2-xl'. Use `repo_type` argument if needed.
209. 
210. venv/lib/python3.9/site-packages/huggingface_hub/utils/_validators.py:154: HFValidationError
211. 
212. The above exception was the direct cause of the following exception:
213. 
214.     def test_ROME_Messi():
215.         hparams = ROMEHyperParams.from_hparams('./hparams/ROME/gpt2-xl.yaml')
216.         prompts = ['Question:What sport does Lionel Messi play? Answer:'] # x_e
217.         ground_truth = ['football'] # y
218.         target_new = ['basketball'] # y_e
219.         subject = ['Lionel Messi']
220.     
221. >       editor=BaseEditor.from_hparams(hparams)
222. 
223. edit.py:3027: 
224. _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
225. easyeditor/editors/editor.py:53: in from_hparams
226.     return cls(hparams)
227. easyeditor/editors/editor.py:96: in __init__
228.     self.model = AutoModelForCausalLM.from_pretrained(self.model_name, **model_kwargs)
229. venv/lib/python3.9/site-packages/transformers/models/auto/auto_factory.py:485: in from_pretrained
230.     resolved_config_file = cached_file(
231. _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
232. 
233. path_or_repo_id = './hugging_cache/gpt2-xl', filename = 'config.json'
234. cache_dir = '/Users/bytedance/.cache/huggingface/hub', force_download = False
235. resume_download = None, proxies = None, token = None, revision = None
236. local_files_only = False, subfolder = '', repo_type = None
237. user_agent = 'transformers/4.44.2; python/3.9.6; session_id/37a2ecdbdf6e42d9b75680efe199628d; torch/2.0.1'
238. _raise_exceptions_for_gated_repo = False
239. _raise_exceptions_for_missing_entries = False
240. _raise_exceptions_for_connection_errors = False, _commit_hash = None
241. deprecated_kwargs = {}, use_auth_token = None, full_filename = 'config.json'
242. 
243.     def cached_file(
244.         path_or_repo_id: Union[str, os.PathLike],
245.         filename: str,
246.         cache_dir: Optional[Union[str, os.PathLike]] = None,
247.         force_download: bool = False,
248.         resume_download: Optional[bool] = None,
249.         proxies: Optional[Dict[str, str]] = None,
250.         token: Optional[Union[bool, str]] = None,
251.         revision: Optional[str] = None,
252.         local_files_only: bool = False,
253.         subfolder: str = "",
254.         repo_type: Optional[str] = None,
255.         user_agent: Optional[Union[str, Dict[str, str]]] = None,
256.         _raise_exceptions_for_gated_repo: bool = True,
257.         _raise_exceptions_for_missing_entries: bool = True,
258.         _raise_exceptions_for_connection_errors: bool = True,
259.         _commit_hash: Optional[str] = None,
260.         **deprecated_kwargs,
261.     ) -> Optional[str]:
262.         """
263.         Tries to locate a file in a local folder and repo, downloads and cache it if necessary.
264.     
265.         Args:
266.             path_or_repo_id (`str` or `os.PathLike`):
267.                 This can be either:
268.     
269.                 - a string, the *model id* of a model repo on huggingface.co.
270.                 - a path to a *directory* potentially containing the file.
271.             filename (`str`):
272.                 The name of the file to locate in `path_or_repo`.
273.             cache_dir (`str` or `os.PathLike`, *optional*):
274.                 Path to a directory in which a downloaded pretrained model configuration should be cached if the standard
275.                 cache should not be used.
276.             force_download (`bool`, *optional*, defaults to `False`):
277.                 Whether or not to force to (re-)download the configuration files and override the cached versions if they
278.                 exist.
279.             resume_download:
280.                 Deprecated and ignored. All downloads are now resumed by default when possible.
281.                 Will be removed in v5 of Transformers.
282.             proxies (`Dict[str, str]`, *optional*):
283.                 A dictionary of proxy servers to use by protocol or endpoint, e.g., `{'http': 'foo.bar:3128',
284.                 'http://hostname': 'foo.bar:4012'}.` The proxies are used on each request.
285.             token (`str` or *bool*, *optional*):
286.                 The token to use as HTTP bearer authorization for remote files. If `True`, will use the token generated
287.                 when running `huggingface-cli login` (stored in `~/.huggingface`).
288.             revision (`str`, *optional*, defaults to `"main"`):
289.                 The specific model version to use. It can be a branch name, a tag name, or a commit id, since we use a
290.                 git-based system for storing models and other artifacts on huggingface.co, so `revision` can be any
291.                 identifier allowed by git.
292.             local_files_only (`bool`, *optional*, defaults to `False`):
293.                 If `True`, will only try to load the tokenizer configuration from local files.
294.             subfolder (`str`, *optional*, defaults to `""`):
295.                 In case the relevant files are located inside a subfolder of the model repo on huggingface.co, you can
296.                 specify the folder name here.
297.             repo_type (`str`, *optional*):
298.                 Specify the repo type (useful when downloading from a space for instance).
299.     
300.         <Tip>
301.     
302.         Passing `token=True` is required when you want to use a private model.
303.     
304.         </Tip>
305.     
306.         Returns:
307.             `Optional[str]`: Returns the resolved file (to the cache folder if downloaded from a repo).
308.     
309.         Examples:
310.     
311.         ```python
312.         # Download a model weight from the Hub and cache it.
313.         model_weights_file = cached_file("google-bert/bert-base-uncased", "pytorch_model.bin")
314.         ```
315.         """
316.         use_auth_token = deprecated_kwargs.pop("use_auth_token", None)
317.         if use_auth_token is not None:
318.             warnings.warn(
319.                 "The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.",
320.                 FutureWarning,
321.             )
322.             if token is not None:
323.                 raise ValueError("`token` and `use_auth_token` are both specified. Please set only the argument `token`.")
324.             token = use_auth_token
325.     
326.         # Private arguments
327.         #     _raise_exceptions_for_gated_repo: if False, do not raise an exception for gated repo error but return
328.         #         None.
329.         #     _raise_exceptions_for_missing_entries: if False, do not raise an exception for missing entries but return
330.         #         None.
331.         #     _raise_exceptions_for_connection_errors: if False, do not raise an exception for connection errors but return
332.         #         None.
333.         #     _commit_hash: passed when we are chaining several calls to various files (e.g. when loading a tokenizer or
334.         #         a pipeline). If files are cached for this commit hash, avoid calls to head and get from the cache.
335.         if is_offline_mode() and not local_files_only:
336.             logger.info("Offline mode: forcing local_files_only=True")
337.             local_files_only = True
338.         if subfolder is None:
339.             subfolder = ""
340.     
341.         path_or_repo_id = str(path_or_repo_id)
342.         full_filename = os.path.join(subfolder, filename)
343.         if os.path.isdir(path_or_repo_id):
344.             resolved_file = os.path.join(os.path.join(path_or_repo_id, subfolder), filename)
345.             if not os.path.isfile(resolved_file):
346.                 if _raise_exceptions_for_missing_entries:
347.                     raise EnvironmentError(
348.                         f"{path_or_repo_id} does not appear to have a file named {full_filename}. Checkout "
349.                         f"'https://huggingface.co/{path_or_repo_id}/tree/{revision}' for available files."
350.                     )
351.                 else:
352.                     return None
353.             return resolved_file
354.     
355.         if cache_dir is None:
356.             cache_dir = TRANSFORMERS_CACHE
357.         if isinstance(cache_dir, Path):
358.             cache_dir = str(cache_dir)
359.     
360.         if _commit_hash is not None and not force_download:
361.             # If the file is cached under that commit hash, we return it directly.
362.             resolved_file = try_to_load_from_cache(
363.                 path_or_repo_id, full_filename, cache_dir=cache_dir, revision=_commit_hash, repo_type=repo_type
364.             )
365.             if resolved_file is not None:
366.                 if resolved_file is not _CACHED_NO_EXIST:
367.                     return resolved_file
368.                 elif not _raise_exceptions_for_missing_entries:
369.                     return None
370.                 else:
371.                     raise EnvironmentError(f"Could not locate {full_filename} inside {path_or_repo_id}.")
372.     
373.         user_agent = http_user_agent(user_agent)
374.         try:
375.             # Load from URL or cache if already cached
376.             resolved_file = hf_hub_download(
377.                 path_or_repo_id,
378.                 filename,
379.                 subfolder=None if len(subfolder) == 0 else subfolder,
380.                 repo_type=repo_type,
381.                 revision=revision,
382.                 cache_dir=cache_dir,
383.                 user_agent=user_agent,
384.                 force_download=force_download,
385.                 proxies=proxies,
386.                 resume_download=resume_download,
387.                 token=token,
388.                 local_files_only=local_files_only,
389.             )
390.         except GatedRepoError as e:
391.             resolved_file = _get_cache_file_to_return(path_or_repo_id, full_filename, cache_dir, revision)
392.             if resolved_file is not None or not _raise_exceptions_for_gated_repo:
393.                 return resolved_file
394.             raise EnvironmentError(
395.                 "You are trying to access a gated repo.\nMake sure to have access to it at "
396.                 f"https://huggingface.co/{path_or_repo_id}.\n{str(e)}"
397.             ) from e
398.         except RepositoryNotFoundError as e:
399.             raise EnvironmentError(
400.                 f"{path_or_repo_id} is not a local folder and is not a valid model identifier "
401.                 "listed on 'https://huggingface.co/models'\nIf this is a private repository, make sure to pass a token "
402.                 "having permission to this repo either by logging in with `huggingface-cli login` or by passing "
403.                 "`token=<your_token>`"
404.             ) from e
405.         except RevisionNotFoundError as e:
406.             raise EnvironmentError(
407.                 f"{revision} is not a valid git identifier (branch name, tag name or commit id) that exists "
408.                 "for this model name. Check the model page at "
409.                 f"'https://huggingface.co/{path_or_repo_id}' for available revisions."
410.             ) from e
411.         except LocalEntryNotFoundError as e:
412.             resolved_file = _get_cache_file_to_return(path_or_repo_id, full_filename, cache_dir, revision)
413.             if (
414.                 resolved_file is not None
415.                 or not _raise_exceptions_for_missing_entries
416.                 or not _raise_exceptions_for_connection_errors
417.             ):
418.                 return resolved_file
419.             raise EnvironmentError(
420.                 f"We couldn't connect to '{HUGGINGFACE_CO_RESOLVE_ENDPOINT}' to load this file, couldn't find it in the"
421.                 f" cached files and it looks like {path_or_repo_id} is not the path to a directory containing a file named"
422.                 f" {full_filename}.\nCheckout your internet connection or see how to run the library in offline mode at"
423.                 " 'https://huggingface.co/docs/transformers/installation#offline-mode'."
424.             ) from e
425.         except EntryNotFoundError as e:
426.             if not _raise_exceptions_for_missing_entries:
427.                 return None
428.             if revision is None:
429.                 revision = "main"
430.             raise EnvironmentError(
431.                 f"{path_or_repo_id} does not appear to have a file named {full_filename}. Checkout "
432.                 f"'https://huggingface.co/{path_or_repo_id}/tree/{revision}' for available files."
433.             ) from e
434.         except HTTPError as err:
435.             resolved_file = _get_cache_file_to_return(path_or_repo_id, full_filename, cache_dir, revision)
436.             if resolved_file is not None or not _raise_exceptions_for_connection_errors:
437.                 return resolved_file
438.             raise EnvironmentError(f"There was a specific connection error when trying to load {path_or_repo_id}:\n{err}")
439.         except HFValidationError as e:
440. >           raise EnvironmentError(
441.                 f"Incorrect path_or_model_id: '{path_or_repo_id}'. Please provide either the path to a local folder or the repo_id of a model on the Hub."
442.             ) from e
443. E           OSError: Incorrect path_or_model_id: './hugging_cache/gpt2-xl'. Please provide either the path to a local folder or the repo_id of a model on the Hub.
444. 
445. venv/lib/python3.9/site-packages/transformers/utils/hub.py:466: OSError

how can I do?

@zxlzr zxlzr added the question Further information is requested label Jan 23, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants