Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support the "meta" device on load_file() #281

Closed
lstein opened this issue Jun 26, 2023 · 3 comments · Fixed by #303
Closed

Support the "meta" device on load_file() #281

lstein opened this issue Jun 26, 2023 · 3 comments · Fixed by #303

Comments

@lstein
Copy link

lstein commented Jun 26, 2023

Feature request

Support the following syntax for obtaining an object containing the tensor shapes and other metadata without loading the actual tensors:

model = safetensors.torch.load_file(path, device="meta")

Motivation

This would be analogous to torch.load(path, map_location="meta") and allows the metadata to be interrogated without having to load the whole safetensors into memory.

Your contribution

Happy to test the implementation.

@Narsil
Copy link
Collaborator

Narsil commented Jun 27, 2023

Have you tried the following ?

with safe_open(filename, framework="pt") as f:
     print(f.keys())
     print(f.get_slice("mytensor").shape)
     print(f.metadata())

Should be pretty instant and it doesn't allocate anything

@keturn
Copy link

keturn commented Jul 29, 2023

Is there a way to get data types from that? I don't see it exposed on PySafeSlice.

@Narsil
Copy link
Collaborator

Narsil commented Jul 31, 2023

Indeed it seems to be missing.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants