Skip to content
This repository has been archived by the owner on Jul 1, 2024. It is now read-only.

adhere to lazy import rules #806

Open
wants to merge 1 commit into
base: classy_vision-0.7.0
Choose a base branch
from

Conversation

fbgheith
Copy link

Summary:
Lazy import changes Python import semantics, specifically when it comes to initialization of packages/modules: https://www.internalfb.com/intern/wiki/Python/Cinder/Onboarding/Tutorial/Lazy_Imports/Troubleshooting/

For example, this pattern is not guaranteed to work:

import torch.optim
...
torch.optim._multi_tensor.Adam   # may fail to resolve _multi_tensor

And this is guaranteed to work:

import torch.optim._multi_tensor
...
torch.optim._multi_tensor.Adam   # will always work

A recent change to PyTorch changed module initialization logic in a way that exposed this issue.

But the code has been working for years? This is the nature of undefined behavior, any change in the environment (in this the PyTorch code base can make it fail.

Differential Revision: D58881291

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Jun 21, 2024
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D58881291

fbgheith added a commit to fbgheith/ClassyVision that referenced this pull request Jun 21, 2024
Summary:
Pull Request resolved: facebookresearch#806

Lazy import changes `Python` import semantics, specifically when it comes to initialization of packages/modules: https://www.internalfb.com/intern/wiki/Python/Cinder/Onboarding/Tutorial/Lazy_Imports/Troubleshooting/

For example, this pattern is not guaranteed to work:

```
import torch.optim
...
torch.optim._multi_tensor.Adam   # may fail to resolve _multi_tensor
```

And this is guaranteed to work:

```
import torch.optim._multi_tensor
...
torch.optim._multi_tensor.Adam   # will always work
```

A recent change to `PyTorch` changed module initialization logic in a way that exposed this issue.

But the code has been working for years? This is the nature of undefined behavior, any change in the environment (in this the `PyTorch` code base can make it fail.

Differential Revision: D58881291
Summary:
Pull Request resolved: facebookresearch#807

Pull Request resolved: facebookresearch#806

Lazy import changes `Python` import semantics, specifically when it comes to initialization of packages/modules: https://www.internalfb.com/intern/wiki/Python/Cinder/Onboarding/Tutorial/Lazy_Imports/Troubleshooting/

For example, this pattern is not guaranteed to work:

```
import torch.optim
...
torch.optim._multi_tensor.Adam   # may fail to resolve _multi_tensor
```

And this is guaranteed to work:

```
import torch.optim._multi_tensor
...
torch.optim._multi_tensor.Adam   # will always work
```

A recent change to `PyTorch` changed module initialization logic in a way that exposed this issue.

But the code has been working for years? This is the nature of undefined behavior, any change in the environment (in this the `PyTorch` code base can make it fail.

Reviewed By: mannatsingh

Differential Revision: D58881291
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D58881291

@fbgheith fbgheith force-pushed the export-D58881291-to-classy_vision-0.7.0 branch from 6d0584b to 79665d2 Compare June 27, 2024 16:00
fbgheith added a commit to fbgheith/ClassyVision that referenced this pull request Jun 27, 2024
Summary:
Pull Request resolved: facebookresearch#807

Pull Request resolved: facebookresearch#806

Lazy import changes `Python` import semantics, specifically when it comes to initialization of packages/modules: https://www.internalfb.com/intern/wiki/Python/Cinder/Onboarding/Tutorial/Lazy_Imports/Troubleshooting/

For example, this pattern is not guaranteed to work:

```
import torch.optim
...
torch.optim._multi_tensor.Adam   # may fail to resolve _multi_tensor
```

And this is guaranteed to work:

```
import torch.optim._multi_tensor
...
torch.optim._multi_tensor.Adam   # will always work
```

A recent change to `PyTorch` changed module initialization logic in a way that exposed this issue.

But the code has been working for years? This is the nature of undefined behavior, any change in the environment (in this the `PyTorch` code base can make it fail.

Reviewed By: mannatsingh

Differential Revision: D58881291
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants