Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update pytorch lightning utilities #15310

Merged

Conversation

Dalton-Murray
Copy link
Contributor

@Dalton-Murray Dalton-Murray commented Mar 18, 2024

Description

Changes pytorch_lightning.utilities.distributed to pytorch_lightning.utilities.rank_zero as this is the new/proper place to get rank_zero_only and does not exist in distributed anymore. This also (mostly) fixes the issue of it not being able to startup on clean install for those who have this error.

This change has existed for quite a few versions of pytorch lightning, and definitely exists in the current version (1.9.4). Docs

This fixes many issues such as: #13785 , #13464 , #11642 , #11501 , #11458 , and a few others
Unfortunately, this doesn't fix every issue as this does still need to change in the repository Stability-AI/stablediffusion

Checklist:

@AUTOMATIC1111
Copy link
Owner

What about an option where you assign to pytorch_lightning.utilities.distributed, something like pytorch_lightning.utilities.distributed = pytorch_lightning.utilities.rank_zero, to make the code work without changes?

@Dalton-Murray
Copy link
Contributor Author

Dalton-Murray commented Mar 18, 2024

I want to confirm what you mean by this, I guess I'm a little confused. This snippet is from ddpm, do you mean something like this:

from torchvision.utils import make_grid
pytorch_lightning.utilities.distributed = pytorch_lightning.utilities.rank_zero
from pytorch_lightning.utilities.rank_zero import rank_zero_only
from omegaconf import ListConfig

If so, this doesn't work because pytorch_lightning is not defined.

Another way I tried was this, but this doesn't seem to work:

import pytorch_lightning.utilities.rank_zero as rank_zero
import pytorch_lightning.utilities.distributed as distributed
distributed = rank_zero
distributed.rank_zero_only()

However, this still does require some initial modification of the from pytorch_lightning.utilities.distributed import rank_zero_only line.
This, also doesn't seem to work as it still does try to initially import from distributed and results an error for me:

ModuleNotFoundError: No module named 'pytorch_lightning.utilities.distributed'
Press any key to continue . . .

Perhaps, I am thinking about this or doing this incorrectly.

This is another way I thought of but doesn't work:

import pytorch_lightning.utilities.rank_zero as rank_zero
def distributed():
    return rank_zero
from pytorch_lightning.utilities.distributed import rank_zero_only

Something like this would work, but it does change the code (and is technically unnecessary since it doesn't do anything with distributed:

import pytorch_lightning.utilities.rank_zero as distributed
from pytorch_lightning.utilities.rank_zero import rank_zero_only

If you prefer, I could instead replace this with a try catch/except like this which does work:

try:
    from pytorch_lightning.utilities.distributed import rank_zero_only
except:
    from pytorch_lightning.utilities.rank_zero import rank_zero_only

Sorry for the wall, this was just me trying to figure out a way of doing this

@AUTOMATIC1111
Copy link
Owner

Something like this:

import pytorch_lightning.utilities.rank_zero
sys.modules["pytorch_lightning.utilities.xxx"] = pytorch_lightning.utilities.rank_zero
from pytorch_lightning.utilities.xxx import rank_zero_only

rank_zero_only
> <function lightning_utilities.core.rank_zero.rank_zero_only(fn: Callable) -> Callable>

@Dalton-Murray
Copy link
Contributor Author

That'd totally work! I just skipped over it initially because I wasn't sure if you wanted to allow importing sys or not because it's not already being imported, will adapt it to current edits

@AUTOMATIC1111
Copy link
Owner

What I'm suggesting is to do this once, somewhere in code that's executed early (ie a function in initialize_util called from initialize.initialize()), and then all codebases will be able to use from pytorch_lightning.utilities.distributed import rank_zero_only.

You just gotta be careful and not replace pytorch_lightning.utilities.distributed if it exists already (which I assume is the case in the c uirrently used version of pytorch_lightning).

@Dalton-Murray
Copy link
Contributor Author

That'd make more sense! I'll revert and work on implementing something like that

@Dalton-Murray
Copy link
Contributor Author

Dalton-Murray commented Mar 21, 2024

Not sure if you want to keep the print statement in there or not as it's a bit unnecessary I feel like?

@Dalton-Murray
Copy link
Contributor Author

Sorry, I forgot to revert a few lines back, should be good now

@Dalton-Murray
Copy link
Contributor Author

Sorry, a bit of a mess, the import was actually definitely required but I moved it further down into the if

@AUTOMATIC1111 AUTOMATIC1111 merged commit 6507589 into AUTOMATIC1111:dev Mar 21, 2024
2 of 3 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants