You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I was having an error on the line 'y = y.clip(0, 1).movedim(1, -1)' stating that there was only 1 dimension and expected [-1,0], so I tracked the problem with prints and eventually discovered that:
on \lib_layerdiffusion\models.py, line 236-237,
result = torch.stack(result, dim=0)
returned a normal tensor as it should, but the next line:
median = torch.median(result, dim=0).values
returned an empty tensor. Even assigning torch.median(result, dim=0) to a var and pulling the .values later didn't work.
So, it seems torch.median doesn't work on Directml. I managed to circle around the problem by:
result = torch.stack(result, dim=0).to("cpu")
and then casting it back right after:
return median.to(self.load_device)
This fixes the problem for Directml users and it didn't seem to effect performance. I'm not entirely sure if the problem is really for all Direcml users, let's see if anyone else complains too.
The text was updated successfully, but these errors were encountered:
MythicalChu
changed the title
Use on Directml, "torch.mean()" problem.
Use on Directml, "torch.median()" problem.
Mar 2, 2024
I was having an error on the line 'y = y.clip(0, 1).movedim(1, -1)' stating that there was only 1 dimension and expected [-1,0], so I tracked the problem with prints and eventually discovered that:
on \lib_layerdiffusion\models.py, line 236-237,
result = torch.stack(result, dim=0) returned a normal tensor as it should, but the next line:
median = torch.median(result, dim=0).values returned an empty tensor. Even assigning torch.median(result, dim=0) to a var and pulling the .values later didn't work.
So, it seems torch.median doesn't work on Directml. I managed to circle around the problem by:
result = torch.stack(result, dim=0).to("cpu") and then casting it back right after:
return median.to(self.load_device)
This fixes the problem for Directml users and it didn't seem to effect performance. I'm not entirely sure if the problem is really for all Direcml users, let's see if anyone else complains too.
I was having an error on the line 'y = y.clip(0, 1).movedim(1, -1)' stating that there was only 1 dimension and expected [-1,0], so I tracked the problem with prints and eventually discovered that:
on \lib_layerdiffusion\models.py, line 236-237,
result = torch.stack(result, dim=0)
returned a normal tensor as it should, but the next line:
median = torch.median(result, dim=0).values
returned an empty tensor. Even assigning torch.median(result, dim=0) to a var and pulling the .values later didn't work.
So, it seems torch.median doesn't work on Directml. I managed to circle around the problem by:
result = torch.stack(result, dim=0).to("cpu")
and then casting it back right after:
return median.to(self.load_device)
This fixes the problem for Directml users and it didn't seem to effect performance. I'm not entirely sure if the problem is really for all Direcml users, let's see if anyone else complains too.
The text was updated successfully, but these errors were encountered: