Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

invoke-ai crashes during installation on Ubuntu 22.04.2 LTS #18

Closed
hansfbaier opened this issue Apr 3, 2023 · 4 comments
Closed

invoke-ai crashes during installation on Ubuntu 22.04.2 LTS #18

hansfbaier opened this issue Apr 3, 2023 · 4 comments

Comments

@hansfbaier
Copy link

[18:03:52] ~ $ nix run "github:nixified-ai/flake#invokeai-amd" -- --web
warning: ignoring untrusted substituter 'https://hydra.iohk.io'
warning: ignoring untrusted substituter 'https://iohk.cachix.org'
warning: ignoring untrusted substituter 'https://nixbld.m-labs.hk'
warning: ignoring untrusted substituter 'https://unblob.cachix.org'
warning: ignoring untrusted substituter 'https://hydra.iohk.io'
warning: ignoring untrusted substituter 'https://iohk.cachix.org'
warning: ignoring untrusted substituter 'https://nixbld.m-labs.hk'
warning: ignoring untrusted substituter 'https://unblob.cachix.org'
warning: ignoring untrusted substituter 'https://ai.cachix.org'
[1/34/61 built, 477 copied (4899.6/4901.4 MiB), 933.7 MiB DL] building torch-1.13.1+rocm5.1.1-cp310-cp310-linux_x86_64.whl:                                  Dload  Upload   Total   Spent    Left  Speed
[1/34/61 built, 477 copied (4899.6/4901.4 MiB), 933.7 MiB DL] building torch-1.13.1+rocm5.1.1-cp310-cp310-linux_x86_64.whl:                                  Dload  Upload   Total   Spent    Left  Speed
[1/34/61 built, 477 copied (4899.6/4901.4 MiB), 933.7 MiB DL] building torch-1.13.1+rocm5.1.1-cp310-cp310-linux_x86_64.whl:                                  Dload  Upload   Total   Spent    Left  Speed
2023-04-03 18:33:55.429880: I tensorflow/core/platform/cpu_feature_guard.cc:193] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations:  SSE3 SSE4.1 SSE4.2 AVX AVX2 FMA
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
/opt/amdgpu/share/libdrm/amdgpu.ids: No such file or directory
/opt/amdgpu/share/libdrm/amdgpu.ids: No such file or directory
"hipErrorNoBinaryForGpu: Unable to find code object for all current devices!"

Aborted (core dumped)

@MatthewCroughan
Copy link
Member

The AMD issues are a known issue, sadly solving this requires some manual imperative steps for the moment, unless you're using NixOS and the NixOS modules. I have already mentioned this here #13 (comment)

And the solution, as mentioned in the previous comment, is here #2 (comment)

@hansfbaier
Copy link
Author

[18:55:17] ~ $ $HSA_OVERRIDE_GFX_VERSION='10.3.0'
[18:55:43] ~ $ nix run -L "github:nixified-ai/flake#invokeai-amd" -- --web
warning: ignoring untrusted substituter 'https://hydra.iohk.io'
warning: ignoring untrusted substituter 'https://iohk.cachix.org'
warning: ignoring untrusted substituter 'https://nixbld.m-labs.hk'
warning: ignoring untrusted substituter 'https://unblob.cachix.org'
warning: ignoring untrusted substituter 'https://hydra.iohk.io'
warning: ignoring untrusted substituter 'https://iohk.cachix.org'
warning: ignoring untrusted substituter 'https://nixbld.m-labs.hk'
warning: ignoring untrusted substituter 'https://unblob.cachix.org'
warning: ignoring untrusted substituter 'https://ai.cachix.org'
2023-04-03 18:55:47.695088: I tensorflow/core/platform/cpu_feature_guard.cc:193] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations:  SSE3 SSE4.1 SSE4.2 AVX AVX2 FMA
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
/nix/store/s9rc6lw24i8bxa2hksb1zqcrqw74p1r4-python3.10-pytorch-lightning-1.9.0/lib/python3.10/site-packages/pytorch_lightning/utilities/distributed.py:258: LightningDeprecationWarning: `pytorch_lightning.utilities.distributed.rank_zero_only` has been deprecated in v1.8.1 and will be removed in v2.0.0. You can import it from `pytorch_lightning.utilities` instead.
  rank_zero_deprecation(
* Initializing, be patient...
>> Internet connectivity is True
>> InvokeAI, version 2.3.1.post2
>> InvokeAI runtime directory is "/home/jack/invokeai"
## NOT FOUND: GFPGAN model not found at /home/jack/invokeai/models/gfpgan/GFPGANv1.4.pth
>> GFPGAN Disabled
## NOT FOUND: CodeFormer model not found at /home/jack/invokeai/models/codeformer/codeformer.pth
>> CodeFormer Disabled
>> ESRGAN Initialized
** An error occurred while attempting to initialize the model: "[Errno 2] No such file or directory: '/home/jack/invokeai/configs/models.yaml'"
** This can be caused by a missing or corrupted models file, and can sometimes be fixed by (re)installing the models.
Do you want to run invokeai-configure script to select and/or reinstall models? [y/N]: y
invokeai-configure is launching....

Loading Python libraries...

** INITIALIZING INVOKEAI RUNTIME DIRECTORY **
* Initializing, be patient...
╭─────────────────────────────── Traceback (most recent call last) ────────────────────────────────╮
│ /nix/store/j8bsmlg9p2f74bvkqn8j7kibhzzaxr7s-python3.10-InvokeAI-2.3.1.post2/lib/python3.10/site- │
│ packages/ldm/invoke/CLI.py:128 in main                                                           │
│                                                                                                  │
│    125 │                                                                                         │
│    126 │   # creating a Generate object:                                                         │
│    127 │   try:                                                                                  │
│ ❱  128 │   │   gen = Generate(                                                                   │
│    129 │   │   │   conf=opt.conf,                                                                │
│    130 │   │   │   model=opt.model,                                                              │
│    131 │   │   │   sampler_name=opt.sampler_name,                                                │
│                                                                                                  │
│ /nix/store/j8bsmlg9p2f74bvkqn8j7kibhzzaxr7s-python3.10-InvokeAI-2.3.1.post2/lib/python3.10/site- │
│ packages/ldm/generate.py:162 in __init__                                                         │
│                                                                                                  │
│    159 │   │   weights=None,                                                                     │
│    160 │   │   config=None,                                                                      │
│    161 │   ):                                                                                    │
│ ❱  162 │   │   mconfig = OmegaConf.load(conf)                                                    │
│    163 │   │   self.height = None                                                                │
│    164 │   │   self.width = None                                                                 │
│    165 │   │   self.model_manager = None                                                         │
│                                                                                                  │
│ /nix/store/7wyc2gyn58nn39j8vwpqs5pgjgw1ikwq-python3.10-omegaconf-2.3.0/lib/python3.10/site-packa │
│ ges/omegaconf/omegaconf.py:189 in load                                                           │
│                                                                                                  │
│    186 │   │   from ._utils import get_yaml_loader                                               │
│    187 │   │                                                                                     │
│    188 │   │   if isinstance(file_, (str, pathlib.Path)):                                        │
│ ❱  189 │   │   │   with io.open(os.path.abspath(file_), "r", encoding="utf-8") as f:             │
│    190 │   │   │   │   obj = yaml.load(f, Loader=get_yaml_loader())                              │
│    191 │   │   elif getattr(file_, "read", None):                                                │
│    192 │   │   │   obj = yaml.load(file_, Loader=get_yaml_loader())                              │
╰──────────────────────────────────────────────────────────────────────────────────────────────────╯
FileNotFoundError: [Errno 2] No such file or directory: '/home/jack/invokeai/configs/models.yaml'

During handling of the above exception, another exception occurred:

╭─────────────────────────────── Traceback (most recent call last) ────────────────────────────────╮
│ /nix/store/j8bsmlg9p2f74bvkqn8j7kibhzzaxr7s-python3.10-InvokeAI-2.3.1.post2/bin/..invoke.py-wrap │
│ ped-wrapped:10 in <module>                                                                       │
│                                                                                                  │
│    7 from ldm.invoke.CLI import main                                                             │
│    8 if __name__ == '__main__':                                                                  │
│    9 │   sys.argv[0] = re.sub(r'(-script\.pyw|\.exe)?$', '', sys.argv[0])                        │
│ ❱ 10 │   sys.exit(main())                                                                        │
│   11                                                                                             │
│                                                                                                  │
│ /nix/store/j8bsmlg9p2f74bvkqn8j7kibhzzaxr7s-python3.10-InvokeAI-2.3.1.post2/lib/python3.10/site- │
│ packages/ldm/invoke/CLI.py:143 in main                                                           │
│                                                                                                  │
│    140 │   │   │   max_loaded_models=opt.max_loaded_models,                                      │
│    141 │   │   )                                                                                 │
│    142 │   except (FileNotFoundError, TypeError, AssertionError) as e:                           │
│ ❱  143 │   │   report_model_error(opt, e)                                                        │
│    144 │   except (IOError, KeyError) as e:                                                      │
│    145 │   │   print(f"{e}. Aborting.")                                                          │
│    146 │   │   sys.exit(-1)                                                                      │
│                                                                                                  │
│ /nix/store/j8bsmlg9p2f74bvkqn8j7kibhzzaxr7s-python3.10-InvokeAI-2.3.1.post2/lib/python3.10/site- │
│ packages/ldm/invoke/CLI.py:1216 in report_model_error                                            │
│                                                                                                  │
│   1213 │                                                                                         │
│   1214 │   from ldm.invoke.config import invokeai_configure                                      │
│   1215 │                                                                                         │
│ ❱ 1216 │   invokeai_configure.main()                                                             │
│   1217 │   print("** InvokeAI will now restart")                                                 │
│   1218 │   sys.argv = previous_args                                                              │
│   1219 │   main()  # would rather do a os.exec(), but doesn't exist?                             │
│                                                                                                  │
│ /nix/store/j8bsmlg9p2f74bvkqn8j7kibhzzaxr7s-python3.10-InvokeAI-2.3.1.post2/lib/python3.10/site- │
│ packages/ldm/invoke/config/invokeai_configure.py:828 in main                                     │
│                                                                                                  │
│   825 │   │   │   │   precision="float32" if opt.full_precision else "float16"                   │
│   826 │   │   │   )                                                                              │
│   827 │   │   else:                                                                              │
│ ❱ 828 │   │   │   init_options, models_to_download = run_console_ui(opt, init_file)              │
│   829 │   │   │   if init_options:                                                               │
│   830 │   │   │   │   write_opts(init_options, init_file)                                        │
│   831 │   │   │   else:                                                                          │
│                                                                                                  │
│ /nix/store/j8bsmlg9p2f74bvkqn8j7kibhzzaxr7s-python3.10-InvokeAI-2.3.1.post2/lib/python3.10/site- │
│ packages/ldm/invoke/config/invokeai_configure.py:681 in run_console_ui                           │
│                                                                                                  │
│   678 │                                                                                          │
│   679 │   set_min_terminal_size(MIN_COLS, MIN_LINES)                                             │
│   680 │   editApp = EditOptApplication(program_opts, invokeai_opts)                              │
│ ❱ 681 │   editApp.run()                                                                          │
│   682 │   if editApp.user_cancelled:                                                             │
│   683 │   │   return (None, None)                                                                │
│   684 │   else:                                                                                  │
│                                                                                                  │
│ /nix/store/8i8hsjcdxf838md0546vd5v7y2xkp3m2-python3.10-npyscreen-4.10.5/lib/python3.10/site-pack │
│ ages/npyscreen/apNPSApplication.py:30 in run                                                     │
│                                                                                                  │
│   27 │   def run(self, fork=None):                                                               │
│   28 │   │   """Run application.  Calls Mainloop wrapped properly."""                            │
│   29 │   │   if fork is None:                                                                    │
│ ❱ 30 │   │   │   return npyssafewrapper.wrapper(self.__remove_argument_call_main)                │
│   31 │   │   else:                                                                               │
│   32 │   │   │   return npyssafewrapper.wrapper(self.__remove_argument_call_main, fork=fork)     │
│   33                                                                                             │
│                                                                                                  │
│ /nix/store/8i8hsjcdxf838md0546vd5v7y2xkp3m2-python3.10-npyscreen-4.10.5/lib/python3.10/site-pack │
│ ages/npyscreen/npyssafewrapper.py:41 in wrapper                                                  │
│                                                                                                  │
│    38 │   │   wrapper_no_fork(call_function)                                                     │
│    39 │   else:                                                                                  │
│    40 │   │   if _NEVER_RUN_INITSCR:                                                             │
│ ❱  41 │   │   │   wrapper_no_fork(call_function)                                                 │
│    42 │   │   else:                                                                              │
│    43 │   │   │   wrapper_fork(call_function, reset=reset)                                       │
│    44                                                                                            │
│                                                                                                  │
│ /nix/store/8i8hsjcdxf838md0546vd5v7y2xkp3m2-python3.10-npyscreen-4.10.5/lib/python3.10/site-pack │
│ ages/npyscreen/npyssafewrapper.py:82 in wrapper_no_fork                                          │
│                                                                                                  │
│    79 │   return_code = None                                                                     │
│    80 │   if _NEVER_RUN_INITSCR:                                                                 │
│    81 │   │   _NEVER_RUN_INITSCR = False                                                         │
│ ❱  82 │   │   locale.setlocale(locale.LC_ALL, '')                                                │
│    83 │   │   _SCREEN = curses.initscr()                                                         │
│    84 │   │   try:                                                                               │
│    85 │   │   │   curses.start_color()                                                           │
│                                                                                                  │
│ /nix/store/iw1vmh509hcbby8dbpsaanbri4zsq7dj-python3-3.10.10/lib/python3.10/locale.py:620 in      │
│ setlocale                                                                                        │
│                                                                                                  │
│    617 │   if locale and not isinstance(locale, _builtin_str):                                   │
│    618 │   │   # convert to string                                                               │
│    619 │   │   locale = normalize(_build_localename(locale))                                     │
│ ❱  620 │   return _setlocale(category, locale)                                                   │
│    621                                                                                           │
│    622 def resetlocale(category=LC_ALL):                                                         │
│    623                                                                                           │
╰──────────────────────────────────────────────────────────────────────────────────────────────────╯
Error: unsupported locale setting

@hansfbaier
Copy link
Author

hansfbaier commented Apr 3, 2023

Ah, I missed LC_ALL=C
Ah, I have an RX570, so I will need to do this #13
I'll let you know how this turns out.
Thanks!

@MatthewCroughan
Copy link
Member

Ah, I missed LC_ALL=C

Yeah that's an interesting issue, you always find these issues when using Nix, the program won't set it, and your distro isn't setting it, and we aren't setting it, so overall it is unset, and when this var is unset lots of programs misbehave, so maybe we should set it in the wrapper in order to avoid issues in the future.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants