Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: use OpenMP flag to avoid macOS segfault #34

Merged
merged 5 commits into from
Mar 18, 2024
Merged
Show file tree
Hide file tree
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -117,6 +117,7 @@ For most users, the easiest way to install Selfie is to follow the [Quick Start]
6. Run `poetry install` to install required Python dependencies.
7. Optional: Run `./scripts/llama-cpp-python-cublas.sh` to enable hardware acceleration (for details, see [Scripts](#llama-cpp-python-cublassh)).
8. Run `poetry run python -m selfie`, or `poetry run python -m selfie --gpu` if your device is GPU-enabled. The first time you run this, it will download ~4GB of model weights.
- On macOS, you may need to run `OMP_NUM_THREADS=1 poetry run python -m selfie` to avoid segmentation faults (with or without `--gpu`). [Read more here](https://github.com/vana-com/selfie/issues/33#issuecomment-2004637058).

[//]: # (1. `git clone
[//]: # (Disable this note about installing with GPU support until supported via transformers, etc.)
Expand Down
2 changes: 1 addition & 1 deletion selfie/connectors/base_connector.py
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ def get_documentation_markdown(self):
def _read_file(self, file_name: str) -> str | None:
file_path = os.path.join(os.path.dirname(__file__), self.id, file_name)
if os.path.exists(file_path):
with open(file_path, 'r') as file:
with open(file_path, 'r', encoding='utf-8') as file:
return file.read()
else:
return None
Expand Down
2 changes: 1 addition & 1 deletion selfie/parsers/chat/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@
# current_dir = os.path.dirname(os.path.abspath(__file__))
# blacklist_file_path = os.path.join(current_dir, "blacklist_patterns.yaml")

with open(blacklist_file_path, "r") as f:
with open(blacklist_file_path, "r", encoding='utf-8') as f:
default_blacklist_patterns = yaml.safe_load(f)
default_blacklist_patterns = [
pattern.strip() for pattern in default_blacklist_patterns
Expand Down
4 changes: 2 additions & 2 deletions selfie/types/completion_requests.py
Original file line number Diff line number Diff line change
Expand Up @@ -82,11 +82,11 @@ def openai_params(self):
return {
k: v
for k, v in self.model_dump().items()
if k not in BaseCompletionRequest.custom_params and v is not None
if k not in self.custom_params and v is not None
}

def selfie_params(self):
return {k: v for k, v in self.model_dump().items() if k in BaseCompletionRequest.custom_params and v is not None}
return {k: v for k, v in self.model_dump().items() if k in self.custom_params and v is not None}

def extra_params(self):
"""
Expand Down
11 changes: 10 additions & 1 deletion start.sh
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ else
fi

echo "Installing Python dependencies with Poetry..."
poetry check || poetry install
poetry install

echo "Building UI with Yarn..."
./scripts/build-ui.sh
Expand All @@ -35,4 +35,13 @@ echo "Running llama-cpp-python-cublas.sh to enable hardware acceleration..."
./scripts/llama-cpp-python-cublas.sh

echo "Running selfie..."

if [ "$(uname -m)" = "arm64" ]; then
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

arm64 can be Linux too no? Is it okay?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We could also check if uname -s matches Darwin, but I'm not sure if this is limited to Macs only... it should be okay to use a conservative setting here. Maybe it impacts performance, I'm not sure.

Ultimately it would be great if we can remove both of these flags, especially KMP_DUPLICATE_LIB_OK, which doesn't appear to be a 100% reliable fix.

ENV_FLAG="OMP_NUM_THREADS=1"
fi

if [ ! -z "$ENV_FLAG" ]; then
export $ENV_FLAG
fi

poetry run python -m selfie $GPU_FLAG