Skip to content

Commit

Permalink
Core Update (#34)
Browse files Browse the repository at this point in the history
* updated deps and readme
* updated llm core
* change tooling from black and pylint to ruff
* updated to use Textuals new theme system
* added support for max context size
* various bug fixes and cleanup
* handle updated return values from Ollama lib
* fix some theme and layout issues
* changed from pyperclip to clipman
* fixed LlmConfig loading
* added ctrl+shift+c to copy markdown fence data from chat message
  • Loading branch information
paulrobello authored Dec 30, 2024
1 parent 421238a commit b935b70
Show file tree
Hide file tree
Showing 91 changed files with 2,850 additions and 5,770 deletions.
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,7 @@
**/build/
**/dist/
**/parllama.egg-info/
**/.DS_Store
/quantize_model/
/quantize_workspace/
/notes.md
Expand Down
51 changes: 16 additions & 35 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -18,37 +18,7 @@ repos:
- id: check-json
- id: pretty-format-json
args: [--autofix, --no-sort-keys]
exclude: tests(/\w*)*/functional/|tests/input|tests(/.*)+/conftest.py|doc/data/messages|tests(/\w*)*data/|Pipfile.lock

#- repo: https://github.com/asottile/reorder-python-imports
# rev: v3.13.0
# hooks:
# - id: reorder-python-imports
# args: [--py39-plus, --add-import, "from __future__ import annotations"]
# exclude: tests(/\w*)*/functional/|tests/input|tests(/.*)+/conftest.py|doc/data/messages|tests(/\w*)*data/

- repo: https://github.com/psf/black
rev: 24.8.0
hooks:
- id: black
args: [-t, py310, -t, py311, -t, py312]
exclude: tests(/\w*)*/functional/|tests/input|tests(/.*)+/conftest.py|doc/data/messages|tests(/\w*)*data/

#- repo: local
# hooks:
# - id: mypy
# name: mypy
# entry: make
# language: system
# pass_filenames: false
# args:
# [typecheck]
# exclude: tests(/\w*)*/functional/|tests/input|tests(/\w*)*data/|doc/

#- repo: https://github.com/RobertCraigie/pyright-python
# rev: v1.1.376
# hooks:
# - id: pyright
exclude: tests(/\w*)*/functional/|tests/input|tests(/.*)+/conftest.py|doc/data/messages|tests(/\w*)*data/|Pipfile.lock|output/.*

- repo: local
hooks:
Expand All @@ -59,15 +29,26 @@ repos:
pass_filenames: false
args:
[typecheck]
exclude: tests(/\w*)*/functional/|tests/input|tests(/\w*)*data/|doc/
exclude: tests(/\w*)*/functional/|tests/input|tests(/\w*)*data/|doc/|output/.*

- repo: local
hooks:
- id: format
name: format
entry: make
language: system
pass_filenames: false
args:
[format]
exclude: tests(/\w*)*/functional/|tests/input|tests(/\w*)*data/|doc/|output/.*

- repo: local
hooks:
- id: pylint
name: pylint
- id: lint
name: lint
entry: make
language: system
pass_filenames: false
args:
[lint]
exclude: tests(/\w*)*/functional/|tests/input|tests(/\w*)*data/|doc/
exclude: tests(/\w*)*/functional/|tests/input|tests(/\w*)*data/|doc/|output/.*
28 changes: 0 additions & 28 deletions .pylintrc

This file was deleted.

22 changes: 12 additions & 10 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -3,12 +3,10 @@
lib := parllama
run := uv run
python := $(run) python
lint := $(run) pylint
ruff := $(run) ruff
pyright := $(run) pyright
twine := $(run) twine
build := $(python) -m build
black := $(run) black
isort := $(run) isort

export UV_LINK_MODE=copy
export PIPENV_VERBOSITY=-1
Expand Down Expand Up @@ -103,9 +101,18 @@ shell: # Start shell inside of .venv
$(run) bash
##############################################################################
# Checking/testing/linting/etc.
.PHONY: format
format: # Reformat the code with ruff.
$(ruff) format src/$(lib)

.PHONY: lint
lint: # Run Pylint over the library
$(lint) $(lib)
lint: # Run ruff lint over the library
$(ruff) check src/$(lib) --fix

.PHONY: lint-unsafe
lint-unsafe: # Run ruff lint over the library
$(ruff) check src/$(lib) --fix --unsafe-fixes


.PHONY: typecheck
typecheck: # Perform static type checks with pyright
Expand Down Expand Up @@ -156,11 +163,6 @@ dist: packagecheck # Upload to pypi
get-venv-name:
$(run) which python

.PHONY: ugly
ugly: # Reformat the code with black.
# $(isort) $(lib)
$(black) $(lib)

.PHONY: repl
repl: # Start a Python REPL
$(python)
Expand Down
36 changes: 32 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,13 +21,14 @@
* [Quick start chat workflow](#quick-start-chat-workflow)
* [Custom Prompts](#Custom-Prompts)
* [Themes](#themes)
* [Screen Help](https://raw.githubusercontent.com/paulrobello/parllama/main/src/parllama/help.md)
* [Screen Help](https://github.com/paulrobello/parllama/blob/main/src/parllama/help.md)
* [Contributing](#contributing)
* [FAQ](#faq)
* [Roadmap](#roadmap)
* [Where we are](#where-we-are)
* [Where we are](#where-we-are)Ï
* [Where we're going](#where-were-going)
* [What's new](#whats-new)
* [v0.3.11](#v0311)*
* [v0.3.10](#v0310)
* [v0.3.9](#v039)
* [v0.3.8](#v038)
Expand All @@ -49,7 +50,7 @@
![PyPI - License](https://img.shields.io/pypi/l/parllama)

## About
PAR LLAMA is a TUI application designed for easy management and use of Ollama based LLMs.
PAR LLAMA is a TUI (Text UI) application designed for easy management and use of Ollama based LLMs.
The application was built with [Textual](https://textual.textualize.io/) and [Rich](https://github.com/Textualize/rich?tab=readme-ov-file)
and runs on all major OS's including but not limited to Windows, Windows WSL, Mac, and Linux.

Expand Down Expand Up @@ -303,6 +304,16 @@ make dev
* Towards the very top of the app you will see what model is loaded and what percent of it is loaded into the GPU / CPU. If a model cant be loaded 100% on the GPU it will run slower.
* Type "/help" or "/?" to see what other slash commands are available.

## LlamaCPP support
Parllama supports LlamaCPP running OpenAI server mode. Parllama will use the default base_url of http://127.0.0.1:8080. This can be configured on the Options tab.
To start a LlamaCPP server run the following command in separate terminal:
```bash
llama-server -m PATH_TO_MODEL
```
or
```bash
llama-server -mu URL_TO_MODEL
```

## Custom Prompts
You can create a library of custom prompts for easy starting of new chats.
Expand Down Expand Up @@ -381,13 +392,18 @@ if anything remains to be fixed before the commit is allowed.
* A: ParLlama by default does not require any network / internet access unless you enable checking for updates or want to import / use data from an online source.
* Q: Does ParLlama run on ARM?
* A: Short answer is yes. ParLlama should run any place python does. It has been tested on Windows 11 x64, Windows WSL x64, Mac OSX intel and silicon
* Q: Does Parllama require Ollama be installed locally?
* Q: Does ParLlama require Ollama be installed locally?
* A: No. ParLlama has options to connect to remote Ollama instances
* Q: Does ParLlama require Ollama?
* A: No. ParLlama can be used with most online AI providers
* Q: Does ParLlama support vision LLMS?
* A: Yes. If the selected provider / model supports vision you can add images to the chat via /slash commands

## Roadmap

### Where we are
* Initial release - Find, maintain and create new models
* Theme support
* Connect to remote instances
* Chat with history / conversation management
* Chat tabs allow chat with multiple models at same time
Expand All @@ -398,13 +414,25 @@ if anything remains to be fixed before the commit is allowed.

### Where we're going

* Better image support via file pickers
* Ability to copy code and other sub sections from chat
* RAG for local documents and web pages
* Expand ability to import custom prompts of other tools
* LLM tool use


## What's new

### v0.3.11

* Added ability to set max context size for Ollama and other providers that support it
* Limited support for LLamaCpp running in OpenAI Mode.
* Added ability to cycle through fences in selected chat message and copy to clipboard with `ctrl+shift+c`
* Added theme selector
* Varius bug fixes and performance improvements
* Updated core AI library and dependencies
* Fixed crash due to upstream library update

### v0.3.10
* Fixed crash issues on fresh installs
* Images are now stored in chat session json files
Expand Down
Binary file added docs/theme-select.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
122 changes: 0 additions & 122 deletions old-setup.cfg

This file was deleted.

Loading

0 comments on commit b935b70

Please sign in to comment.