- updated dependencies; using prompt_manager v0.4.2
- removed replaced
semver
withversionaire
- fixed prompt pipelines
- added //next and //pipeline directives as shortcuts to //config [next,pipeline]
- Added new backend "client" as an internal OpenAI client
- Added --sm, --speech_model default: tts-1
- Added --tm, --transcription_model default: whisper-1
- Added --voice default: alloy (if "siri" and Mac? then uses cli tool "say")
- Added --image_size and --image_quality (--is --iq)
- Added the ability to accept piped in text to be appeded to the end of the prompt text: curl $URL | aia ad_hoc
- Fixed bugs with entering directives as follow-up prompts during a chat session
- Directly access OpenAI to do text to speech when using the
--speak
option - Added --voice to specify which voice to use
- Added --speech_model to specify which TTS model to use
- Added CLI-utility
llm
as a backend processor
- Happy Birthday Ruby!
- Added --next CLI option
- Added --pipeline CLI option
- allow directives to return information that is inserted into the prompt text
- added //shell command directive
- added //ruby ruby_code directive
- added //include path_to_file directive
- Added --roles_dir to isolate roles from other prompts if desired
- Changed --prompts to --prompts_dir to be consistent
- Refactored common fzf usage into its own tool class
- Added a "I'm working" spinner thing when "--verbose" is used as an indication that the backend is in the process of composing its response to the prompt.
- Changed the behavior of the --dump option. It must now be followed by path/to/file.ext where ext is a supported config file format: yml, yaml, toml
- Added ERB processing to the config_file
- Adding processing for directives, shell integration and erb to the follow up prompt in a chat session
- some code refactoring.
- adding ability to render markdown to the terminal using the "glow" CLI utility
- wrap response when its going to the terminal
- removed a wicked puts "loaded" statement
- fixed missed code when the options were changed to --out_file and --log_file
- fixed completion functions by updating $PROMPT_DIR to $AIA_PROMPTS_DIR to match the documentation.
- breaking changes:
- changed
--config
to--config_file
- changed
--env
to--shell
- changed
--output
to--out_file
- changed default
out_file
toSTDOUT
- changed default
- changed
- added --env to process embedded system environment variables and shell commands within a prompt.
- added --erb to process Embedded RuBy within a prompt because have embedded shell commands will only get you in a trouble. Having ERB will really get you into trouble. Remember the simple prompt is usually the best prompt.
- added the --role CLI option to pre-pend a "role" prompt to the front of a primary prompt.
- added a chat mode
- prompt directives now supported
- version bumped to match the
prompt_manager
gem
- added work around to issue with multiple context files going to the
mods
backend - added shellwords gem to santize prompt text on the command line
- major code refactoring.
- supports config files *.yml, *.yaml and *.toml
- usage implemented as a man page. --help will display the man page/
- added "--dump <yml|yaml|toml>" to send current configuration to STDOUT
- added "--completion <bash|fish|zsh>" to send a a completion function for the indicated shell to STDOUT
- added system environment variable (envar) over-rides of default config values uppercase environment variables prefixed with "AIA_" + config item name for example AIA_PROMPTS_DIR and AIA_MODEL. All config items can be over-ridden by their cooresponding envars.
- config value hierarchy is:
- values from config file over-rides ...
- command line values over-rides ...
- envar values over-rides ...
- default values
- Matching version to prompt_manager This version allows for the user of history in the entery of values to prompt keywords. KW_HISTORY_MAX is set at 5. Changed CLI enteraction to use historical selection and editing of prior keyword values.
- Initial release