- version update to v0.2.02
- Supports new LLAMA3 and LLAMA3.1 MODELS
- supports streaming outputs
- can be used through official groq python api (with baseurl)
- faster than previous versions
- groqon stop now works as expected
- added server client feature
- processes queries asynchronosly
- bug fixes
- fixing selectors
- No need to wait after logging in, till timeout
- feature: --reset_login to clear the old cookie file
- Adding async functionality
- added system prompt support for flavourful conversation
- added Documentation
- printing output can be optional with bool
- added type-hinting
- added publish to pypi github action
- published to pypi with gh action
- Published the groqon(reads groq on like rock on!! hai ye waqt ka ishara ) package