Skip to content

Commit

Permalink
V0.3.8 readme
Browse files Browse the repository at this point in the history
  • Loading branch information
bigbrother666sh committed Jan 24, 2025
1 parent b1ef7a2 commit fdfe5c0
Show file tree
Hide file tree
Showing 10 changed files with 10 additions and 228 deletions.
4 changes: 2 additions & 2 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,11 +35,11 @@

Provided a custom extractor interface to allow users to customize according to actual needs.

- bug 修复以及其他改进(crawl4ai浏览器生命周期管理,异步 llm wrapper 等)(感谢 @tusik 贡献异步 llm wrapper
- bug 修复以及其他改进(crawl4ai浏览器生命周期管理,异步 llm wrapper 等)(感谢 @tusik 贡献

Bug fixes and other improvements (crawl4ai browser lifecycle management, asynchronous llm wrapper, etc.)

Thanks to @tusik for contributing the asynchronous LLM wrapper
Thanks to @tusik for contributing

# V0.3.6
- 改用 Crawl4ai 作为底层爬虫框架,其实Crawl4ai 和 Crawlee 的获取效果差别不大,二者也都是基于 Playwright ,但 Crawl4ai 的 html2markdown 功能很实用,而这对llm 信息提取作用很大,另外 Crawl4ai 的架构也更加符合我的思路;
Expand Down
4 changes: 0 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -89,10 +89,6 @@ wiseflow 0.3.x版本使用 pocketbase 作为数据库,你当然也可以手动

🌟 **这里与之前版本不同**,V0.3.5开始需要把 .env 放置在 [core](./core) 文件夹中。

**windows 用户可以参考 core文件夹下的 windows.env windows_run.py 文件,执行 windows_run.py 脚本**

感谢 @c469591 贡献的 windows 下原生启动脚本

#### 3.1 大模型相关配置

wiseflow 是 LLM 原生应用,请务必保证为程序提供稳定的 LLM 服务。
Expand Down
4 changes: 0 additions & 4 deletions README_EN.md
Original file line number Diff line number Diff line change
Expand Up @@ -88,10 +88,6 @@ For details, please refer to [pb/README.md](/pb/README.md)

🌟 **This is different from previous versions** - starting from V0.3.5, the .env file needs to be placed in the [core](./core) folder.

**Windows users can refer to the windows.env and windows_run.py files in the core folder and execute the windows_run.py script**

Thanks to @c469591 for contributing the native Windows startup script

#### 3.1 Large Language Model Configuration

Wiseflow is a LLM native application, so please ensure you provide stable LLM service for the program.
Expand Down
4 changes: 0 additions & 4 deletions README_JP.md
Original file line number Diff line number Diff line change
Expand Up @@ -89,10 +89,6 @@ Wiseflow 0.3.xはデータベースとしてpocketbaseを使用しています

🌟 **これは以前のバージョンとは異なります** - V0.3.5以降、.envファイルは[core](./core)フォルダに配置する必要があります。

**Windowsユーザーはcoreフォルダ内のwindows.envとwindows_run.pyファイルを参照し、windows_run.pyスクリプトを実行してください**

@c469591によるWindows用ネイティブ起動スクリプトの貢献に感謝いたします

#### 3.1 大規模言語モデルの設定

Wiseflowは LLMネイティブアプリケーションですので、プログラムに安定したLLMサービスを提供するようにしてください。
Expand Down
4 changes: 0 additions & 4 deletions README_KR.md
Original file line number Diff line number Diff line change
Expand Up @@ -88,10 +88,6 @@ Wiseflow 0.3.x는 데이터베이스로 pocketbase를 사용합니다. pocketbas

🌟 **이전 버전과 다릅니다** - V0.3.5부터 .env 파일은 [core](./core) 폴더에 위치해야 합니다.

**windows 사용자는 core 폴더의 windows.env와 windows_run.py 파일을 참조하여 windows_run.py 스크립트를 실행할 수 있습니다**

@c469591님이 기여해 주신 windows용 네이티브 실행 스크립트에 감사드립니다

#### 3.1 대규모 언어 모델 구성

Wiseflow는 LLM 네이티브 애플리케이션이므로 프로그램에 안정적인 LLM 서비스를 제공하도록 해주세요.
Expand Down
6 changes: 0 additions & 6 deletions core/windows.env

This file was deleted.

192 changes: 0 additions & 192 deletions core/windows_general_process.py

This file was deleted.

1 change: 0 additions & 1 deletion core/windows_run.py
Original file line number Diff line number Diff line change
Expand Up @@ -72,7 +72,6 @@ def main():
print(f"Error: run_task.py not found at: {process_script}")
except subprocess.CalledProcessError as e:
print(f"Error running run_task.py: {e}")

else:
print("Failed to start services")

Expand Down
17 changes: 8 additions & 9 deletions test/get_info_test.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,6 @@
sys.path.append(core_path)

# 现在可以直接导入模块,因为core目录已经在Python路径中
from scrapers import *
from utils.general_utils import is_chinese
from agents.get_info import get_author_and_publish_date, get_info, get_more_related_urls
from agents.get_info_prompts import *
Expand All @@ -20,7 +19,7 @@
# models = ['deepseek-reasoner']
models = ['Qwen/Qwen2.5-7B-Instruct', 'Qwen/Qwen2.5-14B-Instruct', 'Qwen/Qwen2.5-32B-Instruct', 'deepseek-ai/DeepSeek-V2.5']

async def main(sample: dict, include_ap: bool, prompts: list, focus_dict: dict, record_file: str):
async def main(sample: dict, include_ap: bool, prompts: list, record_file: str):
link_dict, links_parts, contents = sample['link_dict'], sample['links_part'], sample['contents']
get_link_sys_prompt, get_link_suffix_prompt, get_info_sys_prompt, get_info_suffix_prompt = prompts

Expand Down Expand Up @@ -91,16 +90,16 @@ async def main(sample: dict, include_ap: bool, prompts: list, focus_dict: dict,
focus_points = json.load(open(os.path.join(sample_dir, 'focus_point.json'), 'r'))
focus_statement = ''
for item in focus_points:
tag = item["focuspoint"]
expl = item["explanation"]
focus_statement = f"{focus_statement}//{tag}//\n"
tag = item["focuspoint"].strip()
expl = item["explanation"].strip()
focus_statement = f"{focus_statement}//{tag}//"
if expl:
if is_chinese(expl):
focus_statement = f"{focus_statement}解释{expl}\n"
focus_statement = f"{focus_statement}\n解释{expl}\n"
else:
focus_statement = f"{focus_statement}Explanation: {expl}\n"
focus_statement = f"{focus_statement}\nExplanation: {expl}\n"

focus_dict = {item["focuspoint"]: item["focuspoint"] for item in focus_points}
#focus_dict = {item["focuspoint"]: item["focuspoint"] for item in focus_points}
date_stamp = datetime.now().strftime('%Y-%m-%d')
if is_chinese(focus_statement):
get_link_sys_prompt = get_link_system.replace('{focus_statement}', focus_statement)
Expand Down Expand Up @@ -134,4 +133,4 @@ async def main(sample: dict, include_ap: bool, prompts: list, focus_dict: dict,
with open(record_file, 'a') as f:
f.write(f"raw materials: {file}\n\n")
print(f'start testing {file}')
asyncio.run(main(sample, include_ap, prompts, focus_dict, record_file))
asyncio.run(main(sample, include_ap, prompts, record_file))
2 changes: 0 additions & 2 deletions weixin_mp/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,9 +19,7 @@

from general_process import main_process, wiseflow_logger, pb
from typing import Optional
import logging

logging.getLogger("httpx").setLevel(logging.WARNING)

# 千万注意扫码登录时不要选择"同步历史消息",否则会造成 bot 上来挨个回复历史消息
# 先检查下 wx 的登录状态,同时获取已登录微信的 wxid
Expand Down

0 comments on commit fdfe5c0

Please sign in to comment.