MultiThreaded Application to Scrape Working Web Proxies
- More than 300 Live Proxy Fetch
- CLI Support
- Selection of Proxy based on Speed
- Proxy Export
- Country Filteration
pip install fastProxy==1.0.0
git clone https://github.com/1UC1F3R616/fastProxy.git
cd fastProxy/
pip install -r requirements.txt -U
- Threads: 100
- Request Timeout: 4sec
# Basic usage
python cli.py
# With options
python cli.py --c=10 --t=5 --g --a
Flag | Usage | Purpose | Default | Usage |
---|---|---|---|---|
c | Thread Count | Increase Testing Speed | 100 | --c=16 |
t | Request Timeout in sec | Give Faster Proxy when set to lower Values | 4 | --t=20 |
g | Generate CSV | Generate CSV of Working proxy only with user flags | False | --g |
a | All Scraped Proxy | Generate CSV of All Scrapped Proxies with more Detail | False | --a |
- Set Flags or Default Values are Taken
Flag | Usage | Purpose | Default | Usage |
---|---|---|---|---|
c | Thread Count | Increase Testing Speed | 100 | c=256 |
t | Request Timeout in sec | Give Faster Proxy when set to lower Values | 4 | t=2 |
g | Generate CSV | Generate CSV of Working proxy only with user flags | False | g=True |
a | All Scraped Proxy | Generate CSV of All Scrapped Proxies with more Detail | False | a=True |
from fastProxy import fetch_proxies
# Basic usage
proxies = fetch_proxies()
print(proxies)
# With options
proxies = fetch_proxies(c=10, t=5, g=True, a=True)
Sample CSV File
- Tag slow tests
- Fix failing tests
- Add support for
https://proxyscrape.com/free-proxy-list
usinghttps://api.proxyscrape.com/v4/free-proxy-list/get?request=display_proxies&proxy_format=protocolipport&format=json
- Remove redundant code and files
- Refactor linux only code with proper handling