A high-performance, asynchronous proxy scraper and checker tool that supports multiple protocols (HTTP, SOCKS4, SOCKS5) with advanced features for proxy validation and anonymity checking.
- Multi-Protocol Support: HTTP, SOCKS4, and SOCKS5 proxies
- Asynchronous Processing: High-performance proxy scraping and checking
- Advanced Validation:
- Connection testing
- Anonymity level detection
- Response time measurement
- SSL verification
- Flexible Output:
- Multiple export formats (TXT, JSON, CSV)
- Protocol-specific organization
- Detailed statistics
- Robust Error Handling:
- Retry mechanisms
- Timeout management
- Graceful shutdown
- Progress Tracking:
- Real-time progress bars
- Detailed logging
- Performance statistics
- Clone the repository:
git clone https://github.com/taygun08/elite-proxy-checker.git
cd proxy_scraper
- Install dependencies:
pip install -r requirements.txt
Basic usage:
python proxy_scraper.py
Advanced usage with options:
python proxy_scraper.py --threads 100 --timeout 10 --protocols http,socks4 --anonymity-level elite
Required Arguments:
--threads INT Number of concurrent connections (default: 50)
--timeout FLOAT Timeout in seconds for each request (default: 10)
--protocols STR Comma-separated list of protocols (default: http,socks4,socks5)
Optional Arguments:
--output STR Output directory for results (default: output/)
--max-ms INT Maximum response time in milliseconds (default: 10000)
--delay FLOAT Delay between proxy checks in seconds (default: 0)
--anonymity-level STR Required anonymity level (transparent, anonymous, elite)
--verify-ssl Verify SSL certificates (default: True)
--export-format STR Export format for results (txt, json, csv) (default: txt)
Output Control:
--verbose Show detailed output
--quiet Show minimal output
--raw Output only IP:PORT format
- Check HTTP proxies with SSL verification:
python proxy_scraper.py --verify-ssl true --protocols http
- Find elite proxies with fast response time:
python proxy_scraper.py --anonymity-level elite --max-ms 1000
- Export results in JSON format:
python proxy_scraper.py --export-format json --protocols http,socks4
Customize user agents in config/user_agents.txt
.
Customize proxy sources in config/proxy_sources.txt
.
output/
├── http/
│ └── http_20231215_123456.txt
├── socks4/
│ └── socks4_20231215_123456.txt
├── socks5/
│ └── socks5_20231215_123456.txt
└── summary_20231215_123456.txt
This entire project was written by AI, including the README. It was a test project, but it works in general. Yes, the code structure and methods are terrible, it wasn't a project I worked on much, don't judge me :D and this project basically just checks http headers
- Fork the repository
- Create your feature branch
- Commit your changes
- Push to the branch
- Create a new Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.