Skip to content

web scraping tool designed to effortlessly navigate websites and automatically download all types of files.

Notifications You must be signed in to change notification settings

m0hs1ne/WebWorm

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

23 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation


Logo

"WebWorm – Dig Deep, Download Easy!"

Web scraping tool designed to effortlessly navigate websites and automatically download all types of files.

Table Of Contents

About The Project

Screen Shot

WebWorm is a python script that scrapes and downloads files from a specified website URL. It allows configuring the depth of website crawling and the file extensions to scrape. It also has an option to detect technologies used by the website.

Getting Started

This is an example of how you may run the script.

Prerequisites

  • Ensure you have Python installed on your system.

Installation

  1. Clone the repository:
git clone https://github.com/m0hs1ne/WebWorm.git
  1. Install the required packages:
pip install -r requirements.txt
  1. Run the script:
python3 webworm.py -u <url> -d <depth> -e <extensions> -t <technologies>

Usage

usage: WebWorm.py [-h] [-e EXTENSIONS] [-d DEPTH] [-t] url

positional arguments:
  url                   The URL of the website to scrape.

options:
  -h, --help            show this help message and exit
  -e EXTENSIONS, --extensions EXTENSIONS
                        Comma-separated list of file extensions to scrape (e.g., "jpg,png,docx"). If not specified, all files will be scraped.
  -d DEPTH, --depth DEPTH
                        The maximum depth to crawl the website. Default is 1.
  -t, --tech            Detect technologies used on the website.

using the -t flag will detect technologies used by the website.

-t

Roadmap

  • Add support for scraping multiple websites.
  • Request with session cookies.
  • enumerate directories.
  • check for possible keys and secrets in js files.

Contributing

Your contributions are welcome! Whether you're fixing bugs, adding new features, or improving documentation, we appreciate your help in making WebWorm better.

Creating A Pull Request

  1. Fork the Project
  2. Create your Feature Branch (git checkout -b feature/AmazingFeature)
  3. Commit your Changes (git commit -m 'Add some AmazingFeature')
  4. Push to the Branch (git push origin feature/AmazingFeature)
  5. Open a Pull Request

Authors

About

web scraping tool designed to effortlessly navigate websites and automatically download all types of files.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages