License | Version | ||
Github Actions | Coverage | ||
Supported versions | Wheel | ||
Status | Downloads | ||
All Contributors |
Dude is a very simple framework for writing web scrapers using Python decorators. The design, inspired by Flask, was to easily build a web scraper in just a few lines of code. Dude has an easy-to-learn syntax.
π¨ Dude is currently in Pre-Alpha. Please expect breaking changes.
To install, simply run the following from terminal.
pip install pydude
playwright install # Install playwright binaries for Chrome, Firefox and Webkit.
The simplest web scraper will look like this:
from dude import select
@select(css="a")
def get_link(element):
return {"url": element.get_attribute("href")}
The example above will get all the hyperlink elements in a page and calls the handler function get_link()
for each element.
You can run your scraper from terminal/shell/command-line by supplying URLs, the output filename of your choice and the paths to your python scripts to dude scrape
command.
dude scrape --url "<url>" --output data.json path/to/script.py
The output in data.json
should contain the actual URL and the metadata prepended with underscore.
[
{
"_page_number": 1,
"_page_url": "https://dude.ron.sh/",
"_group_id": 4502003824,
"_group_index": 0,
"_element_index": 0,
"url": "/url-1.html"
},
{
"_page_number": 1,
"_page_url": "https://dude.ron.sh/",
"_group_id": 4502003824,
"_group_index": 0,
"_element_index": 1,
"url": "/url-2.html"
},
{
"_page_number": 1,
"_page_url": "https://dude.ron.sh/",
"_group_id": 4502003824,
"_group_index": 0,
"_element_index": 2,
"url": "/url-3.html"
}
]
Changing the output to --output data.csv
should result in the following CSV content.
- Simple Flask-inspired design - build a scraper with decorators.
- Uses Playwright API - run your scraper in Chrome, Firefox and Webkit and leverage Playwright's powerful selector engine supporting CSS, XPath, text, regex, etc.
- Data grouping - group related results.
- URL pattern matching - run functions on matched URLs.
- Priority - reorder functions based on priority.
- Setup function - enable setup steps (clicking dialogs or login).
- Navigate function - enable navigation steps to move to other pages.
- Custom storage - option to save data to other formats or database.
- Async support - write async handlers.
- Option to use other parser backends aside from Playwright.
- BeautifulSoup4 -
pip install pydude[bs4]
- Parsel -
pip install pydude[parsel]
- lxml -
pip install pydude[lxml]
- Selenium -
pip install pydude[selenium]
- BeautifulSoup4 -
- Option to follow all links indefinitely (Crawler/Spider).
- Events - attach functions to startup, pre-setup, post-setup and shutdown events.
- Option to save data on every page.
By default, Dude uses Playwright but gives you an option to use parser backends that you are familiar with. It is possible to use parser backends like BeautifulSoup4, Parsel, lxml, and Selenium.
Here is the summary of features supported by each parser backend.
Parser Backend | Supports Sync? |
Supports Async? |
Selectors | Setup Handler |
Navigate Handler |
Comments | |||
CSS | XPath | Text | Regex | ||||||
Playwright | β | β | β | β | β | β | β | β | |
BeautifulSoup4 | β | β | β | π« | π« | π« | π« | π« | |
Parsel | β | β | β | β | β | β | π« | π« | |
lxml | β | β | β | β | β | β | π« | π« | |
Pyppeteer | π« | β | β | β | β | π« | β | β | Not supported from 0.23.0 |
Selenium | β | β | β | β | β | π« | β | β |
Pull the docker image using the following command.
docker pull roniemartinez/dude
Assuming that script.py
exist in the current directory, run Dude using the following command.
docker run -it --rm -v "$PWD":/code roniemartinez/dude dude scrape --url <url> script.py
Read the complete documentation at https://roniemartinez.github.io/dude/. All the advanced and useful features are documented there.
- β Any dude should know how to work with selectors (CSS or XPath).
- β Familiarity with any backends that you love (see Supported Parser Backends)
- β Python decorators... you'll live, dude!
- β A Recursive acronym looks nice.
- β
Adding "uncomplicated" (like
ufw
) into the name says it is a very simple framework. - β Puns! I also think that if you want to do web scraping, there's probably some random dude around the corner who can make it very easy for you to start with it. π
Thanks goes to these wonderful people (emoji key):
Ronie Martinez π§ π» π π |
This project follows the all-contributors specification. Contributions of any kind welcome!