The following example is a simple script showing how to utilize Smartproxy with Scrapy. We suggest to research Scrapy documentation in order to continue development with this tool.
To get started with Scrapy you will first need to install it using methods provided in their documentation. Check here for more information
Once you get Scrapy up and running if you have not yet, make sure that you create your project folder. Open the Terminal/Command prompt window and enter the command below:
scrapy startproject yourprojectname
When project directory is setup, you can now download our test spider code:
- Make sure to open the exact location in your project folder using
cd .\yourprojectname\yourprojectname\spiders\
- To download our example script, run command
curl https://raw.githubusercontent.com/Smartproxy/Scrapy/master/smartproxy_spider.py > smartproxy_spider.py
- Open the
smartproxy_spider.py
file and enter your Endpoint, Port as well as replace the Username, Password with your proxy authentication credentials. - Run the script using
scrapy crawl smartproxy
command.
Note that the code may not run if the smartproxy_spider.py
file is in the wrong directory.
As mentioned this script only sends a basic request to return a value from the target website.
If you have done all the steps correctly, you should see the result as {'price': '£51.77'}
along with other actions performed by Scrapy in the Terminal window.
Email - sales@smartproxy.com
Live chat 24/7