Website for checking Big Data in Summoners War, i.e. most common monsters builds.
SWSTATS exposes an API that contains every contributor monsters, artifact and rune, including builds. The root endpoint is https://swstats.info/api/. Documentation of these endpoints is available in Swagger. This API is free and public but subject to rate limits, and any abuse or abnormal use of server resources will see restrictions applied.
Since March 2021, API for Monster Report is public, but it's not visible in Docs and root Endpoint because it was intended to be a private API only for Front-End. After many requests to make it public, I've finally decided to share it with community. Here are the instructions on how to use it
https://swstats.info/web/reports-generate/<com2us_id>
<com2us_id>
- monster ID Com2uS is using (i.e. 13413 - Lushen) to determine its core. You can get full list of these IDs from SWARFARM API. From this endpoint you get response with task_id
which is required to ask next endpoint for calculation status and - if it already ended - report data. Worth mentioning that this response is being cached for 15 minutes. Here's the example response.
{
"status": "PENDING",
"task_id": "12345678-1234-1234-1234-1234567890ab"
}
After getting the task_id
from above endpoint, you need to call repeatedly this Status endpoint
https://swstats.info/web/status/<task_id>
Under this endpoint, if task_id
is provided, you can check current calculation status and - if calcs already ended - get report data. I recommend you calling this API call every second (traffic is being monitored and every more frequent call may result in blocking you) until you get different Status than PENDING
. This response is being cached for 30 minutes. Example response is quite long, so it's available here
- Charts data
- Condensed Monster data (some base data, most common 2/4/6 build, TOP3 most common sets)
- Condensed Family data (some base data. so people can move easily between family members)
- All records used to generate the report (don't worry, anonymised)
- Monster substats analysis (mean, std, min/max, percentiles)
No authentication needed.
It's an open-source project, everyone is welcome to contribute. You don't need to code to contribute ideas. If you have a feature request, notice a bug or anything else, submit an issue here on github.
It's quite complicated process, because it's not containerized yet
- Create PostgreSQL User (any username) & Database (
swstats
) - Copy
.env.example
to.env
, changingDJ_DATABASE_URL
to proper values (change onlyuser
andpasswd
) - Run Redis on your PC
- Create virtual environment and install requirements
pip install -r requirements.txt
- Migrate Django migrations
- Create super user
python manage.py createsuperuser
- Load mixtures
python manage.py loaddata base_data.json
- Run Django server
- If everything works, run Celery workers (example in
dev/celery_worker.bat
, for Windowsdev/celery_worker_solo.bat
)
No tests... yet...
- Python 3.6
- Django - The web framework
- Django REST Framework - REST API for Django
- Celery - Asynchronous task runner
- Redis - Broker and backend for Celery
- Many other packages. See requirements.txt
This project is licensed under the Apache 2.0 License - see the LICENSE file for details