Skip to content

Commit

Permalink
Merge pull request #49 from CSCI-5828-S24/stage
Browse files Browse the repository at this point in the history
Monitoring added
  • Loading branch information
RanajitRoy authored May 3, 2024
2 parents 0bcdd13 + 1e4e9a4 commit f0bd84a
Show file tree
Hide file tree
Showing 11 changed files with 144 additions and 93 deletions.
61 changes: 38 additions & 23 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,34 +1,48 @@
# BetaFish
Group 12: Emily Parker, Ranajit Roy, Jonathan Gorman
# ![Denver Crime Tracker Banner](./diagrams/project-banner.png)

[![analyzer-test](https://github.com/CSCI-5828-S24/BetaFish/actions/workflows/test_analyzer.yml/badge.svg?branch=main)](https://github.com/CSCI-5828-S24/BetaFish/actions/workflows/test_analyzer.yml) [![collector-test](https://github.com/CSCI-5828-S24/BetaFish/actions/workflows/test_collector.yml/badge.svg?branch=main)](https://github.com/CSCI-5828-S24/BetaFish/actions/workflows/test_collector.yml) [![flask-server-test](https://github.com/CSCI-5828-S24/BetaFish/actions/workflows/test_flask_server.yml/badge.svg?branch=main)](https://github.com/CSCI-5828-S24/BetaFish/actions/workflows/test_flask_server.yml)
### Group 12 (Betafish): Emily Parker, Ranajit Roy, Jonathan Gorman

[![analyzer-test](https://github.com/CSCI-5828-S24/BetaFish/actions/workflows/test_analyzer.yml/badge.svg?branch=main)](https://github.com/CSCI-5828-S24/BetaFish/actions/workflows/test_analyzer.yml) [![collector-test](https://github.com/CSCI-5828-S24/BetaFish/actions/workflows/test_collector.yml/badge.svg?branch=main)](https://github.com/CSCI-5828-S24/BetaFish/actions/workflows/test_collector.yml) [![flask-server-test](https://github.com/CSCI-5828-S24/BetaFish/actions/workflows/test_flask_server.yml/badge.svg?branch=main)](https://github.com/CSCI-5828-S24/BetaFish/actions/workflows/test_flask_server.yml) [![integration-test](https://github.com/CSCI-5828-S24/BetaFish/actions/workflows/integration-tests.yml/badge.svg?branch=main)](https://github.com/CSCI-5828-S24/BetaFish/actions/workflows/integration-tests.yml) [![playwright-acceptance-test](https://github.com/CSCI-5828-S24/BetaFish/actions/workflows/playwright-acceptance-tests.yml/badge.svg?branch=main)](https://github.com/CSCI-5828-S24/BetaFish/actions/workflows/playwright-acceptance-tests.yml)

## Instructions

Below are the instruction run the we app locally on an linux/ubuntu or windows platform

Install the below packages before proceeding -
1. Python3
2. Python packages: redis jsonpickle requests flask flask_cors flask-mysqldb python-dotenv
> pip3 install --upgrade redis jsonpickle requests flask flask_cors flask-mysqldb python-dotenv
3. Node.js 20.x lts

2. Node.js 20.x lts

First clone the repository

> git clone \<repo-clone-url\>
A `.env` file is necessary for the MySQL Connection. Create a `.env` in `\flask-backend\src\` and populate with the following 4 lines
A `.env` file is necessary for the MySQL Connection. Create a `.env` in `./flask-backend`, `./cloud-data-collector` and `./cloud-data-analyzer` and populate with the following 5 lines with correct data. (leave analyzer url - used to call analyzer service from collector. Not required if called manually)

```
MYSQL_HOST = host
MYSQL_USER = user
MYSQL_PASSWORD = password
MYSQL_DB = databasename
MYSQL_HOST=<hostname>
MYSQL_USER=<user>
MYSQL_PASSWORD=<password>
MYSQL_DB=crime_db
ANALYZER_URL=http://ip.jsontest.com
```

To build the app, run the commands below in the cloned directory -
Run Collector: -<br>
[working directory: ./cloud-data-collector]
```
pip install -r requirements
python3 main.py
```


Run Analyzer: -<br>
[working directory: ./cloud-data-analyzer]
```
pip install -r requirements
python3 main.py
```


To build the app, run the commands below in the cloned directory -
```
npm --prefix ./react-frontend/ ci
npm --prefix ./react-frontend/ build
Expand All @@ -39,21 +53,22 @@ Now there are two options:
* run with a docker image


### Run locally

> python3 flask-backend/src/flask_server.py
### Run on Docker

In order to run on a docker image, run the below commands:

### Run on terminal
[working directory: ./flask-backend]
```
docker build -t web-app .
docker run -p5000:5000 web-app
python3 src/flask_server.py
```

Now, the web page should be accessible on port 5000 on localhost

### Monitoring
We have implemented `/metrics` endpoint with **prometheus** metrics.

Metrics:
1. Counter: `total_req`
2. Histogram: `analytics_latency`
3. Histogram: `data_latency`

## Current Completed Architecture
![image](./diagrams/architecture.png)

Expand Down
2 changes: 2 additions & 0 deletions cloud-data-analyzer/main.py
Original file line number Diff line number Diff line change
Expand Up @@ -149,3 +149,5 @@ def analyze(request):
mydb.commit()
return "success" if status else "failed"

if __name__=="__main__":
analyze(None)
2 changes: 2 additions & 0 deletions cloud-data-collector/main.py
Original file line number Diff line number Diff line change
Expand Up @@ -85,3 +85,5 @@ def collect(request):
return "done!!"


if __name__=="__main__":
collect(None)
Binary file added diagrams/project-banner.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
4 changes: 3 additions & 1 deletion flask-backend/README.md
Original file line number Diff line number Diff line change
@@ -1 +1,3 @@
Initial Creation of back end for HW5 Testing CI/CD
## Flask Backend

This Backend host both the APIs and web pages for the web-app
3 changes: 2 additions & 1 deletion flask-backend/requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -11,8 +11,9 @@ MarkupSafe==2.1.5
mysql-connector-python==8.3.0
packaging==24.0
pluggy==1.5.0
prometheus_client==0.20.0
pytest==8.2.0
python-dotenv==1.0.1
six==1.16.0
tomli==2.0.1
Werkzeug==3.0.2
Werkzeug==3.0.2
64 changes: 44 additions & 20 deletions flask-backend/src/flask_server.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,9 @@
from flask import Flask, request, Response, g
from flask_cors import CORS
import jsonpickle
import os
import os, time
import prometheus_client
from prometheus_client.metrics import Histogram, Counter

import logging
logging.basicConfig(level=logging.DEBUG)
Expand All @@ -14,15 +16,22 @@
app = Flask(__name__, static_folder="../../react-frontend/build", static_url_path='/')
CORS(app, origins=["*"])

def create_app():
def create_app(graph:dict={}):
_INF = float("inf")
app = Flask(__name__, static_folder="../../react-frontend/build", static_url_path='/')


app.config['MYSQL_HOST'] = os.getenv("MYSQL_HOST")
app.config['MYSQL_USER'] = os.getenv("MYSQL_USER")
app.config['MYSQL_PASSWORD'] = os.getenv("MYSQL_PASSWORD")
app.config['MYSQL_DB'] = os.getenv("MYSQL_DB")


if 'count' not in graph.keys():
graph['count'] = Counter('total_req', 'total requests on all paths')
if 'analytics_latency' not in graph.keys():
graph['analytics_latency'] = Histogram('analytics_latency', 'Latency for anaytics API', buckets=(100, 200, 300, _INF))
if 'data_latency' not in graph.keys():
graph['data_latency'] = Histogram('data_latency', 'Latency for data API', buckets=(100, 200, 300, _INF))

def getDB():
if 'db' not in g or not g.db.is_connected():
g.db = mysql.connector.connect(
Expand All @@ -41,6 +50,7 @@ def getDB():
"""
@app.route('/')
def index():
graph['count'].inc()
return app.send_static_file('index.html')

"""
Expand All @@ -55,10 +65,12 @@ def health():
}
except Exception as error:
response = {
'error' : error
'error' : error,
'data': False
}
status = 500
response_pickled = jsonpickle.encode(response)
graph['count'].inc()
return Response(response=response_pickled, status=status, mimetype='application/json')

# @app.route('/api/multiply/<int:x>/<int:y>', methods=["GET"])
Expand All @@ -78,16 +90,17 @@ def health():

@app.route('/api/crime_freq', methods=["GET"])
def get_crime_freq():
start = time.time_ns()
graph['count'].inc()
status = 200
try:
mydb = getDB()
cursor = mydb.cursor()
queryToExecute = f"SELECT * FROM crime_freq"
print(queryToExecute)
# print(queryToExecute)
cursor.execute(queryToExecute)
raw_data = cursor.fetchall()
row_headers = [x[0] for x in cursor.description]
print(row_headers)
json_data = []

for r in raw_data:
Expand All @@ -98,27 +111,29 @@ def get_crime_freq():
'data' : json_data
}

print(response)
except Exception as error:
response = {
'error' : error
'error' : error,
'data': []
}
status = 500
response_pickled = jsonpickle.encode(response)
graph['analytics_latency'].observe((time.time_ns() - start) // 1000000)
return Response(response=response_pickled, status=status, mimetype='application/json')

@app.route('/api/crime_totals', methods=["GET"])
def get_crime_totals():
start = time.time_ns()
graph['count'].inc()
status = 200
try:
mydb = getDB()
cursor = mydb.cursor()
queryToExecute = "SELECT * FROM crime_totals"
print(queryToExecute)
#print(queryToExecute)
cursor.execute(queryToExecute)
raw_data = cursor.fetchall()
row_headers = [x[0] for x in cursor.description]
print(row_headers)
json_data = []

for r in raw_data:
Expand All @@ -128,21 +143,21 @@ def get_crime_totals():
response = {
'data' : json_data
}

response = {
'data' : json_data
}
except Exception as error:
response = {
'error' : error
'error' : error,
'data' : []
}
status = 500
response_pickled = jsonpickle.encode(response)
graph['analytics_latency'].observe((time.time_ns() - start) // 1000000)
return Response(response=response_pickled, status=status, mimetype='application/json')

@app.route('/api/alldata', methods=["GET"])
def get_all():
status = 200
start = time.time_ns()
graph['count'].inc()
try:
mydb = getDB()
pageno = int(request.args["pageno"])
Expand All @@ -155,14 +170,12 @@ def get_all():
long = float(request.args["long"])
startTime = int(request.args["startTime"])
endTime = int(request.args["endTime"])
print("reached")
cursor = mydb.cursor()
queryToExecute = f"SELECT * FROM crime WHERE REPORTED_DATE < {endTime} AND REPORTED_DATE > {startTime} ORDER BY (POWER(GEO_LAT-{lat}, 2)+POWER(GEO_LON-{long}, 2)) LIMIT {pagesize} OFFSET {(pageno-1)*pagesize}"
print(queryToExecute)
# print(queryToExecute)
cursor.execute(queryToExecute)
raw_data = cursor.fetchall()
row_headers = [x[0] for x in cursor.description]
print(row_headers)
json_data = []

for r in raw_data:
Expand All @@ -183,11 +196,22 @@ def get_all():
}
except Exception as error:
response = {
'error' : error
'error' : error,
'data': [],
'pageno': 0,
'pagesize': 0
}
status = 500
response_pickled = jsonpickle.encode(response)
graph['data_latency'].observe((time.time_ns() - start) // 1000000)
return Response(response=response_pickled, status=status, mimetype='application/json')

@app.route('/metrics', methods=["GET"])
def get_metrics():
res = []
for k,v in graph.items():
res.append(prometheus_client.generate_latest(v))
return Response(res, mimetype='text/plain')

return app

Expand Down
13 changes: 12 additions & 1 deletion integration-test/requirements.txt
Original file line number Diff line number Diff line change
@@ -1,2 +1,13 @@
certifi==2024.2.2
charset-normalizer==3.3.2
exceptiongroup==1.2.1
idna==3.7
iniconfig==2.0.0
packaging==24.0
pluggy==1.5.0
pytest==8.2.0
python-dateutil==2.9.0.post0
requests==2.31.0
pytest==8.2.0
six==1.16.0
tomli==2.0.1
urllib3==2.2.1
40 changes: 38 additions & 2 deletions integration-test/test_integration.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,43 @@
from datetime import datetime
import time
import requests
from dateutil.relativedelta import relativedelta

BACKEND_URL = "https://betafish-flask-backend-3asud65paa-uc.a.run.app/"
BACKEND_URL = "https://betafish-flask-backend-3asud65paa-uc.a.run.app"
COLLECTOR_URL = "https://us-central1-csci-5828-final-project.cloudfunctions.net/betafish-collector"
ANALYZER_URL = "https://us-central1-csci-5828-final-project.cloudfunctions.net/betafish-analyzer"

def test_backend_is_deployed_and_healthy():
response = requests.get(BACKEND_URL + "/api/health")
assert response.status_code is 200
assert response.status_code == 200

def test_backend_is_not_storing_beyond_31_days():
timeToFilter = datetime.today().replace(hour=0, minute=0, second=0, microsecond=0) - relativedelta(days=33)
startReportedDateFilter = int(time.mktime(timeToFilter.timetuple()) * 1000)
timeToFilter = datetime.today().replace(hour=0, minute=0, second=0, microsecond=0) - relativedelta(days=32)
endReportedDateFilter = int(time.mktime(timeToFilter.timetuple()) * 1000)
response = requests.get(BACKEND_URL + f"/api/alldata?startTime={startReportedDateFilter}&endTime={endReportedDateFilter}&lat=39.74956044238265&long=-104.95078325271608&pageno=1&pagesize=20")
assert len(response.json()['data']) == 0
assert response.status_code == 200

# premise: there will always be at least one crime each day
def test_backend_is_storing_for_last_30_days():
timeToFilter = datetime.today().replace(hour=0, minute=0, second=0, microsecond=0) - relativedelta(days=30)
startReportedDateFilter = int(time.mktime(timeToFilter.timetuple()) * 1000)
timeToFilter = datetime.today().replace(hour=0, minute=0, second=0, microsecond=0) - relativedelta(days=29)
endReportedDateFilter = int(time.mktime(timeToFilter.timetuple()) * 1000)
response = requests.get(BACKEND_URL + f"/api/alldata?startTime={startReportedDateFilter}&endTime={endReportedDateFilter}&lat=39.74956044238265&long=-104.95078325271608&pageno=1&pagesize=20")
assert len(response.json()['data']) > 0
assert response.status_code == 200


def test_collector_prod():
response = requests.get(COLLECTOR_URL)
assert response.text == "done!!"
assert response.status_code == 200


def test_analyzer_prod():
response = requests.get(ANALYZER_URL)
assert response.text == "success"
assert response.status_code == 200
Loading

0 comments on commit f0bd84a

Please sign in to comment.