Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Monitoring added #47

Merged
merged 18 commits into from
May 3, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
19 changes: 19 additions & 0 deletions .github/workflows/integration-tests.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
name: integration-test

on: [push, workflow_dispatch]

jobs:
run-python-tests:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: '3.x'
- name: Install dependencies
working-directory: ./integration-test
run: python -m pip install -r requirements.txt
- name: Test with pytest
working-directory: ./integration-test
run: python -m pytest
22 changes: 22 additions & 0 deletions .github/workflows/playwright-acceptance-tests.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
name: playwright-acceptance-test

on: [push, workflow_dispatch]

jobs:
run-python-tests:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: '3.x'
- name: Install dependencies
working-directory: ./playwright-acceptance-test
run: python -m pip install -r requirements.txt
- name: Install playwright
working-directory: ./playwright-acceptance-test
run: python -m playwright install --with-deps
- name: Test with pytest
working-directory: ./playwright-acceptance-test
run: python -m pytest
22 changes: 17 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,9 +1,9 @@
# BetaFish
Group 12: Emily Parker, Ranajit Roy, Jonathan Gorman

[![analyzer-test](https://github.com/CSCI-5828-S24/BetaFish/actions/workflows/test_analyzer.yml/badge.svg?branch=main)](https://github.com/CSCI-5828-S24/BetaFish/actions/workflows/test_analyzer.yml) [![collector-test](https://github.com/CSCI-5828-S24/BetaFish/actions/workflows/test_collector.yml/badge.svg?branch=main)](https://github.com/CSCI-5828-S24/BetaFish/actions/workflows/test_collector.yml) [![flask-server-test](https://github.com/CSCI-5828-S24/BetaFish/actions/workflows/test_flask_server.yml/badge.svg?branch=main)](https://github.com/CSCI-5828-S24/BetaFish/actions/workflows/test_flask_server.yml)
[![analyzer-test](https://github.com/CSCI-5828-S24/BetaFish/actions/workflows/test_analyzer.yml/badge.svg?branch=main)](https://github.com/CSCI-5828-S24/BetaFish/actions/workflows/test_analyzer.yml) [![collector-test](https://github.com/CSCI-5828-S24/BetaFish/actions/workflows/test_collector.yml/badge.svg?branch=main)](https://github.com/CSCI-5828-S24/BetaFish/actions/workflows/test_collector.yml) [![flask-server-test](https://github.com/CSCI-5828-S24/BetaFish/actions/workflows/test_flask_server.yml/badge.svg?branch=main)](https://github.com/CSCI-5828-S24/BetaFish/actions/workflows/test_flask_server.yml) [![integration-test](https://github.com/CSCI-5828-S24/BetaFish/actions/workflows/integration-tests.yml/badge.svg?branch=main)](https://github.com/CSCI-5828-S24/BetaFish/actions/workflows/integration-tests.yml) [![playwright-acceptance-test](https://github.com/CSCI-5828-S24/BetaFish/actions/workflows/playwright-acceptance-tests.yml/badge.svg?branch=main)](https://github.com/CSCI-5828-S24/BetaFish/actions/workflows/playwright-acceptance-tests.yml)

# Instructions
## Instructions

Below are the instruction run the we app locally on an linux/ubuntu or windows platform

Expand Down Expand Up @@ -39,11 +39,11 @@ Now there are two options:
* run with a docker image


## Run locally
### Run locally

> python3 flask-backend/src/flask_server.py

## Run on Docker
### Run on Docker

In order to run on a docker image, run the below commands:

Expand All @@ -55,4 +55,16 @@ docker run -p5000:5000 web-app
Now, the web page should be accessible on port 5000 on localhost

## Current Completed Architecture
![image](https://github.com/CSCI-5828-S24/BetaFish/assets/143036094/8c079df8-c48f-47c8-950b-8148625e16a0)
![image](./diagrams/architecture.png)

## Updated Link to the website

https://betafish-flask-backend-3asud65paa-uc.a.run.app/ (Link may not be active)


## UI Samples
Here are some UI sample below (if the above link does not work)

![Landing Page](./UI-samples/landing-page.png)
![Search Results](./UI-samples/search-results.png)
![Analytics Page](./UI-samples/analytics-page.png)
Binary file added UI-samples/analytics-page.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added UI-samples/landing-page.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added UI-samples/search-results.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
3 changes: 0 additions & 3 deletions cloud-data-analyzer/main.py
Original file line number Diff line number Diff line change
Expand Up @@ -149,6 +149,3 @@ def analyze(request):
mydb.commit()
return "success" if status else "failed"


if __name__ == "__main__":
analyze()
Binary file added diagrams/architecture.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
4 changes: 3 additions & 1 deletion flask-backend/README.md
Original file line number Diff line number Diff line change
@@ -1 +1,3 @@
Initial Creation of back end for HW5 Testing CI/CD
## Flask Backend

This Backend host both the APIs and web pages for the web-app
1 change: 1 addition & 0 deletions flask-backend/requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,7 @@ MarkupSafe==2.1.5
mysql-connector-python==8.3.0
packaging==24.0
pluggy==1.5.0
prometheus_client==0.20.0
pytest==8.2.0
python-dotenv==1.0.1
six==1.16.0
Expand Down
64 changes: 44 additions & 20 deletions flask-backend/src/flask_server.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,9 @@
from flask import Flask, request, Response, g
from flask_cors import CORS
import jsonpickle
import os
import os, time
import prometheus_client
from prometheus_client.metrics import Histogram, Counter

import logging
logging.basicConfig(level=logging.DEBUG)
Expand All @@ -14,15 +16,22 @@
app = Flask(__name__, static_folder="../../react-frontend/build", static_url_path='/')
CORS(app, origins=["*"])

def create_app():
def create_app(graph:dict={}):
_INF = float("inf")
app = Flask(__name__, static_folder="../../react-frontend/build", static_url_path='/')


app.config['MYSQL_HOST'] = os.getenv("MYSQL_HOST")
app.config['MYSQL_USER'] = os.getenv("MYSQL_USER")
app.config['MYSQL_PASSWORD'] = os.getenv("MYSQL_PASSWORD")
app.config['MYSQL_DB'] = os.getenv("MYSQL_DB")


if 'count' not in graph.keys():
graph['count'] = Counter('total_req', 'total requests on all paths')
if 'analytics_latency' not in graph.keys():
graph['analytics_latency'] = Histogram('analytics_latency', 'Latency for anaytics API', buckets=(100, 200, 300, _INF))
if 'data_latency' not in graph.keys():
graph['data_latency'] = Histogram('data_latency', 'Latency for data API', buckets=(100, 200, 300, _INF))

def getDB():
if 'db' not in g or not g.db.is_connected():
g.db = mysql.connector.connect(
Expand All @@ -41,6 +50,7 @@ def getDB():
"""
@app.route('/')
def index():
graph['count'].inc()
return app.send_static_file('index.html')

"""
Expand All @@ -55,10 +65,12 @@ def health():
}
except Exception as error:
response = {
'error' : error
'error' : error,
'data': False
}
status = 500
response_pickled = jsonpickle.encode(response)
graph['count'].inc()
return Response(response=response_pickled, status=status, mimetype='application/json')

# @app.route('/api/multiply/<int:x>/<int:y>', methods=["GET"])
Expand All @@ -78,16 +90,17 @@ def health():

@app.route('/api/crime_freq', methods=["GET"])
def get_crime_freq():
start = time.time_ns()
graph['count'].inc()
status = 200
try:
mydb = getDB()
cursor = mydb.cursor()
queryToExecute = f"SELECT * FROM crime_freq"
print(queryToExecute)
# print(queryToExecute)
cursor.execute(queryToExecute)
raw_data = cursor.fetchall()
row_headers = [x[0] for x in cursor.description]
print(row_headers)
json_data = []

for r in raw_data:
Expand All @@ -98,27 +111,29 @@ def get_crime_freq():
'data' : json_data
}

print(response)
except Exception as error:
response = {
'error' : error
'error' : error,
'data': []
}
status = 500
response_pickled = jsonpickle.encode(response)
graph['analytics_latency'].observe((time.time_ns() - start) // 1000000)
return Response(response=response_pickled, status=status, mimetype='application/json')

@app.route('/api/crime_totals', methods=["GET"])
def get_crime_totals():
start = time.time_ns()
graph['count'].inc()
status = 200
try:
mydb = getDB()
cursor = mydb.cursor()
queryToExecute = "SELECT * FROM crime_totals"
print(queryToExecute)
#print(queryToExecute)
cursor.execute(queryToExecute)
raw_data = cursor.fetchall()
row_headers = [x[0] for x in cursor.description]
print(row_headers)
json_data = []

for r in raw_data:
Expand All @@ -128,21 +143,21 @@ def get_crime_totals():
response = {
'data' : json_data
}

response = {
'data' : json_data
}
except Exception as error:
response = {
'error' : error
'error' : error,
'data' : []
}
status = 500
response_pickled = jsonpickle.encode(response)
graph['analytics_latency'].observe((time.time_ns() - start) // 1000000)
return Response(response=response_pickled, status=status, mimetype='application/json')

@app.route('/api/alldata', methods=["GET"])
def get_all():
status = 200
start = time.time_ns()
graph['count'].inc()
try:
mydb = getDB()
pageno = int(request.args["pageno"])
Expand All @@ -155,14 +170,12 @@ def get_all():
long = float(request.args["long"])
startTime = int(request.args["startTime"])
endTime = int(request.args["endTime"])
print("reached")
cursor = mydb.cursor()
queryToExecute = f"SELECT * FROM crime WHERE REPORTED_DATE < {endTime} AND REPORTED_DATE > {startTime} ORDER BY (POWER(GEO_LAT-{lat}, 2)+POWER(GEO_LON-{long}, 2)) LIMIT {pagesize} OFFSET {(pageno-1)*pagesize}"
print(queryToExecute)
# print(queryToExecute)
cursor.execute(queryToExecute)
raw_data = cursor.fetchall()
row_headers = [x[0] for x in cursor.description]
print(row_headers)
json_data = []

for r in raw_data:
Expand All @@ -183,11 +196,22 @@ def get_all():
}
except Exception as error:
response = {
'error' : error
'error' : error,
'data': [],
'pageno': 0,
'pagesize': 0
}
status = 500
response_pickled = jsonpickle.encode(response)
graph['data_latency'].observe((time.time_ns() - start) // 1000000)
return Response(response=response_pickled, status=status, mimetype='application/json')

@app.route('/metrics', methods=["GET"])
def get_metrics():
res = []
for k,v in graph.items():
res.append(prometheus_client.generate_latest(v))
return Response(res, mimetype='text/plain')

return app

Expand Down
1 change: 1 addition & 0 deletions integration-test/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
Folder for housing integration tests for determining if services are up and running.
13 changes: 13 additions & 0 deletions integration-test/requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
certifi==2024.2.2
charset-normalizer==3.3.2
exceptiongroup==1.2.1
idna==3.7
iniconfig==2.0.0
packaging==24.0
pluggy==1.5.0
pytest==8.2.0
python-dateutil==2.9.0.post0
requests==2.31.0
six==1.16.0
tomli==2.0.1
urllib3==2.2.1
43 changes: 43 additions & 0 deletions integration-test/test_integration.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
from datetime import datetime
import time
import requests
from dateutil.relativedelta import relativedelta

BACKEND_URL = "https://betafish-flask-backend-3asud65paa-uc.a.run.app"
COLLECTOR_URL = "https://us-central1-csci-5828-final-project.cloudfunctions.net/betafish-collector"
ANALYZER_URL = "https://us-central1-csci-5828-final-project.cloudfunctions.net/betafish-analyzer"

def test_backend_is_deployed_and_healthy():
response = requests.get(BACKEND_URL + "/api/health")
assert response.status_code == 200

def test_backend_is_not_storing_beyond_31_days():
timeToFilter = datetime.today().replace(hour=0, minute=0, second=0, microsecond=0) - relativedelta(days=33)
startReportedDateFilter = int(time.mktime(timeToFilter.timetuple()) * 1000)
timeToFilter = datetime.today().replace(hour=0, minute=0, second=0, microsecond=0) - relativedelta(days=32)
endReportedDateFilter = int(time.mktime(timeToFilter.timetuple()) * 1000)
response = requests.get(BACKEND_URL + f"/api/alldata?startTime={startReportedDateFilter}&endTime={endReportedDateFilter}&lat=39.74956044238265&long=-104.95078325271608&pageno=1&pagesize=20")
assert len(response.json()['data']) == 0
assert response.status_code == 200

# premise: there will always be at least one crime each day
def test_backend_is_storing_for_last_30_days():
timeToFilter = datetime.today().replace(hour=0, minute=0, second=0, microsecond=0) - relativedelta(days=30)
startReportedDateFilter = int(time.mktime(timeToFilter.timetuple()) * 1000)
timeToFilter = datetime.today().replace(hour=0, minute=0, second=0, microsecond=0) - relativedelta(days=29)
endReportedDateFilter = int(time.mktime(timeToFilter.timetuple()) * 1000)
response = requests.get(BACKEND_URL + f"/api/alldata?startTime={startReportedDateFilter}&endTime={endReportedDateFilter}&lat=39.74956044238265&long=-104.95078325271608&pageno=1&pagesize=20")
assert len(response.json()['data']) > 0
assert response.status_code == 200


def test_collector_prod():
response = requests.get(COLLECTOR_URL)
assert response.text == "done!!"
assert response.status_code == 200


def test_analyzer_prod():
response = requests.get(ANALYZER_URL)
assert response.text == "success"
assert response.status_code == 200
3 changes: 3 additions & 0 deletions playwright-acceptance-test/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
Playwright tests that interact with deployed application to test user functionality

https://playwright.dev/python/
1 change: 1 addition & 0 deletions playwright-acceptance-test/requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
pytest-playwright==0.4.4
26 changes: 26 additions & 0 deletions playwright-acceptance-test/test_playwright.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
import re
from playwright.sync_api import Page, expect

def test_home_page_has_title(page:Page):
page.goto("https://betafish-flask-backend-3asud65paa-uc.a.run.app/")
expect(page).to_have_title(re.compile("Denver Crime Tracker"))

def test_click_analytics_changes_display(page:Page):
page.goto("https://betafish-flask-backend-3asud65paa-uc.a.run.app/")

page.get_by_text("Analytics").click()
expect(page.locator("#first-chart")).to_be_visible()
expect(page.locator("#second-chart")).to_be_visible()

def test_search_returns_results(page:Page):
page.goto("https://betafish-flask-backend-3asud65paa-uc.a.run.app/")
expect(page.locator("tbody")).to_contain_text("-")
page.get_by_label("Start date").fill("2024-03-01")
page.get_by_role("button", name="Search").click()
expect(page.get_by_role("cell", name="4/25/").first).to_be_visible()

def test_search_adds_icon_to_map(page:Page):
page.goto("https://betafish-flask-backend-3asud65paa-uc.a.run.app/")
page.get_by_label("Start date").fill("2024-03-01")
page.get_by_role("button", name="Search").click()
expect(page.locator(".leaflet-pane > img:nth-child(2)")).to_be_visible()
Loading
Loading