-
Notifications
You must be signed in to change notification settings - Fork 26
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Adding Throughput Anomaly detector files and CLI support
This PR is resonsible for: 1. Adding throughput Anomaly detector files to Theia. 2. Adding CLI support. CLI support for get, list and status is present, commands are yet to be implemented 3. Adding CRD for the anomaly detector. 4. Adding e2e Test for all three algos of TAD. 5. Adding Unit tests 6. Using same spark operator for all spark jobs NOTE: TODO Add stat as an API Signed-off-by: Tushar Tathgur <tathgurt@tathgurtFLVDL.vmware.com>
- Loading branch information
Tushar Tathgur
authored and
Tushar Tathgur
committed
Feb 22, 2023
1 parent
f2328e7
commit 1628e5c
Showing
75 changed files
with
6,059 additions
and
233 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,74 @@ | ||
apiVersion: apiextensions.k8s.io/v1 | ||
kind: CustomResourceDefinition | ||
metadata: | ||
name: throughputanomalydetectors.crd.theia.antrea.io | ||
labels: | ||
app: theia | ||
spec: | ||
group: crd.theia.antrea.io | ||
versions: | ||
- name: v1alpha1 | ||
served: true | ||
storage: true | ||
schema: | ||
openAPIV3Schema: | ||
type: object | ||
required: | ||
- spec | ||
properties: | ||
spec: | ||
type: object | ||
required: | ||
- jobType | ||
properties: | ||
jobType: | ||
type: string | ||
startInterval: | ||
type: string | ||
format: datetime | ||
endInterval: | ||
type: string | ||
format: datetime | ||
executorInstances: | ||
type: integer | ||
driverCoreRequest: | ||
type: string | ||
driverMemory: | ||
type: string | ||
executorCoreRequest: | ||
type: string | ||
executorMemory: | ||
type: string | ||
status: | ||
type: object | ||
properties: | ||
state: | ||
type: string | ||
sparkApplication: | ||
type: string | ||
completedStages: | ||
type: integer | ||
totalStages: | ||
type: integer | ||
startTime: | ||
type: string | ||
format: datetime | ||
endTime: | ||
type: string | ||
format: datetime | ||
errorMsg: | ||
type: string | ||
additionalPrinterColumns: | ||
- description: Current state of the job | ||
jsonPath: .status.state | ||
name: State | ||
type: string | ||
subresources: | ||
status: {} | ||
scope: Namespaced | ||
names: | ||
plural: throughputanomalydetectors | ||
singular: throughputanomalydetector | ||
kind: ThroughputAnomalyDetector | ||
shortNames: | ||
- tad |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
2 changes: 2 additions & 0 deletions
2
build/charts/theia/provisioning/datasources/migrators/000004_0-4-0.down.sql
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,2 @@ | ||
--Drop table | ||
DROP tadetector_local |
24 changes: 24 additions & 0 deletions
24
build/charts/theia/provisioning/datasources/migrators/000004_0-4-0.up.sql
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,24 @@ | ||
--Create a table to store the Throughput Anomaly Detector results | ||
CREATE TABLE IF NOT EXISTS tadetector_local ( | ||
sourceIP String, | ||
sourceTransportPort UInt16, | ||
destinationIP String, | ||
destinationTransportPort UInt16, | ||
protocolIdentifier UInt16, | ||
flowStartSeconds DateTime, | ||
flowEndSeconds String, | ||
throughputStandardDeviation String, | ||
algoType String, | ||
algoCalc String, | ||
Throughputs String, | ||
anomaly String, | ||
id String | ||
) engine=ReplicatedMergeTree('/clickhouse/tables/{shard}/{database}/{table}', '{replica}') | ||
ORDER BY (flowStartSeconds); | ||
|
||
--Move data from old table and drop the old table | ||
INSERT INTO tadetector_local SELECT * FROM tadetector; | ||
DROP TABLE tadetector; | ||
|
||
CREATE TABLE IF NOT EXISTS tadetector AS tadetector_local | ||
engine=Distributed('{cluster}', default, tadetector_local, rand()); |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,17 @@ | ||
FROM gcr.io/spark-operator/spark-py:v3.1.1 | ||
|
||
LABEL maintainer="Antrea <projectantrea-dev@googlegroups.com>" | ||
LABEL description="A docker image to deploy Throughput Anomaly Detection Spark job." | ||
|
||
WORKDIR /opt/spark/work-dir | ||
USER root | ||
|
||
RUN apt-get --allow-releaseinfo-change update && \ | ||
apt-get install -y --no-install-recommends wget ca-certificates && \ | ||
wget https://github.com/ClickHouse/clickhouse-jdbc/releases/download/v0.3.1/clickhouse-jdbc-0.3.1.jar -P /opt/spark/jars/ | ||
|
||
COPY plugins/anomaly-detection/AnomalyDetection.py /opt/spark/work-dir/AnomalyDetection.py | ||
COPY plugins/anomaly-detection/requirements.txt /opt/spark/work-dir/requirements.txt | ||
|
||
RUN pip3 install --upgrade pip && \ | ||
pip3 install -r /opt/spark/work-dir/requirements.txt |
Oops, something went wrong.