michigan-dining-api.tendiesti.me
michigan-dining-api.herokuapp.com
A system for scraping and serving information from the University of Michigan dining API. This repository contains the code for fetching, analyzing and serving dining information. For how to make use of this service, see the usage section or visit the mdining-proto repository for protobuf service definitions.
Data is scraped from the University of Michigan's MDining API, formatted and stored in a database to be served. This allows for data structures and formats that are easier to work with than the original API and for historical data to be retrieved past what the MDining API offers (2019-11-02 is the earliest date available through the tendies time michigan dining api service). Check out the MDining statistics page of tendies time for examples made using this historical data.
Setup
Executables
Deployment
Usage
Clone this repo
git clone https://github.com/anders617/michigan-dining-api.git
Install the Bazel build system
This project uses the glog library for logging. The --alsologtostderr
flag can be specified to send log output to stderr.
Run the web server:
bazel run //cmd:web -- --alsologtostderr
Run the fetch executable to fill the DiningHalls/Foods/Menus tables:
bazel run //cmd:fetch -- --alsologtostderr
Run the analyze executable to fill the FoodStats table (depends on data from running //cmd:fetch
above):
bazel run //cmd:analyze -- --alsologtostderr
Run the db executable to create tables:
bazel run //cmd:db -- --alsologtostderr --create
Run the db executable to delete tables:
bazel run //cmd:db -- --alsologtostderr --delete
Run the testing client executable to connect to a instance of the web server:
bazel run //cmd:client -- --alsologtostderr --address=michigan-dining-api.tendiesti.me:443 --use_credentials
The //cmd:web
, //cmd:fetch
, and //cmd:analyze
executables all have rules for creating distroless docker images:
//cmd/web:web_image
//cmd:fetch:fetch_image
//cmd:analyze:analyze_image
There are also rules for pushing these container images to container registries:
//cmd/web:web_image_publish
//cmd/fetch:fetch_image_publish
//cmd/analyze:analyze_image_publish
Note that each target above needs to be run with the --platforms=@io_bazel_rules_go//go/toolchain:linux_amd64
flag set to ensure the binaries are built for running in a linux container. Alternatively, you can specify --config=container
to use the config set in the .bazelrc
to avoid having to remember the long platform name.
Currently these rules are configures to push the images to gcr.io/michigandiningapi but can be easily configured to publish to other container registries by editing the rules in the BUILD files.
This means that the latest container image builds for each executable are available at:
- gcr.io/michigandiningapi/web:latest
- gcr.io/michigandiningapi/fetch:latest
- gcr.io/michigandiningapi/analyze:latest
Note that these container images need to have the AWS_ACCESS_KEY_ID
and AWS_SECRET_ACCESS_KEY
set when run for the AWS account which will host the dynamodb data tables.
Note that since these are distroless docker images, only the bare minimum for running the executable is included so no shells or other standard Linux programs are included. This means that traditional Docker healthchecks that depend on shell commands will not work and should not be used for determining container health.
Currently michigan-dining-api is deployed and hosted on AWS using the Elastic Container Service at michigan-dining-api.tendiesti.me.
There is a task definition for each executable container image (listed above). Within each task definition, the AWS_SECRET_ACCESS_KEY
and AWS_SECRET_ACCESS_KEY
environment variables must be specified for the AWS account hosting the dynamodb tables. The analyze and web tasks have 0.5GiB memory and 0.25vCPU allocated. The fetch task requires more memory and is allocated 1.0GiB memory and 0.25vCPU.
There is a Service defined for the web server using the web task definition. This service is deployed on a Fargate cluster. The web service is configured to include load balancing using a network load balancer. The network load balancer is configured with an SSL/TLS certificate on its :443 listener and decrypts HTTPS traffic before it is forwarded to the web server. It is important this is a network load balancer instead of an application load balancer since AWS application load balancers do not handle grpc style HTTP/2 traffic correctly.
There are scheduled tasks for the fetch and analyze tasks to run once daily in order to update the dynamodb tables.
Currently michigan-dining-api is deployed and hosted on Heroku at https://michigan-dining-api.herokuapp.com.
Heroku is not optimal for hosting grpc servers since it does not support HTTP/2. Therefore, if you plan to take advantage of grpc, I recommend you use a different provider such as AWS.
In order to deploy your own server:
- Setup the Heroku application to point to this repository
- Add the custom heroku-buildpack-bazel buildpack to allow building with bazel
- Setup the HerokuScheduler add on to run the command
cmd/fetch/fetch
andcmd/analyze/analyze
daily in order to fill the data tables - Set the following Heroku config vars:
AWS_ACCESS_KEY_ID
- Access key used for AWS DynamoDB accessAWS_SECRET_ACCESS_KEY
- Secret used for AWS DynamoDB accessBAZEL_BUILD_PATH
-//cmd:all
BAZEL_VERSION
-1.1.0
(or later version)BUILD_CACHE_LOCATION
- Address of a bazel remote cache server (optional)
- Go to the deploy tab and click deploy branch
There are examples of grpc usage and client libraries in the mdining-proto library. This library also contains the proto definitions of messages and services provided by this service.
You can try out the following queries to get a sense of what is available through the api.
Additionally, the homepage for the tendies time michigan dining api service has longer descriptions of the purpose of each query.
/v1/items
/v1/diningHalls
/v1/filterableEntries
/v1/all
/v1/menus?date={yyyy-MM-dd}&diningHall={DINING_HALL}&meal={MEAL}
/v1/foods?name={LOWERCASE_FOOD_NAME}&date={yyyy-MM-dd}&meal={MEAL}
/v1/summarystats
/v1/stats
/v1/hearts