⛔ This project is no longer actively maintained.
Riffusion is an app for real-time music generation with stable diffusion.
This repository contains the interactive web app that powers the website.
It is built with Next.js, React, Typescript, three.js, Tailwind, and Vercel.
This is a Next.js project bootstrapped with create-next-app
.
First, make sure you have Node v18 or greater installed using node --version
.
Install packages:
npm install
Run the development server:
npm run dev
# or
yarn dev
Open http://localhost:3000 with your browser to see the app.
The app home is at pages/index.js
. The page auto-updates as you edit the file. The about page is at pages/about.tsx
.
The pages/api
directory is mapped to /api/*
. Files in this directory are treated as API routes instead of React pages.
To actually generate model outputs, we need a model backend that responds to inference requests via API. If you have a large GPU that can run stable diffusion in under five seconds, clone and run the instructions in the inference server to run the Flask app.
You will need to add a .env.local
file in the root of this repository specifying the URL of the inference server:
RIFFUSION_FLASK_URL=http://127.0.0.1:3013/run_inference/
If you build on this work, please cite it as follows:
@article{Forsgren_Martiros_2022,
author = {Forsgren, Seth* and Martiros, Hayk*},
title = {{Riffusion - Stable diffusion for real-time music generation}},
url = {https://riffusion.com/about},
year = {2022}
}