Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Rebase to noble (gpu) #19

Merged
merged 1 commit into from
Aug 11, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
9 changes: 7 additions & 2 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# syntax=docker/dockerfile:1

FROM ghcr.io/linuxserver/baseimage-ubuntu:jammy
FROM ghcr.io/linuxserver/baseimage-ubuntu:noble

# set version label
ARG BUILD_DATE
Expand All @@ -9,9 +9,12 @@ ARG WHISPER_VERSION
LABEL build_version="Linuxserver.io version:- ${VERSION} Build-date:- ${BUILD_DATE}"
LABEL maintainer="thespad"

ENV HOME=/config
ENV HOME=/config \
DEBIAN_FRONTEND="noninteractive" \
TMPDIR="/run/whisper-temp"

RUN \
echo "**** install packages ****" && \
apt-get update && \
apt-get install -y --no-install-recommends \
build-essential \
Expand All @@ -28,6 +31,8 @@ RUN \
nvidia-cublas-cu12 \
nvidia-cudnn-cu12==8.9.7.29 \
"wyoming-faster-whisper==${WHISPER_VERSION}" && \
printf "Linuxserver.io version: ${VERSION}\nBuild-date: ${BUILD_DATE}" > /build_version && \
echo "**** cleanup ****" && \
apt-get purge -y --auto-remove \
build-essential \
python3-dev && \
Expand Down
11 changes: 9 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -73,6 +73,11 @@ When using the `gpu` tag with Nvidia GPUs, make sure you set the container to us

For more information see the [faster-whisper docs](https://github.com/SYSTRAN/faster-whisper),

## Read-Only Operation

This image can be run with a read-only container filesystem. For details please [read the docs](https://docs.linuxserver.io/misc/read-only/).


## Usage

To help you get started creating a container from this image you can either use docker-compose or the docker cli.
Expand Down Expand Up @@ -130,6 +135,7 @@ Containers are configured using parameters passed at runtime (such as those abov
| `-e WHISPER_BEAM=1` | Number of candidates to consider simultaneously during transcription. |
| `-e WHISPER_LANG=en` | Language that you will speak to the add-on. |
| `-v /config` | Local path for Whisper config files. |
| `--read-only=true` | Run container with a read-only filesystem. Please [read the docs](https://docs.linuxserver.io/misc/read-only/). |

## Environment variables from files (Docker secrets)

Expand Down Expand Up @@ -292,6 +298,7 @@ Once registered you can define the dockerfile to use with `-f Dockerfile.aarch64

## Versions

* **19.05.24:** - Bump CUDA to 12.
* **08.01.24:** - Add GPU Branch.
* **18.07.24:** - Rebase to Ubuntu Noble.
* **19.05.24:** - Bump CUDA to 12 on GPU branch.
* **08.01.24:** - Add GPU branch.
* **25.11.23:** - Initial Release.
8 changes: 5 additions & 3 deletions readme-vars.yml
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,6 @@ project_url: "https://github.com/SYSTRAN/faster-whisper"
project_logo: "https://raw.githubusercontent.com/linuxserver/docker-templates/master/linuxserver.io/img/faster-whisper-logo.png"
project_blurb: "[{{ project_name|capitalize }}]({{ project_url }}) is a reimplementation of OpenAI's Whisper model using CTranslate2, which is a fast inference engine for Transformer models. This container provides a Wyoming protocol server for faster-whisper."
project_lsio_github_repo_url: "https://github.com/linuxserver/docker-{{ project_name }}"
project_blurb_optional_extras_enabled: false

# supported architectures
available_architectures:
Expand Down Expand Up @@ -37,6 +36,8 @@ opt_param_env_vars:
- { env_var: "WHISPER_BEAM", env_value: "1", desc: "Number of candidates to consider simultaneously during transcription." }
- { env_var: "WHISPER_LANG", env_value: "en", desc: "Language that you will speak to the add-on." }

readonly_supported: true

# application setup block
app_setup_block_enabled: true
app_setup_block: |
Expand All @@ -48,6 +49,7 @@ app_setup_block: |
# changelog
changelogs:
- { date: "19.05.24:", desc: "Bump CUDA to 12." }
- { date: "08.01.24:", desc: "Add GPU Branch." }
- { date: "18.07.24:", desc: "Rebase to Ubuntu Noble." }
- { date: "19.05.24:", desc: "Bump CUDA to 12 on GPU branch." }
- { date: "08.01.24:", desc: "Add GPU branch." }
- { date: "25.11.23:", desc: "Initial Release." }
5 changes: 4 additions & 1 deletion root/etc/s6-overlay/s6-rc.d/init-whisper-config/run
Original file line number Diff line number Diff line change
@@ -1,6 +1,9 @@
#!/usr/bin/with-contenv bash
# shellcheck shell=bash

mkdir -p "/run/whisper-temp"

# permissions
lsiown -R abc:abc \
/config
/config \
/run/whisper-temp