19 Commits

Author SHA1 Message Date
28cb50492c Update jiggablend-runner script to accept additional runner flags
All checks were successful
Release Tag / release (push) Successful in 17s
- Modified the jiggablend-runner script to allow passing extra flags during execution, enhancing flexibility for users.
- Updated usage instructions to reflect the new syntax, providing examples for better clarity on how to utilize the runner with various options.
2026-03-13 21:18:04 -05:00
dc525fbaa4 Add hardware compatibility flags for CPU rendering and HIPRT control
- Introduced `--force-cpu-rendering` and `--disable-hiprt` flags to the runner command, allowing users to enforce CPU rendering and disable HIPRT acceleration.
- Updated the runner initialization and context structures to accommodate the new flags, enhancing flexibility in rendering configurations.
- Modified the rendering logic to respect these flags, improving compatibility and user control over rendering behavior in Blender.
2026-03-13 21:15:44 -05:00
5303f01f7c Implement GPU backend detection for Blender compatibility
- Added functionality to detect GPU backends (HIP and NVIDIA) during runner registration, enhancing compatibility for Blender versions below 4.x.
- Introduced a new method, DetectAndStoreGPUBackends, to download the latest Blender and run a detection script, storing the results for future rendering decisions.
- Updated rendering logic to force CPU rendering when HIP is detected on systems with Blender < 4.x, ensuring stability and compatibility.
- Enhanced the Context structure to include flags for GPU detection status, improving error handling and rendering decisions based on GPU availability.
2026-03-13 18:32:05 -05:00
bc39fd438b Add installation script for jiggablend binary
- Introduced a new installer.sh script to automate the installation of the latest jiggablend binary for Linux AMD64.
- The script fetches the latest release information, downloads the binary and its checksums, verifies the checksum, and installs the binary and wrapper scripts for the manager and runner.
- Added wrapper scripts for both the manager and runner with test setup instructions, enhancing user experience for initial setup.
2026-03-13 10:26:21 -05:00
4c7f168bce Enhance GPU error detection in RenderProcessor
- Updated gpuErrorSubstrings to include case-insensitive matching for GPU backend errors, improving error detection reliability.
- Modified checkGPUErrorLine to convert log lines to lowercase before checking for error indicators, ensuring consistent matching across different log formats.
2026-03-13 10:26:13 -05:00
6833bb4013 Add GPU error handling and lockout mechanism in Runner
- Introduced gpuLockedOut state in Runner to manage GPU rendering based on detected errors.
- Implemented SetGPULockedOut and IsGPULockedOut methods for controlling GPU usage.
- Enhanced Context to include GPULockedOut and OnGPUError for better error handling.
- Updated RenderProcessor to check for GPU errors in logs and trigger lockout as needed.
- Modified rendering logic to force CPU rendering when GPU lockout is active, improving stability during errors.
2026-03-13 10:01:39 -05:00
f9111ebac4 Remove xvfb-run dependency from rendering process
All checks were successful
Release Tag / release (push) Successful in 16s
- Eliminated the use of xvfb-run for headless rendering in the RenderProcessor, simplifying the command execution for Blender.
- Updated the CheckRequiredTools function to remove the check for xvfb-run, reflecting the change in rendering requirements.
2026-03-12 20:55:45 -05:00
34445dc5cd Update README to clarify Blender requirements for the runner
All checks were successful
Release Tag / release (push) Successful in 1m21s
- Revised the section on the runner to specify that it can run Blender without needing a pre-installed version, as it retrieves the required Blender version from the manager.
2026-03-12 19:59:11 -05:00
63b8ff34c1 Update README to specify fixed API key for testing 2026-03-12 19:47:05 -05:00
2deb47e5ad Refactor web build process and update documentation
- Removed Node.js build artifacts from .gitignore and adjusted Makefile to reflect changes in web UI build process, now using server-rendered Go templates instead of React.
- Updated README to clarify the new web UI architecture and output formats, emphasizing the removal of the Node.js build step.
- Added a command to set the number of frames per render task in manager configuration, enhancing user control over rendering settings.
- Improved Gitea workflow by removing unnecessary npm install step, streamlining the CI process.
2026-03-12 19:44:40 -05:00
d3c5ee0dba Add pagination support for file loading in JobDetails component
All checks were successful
Release Tag / release (push) Successful in 20s
- Introduced a helper function to load all files associated with a job using pagination, improving performance by fetching files in batches.
- Updated the loadDetails function to utilize the new pagination method for retrieving all files instead of just the first page.
- Adjusted file handling logic to ensure proper updates when new files are added, maintaining consistency with the paginated approach.
2026-01-03 10:58:36 -06:00
bb57ce8659 Update task status handling to reset runner_id on job cancellation and failure
All checks were successful
Release Tag / release (push) Successful in 20s
- Modified SQL queries in multiple functions to set runner_id to NULL when updating task statuses for cancelled jobs and failed tasks.
- Ensured that tasks are properly marked as failed with the correct error messages and updated completion timestamps.
- Improved handling of task statuses to prevent potential issues with task assignment and execution.
2026-01-03 09:01:08 -06:00
1a8836e6aa Merge pull request 'Refactor job status handling to prevent race conditions' (#4) from fix-race into master
All checks were successful
Release Tag / release (push) Successful in 18s
Reviewed-on: #4
2026-01-02 18:25:17 -06:00
b51b96a618 Refactor job status handling to prevent race conditions
All checks were successful
PR Check / check-and-test (pull_request) Successful in 26s
- Removed redundant error handling in handleListJobTasks.
- Introduced per-job mutexes in Manager to serialize updateJobStatusFromTasks calls, ensuring thread safety during concurrent task completions.
- Added methods to manage job status update mutexes, including creation and cleanup after job completion or failure.
- Improved error handling in handleGetJobStatusForRunner by consolidating error checks.
2026-01-02 18:22:55 -06:00
8e561922c9 Merge pull request 'Implement file deletion after successful uploads in runner and encoding processes' (#3) from fix-uploads into master
Reviewed-on: #3
2026-01-02 17:51:19 -06:00
1c4bd78f56 Add FFmpeg setup step to Gitea workflow for enhanced media processing
All checks were successful
PR Check / check-and-test (pull_request) Successful in 1m6s
- Included a new step in the test-pr.yaml workflow to set up FFmpeg, improving the project's media handling capabilities.
- This addition complements the existing build steps for Go and frontend assets, ensuring a more comprehensive build environment.
2026-01-02 17:48:46 -06:00
3f2982ddb3 Update Gitea workflow to include frontend build step and adjust Go build command
Some checks failed
PR Check / check-and-test (pull_request) Failing after 42s
- Added a step to install and build the frontend using npm in the test-pr.yaml workflow.
- Modified the Go build command to compile all packages instead of specifying the output binary location.
- This change improves the build process by integrating frontend assets with the backend build.
2026-01-02 17:46:03 -06:00
0b852c5087 Update Gitea workflow to specify output binary location for jiggablend build
Some checks failed
PR Check / check-and-test (pull_request) Failing after 8s
- Changed the build command in the test-pr.yaml workflow to output the jiggablend binary to the bin directory.
- This modification enhances the organization of build artifacts and aligns with project structure.
2026-01-02 17:40:34 -06:00
5e56c7f0e8 Implement file deletion after successful uploads in runner and encoding processes
Some checks failed
PR Check / check-and-test (pull_request) Failing after 9s
- Added logic to delete files after successful uploads in both runner and encode tasks to prevent duplicate uploads.
- Included logging for any errors encountered during file deletion to ensure visibility of issues.
2026-01-02 17:34:41 -06:00
84 changed files with 4507 additions and 12515 deletions

View File

@@ -10,6 +10,7 @@ jobs:
- uses: actions/setup-go@main - uses: actions/setup-go@main
with: with:
go-version-file: 'go.mod' go-version-file: 'go.mod'
- run: go mod tidy - uses: FedericoCarboni/setup-ffmpeg@v3
- run: go mod tidy
- run: go build ./... - run: go build ./...
- run: go test -race -v -shuffle=on ./... - run: go test -race -v -shuffle=on ./...

10
.gitignore vendored
View File

@@ -43,16 +43,6 @@ runner-secrets-*.json
jiggablend-storage/ jiggablend-storage/
jiggablend-workspaces/ jiggablend-workspaces/
# Node.js
web/node_modules/
web/dist/
web/.vite/
npm-debug.log*
yarn-debug.log*
yarn-error.log*
pnpm-debug.log*
lerna-debug.log*
# IDE # IDE
.vscode/ .vscode/
.idea/ .idea/

View File

@@ -3,7 +3,6 @@ version: 2
before: before:
hooks: hooks:
- go mod tidy -v - go mod tidy -v
- sh -c "cd web && npm install && npm run build"
builds: builds:
- id: default - id: default

View File

@@ -5,11 +5,8 @@ build:
@echo "Building with GoReleaser..." @echo "Building with GoReleaser..."
goreleaser build --clean --snapshot --single-target goreleaser build --clean --snapshot --single-target
@mkdir -p bin @mkdir -p bin
@find dist -name jiggablend -type f -exec cp {} bin/jiggablend \; @find dist -name jiggablend -type f -exec cp {} bin/jiggablend.new \;
@mv -f bin/jiggablend.new bin/jiggablend
# Build web UI
build-web: clean-web
cd web && npm install && npm run build
# Cleanup manager logs # Cleanup manager logs
cleanup-manager: cleanup-manager:
@@ -63,7 +60,7 @@ clean-bin:
# Clean web build artifacts # Clean web build artifacts
clean-web: clean-web:
rm -rf web/dist/ @echo "No generated web artifacts to clean."
# Run tests # Run tests
test: test:
@@ -75,7 +72,7 @@ help:
@echo "" @echo ""
@echo "Build targets:" @echo "Build targets:"
@echo " build - Build jiggablend binary with embedded web UI" @echo " build - Build jiggablend binary with embedded web UI"
@echo " build-web - Build web UI only" @echo " build-web - Validate web UI assets (no build required)"
@echo "" @echo ""
@echo "Run targets:" @echo "Run targets:"
@echo " run - Run manager and runner in parallel (for testing)" @echo " run - Run manager and runner in parallel (for testing)"
@@ -90,7 +87,7 @@ help:
@echo "" @echo ""
@echo "Other targets:" @echo "Other targets:"
@echo " clean-bin - Clean build artifacts" @echo " clean-bin - Clean build artifacts"
@echo " clean-web - Clean web build artifacts" @echo " clean-web - Clean generated web artifacts (currently none)"
@echo " test - Run Go tests" @echo " test - Run Go tests"
@echo " help - Show this help" @echo " help - Show this help"
@echo "" @echo ""

View File

@@ -12,20 +12,20 @@ Both manager and runner are part of a single binary (`jiggablend`) with subcomma
## Features ## Features
- **Authentication**: OAuth (Google and Discord) and local authentication with user management - **Authentication**: OAuth (Google and Discord) and local authentication with user management
- **Web UI**: Modern React-based interface for job submission and monitoring - **Web UI**: Server-rendered Go templates with HTMX fragments for job submission and monitoring
- **Distributed Rendering**: Scale across multiple runners with automatic job distribution - **Distributed Rendering**: Scale across multiple runners with automatic job distribution
- **Real-time Updates**: WebSocket-based progress tracking and job status updates - **Real-time Updates**: Polling-based UI updates with lightweight HTMX refreshes
- **Video Encoding**: Automatic video encoding from EXR/PNG sequences with multiple codec support: - **Video Encoding**: Automatic video encoding from EXR sequences only. EXR→video always uses HDR (HLG, 10-bit); no option to disable. Codecs:
- H.264 (MP4) - SDR and HDR support - H.264 (MP4) - HDR (HLG)
- AV1 (MP4) - With alpha channel support - AV1 (MP4) - Alpha channel support, HDR
- VP9 (WebM) - With alpha channel and HDR support - VP9 (WebM) - Alpha channel and HDR
- **Output Formats**: PNG, JPEG, EXR, and video formats (MP4, WebM) - **Output Formats**: EXR frame sequence only, or EXR + video (H.264, AV1, VP9). Blender always renders EXR.
- **Blender Version Management**: Support for multiple Blender versions with automatic detection - **Blender Version Management**: Support for multiple Blender versions with automatic detection
- **Metadata Extraction**: Automatic extraction of scene metadata from Blender files - **Metadata Extraction**: Automatic extraction of scene metadata from Blender files
- **Admin Panel**: User and runner management interface - **Admin Panel**: User and runner management interface
- **Runner Management**: API key-based authentication for runners with health monitoring - **Runner Management**: API key-based authentication for runners with health monitoring
- **HDR Support**: Preserve HDR range in video encoding with HLG transfer function - **HDR**: EXR→video is always encoded as HDR (HLG, 10-bit). There is no option to turn it off; for SDR-only output, download the EXR frames and encode locally.
- **Alpha Channel**: Preserve alpha channel in video encoding (AV1 and VP9) - **Alpha**: Alpha is always preserved in EXR frames. In video, alpha is preserved when present in the EXR for AV1 and VP9; H.264 MP4 does not support alpha.
## Prerequisites ## Prerequisites
@@ -37,8 +37,8 @@ Both manager and runner are part of a single binary (`jiggablend`) with subcomma
### Runner ### Runner
- Linux amd64 - Linux amd64
- Blender installed (can use bundled versions from storage)
- FFmpeg installed (required for video encoding) - FFmpeg installed (required for video encoding)
- Able to run Blender (the runner gets the jobs required Blender version from the manager; it does not need Blender pre-installed)
## Installation ## Installation
@@ -68,7 +68,7 @@ make init-test
This will: This will:
- Enable local authentication - Enable local authentication
- Set a fixed API key for testing - Set a fixed API key for testing: `jk_r0_test_key_123456789012345678901234567890`
- Create a test admin user (test@example.com / testpassword) - Create a test admin user (test@example.com / testpassword)
#### Manual Configuration #### Manual Configuration
@@ -154,10 +154,22 @@ bin/jiggablend runner --api-key <your-api-key>
# With custom options # With custom options
bin/jiggablend runner --manager http://localhost:8080 --name my-runner --api-key <key> --log-file runner.log bin/jiggablend runner --manager http://localhost:8080 --name my-runner --api-key <key> --log-file runner.log
# Hardware compatibility flags (force CPU + disable HIPRT)
bin/jiggablend runner --api-key <key> --force-cpu-rendering --disable-hiprt
# Using environment variables # Using environment variables
JIGGABLEND_MANAGER=http://localhost:8080 JIGGABLEND_API_KEY=<key> bin/jiggablend runner JIGGABLEND_MANAGER=http://localhost:8080 JIGGABLEND_API_KEY=<key> bin/jiggablend runner
``` ```
### Render Chunk Size Note
For one heavy production scene/profile, chunked rendering (`frames 800-804` in one Blender process) was much slower than one-frame tasks:
- Chunked task (`800-804`): `27m49s` end-to-end (`Task assigned` -> last `Saved`)
- Single-frame tasks (`800`, `801`, `802`, `803`, `804`): `15m04s` wall clock total
In that test, any chunk size greater than `1` caused a major slowdown after the first frame. Fresh installs should already have it set to `1`, but if you see similar performance degradation, try forcing one frame per task (hard reset Blender each frame): `jiggablend manager config set frames-per-render-task 1`. If `1` is worse on your scene/hardware, benchmark and use a higher chunk size instead.
### Running Both (for Testing) ### Running Both (for Testing)
```bash ```bash
@@ -217,9 +229,9 @@ jiggablend/
│ ├── executils/ # Execution utilities │ ├── executils/ # Execution utilities
│ ├── scripts/ # Python scripts for Blender │ ├── scripts/ # Python scripts for Blender
│ └── types/ # Shared types and models │ └── types/ # Shared types and models
├── web/ # React web UI ├── web/ # Embedded templates + static assets
│ ├── src/ # Source files │ ├── templates/ # Go HTML templates and partials
│ └── dist/ # Built files (embedded in binary) │ └── static/ # CSS/JS assets
├── go.mod ├── go.mod
└── Makefile └── Makefile
``` ```
@@ -266,29 +278,25 @@ jiggablend/
- `GET /api/admin/stats` - System statistics - `GET /api/admin/stats` - System statistics
### WebSocket ### WebSocket
- `WS /api/ws` - WebSocket connection for real-time updates - `WS /api/jobs/ws` - Optional API channel for advanced clients
- Subscribe to job channels: `job:{jobId}` - The default web UI uses polling + HTMX for status updates and task views.
- Receive job status updates, progress, and logs
## Output Formats ## Output Formats
The system supports the following output formats: The system supports the following output formats. Blender always renders EXR (linear); the chosen format is the deliverable (frames only or frames + video).
### Image Formats ### Deliverable Formats
- **PNG** - Standard PNG output - **EXR** - EXR frame sequence only (no video)
- **JPEG** - JPEG output - **EXR_264_MP4** - EXR frames + H.264 MP4 (always HDR, HLG)
- **EXR** - OpenEXR format (HDR) - **EXR_AV1_MP4** - EXR frames + AV1 MP4 (alpha support, always HDR)
- **EXR_VP9_WEBM** - EXR frames + VP9 WebM (alpha and HDR)
### Video Formats Video encoding (EXR→video) is always HDR (HLG, 10-bit); there is no option to output SDR video. For SDR-only, download the EXR frames and encode locally.
- **EXR_264_MP4** - H.264 encoded MP4 from EXR sequence (SDR or HDR)
- **EXR_AV1_MP4** - AV1 encoded MP4 from EXR sequence (with alpha channel support)
- **EXR_VP9_WEBM** - VP9 encoded WebM from EXR sequence (with alpha channel and HDR support)
Video encoding features: Video encoding features:
- 2-pass encoding for optimal quality - 2-pass encoding for optimal quality
- HDR preservation using HLG transfer function - EXR→video only (no PNG source); always HLG (HDR), 10-bit, full range
- Alpha channel preservation (AV1 and VP9 only) - Alpha channel preservation (AV1 and VP9 only)
- Automatic detection of source format (EXR or PNG)
- Software encoding (libx264, libaom-av1, libvpx-vp9) - Software encoding (libx264, libaom-av1, libvpx-vp9)
## Storage Structure ## Storage Structure
@@ -320,16 +328,8 @@ go test ./... -timeout 30s
### Web UI Development ### Web UI Development
The web UI is built with React and Vite. To develop the UI: The web UI is server-rendered from embedded templates and static assets in `web/templates` and `web/static`.
No Node/Vite build step is required.
```bash
cd web
npm install
npm run dev # Development server
npm run build # Build for production
```
The built files are embedded in the Go binary using `embed.FS`.
## License ## License

View File

@@ -8,6 +8,7 @@ import (
"encoding/hex" "encoding/hex"
"fmt" "fmt"
"os" "os"
"strconv"
"strings" "strings"
"jiggablend/internal/config" "jiggablend/internal/config"
@@ -381,6 +382,25 @@ var setGoogleOAuthCmd = &cobra.Command{
var setDiscordOAuthRedirectURL string var setDiscordOAuthRedirectURL string
var setFramesPerRenderTaskCmd = &cobra.Command{
Use: "frames-per-render-task <n>",
Short: "Set number of frames per render task (min 1)",
Long: `Set how many frames to batch into each render task. Job frame range is divided into chunks of this size. Default is 10.`,
Args: cobra.ExactArgs(1),
Run: func(cmd *cobra.Command, args []string) {
n, err := strconv.Atoi(args[0])
if err != nil || n < 1 {
exitWithError("frames-per-render-task must be a positive integer")
}
withConfig(func(cfg *config.Config, db *database.DB) {
if err := cfg.SetInt(config.KeyFramesPerRenderTask, n); err != nil {
exitWithError("Failed to set frames_per_render_task: %v", err)
}
fmt.Printf("Frames per render task set to %d\n", n)
})
},
}
var setDiscordOAuthCmd = &cobra.Command{ var setDiscordOAuthCmd = &cobra.Command{
Use: "discord-oauth <client-id> <client-secret>", Use: "discord-oauth <client-id> <client-secret>",
Short: "Set Discord OAuth credentials", Short: "Set Discord OAuth credentials",
@@ -558,6 +578,7 @@ func init() {
configCmd.AddCommand(setCmd) configCmd.AddCommand(setCmd)
setCmd.AddCommand(setFixedAPIKeyCmd) setCmd.AddCommand(setFixedAPIKeyCmd)
setCmd.AddCommand(setAllowedOriginsCmd) setCmd.AddCommand(setAllowedOriginsCmd)
setCmd.AddCommand(setFramesPerRenderTaskCmd)
setCmd.AddCommand(setGoogleOAuthCmd) setCmd.AddCommand(setGoogleOAuthCmd)
setCmd.AddCommand(setDiscordOAuthCmd) setCmd.AddCommand(setDiscordOAuthCmd)

View File

@@ -37,6 +37,8 @@ func init() {
runnerCmd.Flags().String("log-level", "info", "Log level (debug, info, warn, error)") runnerCmd.Flags().String("log-level", "info", "Log level (debug, info, warn, error)")
runnerCmd.Flags().BoolP("verbose", "v", false, "Enable verbose logging (same as --log-level=debug)") runnerCmd.Flags().BoolP("verbose", "v", false, "Enable verbose logging (same as --log-level=debug)")
runnerCmd.Flags().Duration("poll-interval", 5*time.Second, "Job polling interval") runnerCmd.Flags().Duration("poll-interval", 5*time.Second, "Job polling interval")
runnerCmd.Flags().Bool("force-cpu-rendering", false, "Force CPU rendering for all jobs (disables GPU rendering)")
runnerCmd.Flags().Bool("disable-hiprt", false, "Disable HIPRT acceleration in Blender Cycles")
// Bind flags to viper with JIGGABLEND_ prefix // Bind flags to viper with JIGGABLEND_ prefix
runnerViper.SetEnvPrefix("JIGGABLEND") runnerViper.SetEnvPrefix("JIGGABLEND")
@@ -51,6 +53,8 @@ func init() {
runnerViper.BindPFlag("log_level", runnerCmd.Flags().Lookup("log-level")) runnerViper.BindPFlag("log_level", runnerCmd.Flags().Lookup("log-level"))
runnerViper.BindPFlag("verbose", runnerCmd.Flags().Lookup("verbose")) runnerViper.BindPFlag("verbose", runnerCmd.Flags().Lookup("verbose"))
runnerViper.BindPFlag("poll_interval", runnerCmd.Flags().Lookup("poll-interval")) runnerViper.BindPFlag("poll_interval", runnerCmd.Flags().Lookup("poll-interval"))
runnerViper.BindPFlag("force_cpu_rendering", runnerCmd.Flags().Lookup("force-cpu-rendering"))
runnerViper.BindPFlag("disable_hiprt", runnerCmd.Flags().Lookup("disable-hiprt"))
} }
func runRunner(cmd *cobra.Command, args []string) { func runRunner(cmd *cobra.Command, args []string) {
@@ -63,6 +67,8 @@ func runRunner(cmd *cobra.Command, args []string) {
logLevel := runnerViper.GetString("log_level") logLevel := runnerViper.GetString("log_level")
verbose := runnerViper.GetBool("verbose") verbose := runnerViper.GetBool("verbose")
pollInterval := runnerViper.GetDuration("poll_interval") pollInterval := runnerViper.GetDuration("poll_interval")
forceCPURendering := runnerViper.GetBool("force_cpu_rendering")
disableHIPRT := runnerViper.GetBool("disable_hiprt")
var r *runner.Runner var r *runner.Runner
@@ -118,7 +124,7 @@ func runRunner(cmd *cobra.Command, args []string) {
} }
// Create runner // Create runner
r = runner.New(managerURL, name, hostname) r = runner.New(managerURL, name, hostname, forceCPURendering, disableHIPRT)
// Check for required tools early to fail fast // Check for required tools early to fail fast
if err := r.CheckRequiredTools(); err != nil { if err := r.CheckRequiredTools(); err != nil {
@@ -161,6 +167,9 @@ func runRunner(cmd *cobra.Command, args []string) {
runnerID, err = r.Register(apiKey) runnerID, err = r.Register(apiKey)
if err == nil { if err == nil {
logger.Infof("Registered runner with ID: %d", runnerID) logger.Infof("Registered runner with ID: %d", runnerID)
// Download latest Blender and detect HIP vs NVIDIA so we only force CPU for Blender < 4.x when using HIP
logger.Info("Detecting GPU backends (HIP/NVIDIA) for Blender < 4.x policy...")
r.DetectAndStoreGPUBackends()
break break
} }

Binary file not shown.

Before

Width:  |  Height:  |  Size: 24 MiB

113
installer.sh Normal file
View File

@@ -0,0 +1,113 @@
#!/bin/bash
set -euo pipefail
# Simple script to install the latest jiggablend binary for Linux AMD64
# and create wrapper scripts for manager and runner using test setup
# Dependencies: curl, jq, tar, sha256sum, sudo (for installation to /usr/local/bin)
REPO="s1d3sw1ped/jiggablend"
API_URL="https://git.s1d3sw1ped.com/api/v1/repos/${REPO}/releases/latest"
ASSET_NAME="jiggablend-linux-amd64.tar.gz"
echo "Fetching latest release information..."
RELEASE_JSON=$(curl -s "$API_URL")
TAG=$(echo "$RELEASE_JSON" | jq -r '.tag_name')
echo "Latest version: $TAG"
ASSET_URL=$(echo "$RELEASE_JSON" | jq -r ".assets[] | select(.name == \"$ASSET_NAME\") | .browser_download_url")
if [ -z "$ASSET_URL" ]; then
echo "Error: Asset $ASSET_NAME not found in latest release."
exit 1
fi
CHECKSUM_URL=$(echo "$RELEASE_JSON" | jq -r '.assets[] | select(.name == "checksums.txt") | .browser_download_url')
if [ -z "$CHECKSUM_URL" ]; then
echo "Error: checksums.txt not found in latest release."
exit 1
fi
echo "Downloading $ASSET_NAME..."
curl -L -o "$ASSET_NAME" "$ASSET_URL"
echo "Downloading checksums.txt..."
curl -L -o "checksums.txt" "$CHECKSUM_URL"
echo "Verifying checksum..."
if ! sha256sum --ignore-missing --quiet -c checksums.txt; then
echo "Error: Checksum verification failed."
rm -f "$ASSET_NAME" checksums.txt
exit 1
fi
echo "Extracting..."
tar -xzf "$ASSET_NAME"
echo "Installing binary to /usr/local/bin (requires sudo)..."
sudo install -m 0755 jiggablend /usr/local/bin/
echo "Creating manager wrapper script..."
cat << 'EOF' > jiggablend-manager.sh
#!/bin/bash
set -euo pipefail
# Wrapper to run jiggablend manager with test setup
# Run this in a directory where you want the db, storage, and logs
mkdir -p logs
rm -f logs/manager.log
# Initialize test configuration
jiggablend manager config enable localauth
jiggablend manager config set fixed-apikey jk_r0_test_key_123456789012345678901234567890 -f -y
jiggablend manager config add user test@example.com testpassword --admin -f -y
# Run manager
jiggablend manager -l logs/manager.log
EOF
chmod +x jiggablend-manager.sh
sudo install -m 0755 jiggablend-manager.sh /usr/local/bin/jiggablend-manager
rm -f jiggablend-manager.sh
echo "Creating runner wrapper script..."
cat << 'EOF' > jiggablend-runner.sh
#!/bin/bash
set -euo pipefail
# Wrapper to run jiggablend runner with test setup
# Usage: jiggablend-runner [MANAGER_URL] [RUNNER_FLAGS...]
# Default MANAGER_URL: http://localhost:8080
# Run this in a directory where you want the logs
MANAGER_URL="http://localhost:8080"
if [[ $# -gt 0 && "$1" != -* ]]; then
MANAGER_URL="$1"
shift
fi
EXTRA_ARGS=("$@")
mkdir -p logs
rm -f logs/runner.log
# Run runner
jiggablend runner -l logs/runner.log --api-key=jk_r0_test_key_123456789012345678901234567890 --manager "$MANAGER_URL" "${EXTRA_ARGS[@]}"
EOF
chmod +x jiggablend-runner.sh
sudo install -m 0755 jiggablend-runner.sh /usr/local/bin/jiggablend-runner
rm -f jiggablend-runner.sh
echo "Cleaning up..."
rm -f "$ASSET_NAME" checksums.txt jiggablend
echo "Installation complete!"
echo "Binary: jiggablend"
echo "Wrappers: jiggablend-manager, jiggablend-runner"
echo "Run 'jiggablend-manager' to start the manager with test config."
echo "Run 'jiggablend-runner [url] [runner flags...]' to start the runner."
echo "Example: jiggablend-runner http://your-manager:8080 --force-cpu-rendering --disable-hiprt"
echo "Note: Depending on whether you're running the manager or runner, additional dependencies like Blender, ImageMagick, or FFmpeg may be required. See the project README for details."

View File

@@ -21,7 +21,8 @@ const (
KeyFixedAPIKey = "fixed_api_key" KeyFixedAPIKey = "fixed_api_key"
KeyRegistrationEnabled = "registration_enabled" KeyRegistrationEnabled = "registration_enabled"
KeyProductionMode = "production_mode" KeyProductionMode = "production_mode"
KeyAllowedOrigins = "allowed_origins" KeyAllowedOrigins = "allowed_origins"
KeyFramesPerRenderTask = "frames_per_render_task"
) )
// Config manages application configuration stored in the database // Config manages application configuration stored in the database
@@ -301,3 +302,12 @@ func (c *Config) AllowedOrigins() string {
return c.GetWithDefault(KeyAllowedOrigins, "") return c.GetWithDefault(KeyAllowedOrigins, "")
} }
// GetFramesPerRenderTask returns how many frames to include per render task (min 1, default 1).
func (c *Config) GetFramesPerRenderTask() int {
n := c.GetIntWithDefault(KeyFramesPerRenderTask, 1)
if n < 1 {
return 1
}
return n
}

View File

@@ -0,0 +1,31 @@
-- SQLite does not support DROP COLUMN directly; recreate table without frame_end
CREATE TABLE tasks_new (
id INTEGER PRIMARY KEY AUTOINCREMENT,
job_id INTEGER NOT NULL,
runner_id INTEGER,
frame INTEGER NOT NULL,
status TEXT NOT NULL DEFAULT 'pending',
output_path TEXT,
task_type TEXT NOT NULL DEFAULT 'render',
current_step TEXT,
retry_count INTEGER NOT NULL DEFAULT 0,
max_retries INTEGER NOT NULL DEFAULT 3,
runner_failure_count INTEGER NOT NULL DEFAULT 0,
timeout_seconds INTEGER,
condition TEXT,
created_at TIMESTAMP NOT NULL DEFAULT CURRENT_TIMESTAMP,
started_at TIMESTAMP,
completed_at TIMESTAMP,
error_message TEXT,
FOREIGN KEY (job_id) REFERENCES jobs(id),
FOREIGN KEY (runner_id) REFERENCES runners(id)
);
INSERT INTO tasks_new (id, job_id, runner_id, frame, status, output_path, task_type, current_step, retry_count, max_retries, runner_failure_count, timeout_seconds, condition, created_at, started_at, completed_at, error_message)
SELECT id, job_id, runner_id, frame, status, output_path, task_type, current_step, retry_count, max_retries, runner_failure_count, timeout_seconds, condition, created_at, started_at, completed_at, error_message FROM tasks;
DROP TABLE tasks;
ALTER TABLE tasks_new RENAME TO tasks;
CREATE INDEX idx_tasks_job_id ON tasks(job_id);
CREATE INDEX idx_tasks_runner_id ON tasks(runner_id);
CREATE INDEX idx_tasks_status ON tasks(status);
CREATE INDEX idx_tasks_job_status ON tasks(job_id, status);
CREATE INDEX idx_tasks_started_at ON tasks(started_at);

View File

@@ -0,0 +1,2 @@
-- Add frame_end to tasks for range-based render tasks (NULL = single frame, same as frame)
ALTER TABLE tasks ADD COLUMN frame_end INTEGER;

View File

@@ -121,37 +121,6 @@ func (s *Manager) handleDeleteRunnerAPIKey(w http.ResponseWriter, r *http.Reques
s.respondJSON(w, http.StatusOK, map[string]string{"message": "API key deleted"}) s.respondJSON(w, http.StatusOK, map[string]string{"message": "API key deleted"})
} }
// handleVerifyRunner manually verifies a runner
func (s *Manager) handleVerifyRunner(w http.ResponseWriter, r *http.Request) {
runnerID, err := parseID(r, "id")
if err != nil {
s.respondError(w, http.StatusBadRequest, err.Error())
return
}
// Check if runner exists
var exists bool
err = s.db.With(func(conn *sql.DB) error {
return conn.QueryRow("SELECT EXISTS(SELECT 1 FROM runners WHERE id = ?)", runnerID).Scan(&exists)
})
if err != nil || !exists {
s.respondError(w, http.StatusNotFound, "Runner not found")
return
}
// Mark runner as verified
err = s.db.With(func(conn *sql.DB) error {
_, err := conn.Exec("UPDATE runners SET verified = 1 WHERE id = ?", runnerID)
return err
})
if err != nil {
s.respondError(w, http.StatusInternalServerError, fmt.Sprintf("Failed to verify runner: %v", err))
return
}
s.respondJSON(w, http.StatusOK, map[string]string{"message": "Runner verified"})
}
// handleDeleteRunner removes a runner // handleDeleteRunner removes a runner
func (s *Manager) handleDeleteRunner(w http.ResponseWriter, r *http.Request) { func (s *Manager) handleDeleteRunner(w http.ResponseWriter, r *http.Request) {
runnerID, err := parseID(r, "id") runnerID, err := parseID(r, "id")
@@ -415,6 +384,12 @@ func (s *Manager) handleSetRegistrationEnabled(w http.ResponseWriter, r *http.Re
// handleSetUserAdminStatus sets a user's admin status (admin only) // handleSetUserAdminStatus sets a user's admin status (admin only)
func (s *Manager) handleSetUserAdminStatus(w http.ResponseWriter, r *http.Request) { func (s *Manager) handleSetUserAdminStatus(w http.ResponseWriter, r *http.Request) {
currentUserID, err := getUserID(r)
if err != nil {
s.respondError(w, http.StatusUnauthorized, err.Error())
return
}
targetUserID, err := parseID(r, "id") targetUserID, err := parseID(r, "id")
if err != nil { if err != nil {
s.respondError(w, http.StatusBadRequest, err.Error()) s.respondError(w, http.StatusBadRequest, err.Error())
@@ -429,6 +404,12 @@ func (s *Manager) handleSetUserAdminStatus(w http.ResponseWriter, r *http.Reques
return return
} }
// Prevent admins from revoking their own admin status.
if targetUserID == currentUserID && !req.IsAdmin {
s.respondError(w, http.StatusBadRequest, "You cannot revoke your own admin status")
return
}
if err := s.auth.SetUserAdminStatus(targetUserID, req.IsAdmin); err != nil { if err := s.auth.SetUserAdminStatus(targetUserID, req.IsAdmin); err != nil {
s.respondError(w, http.StatusBadRequest, err.Error()) s.respondError(w, http.StatusBadRequest, err.Error())
return return

View File

@@ -331,8 +331,9 @@ func (s *Manager) GetBlenderArchivePath(version *BlenderVersion) (string, error)
// Need to download and decompress // Need to download and decompress
log.Printf("Downloading Blender %s from %s", version.Full, version.URL) log.Printf("Downloading Blender %s from %s", version.Full, version.URL)
// 60-minute timeout for large Blender tarballs; stream to disk via io.Copy below
client := &http.Client{ client := &http.Client{
Timeout: 0, // No timeout for large downloads Timeout: 60 * time.Minute,
} }
resp, err := client.Get(version.URL) resp, err := client.Get(version.URL)
if err != nil { if err != nil {

File diff suppressed because it is too large Load Diff

View File

@@ -59,6 +59,7 @@ type Manager struct {
secrets *authpkg.Secrets secrets *authpkg.Secrets
storage *storage.Storage storage *storage.Storage
router *chi.Mux router *chi.Mux
ui *uiRenderer
// WebSocket connections // WebSocket connections
wsUpgrader websocket.Upgrader wsUpgrader websocket.Upgrader
@@ -89,6 +90,9 @@ type Manager struct {
// Throttling for task status updates (per task) // Throttling for task status updates (per task)
taskUpdateTimes map[int64]time.Time // key: taskID taskUpdateTimes map[int64]time.Time // key: taskID
taskUpdateTimesMu sync.RWMutex taskUpdateTimesMu sync.RWMutex
// Per-job mutexes to serialize updateJobStatusFromTasks calls and prevent race conditions
jobStatusUpdateMu map[int64]*sync.Mutex // key: jobID
jobStatusUpdateMuMu sync.RWMutex
// Client WebSocket connections (new unified WebSocket) // Client WebSocket connections (new unified WebSocket)
// Key is "userID:connID" to support multiple tabs per user // Key is "userID:connID" to support multiple tabs per user
@@ -122,10 +126,24 @@ type ClientConnection struct {
type UploadSession struct { type UploadSession struct {
SessionID string SessionID string
UserID int64 UserID int64
TempDir string
Progress float64 Progress float64
Status string // "uploading", "processing", "extracting_metadata", "creating_context", "completed", "error" Status string // "uploading", "processing", "extracting_metadata", "creating_context", "completed", "error"
Phase string // "upload", "processing", "ready", "error", "action_required"
Message string Message string
CreatedAt time.Time CreatedAt time.Time
// Result fields set when Status is "completed" (for async processing)
ResultContextArchive string
ResultMetadata interface{} // *types.BlendMetadata when set
ResultMainBlendFile string
ResultFileName string
ResultFileSize int64
ResultZipExtracted bool
ResultExtractedFilesCnt int
ResultMetadataExtracted bool
ResultMetadataError string // set when Status is "completed" but metadata extraction failed
ErrorMessage string // set when Status is "error"
ResultBlendFiles []string // set when Status is "select_blend" (relative paths for user to pick)
} }
// NewManager creates a new manager server // NewManager creates a new manager server
@@ -134,6 +152,10 @@ func NewManager(db *database.DB, cfg *config.Config, auth *authpkg.Auth, storage
if err != nil { if err != nil {
return nil, fmt.Errorf("failed to initialize secrets: %w", err) return nil, fmt.Errorf("failed to initialize secrets: %w", err)
} }
ui, err := newUIRenderer()
if err != nil {
return nil, fmt.Errorf("failed to initialize UI renderer: %w", err)
}
s := &Manager{ s := &Manager{
db: db, db: db,
@@ -142,6 +164,7 @@ func NewManager(db *database.DB, cfg *config.Config, auth *authpkg.Auth, storage
secrets: secrets, secrets: secrets,
storage: storage, storage: storage,
router: chi.NewRouter(), router: chi.NewRouter(),
ui: ui,
startTime: time.Now(), startTime: time.Now(),
wsUpgrader: websocket.Upgrader{ wsUpgrader: websocket.Upgrader{
CheckOrigin: checkWebSocketOrigin, CheckOrigin: checkWebSocketOrigin,
@@ -162,6 +185,8 @@ func NewManager(db *database.DB, cfg *config.Config, auth *authpkg.Auth, storage
runnerJobConns: make(map[string]*websocket.Conn), runnerJobConns: make(map[string]*websocket.Conn),
runnerJobConnsWriteMu: make(map[string]*sync.Mutex), runnerJobConnsWriteMu: make(map[string]*sync.Mutex),
runnerJobConnsWriteMuMu: sync.RWMutex{}, // Initialize the new field runnerJobConnsWriteMuMu: sync.RWMutex{}, // Initialize the new field
// Per-job mutexes for serializing status updates
jobStatusUpdateMu: make(map[int64]*sync.Mutex),
} }
// Check for required external tools // Check for required external tools
@@ -445,6 +470,7 @@ func (w *gzipResponseWriter) WriteHeader(statusCode int) {
func (s *Manager) setupRoutes() { func (s *Manager) setupRoutes() {
// Health check endpoint (unauthenticated) // Health check endpoint (unauthenticated)
s.router.Get("/api/health", s.handleHealthCheck) s.router.Get("/api/health", s.handleHealthCheck)
s.setupUIRoutes()
// Public routes (with stricter rate limiting for auth endpoints) // Public routes (with stricter rate limiting for auth endpoints)
s.router.Route("/api/auth", func(r chi.Router) { s.router.Route("/api/auth", func(r chi.Router) {
@@ -472,6 +498,7 @@ func (s *Manager) setupRoutes() {
}) })
r.Post("/", s.handleCreateJob) r.Post("/", s.handleCreateJob)
r.Post("/upload", s.handleUploadFileForJobCreation) // Upload before job creation r.Post("/upload", s.handleUploadFileForJobCreation) // Upload before job creation
r.Get("/upload/status", s.handleUploadStatus) // Poll upload processing status (session_id query param)
r.Get("/", s.handleListJobs) r.Get("/", s.handleListJobs)
r.Get("/summary", s.handleListJobsSummary) r.Get("/summary", s.handleListJobsSummary)
r.Post("/batch", s.handleBatchGetJobs) r.Post("/batch", s.handleBatchGetJobs)
@@ -482,6 +509,7 @@ func (s *Manager) setupRoutes() {
r.Get("/{id}/files", s.handleListJobFiles) r.Get("/{id}/files", s.handleListJobFiles)
r.Get("/{id}/files/count", s.handleGetJobFilesCount) r.Get("/{id}/files/count", s.handleGetJobFilesCount)
r.Get("/{id}/context", s.handleListContextArchive) r.Get("/{id}/context", s.handleListContextArchive)
r.Get("/{id}/files/exr-zip", s.handleDownloadEXRZip)
r.Get("/{id}/files/{fileId}/download", s.handleDownloadJobFile) r.Get("/{id}/files/{fileId}/download", s.handleDownloadJobFile)
r.Get("/{id}/files/{fileId}/preview-exr", s.handlePreviewEXR) r.Get("/{id}/files/{fileId}/preview-exr", s.handlePreviewEXR)
r.Get("/{id}/video", s.handleStreamVideo) r.Get("/{id}/video", s.handleStreamVideo)
@@ -517,7 +545,6 @@ func (s *Manager) setupRoutes() {
r.Delete("/{id}", s.handleDeleteRunnerAPIKey) r.Delete("/{id}", s.handleDeleteRunnerAPIKey)
}) })
r.Get("/", s.handleListRunnersAdmin) r.Get("/", s.handleListRunnersAdmin)
r.Post("/{id}/verify", s.handleVerifyRunner)
r.Delete("/{id}", s.handleDeleteRunner) r.Delete("/{id}", s.handleDeleteRunner)
}) })
r.Route("/users", func(r chi.Router) { r.Route("/users", func(r chi.Router) {
@@ -550,6 +577,7 @@ func (s *Manager) setupRoutes() {
return http.HandlerFunc(s.runnerAuthMiddleware(next.ServeHTTP)) return http.HandlerFunc(s.runnerAuthMiddleware(next.ServeHTTP))
}) })
r.Get("/blender/download", s.handleDownloadBlender) r.Get("/blender/download", s.handleDownloadBlender)
r.Get("/jobs/{jobId}/status", s.handleGetJobStatusForRunner)
r.Get("/jobs/{jobId}/files", s.handleGetJobFilesForRunner) r.Get("/jobs/{jobId}/files", s.handleGetJobFilesForRunner)
r.Get("/jobs/{jobId}/metadata", s.handleGetJobMetadataForRunner) r.Get("/jobs/{jobId}/metadata", s.handleGetJobMetadataForRunner)
r.Get("/files/{jobId}/{fileName}", s.handleDownloadFileForRunner) r.Get("/files/{jobId}/{fileName}", s.handleDownloadFileForRunner)
@@ -559,8 +587,8 @@ func (s *Manager) setupRoutes() {
// Blender versions API (public, for job submission page) // Blender versions API (public, for job submission page)
s.router.Get("/api/blender/versions", s.handleGetBlenderVersions) s.router.Get("/api/blender/versions", s.handleGetBlenderVersions)
// Serve static files (embedded React app with SPA fallback) // Static assets for server-rendered UI.
s.router.Handle("/*", web.SPAHandler()) s.router.Handle("/assets/*", web.StaticHandler())
} }
// ServeHTTP implements http.Handler // ServeHTTP implements http.Handler

View File

@@ -0,0 +1,104 @@
package api
import (
"fmt"
"html/template"
"net/http"
"strings"
"time"
authpkg "jiggablend/internal/auth"
"jiggablend/web"
)
type uiRenderer struct {
templates *template.Template
}
type pageData struct {
Title string
CurrentPath string
ContentTemplate string
PageScript string
User *authpkg.Session
Error string
Notice string
Data interface{}
}
func newUIRenderer() (*uiRenderer, error) {
tpl, err := template.New("base").Funcs(template.FuncMap{
"formatTime": func(t time.Time) string {
if t.IsZero() {
return "-"
}
return t.Local().Format("2006-01-02 15:04:05")
},
"statusClass": func(status string) string {
switch status {
case "completed":
return "status-completed"
case "running":
return "status-running"
case "failed":
return "status-failed"
case "cancelled":
return "status-cancelled"
case "online":
return "status-online"
case "offline":
return "status-offline"
case "busy":
return "status-busy"
default:
return "status-pending"
}
},
"progressInt": func(v float64) int {
if v < 0 {
return 0
}
if v > 100 {
return 100
}
return int(v)
},
"derefInt": func(v *int) string {
if v == nil {
return ""
}
return fmt.Sprintf("%d", *v)
},
"derefString": func(v *string) string {
if v == nil {
return ""
}
return *v
},
"hasSuffixFold": func(value, suffix string) bool {
return strings.HasSuffix(strings.ToLower(value), strings.ToLower(suffix))
},
}).ParseFS(
web.GetTemplateFS(),
"templates/*.html",
"templates/partials/*.html",
)
if err != nil {
return nil, fmt.Errorf("parse templates: %w", err)
}
return &uiRenderer{templates: tpl}, nil
}
func (r *uiRenderer) render(w http.ResponseWriter, data pageData) {
w.Header().Set("Content-Type", "text/html; charset=utf-8")
if err := r.templates.ExecuteTemplate(w, "base", data); err != nil {
http.Error(w, "template render error", http.StatusInternalServerError)
}
}
func (r *uiRenderer) renderTemplate(w http.ResponseWriter, templateName string, data interface{}) {
w.Header().Set("Content-Type", "text/html; charset=utf-8")
if err := r.templates.ExecuteTemplate(w, templateName, data); err != nil {
http.Error(w, "template render error", http.StatusInternalServerError)
}
}

View File

@@ -0,0 +1,13 @@
package api
import "testing"
func TestNewUIRendererParsesTemplates(t *testing.T) {
renderer, err := newUIRenderer()
if err != nil {
t.Fatalf("newUIRenderer returned error: %v", err)
}
if renderer == nil || renderer.templates == nil {
t.Fatalf("renderer/templates should not be nil")
}
}

View File

@@ -275,7 +275,8 @@ type NextJobTaskInfo struct {
TaskID int64 `json:"task_id"` TaskID int64 `json:"task_id"`
JobID int64 `json:"job_id"` JobID int64 `json:"job_id"`
JobName string `json:"job_name"` JobName string `json:"job_name"`
Frame int `json:"frame"` Frame int `json:"frame"` // frame start (inclusive)
FrameEnd int `json:"frame_end"` // frame end (inclusive); same as Frame for single-frame
TaskType string `json:"task_type"` TaskType string `json:"task_type"`
Metadata *types.BlendMetadata `json:"metadata,omitempty"` Metadata *types.BlendMetadata `json:"metadata,omitempty"`
} }
@@ -376,6 +377,7 @@ func (s *Manager) handleNextJob(w http.ResponseWriter, r *http.Request) {
TaskID int64 TaskID int64
JobID int64 JobID int64
Frame int Frame int
FrameEnd sql.NullInt64
TaskType string TaskType string
JobName string JobName string
JobUserID int64 JobUserID int64
@@ -385,12 +387,12 @@ func (s *Manager) handleNextJob(w http.ResponseWriter, r *http.Request) {
err = s.db.With(func(conn *sql.DB) error { err = s.db.With(func(conn *sql.DB) error {
rows, err := conn.Query( rows, err := conn.Query(
`SELECT t.id, t.job_id, t.frame, t.task_type, `SELECT t.id, t.job_id, t.frame, t.frame_end, t.task_type,
j.name as job_name, j.user_id, j.blend_metadata, j.name as job_name, j.user_id, j.blend_metadata,
t.condition t.condition
FROM tasks t FROM tasks t
JOIN jobs j ON t.job_id = j.id JOIN jobs j ON t.job_id = j.id
WHERE t.status = ? AND j.status != ? WHERE t.status = ? AND t.runner_id IS NULL AND j.status != ?
ORDER BY t.created_at ASC ORDER BY t.created_at ASC
LIMIT 50`, LIMIT 50`,
types.TaskStatusPending, types.JobStatusCancelled, types.TaskStatusPending, types.JobStatusCancelled,
@@ -403,7 +405,7 @@ func (s *Manager) handleNextJob(w http.ResponseWriter, r *http.Request) {
for rows.Next() { for rows.Next() {
var task taskCandidate var task taskCandidate
var condition sql.NullString var condition sql.NullString
err := rows.Scan(&task.TaskID, &task.JobID, &task.Frame, &task.TaskType, err := rows.Scan(&task.TaskID, &task.JobID, &task.Frame, &task.FrameEnd, &task.TaskType,
&task.JobName, &task.JobUserID, &task.BlendMetadata, &condition) &task.JobName, &task.JobUserID, &task.BlendMetadata, &condition)
if err != nil { if err != nil {
continue continue
@@ -549,6 +551,11 @@ func (s *Manager) handleNextJob(w http.ResponseWriter, r *http.Request) {
// Update job status // Update job status
s.updateJobStatusFromTasks(selectedTask.JobID) s.updateJobStatusFromTasks(selectedTask.JobID)
// Frame end for response: use task range or single frame (NULL frame_end)
frameEnd := selectedTask.Frame
if selectedTask.FrameEnd.Valid {
frameEnd = int(selectedTask.FrameEnd.Int64)
}
// Build response // Build response
response := NextJobResponse{ response := NextJobResponse{
JobToken: jobToken, JobToken: jobToken,
@@ -558,6 +565,7 @@ func (s *Manager) handleNextJob(w http.ResponseWriter, r *http.Request) {
JobID: selectedTask.JobID, JobID: selectedTask.JobID,
JobName: selectedTask.JobName, JobName: selectedTask.JobName,
Frame: selectedTask.Frame, Frame: selectedTask.Frame,
FrameEnd: frameEnd,
TaskType: selectedTask.TaskType, TaskType: selectedTask.TaskType,
Metadata: metadata, Metadata: metadata,
}, },
@@ -1019,6 +1027,10 @@ func (s *Manager) handleGetJobStatusForRunner(w http.ResponseWriter, r *http.Req
&job.CreatedAt, &startedAt, &completedAt, &errorMessage, &job.CreatedAt, &startedAt, &completedAt, &errorMessage,
) )
}) })
if err == sql.ErrNoRows {
s.respondError(w, http.StatusNotFound, "Job not found")
return
}
if err != nil { if err != nil {
s.respondError(w, http.StatusInternalServerError, fmt.Sprintf("Failed to query job: %v", err)) s.respondError(w, http.StatusInternalServerError, fmt.Sprintf("Failed to query job: %v", err))
return return
@@ -1037,15 +1049,6 @@ func (s *Manager) handleGetJobStatusForRunner(w http.ResponseWriter, r *http.Req
job.OutputFormat = &outputFormat.String job.OutputFormat = &outputFormat.String
} }
if err == sql.ErrNoRows {
s.respondError(w, http.StatusNotFound, "Job not found")
return
}
if err != nil {
s.respondError(w, http.StatusInternalServerError, fmt.Sprintf("Failed to query job: %v", err))
return
}
if startedAt.Valid { if startedAt.Valid {
job.StartedAt = &startedAt.Time job.StartedAt = &startedAt.Time
} }
@@ -1368,7 +1371,7 @@ func (s *Manager) handleRunnerJobWebSocket(w http.ResponseWriter, r *http.Reques
log.Printf("Job WebSocket disconnected unexpectedly for task %d, marking as failed", taskID) log.Printf("Job WebSocket disconnected unexpectedly for task %d, marking as failed", taskID)
s.db.With(func(conn *sql.DB) error { s.db.With(func(conn *sql.DB) error {
_, err := conn.Exec( _, err := conn.Exec(
`UPDATE tasks SET status = ?, error_message = ?, completed_at = ? WHERE id = ?`, `UPDATE tasks SET status = ?, runner_id = NULL, error_message = ?, completed_at = ? WHERE id = ?`,
types.TaskStatusFailed, "WebSocket connection lost", time.Now(), taskID, types.TaskStatusFailed, "WebSocket connection lost", time.Now(), taskID,
) )
return err return err
@@ -1683,11 +1686,10 @@ func (s *Manager) handleWebSocketTaskComplete(runnerID int64, taskUpdate WSTaskU
} else { } else {
// No retries remaining - mark as failed // No retries remaining - mark as failed
err = s.db.WithTx(func(tx *sql.Tx) error { err = s.db.WithTx(func(tx *sql.Tx) error {
_, err := tx.Exec(`UPDATE tasks SET status = ? WHERE id = ?`, types.TaskStatusFailed, taskUpdate.TaskID) _, err := tx.Exec(
if err != nil { `UPDATE tasks SET status = ?, runner_id = NULL, completed_at = ? WHERE id = ?`,
return err types.TaskStatusFailed, now, taskUpdate.TaskID,
} )
_, err = tx.Exec(`UPDATE tasks SET completed_at = ? WHERE id = ?`, now, taskUpdate.TaskID)
if err != nil { if err != nil {
return err return err
} }
@@ -1854,7 +1856,7 @@ func (s *Manager) cancelActiveTasksForJob(jobID int64) error {
// Tasks don't have a cancelled status - mark them as failed instead // Tasks don't have a cancelled status - mark them as failed instead
err := s.db.With(func(conn *sql.DB) error { err := s.db.With(func(conn *sql.DB) error {
_, err := conn.Exec( _, err := conn.Exec(
`UPDATE tasks SET status = ?, error_message = ? WHERE job_id = ? AND status IN (?, ?)`, `UPDATE tasks SET status = ?, runner_id = NULL, error_message = ? WHERE job_id = ? AND status IN (?, ?)`,
types.TaskStatusFailed, "Job cancelled", jobID, types.TaskStatusPending, types.TaskStatusRunning, types.TaskStatusFailed, "Job cancelled", jobID, types.TaskStatusPending, types.TaskStatusRunning,
) )
if err != nil { if err != nil {
@@ -1920,8 +1922,37 @@ func (s *Manager) evaluateTaskCondition(taskID int64, jobID int64, conditionJSON
} }
} }
// getJobStatusUpdateMutex returns the mutex for a specific jobID, creating it if needed.
// This ensures serialized execution of updateJobStatusFromTasks per job to prevent race conditions.
func (s *Manager) getJobStatusUpdateMutex(jobID int64) *sync.Mutex {
s.jobStatusUpdateMuMu.Lock()
defer s.jobStatusUpdateMuMu.Unlock()
mu, exists := s.jobStatusUpdateMu[jobID]
if !exists {
mu = &sync.Mutex{}
s.jobStatusUpdateMu[jobID] = mu
}
return mu
}
// cleanupJobStatusUpdateMutex removes the mutex for a jobID after it's no longer needed.
// Should only be called when the job is in a final state (completed/failed) and no more updates are expected.
func (s *Manager) cleanupJobStatusUpdateMutex(jobID int64) {
s.jobStatusUpdateMuMu.Lock()
defer s.jobStatusUpdateMuMu.Unlock()
delete(s.jobStatusUpdateMu, jobID)
}
// updateJobStatusFromTasks updates job status and progress based on task states // updateJobStatusFromTasks updates job status and progress based on task states
// This function is serialized per jobID to prevent race conditions when multiple tasks
// complete concurrently and trigger status updates simultaneously.
func (s *Manager) updateJobStatusFromTasks(jobID int64) { func (s *Manager) updateJobStatusFromTasks(jobID int64) {
// Serialize updates per job to prevent race conditions
mu := s.getJobStatusUpdateMutex(jobID)
mu.Lock()
defer mu.Unlock()
now := time.Now() now := time.Now()
// All jobs now use parallel runners (one task per frame), so we always use task-based progress // All jobs now use parallel runners (one task per frame), so we always use task-based progress
@@ -1936,6 +1967,12 @@ func (s *Manager) updateJobStatusFromTasks(jobID int64) {
return return
} }
// Cancellation is terminal from the user's perspective.
// Do not allow asynchronous task updates to revive cancelled jobs.
if currentStatus == string(types.JobStatusCancelled) {
return
}
// Count total tasks and completed tasks // Count total tasks and completed tasks
var totalTasks, completedTasks int var totalTasks, completedTasks int
err = s.db.With(func(conn *sql.DB) error { err = s.db.With(func(conn *sql.DB) error {
@@ -2087,6 +2124,11 @@ func (s *Manager) updateJobStatusFromTasks(jobID int64) {
"progress": progress, "progress": progress,
"completed_at": now, "completed_at": now,
}) })
// Clean up mutex for jobs in final states (completed or failed)
// No more status updates will occur for these jobs
if jobStatus == string(types.JobStatusCompleted) || jobStatus == string(types.JobStatusFailed) {
s.cleanupJobStatusUpdateMutex(jobID)
}
} }
} }

556
internal/manager/ui.go Normal file
View File

@@ -0,0 +1,556 @@
package api
import (
"database/sql"
"fmt"
"net/http"
"strconv"
"strings"
"time"
authpkg "jiggablend/internal/auth"
"github.com/go-chi/chi/v5"
)
type uiJobSummary struct {
ID int64
Name string
Status string
Progress float64
FrameStart *int
FrameEnd *int
OutputFormat *string
CreatedAt time.Time
}
type uiTaskSummary struct {
ID int64
TaskType string
Status string
Frame int
FrameEnd *int
CurrentStep string
RetryCount int
Error string
StartedAt *time.Time
CompletedAt *time.Time
}
type uiFileSummary struct {
ID int64
FileName string
FileType string
FileSize int64
CreatedAt time.Time
}
func (s *Manager) setupUIRoutes() {
s.router.Get("/", s.handleUIRoot)
s.router.Get("/login", s.handleUILoginPage)
s.router.Post("/logout", s.handleUILogout)
s.router.Group(func(r chi.Router) {
r.Use(func(next http.Handler) http.Handler {
return http.HandlerFunc(s.auth.Middleware(next.ServeHTTP))
})
r.Get("/jobs", s.handleUIJobsPage)
r.Get("/jobs/new", s.handleUINewJobPage)
r.Get("/jobs/{id}", s.handleUIJobDetailPage)
r.Get("/ui/fragments/jobs", s.handleUIJobsFragment)
r.Get("/ui/fragments/jobs/{id}/tasks", s.handleUIJobTasksFragment)
r.Get("/ui/fragments/jobs/{id}/files", s.handleUIJobFilesFragment)
})
s.router.Group(func(r chi.Router) {
r.Use(func(next http.Handler) http.Handler {
return http.HandlerFunc(s.auth.AdminMiddleware(next.ServeHTTP))
})
r.Get("/admin", s.handleUIAdminPage)
r.Get("/ui/fragments/admin/runners", s.handleUIAdminRunnersFragment)
r.Get("/ui/fragments/admin/users", s.handleUIAdminUsersFragment)
r.Get("/ui/fragments/admin/apikeys", s.handleUIAdminAPIKeysFragment)
})
}
func (s *Manager) sessionFromRequest(r *http.Request) (*authpkg.Session, bool) {
cookie, err := r.Cookie("session_id")
if err != nil {
return nil, false
}
return s.auth.GetSession(cookie.Value)
}
func (s *Manager) handleUIRoot(w http.ResponseWriter, r *http.Request) {
if _, ok := s.sessionFromRequest(r); ok {
http.Redirect(w, r, "/jobs", http.StatusFound)
return
}
http.Redirect(w, r, "/login", http.StatusFound)
}
func (s *Manager) handleUILoginPage(w http.ResponseWriter, r *http.Request) {
if _, ok := s.sessionFromRequest(r); ok {
http.Redirect(w, r, "/jobs", http.StatusFound)
return
}
s.ui.render(w, pageData{
Title: "Login",
CurrentPath: "/login",
ContentTemplate: "page_login",
PageScript: "/assets/login.js",
Data: map[string]interface{}{
"google_enabled": s.auth.IsGoogleOAuthConfigured(),
"discord_enabled": s.auth.IsDiscordOAuthConfigured(),
"local_enabled": s.auth.IsLocalLoginEnabled(),
"error": r.URL.Query().Get("error"),
},
})
}
func (s *Manager) handleUILogout(w http.ResponseWriter, r *http.Request) {
cookie, err := r.Cookie("session_id")
if err == nil {
s.auth.DeleteSession(cookie.Value)
}
expired := &http.Cookie{
Name: "session_id",
Value: "",
Path: "/",
MaxAge: -1,
HttpOnly: true,
SameSite: http.SameSiteLaxMode,
}
if s.cfg.IsProductionMode() {
expired.Secure = true
}
http.SetCookie(w, expired)
http.Redirect(w, r, "/login", http.StatusFound)
}
func (s *Manager) handleUIJobsPage(w http.ResponseWriter, r *http.Request) {
user, _ := s.sessionFromRequest(r)
s.ui.render(w, pageData{
Title: "Jobs",
CurrentPath: "/jobs",
ContentTemplate: "page_jobs",
PageScript: "/assets/jobs.js",
User: user,
})
}
func (s *Manager) handleUINewJobPage(w http.ResponseWriter, r *http.Request) {
user, _ := s.sessionFromRequest(r)
s.ui.render(w, pageData{
Title: "New Job",
CurrentPath: "/jobs/new",
ContentTemplate: "page_jobs_new",
PageScript: "/assets/job_new.js",
User: user,
})
}
func (s *Manager) handleUIJobDetailPage(w http.ResponseWriter, r *http.Request) {
userID, err := getUserID(r)
if err != nil {
http.Redirect(w, r, "/login", http.StatusFound)
return
}
isAdmin := authpkg.IsAdmin(r.Context())
jobID, err := parseID(r, "id")
if err != nil {
http.NotFound(w, r)
return
}
job, err := s.getUIJob(jobID, userID, isAdmin)
if err != nil {
http.NotFound(w, r)
return
}
user, _ := s.sessionFromRequest(r)
s.ui.render(w, pageData{
Title: fmt.Sprintf("Job %d", jobID),
CurrentPath: "/jobs",
ContentTemplate: "page_job_show",
PageScript: "/assets/job_show.js",
User: user,
Data: map[string]interface{}{"job": job},
})
}
func (s *Manager) handleUIAdminPage(w http.ResponseWriter, r *http.Request) {
user, _ := s.sessionFromRequest(r)
regEnabled, _ := s.auth.IsRegistrationEnabled()
s.ui.render(w, pageData{
Title: "Admin",
CurrentPath: "/admin",
ContentTemplate: "page_admin",
PageScript: "/assets/admin.js",
User: user,
Data: map[string]interface{}{
"registration_enabled": regEnabled,
},
})
}
func (s *Manager) handleUIJobsFragment(w http.ResponseWriter, r *http.Request) {
userID, err := getUserID(r)
if err != nil {
http.Error(w, "unauthorized", http.StatusUnauthorized)
return
}
jobs, err := s.listUIJobSummaries(userID, 50, 0)
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
s.ui.renderTemplate(w, "partial_jobs_table", map[string]interface{}{"jobs": jobs})
}
func (s *Manager) handleUIJobTasksFragment(w http.ResponseWriter, r *http.Request) {
userID, err := getUserID(r)
if err != nil {
http.Error(w, "unauthorized", http.StatusUnauthorized)
return
}
isAdmin := authpkg.IsAdmin(r.Context())
jobID, err := parseID(r, "id")
if err != nil {
http.Error(w, "invalid job id", http.StatusBadRequest)
return
}
if _, err := s.getUIJob(jobID, userID, isAdmin); err != nil {
http.Error(w, "job not found", http.StatusNotFound)
return
}
tasks, err := s.listUITasks(jobID)
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
s.ui.renderTemplate(w, "partial_job_tasks", map[string]interface{}{
"job_id": jobID,
"tasks": tasks,
})
}
func (s *Manager) handleUIJobFilesFragment(w http.ResponseWriter, r *http.Request) {
userID, err := getUserID(r)
if err != nil {
http.Error(w, "unauthorized", http.StatusUnauthorized)
return
}
isAdmin := authpkg.IsAdmin(r.Context())
jobID, err := parseID(r, "id")
if err != nil {
http.Error(w, "invalid job id", http.StatusBadRequest)
return
}
if _, err := s.getUIJob(jobID, userID, isAdmin); err != nil {
http.Error(w, "job not found", http.StatusNotFound)
return
}
files, err := s.listUIFiles(jobID, 100)
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
outputFiles := make([]uiFileSummary, 0, len(files))
adminInputFiles := make([]uiFileSummary, 0)
for _, file := range files {
if strings.EqualFold(file.FileType, "output") {
outputFiles = append(outputFiles, file)
continue
}
if isAdmin {
adminInputFiles = append(adminInputFiles, file)
}
}
s.ui.renderTemplate(w, "partial_job_files", map[string]interface{}{
"job_id": jobID,
"files": outputFiles,
"is_admin": isAdmin,
"admin_input_files": adminInputFiles,
})
}
func (s *Manager) handleUIAdminRunnersFragment(w http.ResponseWriter, r *http.Request) {
var rows *sql.Rows
err := s.db.With(func(conn *sql.DB) error {
var qErr error
rows, qErr = conn.Query(`SELECT id, name, hostname, status, last_heartbeat, priority, created_at FROM runners ORDER BY created_at DESC`)
return qErr
})
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
defer rows.Close()
type runner struct {
ID int64
Name string
Hostname string
Status string
LastHeartbeat time.Time
Priority int
CreatedAt time.Time
}
all := make([]runner, 0)
for rows.Next() {
var item runner
if scanErr := rows.Scan(&item.ID, &item.Name, &item.Hostname, &item.Status, &item.LastHeartbeat, &item.Priority, &item.CreatedAt); scanErr != nil {
http.Error(w, scanErr.Error(), http.StatusInternalServerError)
return
}
all = append(all, item)
}
s.ui.renderTemplate(w, "partial_admin_runners", map[string]interface{}{"runners": all})
}
func (s *Manager) handleUIAdminUsersFragment(w http.ResponseWriter, r *http.Request) {
currentUserID, _ := getUserID(r)
firstUserID, _ := s.auth.GetFirstUserID()
var rows *sql.Rows
err := s.db.With(func(conn *sql.DB) error {
var qErr error
rows, qErr = conn.Query(`SELECT id, email, name, oauth_provider, is_admin, created_at FROM users ORDER BY created_at DESC`)
return qErr
})
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
defer rows.Close()
type user struct {
ID int64
Email string
Name string
OAuthProvider string
IsAdmin bool
IsFirstUser bool
CreatedAt time.Time
}
all := make([]user, 0)
for rows.Next() {
var item user
if scanErr := rows.Scan(&item.ID, &item.Email, &item.Name, &item.OAuthProvider, &item.IsAdmin, &item.CreatedAt); scanErr != nil {
http.Error(w, scanErr.Error(), http.StatusInternalServerError)
return
}
item.IsFirstUser = item.ID == firstUserID
all = append(all, item)
}
s.ui.renderTemplate(w, "partial_admin_users", map[string]interface{}{
"users": all,
"current_user_id": currentUserID,
})
}
func (s *Manager) handleUIAdminAPIKeysFragment(w http.ResponseWriter, r *http.Request) {
keys, err := s.secrets.ListRunnerAPIKeys()
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
type item struct {
ID int64
Name string
Scope string
Key string
IsActive bool
CreatedAt time.Time
}
out := make([]item, 0, len(keys))
for _, key := range keys {
out = append(out, item{
ID: key.ID,
Name: key.Name,
Scope: key.Scope,
Key: key.Key,
IsActive: key.IsActive,
CreatedAt: key.CreatedAt,
})
}
s.ui.renderTemplate(w, "partial_admin_apikeys", map[string]interface{}{"keys": out})
}
func (s *Manager) listUIJobSummaries(userID int64, limit int, offset int) ([]uiJobSummary, error) {
rows := &sql.Rows{}
err := s.db.With(func(conn *sql.DB) error {
var qErr error
rows, qErr = conn.Query(
`SELECT id, name, status, progress, frame_start, frame_end, output_format, created_at
FROM jobs WHERE user_id = ? ORDER BY created_at DESC LIMIT ? OFFSET ?`,
userID, limit, offset,
)
return qErr
})
if err != nil {
return nil, err
}
defer rows.Close()
out := make([]uiJobSummary, 0)
for rows.Next() {
var item uiJobSummary
var frameStart, frameEnd sql.NullInt64
var outputFormat sql.NullString
if scanErr := rows.Scan(&item.ID, &item.Name, &item.Status, &item.Progress, &frameStart, &frameEnd, &outputFormat, &item.CreatedAt); scanErr != nil {
return nil, scanErr
}
if frameStart.Valid {
v := int(frameStart.Int64)
item.FrameStart = &v
}
if frameEnd.Valid {
v := int(frameEnd.Int64)
item.FrameEnd = &v
}
if outputFormat.Valid {
item.OutputFormat = &outputFormat.String
}
out = append(out, item)
}
return out, nil
}
func (s *Manager) getUIJob(jobID int64, userID int64, isAdmin bool) (uiJobSummary, error) {
var item uiJobSummary
var frameStart, frameEnd sql.NullInt64
var outputFormat sql.NullString
err := s.db.With(func(conn *sql.DB) error {
if isAdmin {
return conn.QueryRow(
`SELECT id, name, status, progress, frame_start, frame_end, output_format, created_at
FROM jobs WHERE id = ?`,
jobID,
).Scan(&item.ID, &item.Name, &item.Status, &item.Progress, &frameStart, &frameEnd, &outputFormat, &item.CreatedAt)
}
return conn.QueryRow(
`SELECT id, name, status, progress, frame_start, frame_end, output_format, created_at
FROM jobs WHERE id = ? AND user_id = ?`,
jobID, userID,
).Scan(&item.ID, &item.Name, &item.Status, &item.Progress, &frameStart, &frameEnd, &outputFormat, &item.CreatedAt)
})
if err != nil {
return uiJobSummary{}, err
}
if frameStart.Valid {
v := int(frameStart.Int64)
item.FrameStart = &v
}
if frameEnd.Valid {
v := int(frameEnd.Int64)
item.FrameEnd = &v
}
if outputFormat.Valid {
item.OutputFormat = &outputFormat.String
}
return item, nil
}
func (s *Manager) listUITasks(jobID int64) ([]uiTaskSummary, error) {
var rows *sql.Rows
err := s.db.With(func(conn *sql.DB) error {
var qErr error
rows, qErr = conn.Query(
`SELECT id, task_type, status, frame, frame_end, current_step, retry_count, error_message, started_at, completed_at
FROM tasks WHERE job_id = ? ORDER BY id ASC`,
jobID,
)
return qErr
})
if err != nil {
return nil, err
}
defer rows.Close()
out := make([]uiTaskSummary, 0)
for rows.Next() {
var item uiTaskSummary
var frameEnd sql.NullInt64
var currentStep sql.NullString
var errMsg sql.NullString
var startedAt, completedAt sql.NullTime
if scanErr := rows.Scan(
&item.ID, &item.TaskType, &item.Status, &item.Frame, &frameEnd,
&currentStep, &item.RetryCount, &errMsg, &startedAt, &completedAt,
); scanErr != nil {
return nil, scanErr
}
if frameEnd.Valid {
v := int(frameEnd.Int64)
item.FrameEnd = &v
}
if currentStep.Valid {
item.CurrentStep = currentStep.String
}
if errMsg.Valid {
item.Error = errMsg.String
}
if startedAt.Valid {
item.StartedAt = &startedAt.Time
}
if completedAt.Valid {
item.CompletedAt = &completedAt.Time
}
out = append(out, item)
}
return out, nil
}
func (s *Manager) listUIFiles(jobID int64, limit int) ([]uiFileSummary, error) {
var rows *sql.Rows
err := s.db.With(func(conn *sql.DB) error {
var qErr error
rows, qErr = conn.Query(
`SELECT id, file_name, file_type, file_size, created_at
FROM job_files WHERE job_id = ? ORDER BY created_at DESC LIMIT ?`,
jobID, limit,
)
return qErr
})
if err != nil {
return nil, err
}
defer rows.Close()
out := make([]uiFileSummary, 0)
for rows.Next() {
var item uiFileSummary
if scanErr := rows.Scan(&item.ID, &item.FileName, &item.FileType, &item.FileSize, &item.CreatedAt); scanErr != nil {
return nil, scanErr
}
out = append(out, item)
}
return out, nil
}
func parseBoolForm(r *http.Request, key string) bool {
v := strings.TrimSpace(strings.ToLower(r.FormValue(key)))
return v == "1" || v == "true" || v == "on" || v == "yes"
}
func parseIntQuery(r *http.Request, key string, fallback int) int {
raw := strings.TrimSpace(r.URL.Query().Get(key))
if raw == "" {
return fallback
}
v, err := strconv.Atoi(raw)
if err != nil || v < 0 {
return fallback
}
return v
}

View File

@@ -0,0 +1,37 @@
package api
import (
"net/http/httptest"
"testing"
)
func TestParseBoolForm(t *testing.T) {
req := httptest.NewRequest("POST", "/?flag=true", nil)
req.ParseForm()
req.Form.Set("enabled", "true")
if !parseBoolForm(req, "enabled") {
t.Fatalf("expected true for enabled=true")
}
req.Form.Set("enabled", "no")
if parseBoolForm(req, "enabled") {
t.Fatalf("expected false for enabled=no")
}
}
func TestParseIntQuery(t *testing.T) {
req := httptest.NewRequest("GET", "/?limit=42", nil)
if got := parseIntQuery(req, "limit", 10); got != 42 {
t.Fatalf("expected 42, got %d", got)
}
req = httptest.NewRequest("GET", "/?limit=-1", nil)
if got := parseIntQuery(req, "limit", 10); got != 10 {
t.Fatalf("expected fallback 10, got %d", got)
}
req = httptest.NewRequest("GET", "/?limit=abc", nil)
if got := parseIntQuery(req, "limit", 10); got != 10 {
t.Fatalf("expected fallback 10, got %d", got)
}
}

View File

@@ -196,7 +196,8 @@ type NextJobTaskInfo struct {
TaskID int64 `json:"task_id"` TaskID int64 `json:"task_id"`
JobID int64 `json:"job_id"` JobID int64 `json:"job_id"`
JobName string `json:"job_name"` JobName string `json:"job_name"`
Frame int `json:"frame"` Frame int `json:"frame"` // frame start (inclusive)
FrameEnd int `json:"frame_end"` // frame end (inclusive); same as Frame for single-frame
TaskType string `json:"task_type"` TaskType string `json:"task_type"`
Metadata *types.BlendMetadata `json:"metadata,omitempty"` Metadata *types.BlendMetadata `json:"metadata,omitempty"`
} }
@@ -315,6 +316,28 @@ func (m *ManagerClient) GetJobMetadata(jobID int64) (*types.BlendMetadata, error
return &metadata, nil return &metadata, nil
} }
// GetJobStatus retrieves the current status of a job.
func (m *ManagerClient) GetJobStatus(jobID int64) (types.JobStatus, error) {
path := fmt.Sprintf("/api/runner/jobs/%d/status?runner_id=%d", jobID, m.runnerID)
resp, err := m.Request("GET", path, nil)
if err != nil {
return "", err
}
defer resp.Body.Close()
if resp.StatusCode != http.StatusOK {
body, _ := io.ReadAll(resp.Body)
return "", fmt.Errorf("failed to get job status: %s", string(body))
}
var job types.Job
if err := json.NewDecoder(resp.Body).Decode(&job); err != nil {
return "", err
}
return job.Status, nil
}
// JobFile represents a file associated with a job. // JobFile represents a file associated with a job.
type JobFile struct { type JobFile struct {
ID int64 `json:"id"` ID int64 `json:"id"`
@@ -419,3 +442,32 @@ func (m *ManagerClient) DownloadBlender(version string) (io.ReadCloser, error) {
return resp.Body, nil return resp.Body, nil
} }
// blenderVersionsResponse is the response from GET /api/blender/versions.
type blenderVersionsResponse struct {
Versions []struct {
Full string `json:"full"`
} `json:"versions"`
}
// GetLatestBlenderVersion returns the latest Blender version string (e.g. "4.2.3") from the manager.
// Uses the flat versions list which is newest-first.
func (m *ManagerClient) GetLatestBlenderVersion() (string, error) {
resp, err := m.Request("GET", "/api/blender/versions", nil)
if err != nil {
return "", fmt.Errorf("failed to fetch blender versions: %w", err)
}
defer resp.Body.Close()
if resp.StatusCode != http.StatusOK {
body, _ := io.ReadAll(resp.Body)
return "", fmt.Errorf("blender versions returned status %d: %s", resp.StatusCode, string(body))
}
var out blenderVersionsResponse
if err := json.NewDecoder(resp.Body).Decode(&out); err != nil {
return "", fmt.Errorf("failed to decode blender versions: %w", err)
}
if len(out.Versions) == 0 {
return "", fmt.Errorf("no blender versions available")
}
return out.Versions[0].Full, nil
}

View File

@@ -6,6 +6,7 @@ import (
"log" "log"
"os" "os"
"path/filepath" "path/filepath"
"strings"
"jiggablend/internal/runner/api" "jiggablend/internal/runner/api"
"jiggablend/internal/runner/workspace" "jiggablend/internal/runner/workspace"
@@ -85,3 +86,42 @@ func (m *Manager) GetBinaryForJob(version string) (string, error) {
return m.GetBinaryPath(version) return m.GetBinaryPath(version)
} }
// TarballEnv returns a copy of baseEnv with LD_LIBRARY_PATH set so that a
// tarball Blender installation can find its bundled libs (e.g. lib/python3.x).
// If blenderBinary is the system "blender" or has no path component, baseEnv is
// returned unchanged.
func TarballEnv(blenderBinary string, baseEnv []string) []string {
if blenderBinary == "" || blenderBinary == "blender" {
return baseEnv
}
if !strings.Contains(blenderBinary, string(os.PathSeparator)) {
return baseEnv
}
blenderDir := filepath.Dir(blenderBinary)
libDir := filepath.Join(blenderDir, "lib")
ldLib := libDir
for _, e := range baseEnv {
if strings.HasPrefix(e, "LD_LIBRARY_PATH=") {
existing := strings.TrimPrefix(e, "LD_LIBRARY_PATH=")
if existing != "" {
ldLib = libDir + ":" + existing
}
break
}
}
out := make([]string, 0, len(baseEnv)+1)
done := false
for _, e := range baseEnv {
if strings.HasPrefix(e, "LD_LIBRARY_PATH=") {
out = append(out, "LD_LIBRARY_PATH="+ldLib)
done = true
continue
}
out = append(out, e)
}
if !done {
out = append(out, "LD_LIBRARY_PATH="+ldLib)
}
return out
}

View File

@@ -0,0 +1,45 @@
// Package blender: GPU backend detection for HIP vs NVIDIA.
package blender
import (
"bufio"
"fmt"
"os"
"os/exec"
"path/filepath"
"strings"
"jiggablend/pkg/scripts"
)
// DetectGPUBackends runs a minimal Blender script to detect whether HIP (AMD) and/or
// NVIDIA (CUDA/OptiX) devices are available. Use this to decide whether to force CPU
// for Blender < 4.x (only force when HIP is present, since HIP has no official support pre-4).
func DetectGPUBackends(blenderBinary, scriptDir string) (hasHIP, hasNVIDIA bool, err error) {
scriptPath := filepath.Join(scriptDir, "detect_gpu_backends.py")
if err := os.WriteFile(scriptPath, []byte(scripts.DetectGPUBackends), 0644); err != nil {
return false, false, fmt.Errorf("write detection script: %w", err)
}
defer os.Remove(scriptPath)
env := TarballEnv(blenderBinary, os.Environ())
cmd := exec.Command(blenderBinary, "-b", "--python", scriptPath)
cmd.Env = env
cmd.Dir = scriptDir
out, err := cmd.CombinedOutput()
if err != nil {
return false, false, fmt.Errorf("run blender detection: %w (output: %s)", err, string(out))
}
scanner := bufio.NewScanner(strings.NewReader(string(out)))
for scanner.Scan() {
line := strings.TrimSpace(scanner.Text())
switch line {
case "HAS_HIP":
hasHIP = true
case "HAS_NVIDIA":
hasNVIDIA = true
}
}
return hasHIP, hasNVIDIA, scanner.Err()
}

View File

@@ -20,10 +20,8 @@ type EncodeConfig struct {
StartFrame int // Starting frame number StartFrame int // Starting frame number
FrameRate float64 // Frame rate FrameRate float64 // Frame rate
WorkDir string // Working directory WorkDir string // Working directory
UseAlpha bool // Whether to preserve alpha channel UseAlpha bool // Whether to preserve alpha channel
TwoPass bool // Whether to use 2-pass encoding TwoPass bool // Whether to use 2-pass encoding
SourceFormat string // Source format: "exr" or "png" (defaults to "exr")
PreserveHDR bool // Whether to preserve HDR range for EXR (uses HLG with bt709 primaries)
} }
// Selector selects the software encoder. // Selector selects the software encoder.

View File

@@ -1,5 +1,7 @@
package encoding package encoding
// Pipeline: Blender outputs only EXR (linear). Encode is EXR only: linear -> sRGB -> HLG (video), 10-bit, full range.
import ( import (
"fmt" "fmt"
"log" "log"
@@ -56,97 +58,34 @@ func (e *SoftwareEncoder) Available() bool {
} }
func (e *SoftwareEncoder) BuildCommand(config *EncodeConfig) *exec.Cmd { func (e *SoftwareEncoder) BuildCommand(config *EncodeConfig) *exec.Cmd {
// Use HDR pixel formats for EXR, SDR for PNG // EXR only: HDR path (HLG, 10-bit, full range)
var pixFmt string pixFmt := "yuv420p10le"
var colorPrimaries, colorTrc, colorspace string if config.UseAlpha {
if config.SourceFormat == "png" { pixFmt = "yuva420p10le"
// PNG: SDR format
pixFmt = "yuv420p"
if config.UseAlpha {
pixFmt = "yuva420p"
}
colorPrimaries = "bt709"
colorTrc = "bt709"
colorspace = "bt709"
} else {
// EXR: Use HDR encoding if PreserveHDR is true, otherwise SDR (like PNG)
if config.PreserveHDR {
// HDR: Use HLG transfer with bt709 primaries to preserve HDR range while matching PNG color
pixFmt = "yuv420p10le" // 10-bit to preserve HDR range
if config.UseAlpha {
pixFmt = "yuva420p10le"
}
colorPrimaries = "bt709" // bt709 primaries to match PNG color appearance
colorTrc = "arib-std-b67" // HLG transfer function - preserves HDR range, works on SDR displays
colorspace = "bt709" // bt709 colorspace to match PNG
} else {
// SDR: Treat as SDR (like PNG) - encode as bt709
pixFmt = "yuv420p"
if config.UseAlpha {
pixFmt = "yuva420p"
}
colorPrimaries = "bt709"
colorTrc = "bt709"
colorspace = "bt709"
}
} }
colorPrimaries, colorTrc, colorspace, colorRange := "bt709", "arib-std-b67", "bt709", "pc"
var codecArgs []string var codecArgs []string
switch e.codec { switch e.codec {
case "libaom-av1": case "libaom-av1":
codecArgs = []string{"-crf", strconv.Itoa(CRFAV1), "-b:v", "0", "-tiles", "2x2", "-g", "240"} codecArgs = []string{"-crf", strconv.Itoa(CRFAV1), "-b:v", "0", "-tiles", "2x2", "-g", "240"}
case "libvpx-vp9": case "libvpx-vp9":
// VP9 supports alpha and HDR, use good quality settings
codecArgs = []string{"-crf", strconv.Itoa(CRFVP9), "-b:v", "0", "-row-mt", "1", "-g", "240"} codecArgs = []string{"-crf", strconv.Itoa(CRFVP9), "-b:v", "0", "-row-mt", "1", "-g", "240"}
default: default:
// H.264: Use High 10 profile for HDR EXR (10-bit), High profile for SDR codecArgs = []string{"-preset", "veryslow", "-crf", strconv.Itoa(CRFH264), "-profile:v", "high10", "-level", "5.2", "-tune", "film", "-keyint_min", "24", "-g", "240", "-bf", "2", "-refs", "4"}
if config.SourceFormat != "png" && config.PreserveHDR {
codecArgs = []string{"-preset", "veryslow", "-crf", strconv.Itoa(CRFH264), "-profile:v", "high10", "-level", "5.2", "-tune", "film", "-keyint_min", "24", "-g", "240", "-bf", "2", "-refs", "4"}
} else {
codecArgs = []string{"-preset", "veryslow", "-crf", strconv.Itoa(CRFH264), "-profile:v", "high", "-level", "5.2", "-tune", "film", "-keyint_min", "24", "-g", "240", "-bf", "2", "-refs", "4"}
}
} }
args := []string{ args := []string{"-y", "-f", "image2", "-start_number", fmt.Sprintf("%d", config.StartFrame), "-framerate", fmt.Sprintf("%.2f", config.FrameRate),
"-y", "-color_trc", "linear", "-color_primaries", "bt709"}
"-f", "image2", args = append(args, "-i", config.InputPattern, "-c:v", e.codec, "-pix_fmt", pixFmt, "-r", fmt.Sprintf("%.2f", config.FrameRate), "-color_primaries", colorPrimaries, "-color_trc", colorTrc, "-colorspace", colorspace, "-color_range", colorRange)
"-start_number", fmt.Sprintf("%d", config.StartFrame),
"-framerate", fmt.Sprintf("%.2f", config.FrameRate),
"-i", config.InputPattern,
"-c:v", e.codec,
"-pix_fmt", pixFmt,
"-r", fmt.Sprintf("%.2f", config.FrameRate),
"-color_primaries", colorPrimaries,
"-color_trc", colorTrc,
"-colorspace", colorspace,
"-color_range", "tv",
}
// Add video filter for EXR: convert linear RGB based on HDR setting vf := "format=gbrpf32le,zscale=transferin=8:transfer=13:primariesin=1:primaries=1:matrixin=0:matrix=1:rangein=full:range=full,zscale=transferin=13:transfer=18:primariesin=1:primaries=1:matrixin=1:matrix=1:rangein=full:range=full"
// PNG doesn't need any filter as it's already in sRGB if config.UseAlpha {
if config.SourceFormat != "png" { vf += ",format=yuva420p10le"
var vf string } else {
if config.PreserveHDR { vf += ",format=yuv420p10le"
// HDR: Convert linear RGB -> sRGB -> HLG with bt709 primaries
// This preserves HDR range while matching PNG color appearance
vf = "format=gbrpf32le,zscale=transferin=8:transfer=13:primariesin=1:primaries=1:matrixin=0:matrix=1:rangein=full:range=full,zscale=transferin=13:transfer=18:primariesin=1:primaries=1:matrixin=1:matrix=1:rangein=full:range=full"
if config.UseAlpha {
vf += ",format=yuva420p10le"
} else {
vf += ",format=yuv420p10le"
}
} else {
// SDR: Convert linear RGB (EXR) to sRGB (bt709) - simple conversion like Krita does
// zscale: linear (8) -> sRGB (13) with bt709 primaries/matrix
vf = "format=gbrpf32le,zscale=transferin=8:transfer=13:primariesin=1:primaries=1:matrixin=0:matrix=1:rangein=full:range=full"
if config.UseAlpha {
vf += ",format=yuva420p"
} else {
vf += ",format=yuv420p"
}
}
args = append(args, "-vf", vf)
} }
args = append(args, "-vf", vf)
args = append(args, codecArgs...) args = append(args, codecArgs...)
if config.TwoPass { if config.TwoPass {
@@ -168,97 +107,33 @@ func (e *SoftwareEncoder) BuildCommand(config *EncodeConfig) *exec.Cmd {
// BuildPass1Command builds the first pass command for 2-pass encoding. // BuildPass1Command builds the first pass command for 2-pass encoding.
func (e *SoftwareEncoder) BuildPass1Command(config *EncodeConfig) *exec.Cmd { func (e *SoftwareEncoder) BuildPass1Command(config *EncodeConfig) *exec.Cmd {
// Use HDR pixel formats for EXR, SDR for PNG pixFmt := "yuv420p10le"
var pixFmt string if config.UseAlpha {
var colorPrimaries, colorTrc, colorspace string pixFmt = "yuva420p10le"
if config.SourceFormat == "png" {
// PNG: SDR format
pixFmt = "yuv420p"
if config.UseAlpha {
pixFmt = "yuva420p"
}
colorPrimaries = "bt709"
colorTrc = "bt709"
colorspace = "bt709"
} else {
// EXR: Use HDR encoding if PreserveHDR is true, otherwise SDR (like PNG)
if config.PreserveHDR {
// HDR: Use HLG transfer with bt709 primaries to preserve HDR range while matching PNG color
pixFmt = "yuv420p10le" // 10-bit to preserve HDR range
if config.UseAlpha {
pixFmt = "yuva420p10le"
}
colorPrimaries = "bt709" // bt709 primaries to match PNG color appearance
colorTrc = "arib-std-b67" // HLG transfer function - preserves HDR range, works on SDR displays
colorspace = "bt709" // bt709 colorspace to match PNG
} else {
// SDR: Treat as SDR (like PNG) - encode as bt709
pixFmt = "yuv420p"
if config.UseAlpha {
pixFmt = "yuva420p"
}
colorPrimaries = "bt709"
colorTrc = "bt709"
colorspace = "bt709"
}
} }
colorPrimaries, colorTrc, colorspace, colorRange := "bt709", "arib-std-b67", "bt709", "pc"
var codecArgs []string var codecArgs []string
switch e.codec { switch e.codec {
case "libaom-av1": case "libaom-av1":
codecArgs = []string{"-crf", strconv.Itoa(CRFAV1), "-b:v", "0", "-tiles", "2x2", "-g", "240"} codecArgs = []string{"-crf", strconv.Itoa(CRFAV1), "-b:v", "0", "-tiles", "2x2", "-g", "240"}
case "libvpx-vp9": case "libvpx-vp9":
// VP9 supports alpha and HDR, use good quality settings
codecArgs = []string{"-crf", strconv.Itoa(CRFVP9), "-b:v", "0", "-row-mt", "1", "-g", "240"} codecArgs = []string{"-crf", strconv.Itoa(CRFVP9), "-b:v", "0", "-row-mt", "1", "-g", "240"}
default: default:
// H.264: Use High 10 profile for HDR EXR (10-bit), High profile for SDR codecArgs = []string{"-preset", "veryslow", "-crf", strconv.Itoa(CRFH264), "-profile:v", "high10", "-level", "5.2", "-tune", "film", "-keyint_min", "24", "-g", "240", "-bf", "2", "-refs", "4"}
if config.SourceFormat != "png" && config.PreserveHDR {
codecArgs = []string{"-preset", "veryslow", "-crf", strconv.Itoa(CRFH264), "-profile:v", "high10", "-level", "5.2", "-tune", "film", "-keyint_min", "24", "-g", "240", "-bf", "2", "-refs", "4"}
} else {
codecArgs = []string{"-preset", "veryslow", "-crf", strconv.Itoa(CRFH264), "-profile:v", "high", "-level", "5.2", "-tune", "film", "-keyint_min", "24", "-g", "240", "-bf", "2", "-refs", "4"}
}
} }
args := []string{ args := []string{"-y", "-f", "image2", "-start_number", fmt.Sprintf("%d", config.StartFrame), "-framerate", fmt.Sprintf("%.2f", config.FrameRate),
"-y", "-color_trc", "linear", "-color_primaries", "bt709"}
"-f", "image2", args = append(args, "-i", config.InputPattern, "-c:v", e.codec, "-pix_fmt", pixFmt, "-r", fmt.Sprintf("%.2f", config.FrameRate), "-color_primaries", colorPrimaries, "-color_trc", colorTrc, "-colorspace", colorspace, "-color_range", colorRange)
"-start_number", fmt.Sprintf("%d", config.StartFrame),
"-framerate", fmt.Sprintf("%.2f", config.FrameRate),
"-i", config.InputPattern,
"-c:v", e.codec,
"-pix_fmt", pixFmt,
"-r", fmt.Sprintf("%.2f", config.FrameRate),
"-color_primaries", colorPrimaries,
"-color_trc", colorTrc,
"-colorspace", colorspace,
"-color_range", "tv",
}
// Add video filter for EXR: convert linear RGB based on HDR setting vf := "format=gbrpf32le,zscale=transferin=8:transfer=13:primariesin=1:primaries=1:matrixin=0:matrix=1:rangein=full:range=full,zscale=transferin=13:transfer=18:primariesin=1:primaries=1:matrixin=1:matrix=1:rangein=full:range=full"
// PNG doesn't need any filter as it's already in sRGB if config.UseAlpha {
if config.SourceFormat != "png" { vf += ",format=yuva420p10le"
var vf string } else {
if config.PreserveHDR { vf += ",format=yuv420p10le"
// HDR: Convert linear RGB -> sRGB -> HLG with bt709 primaries
// This preserves HDR range while matching PNG color appearance
vf = "format=gbrpf32le,zscale=transferin=8:transfer=13:primariesin=1:primaries=1:matrixin=0:matrix=1:rangein=full:range=full,zscale=transferin=13:transfer=18:primariesin=1:primaries=1:matrixin=1:matrix=1:rangein=full:range=full"
if config.UseAlpha {
vf += ",format=yuva420p10le"
} else {
vf += ",format=yuv420p10le"
}
} else {
// SDR: Convert linear RGB (EXR) to sRGB (bt709) - simple conversion like Krita does
// zscale: linear (8) -> sRGB (13) with bt709 primaries/matrix
vf = "format=gbrpf32le,zscale=transferin=8:transfer=13:primariesin=1:primaries=1:matrixin=0:matrix=1:rangein=full:range=full"
if config.UseAlpha {
vf += ",format=yuva420p"
} else {
vf += ",format=yuv420p"
}
}
args = append(args, "-vf", vf)
} }
args = append(args, "-vf", vf)
args = append(args, codecArgs...) args = append(args, codecArgs...)
args = append(args, "-pass", "1", "-f", "null", "/dev/null") args = append(args, "-pass", "1", "-f", "null", "/dev/null")

View File

@@ -18,7 +18,6 @@ func TestSoftwareEncoder_BuildCommand_H264_EXR(t *testing.T) {
WorkDir: "/tmp", WorkDir: "/tmp",
UseAlpha: false, UseAlpha: false,
TwoPass: true, TwoPass: true,
SourceFormat: "exr",
} }
cmd := encoder.BuildCommand(config) cmd := encoder.BuildCommand(config)
@@ -37,7 +36,7 @@ func TestSoftwareEncoder_BuildCommand_H264_EXR(t *testing.T) {
args := cmd.Args[1:] // Skip "ffmpeg" args := cmd.Args[1:] // Skip "ffmpeg"
argsStr := strings.Join(args, " ") argsStr := strings.Join(args, " ")
// Check required arguments // EXR always uses HDR path: 10-bit, HLG, full range
checks := []struct { checks := []struct {
name string name string
expected string expected string
@@ -46,18 +45,19 @@ func TestSoftwareEncoder_BuildCommand_H264_EXR(t *testing.T) {
{"image2 format", "-f image2"}, {"image2 format", "-f image2"},
{"start number", "-start_number 1"}, {"start number", "-start_number 1"},
{"framerate", "-framerate 24.00"}, {"framerate", "-framerate 24.00"},
{"input color tag", "-color_trc linear"},
{"input pattern", "-i frame_%04d.exr"}, {"input pattern", "-i frame_%04d.exr"},
{"codec", "-c:v libx264"}, {"codec", "-c:v libx264"},
{"pixel format", "-pix_fmt yuv420p"}, // EXR now treated as SDR (like PNG) {"pixel format", "-pix_fmt yuv420p10le"},
{"frame rate", "-r 24.00"}, {"frame rate", "-r 24.00"},
{"color primaries", "-color_primaries bt709"}, // EXR now uses bt709 (SDR) {"color primaries", "-color_primaries bt709"},
{"color trc", "-color_trc bt709"}, // EXR now uses bt709 (SDR) {"color trc", "-color_trc arib-std-b67"},
{"colorspace", "-colorspace bt709"}, {"colorspace", "-colorspace bt709"},
{"color range", "-color_range tv"}, {"color range", "-color_range pc"},
{"video filter", "-vf"}, {"video filter", "-vf"},
{"preset", "-preset veryslow"}, {"preset", "-preset veryslow"},
{"crf", "-crf 15"}, {"crf", "-crf 15"},
{"profile", "-profile:v high"}, // EXR now uses high profile (SDR) {"profile", "-profile:v high10"},
{"pass 2", "-pass 2"}, {"pass 2", "-pass 2"},
{"output path", "output.mp4"}, {"output path", "output.mp4"},
} }
@@ -68,40 +68,15 @@ func TestSoftwareEncoder_BuildCommand_H264_EXR(t *testing.T) {
} }
} }
// Verify filter is present for EXR (linear RGB to sRGB conversion, like Krita does) // EXR: linear -> sRGB -> HLG filter
if !strings.Contains(argsStr, "format=gbrpf32le") { if !strings.Contains(argsStr, "format=gbrpf32le") {
t.Error("Expected format conversion filter for EXR source, but not found") t.Error("Expected format conversion filter for EXR source, but not found")
} }
if !strings.Contains(argsStr, "zscale=transferin=8:transfer=13") { if !strings.Contains(argsStr, "zscale=transferin=8:transfer=13") {
t.Error("Expected linear to sRGB conversion for EXR source, but not found") t.Error("Expected linear to sRGB conversion for EXR source, but not found")
} }
} if !strings.Contains(argsStr, "transfer=18") {
t.Error("Expected sRGB to HLG conversion for EXR HDR, but not found")
func TestSoftwareEncoder_BuildCommand_H264_PNG(t *testing.T) {
encoder := &SoftwareEncoder{codec: "libx264"}
config := &EncodeConfig{
InputPattern: "frame_%04d.png",
OutputPath: "output.mp4",
StartFrame: 1,
FrameRate: 24.0,
WorkDir: "/tmp",
UseAlpha: false,
TwoPass: true,
SourceFormat: "png",
}
cmd := encoder.BuildCommand(config)
args := cmd.Args[1:]
argsStr := strings.Join(args, " ")
// PNG should NOT have video filter
if strings.Contains(argsStr, "-vf") {
t.Error("PNG source should not have video filter, but -vf was found")
}
// Should still have all other required args
if !strings.Contains(argsStr, "-c:v libx264") {
t.Error("Missing codec argument")
} }
} }
@@ -113,18 +88,17 @@ func TestSoftwareEncoder_BuildCommand_AV1_WithAlpha(t *testing.T) {
StartFrame: 100, StartFrame: 100,
FrameRate: 30.0, FrameRate: 30.0,
WorkDir: "/tmp", WorkDir: "/tmp",
UseAlpha: true, UseAlpha: true,
TwoPass: true, TwoPass: true,
SourceFormat: "exr",
} }
cmd := encoder.BuildCommand(config) cmd := encoder.BuildCommand(config)
args := cmd.Args[1:] args := cmd.Args[1:]
argsStr := strings.Join(args, " ") argsStr := strings.Join(args, " ")
// Check alpha-specific settings // EXR with alpha: 10-bit HDR path
if !strings.Contains(argsStr, "-pix_fmt yuva420p") { if !strings.Contains(argsStr, "-pix_fmt yuva420p10le") {
t.Error("Expected yuva420p pixel format for alpha, but not found") t.Error("Expected yuva420p10le pixel format for EXR alpha, but not found")
} }
// Check AV1-specific arguments // Check AV1-specific arguments
@@ -142,9 +116,9 @@ func TestSoftwareEncoder_BuildCommand_AV1_WithAlpha(t *testing.T) {
} }
} }
// Check tonemap filter includes alpha format // Check tonemap filter includes alpha format (10-bit for EXR)
if !strings.Contains(argsStr, "format=yuva420p") { if !strings.Contains(argsStr, "format=yuva420p10le") {
t.Error("Expected tonemap filter to output yuva420p for alpha, but not found") t.Error("Expected tonemap filter to output yuva420p10le for EXR alpha, but not found")
} }
} }
@@ -156,9 +130,8 @@ func TestSoftwareEncoder_BuildCommand_VP9(t *testing.T) {
StartFrame: 1, StartFrame: 1,
FrameRate: 24.0, FrameRate: 24.0,
WorkDir: "/tmp", WorkDir: "/tmp",
UseAlpha: true, UseAlpha: true,
TwoPass: true, TwoPass: true,
SourceFormat: "exr",
} }
cmd := encoder.BuildCommand(config) cmd := encoder.BuildCommand(config)
@@ -191,7 +164,6 @@ func TestSoftwareEncoder_BuildPass1Command(t *testing.T) {
WorkDir: "/tmp", WorkDir: "/tmp",
UseAlpha: false, UseAlpha: false,
TwoPass: true, TwoPass: true,
SourceFormat: "exr",
} }
cmd := encoder.BuildPass1Command(config) cmd := encoder.BuildPass1Command(config)
@@ -227,7 +199,6 @@ func TestSoftwareEncoder_BuildPass1Command_AV1(t *testing.T) {
WorkDir: "/tmp", WorkDir: "/tmp",
UseAlpha: false, UseAlpha: false,
TwoPass: true, TwoPass: true,
SourceFormat: "exr",
} }
cmd := encoder.BuildPass1Command(config) cmd := encoder.BuildPass1Command(config)
@@ -273,7 +244,6 @@ func TestSoftwareEncoder_BuildPass1Command_VP9(t *testing.T) {
WorkDir: "/tmp", WorkDir: "/tmp",
UseAlpha: false, UseAlpha: false,
TwoPass: true, TwoPass: true,
SourceFormat: "exr",
} }
cmd := encoder.BuildPass1Command(config) cmd := encoder.BuildPass1Command(config)
@@ -319,7 +289,6 @@ func TestSoftwareEncoder_BuildCommand_NoTwoPass(t *testing.T) {
WorkDir: "/tmp", WorkDir: "/tmp",
UseAlpha: false, UseAlpha: false,
TwoPass: false, TwoPass: false,
SourceFormat: "exr",
} }
cmd := encoder.BuildCommand(config) cmd := encoder.BuildCommand(config)
@@ -432,28 +401,6 @@ func TestSoftwareEncoder_Available(t *testing.T) {
} }
} }
func TestEncodeConfig_DefaultSourceFormat(t *testing.T) {
config := &EncodeConfig{
InputPattern: "frame_%04d.exr",
OutputPath: "output.mp4",
StartFrame: 1,
FrameRate: 24.0,
WorkDir: "/tmp",
UseAlpha: false,
TwoPass: false,
// SourceFormat not set, should default to empty string (treated as exr)
}
encoder := &SoftwareEncoder{codec: "libx264"}
cmd := encoder.BuildCommand(config)
args := strings.Join(cmd.Args[1:], " ")
// Should still have tonemap filter when SourceFormat is empty (defaults to exr behavior)
if !strings.Contains(args, "-vf") {
t.Error("Empty SourceFormat should default to EXR behavior with tonemap filter")
}
}
func TestCommandOrder(t *testing.T) { func TestCommandOrder(t *testing.T) {
encoder := &SoftwareEncoder{codec: "libx264"} encoder := &SoftwareEncoder{codec: "libx264"}
config := &EncodeConfig{ config := &EncodeConfig{
@@ -464,7 +411,6 @@ func TestCommandOrder(t *testing.T) {
WorkDir: "/tmp", WorkDir: "/tmp",
UseAlpha: false, UseAlpha: false,
TwoPass: true, TwoPass: true,
SourceFormat: "exr",
} }
cmd := encoder.BuildCommand(config) cmd := encoder.BuildCommand(config)
@@ -519,20 +465,18 @@ func TestCommand_ColorspaceMetadata(t *testing.T) {
WorkDir: "/tmp", WorkDir: "/tmp",
UseAlpha: false, UseAlpha: false,
TwoPass: false, TwoPass: false,
SourceFormat: "exr",
PreserveHDR: false, // SDR encoding
} }
cmd := encoder.BuildCommand(config) cmd := encoder.BuildCommand(config)
args := cmd.Args[1:] args := cmd.Args[1:]
argsStr := strings.Join(args, " ") argsStr := strings.Join(args, " ")
// Verify all SDR colorspace metadata is present for EXR (SDR encoding) // EXR always uses HDR path: bt709 primaries, HLG, full range
colorspaceArgs := []string{ colorspaceArgs := []string{
"-color_primaries bt709", // EXR uses bt709 (SDR) "-color_primaries bt709",
"-color_trc bt709", // EXR uses bt709 (SDR) "-color_trc arib-std-b67",
"-colorspace bt709", "-colorspace bt709",
"-color_range tv", "-color_range pc",
} }
for _, arg := range colorspaceArgs { for _, arg := range colorspaceArgs {
@@ -541,17 +485,11 @@ func TestCommand_ColorspaceMetadata(t *testing.T) {
} }
} }
// Verify SDR pixel format if !strings.Contains(argsStr, "-pix_fmt yuv420p10le") {
if !strings.Contains(argsStr, "-pix_fmt yuv420p") { t.Error("EXR encoding should use yuv420p10le pixel format")
t.Error("SDR encoding should use yuv420p pixel format")
} }
if !strings.Contains(argsStr, "-profile:v high10") {
// Verify H.264 high profile (not high10) t.Error("EXR encoding should use high10 profile")
if !strings.Contains(argsStr, "-profile:v high") {
t.Error("SDR encoding should use high profile")
}
if strings.Contains(argsStr, "-profile:v high10") {
t.Error("SDR encoding should not use high10 profile")
} }
} }
@@ -565,20 +503,18 @@ func TestCommand_HDR_ColorspaceMetadata(t *testing.T) {
WorkDir: "/tmp", WorkDir: "/tmp",
UseAlpha: false, UseAlpha: false,
TwoPass: false, TwoPass: false,
SourceFormat: "exr",
PreserveHDR: true, // HDR encoding
} }
cmd := encoder.BuildCommand(config) cmd := encoder.BuildCommand(config)
args := cmd.Args[1:] args := cmd.Args[1:]
argsStr := strings.Join(args, " ") argsStr := strings.Join(args, " ")
// Verify all HDR colorspace metadata is present for EXR (HDR encoding) // Verify all HDR colorspace metadata is present for EXR (full range to match zscale output)
colorspaceArgs := []string{ colorspaceArgs := []string{
"-color_primaries bt709", // bt709 primaries to match PNG color appearance "-color_primaries bt709",
"-color_trc arib-std-b67", // HLG transfer function for HDR/SDR compatibility "-color_trc arib-std-b67",
"-colorspace bt709", // bt709 colorspace to match PNG "-colorspace bt709",
"-color_range tv", "-color_range pc",
} }
for _, arg := range colorspaceArgs { for _, arg := range colorspaceArgs {
@@ -656,7 +592,6 @@ func TestIntegration_Encode_EXR_H264(t *testing.T) {
WorkDir: tmpDir, WorkDir: tmpDir,
UseAlpha: false, UseAlpha: false,
TwoPass: false, // Use single pass for faster testing TwoPass: false, // Use single pass for faster testing
SourceFormat: "exr",
} }
// Build and run command // Build and run command
@@ -687,77 +622,6 @@ func TestIntegration_Encode_EXR_H264(t *testing.T) {
} }
} }
func TestIntegration_Encode_PNG_H264(t *testing.T) {
if testing.Short() {
t.Skip("Skipping integration test in short mode")
}
// Check if example file exists
exampleDir := filepath.Join("..", "..", "..", "examples")
pngFile := filepath.Join(exampleDir, "frame_0800.png")
if _, err := os.Stat(pngFile); os.IsNotExist(err) {
t.Skipf("Example file not found: %s", pngFile)
}
// Get absolute paths
workspaceRoot, err := filepath.Abs(filepath.Join("..", "..", ".."))
if err != nil {
t.Fatalf("Failed to get workspace root: %v", err)
}
exampleDirAbs, err := filepath.Abs(exampleDir)
if err != nil {
t.Fatalf("Failed to get example directory: %v", err)
}
tmpDir := filepath.Join(workspaceRoot, "tmp")
if err := os.MkdirAll(tmpDir, 0755); err != nil {
t.Fatalf("Failed to create tmp directory: %v", err)
}
encoder := &SoftwareEncoder{codec: "libx264"}
config := &EncodeConfig{
InputPattern: filepath.Join(exampleDirAbs, "frame_%04d.png"),
OutputPath: filepath.Join(tmpDir, "test_png_h264.mp4"),
StartFrame: 800,
FrameRate: 24.0,
WorkDir: tmpDir,
UseAlpha: false,
TwoPass: false, // Use single pass for faster testing
SourceFormat: "png",
}
// Build and run command
cmd := encoder.BuildCommand(config)
if cmd == nil {
t.Fatal("BuildCommand returned nil")
}
// Verify no video filter is used for PNG
argsStr := strings.Join(cmd.Args, " ")
if strings.Contains(argsStr, "-vf") {
t.Error("PNG encoding should not use video filter, but -vf was found in command")
}
// Run the command
cmdOutput, err := cmd.CombinedOutput()
if err != nil {
t.Errorf("FFmpeg command failed: %v\nCommand output: %s", err, string(cmdOutput))
return
}
// Verify output file was created
if _, err := os.Stat(config.OutputPath); os.IsNotExist(err) {
t.Errorf("Output file was not created: %s\nCommand output: %s", config.OutputPath, string(cmdOutput))
} else {
t.Logf("Successfully created output file: %s", config.OutputPath)
info, _ := os.Stat(config.OutputPath)
if info.Size() == 0 {
t.Error("Output file was created but is empty")
} else {
t.Logf("Output file size: %d bytes", info.Size())
}
}
}
func TestIntegration_Encode_EXR_VP9(t *testing.T) { func TestIntegration_Encode_EXR_VP9(t *testing.T) {
if testing.Short() { if testing.Short() {
t.Skip("Skipping integration test in short mode") t.Skip("Skipping integration test in short mode")
@@ -800,7 +664,6 @@ func TestIntegration_Encode_EXR_VP9(t *testing.T) {
WorkDir: tmpDir, WorkDir: tmpDir,
UseAlpha: false, UseAlpha: false,
TwoPass: false, // Use single pass for faster testing TwoPass: false, // Use single pass for faster testing
SourceFormat: "exr",
} }
// Build and run command // Build and run command
@@ -873,7 +736,6 @@ func TestIntegration_Encode_EXR_AV1(t *testing.T) {
WorkDir: tmpDir, WorkDir: tmpDir,
UseAlpha: false, UseAlpha: false,
TwoPass: false, TwoPass: false,
SourceFormat: "exr",
} }
// Build and run command // Build and run command
@@ -940,7 +802,6 @@ func TestIntegration_Encode_EXR_VP9_WithAlpha(t *testing.T) {
WorkDir: tmpDir, WorkDir: tmpDir,
UseAlpha: true, // Test with alpha UseAlpha: true, // Test with alpha
TwoPass: false, // Use single pass for faster testing TwoPass: false, // Use single pass for faster testing
SourceFormat: "exr",
} }
// Build and run command // Build and run command

View File

@@ -4,6 +4,7 @@ package runner
import ( import (
"crypto/sha256" "crypto/sha256"
"encoding/hex" "encoding/hex"
"errors"
"fmt" "fmt"
"log" "log"
"net" "net"
@@ -39,10 +40,29 @@ type Runner struct {
fingerprint string fingerprint string
fingerprintMu sync.RWMutex fingerprintMu sync.RWMutex
// gpuLockedOut is set when logs indicate a GPU error (e.g. HIP "Illegal address");
// when true, the runner forces CPU rendering for all subsequent jobs.
gpuLockedOut bool
gpuLockedOutMu sync.RWMutex
// hasHIP/hasNVIDIA are set at startup by running latest Blender to detect GPU backends.
// Used to force CPU only for Blender < 4.x when HIP is present (no official HIP support pre-4).
// gpuDetectionFailed is true when detection could not run; we then force CPU for all versions (we could not determine HIP vs NVIDIA).
gpuBackendMu sync.RWMutex
hasHIP bool
hasNVIDIA bool
gpuBackendProbed bool
gpuDetectionFailed bool
// forceCPURendering forces CPU rendering for all jobs regardless of metadata/backend detection.
forceCPURendering bool
// disableHIPRT disables HIPRT acceleration when configuring Cycles HIP devices.
disableHIPRT bool
} }
// New creates a new runner. // New creates a new runner.
func New(managerURL, name, hostname string) *Runner { func New(managerURL, name, hostname string, forceCPURendering, disableHIPRT bool) *Runner {
manager := api.NewManagerClient(managerURL) manager := api.NewManagerClient(managerURL)
r := &Runner{ r := &Runner{
@@ -52,6 +72,9 @@ func New(managerURL, name, hostname string) *Runner {
processes: executils.NewProcessTracker(), processes: executils.NewProcessTracker(),
stopChan: make(chan struct{}), stopChan: make(chan struct{}),
processors: make(map[string]tasks.Processor), processors: make(map[string]tasks.Processor),
forceCPURendering: forceCPURendering,
disableHIPRT: disableHIPRT,
} }
// Generate fingerprint // Generate fingerprint
@@ -67,10 +90,6 @@ func (r *Runner) CheckRequiredTools() error {
} }
log.Printf("Found zstd for compressed blend file support") log.Printf("Found zstd for compressed blend file support")
if err := exec.Command("xvfb-run", "--help").Run(); err != nil {
return fmt.Errorf("xvfb-run not found - required for headless Blender rendering. Install with: apt install xvfb")
}
log.Printf("Found xvfb-run for headless rendering without -b option")
return nil return nil
} }
@@ -122,6 +141,58 @@ func (r *Runner) Register(apiKey string) (int64, error) {
return id, nil return id, nil
} }
// DetectAndStoreGPUBackends downloads the latest Blender from the manager (if needed),
// runs a detection script to see if HIP (AMD) and/or NVIDIA devices are available,
// and stores the result. Call after Register. Used so we only force CPU for Blender < 4.x
// when the runner has HIP (no official HIP support pre-4); NVIDIA is allowed.
func (r *Runner) DetectAndStoreGPUBackends() {
r.gpuBackendMu.Lock()
defer r.gpuBackendMu.Unlock()
if r.gpuBackendProbed {
return
}
latestVer, err := r.manager.GetLatestBlenderVersion()
if err != nil {
log.Printf("GPU backend detection failed (could not get latest Blender version: %v). All jobs will use CPU because we could not determine HIP vs NVIDIA.", err)
r.gpuBackendProbed = true
r.gpuDetectionFailed = true
return
}
binaryPath, err := r.blender.GetBinaryPath(latestVer)
if err != nil {
log.Printf("GPU backend detection failed (could not get Blender binary: %v). All jobs will use CPU because we could not determine HIP vs NVIDIA.", err)
r.gpuBackendProbed = true
r.gpuDetectionFailed = true
return
}
hasHIP, hasNVIDIA, err := blender.DetectGPUBackends(binaryPath, r.workspace.BaseDir())
if err != nil {
log.Printf("GPU backend detection failed (script error: %v). All jobs will use CPU because we could not determine HIP vs NVIDIA.", err)
r.gpuBackendProbed = true
r.gpuDetectionFailed = true
return
}
r.hasHIP = hasHIP
r.hasNVIDIA = hasNVIDIA
r.gpuBackendProbed = true
r.gpuDetectionFailed = false
log.Printf("GPU backend detection: HIP=%v NVIDIA=%v (Blender < 4.x will force CPU only when HIP is present)", hasHIP, hasNVIDIA)
}
// HasHIP returns whether the runner detected HIP (AMD) devices. Used to force CPU for Blender < 4.x only when HIP is present.
func (r *Runner) HasHIP() bool {
r.gpuBackendMu.RLock()
defer r.gpuBackendMu.RUnlock()
return r.hasHIP
}
// GPUDetectionFailed returns true when startup GPU backend detection could not run or failed. When true, all jobs use CPU because we could not determine HIP vs NVIDIA.
func (r *Runner) GPUDetectionFailed() bool {
r.gpuBackendMu.RLock()
defer r.gpuBackendMu.RUnlock()
return r.gpuDetectionFailed
}
// Start starts the job polling loop. // Start starts the job polling loop.
func (r *Runner) Start(pollInterval time.Duration) { func (r *Runner) Start(pollInterval time.Duration) {
log.Printf("Starting job polling loop (interval: %v)", pollInterval) log.Printf("Starting job polling loop (interval: %v)", pollInterval)
@@ -182,6 +253,24 @@ func (r *Runner) Cleanup() {
} }
} }
func (r *Runner) withJobWorkspace(jobID int64, fn func(workDir string) error) error {
workDir, err := r.workspace.CreateJobDir(jobID)
if err != nil {
return fmt.Errorf("failed to create job workspace: %w", err)
}
defer func() {
if cleanupErr := r.workspace.CleanupJobDir(jobID); cleanupErr != nil {
log.Printf("Warning: failed to cleanup job workspace for job %d: %v", jobID, cleanupErr)
}
if cleanupErr := r.workspace.CleanupVideoDir(jobID); cleanupErr != nil {
log.Printf("Warning: failed to cleanup encode workspace for job %d: %v", jobID, cleanupErr)
}
}()
return fn(workDir)
}
// executeJob handles a job using per-job WebSocket connection. // executeJob handles a job using per-job WebSocket connection.
func (r *Runner) executeJob(job *api.NextJobResponse) (err error) { func (r *Runner) executeJob(job *api.NextJobResponse) (err error) {
// Recover from panics to prevent runner process crashes during task execution // Recover from panics to prevent runner process crashes during task execution
@@ -192,72 +281,88 @@ func (r *Runner) executeJob(job *api.NextJobResponse) (err error) {
} }
}() }()
// Connect to job WebSocket (no runnerID needed - authentication handles it) return r.withJobWorkspace(job.Task.JobID, func(workDir string) error {
jobConn := api.NewJobConnection() // Connect to job WebSocket (no runnerID needed - authentication handles it)
if err := jobConn.Connect(r.manager.GetBaseURL(), job.JobPath, job.JobToken); err != nil { jobConn := api.NewJobConnection()
return fmt.Errorf("failed to connect job WebSocket: %w", err) if err := jobConn.Connect(r.manager.GetBaseURL(), job.JobPath, job.JobToken); err != nil {
} return fmt.Errorf("failed to connect job WebSocket: %w", err)
defer jobConn.Close()
log.Printf("Job WebSocket authenticated for task %d", job.Task.TaskID)
// Create task context
workDir := r.workspace.JobDir(job.Task.JobID)
ctx := tasks.NewContext(
job.Task.TaskID,
job.Task.JobID,
job.Task.JobName,
job.Task.Frame,
job.Task.TaskType,
workDir,
job.JobToken,
job.Task.Metadata,
r.manager,
jobConn,
r.workspace,
r.blender,
r.encoder,
r.processes,
)
ctx.Info(fmt.Sprintf("Task assignment received (job: %d, type: %s)",
job.Task.JobID, job.Task.TaskType))
// Get processor for task type
processor, ok := r.processors[job.Task.TaskType]
if !ok {
return fmt.Errorf("unknown task type: %s", job.Task.TaskType)
}
// Process the task
var processErr error
switch job.Task.TaskType {
case "render": // this task has a upload outputs step because the frames are not uploaded by the render task directly we have to do it manually here TODO: maybe we should make it work like the encode task
// Download context
contextPath := job.JobPath + "/context.tar"
if err := r.downloadContext(job.Task.JobID, contextPath, job.JobToken); err != nil {
jobConn.Log(job.Task.TaskID, types.LogLevelError, fmt.Sprintf("Failed to download context: %v", err))
jobConn.Complete(job.Task.TaskID, false, fmt.Errorf("failed to download context: %v", err))
return fmt.Errorf("failed to download context: %w", err)
} }
processErr = processor.Process(ctx) defer jobConn.Close()
if processErr == nil {
processErr = r.uploadOutputs(ctx, job) log.Printf("Job WebSocket authenticated for task %d", job.Task.TaskID)
// Create task context (frame range: Frame = start, FrameEnd = end; 0 or missing = single frame)
frameEnd := job.Task.FrameEnd
if frameEnd < job.Task.Frame {
frameEnd = job.Task.Frame
} }
case "encode": // this task doesn't have a upload outputs step because the video is already uploaded by the encode task ctx := tasks.NewContext(
processErr = processor.Process(ctx) job.Task.TaskID,
default: job.Task.JobID,
return fmt.Errorf("unknown task type: %s", job.Task.TaskType) job.Task.JobName,
} job.Task.Frame,
frameEnd,
job.Task.TaskType,
workDir,
job.JobToken,
job.Task.Metadata,
r.manager,
jobConn,
r.workspace,
r.blender,
r.encoder,
r.processes,
r.IsGPULockedOut(),
r.HasHIP(),
r.GPUDetectionFailed(),
r.forceCPURendering,
r.disableHIPRT,
func() { r.SetGPULockedOut(true) },
)
if processErr != nil { ctx.Info(fmt.Sprintf("Task assignment received (job: %d, type: %s)",
ctx.Error(fmt.Sprintf("Task failed: %v", processErr)) job.Task.JobID, job.Task.TaskType))
ctx.Complete(false, processErr)
return processErr
}
ctx.Complete(true, nil) // Get processor for task type
return nil processor, ok := r.processors[job.Task.TaskType]
if !ok {
return fmt.Errorf("unknown task type: %s", job.Task.TaskType)
}
// Process the task
var processErr error
switch job.Task.TaskType {
case "render": // this task has a upload outputs step because the frames are not uploaded by the render task directly we have to do it manually here TODO: maybe we should make it work like the encode task
// Download context
contextPath := job.JobPath + "/context.tar"
if err := r.downloadContext(job.Task.JobID, contextPath, job.JobToken); err != nil {
jobConn.Log(job.Task.TaskID, types.LogLevelError, fmt.Sprintf("Failed to download context: %v", err))
jobConn.Complete(job.Task.TaskID, false, fmt.Errorf("failed to download context: %v", err))
return fmt.Errorf("failed to download context: %w", err)
}
processErr = processor.Process(ctx)
if processErr == nil {
processErr = r.uploadOutputs(ctx, job)
}
case "encode": // this task doesn't have a upload outputs step because the video is already uploaded by the encode task
processErr = processor.Process(ctx)
default:
return fmt.Errorf("unknown task type: %s", job.Task.TaskType)
}
if processErr != nil {
if errors.Is(processErr, tasks.ErrJobCancelled) {
ctx.Warn("Stopping task early because the job was cancelled")
return nil
}
ctx.Error(fmt.Sprintf("Task failed: %v", processErr))
ctx.Complete(false, processErr)
return processErr
}
ctx.Complete(true, nil)
return nil
})
} }
func (r *Runner) downloadContext(jobID int64, contextPath, jobToken string) error { func (r *Runner) downloadContext(jobID int64, contextPath, jobToken string) error {
@@ -289,6 +394,10 @@ func (r *Runner) uploadOutputs(ctx *tasks.Context, job *api.NextJobResponse) err
log.Printf("Failed to upload %s: %v", filePath, err) log.Printf("Failed to upload %s: %v", filePath, err)
} else { } else {
ctx.OutputUploaded(entry.Name()) ctx.OutputUploaded(entry.Name())
// Delete file after successful upload to prevent duplicate uploads
if err := os.Remove(filePath); err != nil {
log.Printf("Warning: Failed to delete file %s after upload: %v", filePath, err)
}
} }
} }
@@ -359,3 +468,21 @@ func (r *Runner) GetFingerprint() string {
func (r *Runner) GetID() int64 { func (r *Runner) GetID() int64 {
return r.id return r.id
} }
// SetGPULockedOut sets whether GPU use is locked out due to a detected GPU error.
// When true, the runner will force CPU rendering for all jobs.
func (r *Runner) SetGPULockedOut(locked bool) {
r.gpuLockedOutMu.Lock()
defer r.gpuLockedOutMu.Unlock()
r.gpuLockedOut = locked
if locked {
log.Printf("GPU lockout enabled: GPU rendering disabled for subsequent jobs (CPU only)")
}
}
// IsGPULockedOut returns whether GPU use is currently locked out.
func (r *Runner) IsGPULockedOut() bool {
r.gpuLockedOutMu.RLock()
defer r.gpuLockedOutMu.RUnlock()
return r.gpuLockedOut
}

View File

@@ -12,6 +12,7 @@ import (
"regexp" "regexp"
"sort" "sort"
"strings" "strings"
"sync"
"jiggablend/internal/runner/encoding" "jiggablend/internal/runner/encoding"
) )
@@ -26,6 +27,10 @@ func NewEncodeProcessor() *EncodeProcessor {
// Process executes an encode task. // Process executes an encode task.
func (p *EncodeProcessor) Process(ctx *Context) error { func (p *EncodeProcessor) Process(ctx *Context) error {
if err := ctx.CheckCancelled(); err != nil {
return err
}
ctx.Info(fmt.Sprintf("Starting encode task: job %d", ctx.JobID)) ctx.Info(fmt.Sprintf("Starting encode task: job %d", ctx.JobID))
log.Printf("Processing encode task %d for job %d", ctx.TaskID, ctx.JobID) log.Printf("Processing encode task %d for job %d", ctx.TaskID, ctx.JobID)
@@ -64,23 +69,18 @@ func (p *EncodeProcessor) Process(ctx *Context) error {
ctx.Info(fmt.Sprintf("File: %s (type: %s, size: %d)", file.FileName, file.FileType, file.FileSize)) ctx.Info(fmt.Sprintf("File: %s (type: %s, size: %d)", file.FileName, file.FileType, file.FileSize))
} }
// Determine source format based on output format // Encode from EXR frames only
sourceFormat := "exr"
fileExt := ".exr" fileExt := ".exr"
// Find and deduplicate frame files (EXR or PNG)
frameFileSet := make(map[string]bool) frameFileSet := make(map[string]bool)
var frameFilesList []string var frameFilesList []string
for _, file := range files { for _, file := range files {
if file.FileType == "output" && strings.HasSuffix(strings.ToLower(file.FileName), fileExt) { if file.FileType == "output" && strings.HasSuffix(strings.ToLower(file.FileName), fileExt) {
// Deduplicate by filename
if !frameFileSet[file.FileName] { if !frameFileSet[file.FileName] {
frameFileSet[file.FileName] = true frameFileSet[file.FileName] = true
frameFilesList = append(frameFilesList, file.FileName) frameFilesList = append(frameFilesList, file.FileName)
} }
} }
} }
if len(frameFilesList) == 0 { if len(frameFilesList) == 0 {
// Log why no files matched (deduplicate for error reporting) // Log why no files matched (deduplicate for error reporting)
outputFileSet := make(map[string]bool) outputFileSet := make(map[string]bool)
@@ -103,37 +103,61 @@ func (p *EncodeProcessor) Process(ctx *Context) error {
} }
} }
} }
ctx.Error(fmt.Sprintf("no %s frame files found for encode: found %d total files, %d unique output files, %d unique %s files (with other types)", strings.ToUpper(fileExt[1:]), len(files), len(outputFiles), len(frameFilesOtherType), strings.ToUpper(fileExt[1:]))) ctx.Error(fmt.Sprintf("no EXR frame files found for encode: found %d total files, %d unique output files, %d unique EXR files (with other types)", len(files), len(outputFiles), len(frameFilesOtherType)))
if len(outputFiles) > 0 { if len(outputFiles) > 0 {
ctx.Error(fmt.Sprintf("Output files found: %v", outputFiles)) ctx.Error(fmt.Sprintf("Output files found: %v", outputFiles))
} }
if len(frameFilesOtherType) > 0 { if len(frameFilesOtherType) > 0 {
ctx.Error(fmt.Sprintf("%s files with wrong type: %v", strings.ToUpper(fileExt[1:]), frameFilesOtherType)) ctx.Error(fmt.Sprintf("EXR files with wrong type: %v", frameFilesOtherType))
} }
err := fmt.Errorf("no %s frame files found for encode", strings.ToUpper(fileExt[1:])) err := fmt.Errorf("no EXR frame files found for encode")
return err return err
} }
ctx.Info(fmt.Sprintf("Found %d %s frames for encode", len(frameFilesList), strings.ToUpper(fileExt[1:]))) ctx.Info(fmt.Sprintf("Found %d EXR frames for encode", len(frameFilesList)))
// Download frames // Download frames with bounded parallelism (8 concurrent downloads)
ctx.Info(fmt.Sprintf("Downloading %d %s frames for encode...", len(frameFilesList), strings.ToUpper(fileExt[1:]))) const downloadWorkers = 8
ctx.Info(fmt.Sprintf("Downloading %d EXR frames for encode...", len(frameFilesList)))
type result struct {
path string
err error
}
results := make([]result, len(frameFilesList))
var wg sync.WaitGroup
sem := make(chan struct{}, downloadWorkers)
for i, fileName := range frameFilesList {
wg.Add(1)
go func(i int, fileName string) {
defer wg.Done()
sem <- struct{}{}
defer func() { <-sem }()
framePath := filepath.Join(workDir, fileName)
err := ctx.Manager.DownloadFrame(ctx.JobID, fileName, framePath)
if err != nil {
ctx.Error(fmt.Sprintf("Failed to download EXR frame %s: %v", fileName, err))
log.Printf("Failed to download EXR frame for encode %s: %v", fileName, err)
results[i] = result{"", err}
return
}
results[i] = result{framePath, nil}
}(i, fileName)
}
wg.Wait()
var frameFiles []string var frameFiles []string
for i, fileName := range frameFilesList { for _, r := range results {
ctx.Info(fmt.Sprintf("Downloading frame %d/%d: %s", i+1, len(frameFilesList), fileName)) if r.err == nil && r.path != "" {
framePath := filepath.Join(workDir, fileName) frameFiles = append(frameFiles, r.path)
if err := ctx.Manager.DownloadFrame(ctx.JobID, fileName, framePath); err != nil {
ctx.Error(fmt.Sprintf("Failed to download %s frame %s: %v", strings.ToUpper(fileExt[1:]), fileName, err))
log.Printf("Failed to download %s frame for encode %s: %v", strings.ToUpper(fileExt[1:]), fileName, err)
continue
} }
ctx.Info(fmt.Sprintf("Successfully downloaded frame %d/%d: %s", i+1, len(frameFilesList), fileName)) }
frameFiles = append(frameFiles, framePath) if err := ctx.CheckCancelled(); err != nil {
return err
} }
if len(frameFiles) == 0 { if len(frameFiles) == 0 {
err := fmt.Errorf("failed to download any %s frames for encode", strings.ToUpper(fileExt[1:])) err := fmt.Errorf("failed to download any EXR frames for encode")
ctx.Error(err.Error()) ctx.Error(err.Error())
return err return err
} }
@@ -141,11 +165,9 @@ func (p *EncodeProcessor) Process(ctx *Context) error {
sort.Strings(frameFiles) sort.Strings(frameFiles)
ctx.Info(fmt.Sprintf("Downloaded %d frames", len(frameFiles))) ctx.Info(fmt.Sprintf("Downloaded %d frames", len(frameFiles)))
// Check if EXR files have alpha channel and HDR content (only for EXR source format) // Check if EXR files have alpha channel (for encode decision)
hasAlpha := false hasAlpha := false
hasHDR := false {
if sourceFormat == "exr" {
// Check first frame for alpha channel and HDR using ffprobe
firstFrame := frameFiles[0] firstFrame := frameFiles[0]
hasAlpha = detectAlphaChannel(ctx, firstFrame) hasAlpha = detectAlphaChannel(ctx, firstFrame)
if hasAlpha { if hasAlpha {
@@ -153,45 +175,28 @@ func (p *EncodeProcessor) Process(ctx *Context) error {
} else { } else {
ctx.Info("No alpha channel detected in EXR files") ctx.Info("No alpha channel detected in EXR files")
} }
hasHDR = detectHDR(ctx, firstFrame)
if hasHDR {
ctx.Info("Detected HDR content in EXR files")
} else {
ctx.Info("No HDR content detected in EXR files (SDR range)")
}
} }
// Generate video // Generate video
// Use alpha if: // Use alpha when source EXR has alpha and codec supports it (AV1 or VP9). H.264 does not support alpha.
// 1. User explicitly enabled it OR source has alpha channel AND useAlpha := hasAlpha && (outputFormat == "EXR_AV1_MP4" || outputFormat == "EXR_VP9_WEBM")
// 2. Codec supports alpha (AV1 or VP9) if hasAlpha && outputFormat == "EXR_264_MP4" {
preserveAlpha := ctx.ShouldPreserveAlpha() ctx.Warn("Alpha channel detected in EXR but H.264 does not support alpha. Use EXR_AV1_MP4 or EXR_VP9_WEBM to preserve alpha in video.")
useAlpha := (preserveAlpha || hasAlpha) && (outputFormat == "EXR_AV1_MP4" || outputFormat == "EXR_VP9_WEBM")
if (preserveAlpha || hasAlpha) && outputFormat == "EXR_264_MP4" {
ctx.Warn("Alpha channel requested/detected but H.264 does not support alpha. Consider using EXR_AV1_MP4 or EXR_VP9_WEBM to preserve alpha.")
}
if preserveAlpha && !hasAlpha {
ctx.Warn("Alpha preservation requested but no alpha channel detected in EXR files.")
} }
if useAlpha { if useAlpha {
if preserveAlpha && hasAlpha { ctx.Info("Alpha channel detected - encoding with alpha (AV1/VP9)")
ctx.Info("Alpha preservation enabled: Using alpha channel encoding")
} else if hasAlpha {
ctx.Info("Alpha channel detected - automatically enabling alpha encoding")
}
} }
var outputExt string var outputExt string
switch outputFormat { switch outputFormat {
case "EXR_VP9_WEBM": case "EXR_VP9_WEBM":
outputExt = "webm" outputExt = "webm"
ctx.Info("Encoding WebM video with VP9 codec (with alpha channel and HDR support)...") ctx.Info("Encoding WebM video with VP9 codec (alpha, HDR)...")
case "EXR_AV1_MP4": case "EXR_AV1_MP4":
outputExt = "mp4" outputExt = "mp4"
ctx.Info("Encoding MP4 video with AV1 codec (with alpha channel)...") ctx.Info("Encoding MP4 video with AV1 codec (alpha, HDR)...")
default: default:
outputExt = "mp4" outputExt = "mp4"
ctx.Info("Encoding MP4 video with H.264 codec...") ctx.Info("Encoding MP4 video with H.264 codec (HDR, HLG)...")
} }
outputVideo := filepath.Join(workDir, fmt.Sprintf("output_%d.%s", ctx.JobID, outputExt)) outputVideo := filepath.Join(workDir, fmt.Sprintf("output_%d.%s", ctx.JobID, outputExt))
@@ -231,11 +236,6 @@ func (p *EncodeProcessor) Process(ctx *Context) error {
// Pass 1 // Pass 1
ctx.Info("Pass 1/2: Analyzing content for optimal encode...") ctx.Info("Pass 1/2: Analyzing content for optimal encode...")
softEncoder := encoder.(*encoding.SoftwareEncoder) softEncoder := encoder.(*encoding.SoftwareEncoder)
// Use HDR if: user explicitly enabled it OR HDR content was detected
preserveHDR := (ctx.ShouldPreserveHDR() || hasHDR) && sourceFormat == "exr"
if hasHDR && !ctx.ShouldPreserveHDR() {
ctx.Info("HDR content detected - automatically enabling HDR preservation")
}
pass1Cmd := softEncoder.BuildPass1Command(&encoding.EncodeConfig{ pass1Cmd := softEncoder.BuildPass1Command(&encoding.EncodeConfig{
InputPattern: patternPath, InputPattern: patternPath,
OutputPath: outputVideo, OutputPath: outputVideo,
@@ -244,8 +244,6 @@ func (p *EncodeProcessor) Process(ctx *Context) error {
WorkDir: workDir, WorkDir: workDir,
UseAlpha: useAlpha, UseAlpha: useAlpha,
TwoPass: true, TwoPass: true,
SourceFormat: sourceFormat,
PreserveHDR: preserveHDR,
}) })
if err := pass1Cmd.Run(); err != nil { if err := pass1Cmd.Run(); err != nil {
ctx.Warn(fmt.Sprintf("Pass 1 completed (warnings expected): %v", err)) ctx.Warn(fmt.Sprintf("Pass 1 completed (warnings expected): %v", err))
@@ -254,15 +252,6 @@ func (p *EncodeProcessor) Process(ctx *Context) error {
// Pass 2 // Pass 2
ctx.Info("Pass 2/2: Encoding with optimal quality...") ctx.Info("Pass 2/2: Encoding with optimal quality...")
preserveHDR = (ctx.ShouldPreserveHDR() || hasHDR) && sourceFormat == "exr"
if preserveHDR {
if hasHDR && !ctx.ShouldPreserveHDR() {
ctx.Info("HDR preservation enabled (auto-detected): Using HLG transfer with bt709 primaries")
} else {
ctx.Info("HDR preservation enabled: Using HLG transfer with bt709 primaries")
}
}
config := &encoding.EncodeConfig{ config := &encoding.EncodeConfig{
InputPattern: patternPath, InputPattern: patternPath,
OutputPath: outputVideo, OutputPath: outputVideo,
@@ -271,8 +260,6 @@ func (p *EncodeProcessor) Process(ctx *Context) error {
WorkDir: workDir, WorkDir: workDir,
UseAlpha: useAlpha, UseAlpha: useAlpha,
TwoPass: true, // Software encoding always uses 2-pass for quality TwoPass: true, // Software encoding always uses 2-pass for quality
SourceFormat: sourceFormat,
PreserveHDR: preserveHDR,
} }
cmd := encoder.BuildCommand(config) cmd := encoder.BuildCommand(config)
@@ -294,6 +281,8 @@ func (p *EncodeProcessor) Process(ctx *Context) error {
if err := cmd.Start(); err != nil { if err := cmd.Start(); err != nil {
return fmt.Errorf("failed to start encode command: %w", err) return fmt.Errorf("failed to start encode command: %w", err)
} }
stopMonitor := ctx.StartCancellationMonitor(cmd, "encode")
defer stopMonitor()
ctx.Processes.Track(ctx.TaskID, cmd) ctx.Processes.Track(ctx.TaskID, cmd)
defer ctx.Processes.Untrack(ctx.TaskID) defer ctx.Processes.Untrack(ctx.TaskID)
@@ -329,6 +318,9 @@ func (p *EncodeProcessor) Process(ctx *Context) error {
<-stderrDone <-stderrDone
if err != nil { if err != nil {
if cancelled, checkErr := ctx.IsJobCancelled(); checkErr == nil && cancelled {
return ErrJobCancelled
}
var errMsg string var errMsg string
if exitErr, ok := err.(*exec.ExitError); ok { if exitErr, ok := err.(*exec.ExitError); ok {
if exitErr.ExitCode() == 137 { if exitErr.ExitCode() == 137 {
@@ -373,6 +365,12 @@ func (p *EncodeProcessor) Process(ctx *Context) error {
ctx.Info(fmt.Sprintf("Successfully uploaded %s: %s", strings.ToUpper(outputExt), filepath.Base(outputVideo))) ctx.Info(fmt.Sprintf("Successfully uploaded %s: %s", strings.ToUpper(outputExt), filepath.Base(outputVideo)))
// Delete file after successful upload to prevent duplicate uploads
if err := os.Remove(outputVideo); err != nil {
log.Printf("Warning: Failed to delete video file %s after upload: %v", outputVideo, err)
ctx.Warn(fmt.Sprintf("Warning: Failed to delete video file after upload: %v", err))
}
log.Printf("Successfully generated and uploaded %s for job %d: %s", strings.ToUpper(outputExt), ctx.JobID, filepath.Base(outputVideo)) log.Printf("Successfully generated and uploaded %s for job %d: %s", strings.ToUpper(outputExt), ctx.JobID, filepath.Base(outputVideo))
return nil return nil
} }

View File

@@ -2,12 +2,19 @@
package tasks package tasks
import ( import (
"errors"
"fmt"
"jiggablend/internal/runner/api" "jiggablend/internal/runner/api"
"jiggablend/internal/runner/blender" "jiggablend/internal/runner/blender"
"jiggablend/internal/runner/encoding" "jiggablend/internal/runner/encoding"
"jiggablend/internal/runner/workspace" "jiggablend/internal/runner/workspace"
"jiggablend/pkg/executils" "jiggablend/pkg/executils"
"jiggablend/pkg/types" "jiggablend/pkg/types"
"os/exec"
"strconv"
"strings"
"sync"
"time"
) )
// Processor handles a specific task type. // Processor handles a specific task type.
@@ -20,7 +27,8 @@ type Context struct {
TaskID int64 TaskID int64
JobID int64 JobID int64
JobName string JobName string
Frame int Frame int // frame start (inclusive); kept for backward compat
FrameEnd int // frame end (inclusive); same as Frame for single-frame
TaskType string TaskType string
WorkDir string WorkDir string
JobToken string JobToken string
@@ -32,13 +40,30 @@ type Context struct {
Blender *blender.Manager Blender *blender.Manager
Encoder *encoding.Selector Encoder *encoding.Selector
Processes *executils.ProcessTracker Processes *executils.ProcessTracker
// GPULockedOut is set when the runner has detected a GPU error (e.g. HIP) and disables GPU for all jobs.
GPULockedOut bool
// HasHIP is true when the runner detected HIP (AMD) devices at startup. Used to force CPU for Blender < 4.x only when HIP is present.
HasHIP bool
// GPUDetectionFailed is true when startup GPU backend detection could not run; we force CPU for all versions (could not determine HIP vs NVIDIA).
GPUDetectionFailed bool
// OnGPUError is called when a GPU error line is seen in render logs; typically sets runner GPU lockout.
OnGPUError func()
// ForceCPURendering is a runner-level override that forces CPU rendering for all jobs.
ForceCPURendering bool
// DisableHIPRT is a runner-level override that disables HIPRT acceleration in Blender.
DisableHIPRT bool
} }
// NewContext creates a new task context. // ErrJobCancelled indicates the manager-side job was cancelled during execution.
var ErrJobCancelled = errors.New("job cancelled")
// NewContext creates a new task context. frameEnd should be >= frame; if 0 or less than frame, it is treated as single-frame (frameEnd = frame).
// gpuLockedOut is the runner's current GPU lockout state; hasHIP means the runner has HIP (AMD) devices (force CPU for Blender < 4.x only when true); gpuDetectionFailed means detection failed at startup (force CPU for all versions—could not determine HIP vs NVIDIA); onGPUError is called when a GPU error is detected in logs (may be nil).
func NewContext( func NewContext(
taskID, jobID int64, taskID, jobID int64,
jobName string, jobName string,
frame int, frameStart, frameEnd int,
taskType string, taskType string,
workDir string, workDir string,
jobToken string, jobToken string,
@@ -49,22 +74,38 @@ func NewContext(
blenderMgr *blender.Manager, blenderMgr *blender.Manager,
encoder *encoding.Selector, encoder *encoding.Selector,
processes *executils.ProcessTracker, processes *executils.ProcessTracker,
gpuLockedOut bool,
hasHIP bool,
gpuDetectionFailed bool,
forceCPURendering bool,
disableHIPRT bool,
onGPUError func(),
) *Context { ) *Context {
if frameEnd < frameStart {
frameEnd = frameStart
}
return &Context{ return &Context{
TaskID: taskID, TaskID: taskID,
JobID: jobID, JobID: jobID,
JobName: jobName, JobName: jobName,
Frame: frame, Frame: frameStart,
TaskType: taskType, FrameEnd: frameEnd,
WorkDir: workDir, TaskType: taskType,
JobToken: jobToken, WorkDir: workDir,
Metadata: metadata, JobToken: jobToken,
Manager: manager, Metadata: metadata,
JobConn: jobConn, Manager: manager,
Workspace: ws, JobConn: jobConn,
Blender: blenderMgr, Workspace: ws,
Encoder: encoder, Blender: blenderMgr,
Processes: processes, Encoder: encoder,
Processes: processes,
GPULockedOut: gpuLockedOut,
HasHIP: hasHIP,
GPUDetectionFailed: gpuDetectionFailed,
ForceCPURendering: forceCPURendering,
DisableHIPRT: disableHIPRT,
OnGPUError: onGPUError,
} }
} }
@@ -145,12 +186,111 @@ func (c *Context) ShouldEnableExecution() bool {
return c.Metadata != nil && c.Metadata.EnableExecution != nil && *c.Metadata.EnableExecution return c.Metadata != nil && c.Metadata.EnableExecution != nil && *c.Metadata.EnableExecution
} }
// ShouldPreserveHDR returns whether to preserve HDR range for EXR encoding. // ShouldForceCPU returns true if GPU should be disabled and CPU rendering forced
func (c *Context) ShouldPreserveHDR() bool { // (runner GPU lockout, GPU detection failed at startup for any version, metadata force_cpu,
return c.Metadata != nil && c.Metadata.PreserveHDR != nil && *c.Metadata.PreserveHDR // or Blender < 4.x when the runner has HIP).
func (c *Context) ShouldForceCPU() bool {
if c.ForceCPURendering {
return true
}
if c.GPULockedOut {
return true
}
// Detection failed at startup: we could not determine HIP vs NVIDIA, so force CPU for all versions.
if c.GPUDetectionFailed {
return true
}
v := c.GetBlenderVersion()
major := parseBlenderMajor(v)
isPre4 := v != "" && major >= 0 && major < 4
// Blender < 4.x: force CPU when runner has HIP (no official HIP support).
if isPre4 && c.HasHIP {
return true
}
if c.Metadata != nil && c.Metadata.RenderSettings.EngineSettings != nil {
if v, ok := c.Metadata.RenderSettings.EngineSettings["force_cpu"]; ok {
if b, ok := v.(bool); ok && b {
return true
}
}
}
return false
} }
// ShouldPreserveAlpha returns whether to preserve alpha channel for EXR encoding. // parseBlenderMajor returns the major version number from a string like "4.2.3" or "3.6".
func (c *Context) ShouldPreserveAlpha() bool { // Returns -1 if the version cannot be parsed.
return c.Metadata != nil && c.Metadata.PreserveAlpha != nil && *c.Metadata.PreserveAlpha func parseBlenderMajor(version string) int {
version = strings.TrimSpace(version)
if version == "" {
return -1
}
parts := strings.SplitN(version, ".", 2)
major, err := strconv.Atoi(parts[0])
if err != nil {
return -1
}
return major
}
// IsJobCancelled checks whether the manager marked this job as cancelled.
func (c *Context) IsJobCancelled() (bool, error) {
if c.Manager == nil {
return false, nil
}
status, err := c.Manager.GetJobStatus(c.JobID)
if err != nil {
return false, err
}
return status == types.JobStatusCancelled, nil
}
// CheckCancelled returns ErrJobCancelled if the job was cancelled.
func (c *Context) CheckCancelled() error {
cancelled, err := c.IsJobCancelled()
if err != nil {
return fmt.Errorf("failed to check job status: %w", err)
}
if cancelled {
return ErrJobCancelled
}
return nil
}
// StartCancellationMonitor polls manager status and kills cmd if job is cancelled.
// Caller must invoke returned stop function when cmd exits.
func (c *Context) StartCancellationMonitor(cmd *exec.Cmd, taskLabel string) func() {
stop := make(chan struct{})
var once sync.Once
go func() {
ticker := time.NewTicker(2 * time.Second)
defer ticker.Stop()
for {
select {
case <-stop:
return
case <-ticker.C:
cancelled, err := c.IsJobCancelled()
if err != nil {
c.Warn(fmt.Sprintf("Could not check cancellation for %s task: %v", taskLabel, err))
continue
}
if !cancelled {
continue
}
c.Warn(fmt.Sprintf("Job %d was cancelled, stopping %s task early", c.JobID, taskLabel))
if cmd != nil && cmd.Process != nil {
_ = cmd.Process.Kill()
}
return
}
}
}()
return func() {
once.Do(func() {
close(stop)
})
}
} }

View File

@@ -25,11 +25,47 @@ func NewRenderProcessor() *RenderProcessor {
return &RenderProcessor{} return &RenderProcessor{}
} }
// gpuErrorSubstrings are log line substrings that indicate a GPU backend error (matched case-insensitively); any match triggers full GPU lockout.
var gpuErrorSubstrings = []string{
"illegal address in hip", // HIP (AMD) e.g. "Illegal address in HIP" or "Illegal address in hip"
"hiperror", // hipError* codes
"hip error",
"cuda error",
"cuerror",
"optix error",
"oneapi error",
"opencl error",
}
// checkGPUErrorLine checks a log line for GPU error indicators and triggers runner GPU lockout if found.
func (p *RenderProcessor) checkGPUErrorLine(ctx *Context, line string) {
lower := strings.ToLower(line)
for _, sub := range gpuErrorSubstrings {
if strings.Contains(lower, sub) {
if ctx.OnGPUError != nil {
ctx.OnGPUError()
}
ctx.Warn(fmt.Sprintf("GPU error detected in log (%q); GPU disabled for subsequent jobs", sub))
return
}
}
}
// Process executes a render task. // Process executes a render task.
func (p *RenderProcessor) Process(ctx *Context) error { func (p *RenderProcessor) Process(ctx *Context) error {
ctx.Info(fmt.Sprintf("Starting task: job %d, frame %d, format: %s", if err := ctx.CheckCancelled(); err != nil {
ctx.JobID, ctx.Frame, ctx.GetOutputFormat())) return err
log.Printf("Processing task %d: job %d, frame %d", ctx.TaskID, ctx.JobID, ctx.Frame) }
if ctx.FrameEnd > ctx.Frame {
ctx.Info(fmt.Sprintf("Starting task: job %d, frames %d-%d, format: %s",
ctx.JobID, ctx.Frame, ctx.FrameEnd, ctx.GetOutputFormat()))
log.Printf("Processing task %d: job %d, frames %d-%d", ctx.TaskID, ctx.JobID, ctx.Frame, ctx.FrameEnd)
} else {
ctx.Info(fmt.Sprintf("Starting task: job %d, frame %d, format: %s",
ctx.JobID, ctx.Frame, ctx.GetOutputFormat()))
log.Printf("Processing task %d: job %d, frame %d", ctx.TaskID, ctx.JobID, ctx.Frame)
}
// Find .blend file // Find .blend file
blendFile, err := workspace.FindFirstBlendFile(ctx.WorkDir) blendFile, err := workspace.FindFirstBlendFile(ctx.WorkDir)
@@ -64,11 +100,22 @@ func (p *RenderProcessor) Process(ctx *Context) error {
return fmt.Errorf("failed to create Blender home directory: %w", err) return fmt.Errorf("failed to create Blender home directory: %w", err)
} }
// Determine render format // We always render EXR (linear) for VFX accuracy; job output_format is the deliverable (EXR sequence or video).
outputFormat := ctx.GetOutputFormat() renderFormat := "EXR"
renderFormat := outputFormat
if outputFormat == "EXR_264_MP4" || outputFormat == "EXR_AV1_MP4" || outputFormat == "EXR_VP9_WEBM" { if ctx.ShouldForceCPU() {
renderFormat = "EXR" // Use EXR for maximum quality v := ctx.GetBlenderVersion()
major := parseBlenderMajor(v)
isPre4 := v != "" && major >= 0 && major < 4
if ctx.ForceCPURendering {
ctx.Info("Runner compatibility flag is enabled: forcing CPU rendering for this job")
} else if ctx.GPUDetectionFailed {
ctx.Info("GPU backend detection failed at startup—we could not determine whether this machine has HIP (AMD) or NVIDIA GPUs, so rendering will use CPU to avoid compatibility issues")
} else if isPre4 && ctx.HasHIP {
ctx.Info("Blender < 4.x has no official HIP support: using CPU rendering only")
} else {
ctx.Info("GPU lockout active: using CPU rendering only")
}
} }
// Create render script // Create render script
@@ -77,18 +124,30 @@ func (p *RenderProcessor) Process(ctx *Context) error {
} }
// Render // Render
ctx.Info(fmt.Sprintf("Starting Blender render for frame %d...", ctx.Frame)) if ctx.FrameEnd > ctx.Frame {
ctx.Info(fmt.Sprintf("Starting Blender render for frames %d-%d...", ctx.Frame, ctx.FrameEnd))
} else {
ctx.Info(fmt.Sprintf("Starting Blender render for frame %d...", ctx.Frame))
}
if err := p.runBlender(ctx, blenderBinary, blendFile, outputDir, renderFormat, blenderHome); err != nil { if err := p.runBlender(ctx, blenderBinary, blendFile, outputDir, renderFormat, blenderHome); err != nil {
if errors.Is(err, ErrJobCancelled) {
ctx.Warn("Render stopped because job was cancelled")
return err
}
ctx.Error(fmt.Sprintf("Blender render failed: %v", err)) ctx.Error(fmt.Sprintf("Blender render failed: %v", err))
return err return err
} }
// Verify output // Verify output (range or single frame)
if _, err := p.findOutputFile(ctx, outputDir, renderFormat); err != nil { if err := p.verifyOutputRange(ctx, outputDir, renderFormat); err != nil {
ctx.Error(fmt.Sprintf("Output verification failed: %v", err)) ctx.Error(fmt.Sprintf("Output verification failed: %v", err))
return err return err
} }
ctx.Info(fmt.Sprintf("Blender render completed for frame %d", ctx.Frame)) if ctx.FrameEnd > ctx.Frame {
ctx.Info(fmt.Sprintf("Blender render completed for frames %d-%d", ctx.Frame, ctx.FrameEnd))
} else {
ctx.Info(fmt.Sprintf("Blender render completed for frame %d", ctx.Frame))
}
return nil return nil
} }
@@ -116,22 +175,31 @@ func (p *RenderProcessor) createRenderScript(ctx *Context, renderFormat string)
return errors.New(errMsg) return errors.New(errMsg)
} }
// Write output format // Write EXR to format file so Blender script sets OPEN_EXR (job output_format is for downstream deliverable only).
outputFormat := ctx.GetOutputFormat() ctx.Info("Writing output format 'EXR' to format file")
ctx.Info(fmt.Sprintf("Writing output format '%s' to format file", outputFormat)) if err := os.WriteFile(formatFilePath, []byte("EXR"), 0644); err != nil {
if err := os.WriteFile(formatFilePath, []byte(outputFormat), 0644); err != nil {
errMsg := fmt.Sprintf("failed to create format file: %v", err) errMsg := fmt.Sprintf("failed to create format file: %v", err)
ctx.Error(errMsg) ctx.Error(errMsg)
return errors.New(errMsg) return errors.New(errMsg)
} }
// Write render settings if available // Write render settings: merge job metadata with runner force_cpu (GPU lockout)
var settingsMap map[string]interface{}
if ctx.Metadata != nil && ctx.Metadata.RenderSettings.EngineSettings != nil { if ctx.Metadata != nil && ctx.Metadata.RenderSettings.EngineSettings != nil {
settingsJSON, err := json.Marshal(ctx.Metadata.RenderSettings) raw, err := json.Marshal(ctx.Metadata.RenderSettings)
if err == nil { if err == nil {
if err := os.WriteFile(renderSettingsFilePath, settingsJSON, 0644); err != nil { _ = json.Unmarshal(raw, &settingsMap)
ctx.Warn(fmt.Sprintf("Failed to write render settings file: %v", err)) }
} }
if settingsMap == nil {
settingsMap = make(map[string]interface{})
}
settingsMap["force_cpu"] = ctx.ShouldForceCPU()
settingsMap["disable_hiprt"] = ctx.DisableHIPRT
settingsJSON, err := json.Marshal(settingsMap)
if err == nil {
if err := os.WriteFile(renderSettingsFilePath, settingsJSON, 0644); err != nil {
ctx.Warn(fmt.Sprintf("Failed to write render settings file: %v", err))
} }
} }
@@ -151,17 +219,19 @@ func (p *RenderProcessor) runBlender(ctx *Context, blenderBinary, blendFile, out
outputAbsPattern, _ := filepath.Abs(outputPattern) outputAbsPattern, _ := filepath.Abs(outputPattern)
args = append(args, "-o", outputAbsPattern) args = append(args, "-o", outputAbsPattern)
args = append(args, "-f", fmt.Sprintf("%d", ctx.Frame)) // Render single frame or range: -f N for one frame, -s start -e end -a for range
if ctx.FrameEnd > ctx.Frame {
args = append(args, "-s", fmt.Sprintf("%d", ctx.Frame), "-e", fmt.Sprintf("%d", ctx.FrameEnd), "-a")
} else {
args = append(args, "-f", fmt.Sprintf("%d", ctx.Frame))
}
// Wrap with xvfb-run cmd := exec.Command(blenderBinary, args...)
xvfbArgs := []string{"-a", "-s", "-screen 0 800x600x24", blenderBinary}
xvfbArgs = append(xvfbArgs, args...)
cmd := exec.Command("xvfb-run", xvfbArgs...)
cmd.Dir = ctx.WorkDir cmd.Dir = ctx.WorkDir
// Set up environment with custom HOME directory // Set up environment: LD_LIBRARY_PATH for tarball Blender, then custom HOME
env := os.Environ() env := os.Environ()
// Remove existing HOME if present and add our custom one env = blender.TarballEnv(blenderBinary, env)
newEnv := make([]string, 0, len(env)+1) newEnv := make([]string, 0, len(env)+1)
for _, e := range env { for _, e := range env {
if !strings.HasPrefix(e, "HOME=") { if !strings.HasPrefix(e, "HOME=") {
@@ -185,12 +255,14 @@ func (p *RenderProcessor) runBlender(ctx *Context, blenderBinary, blendFile, out
if err := cmd.Start(); err != nil { if err := cmd.Start(); err != nil {
return fmt.Errorf("failed to start blender: %w", err) return fmt.Errorf("failed to start blender: %w", err)
} }
stopMonitor := ctx.StartCancellationMonitor(cmd, "render")
defer stopMonitor()
// Track process // Track process
ctx.Processes.Track(ctx.TaskID, cmd) ctx.Processes.Track(ctx.TaskID, cmd)
defer ctx.Processes.Untrack(ctx.TaskID) defer ctx.Processes.Untrack(ctx.TaskID)
// Stream stdout // Stream stdout and watch for GPU error lines (lock out all GPU on any backend error)
stdoutDone := make(chan bool) stdoutDone := make(chan bool)
go func() { go func() {
defer close(stdoutDone) defer close(stdoutDone)
@@ -198,6 +270,7 @@ func (p *RenderProcessor) runBlender(ctx *Context, blenderBinary, blendFile, out
for scanner.Scan() { for scanner.Scan() {
line := scanner.Text() line := scanner.Text()
if line != "" { if line != "" {
p.checkGPUErrorLine(ctx, line)
shouldFilter, logLevel := blender.FilterLog(line) shouldFilter, logLevel := blender.FilterLog(line)
if !shouldFilter { if !shouldFilter {
ctx.Log(logLevel, line) ctx.Log(logLevel, line)
@@ -206,7 +279,7 @@ func (p *RenderProcessor) runBlender(ctx *Context, blenderBinary, blendFile, out
} }
}() }()
// Stream stderr // Stream stderr and watch for GPU error lines
stderrDone := make(chan bool) stderrDone := make(chan bool)
go func() { go func() {
defer close(stderrDone) defer close(stderrDone)
@@ -214,6 +287,7 @@ func (p *RenderProcessor) runBlender(ctx *Context, blenderBinary, blendFile, out
for scanner.Scan() { for scanner.Scan() {
line := scanner.Text() line := scanner.Text()
if line != "" { if line != "" {
p.checkGPUErrorLine(ctx, line)
shouldFilter, logLevel := blender.FilterLog(line) shouldFilter, logLevel := blender.FilterLog(line)
if !shouldFilter { if !shouldFilter {
if logLevel == types.LogLevelInfo { if logLevel == types.LogLevelInfo {
@@ -231,6 +305,9 @@ func (p *RenderProcessor) runBlender(ctx *Context, blenderBinary, blendFile, out
<-stderrDone <-stderrDone
if err != nil { if err != nil {
if cancelled, checkErr := ctx.IsJobCancelled(); checkErr == nil && cancelled {
return ErrJobCancelled
}
if exitErr, ok := err.(*exec.ExitError); ok { if exitErr, ok := err.(*exec.ExitError); ok {
if exitErr.ExitCode() == 137 { if exitErr.ExitCode() == 137 {
return errors.New("Blender was killed due to excessive memory usage (OOM)") return errors.New("Blender was killed due to excessive memory usage (OOM)")
@@ -242,60 +319,64 @@ func (p *RenderProcessor) runBlender(ctx *Context, blenderBinary, blendFile, out
return nil return nil
} }
func (p *RenderProcessor) findOutputFile(ctx *Context, outputDir, renderFormat string) (string, error) { // verifyOutputRange checks that output files exist for the task's frame range (first and last at minimum).
func (p *RenderProcessor) verifyOutputRange(ctx *Context, outputDir, renderFormat string) error {
entries, err := os.ReadDir(outputDir) entries, err := os.ReadDir(outputDir)
if err != nil { if err != nil {
return "", fmt.Errorf("failed to read output directory: %w", err) return fmt.Errorf("failed to read output directory: %w", err)
} }
ctx.Info("Checking output directory for files...") ctx.Info("Checking output directory for files...")
ext := strings.ToLower(renderFormat)
// Try exact match first // Check first and last frame in range (minimum required for range; single frame = one check)
expectedFile := filepath.Join(outputDir, fmt.Sprintf("frame_%04d.%s", ctx.Frame, strings.ToLower(renderFormat))) framesToCheck := []int{ctx.Frame}
if _, err := os.Stat(expectedFile); err == nil { if ctx.FrameEnd > ctx.Frame {
ctx.Info(fmt.Sprintf("Found output file: %s", filepath.Base(expectedFile))) framesToCheck = append(framesToCheck, ctx.FrameEnd)
return expectedFile, nil
} }
for _, frame := range framesToCheck {
// Try without zero padding found := false
altFile := filepath.Join(outputDir, fmt.Sprintf("frame_%d.%s", ctx.Frame, strings.ToLower(renderFormat))) // Try frame_0001.ext, frame_1.ext, 0001.ext
if _, err := os.Stat(altFile); err == nil { for _, name := range []string{
ctx.Info(fmt.Sprintf("Found output file: %s", filepath.Base(altFile))) fmt.Sprintf("frame_%04d.%s", frame, ext),
return altFile, nil fmt.Sprintf("frame_%d.%s", frame, ext),
} fmt.Sprintf("%04d.%s", frame, ext),
} {
// Try just frame number if _, err := os.Stat(filepath.Join(outputDir, name)); err == nil {
altFile2 := filepath.Join(outputDir, fmt.Sprintf("%04d.%s", ctx.Frame, strings.ToLower(renderFormat))) found = true
if _, err := os.Stat(altFile2); err == nil { ctx.Info(fmt.Sprintf("Found output file: %s", name))
ctx.Info(fmt.Sprintf("Found output file: %s", filepath.Base(altFile2))) break
return altFile2, nil
}
// Search through all files
for _, entry := range entries {
if !entry.IsDir() {
fileName := entry.Name()
if strings.Contains(fileName, "%04d") || strings.Contains(fileName, "%d") {
ctx.Warn(fmt.Sprintf("Skipping file with literal pattern: %s", fileName))
continue
}
frameStr := fmt.Sprintf("%d", ctx.Frame)
frameStrPadded := fmt.Sprintf("%04d", ctx.Frame)
if strings.Contains(fileName, frameStrPadded) ||
(strings.Contains(fileName, frameStr) && strings.HasSuffix(strings.ToLower(fileName), strings.ToLower(renderFormat))) {
outputFile := filepath.Join(outputDir, fileName)
ctx.Info(fmt.Sprintf("Found output file: %s", fileName))
return outputFile, nil
} }
} }
} if !found {
// Search entries for this frame number
// Not found frameStr := fmt.Sprintf("%d", frame)
fileList := []string{} frameStrPadded := fmt.Sprintf("%04d", frame)
for _, entry := range entries { for _, entry := range entries {
if !entry.IsDir() { if entry.IsDir() {
fileList = append(fileList, entry.Name()) continue
}
fileName := entry.Name()
if strings.Contains(fileName, "%04d") || strings.Contains(fileName, "%d") {
continue
}
if (strings.Contains(fileName, frameStrPadded) ||
strings.Contains(fileName, frameStr)) && strings.HasSuffix(strings.ToLower(fileName), ext) {
found = true
ctx.Info(fmt.Sprintf("Found output file: %s", fileName))
break
}
}
}
if !found {
fileList := []string{}
for _, e := range entries {
if !e.IsDir() {
fileList = append(fileList, e.Name())
}
}
return fmt.Errorf("output file for frame %d not found; files in output directory: %v", frame, fileList)
} }
} }
return "", fmt.Errorf("output file not found: %s\nFiles in output directory: %v", expectedFile, fileList) return nil
} }

View File

@@ -60,7 +60,7 @@ func (s *Storage) TempDir(pattern string) (string, error) {
if err := os.MkdirAll(s.tempPath(), 0755); err != nil { if err := os.MkdirAll(s.tempPath(), 0755); err != nil {
return "", fmt.Errorf("failed to create temp directory: %w", err) return "", fmt.Errorf("failed to create temp directory: %w", err)
} }
// Create temp directory under storage base path // Create temp directory under storage base path
return os.MkdirTemp(s.tempPath(), pattern) return os.MkdirTemp(s.tempPath(), pattern)
} }
@@ -166,12 +166,12 @@ func (s *Storage) GetFileSize(filePath string) (int64, error) {
// Returns a list of all extracted file paths // Returns a list of all extracted file paths
func (s *Storage) ExtractZip(zipPath, destDir string) ([]string, error) { func (s *Storage) ExtractZip(zipPath, destDir string) ([]string, error) {
log.Printf("Extracting ZIP archive: %s -> %s", zipPath, destDir) log.Printf("Extracting ZIP archive: %s -> %s", zipPath, destDir)
// Ensure destination directory exists // Ensure destination directory exists
if err := os.MkdirAll(destDir, 0755); err != nil { if err := os.MkdirAll(destDir, 0755); err != nil {
return nil, fmt.Errorf("failed to create destination directory: %w", err) return nil, fmt.Errorf("failed to create destination directory: %w", err)
} }
r, err := zip.OpenReader(zipPath) r, err := zip.OpenReader(zipPath)
if err != nil { if err != nil {
return nil, fmt.Errorf("failed to open ZIP file: %w", err) return nil, fmt.Errorf("failed to open ZIP file: %w", err)
@@ -187,7 +187,7 @@ func (s *Storage) ExtractZip(zipPath, destDir string) ([]string, error) {
for _, f := range r.File { for _, f := range r.File {
// Sanitize file path to prevent directory traversal // Sanitize file path to prevent directory traversal
destPath := filepath.Join(destDir, f.Name) destPath := filepath.Join(destDir, f.Name)
cleanDestPath := filepath.Clean(destPath) cleanDestPath := filepath.Clean(destPath)
cleanDestDir := filepath.Clean(destDir) cleanDestDir := filepath.Clean(destDir)
if !strings.HasPrefix(cleanDestPath, cleanDestDir+string(os.PathSeparator)) && cleanDestPath != cleanDestDir { if !strings.HasPrefix(cleanDestPath, cleanDestDir+string(os.PathSeparator)) && cleanDestPath != cleanDestDir {
@@ -520,7 +520,7 @@ func (s *Storage) CreateJobContextFromDir(sourceDir string, jobID int64, exclude
if commonPrefix != "" && strings.HasPrefix(tarPath, commonPrefix) { if commonPrefix != "" && strings.HasPrefix(tarPath, commonPrefix) {
tarPath = strings.TrimPrefix(tarPath, commonPrefix) tarPath = strings.TrimPrefix(tarPath, commonPrefix)
} }
// Check if it's a .blend file at root (no path separators after prefix stripping) // Check if it's a .blend file at root (no path separators after prefix stripping)
if strings.HasSuffix(strings.ToLower(tarPath), ".blend") { if strings.HasSuffix(strings.ToLower(tarPath), ".blend") {
// Check if it's at root level (no directory separators) // Check if it's at root level (no directory separators)
@@ -566,7 +566,7 @@ func (s *Storage) CreateJobContextFromDir(sourceDir string, jobID int64, exclude
// Get relative path and strip common prefix if present // Get relative path and strip common prefix if present
relPath := relPaths[i] relPath := relPaths[i]
tarPath := filepath.ToSlash(relPath) tarPath := filepath.ToSlash(relPath)
// Strip common prefix if found // Strip common prefix if found
if commonPrefix != "" && strings.HasPrefix(tarPath, commonPrefix) { if commonPrefix != "" && strings.HasPrefix(tarPath, commonPrefix) {
tarPath = strings.TrimPrefix(tarPath, commonPrefix) tarPath = strings.TrimPrefix(tarPath, commonPrefix)
@@ -608,3 +608,129 @@ func (s *Storage) CreateJobContextFromDir(sourceDir string, jobID int64, exclude
return contextPath, nil return contextPath, nil
} }
// CreateContextArchiveFromDirToPath creates a context archive from files in sourceDir at destPath.
// This is used for pre-job upload sessions where the archive is staged before a job ID exists.
func (s *Storage) CreateContextArchiveFromDirToPath(sourceDir, destPath string, excludeFiles ...string) (string, error) {
excludeSet := make(map[string]bool)
for _, excludeFile := range excludeFiles {
excludePath := filepath.Clean(excludeFile)
excludeSet[excludePath] = true
excludeSet[filepath.ToSlash(excludePath)] = true
}
var filesToInclude []string
err := filepath.Walk(sourceDir, func(path string, info os.FileInfo, err error) error {
if err != nil {
return err
}
if info.IsDir() {
return nil
}
if isBlenderSaveFile(info.Name()) {
return nil
}
relPath, err := filepath.Rel(sourceDir, path)
if err != nil {
return err
}
cleanRelPath := filepath.Clean(relPath)
if strings.HasPrefix(cleanRelPath, "..") {
return fmt.Errorf("invalid file path: %s", relPath)
}
if excludeSet[cleanRelPath] || excludeSet[filepath.ToSlash(cleanRelPath)] {
return nil
}
filesToInclude = append(filesToInclude, path)
return nil
})
if err != nil {
return "", fmt.Errorf("failed to walk source directory: %w", err)
}
if len(filesToInclude) == 0 {
return "", fmt.Errorf("no files found to include in context archive")
}
relPaths := make([]string, 0, len(filesToInclude))
for _, filePath := range filesToInclude {
relPath, err := filepath.Rel(sourceDir, filePath)
if err != nil {
return "", fmt.Errorf("failed to get relative path: %w", err)
}
relPaths = append(relPaths, relPath)
}
commonPrefix := findCommonPrefix(relPaths)
blendFilesAtRoot := 0
for _, relPath := range relPaths {
tarPath := filepath.ToSlash(relPath)
if commonPrefix != "" && strings.HasPrefix(tarPath, commonPrefix) {
tarPath = strings.TrimPrefix(tarPath, commonPrefix)
}
if strings.HasSuffix(strings.ToLower(tarPath), ".blend") && !strings.Contains(tarPath, "/") {
blendFilesAtRoot++
}
}
if blendFilesAtRoot == 0 {
return "", fmt.Errorf("no .blend file found at root level in context archive - .blend files must be at the root level of the uploaded archive, not in subdirectories")
}
if blendFilesAtRoot > 1 {
return "", fmt.Errorf("multiple .blend files found at root level in context archive (found %d, expected 1)", blendFilesAtRoot)
}
contextFile, err := os.Create(destPath)
if err != nil {
return "", fmt.Errorf("failed to create context file: %w", err)
}
defer contextFile.Close()
tarWriter := tar.NewWriter(contextFile)
defer tarWriter.Close()
copyBuf := make([]byte, 32*1024)
for i, filePath := range filesToInclude {
file, err := os.Open(filePath)
if err != nil {
return "", fmt.Errorf("failed to open file: %w", err)
}
info, err := file.Stat()
if err != nil {
file.Close()
return "", fmt.Errorf("failed to stat file: %w", err)
}
tarPath := filepath.ToSlash(relPaths[i])
if commonPrefix != "" && strings.HasPrefix(tarPath, commonPrefix) {
tarPath = strings.TrimPrefix(tarPath, commonPrefix)
}
header, err := tar.FileInfoHeader(info, "")
if err != nil {
file.Close()
return "", fmt.Errorf("failed to create tar header: %w", err)
}
header.Name = tarPath
if err := tarWriter.WriteHeader(header); err != nil {
file.Close()
return "", fmt.Errorf("failed to write tar header: %w", err)
}
if _, err := io.CopyBuffer(tarWriter, file, copyBuf); err != nil {
file.Close()
return "", fmt.Errorf("failed to write file to tar: %w", err)
}
file.Close()
}
if err := tarWriter.Close(); err != nil {
return "", fmt.Errorf("failed to close tar writer: %w", err)
}
if err := contextFile.Close(); err != nil {
return "", fmt.Errorf("failed to close context file: %w", err)
}
return destPath, nil
}

View File

@@ -2,10 +2,13 @@ package executils
import ( import (
"bufio" "bufio"
"context"
"errors" "errors"
"fmt" "fmt"
"io"
"os" "os"
"os/exec" "os/exec"
"strings"
"sync" "sync"
"time" "time"
@@ -107,6 +110,78 @@ type CommandResult struct {
ExitCode int ExitCode int
} }
// RunCommandWithTimeout is like RunCommand but kills the process after timeout.
// A zero timeout means no timeout.
func RunCommandWithTimeout(
timeout time.Duration,
cmdPath string,
args []string,
dir string,
env []string,
taskID int64,
tracker *ProcessTracker,
) (*CommandResult, error) {
if timeout <= 0 {
return RunCommand(cmdPath, args, dir, env, taskID, tracker)
}
ctx, cancel := context.WithTimeout(context.Background(), timeout)
defer cancel()
cmd := exec.CommandContext(ctx, cmdPath, args...)
cmd.Dir = dir
if env != nil {
cmd.Env = env
}
stdoutPipe, err := cmd.StdoutPipe()
if err != nil {
return nil, fmt.Errorf("failed to create stdout pipe: %w", err)
}
stderrPipe, err := cmd.StderrPipe()
if err != nil {
return nil, fmt.Errorf("failed to create stderr pipe: %w", err)
}
if err := cmd.Start(); err != nil {
return nil, fmt.Errorf("failed to start command: %w", err)
}
if tracker != nil {
tracker.Track(taskID, cmd)
defer tracker.Untrack(taskID)
}
var stdoutBuf, stderrBuf []byte
var stdoutErr, stderrErr error
var wg sync.WaitGroup
wg.Add(2)
go func() {
defer wg.Done()
stdoutBuf, stdoutErr = readAll(stdoutPipe)
}()
go func() {
defer wg.Done()
stderrBuf, stderrErr = readAll(stderrPipe)
}()
waitErr := cmd.Wait()
wg.Wait()
if stdoutErr != nil && !isBenignPipeReadError(stdoutErr) {
return nil, fmt.Errorf("failed to read stdout: %w", stdoutErr)
}
if stderrErr != nil && !isBenignPipeReadError(stderrErr) {
return nil, fmt.Errorf("failed to read stderr: %w", stderrErr)
}
result := &CommandResult{Stdout: string(stdoutBuf), Stderr: string(stderrBuf)}
if waitErr != nil {
if exitErr, ok := waitErr.(*exec.ExitError); ok {
result.ExitCode = exitErr.ExitCode()
} else {
result.ExitCode = -1
}
if ctx.Err() == context.DeadlineExceeded {
return result, fmt.Errorf("command timed out after %v: %w", timeout, waitErr)
}
return result, waitErr
}
result.ExitCode = 0
return result, nil
}
// RunCommand executes a command and returns the output // RunCommand executes a command and returns the output
// If tracker is provided, the process will be registered for tracking // If tracker is provided, the process will be registered for tracking
// This is useful for commands where you need to capture output (like metadata extraction) // This is useful for commands where you need to capture output (like metadata extraction)
@@ -164,10 +239,10 @@ func RunCommand(
wg.Wait() wg.Wait()
// Check for read errors // Check for read errors
if stdoutErr != nil { if stdoutErr != nil && !isBenignPipeReadError(stdoutErr) {
return nil, fmt.Errorf("failed to read stdout: %w", stdoutErr) return nil, fmt.Errorf("failed to read stdout: %w", stdoutErr)
} }
if stderrErr != nil { if stderrErr != nil && !isBenignPipeReadError(stderrErr) {
return nil, fmt.Errorf("failed to read stderr: %w", stderrErr) return nil, fmt.Errorf("failed to read stderr: %w", stderrErr)
} }
@@ -208,6 +283,18 @@ func readAll(r interface{ Read([]byte) (int, error) }) ([]byte, error) {
return buf, nil return buf, nil
} }
// isBenignPipeReadError treats EOF-like pipe close races as non-fatal.
func isBenignPipeReadError(err error) bool {
if err == nil {
return false
}
if errors.Is(err, io.EOF) || errors.Is(err, os.ErrClosed) || errors.Is(err, io.ErrClosedPipe) {
return true
}
// Some platforms return wrapped messages that don't map cleanly to sentinel errors.
return strings.Contains(strings.ToLower(err.Error()), "file already closed")
}
// LogSender is a function type for sending logs // LogSender is a function type for sending logs
type LogSender func(taskID int, level types.LogLevel, message string, stepName string) type LogSender func(taskID int, level types.LogLevel, message string, stepName string)

View File

@@ -0,0 +1,32 @@
package executils
import (
"errors"
"io"
"os"
"testing"
)
func TestIsBenignPipeReadError(t *testing.T) {
tests := []struct {
name string
err error
want bool
}{
{name: "nil", err: nil, want: false},
{name: "eof", err: io.EOF, want: true},
{name: "closed", err: os.ErrClosed, want: true},
{name: "closed pipe", err: io.ErrClosedPipe, want: true},
{name: "wrapped closed", err: errors.New("read |0: file already closed"), want: true},
{name: "other", err: errors.New("permission denied"), want: false},
}
for _, tc := range tests {
t.Run(tc.name, func(t *testing.T) {
got := isBenignPipeReadError(tc.err)
if got != tc.want {
t.Fatalf("got %v, want %v (err=%v)", got, tc.want, tc.err)
}
})
}
}

View File

@@ -11,3 +11,6 @@ var UnhideObjects string
//go:embed scripts/render_blender.py.template //go:embed scripts/render_blender.py.template
var RenderBlenderTemplate string var RenderBlenderTemplate string
//go:embed scripts/detect_gpu_backends.py
var DetectGPUBackends string

View File

@@ -0,0 +1,39 @@
# Minimal script to detect HIP (AMD) and NVIDIA (CUDA/OptiX) backends for Cycles.
# Run with: blender -b --python detect_gpu_backends.py
# Prints HAS_HIP and/or HAS_NVIDIA to stdout, one per line.
import sys
def main():
try:
prefs = bpy.context.preferences
if not hasattr(prefs, 'addons') or 'cycles' not in prefs.addons:
return
cprefs = prefs.addons['cycles'].preferences
has_hip = False
has_nvidia = False
for device_type in ('HIP', 'CUDA', 'OPTIX'):
try:
cprefs.compute_device_type = device_type
cprefs.refresh_devices()
devs = []
if hasattr(cprefs, 'get_devices'):
devs = cprefs.get_devices()
elif hasattr(cprefs, 'devices') and cprefs.devices:
devs = list(cprefs.devices) if hasattr(cprefs.devices, '__iter__') else [cprefs.devices]
if devs:
if device_type == 'HIP':
has_hip = True
if device_type in ('CUDA', 'OPTIX'):
has_nvidia = True
except Exception:
pass
if has_hip:
print('HAS_HIP', flush=True)
if has_nvidia:
print('HAS_NVIDIA', flush=True)
except Exception as e:
print('ERROR', str(e), file=sys.stderr, flush=True)
sys.exit(1)
import bpy
main()

View File

@@ -95,31 +95,20 @@ if current_device:
print(f"Blend file output format: {current_output_format}") print(f"Blend file output format: {current_output_format}")
# Override output format if specified # Override output format if specified
# The format file always takes precedence (it's written specifically for this job) # Render output is EXR only and must remain linear for the encode pipeline (linear -> sRGB -> HLG).
if output_format_override: if output_format_override:
print(f"Overriding output format from '{current_output_format}' to '{output_format_override}'") print(f"Overriding output format from '{current_output_format}' to OPEN_EXR (always EXR for pipeline)")
# Map common format names to Blender's format constants
# For video formats, we render as appropriate frame format first
format_to_use = output_format_override.upper()
if format_to_use in ['EXR_264_MP4', 'EXR_AV1_MP4', 'EXR_VP9_WEBM']:
format_to_use = 'EXR' # Render as EXR for EXR video formats
format_map = {
'PNG': 'PNG',
'JPEG': 'JPEG',
'JPG': 'JPEG',
'EXR': 'OPEN_EXR',
'OPEN_EXR': 'OPEN_EXR',
'TARGA': 'TARGA',
'TIFF': 'TIFF',
'BMP': 'BMP',
}
blender_format = format_map.get(format_to_use, format_to_use)
try: try:
scene.render.image_settings.file_format = blender_format scene.render.image_settings.file_format = 'OPEN_EXR'
print(f"Successfully set output format to: {blender_format}") # Lock output color space to linear (defense in depth; EXR is linear for encode pipeline)
if getattr(scene.render.image_settings, 'has_linear_colorspace', False) and hasattr(scene.render.image_settings, 'linear_colorspace_settings'):
try:
scene.render.image_settings.linear_colorspace_settings.name = 'Linear'
except Exception as ex:
print(f"Note: Could not set linear output: {ex}")
print("Successfully set output format to: OPEN_EXR")
except Exception as e: except Exception as e:
print(f"Warning: Could not set output format to {blender_format}: {e}") print(f"Warning: Could not set output format to OPEN_EXR: {e}")
print(f"Using blend file's format: {current_output_format}") print(f"Using blend file's format: {current_output_format}")
else: else:
print(f"Using blend file's output format: {current_output_format}") print(f"Using blend file's output format: {current_output_format}")
@@ -186,9 +175,13 @@ if render_settings_override:
if current_engine == 'CYCLES': if current_engine == 'CYCLES':
# Check if CPU rendering is forced # Check if CPU rendering is forced
force_cpu = False force_cpu = False
disable_hiprt = False
if render_settings_override and render_settings_override.get('force_cpu'): if render_settings_override and render_settings_override.get('force_cpu'):
force_cpu = render_settings_override.get('force_cpu', False) force_cpu = render_settings_override.get('force_cpu', False)
print("Force CPU rendering is enabled - skipping GPU detection") print("Force CPU rendering is enabled - skipping GPU detection")
if render_settings_override and render_settings_override.get('disable_hiprt'):
disable_hiprt = render_settings_override.get('disable_hiprt', False)
print("Disable HIPRT flag is enabled")
# Ensure Cycles addon is enabled # Ensure Cycles addon is enabled
try: try:
@@ -332,7 +325,16 @@ if current_engine == 'CYCLES':
try: try:
if best_device_type == 'HIP': if best_device_type == 'HIP':
# HIPRT (HIP Ray Tracing) for AMD GPUs # HIPRT (HIP Ray Tracing) for AMD GPUs
if hasattr(cycles_prefs, 'use_hiprt'): if disable_hiprt:
if hasattr(cycles_prefs, 'use_hiprt'):
cycles_prefs.use_hiprt = False
print(f" Disabled HIPRT (HIP Ray Tracing) via runner compatibility flag")
elif hasattr(scene.cycles, 'use_hiprt'):
scene.cycles.use_hiprt = False
print(f" Disabled HIPRT (HIP Ray Tracing) via runner compatibility flag")
else:
print(f" HIPRT toggle not available on this Blender version")
elif hasattr(cycles_prefs, 'use_hiprt'):
cycles_prefs.use_hiprt = True cycles_prefs.use_hiprt = True
print(f" Enabled HIPRT (HIP Ray Tracing) for faster rendering") print(f" Enabled HIPRT (HIP Ray Tracing) for faster rendering")
elif hasattr(scene.cycles, 'use_hiprt'): elif hasattr(scene.cycles, 'use_hiprt'):

View File

@@ -93,7 +93,8 @@ type Task struct {
ID int64 `json:"id"` ID int64 `json:"id"`
JobID int64 `json:"job_id"` JobID int64 `json:"job_id"`
RunnerID *int64 `json:"runner_id,omitempty"` RunnerID *int64 `json:"runner_id,omitempty"`
Frame int `json:"frame"` Frame int `json:"frame"` // frame start (inclusive) for render tasks
FrameEnd *int `json:"frame_end,omitempty"` // frame end (inclusive); nil = single frame
TaskType TaskType `json:"task_type"` TaskType TaskType `json:"task_type"`
Status TaskStatus `json:"status"` Status TaskStatus `json:"status"`
CurrentStep string `json:"current_step,omitempty"` CurrentStep string `json:"current_step,omitempty"`
@@ -138,8 +139,6 @@ type CreateJobRequest struct {
UnhideObjects *bool `json:"unhide_objects,omitempty"` // Optional: Enable unhide tweaks for objects/collections UnhideObjects *bool `json:"unhide_objects,omitempty"` // Optional: Enable unhide tweaks for objects/collections
EnableExecution *bool `json:"enable_execution,omitempty"` // Optional: Enable auto-execution in Blender (adds --enable-autoexec flag, defaults to false) EnableExecution *bool `json:"enable_execution,omitempty"` // Optional: Enable auto-execution in Blender (adds --enable-autoexec flag, defaults to false)
BlenderVersion *string `json:"blender_version,omitempty"` // Optional: Override Blender version (e.g., "4.2" or "4.2.3") BlenderVersion *string `json:"blender_version,omitempty"` // Optional: Override Blender version (e.g., "4.2" or "4.2.3")
PreserveHDR *bool `json:"preserve_hdr,omitempty"` // Optional: Preserve HDR range for EXR encoding (uses HLG with bt709 primaries)
PreserveAlpha *bool `json:"preserve_alpha,omitempty"` // Optional: Preserve alpha channel for EXR encoding (requires AV1 or VP9 codec)
} }
// UpdateJobProgressRequest represents a request to update job progress // UpdateJobProgressRequest represents a request to update job progress
@@ -234,8 +233,6 @@ type BlendMetadata struct {
UnhideObjects *bool `json:"unhide_objects,omitempty"` // Enable unhide tweaks for objects/collections UnhideObjects *bool `json:"unhide_objects,omitempty"` // Enable unhide tweaks for objects/collections
EnableExecution *bool `json:"enable_execution,omitempty"` // Enable auto-execution in Blender (adds --enable-autoexec flag, defaults to false) EnableExecution *bool `json:"enable_execution,omitempty"` // Enable auto-execution in Blender (adds --enable-autoexec flag, defaults to false)
BlenderVersion string `json:"blender_version,omitempty"` // Detected or overridden Blender version (e.g., "4.2" or "4.2.3") BlenderVersion string `json:"blender_version,omitempty"` // Detected or overridden Blender version (e.g., "4.2" or "4.2.3")
PreserveHDR *bool `json:"preserve_hdr,omitempty"` // Preserve HDR range for EXR encoding (uses HLG with bt709 primaries)
PreserveAlpha *bool `json:"preserve_alpha,omitempty"` // Preserve alpha channel for EXR encoding (requires AV1 or VP9 codec)
} }
// MissingFilesInfo represents information about missing files/addons // MissingFilesInfo represents information about missing files/addons

View File

@@ -1,269 +0,0 @@
const API_BASE = '/api';
let currentUser = null;
// Check authentication on load
async function init() {
await checkAuth();
setupEventListeners();
if (currentUser) {
showMainPage();
loadJobs();
loadRunners();
} else {
showLoginPage();
}
}
async function checkAuth() {
try {
const response = await fetch(`${API_BASE}/auth/me`);
if (response.ok) {
currentUser = await response.json();
return true;
}
} catch (error) {
console.error('Auth check failed:', error);
}
return false;
}
function showLoginPage() {
document.getElementById('login-page').classList.remove('hidden');
document.getElementById('main-page').classList.add('hidden');
}
function showMainPage() {
document.getElementById('login-page').classList.add('hidden');
document.getElementById('main-page').classList.remove('hidden');
if (currentUser) {
document.getElementById('user-name').textContent = currentUser.name || currentUser.email;
}
}
function setupEventListeners() {
// Navigation
document.querySelectorAll('.nav-btn').forEach(btn => {
btn.addEventListener('click', (e) => {
const page = e.target.dataset.page;
switchPage(page);
});
});
// Logout
document.getElementById('logout-btn').addEventListener('click', async () => {
await fetch(`${API_BASE}/auth/logout`, { method: 'POST' });
currentUser = null;
showLoginPage();
});
// Job form
document.getElementById('job-form').addEventListener('submit', async (e) => {
e.preventDefault();
await submitJob();
});
}
function switchPage(page) {
document.querySelectorAll('.content-page').forEach(p => p.classList.add('hidden'));
document.querySelectorAll('.nav-btn').forEach(b => b.classList.remove('active'));
document.getElementById(`${page}-page`).classList.remove('hidden');
document.querySelector(`[data-page="${page}"]`).classList.add('active');
if (page === 'jobs') {
loadJobs();
} else if (page === 'runners') {
loadRunners();
}
}
async function submitJob() {
const form = document.getElementById('job-form');
const formData = new FormData(form);
const jobData = {
name: document.getElementById('job-name').value,
frame_start: parseInt(document.getElementById('frame-start').value),
frame_end: parseInt(document.getElementById('frame-end').value),
output_format: document.getElementById('output-format').value,
};
try {
// Create job
const jobResponse = await fetch(`${API_BASE}/jobs`, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(jobData),
});
if (!jobResponse.ok) {
throw new Error('Failed to create job');
}
const job = await jobResponse.json();
// Upload file
const fileInput = document.getElementById('blend-file');
if (fileInput.files.length > 0) {
const fileFormData = new FormData();
fileFormData.append('file', fileInput.files[0]);
const fileResponse = await fetch(`${API_BASE}/jobs/${job.id}/upload`, {
method: 'POST',
body: fileFormData,
});
if (!fileResponse.ok) {
throw new Error('Failed to upload file');
}
}
alert('Job submitted successfully!');
form.reset();
switchPage('jobs');
loadJobs();
} catch (error) {
alert('Failed to submit job: ' + error.message);
}
}
async function loadJobs() {
try {
const response = await fetch(`${API_BASE}/jobs`);
if (!response.ok) throw new Error('Failed to load jobs');
const jobs = await response.json();
displayJobs(jobs);
} catch (error) {
console.error('Failed to load jobs:', error);
}
}
function displayJobs(jobs) {
const container = document.getElementById('jobs-list');
if (jobs.length === 0) {
container.innerHTML = '<p>No jobs yet. Submit a job to get started!</p>';
return;
}
container.innerHTML = jobs.map(job => `
<div class="job-card">
<h3>${escapeHtml(job.name)}</h3>
<div class="job-meta">
<span>Frames: ${job.frame_start}-${job.frame_end}</span>
<span>Format: ${job.output_format}</span>
<span>Created: ${new Date(job.created_at).toLocaleString()}</span>
</div>
<div class="job-status ${job.status}">${job.status}</div>
<div class="progress-bar">
<div class="progress-fill" style="width: ${job.progress}%"></div>
</div>
<div class="job-actions">
<button onclick="viewJob(${job.id})" class="btn btn-primary">View Details</button>
${job.status === 'pending' || job.status === 'running' ?
`<button onclick="cancelJob(${job.id})" class="btn btn-secondary">Cancel</button>` : ''}
</div>
</div>
`).join('');
}
async function viewJob(jobId) {
try {
const response = await fetch(`${API_BASE}/jobs/${jobId}`);
if (!response.ok) throw new Error('Failed to load job');
const job = await response.json();
// Load files
const filesResponse = await fetch(`${API_BASE}/jobs/${jobId}/files`);
const files = filesResponse.ok ? await filesResponse.json() : [];
const outputFiles = files.filter(f => f.file_type === 'output');
if (outputFiles.length > 0) {
let message = 'Output files:\n';
outputFiles.forEach(file => {
message += `- ${file.file_name}\n`;
});
message += '\nWould you like to download them?';
if (confirm(message)) {
outputFiles.forEach(file => {
window.open(`${API_BASE}/jobs/${jobId}/files/${file.id}/download`, '_blank');
});
}
} else {
alert(`Job: ${job.name}\nStatus: ${job.status}\nProgress: ${job.progress.toFixed(1)}%`);
}
} catch (error) {
alert('Failed to load job details: ' + error.message);
}
}
async function cancelJob(jobId) {
if (!confirm('Are you sure you want to cancel this job?')) return;
try {
const response = await fetch(`${API_BASE}/jobs/${jobId}`, {
method: 'DELETE',
});
if (!response.ok) throw new Error('Failed to cancel job');
loadJobs();
} catch (error) {
alert('Failed to cancel job: ' + error.message);
}
}
async function loadRunners() {
try {
const response = await fetch(`${API_BASE}/runners`);
if (!response.ok) throw new Error('Failed to load runners');
const runners = await response.json();
displayRunners(runners);
} catch (error) {
console.error('Failed to load runners:', error);
}
}
function displayRunners(runners) {
const container = document.getElementById('runners-list');
if (runners.length === 0) {
container.innerHTML = '<p>No runners connected.</p>';
return;
}
container.innerHTML = runners.map(runner => {
const lastHeartbeat = new Date(runner.last_heartbeat);
const isOnline = (Date.now() - lastHeartbeat.getTime()) < 60000; // 1 minute
return `
<div class="runner-card">
<h3>${escapeHtml(runner.name)}</h3>
<div class="runner-info">
<span>Hostname: ${escapeHtml(runner.hostname)}</span>
<span>Last heartbeat: ${lastHeartbeat.toLocaleString()}</span>
</div>
<div class="runner-status ${isOnline ? 'online' : 'offline'}">
${isOnline ? 'Online' : 'Offline'}
</div>
</div>
`;
}).join('');
}
function escapeHtml(text) {
const div = document.createElement('div');
div.textContent = text;
return div.innerHTML;
}
// Auto-refresh jobs every 5 seconds
setInterval(() => {
if (currentUser && document.getElementById('jobs-page').classList.contains('hidden') === false) {
loadJobs();
}
}, 5000);
// Initialize on load
init();

View File

@@ -4,42 +4,26 @@ import (
"embed" "embed"
"io/fs" "io/fs"
"net/http" "net/http"
"strings"
) )
//go:embed dist/* //go:embed templates templates/partials static
var distFS embed.FS var uiFS embed.FS
// GetFileSystem returns an http.FileSystem for the embedded web UI files // GetStaticFileSystem returns an http.FileSystem for embedded UI assets.
func GetFileSystem() http.FileSystem { func GetStaticFileSystem() http.FileSystem {
subFS, err := fs.Sub(distFS, "dist") subFS, err := fs.Sub(uiFS, "static")
if err != nil { if err != nil {
panic(err) panic(err)
} }
return http.FS(subFS) return http.FS(subFS)
} }
// SPAHandler returns an http.Handler that serves the embedded SPA // StaticHandler serves /assets/* files from embedded static assets.
// It serves static files if they exist, otherwise falls back to index.html func StaticHandler() http.Handler {
func SPAHandler() http.Handler { return http.StripPrefix("/assets/", http.FileServer(GetStaticFileSystem()))
fsys := GetFileSystem()
fileServer := http.FileServer(fsys)
return http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
path := r.URL.Path
// Try to open the file
f, err := fsys.Open(strings.TrimPrefix(path, "/"))
if err != nil {
// File doesn't exist, serve index.html for SPA routing
r.URL.Path = "/"
fileServer.ServeHTTP(w, r)
return
}
f.Close()
// File exists, serve it
fileServer.ServeHTTP(w, r)
})
} }
// GetTemplateFS returns the embedded template filesystem.
func GetTemplateFS() fs.FS {
return uiFS
}

View File

@@ -1,13 +0,0 @@
<!doctype html>
<html lang="en">
<head>
<meta charset="UTF-8" />
<link rel="icon" type="image/svg+xml" href="/vite.svg" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<title>JiggaBlend</title>
</head>
<body>
<div id="root"></div>
<script type="module" src="/src/main.jsx"></script>
</body>
</html>

2677
web/package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@@ -1,21 +0,0 @@
{
"name": "jiggablend-web",
"version": "1.0.0",
"type": "module",
"scripts": {
"dev": "vite",
"build": "vite build",
"preview": "vite preview"
},
"dependencies": {
"react": "^18.2.0",
"react-dom": "^18.2.0"
},
"devDependencies": {
"@vitejs/plugin-react": "^4.2.1",
"autoprefixer": "^10.4.16",
"postcss": "^8.4.32",
"tailwindcss": "^3.4.0",
"vite": "^7.2.4"
}
}

View File

@@ -1,7 +0,0 @@
export default {
plugins: {
tailwindcss: {},
autoprefixer: {},
},
}

View File

@@ -1,50 +0,0 @@
import { useState, useEffect, useMemo } from 'react';
import { useAuth } from './hooks/useAuth';
import Login from './components/Login';
import Layout from './components/Layout';
import JobList from './components/JobList';
import JobSubmission from './components/JobSubmission';
import AdminPanel from './components/AdminPanel';
import ErrorBoundary from './components/ErrorBoundary';
import LoadingSpinner from './components/LoadingSpinner';
import './styles/index.css';
function App() {
const { user, loading, refresh } = useAuth();
const [activeTab, setActiveTab] = useState('jobs');
// Memoize login component to ensure it's ready immediately
const loginComponent = useMemo(() => <Login key="login" />, []);
if (loading) {
return (
<div className="min-h-screen flex items-center justify-center bg-gray-900">
<LoadingSpinner size="md" />
</div>
);
}
if (!user) {
return loginComponent;
}
// Wrapper to change tabs - only check auth on mount, not on every navigation
const handleTabChange = (newTab) => {
setActiveTab(newTab);
};
return (
<Layout activeTab={activeTab} onTabChange={handleTabChange}>
<ErrorBoundary>
{activeTab === 'jobs' && <JobList />}
{activeTab === 'submit' && (
<JobSubmission onSuccess={() => handleTabChange('jobs')} />
)}
{activeTab === 'admin' && <AdminPanel />}
</ErrorBoundary>
</Layout>
);
}
export default App;

View File

@@ -1,810 +0,0 @@
import { useState, useEffect, useRef } from 'react';
import { admin, jobs, normalizeArrayResponse } from '../utils/api';
import { wsManager } from '../utils/websocket';
import UserJobs from './UserJobs';
import PasswordChange from './PasswordChange';
import LoadingSpinner from './LoadingSpinner';
export default function AdminPanel() {
const [activeSection, setActiveSection] = useState('api-keys');
const [apiKeys, setApiKeys] = useState([]);
const [runners, setRunners] = useState([]);
const [users, setUsers] = useState([]);
const [loading, setLoading] = useState(false);
const [newAPIKeyName, setNewAPIKeyName] = useState('');
const [newAPIKeyDescription, setNewAPIKeyDescription] = useState('');
const [newAPIKeyScope, setNewAPIKeyScope] = useState('user'); // Default to user scope
const [newAPIKey, setNewAPIKey] = useState(null);
const [selectedUser, setSelectedUser] = useState(null);
const [registrationEnabled, setRegistrationEnabled] = useState(true);
const [passwordChangeUser, setPasswordChangeUser] = useState(null);
const listenerIdRef = useRef(null); // Listener ID for shared WebSocket
const subscribedChannelsRef = useRef(new Set()); // Track confirmed subscribed channels
const pendingSubscriptionsRef = useRef(new Set()); // Track pending subscriptions (waiting for confirmation)
// Connect to shared WebSocket on mount
useEffect(() => {
listenerIdRef.current = wsManager.subscribe('adminpanel', {
open: () => {
console.log('AdminPanel: Shared WebSocket connected');
// Subscribe to runners if already viewing runners section
if (activeSection === 'runners') {
subscribeToRunners();
}
},
message: (data) => {
// Handle subscription responses - update both local refs and wsManager
if (data.type === 'subscribed' && data.channel) {
pendingSubscriptionsRef.current.delete(data.channel);
subscribedChannelsRef.current.add(data.channel);
wsManager.confirmSubscription(data.channel);
console.log('Successfully subscribed to channel:', data.channel);
} else if (data.type === 'subscription_error' && data.channel) {
pendingSubscriptionsRef.current.delete(data.channel);
subscribedChannelsRef.current.delete(data.channel);
wsManager.failSubscription(data.channel);
console.error('Subscription failed for channel:', data.channel, data.error);
}
// Handle runners channel messages
if (data.channel === 'runners' && data.type === 'runner_status') {
// Update runner in list
setRunners(prev => {
const index = prev.findIndex(r => r.id === data.runner_id);
if (index >= 0 && data.data) {
const updated = [...prev];
updated[index] = { ...updated[index], ...data.data };
return updated;
}
return prev;
});
}
},
error: (error) => {
console.error('AdminPanel: Shared WebSocket error:', error);
},
close: (event) => {
console.log('AdminPanel: Shared WebSocket closed:', event);
subscribedChannelsRef.current.clear();
pendingSubscriptionsRef.current.clear();
}
});
// Ensure connection is established
wsManager.connect();
return () => {
// Unsubscribe from all channels before unmounting
unsubscribeFromRunners();
if (listenerIdRef.current) {
wsManager.unsubscribe(listenerIdRef.current);
listenerIdRef.current = null;
}
};
}, []);
const subscribeToRunners = () => {
const channel = 'runners';
// Don't subscribe if already subscribed or pending
if (subscribedChannelsRef.current.has(channel) || pendingSubscriptionsRef.current.has(channel)) {
return;
}
wsManager.subscribeToChannel(channel);
subscribedChannelsRef.current.add(channel);
pendingSubscriptionsRef.current.add(channel);
console.log('Subscribing to runners channel');
};
const unsubscribeFromRunners = () => {
const channel = 'runners';
if (!subscribedChannelsRef.current.has(channel)) {
return; // Not subscribed
}
wsManager.unsubscribeFromChannel(channel);
subscribedChannelsRef.current.delete(channel);
pendingSubscriptionsRef.current.delete(channel);
console.log('Unsubscribed from runners channel');
};
useEffect(() => {
if (activeSection === 'api-keys') {
loadAPIKeys();
unsubscribeFromRunners();
} else if (activeSection === 'runners') {
loadRunners();
subscribeToRunners();
} else if (activeSection === 'users') {
loadUsers();
unsubscribeFromRunners();
} else if (activeSection === 'settings') {
loadSettings();
unsubscribeFromRunners();
}
}, [activeSection]);
const loadAPIKeys = async () => {
setLoading(true);
try {
const data = await admin.listAPIKeys();
setApiKeys(normalizeArrayResponse(data));
} catch (error) {
console.error('Failed to load API keys:', error);
setApiKeys([]);
alert('Failed to load API keys');
} finally {
setLoading(false);
}
};
const loadRunners = async () => {
setLoading(true);
try {
const data = await admin.listRunners();
setRunners(normalizeArrayResponse(data));
} catch (error) {
console.error('Failed to load runners:', error);
setRunners([]);
alert('Failed to load runners');
} finally {
setLoading(false);
}
};
const loadUsers = async () => {
setLoading(true);
try {
const data = await admin.listUsers();
setUsers(normalizeArrayResponse(data));
} catch (error) {
console.error('Failed to load users:', error);
setUsers([]);
alert('Failed to load users');
} finally {
setLoading(false);
}
};
const loadSettings = async () => {
setLoading(true);
try {
const data = await admin.getRegistrationEnabled();
setRegistrationEnabled(data.enabled);
} catch (error) {
console.error('Failed to load settings:', error);
alert('Failed to load settings');
} finally {
setLoading(false);
}
};
const handleToggleRegistration = async () => {
const newValue = !registrationEnabled;
setLoading(true);
try {
await admin.setRegistrationEnabled(newValue);
setRegistrationEnabled(newValue);
alert(`Registration ${newValue ? 'enabled' : 'disabled'}`);
} catch (error) {
console.error('Failed to update registration setting:', error);
alert('Failed to update registration setting');
} finally {
setLoading(false);
}
};
const generateAPIKey = async () => {
if (!newAPIKeyName.trim()) {
alert('API key name is required');
return;
}
setLoading(true);
try {
const data = await admin.generateAPIKey(newAPIKeyName.trim(), newAPIKeyDescription.trim() || undefined, newAPIKeyScope);
setNewAPIKey(data);
setNewAPIKeyName('');
setNewAPIKeyDescription('');
setNewAPIKeyScope('user');
await loadAPIKeys();
} catch (error) {
console.error('Failed to generate API key:', error);
alert('Failed to generate API key');
} finally {
setLoading(false);
}
};
const [deletingKeyId, setDeletingKeyId] = useState(null);
const [deletingRunnerId, setDeletingRunnerId] = useState(null);
const revokeAPIKey = async (keyId) => {
if (!confirm('Are you sure you want to delete this API key? This action cannot be undone.')) {
return;
}
setDeletingKeyId(keyId);
try {
await admin.deleteAPIKey(keyId);
await loadAPIKeys();
} catch (error) {
console.error('Failed to delete API key:', error);
alert('Failed to delete API key');
} finally {
setDeletingKeyId(null);
}
};
const deleteRunner = async (runnerId) => {
if (!confirm('Are you sure you want to delete this runner?')) {
return;
}
setDeletingRunnerId(runnerId);
try {
await admin.deleteRunner(runnerId);
await loadRunners();
} catch (error) {
console.error('Failed to delete runner:', error);
alert('Failed to delete runner');
} finally {
setDeletingRunnerId(null);
}
};
const copyToClipboard = (text) => {
navigator.clipboard.writeText(text);
alert('Copied to clipboard!');
};
const isAPIKeyActive = (isActive) => {
return isActive;
};
return (
<div className="space-y-6">
<div className="flex space-x-4 border-b border-gray-700">
<button
onClick={() => {
setActiveSection('api-keys');
setSelectedUser(null);
}}
className={`py-2 px-4 border-b-2 font-medium ${
activeSection === 'api-keys'
? 'border-orange-500 text-orange-500'
: 'border-transparent text-gray-400 hover:text-gray-300'
}`}
>
API Keys
</button>
<button
onClick={() => {
setActiveSection('runners');
setSelectedUser(null);
}}
className={`py-2 px-4 border-b-2 font-medium ${
activeSection === 'runners'
? 'border-orange-500 text-orange-500'
: 'border-transparent text-gray-400 hover:text-gray-300'
}`}
>
Runner Management
</button>
<button
onClick={() => {
setActiveSection('users');
setSelectedUser(null);
}}
className={`py-2 px-4 border-b-2 font-medium ${
activeSection === 'users'
? 'border-orange-500 text-orange-500'
: 'border-transparent text-gray-400 hover:text-gray-300'
}`}
>
Users
</button>
<button
onClick={() => {
setActiveSection('settings');
setSelectedUser(null);
}}
className={`py-2 px-4 border-b-2 font-medium ${
activeSection === 'settings'
? 'border-orange-500 text-orange-500'
: 'border-transparent text-gray-400 hover:text-gray-300'
}`}
>
Settings
</button>
</div>
{activeSection === 'api-keys' && (
<div className="space-y-6">
<div className="bg-gray-800 rounded-lg shadow-md p-6 border border-gray-700">
<h2 className="text-xl font-semibold mb-4 text-gray-100">Generate API Key</h2>
<div className="space-y-4">
<div className="grid grid-cols-1 md:grid-cols-3 gap-4">
<div>
<label className="block text-sm font-medium text-gray-300 mb-2">
Name *
</label>
<input
type="text"
value={newAPIKeyName}
onChange={(e) => setNewAPIKeyName(e.target.value)}
placeholder="e.g., production-runner-01"
className="w-full px-3 py-2 bg-gray-900 border border-gray-600 rounded-lg text-gray-100 focus:ring-2 focus:ring-orange-500 focus:border-transparent"
required
/>
</div>
<div>
<label className="block text-sm font-medium text-gray-300 mb-2">
Description
</label>
<input
type="text"
value={newAPIKeyDescription}
onChange={(e) => setNewAPIKeyDescription(e.target.value)}
placeholder="Optional description"
className="w-full px-3 py-2 bg-gray-900 border border-gray-600 rounded-lg text-gray-100 focus:ring-2 focus:ring-orange-500 focus:border-transparent"
/>
</div>
<div>
<label className="block text-sm font-medium text-gray-300 mb-2">
Scope
</label>
<select
value={newAPIKeyScope}
onChange={(e) => setNewAPIKeyScope(e.target.value)}
className="w-full px-3 py-2 bg-gray-900 border border-gray-600 rounded-lg text-gray-100 focus:ring-2 focus:ring-orange-500 focus:border-transparent"
>
<option value="user">User - Only jobs from API key owner</option>
<option value="manager">Manager - All jobs from any user</option>
</select>
</div>
</div>
<div className="flex justify-end">
<button
onClick={generateAPIKey}
disabled={loading || !newAPIKeyName.trim()}
className="px-6 py-2 bg-orange-600 text-white rounded-lg hover:bg-orange-500 disabled:opacity-50 disabled:cursor-not-allowed transition-colors"
>
Generate API Key
</button>
</div>
</div>
{newAPIKey && (
<div className="mt-4 p-4 bg-green-400/20 border border-green-400/50 rounded-lg">
<p className="text-sm font-medium text-green-400 mb-2">New API Key Generated:</p>
<div className="space-y-2">
<div className="flex items-center gap-2">
<code className="flex-1 px-3 py-2 bg-gray-900 border border-green-400/50 rounded text-sm font-mono break-all text-gray-100">
{newAPIKey.key}
</code>
<button
onClick={() => copyToClipboard(newAPIKey.key)}
className="px-4 py-2 bg-green-600 text-white rounded hover:bg-green-500 transition-colors text-sm whitespace-nowrap"
>
Copy Key
</button>
</div>
<div className="text-xs text-green-400/80">
<p><strong>Name:</strong> {newAPIKey.name}</p>
{newAPIKey.description && <p><strong>Description:</strong> {newAPIKey.description}</p>}
</div>
<p className="text-xs text-green-400/80 mt-2">
Save this API key securely. It will not be shown again.
</p>
</div>
</div>
)}
</div>
<div className="bg-gray-800 rounded-lg shadow-md p-6 border border-gray-700">
<h2 className="text-xl font-semibold mb-4 text-gray-100">API Keys</h2>
{loading ? (
<LoadingSpinner size="sm" className="py-8" />
) : !apiKeys || apiKeys.length === 0 ? (
<p className="text-gray-400 text-center py-8">No API keys generated yet.</p>
) : (
<div className="overflow-x-auto">
<table className="min-w-full divide-y divide-gray-700">
<thead className="bg-gray-900">
<tr>
<th className="px-6 py-3 text-left text-xs font-medium text-gray-400 uppercase tracking-wider">
Name
</th>
<th className="px-6 py-3 text-left text-xs font-medium text-gray-400 uppercase tracking-wider">
Scope
</th>
<th className="px-6 py-3 text-left text-xs font-medium text-gray-400 uppercase tracking-wider">
Key Prefix
</th>
<th className="px-6 py-3 text-left text-xs font-medium text-gray-400 uppercase tracking-wider">
Status
</th>
<th className="px-6 py-3 text-left text-xs font-medium text-gray-400 uppercase tracking-wider">
Created At
</th>
<th className="px-6 py-3 text-left text-xs font-medium text-gray-400 uppercase tracking-wider">
Actions
</th>
</tr>
</thead>
<tbody className="bg-gray-800 divide-y divide-gray-700">
{apiKeys.map((key) => {
return (
<tr key={key.id}>
<td className="px-6 py-4 whitespace-nowrap">
<div>
<div className="text-sm font-medium text-gray-100">{key.name}</div>
{key.description && (
<div className="text-sm text-gray-400">{key.description}</div>
)}
</div>
</td>
<td className="px-6 py-4 whitespace-nowrap">
<span className={`px-2 py-1 text-xs font-medium rounded-full ${
key.scope === 'manager'
? 'bg-purple-400/20 text-purple-400'
: 'bg-blue-400/20 text-blue-400'
}`}>
{key.scope === 'manager' ? 'Manager' : 'User'}
</span>
</td>
<td className="px-6 py-4 whitespace-nowrap">
<code className="text-sm font-mono text-gray-300">
{key.key_prefix}
</code>
</td>
<td className="px-6 py-4 whitespace-nowrap">
{!key.is_active ? (
<span className="px-2 py-1 text-xs font-medium rounded-full bg-gray-500/20 text-gray-400">
Revoked
</span>
) : (
<span className="px-2 py-1 text-xs font-medium rounded-full bg-green-400/20 text-green-400">
Active
</span>
)}
</td>
<td className="px-6 py-4 whitespace-nowrap text-sm text-gray-400">
{new Date(key.created_at).toLocaleString()}
</td>
<td className="px-6 py-4 whitespace-nowrap text-sm space-x-2">
<button
onClick={() => revokeAPIKey(key.id)}
disabled={deletingKeyId === key.id}
className="text-red-400 hover:text-red-300 font-medium disabled:opacity-50 disabled:cursor-not-allowed"
title="Delete API key"
>
{deletingKeyId === key.id ? 'Deleting...' : 'Delete'}
</button>
</td>
</tr>
);
})}
</tbody>
</table>
</div>
)}
</div>
</div>
)}
{activeSection === 'runners' && (
<div className="bg-gray-800 rounded-lg shadow-md p-6 border border-gray-700">
<h2 className="text-xl font-semibold mb-4 text-gray-100">Runner Management</h2>
{loading ? (
<LoadingSpinner size="sm" className="py-8" />
) : !runners || runners.length === 0 ? (
<p className="text-gray-400 text-center py-8">No runners registered.</p>
) : (
<div className="overflow-x-auto">
<table className="min-w-full divide-y divide-gray-700">
<thead className="bg-gray-900">
<tr>
<th className="px-6 py-3 text-left text-xs font-medium text-gray-400 uppercase tracking-wider">
Name
</th>
<th className="px-6 py-3 text-left text-xs font-medium text-gray-400 uppercase tracking-wider">
Hostname
</th>
<th className="px-6 py-3 text-left text-xs font-medium text-gray-400 uppercase tracking-wider">
Status
</th>
<th className="px-6 py-3 text-left text-xs font-medium text-gray-400 uppercase tracking-wider">
API Key
</th>
<th className="px-6 py-3 text-left text-xs font-medium text-gray-400 uppercase tracking-wider">
Priority
</th>
<th className="px-6 py-3 text-left text-xs font-medium text-gray-400 uppercase tracking-wider">
Capabilities
</th>
<th className="px-6 py-3 text-left text-xs font-medium text-gray-400 uppercase tracking-wider">
Last Heartbeat
</th>
<th className="px-6 py-3 text-left text-xs font-medium text-gray-400 uppercase tracking-wider">
Actions
</th>
</tr>
</thead>
<tbody className="bg-gray-800 divide-y divide-gray-700">
{runners.map((runner) => {
const isOnline = new Date(runner.last_heartbeat) > new Date(Date.now() - 60000);
return (
<tr key={runner.id}>
<td className="px-6 py-4 whitespace-nowrap text-sm font-medium text-gray-100">
{runner.name}
</td>
<td className="px-6 py-4 whitespace-nowrap text-sm text-gray-400">
{runner.hostname}
</td>
<td className="px-6 py-4 whitespace-nowrap">
<span
className={`px-2 py-1 text-xs font-medium rounded-full ${
isOnline
? 'bg-green-400/20 text-green-400'
: 'bg-gray-500/20 text-gray-400'
}`}
>
{isOnline ? 'Online' : 'Offline'}
</span>
</td>
<td className="px-6 py-4 whitespace-nowrap text-sm text-gray-400">
<code className="text-xs font-mono bg-gray-900 px-2 py-1 rounded">
jk_r{runner.id % 10}_...
</code>
</td>
<td className="px-6 py-4 whitespace-nowrap text-sm text-gray-400">
{runner.priority}
</td>
<td className="px-6 py-4 whitespace-nowrap text-sm text-gray-400">
{runner.capabilities ? (
(() => {
try {
const caps = JSON.parse(runner.capabilities);
const enabled = Object.entries(caps)
.filter(([_, v]) => v)
.map(([k, _]) => k)
.join(', ');
return enabled || 'None';
} catch {
return runner.capabilities;
}
})()
) : (
'None'
)}
</td>
<td className="px-6 py-4 whitespace-nowrap text-sm text-gray-400">
{new Date(runner.last_heartbeat).toLocaleString()}
</td>
<td className="px-6 py-4 whitespace-nowrap text-sm">
<button
onClick={() => deleteRunner(runner.id)}
disabled={deletingRunnerId === runner.id}
className="text-red-400 hover:text-red-300 font-medium disabled:opacity-50 disabled:cursor-not-allowed"
>
{deletingRunnerId === runner.id ? 'Deleting...' : 'Delete'}
</button>
</td>
</tr>
);
})}
</tbody>
</table>
</div>
)}
</div>
)}
{activeSection === 'change-password' && passwordChangeUser && (
<div className="bg-gray-800 rounded-lg shadow-md p-6 border border-gray-700">
<button
onClick={() => {
setPasswordChangeUser(null);
setActiveSection('users');
}}
className="text-gray-400 hover:text-gray-300 mb-4 flex items-center gap-2"
>
<svg className="w-5 h-5" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M15 19l-7-7 7-7" />
</svg>
Back to Users
</button>
<PasswordChange
targetUserId={passwordChangeUser.id}
targetUserName={passwordChangeUser.name || passwordChangeUser.email}
onSuccess={() => {
setPasswordChangeUser(null);
setActiveSection('users');
}}
/>
</div>
)}
{activeSection === 'users' && (
<div className="space-y-6">
{selectedUser ? (
<UserJobs
userId={selectedUser.id}
userName={selectedUser.name || selectedUser.email}
onBack={() => setSelectedUser(null)}
/>
) : (
<div className="bg-gray-800 rounded-lg shadow-md p-6 border border-gray-700">
<h2 className="text-xl font-semibold mb-4 text-gray-100">User Management</h2>
{loading ? (
<LoadingSpinner size="sm" className="py-8" />
) : !users || users.length === 0 ? (
<p className="text-gray-400 text-center py-8">No users found.</p>
) : (
<div className="overflow-x-auto">
<table className="min-w-full divide-y divide-gray-700">
<thead className="bg-gray-900">
<tr>
<th className="px-6 py-3 text-left text-xs font-medium text-gray-400 uppercase tracking-wider">
Email
</th>
<th className="px-6 py-3 text-left text-xs font-medium text-gray-400 uppercase tracking-wider">
Name
</th>
<th className="px-6 py-3 text-left text-xs font-medium text-gray-400 uppercase tracking-wider">
Provider
</th>
<th className="px-6 py-3 text-left text-xs font-medium text-gray-400 uppercase tracking-wider">
Admin
</th>
<th className="px-6 py-3 text-left text-xs font-medium text-gray-400 uppercase tracking-wider">
Jobs
</th>
<th className="px-6 py-3 text-left text-xs font-medium text-gray-400 uppercase tracking-wider">
Created
</th>
<th className="px-6 py-3 text-left text-xs font-medium text-gray-400 uppercase tracking-wider">
Actions
</th>
</tr>
</thead>
<tbody className="bg-gray-800 divide-y divide-gray-700">
{users.map((user) => (
<tr key={user.id}>
<td className="px-6 py-4 whitespace-nowrap text-sm text-gray-100">
{user.email}
</td>
<td className="px-6 py-4 whitespace-nowrap text-sm text-gray-300">
{user.name}
</td>
<td className="px-6 py-4 whitespace-nowrap text-sm text-gray-400">
{user.oauth_provider}
</td>
<td className="px-6 py-4 whitespace-nowrap">
<div className="flex items-center gap-2">
{user.is_admin ? (
<span className="px-2 py-1 text-xs font-medium rounded-full bg-orange-400/20 text-orange-400">
Admin
</span>
) : (
<span className="px-2 py-1 text-xs font-medium rounded-full bg-gray-500/20 text-gray-400">
User
</span>
)}
<button
onClick={async () => {
if (user.is_first_user && user.is_admin) {
alert('Cannot remove admin status from the first user');
return;
}
if (!confirm(`Are you sure you want to ${user.is_admin ? 'remove admin privileges from' : 'grant admin privileges to'} ${user.name || user.email}?`)) {
return;
}
try {
await admin.setUserAdminStatus(user.id, !user.is_admin);
await loadUsers();
alert(`Admin status ${user.is_admin ? 'removed' : 'granted'} successfully`);
} catch (error) {
console.error('Failed to update admin status:', error);
const errorMsg = error.message || 'Failed to update admin status';
if (errorMsg.includes('first user')) {
alert('Cannot remove admin status from the first user');
} else {
alert(errorMsg);
}
}
}}
disabled={user.is_first_user && user.is_admin}
className={`text-xs px-2 py-1 rounded ${
user.is_first_user && user.is_admin
? 'text-gray-500 bg-gray-500/10 cursor-not-allowed'
: user.is_admin
? 'text-red-400 hover:text-red-300 bg-red-400/10 hover:bg-red-400/20'
: 'text-green-400 hover:text-green-300 bg-green-400/10 hover:bg-green-400/20'
} transition-colors`}
title={user.is_first_user && user.is_admin ? 'First user must remain admin' : user.is_admin ? 'Remove admin privileges' : 'Grant admin privileges'}
>
{user.is_admin ? 'Remove Admin' : 'Make Admin'}
</button>
</div>
</td>
<td className="px-6 py-4 whitespace-nowrap text-sm text-gray-400">
{user.job_count || 0}
</td>
<td className="px-6 py-4 whitespace-nowrap text-sm text-gray-400">
{new Date(user.created_at).toLocaleString()}
</td>
<td className="px-6 py-4 whitespace-nowrap text-sm">
<div className="flex gap-3">
<button
onClick={() => setSelectedUser(user)}
className="text-orange-400 hover:text-orange-300 font-medium"
>
View Jobs
</button>
{user.oauth_provider === 'local' && (
<button
onClick={() => {
const userForPassword = { id: user.id, name: user.name || user.email };
setPasswordChangeUser(userForPassword);
setSelectedUser(null);
setActiveSection('change-password');
}}
className="text-blue-400 hover:text-blue-300 font-medium"
>
Change Password
</button>
)}
</div>
</td>
</tr>
))}
</tbody>
</table>
</div>
)}
</div>
)}
</div>
)}
{activeSection === 'settings' && (
<div className="space-y-6">
<PasswordChange />
<div className="bg-gray-800 rounded-lg shadow-md p-6 border border-gray-700">
<h2 className="text-xl font-semibold mb-6 text-gray-100">System Settings</h2>
<div className="space-y-6">
<div className="flex items-center justify-between p-4 bg-gray-900 rounded-lg border border-gray-700">
<div>
<h3 className="text-lg font-medium text-gray-100 mb-1">User Registration</h3>
<p className="text-sm text-gray-400">
{registrationEnabled
? 'New users can register via OAuth or local login'
: 'Registration is disabled. Only existing users can log in.'}
</p>
</div>
<div className="flex items-center gap-4">
<span className={`text-sm font-medium ${registrationEnabled ? 'text-green-400' : 'text-red-400'}`}>
{registrationEnabled ? 'Enabled' : 'Disabled'}
</span>
<button
onClick={handleToggleRegistration}
disabled={loading}
className={`px-6 py-2 rounded-lg font-medium transition-colors ${
registrationEnabled
? 'bg-red-600 hover:bg-red-500 text-white'
: 'bg-green-600 hover:bg-green-500 text-white'
} disabled:opacity-50 disabled:cursor-not-allowed`}
>
{registrationEnabled ? 'Disable' : 'Enable'}
</button>
</div>
</div>
</div>
</div>
</div>
)}
</div>
);
}

View File

@@ -1,41 +0,0 @@
import React from 'react';
class ErrorBoundary extends React.Component {
constructor(props) {
super(props);
this.state = { hasError: false, error: null };
}
static getDerivedStateFromError(error) {
return { hasError: true, error };
}
componentDidCatch(error, errorInfo) {
console.error('ErrorBoundary caught an error:', error, errorInfo);
}
render() {
if (this.state.hasError) {
return (
<div className="p-6 bg-red-400/20 border border-red-400/50 rounded-lg text-red-400">
<h2 className="text-xl font-semibold mb-2">Something went wrong</h2>
<p className="mb-4">{this.state.error?.message || 'An unexpected error occurred'}</p>
<button
onClick={() => {
this.setState({ hasError: false, error: null });
window.location.reload();
}}
className="px-4 py-2 bg-red-600 text-white rounded-lg hover:bg-red-500 transition-colors"
>
Reload Page
</button>
</div>
);
}
return this.props.children;
}
}
export default ErrorBoundary;

View File

@@ -1,26 +0,0 @@
import React from 'react';
/**
* Shared ErrorMessage component for consistent error display
* Sanitizes error messages to prevent XSS
*/
export default function ErrorMessage({ error, className = '' }) {
if (!error) return null;
// Sanitize error message - escape HTML entities
const sanitize = (text) => {
const div = document.createElement('div');
div.textContent = text;
return div.innerHTML;
};
const sanitizedError = typeof error === 'string' ? sanitize(error) : sanitize(error.message || 'An error occurred');
return (
<div className={`p-4 bg-red-400/20 border border-red-400/50 rounded-lg text-red-400 ${className}`}>
<p className="font-semibold">Error:</p>
<p dangerouslySetInnerHTML={{ __html: sanitizedError }} />
</div>
);
}

View File

@@ -1,191 +0,0 @@
import { useState } from 'react';
export default function FileExplorer({ files, onDownload, onPreview, onVideoPreview, isImageFile }) {
const [expandedPaths, setExpandedPaths] = useState(new Set()); // Root folder collapsed by default
// Build directory tree from file paths
const buildTree = (files) => {
const tree = {};
files.forEach(file => {
const path = file.file_name;
// Handle both paths with slashes and single filenames
const parts = path.includes('/') ? path.split('/').filter(p => p) : [path];
// If it's a single file at root (no slashes), treat it specially
if (parts.length === 1 && !path.includes('/')) {
tree[parts[0]] = {
name: parts[0],
isFile: true,
file: file,
children: {},
path: parts[0]
};
return;
}
let current = tree;
parts.forEach((part, index) => {
if (!current[part]) {
current[part] = {
name: part,
isFile: index === parts.length - 1,
file: index === parts.length - 1 ? file : null,
children: {},
path: parts.slice(0, index + 1).join('/')
};
}
current = current[part].children;
});
});
return tree;
};
const togglePath = (path) => {
const newExpanded = new Set(expandedPaths);
if (newExpanded.has(path)) {
newExpanded.delete(path);
} else {
newExpanded.add(path);
}
setExpandedPaths(newExpanded);
};
const renderTree = (node, level = 0, parentPath = '') => {
const items = Object.values(node).sort((a, b) => {
// Directories first, then files
if (a.isFile !== b.isFile) {
return a.isFile ? 1 : -1;
}
return a.name.localeCompare(b.name);
});
return items.map((item) => {
const fullPath = parentPath ? `${parentPath}/${item.name}` : item.name;
const isExpanded = expandedPaths.has(fullPath);
const indent = level * 20;
if (item.isFile) {
const file = item.file;
const isImage = isImageFile && isImageFile(file.file_name);
const isVideo = file.file_name.toLowerCase().endsWith('.mp4');
const sizeMB = (file.file_size / 1024 / 1024).toFixed(2);
const isArchive = file.file_name.endsWith('.tar') || file.file_name.endsWith('.zip');
return (
<div key={fullPath} className="flex items-center justify-between py-1.5 hover:bg-gray-800/50 rounded px-2" style={{ paddingLeft: `${indent + 8}px` }}>
<div className="flex items-center gap-2 flex-1 min-w-0">
<span className="text-gray-500 text-sm">{isArchive ? '📦' : isVideo ? '🎬' : '📄'}</span>
<span className="text-gray-200 text-sm truncate" title={item.name}>
{item.name}
</span>
<span className="text-gray-500 text-xs ml-2">{sizeMB} MB</span>
</div>
<div className="flex gap-2 ml-4 shrink-0">
{isVideo && onVideoPreview && (
<button
onClick={() => onVideoPreview(file)}
className="px-2 py-1 bg-purple-600 text-white rounded text-xs hover:bg-purple-500 transition-colors"
title="Play Video"
>
</button>
)}
{isImage && onPreview && (
<button
onClick={() => onPreview(file)}
className="px-2 py-1 bg-blue-600 text-white rounded text-xs hover:bg-blue-500 transition-colors"
title="Preview"
>
👁
</button>
)}
{onDownload && file.id && (
<button
onClick={() => onDownload(file.id, file.file_name)}
className="px-2 py-1 bg-orange-600 text-white rounded text-xs hover:bg-orange-500 transition-colors"
title="Download"
>
</button>
)}
</div>
</div>
);
} else {
const hasChildren = Object.keys(item.children).length > 0;
return (
<div key={fullPath}>
<div
className="flex items-center gap-2 py-1.5 hover:bg-gray-800/50 rounded px-2 cursor-pointer select-none"
style={{ paddingLeft: `${indent + 8}px` }}
onClick={() => hasChildren && togglePath(fullPath)}
>
<span className="text-gray-400 text-xs w-4 flex items-center justify-center">
{hasChildren ? (isExpanded ? '▼' : '▶') : '•'}
</span>
<span className="text-gray-500 text-sm">
{hasChildren ? (isExpanded ? '📂' : '📁') : '📁'}
</span>
<span className="text-gray-300 text-sm font-medium">{item.name}</span>
{hasChildren && (
<span className="text-gray-500 text-xs ml-2">
({Object.keys(item.children).length})
</span>
)}
</div>
{hasChildren && isExpanded && (
<div className="ml-2">
{renderTree(item.children, level + 1, fullPath)}
</div>
)}
</div>
);
}
});
};
const tree = buildTree(files);
if (Object.keys(tree).length === 0) {
return (
<div className="text-gray-400 text-sm py-4 text-center">
No files
</div>
);
}
// Wrap tree in a root folder
const rootExpanded = expandedPaths.has('');
return (
<div className="bg-gray-900 rounded-lg border border-gray-700 p-3">
<div className="space-y-1">
<div>
<div
className="flex items-center gap-2 py-1.5 hover:bg-gray-800/50 rounded px-2 cursor-pointer select-none"
onClick={() => togglePath('')}
>
<span className="text-gray-400 text-xs w-4 flex items-center justify-center">
{rootExpanded ? '▼' : '▶'}
</span>
<span className="text-gray-500 text-sm">
{rootExpanded ? '📂' : '📁'}
</span>
<span className="text-gray-300 text-sm font-medium">Files</span>
<span className="text-gray-500 text-xs ml-2">
({Object.keys(tree).length})
</span>
</div>
{rootExpanded && (
<div className="ml-2">
{renderTree(tree)}
</div>
)}
</div>
</div>
</div>
);
}

File diff suppressed because it is too large Load Diff

View File

@@ -1,289 +0,0 @@
import { useState, useEffect, useRef } from 'react';
import { jobs, normalizeArrayResponse } from '../utils/api';
import { wsManager } from '../utils/websocket';
import JobDetails from './JobDetails';
import LoadingSpinner from './LoadingSpinner';
export default function JobList() {
const [jobList, setJobList] = useState([]);
const [loading, setLoading] = useState(true);
const [selectedJob, setSelectedJob] = useState(null);
const [pagination, setPagination] = useState({ total: 0, limit: 50, offset: 0 });
const [hasMore, setHasMore] = useState(true);
const listenerIdRef = useRef(null);
useEffect(() => {
loadJobs();
// Use shared WebSocket manager for real-time updates
listenerIdRef.current = wsManager.subscribe('joblist', {
open: () => {
console.log('JobList: Shared WebSocket connected');
// Load initial job list via HTTP to get current state
loadJobs();
},
message: (data) => {
console.log('JobList: Client WebSocket message received:', data.type, data.channel, data);
// Handle jobs channel messages (always broadcasted)
if (data.channel === 'jobs') {
if (data.type === 'job_update' && data.data) {
console.log('JobList: Updating job:', data.job_id, data.data);
// Update job in list
setJobList(prev => {
const prevArray = Array.isArray(prev) ? prev : [];
const index = prevArray.findIndex(j => j.id === data.job_id);
if (index >= 0) {
const updated = [...prevArray];
updated[index] = { ...updated[index], ...data.data };
console.log('JobList: Updated job at index', index, updated[index]);
return updated;
}
// If job not in current page, reload to get updated list
if (data.data.status === 'completed' || data.data.status === 'failed') {
loadJobs();
}
return prevArray;
});
} else if (data.type === 'job_created' && data.data) {
console.log('JobList: New job created:', data.job_id, data.data);
// New job created - add to list
setJobList(prev => {
const prevArray = Array.isArray(prev) ? prev : [];
// Check if job already exists (avoid duplicates)
if (prevArray.findIndex(j => j.id === data.job_id) >= 0) {
return prevArray;
}
// Add new job at the beginning
return [data.data, ...prevArray];
});
}
} else if (data.type === 'connected') {
// Connection established
console.log('JobList: WebSocket connected');
}
},
error: (error) => {
console.error('JobList: Shared WebSocket error:', error);
},
close: (event) => {
console.log('JobList: Shared WebSocket closed:', event);
}
});
// Ensure connection is established
wsManager.connect();
return () => {
if (listenerIdRef.current) {
wsManager.unsubscribe(listenerIdRef.current);
listenerIdRef.current = null;
}
};
}, []);
const loadJobs = async (append = false) => {
try {
const offset = append ? pagination.offset + pagination.limit : 0;
const result = await jobs.listSummary({
limit: pagination.limit,
offset,
sort: 'created_at:desc'
});
// Handle both old format (array) and new format (object with data, total, etc.)
const jobsArray = normalizeArrayResponse(result);
const total = result.total !== undefined ? result.total : jobsArray.length;
if (append) {
setJobList(prev => {
const prevArray = Array.isArray(prev) ? prev : [];
return [...prevArray, ...jobsArray];
});
setPagination(prev => ({ ...prev, offset, total }));
} else {
setJobList(jobsArray);
setPagination({ total, limit: result.limit || pagination.limit, offset: result.offset || 0 });
}
setHasMore(offset + jobsArray.length < total);
} catch (error) {
console.error('Failed to load jobs:', error);
// Ensure jobList is always an array even on error
if (!append) {
setJobList([]);
}
} finally {
setLoading(false);
}
};
const loadMore = () => {
if (!loading && hasMore) {
loadJobs(true);
}
};
// Keep selectedJob in sync with the job list when it refreshes
useEffect(() => {
if (selectedJob && jobList.length > 0) {
const freshJob = jobList.find(j => j.id === selectedJob.id);
if (freshJob) {
// Update to the fresh object from the list to keep it in sync
setSelectedJob(freshJob);
} else {
// Job was deleted or no longer exists, clear selection
setSelectedJob(null);
}
}
// eslint-disable-next-line react-hooks/exhaustive-deps
}, [jobList]); // Only depend on jobList, not selectedJob to avoid infinite loops
const handleCancel = async (jobId) => {
if (!confirm('Are you sure you want to cancel this job?')) return;
try {
await jobs.cancel(jobId);
loadJobs();
} catch (error) {
alert('Failed to cancel job: ' + error.message);
}
};
const handleDelete = async (jobId) => {
if (!confirm('Are you sure you want to permanently delete this job? This action cannot be undone.')) return;
try {
// Optimistically update the list
setJobList(prev => {
const prevArray = Array.isArray(prev) ? prev : [];
return prevArray.filter(j => j.id !== jobId);
});
if (selectedJob && selectedJob.id === jobId) {
setSelectedJob(null);
}
// Then actually delete
await jobs.delete(jobId);
// Reload to ensure consistency
loadJobs();
} catch (error) {
// On error, reload to restore correct state
loadJobs();
alert('Failed to delete job: ' + error.message);
}
};
const getStatusColor = (status) => {
const colors = {
pending: 'bg-yellow-400/20 text-yellow-400',
running: 'bg-orange-400/20 text-orange-400',
completed: 'bg-green-400/20 text-green-400',
failed: 'bg-red-400/20 text-red-400',
cancelled: 'bg-gray-500/20 text-gray-400',
};
return colors[status] || colors.pending;
};
if (loading && jobList.length === 0) {
return <LoadingSpinner size="md" className="h-64" />;
}
if (jobList.length === 0) {
return (
<div className="text-center py-12">
<p className="text-gray-400 text-lg">No jobs yet. Submit a job to get started!</p>
</div>
);
}
return (
<>
<div className="grid gap-6 md:grid-cols-2 lg:grid-cols-3">
{jobList.map((job) => (
<div
key={job.id}
className="bg-gray-800 rounded-lg shadow-md hover:shadow-lg transition-shadow p-6 border-l-4 border-orange-500 border border-gray-700"
>
<div className="flex justify-between items-start mb-4">
<h3 className="text-xl font-semibold text-gray-100">{job.name}</h3>
<span className={`px-3 py-1 rounded-full text-xs font-medium ${getStatusColor(job.status)}`}>
{job.status}
</span>
</div>
<div className="space-y-2 text-sm text-gray-400 mb-4">
{job.frame_start !== undefined && job.frame_end !== undefined && (
<p>Frames: {job.frame_start} - {job.frame_end}</p>
)}
{job.output_format && <p>Format: {job.output_format}</p>}
<p>Created: {new Date(job.created_at).toLocaleString()}</p>
</div>
<div className="mb-4">
<div className="flex justify-between text-xs text-gray-400 mb-1">
<span>Progress</span>
<span>{job.progress.toFixed(1)}%</span>
</div>
<div className="w-full bg-gray-700 rounded-full h-2">
<div
className="bg-orange-500 h-2 rounded-full transition-all duration-300"
style={{ width: `${job.progress}%` }}
></div>
</div>
</div>
<div className="flex gap-2">
<button
onClick={() => {
// Fetch full job details when viewing
jobs.get(job.id).then(fullJob => {
setSelectedJob(fullJob);
}).catch(err => {
console.error('Failed to load job details:', err);
setSelectedJob(job); // Fallback to summary
});
}}
className="flex-1 px-4 py-2 bg-orange-600 text-white rounded-lg hover:bg-orange-500 transition-colors font-medium"
>
View Details
</button>
{(job.status === 'pending' || job.status === 'running') && (
<button
onClick={() => handleCancel(job.id)}
className="px-4 py-2 bg-gray-700 text-gray-200 rounded-lg hover:bg-gray-600 transition-colors font-medium"
>
Cancel
</button>
)}
{(job.status === 'completed' || job.status === 'failed' || job.status === 'cancelled') && (
<button
onClick={() => handleDelete(job.id)}
className="px-4 py-2 bg-red-600 text-white rounded-lg hover:bg-red-500 transition-colors font-medium"
title="Delete job"
>
Delete
</button>
)}
</div>
</div>
))}
</div>
{hasMore && (
<div className="flex justify-center mt-6">
<button
onClick={loadMore}
disabled={loading}
className="px-6 py-2 bg-gray-700 text-gray-200 rounded-lg hover:bg-gray-600 transition-colors font-medium disabled:opacity-50"
>
{loading ? 'Loading...' : 'Load More'}
</button>
</div>
)}
{selectedJob && (
<JobDetails
job={selectedJob}
onClose={() => setSelectedJob(null)}
onUpdate={loadJobs}
/>
)}
</>
);
}

File diff suppressed because it is too large Load Diff

View File

@@ -1,76 +0,0 @@
import { useAuth } from '../hooks/useAuth';
export default function Layout({ children, activeTab, onTabChange }) {
const { user, logout } = useAuth();
const isAdmin = user?.is_admin || false;
// Note: If user becomes null, App.jsx will handle showing Login component
// We don't need to redirect here as App.jsx already checks for !user
return (
<div className="min-h-screen bg-gray-900">
<header className="bg-gray-800 shadow-sm border-b border-gray-700">
<div className="max-w-7xl mx-auto px-4 sm:px-6 lg:px-8">
<div className="flex justify-between items-center h-16">
<h1 className="text-2xl font-bold text-transparent bg-clip-text bg-gradient-to-r from-orange-500 to-amber-500">
JiggaBlend
</h1>
<div className="flex items-center gap-4">
<span className="text-gray-300">{user?.name || user?.email}</span>
<button
onClick={logout}
className="px-4 py-2 text-sm font-medium text-gray-200 bg-gray-700 border border-gray-600 rounded-lg hover:bg-gray-600 transition-colors"
>
Logout
</button>
</div>
</div>
</div>
</header>
<nav className="bg-gray-800 border-b border-gray-700">
<div className="max-w-7xl mx-auto px-4 sm:px-6 lg:px-8">
<div className="flex space-x-8">
<button
onClick={() => onTabChange('jobs')}
className={`py-4 px-1 border-b-2 font-medium text-sm transition-colors ${
activeTab === 'jobs'
? 'border-orange-500 text-orange-500'
: 'border-transparent text-gray-400 hover:text-gray-300 hover:border-gray-600'
}`}
>
Jobs
</button>
<button
onClick={() => onTabChange('submit')}
className={`py-4 px-1 border-b-2 font-medium text-sm transition-colors ${
activeTab === 'submit'
? 'border-orange-500 text-orange-500'
: 'border-transparent text-gray-400 hover:text-gray-300 hover:border-gray-600'
}`}
>
Submit Job
</button>
{isAdmin && (
<button
onClick={() => onTabChange('admin')}
className={`py-4 px-1 border-b-2 font-medium text-sm transition-colors ${
activeTab === 'admin'
? 'border-orange-500 text-orange-500'
: 'border-transparent text-gray-400 hover:text-gray-300 hover:border-gray-600'
}`}
>
Admin
</button>
)}
</div>
</div>
</nav>
<main className="max-w-7xl mx-auto px-4 sm:px-6 lg:px-8 py-8">
{children}
</main>
</div>
);
}

View File

@@ -1,19 +0,0 @@
import React from 'react';
/**
* Shared LoadingSpinner component with size variants
*/
export default function LoadingSpinner({ size = 'md', className = '', borderColor = 'border-orange-500' }) {
const sizeClasses = {
sm: 'h-8 w-8',
md: 'h-12 w-12',
lg: 'h-16 w-16',
};
return (
<div className={`flex justify-center items-center ${className}`}>
<div className={`animate-spin rounded-full border-b-2 ${borderColor} ${sizeClasses[size]}`}></div>
</div>
);
}

View File

@@ -1,277 +0,0 @@
import { useState, useEffect } from 'react';
import { auth } from '../utils/api';
import ErrorMessage from './ErrorMessage';
export default function Login() {
const [providers, setProviders] = useState({
google: false,
discord: false,
local: false,
});
const [showRegister, setShowRegister] = useState(false);
const [email, setEmail] = useState('');
const [name, setName] = useState('');
const [username, setUsername] = useState('');
const [password, setPassword] = useState('');
const [confirmPassword, setConfirmPassword] = useState('');
const [error, setError] = useState('');
const [loading, setLoading] = useState(false);
useEffect(() => {
checkAuthProviders();
// Check for registration disabled error in URL
const urlParams = new URLSearchParams(window.location.search);
if (urlParams.get('error') === 'registration_disabled') {
setError('Registration is currently disabled. Please contact an administrator.');
}
}, []);
const checkAuthProviders = async () => {
try {
const result = await auth.getProviders();
setProviders({
google: result.google || false,
discord: result.discord || false,
local: result.local || false,
});
} catch (error) {
// If endpoint fails, assume no providers are available
console.error('Failed to check auth providers:', error);
setProviders({ google: false, discord: false, local: false });
}
};
const handleLocalLogin = async (e) => {
e.preventDefault();
setError('');
setLoading(true);
try {
await auth.localLogin(username, password);
// Reload page to trigger auth check in App component
window.location.reload();
} catch (err) {
setError(err.message || 'Login failed');
setLoading(false);
}
};
const handleLocalRegister = async (e) => {
e.preventDefault();
setError('');
if (password !== confirmPassword) {
setError('Passwords do not match');
return;
}
if (password.length < 8) {
setError('Password must be at least 8 characters long');
return;
}
setLoading(true);
try {
await auth.localRegister(email, name, password);
// Reload page to trigger auth check in App component
window.location.reload();
} catch (err) {
setError(err.message || 'Registration failed');
setLoading(false);
}
};
return (
<div className="min-h-screen flex items-center justify-center bg-gray-900">
<div className="bg-gray-800 rounded-2xl shadow-2xl p-8 w-full max-w-md border border-gray-700">
<div className="text-center mb-8">
<h1 className="text-4xl font-bold text-transparent bg-clip-text bg-gradient-to-r from-orange-500 to-amber-500 mb-2">
JiggaBlend
</h1>
<p className="text-gray-400 text-lg">Blender Render Farm</p>
</div>
<div className="space-y-4">
<ErrorMessage error={error} className="text-sm" />
{providers.local && (
<div className="pb-4 border-b border-gray-700">
<div className="flex gap-2 mb-4">
<button
type="button"
onClick={() => {
setShowRegister(false);
setError('');
}}
className={`flex-1 py-2 px-4 rounded-lg font-medium transition-colors ${
!showRegister
? 'bg-orange-600 text-white'
: 'bg-gray-700 text-gray-300 hover:bg-gray-600'
}`}
>
Login
</button>
<button
type="button"
onClick={() => {
setShowRegister(true);
setError('');
}}
className={`flex-1 py-2 px-4 rounded-lg font-medium transition-colors ${
showRegister
? 'bg-orange-600 text-white'
: 'bg-gray-700 text-gray-300 hover:bg-gray-600'
}`}
>
Register
</button>
</div>
{!showRegister ? (
<form onSubmit={handleLocalLogin} className="space-y-4">
<div>
<label htmlFor="username" className="block text-sm font-medium text-gray-300 mb-1">
Email
</label>
<input
id="username"
type="email"
value={username}
onChange={(e) => setUsername(e.target.value)}
required
className="w-full px-4 py-2 bg-gray-900 border border-gray-600 rounded-lg text-gray-100 focus:ring-2 focus:ring-orange-500 focus:border-transparent placeholder-gray-500"
placeholder="Enter your email"
/>
</div>
<div>
<label htmlFor="password" className="block text-sm font-medium text-gray-300 mb-1">
Password
</label>
<input
id="password"
type="password"
value={password}
onChange={(e) => setPassword(e.target.value)}
required
className="w-full px-4 py-2 bg-gray-900 border border-gray-600 rounded-lg text-gray-100 focus:ring-2 focus:ring-orange-500 focus:border-transparent placeholder-gray-500"
placeholder="Enter password"
/>
</div>
<button
type="submit"
disabled={loading}
className="w-full bg-orange-600 text-white font-semibold py-3 px-6 rounded-lg hover:bg-orange-500 transition-all duration-200 shadow-lg disabled:opacity-50 disabled:cursor-not-allowed"
>
{loading ? 'Logging in...' : 'Login'}
</button>
</form>
) : (
<form onSubmit={handleLocalRegister} className="space-y-4">
<div>
<label htmlFor="reg-email" className="block text-sm font-medium text-gray-300 mb-1">
Email
</label>
<input
id="reg-email"
type="email"
value={email}
onChange={(e) => setEmail(e.target.value)}
required
className="w-full px-4 py-2 bg-gray-900 border border-gray-600 rounded-lg text-gray-100 focus:ring-2 focus:ring-orange-500 focus:border-transparent placeholder-gray-500"
placeholder="Enter your email"
/>
</div>
<div>
<label htmlFor="reg-name" className="block text-sm font-medium text-gray-300 mb-1">
Name
</label>
<input
id="reg-name"
type="text"
value={name}
onChange={(e) => setName(e.target.value)}
required
className="w-full px-4 py-2 bg-gray-900 border border-gray-600 rounded-lg text-gray-100 focus:ring-2 focus:ring-orange-500 focus:border-transparent placeholder-gray-500"
placeholder="Enter your name"
/>
</div>
<div>
<label htmlFor="reg-password" className="block text-sm font-medium text-gray-300 mb-1">
Password
</label>
<input
id="reg-password"
type="password"
value={password}
onChange={(e) => setPassword(e.target.value)}
required
minLength={8}
className="w-full px-4 py-2 bg-gray-900 border border-gray-600 rounded-lg text-gray-100 focus:ring-2 focus:ring-orange-500 focus:border-transparent placeholder-gray-500"
placeholder="At least 8 characters"
/>
</div>
<div>
<label htmlFor="reg-confirm-password" className="block text-sm font-medium text-gray-300 mb-1">
Confirm Password
</label>
<input
id="reg-confirm-password"
type="password"
value={confirmPassword}
onChange={(e) => setConfirmPassword(e.target.value)}
required
minLength={8}
className="w-full px-4 py-2 bg-gray-900 border border-gray-600 rounded-lg text-gray-100 focus:ring-2 focus:ring-orange-500 focus:border-transparent placeholder-gray-500"
placeholder="Confirm password"
/>
</div>
<button
type="submit"
disabled={loading}
className="w-full bg-orange-600 text-white font-semibold py-3 px-6 rounded-lg hover:bg-orange-500 transition-all duration-200 shadow-lg disabled:opacity-50 disabled:cursor-not-allowed"
>
{loading ? 'Registering...' : 'Register'}
</button>
</form>
)}
</div>
)}
{providers.google && (
<a
href="/api/auth/google/login"
className="w-full flex items-center justify-center gap-3 bg-gray-700 border-2 border-gray-600 text-gray-200 font-semibold py-3 px-6 rounded-lg hover:bg-gray-600 hover:border-gray-500 transition-all duration-200 shadow-sm"
>
<svg className="w-5 h-5" viewBox="0 0 24 24">
<path fill="#4285F4" d="M22.56 12.25c0-.78-.07-1.53-.2-2.25H12v4.26h5.92c-.26 1.37-1.04 2.53-2.21 3.31v2.77h3.57c2.08-1.92 3.28-4.74 3.28-8.09z"/>
<path fill="#34A853" d="M12 23c2.97 0 5.46-.98 7.28-2.66l-3.57-2.77c-.98.66-2.23 1.06-3.71 1.06-2.86 0-5.29-1.93-6.16-4.53H2.18v2.84C3.99 20.53 7.7 23 12 23z"/>
<path fill="#FBBC05" d="M5.84 14.09c-.22-.66-.35-1.36-.35-2.09s.13-1.43.35-2.09V7.07H2.18C1.43 8.55 1 10.22 1 12s.43 3.45 1.18 4.93l2.85-2.22.81-.62z"/>
<path fill="#EA4335" d="M12 5.38c1.62 0 3.06.56 4.21 1.64l3.15-3.15C17.45 2.09 14.97 1 12 1 7.7 1 3.99 3.47 2.18 7.07l3.66 2.84c.87-2.6 3.3-4.53 6.16-4.53z"/>
</svg>
Continue with Google
</a>
)}
{providers.discord && (
<a
href="/api/auth/discord/login"
className="w-full flex items-center justify-center gap-3 bg-[#5865F2] text-white font-semibold py-3 px-6 rounded-lg hover:bg-[#4752C4] transition-all duration-200 shadow-lg"
>
<svg className="w-5 h-5" fill="currentColor" viewBox="0 0 24 24">
<path d="M20.317 4.37a19.791 19.791 0 0 0-4.885-1.515a.074.074 0 0 0-.079.037c-.21.375-.444.864-.608 1.25a18.27 18.27 0 0 0-5.487 0a12.64 12.64 0 0 0-.617-1.25a.077.077 0 0 0-.079-.037A19.736 19.736 0 0 0 3.677 4.37a.07.07 0 0 0-.032.027C.533 9.046-.32 13.58.099 18.057a.082.082 0 0 0 .031.057a19.9 19.9 0 0 0 5.993 3.03a.078.078 0 0 0 .084-.028a14.09 14.09 0 0 0 1.226-1.994a.076.076 0 0 0-.041-.106a13.107 13.107 0 0 1-1.872-.892a.077.077 0 0 1-.008-.128a10.2 10.2 0 0 0 .372-.292a.074.074 0 0 1 .077-.01c3.928 1.793 8.18 1.793 12.062 0a.074.074 0 0 1 .078.01c.12.098.246.198.373.292a.077.077 0 0 1-.006.127a12.299 12.299 0 0 1-1.873.892a.077.077 0 0 0-.041.107c.36.698.772 1.362 1.225 1.993a.076.076 0 0 0 .084.028a19.839 19.839 0 0 0 6.002-3.03a.077.077 0 0 0 .032-.054c.5-5.177-.838-9.674-3.549-13.66a.061.061 0 0 0-.031-.03zM8.02 15.33c-1.183 0-2.157-1.085-2.157-2.419c0-1.333.956-2.419 2.157-2.419c1.21 0 2.176 1.096 2.157 2.42c0 1.333-.956 2.418-2.157 2.418zm7.975 0c-1.183 0-2.157-1.085-2.157-2.419c0-1.333.955-2.419 2.157-2.419c1.21 0 2.176 1.096 2.157 2.42c0 1.333-.946 2.418-2.157 2.418z"/>
</svg>
Continue with Discord
</a>
)}
{!providers.google && !providers.discord && !providers.local && (
<div className="p-4 bg-yellow-400/20 border border-yellow-400/50 rounded-lg text-yellow-400 text-sm text-center">
No authentication methods are configured. Please contact an administrator.
</div>
)}
</div>
</div>
</div>
);
}

View File

@@ -1,137 +0,0 @@
import { useState } from 'react';
import { auth } from '../utils/api';
import ErrorMessage from './ErrorMessage';
import { useAuth } from '../hooks/useAuth';
export default function PasswordChange({ targetUserId = null, targetUserName = null, onSuccess }) {
const { user } = useAuth();
const [oldPassword, setOldPassword] = useState('');
const [newPassword, setNewPassword] = useState('');
const [confirmPassword, setConfirmPassword] = useState('');
const [error, setError] = useState('');
const [success, setSuccess] = useState('');
const [loading, setLoading] = useState(false);
const isAdmin = user?.is_admin || false;
const isChangingOtherUser = targetUserId !== null && isAdmin;
const handleSubmit = async (e) => {
e.preventDefault();
setError('');
setSuccess('');
if (newPassword !== confirmPassword) {
setError('New passwords do not match');
return;
}
if (newPassword.length < 8) {
setError('Password must be at least 8 characters long');
return;
}
if (!isChangingOtherUser && !oldPassword) {
setError('Old password is required');
return;
}
setLoading(true);
try {
await auth.changePassword(
isChangingOtherUser ? null : oldPassword,
newPassword,
isChangingOtherUser ? targetUserId : null
);
setSuccess('Password changed successfully');
setOldPassword('');
setNewPassword('');
setConfirmPassword('');
if (onSuccess) {
setTimeout(() => {
onSuccess();
}, 1500);
}
} catch (err) {
setError(err.message || 'Failed to change password');
} finally {
setLoading(false);
}
};
return (
<div className="bg-gray-800 rounded-lg shadow-md p-6 border border-gray-700">
<h2 className="text-xl font-semibold mb-4 text-gray-100">
{isChangingOtherUser ? `Change Password for ${targetUserName || 'User'}` : 'Change Password'}
</h2>
<ErrorMessage error={error} className="mb-4 text-sm" />
{success && (
<div className="mb-4 p-3 bg-green-400/20 border border-green-400/50 rounded-lg text-green-400 text-sm">
{success}
</div>
)}
<form onSubmit={handleSubmit} className="space-y-4">
{!isChangingOtherUser && (
<div>
<label htmlFor="old-password" className="block text-sm font-medium text-gray-300 mb-1">
Current Password
</label>
<input
id="old-password"
type="password"
value={oldPassword}
onChange={(e) => setOldPassword(e.target.value)}
required
className="w-full px-4 py-2 bg-gray-900 border border-gray-600 rounded-lg text-gray-100 focus:ring-2 focus:ring-orange-500 focus:border-transparent"
placeholder="Enter current password"
/>
</div>
)}
<div>
<label htmlFor="new-password" className="block text-sm font-medium text-gray-300 mb-1">
New Password
</label>
<input
id="new-password"
type="password"
value={newPassword}
onChange={(e) => setNewPassword(e.target.value)}
required
minLength={8}
className="w-full px-4 py-2 bg-gray-900 border border-gray-600 rounded-lg text-gray-100 focus:ring-2 focus:ring-orange-500 focus:border-transparent"
placeholder="At least 8 characters"
/>
</div>
<div>
<label htmlFor="confirm-password" className="block text-sm font-medium text-gray-300 mb-1">
Confirm New Password
</label>
<input
id="confirm-password"
type="password"
value={confirmPassword}
onChange={(e) => setConfirmPassword(e.target.value)}
required
minLength={8}
className="w-full px-4 py-2 bg-gray-900 border border-gray-600 rounded-lg text-gray-100 focus:ring-2 focus:ring-orange-500 focus:border-transparent"
placeholder="Confirm new password"
/>
</div>
<button
type="submit"
disabled={loading}
className="w-full bg-orange-600 text-white font-semibold py-2 px-4 rounded-lg hover:bg-orange-500 transition-colors disabled:opacity-50 disabled:cursor-not-allowed"
>
{loading ? 'Changing Password...' : 'Change Password'}
</button>
</form>
</div>
);
}

View File

@@ -1,179 +0,0 @@
import { useState, useEffect, useRef } from 'react';
import { admin, normalizeArrayResponse } from '../utils/api';
import { wsManager } from '../utils/websocket';
import JobDetails from './JobDetails';
import LoadingSpinner from './LoadingSpinner';
export default function UserJobs({ userId, userName, onBack }) {
const [jobList, setJobList] = useState([]);
const [loading, setLoading] = useState(true);
const [selectedJob, setSelectedJob] = useState(null);
const listenerIdRef = useRef(null);
useEffect(() => {
loadJobs();
// Use shared WebSocket manager for real-time updates instead of polling
listenerIdRef.current = wsManager.subscribe(`userjobs_${userId}`, {
open: () => {
console.log('UserJobs: Shared WebSocket connected');
loadJobs();
},
message: (data) => {
// Handle jobs channel messages (always broadcasted)
if (data.channel === 'jobs') {
if (data.type === 'job_update' && data.data) {
// Update job in list if it belongs to this user
setJobList(prev => {
const prevArray = Array.isArray(prev) ? prev : [];
const index = prevArray.findIndex(j => j.id === data.job_id);
if (index >= 0) {
const updated = [...prevArray];
updated[index] = { ...updated[index], ...data.data };
return updated;
}
// If job not in current list, reload to get updated list
if (data.data.status === 'completed' || data.data.status === 'failed') {
loadJobs();
}
return prevArray;
});
} else if (data.type === 'job_created' && data.data) {
// New job created - reload to check if it belongs to this user
loadJobs();
}
}
},
error: (error) => {
console.error('UserJobs: Shared WebSocket error:', error);
},
close: (event) => {
console.log('UserJobs: Shared WebSocket closed:', event);
}
});
// Ensure connection is established
wsManager.connect();
return () => {
if (listenerIdRef.current) {
wsManager.unsubscribe(listenerIdRef.current);
listenerIdRef.current = null;
}
};
}, [userId]);
const loadJobs = async () => {
try {
const data = await admin.getUserJobs(userId);
setJobList(normalizeArrayResponse(data));
} catch (error) {
console.error('Failed to load jobs:', error);
setJobList([]);
} finally {
setLoading(false);
}
};
const getStatusColor = (status) => {
const colors = {
pending: 'bg-yellow-400/20 text-yellow-400',
running: 'bg-orange-400/20 text-orange-400',
completed: 'bg-green-400/20 text-green-400',
failed: 'bg-red-400/20 text-red-400',
cancelled: 'bg-gray-500/20 text-gray-400',
};
return colors[status] || colors.pending;
};
if (selectedJob) {
return (
<JobDetails
job={selectedJob}
onClose={() => setSelectedJob(null)}
onUpdate={loadJobs}
/>
);
}
if (loading) {
return <LoadingSpinner size="md" className="h-64" />;
}
return (
<div className="space-y-6">
<div className="flex items-center justify-between">
<div>
<button
onClick={onBack}
className="text-gray-400 hover:text-gray-300 mb-2 flex items-center gap-2"
>
<svg className="w-5 h-5" fill="none" stroke="currentColor" viewBox="0 0 24 24">
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M15 19l-7-7 7-7" />
</svg>
Back to Users
</button>
<h2 className="text-2xl font-bold text-gray-100">Jobs for {userName}</h2>
</div>
</div>
{jobList.length === 0 ? (
<div className="text-center py-12">
<p className="text-gray-400 text-lg">No jobs found for this user.</p>
</div>
) : (
<div className="grid gap-6">
{jobList.map((job) => (
<div
key={job.id}
className="bg-gray-800 rounded-lg shadow-md hover:shadow-lg transition-shadow p-6 border-l-4 border-orange-500 border border-gray-700"
>
<div className="flex justify-between items-start mb-4">
<div>
<h3 className="text-xl font-semibold text-gray-100 mb-1">{job.name}</h3>
<p className="text-sm text-gray-400">
{job.job_type === 'render' && job.frame_start !== null && job.frame_end !== null
? `Frames ${job.frame_start}-${job.frame_end}`
: 'Metadata extraction'}
</p>
</div>
<span
className={`px-3 py-1 text-sm font-medium rounded-full ${getStatusColor(job.status)}`}
>
{job.status}
</span>
</div>
{job.status === 'running' && (
<div className="mb-4">
<div className="flex justify-between text-sm text-gray-400 mb-1">
<span>Progress</span>
<span>{Math.round(job.progress)}%</span>
</div>
<div className="w-full bg-gray-700 rounded-full h-2">
<div
className="bg-orange-500 h-2 rounded-full transition-all duration-300"
style={{ width: `${job.progress}%` }}
></div>
</div>
</div>
)}
<div className="flex items-center justify-between">
<div className="text-sm text-gray-400">
Created: {new Date(job.created_at).toLocaleString()}
</div>
<button
onClick={() => setSelectedJob(job)}
className="px-4 py-2 bg-orange-600 text-white rounded-lg hover:bg-orange-500 transition-colors font-medium"
>
View Details
</button>
</div>
</div>
))}
</div>
)}
</div>
);
}

View File

@@ -1,90 +0,0 @@
import { useState, useRef, useEffect } from 'react';
import ErrorMessage from './ErrorMessage';
import LoadingSpinner from './LoadingSpinner';
export default function VideoPlayer({ videoUrl, onClose }) {
const videoRef = useRef(null);
const [loading, setLoading] = useState(true);
const [error, setError] = useState(null);
useEffect(() => {
const video = videoRef.current;
if (!video || !videoUrl) return;
const handleCanPlay = () => {
setLoading(false);
};
const handleError = (e) => {
console.error('Video playback error:', e, video.error);
// Get more detailed error information
let errorMsg = 'Failed to load video';
if (video.error) {
switch (video.error.code) {
case video.error.MEDIA_ERR_ABORTED:
errorMsg = 'Video loading aborted';
break;
case video.error.MEDIA_ERR_NETWORK:
errorMsg = 'Network error while loading video';
break;
case video.error.MEDIA_ERR_DECODE:
errorMsg = 'Video decoding error';
break;
case video.error.MEDIA_ERR_SRC_NOT_SUPPORTED:
errorMsg = 'Video format not supported';
break;
}
}
setError(errorMsg);
setLoading(false);
};
const handleLoadStart = () => {
setLoading(true);
setError(null);
};
video.addEventListener('canplay', handleCanPlay);
video.addEventListener('error', handleError);
video.addEventListener('loadstart', handleLoadStart);
return () => {
video.removeEventListener('canplay', handleCanPlay);
video.removeEventListener('error', handleError);
video.removeEventListener('loadstart', handleLoadStart);
};
}, [videoUrl]);
if (error) {
return (
<div>
<ErrorMessage error={error} />
<div className="mt-2 text-sm text-gray-400">
<a href={videoUrl} download className="text-orange-400 hover:text-orange-300 underline">Download video instead</a>
</div>
</div>
);
}
return (
<div className="relative bg-black rounded-lg overflow-hidden">
{loading && (
<div className="absolute inset-0 flex items-center justify-center bg-black bg-opacity-50 z-10">
<LoadingSpinner size="lg" className="border-white" />
</div>
)}
<video
ref={videoRef}
src={videoUrl}
controls
className="w-full"
onLoadedData={() => setLoading(false)}
preload="metadata"
>
Your browser does not support the video tag.
<a href={videoUrl} download>Download the video</a>
</video>
</div>
);
}

View File

@@ -1,88 +0,0 @@
import { useState, useEffect, useRef } from 'react';
import { auth, setAuthErrorHandler } from '../utils/api';
export function useAuth() {
const [user, setUser] = useState(null);
const [loading, setLoading] = useState(true);
const userRef = useRef(user);
// Keep userRef in sync with user state
useEffect(() => {
userRef.current = user;
}, [user]);
const checkAuth = async () => {
try {
const userData = await auth.getMe();
setUser(userData);
setLoading(false);
return userData; // Return user data for verification
} catch (error) {
// If 401/403, user is not authenticated
// Check if it's an auth error
if (error.message && (error.message.includes('Unauthorized') || error.message.includes('401') || error.message.includes('403'))) {
setUser(null);
setLoading(false);
throw error; // Re-throw to allow caller to handle
} else {
// Other errors (network, etc.) - don't log out, just re-throw
// This prevents network issues from logging users out
setLoading(false);
throw error; // Re-throw to allow caller to handle
}
}
};
const logout = async () => {
try {
await auth.logout();
} catch (error) {
console.error('Logout error:', error);
} finally {
// Refresh the page to show login
window.location.reload();
}
};
useEffect(() => {
// Set up global auth error handler
setAuthErrorHandler(() => {
setUser(null);
setLoading(false);
});
// Listen for auth errors from API calls
const handleAuthErrorEvent = () => {
setUser(null);
setLoading(false);
};
window.addEventListener('auth-error', handleAuthErrorEvent);
// Initial auth check
checkAuth();
// Periodic auth check every 10 seconds
const authInterval = setInterval(() => {
// Use ref to check current user state without dependency
if (userRef.current) {
// Only check if we have a user (don't spam when logged out)
checkAuth().catch((error) => {
// Only log out if it's actually an auth error, not a network error
// Network errors shouldn't log the user out
if (error.message && (error.message.includes('Unauthorized') || error.message.includes('401') || error.message.includes('403'))) {
// This is a real auth error - user will be set to null by checkAuth
}
// For other errors (network, etc.), don't log out - just silently fail
});
}
}, 10000); // 10 seconds
return () => {
window.removeEventListener('auth-error', handleAuthErrorEvent);
clearInterval(authInterval);
};
}, []); // Empty deps - only run on mount/unmount
return { user, loading, logout, refresh: checkAuth };
}

View File

@@ -1,10 +0,0 @@
import React from 'react'
import ReactDOM from 'react-dom/client'
import App from './App.jsx'
ReactDOM.createRoot(document.getElementById('root')).render(
<React.StrictMode>
<App />
</React.StrictMode>,
)

View File

@@ -1,14 +0,0 @@
@tailwind base;
@tailwind components;
@tailwind utilities;
body {
margin: 0;
font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', 'Roboto', 'Oxygen',
'Ubuntu', 'Cantarell', 'Fira Sans', 'Droid Sans', 'Helvetica Neue',
sans-serif;
-webkit-font-smoothing: antialiased;
-moz-osx-font-smoothing: grayscale;
@apply bg-gray-900 text-gray-100;
}

View File

@@ -1,552 +0,0 @@
const API_BASE = '/api';
// Global auth error handler - will be set by useAuth hook
let onAuthError = null;
// Request debouncing and deduplication
const pendingRequests = new Map(); // key: endpoint+params, value: Promise
const requestQueue = new Map(); // key: endpoint+params, value: { resolve, reject, timestamp }
const DEBOUNCE_DELAY = 100; // 100ms debounce delay
const DEDUPE_WINDOW = 5000; // 5 seconds - same request within this window uses cached promise
// Generate cache key from endpoint and params
function getCacheKey(endpoint, options = {}) {
const params = new URLSearchParams();
Object.keys(options).sort().forEach(key => {
if (options[key] !== undefined && options[key] !== null) {
params.append(key, String(options[key]));
}
});
const query = params.toString();
return `${endpoint}${query ? '?' + query : ''}`;
}
// Utility function to normalize array responses (handles both old and new formats)
export function normalizeArrayResponse(response) {
const data = response?.data || response;
return Array.isArray(data) ? data : [];
}
// Sentinel value to indicate a request was superseded (instead of rejecting)
// Export it so components can check for it
export const REQUEST_SUPERSEDED = Symbol('REQUEST_SUPERSEDED');
// Debounced request wrapper
function debounceRequest(key, requestFn, delay = DEBOUNCE_DELAY) {
return new Promise((resolve, reject) => {
// Check if there's a pending request for this key
if (pendingRequests.has(key)) {
const pending = pendingRequests.get(key);
// If request is very recent (within dedupe window), reuse it
const now = Date.now();
if (pending.timestamp && (now - pending.timestamp) < DEDUPE_WINDOW) {
pending.promise.then(resolve).catch(reject);
return;
} else {
// Request is older than dedupe window - remove it and create new one
pendingRequests.delete(key);
}
}
// Clear any existing timeout for this key
if (requestQueue.has(key)) {
const queued = requestQueue.get(key);
clearTimeout(queued.timeout);
// Resolve with sentinel value instead of rejecting - this prevents errors from propagating
// The new request will handle the actual response
queued.resolve(REQUEST_SUPERSEDED);
}
// Queue new request
const timeout = setTimeout(() => {
requestQueue.delete(key);
const promise = requestFn();
const timestamp = Date.now();
pendingRequests.set(key, { promise, timestamp });
promise
.then(result => {
pendingRequests.delete(key);
resolve(result);
})
.catch(error => {
pendingRequests.delete(key);
reject(error);
});
}, delay);
requestQueue.set(key, { resolve, reject, timeout });
});
}
export const setAuthErrorHandler = (handler) => {
onAuthError = handler;
};
// Whitelist of endpoints that should NOT trigger auth error handling
// These are endpoints that can legitimately return 401/403 without meaning the user is logged out
const AUTH_CHECK_ENDPOINTS = ['/auth/me', '/auth/logout'];
const handleAuthError = (response, endpoint) => {
if (response.status === 401 || response.status === 403) {
// Don't trigger auth error handler for endpoints that check auth status
if (AUTH_CHECK_ENDPOINTS.includes(endpoint)) {
return;
}
// Trigger auth error handler if set (this will clear user state)
if (onAuthError) {
onAuthError();
}
// Force a re-check of auth status to ensure login is shown
// This ensures the App component re-renders with user=null
if (typeof window !== 'undefined') {
// Dispatch a custom event that useAuth can listen to
window.dispatchEvent(new CustomEvent('auth-error'));
}
}
};
// Extract error message from response - centralized to avoid duplication
async function extractErrorMessage(response) {
try {
const errorData = await response.json();
return errorData?.error || response.statusText;
} catch {
return response.statusText;
}
}
export const api = {
async get(endpoint, options = {}) {
const abortController = options.signal || new AbortController();
const response = await fetch(`${API_BASE}${endpoint}`, {
credentials: 'include', // Include cookies for session
signal: abortController.signal,
});
if (!response.ok) {
// Handle auth errors before parsing response
handleAuthError(response, endpoint);
const errorMessage = await extractErrorMessage(response);
throw new Error(errorMessage);
}
return response.json();
},
async post(endpoint, data, options = {}) {
const abortController = options.signal || new AbortController();
const response = await fetch(`${API_BASE}${endpoint}`, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(data),
credentials: 'include', // Include cookies for session
signal: abortController.signal,
});
if (!response.ok) {
// Handle auth errors before parsing response
handleAuthError(response, endpoint);
const errorMessage = await extractErrorMessage(response);
throw new Error(errorMessage);
}
return response.json();
},
async patch(endpoint, data, options = {}) {
const abortController = options.signal || new AbortController();
const response = await fetch(`${API_BASE}${endpoint}`, {
method: 'PATCH',
headers: { 'Content-Type': 'application/json' },
body: data ? JSON.stringify(data) : undefined,
credentials: 'include', // Include cookies for session
signal: abortController.signal,
});
if (!response.ok) {
// Handle auth errors before parsing response
handleAuthError(response, endpoint);
const errorMessage = await extractErrorMessage(response);
throw new Error(errorMessage);
}
return response.json();
},
async delete(endpoint, options = {}) {
const abortController = options.signal || new AbortController();
const response = await fetch(`${API_BASE}${endpoint}`, {
method: 'DELETE',
credentials: 'include', // Include cookies for session
signal: abortController.signal,
});
if (!response.ok) {
// Handle auth errors before parsing response
handleAuthError(response, endpoint);
const errorMessage = await extractErrorMessage(response);
throw new Error(errorMessage);
}
return response.json();
},
async uploadFile(endpoint, file, onProgress, mainBlendFile) {
return new Promise((resolve, reject) => {
const formData = new FormData();
formData.append('file', file);
if (mainBlendFile) {
formData.append('main_blend_file', mainBlendFile);
}
const xhr = new XMLHttpRequest();
// Track upload progress
if (onProgress) {
xhr.upload.addEventListener('progress', (e) => {
if (e.lengthComputable) {
const percentComplete = (e.loaded / e.total) * 100;
onProgress(percentComplete);
}
});
}
xhr.addEventListener('load', () => {
if (xhr.status >= 200 && xhr.status < 300) {
try {
const response = JSON.parse(xhr.responseText);
resolve(response);
} catch (err) {
resolve(xhr.responseText);
}
} else {
// Handle auth errors
if (xhr.status === 401 || xhr.status === 403) {
handleAuthError({ status: xhr.status }, endpoint);
}
try {
const errorData = JSON.parse(xhr.responseText);
reject(new Error(errorData.error || xhr.statusText));
} catch {
reject(new Error(xhr.statusText));
}
}
});
xhr.addEventListener('error', () => {
reject(new Error('Upload failed'));
});
xhr.addEventListener('abort', () => {
reject(new Error('Upload aborted'));
});
xhr.open('POST', `${API_BASE}${endpoint}`);
xhr.withCredentials = true; // Include cookies for session
xhr.send(formData);
});
},
};
export const auth = {
async getMe() {
return api.get('/auth/me');
},
async logout() {
return api.post('/auth/logout');
},
async getProviders() {
return api.get('/auth/providers');
},
async isLocalLoginAvailable() {
return api.get('/auth/local/available');
},
async localRegister(email, name, password) {
return api.post('/auth/local/register', { email, name, password });
},
async localLogin(username, password) {
return api.post('/auth/local/login', { username, password });
},
async changePassword(oldPassword, newPassword, targetUserId = null) {
const body = { old_password: oldPassword, new_password: newPassword };
if (targetUserId !== null) {
body.target_user_id = targetUserId;
}
return api.post('/auth/change-password', body);
},
};
export const jobs = {
async list(options = {}) {
const key = getCacheKey('/jobs', options);
return debounceRequest(key, () => {
const params = new URLSearchParams();
if (options.limit) params.append('limit', options.limit.toString());
if (options.offset) params.append('offset', options.offset.toString());
if (options.status) params.append('status', options.status);
if (options.sort) params.append('sort', options.sort);
const query = params.toString();
return api.get(`/jobs${query ? '?' + query : ''}`);
});
},
async listSummary(options = {}) {
const key = getCacheKey('/jobs/summary', options);
return debounceRequest(key, () => {
const params = new URLSearchParams();
if (options.limit) params.append('limit', options.limit.toString());
if (options.offset) params.append('offset', options.offset.toString());
if (options.status) params.append('status', options.status);
if (options.sort) params.append('sort', options.sort);
const query = params.toString();
return api.get(`/jobs/summary${query ? '?' + query : ''}`, options);
});
},
async get(id, options = {}) {
const key = getCacheKey(`/jobs/${id}`, options);
return debounceRequest(key, async () => {
if (options.etag) {
// Include ETag in request headers for conditional requests
const headers = { 'If-None-Match': options.etag };
const response = await fetch(`${API_BASE}/jobs/${id}`, {
credentials: 'include',
headers,
});
if (response.status === 304) {
return null; // Not modified
}
if (!response.ok) {
const errorData = await response.json().catch(() => null);
throw new Error(errorData?.error || response.statusText);
}
return response.json();
}
return api.get(`/jobs/${id}`, options);
});
},
async create(jobData) {
return api.post('/jobs', jobData);
},
async cancel(id) {
return api.delete(`/jobs/${id}`);
},
async delete(id) {
return api.post(`/jobs/${id}/delete`);
},
async uploadFile(jobId, file, onProgress, mainBlendFile) {
return api.uploadFile(`/jobs/${jobId}/upload`, file, onProgress, mainBlendFile);
},
async uploadFileForJobCreation(file, onProgress, mainBlendFile) {
return api.uploadFile(`/jobs/upload`, file, onProgress, mainBlendFile);
},
async getFiles(jobId, options = {}) {
const key = getCacheKey(`/jobs/${jobId}/files`, options);
return debounceRequest(key, () => {
const params = new URLSearchParams();
if (options.limit) params.append('limit', options.limit.toString());
if (options.offset) params.append('offset', options.offset.toString());
if (options.file_type) params.append('file_type', options.file_type);
if (options.extension) params.append('extension', options.extension);
const query = params.toString();
return api.get(`/jobs/${jobId}/files${query ? '?' + query : ''}`, options);
});
},
async getFilesCount(jobId, options = {}) {
const key = getCacheKey(`/jobs/${jobId}/files/count`, options);
return debounceRequest(key, () => {
const params = new URLSearchParams();
if (options.file_type) params.append('file_type', options.file_type);
const query = params.toString();
return api.get(`/jobs/${jobId}/files/count${query ? '?' + query : ''}`);
});
},
async getContextArchive(jobId, options = {}) {
return api.get(`/jobs/${jobId}/context`, options);
},
downloadFile(jobId, fileId) {
return `${API_BASE}/jobs/${jobId}/files/${fileId}/download`;
},
previewEXR(jobId, fileId) {
return `${API_BASE}/jobs/${jobId}/files/${fileId}/preview-exr`;
},
getVideoUrl(jobId) {
return `${API_BASE}/jobs/${jobId}/video`;
},
async getTaskLogs(jobId, taskId, options = {}) {
const key = getCacheKey(`/jobs/${jobId}/tasks/${taskId}/logs`, options);
return debounceRequest(key, async () => {
const params = new URLSearchParams();
if (options.stepName) params.append('step_name', options.stepName);
if (options.logLevel) params.append('log_level', options.logLevel);
if (options.limit) params.append('limit', options.limit.toString());
if (options.sinceId) params.append('since_id', options.sinceId.toString());
const query = params.toString();
const result = await api.get(`/jobs/${jobId}/tasks/${taskId}/logs${query ? '?' + query : ''}`, options);
// Handle both old format (array) and new format (object with logs, last_id, limit)
if (Array.isArray(result)) {
return { logs: result, last_id: result.length > 0 ? result[result.length - 1].id : 0, limit: options.limit || 100 };
}
return result;
});
},
async getTaskSteps(jobId, taskId, options = {}) {
return api.get(`/jobs/${jobId}/tasks/${taskId}/steps`, options);
},
// New unified client WebSocket - DEPRECATED: Use wsManager from websocket.js instead
// This is kept for backwards compatibility but should not be used
streamClientWebSocket() {
console.warn('streamClientWebSocket() is deprecated - use wsManager from websocket.js instead');
const wsProtocol = window.location.protocol === 'https:' ? 'wss:' : 'ws:';
const wsHost = window.location.host;
const url = `${wsProtocol}//${wsHost}${API_BASE}/ws`;
return new WebSocket(url);
},
// Old WebSocket methods (to be removed after migration)
streamTaskLogsWebSocket(jobId, taskId, lastId = 0) {
// Convert HTTP to WebSocket URL
const wsProtocol = window.location.protocol === 'https:' ? 'wss:' : 'ws:';
const wsHost = window.location.host;
const url = `${wsProtocol}//${wsHost}${API_BASE}/jobs/${jobId}/tasks/${taskId}/logs/ws?last_id=${lastId}`;
return new WebSocket(url);
},
streamJobsWebSocket() {
const wsProtocol = window.location.protocol === 'https:' ? 'wss:' : 'ws:';
const wsHost = window.location.host;
const url = `${wsProtocol}//${wsHost}${API_BASE}/jobs/ws-old`;
return new WebSocket(url);
},
streamJobWebSocket(jobId) {
const wsProtocol = window.location.protocol === 'https:' ? 'wss:' : 'ws:';
const wsHost = window.location.host;
const url = `${wsProtocol}//${wsHost}${API_BASE}/jobs/${jobId}/ws`;
return new WebSocket(url);
},
async retryTask(jobId, taskId) {
return api.post(`/jobs/${jobId}/tasks/${taskId}/retry`);
},
async getMetadata(jobId) {
return api.get(`/jobs/${jobId}/metadata`);
},
async getTasks(jobId, options = {}) {
const key = getCacheKey(`/jobs/${jobId}/tasks`, options);
return debounceRequest(key, () => {
const params = new URLSearchParams();
if (options.limit) params.append('limit', options.limit.toString());
if (options.offset) params.append('offset', options.offset.toString());
if (options.status) params.append('status', options.status);
if (options.frameStart) params.append('frame_start', options.frameStart.toString());
if (options.frameEnd) params.append('frame_end', options.frameEnd.toString());
if (options.sort) params.append('sort', options.sort);
const query = params.toString();
return api.get(`/jobs/${jobId}/tasks${query ? '?' + query : ''}`, options);
});
},
async getTasksSummary(jobId, options = {}) {
const key = getCacheKey(`/jobs/${jobId}/tasks/summary`, options);
return debounceRequest(key, () => {
const params = new URLSearchParams();
if (options.limit) params.append('limit', options.limit.toString());
if (options.offset) params.append('offset', options.offset.toString());
if (options.status) params.append('status', options.status);
if (options.sort) params.append('sort', options.sort);
const query = params.toString();
return api.get(`/jobs/${jobId}/tasks/summary${query ? '?' + query : ''}`, options);
});
},
async batchGetJobs(jobIds) {
// Sort jobIds for consistent cache key
const sortedIds = [...jobIds].sort((a, b) => a - b);
const key = getCacheKey('/jobs/batch', { job_ids: sortedIds.join(',') });
return debounceRequest(key, () => {
return api.post('/jobs/batch', { job_ids: jobIds });
});
},
async batchGetTasks(jobId, taskIds) {
// Sort taskIds for consistent cache key
const sortedIds = [...taskIds].sort((a, b) => a - b);
const key = getCacheKey(`/jobs/${jobId}/tasks/batch`, { task_ids: sortedIds.join(',') });
return debounceRequest(key, () => {
return api.post(`/jobs/${jobId}/tasks/batch`, { task_ids: taskIds });
});
},
};
export const runners = {
// Non-admin runner list removed - use admin.listRunners() instead
};
export const admin = {
async generateAPIKey(name, description, scope) {
const data = { name, scope };
if (description) data.description = description;
return api.post('/admin/runners/api-keys', data);
},
async listAPIKeys() {
return api.get('/admin/runners/api-keys');
},
async revokeAPIKey(keyId) {
return api.patch(`/admin/runners/api-keys/${keyId}/revoke`);
},
async deleteAPIKey(keyId) {
return api.delete(`/admin/runners/api-keys/${keyId}`);
},
async listRunners() {
return api.get('/admin/runners');
},
async verifyRunner(runnerId) {
return api.post(`/admin/runners/${runnerId}/verify`);
},
async deleteRunner(runnerId) {
return api.delete(`/admin/runners/${runnerId}`);
},
async listUsers() {
return api.get('/admin/users');
},
async getUserJobs(userId) {
return api.get(`/admin/users/${userId}/jobs`);
},
async setUserAdminStatus(userId, isAdmin) {
return api.post(`/admin/users/${userId}/admin`, { is_admin: isAdmin });
},
async getRegistrationEnabled() {
return api.get('/admin/settings/registration');
},
async setRegistrationEnabled(enabled) {
return api.post('/admin/settings/registration', { enabled });
},
};

View File

@@ -1,271 +0,0 @@
// Shared WebSocket connection manager
// All components should use this instead of creating their own connections
class WebSocketManager {
constructor() {
this.ws = null;
this.listeners = new Map(); // Map of listener IDs to callback functions
this.reconnectTimeout = null;
this.reconnectDelay = 2000;
this.isConnecting = false;
this.listenerIdCounter = 0;
this.verboseLogging = false; // Set to true to enable verbose WebSocket logging
// Track server-side channel subscriptions for re-subscription on reconnect
this.serverSubscriptions = new Set(); // Channels we want to be subscribed to
this.confirmedSubscriptions = new Set(); // Channels confirmed by server
this.pendingSubscriptions = new Set(); // Channels waiting for confirmation
}
connect() {
// If already connected or connecting, don't create a new connection
if (this.ws && (this.ws.readyState === WebSocket.CONNECTING || this.ws.readyState === WebSocket.OPEN)) {
return;
}
if (this.isConnecting) {
return;
}
this.isConnecting = true;
try {
const wsProtocol = window.location.protocol === 'https:' ? 'wss:' : 'ws:';
const wsHost = window.location.host;
const API_BASE = '/api';
const url = `${wsProtocol}//${wsHost}${API_BASE}/jobs/ws`;
this.ws = new WebSocket(url);
this.ws.onopen = () => {
if (this.verboseLogging) {
console.log('Shared WebSocket connected');
}
this.isConnecting = false;
// Re-subscribe to all channels that were previously subscribed
this.resubscribeToChannels();
this.notifyListeners('open', {});
};
this.ws.onmessage = (event) => {
try {
const data = JSON.parse(event.data);
if (this.verboseLogging) {
console.log('WebSocketManager: Message received:', data.type, data.channel || 'no channel', data);
}
this.notifyListeners('message', data);
} catch (error) {
console.error('WebSocketManager: Failed to parse message:', error, 'Raw data:', event.data);
}
};
this.ws.onerror = (error) => {
console.error('Shared WebSocket error:', error);
this.isConnecting = false;
this.notifyListeners('error', error);
};
this.ws.onclose = (event) => {
if (this.verboseLogging) {
console.log('Shared WebSocket closed:', {
code: event.code,
reason: event.reason,
wasClean: event.wasClean
});
}
this.ws = null;
this.isConnecting = false;
// Clear confirmed/pending but keep serverSubscriptions for re-subscription
this.confirmedSubscriptions.clear();
this.pendingSubscriptions.clear();
this.notifyListeners('close', event);
// Always retry connection if we have listeners
if (this.listeners.size > 0) {
if (this.reconnectTimeout) {
clearTimeout(this.reconnectTimeout);
}
this.reconnectTimeout = setTimeout(() => {
if (!this.ws || this.ws.readyState === WebSocket.CLOSED) {
this.connect();
}
}, this.reconnectDelay);
}
};
} catch (error) {
console.error('Failed to create WebSocket:', error);
this.isConnecting = false;
// Retry after delay
this.reconnectTimeout = setTimeout(() => {
this.connect();
}, this.reconnectDelay);
}
}
subscribe(listenerId, callbacks) {
// Generate ID if not provided
if (!listenerId) {
listenerId = `listener_${this.listenerIdCounter++}`;
}
if (this.verboseLogging) {
console.log('WebSocketManager: Subscribing listener:', listenerId, 'WebSocket state:', this.ws ? this.ws.readyState : 'no connection');
}
this.listeners.set(listenerId, callbacks);
// Connect if not already connected
if (!this.ws || this.ws.readyState === WebSocket.CLOSED) {
if (this.verboseLogging) {
console.log('WebSocketManager: WebSocket not connected, connecting...');
}
this.connect();
}
// If already open, notify immediately
if (this.ws && this.ws.readyState === WebSocket.OPEN && callbacks.open) {
if (this.verboseLogging) {
console.log('WebSocketManager: WebSocket already open, calling open callback for listener:', listenerId);
}
// Use setTimeout to ensure this happens after the listener is registered
setTimeout(() => {
if (callbacks.open) {
callbacks.open();
}
}, 0);
}
return listenerId;
}
unsubscribe(listenerId) {
this.listeners.delete(listenerId);
// If no more listeners, we could close the connection, but let's keep it open
// in case other components need it
}
send(data) {
if (this.ws && this.ws.readyState === WebSocket.OPEN) {
if (this.verboseLogging) {
console.log('WebSocketManager: Sending message:', data);
}
this.ws.send(JSON.stringify(data));
} else {
console.warn('WebSocketManager: Cannot send message - connection not open. State:', this.ws ? this.ws.readyState : 'no connection', 'Message:', data);
}
}
notifyListeners(eventType, data) {
this.listeners.forEach((callbacks) => {
if (callbacks[eventType]) {
try {
callbacks[eventType](data);
} catch (error) {
console.error('Error in WebSocket listener:', error);
}
}
});
}
getReadyState() {
return this.ws ? this.ws.readyState : WebSocket.CLOSED;
}
// Subscribe to a server-side channel (will be re-subscribed on reconnect)
subscribeToChannel(channel) {
if (this.serverSubscriptions.has(channel)) {
// Already subscribed or pending
return;
}
this.serverSubscriptions.add(channel);
if (this.ws && this.ws.readyState === WebSocket.OPEN) {
if (!this.confirmedSubscriptions.has(channel) && !this.pendingSubscriptions.has(channel)) {
this.pendingSubscriptions.add(channel);
this.send({ type: 'subscribe', channel });
if (this.verboseLogging) {
console.log('WebSocketManager: Subscribing to channel:', channel);
}
}
}
}
// Unsubscribe from a server-side channel (won't be re-subscribed on reconnect)
unsubscribeFromChannel(channel) {
this.serverSubscriptions.delete(channel);
this.confirmedSubscriptions.delete(channel);
this.pendingSubscriptions.delete(channel);
if (this.ws && this.ws.readyState === WebSocket.OPEN) {
this.send({ type: 'unsubscribe', channel });
if (this.verboseLogging) {
console.log('WebSocketManager: Unsubscribing from channel:', channel);
}
}
}
// Mark a channel subscription as confirmed (call this when server confirms)
confirmSubscription(channel) {
this.pendingSubscriptions.delete(channel);
this.confirmedSubscriptions.add(channel);
if (this.verboseLogging) {
console.log('WebSocketManager: Subscription confirmed for channel:', channel);
}
}
// Mark a channel subscription as failed (call this when server rejects)
failSubscription(channel) {
this.pendingSubscriptions.delete(channel);
this.serverSubscriptions.delete(channel);
if (this.verboseLogging) {
console.log('WebSocketManager: Subscription failed for channel:', channel);
}
}
// Check if subscribed to a channel
isSubscribedToChannel(channel) {
return this.confirmedSubscriptions.has(channel);
}
// Re-subscribe to all channels after reconnect
resubscribeToChannels() {
if (this.serverSubscriptions.size === 0) {
return;
}
if (this.verboseLogging) {
console.log('WebSocketManager: Re-subscribing to channels:', Array.from(this.serverSubscriptions));
}
for (const channel of this.serverSubscriptions) {
if (!this.pendingSubscriptions.has(channel)) {
this.pendingSubscriptions.add(channel);
this.send({ type: 'subscribe', channel });
}
}
}
disconnect() {
if (this.reconnectTimeout) {
clearTimeout(this.reconnectTimeout);
this.reconnectTimeout = null;
}
if (this.ws) {
this.ws.close();
this.ws = null;
}
this.listeners.clear();
this.serverSubscriptions.clear();
this.confirmedSubscriptions.clear();
this.pendingSubscriptions.clear();
}
}
// Export singleton instance
export const wsManager = new WebSocketManager();

95
web/static/admin.js Normal file
View File

@@ -0,0 +1,95 @@
(function () {
const msgEl = document.getElementById("admin-message");
const errEl = document.getElementById("admin-error");
const saveRegBtn = document.getElementById("save-registration");
const regCheckbox = document.getElementById("registration-enabled");
const createKeyBtn = document.getElementById("create-api-key");
function showMessage(msg) {
msgEl.textContent = msg || "";
msgEl.classList.toggle("hidden", !msg);
}
function showError(msg) {
errEl.textContent = msg || "";
errEl.classList.toggle("hidden", !msg);
}
async function request(url, method, payload) {
const res = await fetch(url, {
method,
credentials: "include",
headers: { "Content-Type": "application/json" },
body: payload ? JSON.stringify(payload) : undefined,
});
const data = await res.json().catch(() => ({}));
if (!res.ok) throw new Error(data.error || "Request failed");
return data;
}
function refreshAll() {
if (!window.htmx) return window.location.reload();
htmx.ajax("GET", "/ui/fragments/admin/runners", "#admin-runners");
htmx.ajax("GET", "/ui/fragments/admin/users", "#admin-users");
htmx.ajax("GET", "/ui/fragments/admin/apikeys", "#admin-apikeys");
}
if (saveRegBtn && regCheckbox) {
saveRegBtn.addEventListener("click", async () => {
showError("");
try {
await request("/api/admin/settings/registration", "POST", { enabled: regCheckbox.checked });
showMessage("Registration setting saved.");
} catch (err) {
showError(err.message);
}
});
}
if (createKeyBtn) {
createKeyBtn.addEventListener("click", async () => {
const name = prompt("API key name:");
if (!name) return;
showError("");
try {
const data = await request("/api/admin/runners/api-keys", "POST", { name, scope: "manager" });
showMessage(`New API key created: ${data.key}`);
refreshAll();
} catch (err) {
showError(err.message);
}
});
}
document.body.addEventListener("click", async (e) => {
const deleteRunner = e.target.closest("[data-delete-runner]");
const setAdmin = e.target.closest("[data-set-admin]");
const revokeKey = e.target.closest("[data-revoke-apikey]");
const deleteKey = e.target.closest("[data-delete-apikey]");
if (!deleteRunner && !setAdmin && !revokeKey && !deleteKey) return;
showError("");
try {
if (deleteRunner) {
const id = deleteRunner.getAttribute("data-delete-runner");
if (!confirm("Delete this runner?")) return;
await request(`/api/admin/runners/${id}`, "DELETE");
}
if (setAdmin) {
const id = setAdmin.getAttribute("data-set-admin");
const value = setAdmin.getAttribute("data-admin-value") === "true";
await request(`/api/admin/users/${id}/admin`, "POST", { is_admin: value });
}
if (revokeKey) {
const id = revokeKey.getAttribute("data-revoke-apikey");
await request(`/api/admin/runners/api-keys/${id}/revoke`, "PATCH");
}
if (deleteKey) {
const id = deleteKey.getAttribute("data-delete-apikey");
await request(`/api/admin/runners/api-keys/${id}`, "DELETE");
}
refreshAll();
} catch (err) {
showError(err.message);
}
});
})();

286
web/static/job_new.js Normal file
View File

@@ -0,0 +1,286 @@
(function () {
const uploadForm = document.getElementById("upload-analyze-form");
const configForm = document.getElementById("job-config-form");
const fileInput = document.getElementById("source-file");
const statusEl = document.getElementById("upload-status");
const errorEl = document.getElementById("job-create-error");
const blendVersionEl = document.getElementById("blender-version");
const mainBlendWrapper = document.getElementById("main-blend-wrapper");
const mainBlendSelect = document.getElementById("main-blend-select");
const metadataPreview = document.getElementById("metadata-preview");
const configSection = document.getElementById("job-config-section");
const uploadSection = document.getElementById("job-upload-section");
const uploadSubmitBtn = uploadForm.querySelector('button[type="submit"]');
const stepUpload = document.getElementById("step-upload");
const stepConfig = document.getElementById("step-config");
const nameInput = document.getElementById("job-name");
const frameStartInput = document.getElementById("frame-start");
const frameEndInput = document.getElementById("frame-end");
const outputFormatInput = document.getElementById("output-format");
const unhideObjectsInput = document.getElementById("unhide-objects");
const enableExecutionInput = document.getElementById("enable-execution");
let sessionID = "";
let pollTimer = null;
let uploadInProgress = false;
function showError(msg) {
errorEl.textContent = msg || "";
errorEl.classList.toggle("hidden", !msg);
}
function showStatus(msg) {
statusEl.classList.remove("hidden");
statusEl.innerHTML = `<p>${msg}</p>`;
}
function setUploadBusy(busy) {
uploadInProgress = busy;
if (!uploadSubmitBtn) return;
uploadSubmitBtn.disabled = busy;
}
function setStep(step) {
const uploadActive = step === 1;
stepUpload.classList.toggle("active", uploadActive);
stepUpload.classList.toggle("complete", !uploadActive);
stepConfig.classList.toggle("active", !uploadActive);
uploadSection.classList.toggle("hidden", !uploadActive);
configSection.classList.toggle("hidden", uploadActive);
}
function fileNameToJobName(fileName) {
const stem = (fileName || "Render Job").replace(/\.[^/.]+$/, "");
return stem.trim() || "Render Job";
}
function prefillFromMetadata(status, fileName) {
const metadata = status.metadata || {};
const render = metadata.render_settings || {};
nameInput.value = fileNameToJobName(fileName || status.file_name);
frameStartInput.value = Number.isFinite(metadata.frame_start) ? metadata.frame_start : 1;
frameEndInput.value = Number.isFinite(metadata.frame_end) ? metadata.frame_end : 250;
if (render.output_format && outputFormatInput.querySelector(`option[value="${render.output_format}"]`)) {
outputFormatInput.value = render.output_format;
} else {
outputFormatInput.value = "EXR";
}
if (metadata.blender_version && blendVersionEl.querySelector(`option[value="${metadata.blender_version}"]`)) {
blendVersionEl.value = metadata.blender_version;
} else {
blendVersionEl.value = "";
}
unhideObjectsInput.checked = Boolean(metadata.unhide_objects);
enableExecutionInput.checked = Boolean(metadata.enable_execution);
const scenes = metadata.scene_info || {};
metadataPreview.innerHTML = `
<div class="metadata-grid">
<div><strong>Detected file:</strong> ${status.file_name || fileName || "-"}</div>
<div><strong>Frames:</strong> ${metadata.frame_start ?? "-"} - ${metadata.frame_end ?? "-"}</div>
<div><strong>Render engine:</strong> ${render.engine || "-"}</div>
<div><strong>Resolution:</strong> ${render.resolution_x || "-"} x ${render.resolution_y || "-"}</div>
<div><strong>Frame rate:</strong> ${render.frame_rate || "-"}</div>
<div><strong>Objects:</strong> ${scenes.object_count ?? "-"}</div>
</div>
`;
}
async function loadBlenderVersions() {
try {
const res = await fetch("/api/blender/versions", { credentials: "include" });
if (!res.ok) return;
const data = await res.json();
const versions = data.versions || [];
versions.slice(0, 30).forEach((v) => {
const option = document.createElement("option");
option.value = v.full;
option.textContent = v.full;
blendVersionEl.appendChild(option);
});
} catch (_) {}
}
function uploadFile(mainBlendFile) {
return new Promise((resolve, reject) => {
const file = fileInput.files[0];
if (!file) {
reject(new Error("Select a file first"));
return;
}
const lowerName = file.name.toLowerCase();
const isAccepted = lowerName.endsWith(".blend") || lowerName.endsWith(".zip");
if (!isAccepted) {
reject(new Error("Only .blend or .zip files are supported."));
return;
}
const fd = new FormData();
fd.append("file", file);
if (mainBlendFile) {
fd.append("main_blend_file", mainBlendFile);
}
const xhr = new XMLHttpRequest();
xhr.open("POST", "/api/jobs/upload", true);
xhr.withCredentials = true;
xhr.upload.addEventListener("progress", (e) => {
if (!e.lengthComputable) return;
const pct = Math.round((e.loaded / e.total) * 100);
showStatus(`Uploading: ${pct}%`);
});
xhr.onload = () => {
try {
const data = JSON.parse(xhr.responseText || "{}");
if (xhr.status >= 400) {
reject(new Error(data.error || "Upload failed"));
return;
}
resolve(data);
} catch (err) {
reject(err);
}
};
xhr.onerror = () => reject(new Error("Upload failed"));
xhr.send(fd);
});
}
async function pollUploadStatus() {
if (!sessionID) return null;
const res = await fetch(`/api/jobs/upload/status?session_id=${encodeURIComponent(sessionID)}`, { credentials: "include" });
const data = await res.json().catch(() => ({}));
if (!res.ok) {
throw new Error(data.error || "Upload status check failed");
}
return data;
}
async function createJob(payload) {
const res = await fetch("/api/jobs", {
method: "POST",
credentials: "include",
headers: { "Content-Type": "application/json" },
body: JSON.stringify(payload),
});
const data = await res.json().catch(() => ({}));
if (!res.ok) {
throw new Error(data.error || "Job creation failed");
}
return data;
}
async function runSubmission(mainBlendFile) {
showError("");
setStep(1);
configSection.classList.add("hidden");
metadataPreview.innerHTML = "";
const upload = await uploadFile(mainBlendFile);
sessionID = upload.session_id;
showStatus("Upload complete. Processing...");
clearInterval(pollTimer);
await new Promise((resolve, reject) => {
pollTimer = setInterval(async () => {
try {
const status = await pollUploadStatus();
if (!status) return;
showStatus(`${status.message || status.status} (${Math.round((status.progress || 0) * 100)}%)`);
if (status.status === "select_blend") {
clearInterval(pollTimer);
mainBlendSelect.innerHTML = "";
(status.blend_files || []).forEach((path) => {
const option = document.createElement("option");
option.value = path;
option.textContent = path;
mainBlendSelect.appendChild(option);
});
mainBlendWrapper.classList.remove("hidden");
reject(new Error("Select a main blend file and submit again."));
return;
}
if (status.status === "error") {
clearInterval(pollTimer);
reject(new Error(status.error || "Upload processing failed"));
return;
}
if (status.status === "completed") {
clearInterval(pollTimer);
prefillFromMetadata(status, fileInput.files[0]?.name || "");
setStep(2);
resolve();
}
} catch (err) {
clearInterval(pollTimer);
reject(err);
}
}, 1500);
});
}
async function submitJobConfig() {
if (!sessionID) {
throw new Error("Upload and analyze a file first.");
}
const fd = new FormData(configForm);
const jobName = String(fd.get("name") || "").trim();
if (!jobName) {
throw new Error("Job name is required.");
}
nameInput.value = jobName;
const payload = {
job_type: "render",
name: jobName,
frame_start: Number(fd.get("frame_start")),
frame_end: Number(fd.get("frame_end")),
output_format: fd.get("output_format"),
upload_session_id: sessionID,
unhide_objects: Boolean(fd.get("unhide_objects")),
enable_execution: Boolean(fd.get("enable_execution")),
};
const blenderVersion = fd.get("blender_version");
if (blenderVersion) payload.blender_version = blenderVersion;
const job = await createJob(payload);
showStatus(`Job created (#${job.id}). Redirecting...`);
window.location.href = `/jobs/${job.id}`;
}
uploadForm.addEventListener("submit", async (e) => {
e.preventDefault();
if (uploadInProgress) {
return;
}
try {
setUploadBusy(true);
const selected = mainBlendWrapper.classList.contains("hidden") ? "" : mainBlendSelect.value;
await runSubmission(selected);
} catch (err) {
showError(err.message || "Failed to create job");
setUploadBusy(false);
}
});
configForm.addEventListener("submit", async (e) => {
e.preventDefault();
try {
showError("");
await submitJobConfig();
} catch (err) {
showError(err.message || "Failed to create job");
}
});
setStep(1);
loadBlenderVersions();
})();

428
web/static/job_show.js Normal file
View File

@@ -0,0 +1,428 @@
(function () {
const jobID = window.location.pathname.split("/").pop();
const progressFill = document.querySelector(".progress-fill[data-progress]");
const progressText = document.getElementById("job-progress-text");
const statusBadge = document.getElementById("job-status-badge");
const tasksRefreshBtn = document.getElementById("tasks-refresh");
const tasksFragment = document.getElementById("tasks-fragment");
const filesRefreshBtn = document.getElementById("files-refresh");
const filesFragment = document.getElementById("files-fragment");
const cancelJobBtn = document.getElementById("cancel-job-btn");
const deleteJobBtn = document.getElementById("delete-job-btn");
const previewModal = document.getElementById("exr-preview-modal");
const previewImage = document.getElementById("exr-preview-image");
const previewLoading = document.getElementById("exr-preview-loading");
const previewError = document.getElementById("exr-preview-error");
const previewName = document.getElementById("exr-preview-name");
let lastJobSnapshot = null;
let lastSmartRefreshAt = 0;
if (progressFill) {
const value = Number(progressFill.getAttribute("data-progress") || "0");
const bounded = Math.max(0, Math.min(100, value));
progressFill.style.width = `${bounded}%`;
}
function statusClass(status) {
const normalized = String(status || "").toLowerCase();
if (normalized === "completed") return "status-completed";
if (normalized === "running") return "status-running";
if (normalized === "failed") return "status-failed";
if (normalized === "cancelled") return "status-cancelled";
return "status-pending";
}
function applyJobState(job) {
if (!job) return;
const normalizedStatus = String(job.status || "pending").toLowerCase();
const canCancel = normalizedStatus === "pending" || normalizedStatus === "running";
const canDelete = normalizedStatus === "completed" || normalizedStatus === "failed" || normalizedStatus === "cancelled";
const progressValue = Math.max(0, Math.min(100, Number(job.progress || 0)));
if (progressFill) {
progressFill.style.width = `${progressValue}%`;
progressFill.setAttribute("data-progress", String(Math.round(progressValue)));
}
if (progressText) {
progressText.textContent = `${Math.round(progressValue)}%`;
}
if (statusBadge) {
statusBadge.textContent = normalizedStatus;
statusBadge.classList.remove("status-pending", "status-running", "status-completed", "status-failed", "status-cancelled");
statusBadge.classList.add(statusClass(job.status));
}
if (cancelJobBtn) {
cancelJobBtn.classList.toggle("hidden", !canCancel);
}
if (deleteJobBtn) {
deleteJobBtn.classList.toggle("hidden", !canDelete);
}
}
function refreshTasksAndFiles() {
if (!window.htmx) return;
if (tasksFragment) {
htmx.ajax("GET", `/ui/fragments/jobs/${jobID}/tasks`, "#tasks-fragment");
}
if (filesFragment) {
htmx.ajax("GET", `/ui/fragments/jobs/${jobID}/files`, "#files-fragment");
}
lastSmartRefreshAt = Date.now();
}
async function pollJobState() {
try {
const res = await fetch(`/api/jobs/${jobID}`, { credentials: "include" });
if (!res.ok) return;
const job = await res.json();
applyJobState(job);
const snapshot = {
status: String(job.status || ""),
progress: Math.round(Number(job.progress || 0)),
startedAt: job.started_at || "",
completedAt: job.completed_at || "",
};
const changed =
!lastJobSnapshot ||
snapshot.status !== lastJobSnapshot.status ||
snapshot.progress !== lastJobSnapshot.progress ||
snapshot.startedAt !== lastJobSnapshot.startedAt ||
snapshot.completedAt !== lastJobSnapshot.completedAt;
lastJobSnapshot = snapshot;
// Smart refresh fragments only when job state changes.
if (changed) {
refreshTasksAndFiles();
return;
}
// Fallback while running: refresh infrequently even without visible progress deltas.
if (snapshot.status === "running" && Date.now() - lastSmartRefreshAt > 12000) {
refreshTasksAndFiles();
}
} catch (_) {
// Keep UI usable even if polling briefly fails.
}
}
if (tasksRefreshBtn && tasksFragment && window.htmx) {
tasksRefreshBtn.addEventListener("click", () => {
htmx.ajax("GET", `/ui/fragments/jobs/${jobID}/tasks`, "#tasks-fragment");
});
}
if (filesRefreshBtn && filesFragment && window.htmx) {
filesRefreshBtn.addEventListener("click", () => {
htmx.ajax("GET", `/ui/fragments/jobs/${jobID}/files`, "#files-fragment");
});
}
pollJobState();
setInterval(pollJobState, 2500);
async function apiRequest(url, method) {
const res = await fetch(url, {
method,
credentials: "include",
headers: { "Content-Type": "application/json" },
});
const data = await res.json().catch(() => ({}));
if (!res.ok) {
throw new Error(data.error || "Request failed");
}
return data;
}
function closePreviewModal() {
if (!previewModal) return;
previewModal.classList.add("hidden");
if (previewImage) {
previewImage.classList.add("hidden");
previewImage.removeAttribute("src");
}
if (previewLoading) previewLoading.classList.remove("hidden");
if (previewError) {
previewError.classList.add("hidden");
previewError.textContent = "";
}
}
function openPreviewModal(url, name) {
if (!previewModal || !previewImage) return;
previewModal.classList.remove("hidden");
if (previewName) previewName.textContent = name ? `File: ${name}` : "";
if (previewLoading) previewLoading.classList.remove("hidden");
if (previewError) {
previewError.classList.add("hidden");
previewError.textContent = "";
}
previewImage.classList.add("hidden");
previewImage.onload = () => {
if (previewLoading) previewLoading.classList.add("hidden");
previewImage.classList.remove("hidden");
};
previewImage.onerror = () => {
if (previewLoading) previewLoading.classList.add("hidden");
if (previewError) {
previewError.textContent = "Failed to load preview image.";
previewError.classList.remove("hidden");
}
};
previewImage.src = url;
}
document.body.addEventListener("click", async (e) => {
const previewBtn = e.target.closest("[data-exr-preview-url]");
if (previewBtn) {
const url = previewBtn.getAttribute("data-exr-preview-url");
const name = previewBtn.getAttribute("data-exr-preview-name");
if (url) {
openPreviewModal(url, name || "");
}
return;
}
const modalClose = e.target.closest("[data-modal-close]");
if (modalClose) {
closePreviewModal();
return;
}
const cancelBtn = e.target.closest("[data-cancel-job]");
const deleteBtn = e.target.closest("[data-delete-job]");
if (!cancelBtn && !deleteBtn) return;
const id = (cancelBtn || deleteBtn).getAttribute(cancelBtn ? "data-cancel-job" : "data-delete-job");
try {
if (cancelBtn) {
if (!confirm("Cancel this job?")) return;
await apiRequest(`/api/jobs/${id}`, "DELETE");
} else {
if (!confirm("Delete this job permanently?")) return;
await apiRequest(`/api/jobs/${id}/delete`, "POST");
window.location.href = "/jobs";
return;
}
window.location.reload();
} catch (err) {
alert(err.message);
}
});
document.addEventListener("keydown", (e) => {
if (e.key === "Escape") {
closePreviewModal();
}
});
const taskSelect = document.getElementById("task-log-task-id");
const levelFilter = document.getElementById("task-log-level-filter");
const autoRefreshToggle = document.getElementById("task-log-auto-refresh");
const followToggle = document.getElementById("task-log-follow");
const refreshBtn = document.getElementById("task-log-refresh");
const copyBtn = document.getElementById("task-log-copy");
const output = document.getElementById("task-log-output");
const statusEl = document.getElementById("task-log-status");
const state = {
timer: null,
activeTaskID: "",
lastLogID: 0,
logs: [],
seenIDs: new Set(),
};
function setStatus(text) {
if (statusEl) statusEl.textContent = text;
}
function levelClass(level) {
const normalized = String(level || "INFO").toUpperCase();
if (normalized === "ERROR") return "log-error";
if (normalized === "WARN") return "log-warn";
if (normalized === "DEBUG") return "log-debug";
return "log-info";
}
function formatTime(ts) {
if (!ts) return "--:--:--";
const d = new Date(ts);
if (Number.isNaN(d.getTime())) return "--:--:--";
return d.toLocaleTimeString();
}
function renderLogs() {
if (!output) return;
const selectedLevel = (levelFilter?.value || "").toUpperCase();
const filtered = state.logs.filter((entry) => {
if (!selectedLevel) return true;
return String(entry.log_level || "").toUpperCase() === selectedLevel;
});
if (filtered.length === 0) {
output.innerHTML = '<div class="log-line empty">No logs yet.</div>';
return;
}
output.innerHTML = filtered.map((entry) => {
const level = String(entry.log_level || "INFO").toUpperCase();
const step = entry.step_name ? ` <span class="log-step">(${entry.step_name})</span>` : "";
const message = String(entry.message || "").replaceAll("&", "&amp;").replaceAll("<", "&lt;").replaceAll(">", "&gt;");
return `<div class="log-line">
<span class="log-time">${formatTime(entry.created_at)}</span>
<span class="log-level ${levelClass(level)}">${level}</span>${step}
<span class="log-message">${message}</span>
</div>`;
}).join("");
if (followToggle?.checked) {
output.scrollTop = output.scrollHeight;
}
}
function getVisibleLogs() {
const selectedLevel = (levelFilter?.value || "").toUpperCase();
return state.logs.filter((entry) => {
if (!selectedLevel) return true;
return String(entry.log_level || "").toUpperCase() === selectedLevel;
});
}
function logsToText(entries) {
return entries.map((entry) => {
const level = String(entry.log_level || "INFO").toUpperCase();
const step = entry.step_name ? ` (${entry.step_name})` : "";
return `[${formatTime(entry.created_at)}] [${level}]${step} ${entry.message || ""}`;
}).join("\n");
}
function collectTaskOptions() {
if (!taskSelect) return;
const buttons = document.querySelectorAll("[data-view-logs-task-id]");
const current = taskSelect.value;
taskSelect.innerHTML = '<option value="">Choose a task...</option>';
buttons.forEach((btn) => {
const id = btn.getAttribute("data-view-logs-task-id");
if (!id) return;
const row = btn.closest("tr");
const status = row?.querySelector(".status")?.textContent?.trim() || "";
const type = row?.children?.[1]?.textContent?.trim() || "";
const option = document.createElement("option");
option.value = id;
option.textContent = `#${id} ${type ? `(${type})` : ""} ${status ? `- ${status}` : ""}`.trim();
taskSelect.appendChild(option);
});
if (current && taskSelect.querySelector(`option[value="${current}"]`)) {
taskSelect.value = current;
}
}
async function fetchLogs({ reset = false, full = false } = {}) {
const taskID = taskSelect?.value?.trim();
if (!taskID) {
setStatus("Select a task to view logs.");
return;
}
if (reset || taskID !== state.activeTaskID) {
state.activeTaskID = taskID;
state.lastLogID = 0;
state.logs = [];
state.seenIDs.clear();
renderLogs();
}
const params = new URLSearchParams();
params.set("limit", "0"); // backend: 0 = no limit
if (!full && state.lastLogID > 0) {
params.set("since_id", String(state.lastLogID));
}
try {
const res = await fetch(`/api/jobs/${jobID}/tasks/${taskID}/logs?${params.toString()}`, {
credentials: "include",
});
if (!res.ok) {
setStatus(`Failed to fetch logs (HTTP ${res.status}).`);
return;
}
const payload = await res.json();
const rows = Array.isArray(payload) ? payload : (payload.logs || []);
if (rows.length > 0) {
for (const row of rows) {
const id = Number(row.id || 0);
if (id > 0 && !state.seenIDs.has(id)) {
state.seenIDs.add(id);
state.logs.push(row);
if (id > state.lastLogID) state.lastLogID = id;
}
}
if (!Array.isArray(payload) && Number(payload.last_id || 0) > state.lastLogID) {
state.lastLogID = Number(payload.last_id);
}
}
setStatus(`Task #${taskID}: ${state.logs.length} log line(s).`);
renderLogs();
} catch (err) {
setStatus(`Failed to fetch logs: ${err.message}`);
}
}
function restartPolling() {
if (state.timer) {
clearInterval(state.timer);
state.timer = null;
}
if (!autoRefreshToggle?.checked) return;
state.timer = setInterval(() => {
if (taskSelect?.value) {
fetchLogs();
}
}, 2000);
}
if (tasksFragment) {
tasksFragment.addEventListener("htmx:afterSwap", () => {
collectTaskOptions();
});
}
collectTaskOptions();
document.body.addEventListener("click", (e) => {
const viewBtn = e.target.closest("[data-view-logs-task-id]");
if (!viewBtn || !taskSelect) return;
const taskID = viewBtn.getAttribute("data-view-logs-task-id");
if (!taskID) return;
taskSelect.value = taskID;
fetchLogs({ reset: true, full: true });
});
if (taskSelect) {
taskSelect.addEventListener("change", () => fetchLogs({ reset: true, full: true }));
}
if (levelFilter) {
levelFilter.addEventListener("change", renderLogs);
}
if (refreshBtn) {
refreshBtn.addEventListener("click", () => fetchLogs({ reset: true, full: true }));
}
if (copyBtn) {
copyBtn.addEventListener("click", async () => {
const visible = getVisibleLogs();
if (visible.length === 0) {
setStatus("No logs to copy.");
return;
}
try {
await navigator.clipboard.writeText(logsToText(visible));
setStatus(`Copied ${visible.length} log line(s).`);
} catch (_) {
setStatus("Clipboard copy failed.");
}
});
}
if (autoRefreshToggle) {
autoRefreshToggle.addEventListener("change", restartPolling);
}
restartPolling();
})();

41
web/static/jobs.js Normal file
View File

@@ -0,0 +1,41 @@
(function () {
async function apiRequest(url, method) {
const res = await fetch(url, {
method,
credentials: "include",
headers: { "Content-Type": "application/json" },
});
const data = await res.json().catch(() => ({}));
if (!res.ok) {
throw new Error(data.error || "Request failed");
}
return data;
}
document.body.addEventListener("click", async (e) => {
const cancelBtn = e.target.closest("[data-cancel-job]");
const deleteBtn = e.target.closest("[data-delete-job]");
if (!cancelBtn && !deleteBtn) return;
try {
if (cancelBtn) {
const id = cancelBtn.getAttribute("data-cancel-job");
if (!confirm("Cancel this job?")) return;
await apiRequest(`/api/jobs/${id}`, "DELETE");
}
if (deleteBtn) {
const id = deleteBtn.getAttribute("data-delete-job");
if (!confirm("Delete this job permanently?")) return;
await apiRequest(`/api/jobs/${id}/delete`, "POST");
}
if (window.htmx) {
htmx.trigger("#jobs-fragment", "refresh");
htmx.ajax("GET", "/ui/fragments/jobs", "#jobs-fragment");
} else {
window.location.reload();
}
} catch (err) {
alert(err.message);
}
});
})();

65
web/static/login.js Normal file
View File

@@ -0,0 +1,65 @@
(function () {
const loginForm = document.getElementById("login-form");
const registerForm = document.getElementById("register-form");
const errorEl = document.getElementById("auth-error");
function setError(msg) {
if (!errorEl) return;
if (!msg) {
errorEl.classList.add("hidden");
errorEl.textContent = "";
return;
}
errorEl.textContent = msg;
errorEl.classList.remove("hidden");
}
async function postJSON(url, payload) {
const res = await fetch(url, {
method: "POST",
credentials: "include",
headers: { "Content-Type": "application/json" },
body: JSON.stringify(payload),
});
const body = await res.json().catch(() => ({}));
if (!res.ok) {
throw new Error(body.error || "Request failed");
}
return body;
}
if (loginForm) {
loginForm.addEventListener("submit", async (e) => {
e.preventDefault();
setError("");
const fd = new FormData(loginForm);
try {
await postJSON("/api/auth/local/login", {
username: fd.get("username"),
password: fd.get("password"),
});
window.location.href = "/jobs";
} catch (err) {
setError(err.message);
}
});
}
if (registerForm) {
registerForm.addEventListener("submit", async (e) => {
e.preventDefault();
setError("");
const fd = new FormData(registerForm);
try {
await postJSON("/api/auth/local/register", {
name: fd.get("name"),
email: fd.get("email"),
password: fd.get("password"),
});
window.location.href = "/jobs";
} catch (err) {
setError(err.message);
}
});
}
})();

241
web/static/style.css Normal file
View File

@@ -0,0 +1,241 @@
* { box-sizing: border-box; }
body {
margin: 0;
font-family: system-ui, -apple-system, Segoe UI, Roboto, sans-serif;
background: #0f172a;
color: #e2e8f0;
}
.container { max-width: 1200px; margin: 24px auto; padding: 0 16px; }
.topbar {
display: flex;
align-items: center;
justify-content: space-between;
gap: 16px;
padding: 12px 16px;
border-bottom: 1px solid #334155;
background: #111827;
}
.brand { font-weight: 700; }
.nav { display: flex; gap: 12px; }
.nav a { color: #cbd5e1; text-decoration: none; padding: 8px 10px; border-radius: 6px; }
.nav a.active, .nav a:hover { background: #1f2937; color: #fff; }
.account { display: flex; gap: 12px; align-items: center; }
.card {
background: #111827;
border: 1px solid #334155;
border-radius: 10px;
padding: 16px;
margin-bottom: 16px;
}
.card.narrow { max-width: 900px; margin-inline: auto; }
.section-head { display: flex; justify-content: space-between; align-items: center; gap: 12px; }
.btn {
border: 1px solid #475569;
color: #e2e8f0;
background: #1f2937;
border-radius: 7px;
padding: 8px 12px;
cursor: pointer;
text-decoration: none;
}
.btn:hover { background: #334155; }
.btn.primary { background: #2563eb; border-color: #2563eb; color: white; }
.btn:disabled,
.btn[disabled] {
cursor: not-allowed;
opacity: 1;
}
.btn.primary:disabled,
.btn.primary[disabled] {
background: #1e293b;
border-color: #475569;
color: #94a3b8;
}
.btn.danger { background: #b91c1c; border-color: #b91c1c; color: white; }
.btn.subtle { background: transparent; }
.btn.tiny { padding: 4px 8px; font-size: 12px; }
.table { width: 100%; border-collapse: collapse; }
.table th, .table td { border-bottom: 1px solid #334155; padding: 8px; text-align: left; vertical-align: top; }
.table th { font-size: 12px; text-transform: uppercase; color: #94a3b8; }
.job-link,
.job-link:visited,
.job-link:hover,
.job-link:active {
color: #93c5fd;
text-decoration: underline;
text-underline-offset: 2px;
text-decoration-thickness: 1px;
cursor: pointer;
}
.job-link:hover,
.job-link:focus-visible {
color: #bfdbfe;
text-decoration-thickness: 2px;
}
.status { border-radius: 999px; padding: 2px 8px; font-size: 12px; }
.status-pending { background: #7c2d12; color: #fdba74; }
.status-running { background: #164e63; color: #67e8f9; }
.status-completed { background: #14532d; color: #86efac; }
.status-failed { background: #7f1d1d; color: #fca5a5; }
.status-cancelled { background: #334155; color: #cbd5e1; }
.status-online { background: #14532d; color: #86efac; }
.status-offline { background: #334155; color: #cbd5e1; }
.status-busy { background: #164e63; color: #67e8f9; }
.progress {
width: 100%;
height: 10px;
background: #1e293b;
border-radius: 999px;
overflow: hidden;
}
.progress-fill { height: 100%; background: #2563eb; }
.alert {
border-radius: 8px;
padding: 10px 12px;
margin: 10px 0;
}
.alert.error { background: #7f1d1d; color: #fee2e2; border: 1px solid #ef4444; }
.alert.notice { background: #1e3a8a; color: #dbeafe; border: 1px solid #3b82f6; }
label { display: block; }
input, select {
width: 100%;
margin-top: 6px;
margin-bottom: 12px;
background: #0f172a;
border: 1px solid #334155;
border-radius: 6px;
color: #e2e8f0;
padding: 8px;
}
.stack { display: grid; gap: 8px; }
.grid-2 { display: grid; grid-template-columns: 1fr 1fr; gap: 12px; }
.split { display: grid; grid-template-columns: 1fr 1fr; gap: 16px; margin-top: 16px; }
.auth-grid { display: flex; gap: 10px; margin-bottom: 12px; }
.check-row { display: flex; gap: 12px; align-items: center; flex-wrap: wrap; }
.row { display: flex; gap: 8px; align-items: center; flex-wrap: wrap; }
.stepper { display: flex; gap: 10px; margin-bottom: 12px; }
.step {
border: 1px solid #334155;
border-radius: 999px;
padding: 6px 10px;
font-size: 12px;
color: #94a3b8;
}
.step.active {
border-color: #2563eb;
color: #bfdbfe;
background: #1e3a8a;
}
.step.complete {
border-color: #14532d;
color: #86efac;
background: #052e16;
}
.muted { color: #94a3b8; margin-top: 0; }
.metadata-grid {
display: grid;
grid-template-columns: repeat(2, minmax(0, 1fr));
gap: 8px;
margin: 8px 0 12px;
}
.logs {
max-height: 320px;
overflow: auto;
background: #020617;
border: 1px solid #334155;
border-radius: 8px;
padding: 10px;
white-space: pre-wrap;
}
.log-controls {
display: grid;
grid-template-columns: 2fr 1fr auto auto auto auto;
gap: 10px;
align-items: end;
margin-bottom: 10px;
}
.log-toggle {
display: flex;
gap: 6px;
align-items: center;
margin-bottom: 12px;
white-space: nowrap;
}
.log-toggle input {
width: auto;
margin: 0;
}
.log-lines {
font-family: ui-monospace, SFMono-Regular, Menlo, monospace;
white-space: normal;
}
.log-line {
display: grid;
grid-template-columns: auto auto auto 1fr;
gap: 8px;
align-items: start;
padding: 4px 0;
border-bottom: 1px solid #1e293b;
}
.log-line.empty {
display: block;
color: #94a3b8;
border-bottom: none;
}
.log-time { color: #64748b; }
.log-level {
border-radius: 999px;
padding: 0 6px;
font-size: 11px;
line-height: 18px;
}
.log-info { background: #164e63; color: #67e8f9; }
.log-warn { background: #7c2d12; color: #fdba74; }
.log-error { background: #7f1d1d; color: #fca5a5; }
.log-debug { background: #334155; color: #cbd5e1; }
.log-step { color: #93c5fd; }
.log-message {
color: #e2e8f0;
overflow-wrap: anywhere;
}
.modal {
position: fixed;
inset: 0;
z-index: 1000;
display: grid;
place-items: center;
}
.modal-backdrop {
position: absolute;
inset: 0;
background: rgba(2, 6, 23, 0.8);
}
.modal-content {
position: relative;
width: min(1100px, 94vw);
max-height: 90vh;
overflow: auto;
background: #0b1220;
border: 1px solid #334155;
border-radius: 10px;
padding: 12px;
}
.modal-body {
min-height: 220px;
}
.preview-image {
display: block;
max-width: 100%;
max-height: 70vh;
margin: 0 auto;
border: 1px solid #334155;
border-radius: 8px;
}
.hidden { display: none; }

View File

@@ -1,325 +0,0 @@
* {
margin: 0;
padding: 0;
box-sizing: border-box;
}
body {
font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, Oxygen, Ubuntu, Cantarell, sans-serif;
background: #f5f5f5;
color: #333;
}
.hidden {
display: none !important;
}
/* Login Page */
#login-page {
display: flex;
justify-content: center;
align-items: center;
min-height: 100vh;
background: linear-gradient(135deg, #667eea 0%, #764ba2 100%);
}
.login-container {
background: white;
padding: 3rem;
border-radius: 10px;
box-shadow: 0 10px 40px rgba(0,0,0,0.2);
text-align: center;
max-width: 400px;
width: 100%;
}
.login-container h1 {
margin-bottom: 0.5rem;
color: #667eea;
}
.login-container p {
color: #666;
margin-bottom: 2rem;
}
.login-buttons {
display: flex;
flex-direction: column;
gap: 1rem;
}
/* Main Page */
#main-page {
min-height: 100vh;
}
header {
background: white;
padding: 1rem 2rem;
box-shadow: 0 2px 4px rgba(0,0,0,0.1);
display: flex;
justify-content: space-between;
align-items: center;
}
header h1 {
color: #667eea;
}
.user-info {
display: flex;
align-items: center;
gap: 1rem;
}
nav {
background: white;
padding: 0 2rem;
border-bottom: 1px solid #e0e0e0;
display: flex;
gap: 1rem;
}
.nav-btn {
padding: 1rem 1.5rem;
background: none;
border: none;
border-bottom: 2px solid transparent;
cursor: pointer;
font-size: 1rem;
color: #666;
transition: all 0.2s;
}
.nav-btn:hover {
color: #667eea;
}
.nav-btn.active {
color: #667eea;
border-bottom-color: #667eea;
}
main {
max-width: 1200px;
margin: 2rem auto;
padding: 0 2rem;
}
.content-page {
background: white;
padding: 2rem;
border-radius: 8px;
box-shadow: 0 2px 4px rgba(0,0,0,0.1);
}
.content-page h2 {
margin-bottom: 1.5rem;
color: #333;
}
/* Buttons */
.btn {
padding: 0.75rem 1.5rem;
border: none;
border-radius: 5px;
cursor: pointer;
font-size: 1rem;
text-decoration: none;
display: inline-block;
transition: all 0.2s;
}
.btn-primary {
background: #667eea;
color: white;
}
.btn-primary:hover {
background: #5568d3;
}
.btn-secondary {
background: #6c757d;
color: white;
}
.btn-secondary:hover {
background: #5a6268;
}
.btn-google {
background: #db4437;
color: white;
}
.btn-google:hover {
background: #c23321;
}
.btn-discord {
background: #5865F2;
color: white;
}
.btn-discord:hover {
background: #4752C4;
}
/* Forms */
.form-group {
margin-bottom: 1.5rem;
}
.form-group label {
display: block;
margin-bottom: 0.5rem;
font-weight: 500;
color: #333;
}
.form-group input,
.form-group select {
width: 100%;
padding: 0.75rem;
border: 1px solid #ddd;
border-radius: 5px;
font-size: 1rem;
}
.form-group input:focus,
.form-group select:focus {
outline: none;
border-color: #667eea;
}
/* Jobs List */
#jobs-list {
display: grid;
gap: 1rem;
}
.job-card {
background: #f8f9fa;
padding: 1.5rem;
border-radius: 8px;
border-left: 4px solid #667eea;
}
.job-card h3 {
margin-bottom: 0.5rem;
color: #333;
}
.job-meta {
display: flex;
gap: 2rem;
margin: 1rem 0;
color: #666;
font-size: 0.9rem;
}
.job-status {
display: inline-block;
padding: 0.25rem 0.75rem;
border-radius: 20px;
font-size: 0.85rem;
font-weight: 500;
}
.job-status.pending {
background: #ffc107;
color: #000;
}
.job-status.running {
background: #17a2b8;
color: white;
}
.job-status.completed {
background: #28a745;
color: white;
}
.job-status.failed {
background: #dc3545;
color: white;
}
.job-status.cancelled {
background: #6c757d;
color: white;
}
.progress-bar {
width: 100%;
height: 8px;
background: #e0e0e0;
border-radius: 4px;
overflow: hidden;
margin: 1rem 0;
}
.progress-fill {
height: 100%;
background: #667eea;
transition: width 0.3s;
}
.job-actions {
margin-top: 1rem;
display: flex;
gap: 1rem;
}
/* Runners List */
#runners-list {
display: grid;
gap: 1rem;
}
.runner-card {
background: #f8f9fa;
padding: 1.5rem;
border-radius: 8px;
border-left: 4px solid #28a745;
}
.runner-card h3 {
margin-bottom: 0.5rem;
color: #333;
}
.runner-info {
display: flex;
gap: 2rem;
margin-top: 1rem;
color: #666;
font-size: 0.9rem;
}
.runner-status {
display: inline-block;
padding: 0.25rem 0.75rem;
border-radius: 20px;
font-size: 0.85rem;
font-weight: 500;
}
.runner-status.online {
background: #28a745;
color: white;
}
.runner-status.offline {
background: #6c757d;
color: white;
}
.runner-status.busy {
background: #ffc107;
color: #000;
}

View File

@@ -1,20 +0,0 @@
/** @type {import('tailwindcss').Config} */
export default {
content: [
"./index.html",
"./src/**/*.{js,ts,jsx,tsx}",
],
darkMode: 'class',
theme: {
extend: {
colors: {
primary: {
500: '#f97316', // orange-500
600: '#ea580c', // orange-600
},
},
},
},
plugins: [],
}

49
web/templates/admin.html Normal file
View File

@@ -0,0 +1,49 @@
{{ define "page_admin" }}
{{ $view := .Data }}
<section class="card">
<h1>Admin Panel</h1>
<div class="check-row">
<label>
<input id="registration-enabled" type="checkbox" {{ if index $view "registration_enabled" }}checked{{ end }}>
Allow new registrations
</label>
<button id="save-registration" class="btn">Save</button>
</div>
</section>
<section class="card">
<h2>Runners</h2>
<div id="admin-runners"
hx-get="/ui/fragments/admin/runners"
hx-trigger="load, every 6s"
hx-swap="innerHTML">
<p>Loading runners...</p>
</div>
</section>
<section class="card">
<h2>Users</h2>
<div id="admin-users"
hx-get="/ui/fragments/admin/users"
hx-trigger="load, every 10s"
hx-swap="innerHTML">
<p>Loading users...</p>
</div>
</section>
<section class="card">
<div class="section-head">
<h2>Runner API Keys</h2>
<button id="create-api-key" class="btn">Create API Key</button>
</div>
<div id="admin-apikeys"
hx-get="/ui/fragments/admin/apikeys"
hx-trigger="load, every 10s"
hx-swap="innerHTML">
<p>Loading API keys...</p>
</div>
</section>
<p id="admin-message" class="alert notice hidden"></p>
<p id="admin-error" class="alert error hidden"></p>
{{ end }}

48
web/templates/base.html Normal file
View File

@@ -0,0 +1,48 @@
{{ define "base" }}
<!doctype html>
<html lang="en">
<head>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1">
<title>{{ .Title }} - JiggaBlend</title>
<link rel="stylesheet" href="/assets/style.css">
<script src="https://unpkg.com/htmx.org@1.9.12"></script>
</head>
<body>
{{ if .User }}
<header class="topbar">
<div class="brand">JiggaBlend</div>
<nav class="nav">
<a href="/jobs" class="{{ if eq .CurrentPath "/jobs" }}active{{ end }}">Jobs</a>
<a href="/jobs/new" class="{{ if eq .CurrentPath "/jobs/new" }}active{{ end }}">Submit</a>
{{ if .User.IsAdmin }}<a href="/admin" class="{{ if eq .CurrentPath "/admin" }}active{{ end }}">Admin</a>{{ end }}
</nav>
<div class="account">
<span>{{ .User.Name }}</span>
<form method="post" action="/logout">
<button type="submit" class="btn subtle">Logout</button>
</form>
</div>
</header>
{{ end }}
<main class="container">
{{ if .Error }}<div class="alert error">{{ .Error }}</div>{{ end }}
{{ if .Notice }}<div class="alert notice">{{ .Notice }}</div>{{ end }}
{{ if eq .ContentTemplate "page_login" }}
{{ template "page_login" . }}
{{ else if eq .ContentTemplate "page_jobs" }}
{{ template "page_jobs" . }}
{{ else if eq .ContentTemplate "page_jobs_new" }}
{{ template "page_jobs_new" . }}
{{ else if eq .ContentTemplate "page_job_show" }}
{{ template "page_job_show" . }}
{{ else if eq .ContentTemplate "page_admin" }}
{{ template "page_admin" . }}
{{ end }}
</main>
{{ if .PageScript }}<script src="{{ .PageScript }}"></script>{{ end }}
</body>
</html>
{{ end }}

View File

@@ -0,0 +1,60 @@
{{ define "page_jobs_new" }}
<section id="job-upload-section" class="card">
<h1>Create Render Job</h1>
<div class="stepper">
<div id="step-upload" class="step active">1. Upload & Analyze</div>
<div id="step-config" class="step">2. Review & Submit</div>
</div>
<form id="upload-analyze-form" class="stack">
<label>Upload Blend/ZIP
<input type="file" id="source-file" name="file" accept=".blend,.zip,application/zip,application/x-zip-compressed" required>
</label>
<label id="main-blend-wrapper" class="hidden">Main Blend (for ZIP with multiple .blend files)
<select id="main-blend-select"></select>
</label>
<button type="submit" class="btn primary">Upload and Analyze</button>
</form>
<div id="upload-status" class="stack hidden"></div>
<p id="job-create-error" class="alert error hidden"></p>
</section>
<section id="job-config-section" class="card hidden">
<h2>Review Render Settings</h2>
<p class="muted">Values are prefilled from extracted metadata; adjust before submission.</p>
<div id="metadata-preview" class="stack"></div>
<form id="job-config-form" class="stack">
<label>Job Name
<input type="text" id="job-name" name="name" required>
</label>
<div class="grid-2">
<label>Frame Start
<input type="number" id="frame-start" name="frame_start" min="0" required>
</label>
<label>Frame End
<input type="number" id="frame-end" name="frame_end" min="0" required>
</label>
</div>
<label>Output Format
<select name="output_format" id="output-format">
<option value="EXR">EXR</option>
<option value="EXR_264_MP4">EXR + H264 MP4</option>
<option value="EXR_AV1_MP4">EXR + AV1 MP4</option>
<option value="EXR_VP9_WEBM">EXR + VP9 WEBM</option>
</select>
</label>
<label>Blender Version (optional)
<select name="blender_version" id="blender-version">
<option value="">Auto-detect from file</option>
</select>
</label>
<div class="check-row">
<label><input type="checkbox" id="unhide-objects" name="unhide_objects"> Unhide objects/collections</label>
<label><input type="checkbox" id="enable-execution" name="enable_execution"> Enable auto-execution in Blender</label>
</div>
<button type="submit" class="btn primary">Create Job</button>
</form>
</section>
{{ end }}

View File

@@ -0,0 +1,97 @@
{{ define "page_job_show" }}
{{ $view := .Data }}
{{ $job := index $view "job" }}
<section class="card">
<div class="section-head">
<h1>Job #{{ $job.ID }} - {{ $job.Name }}</h1>
<a href="/jobs" class="btn subtle">Back</a>
</div>
<p>Status: <span id="job-status-badge" class="status {{ statusClass $job.Status }}">{{ $job.Status }}</span></p>
<p>Progress: <span id="job-progress-text">{{ progressInt $job.Progress }}%</span></p>
<div class="progress">
<div class="progress-fill" data-progress="{{ progressInt $job.Progress }}"></div>
</div>
<div class="row">
{{ if $job.FrameStart }}<span>Frames: {{ derefInt $job.FrameStart }}{{ if $job.FrameEnd }}-{{ derefInt $job.FrameEnd }}{{ end }}</span>{{ end }}
{{ if $job.OutputFormat }}<span>Format: {{ derefString $job.OutputFormat }}</span>{{ end }}
<span>Created: {{ formatTime $job.CreatedAt }}</span>
</div>
<div class="section-head">
<button id="cancel-job-btn" class="btn{{ if not (or (eq $job.Status "pending") (eq $job.Status "running")) }} hidden{{ end }}" data-cancel-job="{{ $job.ID }}">Cancel Job</button>
<button id="delete-job-btn" class="btn danger{{ if not (or (eq $job.Status "completed") (eq $job.Status "failed") (eq $job.Status "cancelled")) }} hidden{{ end }}" data-delete-job="{{ $job.ID }}">Delete Job</button>
</div>
</section>
<section class="card">
<div class="section-head">
<h2>Tasks</h2>
<button id="tasks-refresh" class="btn tiny">Refresh tasks</button>
</div>
<div id="tasks-fragment"
hx-get="/ui/fragments/jobs/{{ $job.ID }}/tasks"
hx-trigger="load"
hx-swap="innerHTML">
<p>Loading tasks...</p>
</div>
</section>
<section class="card">
<div class="section-head">
<h2>Files</h2>
<div class="row">
<a href="/api/jobs/{{ $job.ID }}/files/exr-zip" class="btn tiny">Download all EXR (.zip)</a>
<button id="files-refresh" class="btn tiny">Refresh files</button>
</div>
</div>
<div id="files-fragment"
hx-get="/ui/fragments/jobs/{{ $job.ID }}/files"
hx-trigger="load"
hx-swap="innerHTML">
<p>Loading files...</p>
</div>
</section>
<div id="exr-preview-modal" class="modal hidden" role="dialog" aria-modal="true" aria-labelledby="exr-preview-title">
<div class="modal-backdrop" data-modal-close></div>
<div class="modal-content">
<div class="section-head">
<h3 id="exr-preview-title">EXR Preview</h3>
<button type="button" id="exr-preview-close" class="btn tiny subtle" data-modal-close>Close</button>
</div>
<p id="exr-preview-name" class="muted"></p>
<div class="modal-body">
<img id="exr-preview-image" alt="EXR preview" class="preview-image hidden">
<p id="exr-preview-loading" class="muted">Loading preview...</p>
<p id="exr-preview-error" class="alert error hidden"></p>
</div>
</div>
</div>
<section class="card">
<div class="section-head">
<h2>Task Logs</h2>
<span id="task-log-status" class="muted">Select a task to view logs.</span>
</div>
<div class="log-controls">
<label>Task
<select id="task-log-task-id">
<option value="">Choose a task...</option>
</select>
</label>
<label>Level
<select id="task-log-level-filter">
<option value="">All</option>
<option value="INFO">INFO</option>
<option value="WARN">WARN</option>
<option value="ERROR">ERROR</option>
<option value="DEBUG">DEBUG</option>
</select>
</label>
<label class="log-toggle"><input id="task-log-auto-refresh" type="checkbox" checked> Auto refresh</label>
<label class="log-toggle"><input id="task-log-follow" type="checkbox" checked> Follow tail</label>
<button id="task-log-refresh" class="btn">Refresh now</button>
<button id="task-log-copy" class="btn subtle">Copy logs</button>
</div>
<div id="task-log-output" class="logs log-lines"></div>
</section>
{{ end }}

16
web/templates/jobs.html Normal file
View File

@@ -0,0 +1,16 @@
{{ define "page_jobs" }}
<section class="card">
<div class="section-head">
<h1>Your Jobs</h1>
<a href="/jobs/new" class="btn primary">New Job</a>
</div>
<div
id="jobs-fragment"
hx-get="/ui/fragments/jobs"
hx-trigger="load, every 5s"
hx-swap="innerHTML"
>
<p>Loading jobs...</p>
</div>
</section>
{{ end }}

41
web/templates/login.html Normal file
View File

@@ -0,0 +1,41 @@
{{ define "page_login" }}
<section class="card narrow">
<h1>Sign in to JiggaBlend</h1>
{{ $view := .Data }}
{{ if index $view "error" }}
<div class="alert error">Login error: {{ index $view "error" }}</div>
{{ end }}
<div class="auth-grid">
{{ if index $view "google_enabled" }}
<a class="btn" href="/api/auth/google/login">Continue with Google</a>
{{ end }}
{{ if index $view "discord_enabled" }}
<a class="btn" href="/api/auth/discord/login">Continue with Discord</a>
{{ end }}
</div>
{{ if index $view "local_enabled" }}
<div class="split">
<form id="login-form" class="stack">
<h2>Local Login</h2>
<label>Email or Username<input type="text" name="username" required></label>
<label>Password<input type="password" name="password" required></label>
<button type="submit" class="btn primary">Login</button>
</form>
<form id="register-form" class="stack">
<h2>Register</h2>
<label>Name<input type="text" name="name" required></label>
<label>Email<input type="email" name="email" required></label>
<label>Password<input type="password" name="password" minlength="8" required></label>
<button type="submit" class="btn">Register</button>
</form>
</div>
{{ else }}
<p>Local authentication is disabled.</p>
{{ end }}
<p id="auth-error" class="alert error hidden"></p>
</section>
{{ end }}

View File

@@ -0,0 +1,36 @@
{{ define "partial_admin_apikeys" }}
{{ $keys := index . "keys" }}
{{ if not $keys }}
<p>No API keys generated yet.</p>
{{ else }}
<table class="table">
<thead>
<tr>
<th>ID</th>
<th>Name</th>
<th>Scope</th>
<th>Prefix</th>
<th>Active</th>
<th>Created</th>
<th>Actions</th>
</tr>
</thead>
<tbody>
{{ range $key := $keys }}
<tr>
<td>{{ $key.ID }}</td>
<td>{{ $key.Name }}</td>
<td>{{ $key.Scope }}</td>
<td>{{ $key.Key }}</td>
<td>{{ if $key.IsActive }}yes{{ else }}no{{ end }}</td>
<td>{{ formatTime $key.CreatedAt }}</td>
<td class="row">
<button class="btn tiny" data-revoke-apikey="{{ $key.ID }}">Revoke</button>
<button class="btn tiny danger" data-delete-apikey="{{ $key.ID }}">Delete</button>
</td>
</tr>
{{ end }}
</tbody>
</table>
{{ end }}
{{ end }}

View File

@@ -0,0 +1,35 @@
{{ define "partial_admin_runners" }}
{{ $runners := index . "runners" }}
{{ if not $runners }}
<p>No runners registered.</p>
{{ else }}
<table class="table">
<thead>
<tr>
<th>ID</th>
<th>Name</th>
<th>Host</th>
<th>Status</th>
<th>Priority</th>
<th>Heartbeat</th>
<th>Actions</th>
</tr>
</thead>
<tbody>
{{ range $runner := $runners }}
<tr>
<td>{{ $runner.ID }}</td>
<td>{{ $runner.Name }}</td>
<td>{{ $runner.Hostname }}</td>
<td><span class="status {{ statusClass $runner.Status }}">{{ $runner.Status }}</span></td>
<td>{{ $runner.Priority }}</td>
<td>{{ formatTime $runner.LastHeartbeat }}</td>
<td class="row">
<button class="btn tiny danger" data-delete-runner="{{ $runner.ID }}">Delete</button>
</td>
</tr>
{{ end }}
</tbody>
</table>
{{ end }}
{{ end }}

View File

@@ -0,0 +1,44 @@
{{ define "partial_admin_users" }}
{{ $users := index . "users" }}
{{ $currentUserID := index . "current_user_id" }}
{{ if not $users }}
<p>No users found.</p>
{{ else }}
<table class="table">
<thead>
<tr>
<th>ID</th>
<th>Name</th>
<th>Email</th>
<th>Provider</th>
<th>Admin</th>
<th>Created</th>
<th>Actions</th>
</tr>
</thead>
<tbody>
{{ range $user := $users }}
<tr>
<td>{{ $user.ID }}</td>
<td>{{ $user.Name }}</td>
<td>{{ $user.Email }}</td>
<td>{{ if $user.OAuthProvider }}{{ $user.OAuthProvider }}{{ else }}local{{ end }}</td>
<td>{{ if $user.IsAdmin }}yes{{ else }}no{{ end }}</td>
<td>{{ formatTime $user.CreatedAt }}</td>
<td class="row">
{{ if and $user.IsAdmin (eq $user.ID $currentUserID) }}
<button class="btn tiny" disabled title="You cannot revoke your own admin status">
Revoke Admin
</button>
{{ else }}
<button class="btn tiny" data-set-admin="{{ $user.ID }}" data-admin-value="{{ if $user.IsAdmin }}false{{ else }}true{{ end }}">
{{ if $user.IsAdmin }}Revoke Admin{{ else }}Make Admin{{ end }}
</button>
{{ end }}
</td>
</tr>
{{ end }}
</tbody>
</table>
{{ end }}
{{ end }}

View File

@@ -0,0 +1,82 @@
{{ define "partial_job_files" }}
{{ $jobID := index . "job_id" }}
{{ $files := index . "files" }}
{{ $isAdmin := index . "is_admin" }}
{{ $adminInputFiles := index . "admin_input_files" }}
{{ if not $files }}
<p>No output files found yet.</p>
{{ else }}
<table class="table">
<thead>
<tr>
<th>ID</th>
<th>Name</th>
{{ if $isAdmin }}<th>Type</th>{{ end }}
<th>Size</th>
<th>Created</th>
<th>Actions</th>
</tr>
</thead>
<tbody>
{{ range $file := $files }}
<tr>
<td>{{ $file.ID }}</td>
<td>{{ $file.FileName }}</td>
{{ if $isAdmin }}<td>{{ $file.FileType }}</td>{{ end }}
<td>{{ $file.FileSize }}</td>
<td>{{ formatTime $file.CreatedAt }}</td>
<td class="row">
<a class="btn tiny" href="/api/jobs/{{ $jobID }}/files/{{ $file.ID }}/download">Download</a>
{{ if hasSuffixFold $file.FileName ".exr" }}
<button
type="button"
class="btn tiny"
data-exr-preview-url="/api/jobs/{{ $jobID }}/files/{{ $file.ID }}/preview-exr"
data-exr-preview-name="{{ $file.FileName }}"
>
Preview
</button>
{{ end }}
</td>
</tr>
{{ end }}
</tbody>
</table>
{{ end }}
{{ if $isAdmin }}
<details class="admin-context">
<summary>Admin: context/input files</summary>
{{ if not $adminInputFiles }}
<p>No context/input files found.</p>
{{ else }}
<table class="table">
<thead>
<tr>
<th>ID</th>
<th>Name</th>
<th>Type</th>
<th>Size</th>
<th>Created</th>
<th>Download</th>
</tr>
</thead>
<tbody>
{{ range $file := $adminInputFiles }}
<tr>
<td>{{ $file.ID }}</td>
<td>{{ $file.FileName }}</td>
<td>{{ $file.FileType }}</td>
<td>{{ $file.FileSize }}</td>
<td>{{ formatTime $file.CreatedAt }}</td>
<td>
<a class="btn tiny" href="/api/jobs/{{ $jobID }}/files/{{ $file.ID }}/download">Download</a>
</td>
</tr>
{{ end }}
</tbody>
</table>
{{ end }}
</details>
{{ end }}
{{ end }}

View File

@@ -0,0 +1,37 @@
{{ define "partial_job_tasks" }}
{{ $tasks := index . "tasks" }}
{{ if not $tasks }}
<p>No tasks yet.</p>
{{ else }}
<table class="table">
<thead>
<tr>
<th>ID</th>
<th>Type</th>
<th>Status</th>
<th>Frame(s)</th>
<th>Step</th>
<th>Retries</th>
<th>Error</th>
<th>Logs</th>
</tr>
</thead>
<tbody>
{{ range $task := $tasks }}
<tr>
<td>{{ $task.ID }}</td>
<td>{{ $task.TaskType }}</td>
<td><span class="status {{ statusClass $task.Status }}">{{ $task.Status }}</span></td>
<td>{{ $task.Frame }}{{ if $task.FrameEnd }}-{{ derefInt $task.FrameEnd }}{{ end }}</td>
<td>{{ if $task.CurrentStep }}{{ $task.CurrentStep }}{{ else }}-{{ end }}</td>
<td>{{ $task.RetryCount }}</td>
<td>{{ if $task.Error }}{{ $task.Error }}{{ else }}-{{ end }}</td>
<td>
<button class="btn tiny" data-view-logs-task-id="{{ $task.ID }}">View logs</button>
</td>
</tr>
{{ end }}
</tbody>
</table>
{{ end }}
{{ end }}

View File

@@ -0,0 +1,40 @@
{{ define "partial_jobs_table" }}
{{ $jobs := index . "jobs" }}
{{ if not $jobs }}
<p>No jobs yet. Submit one to get started.</p>
{{ else }}
<table class="table">
<thead>
<tr>
<th>Name</th>
<th>Status</th>
<th>Progress</th>
<th>Frames</th>
<th>Format</th>
<th>Created</th>
<th>Actions</th>
</tr>
</thead>
<tbody>
{{ range $job := $jobs }}
<tr>
<td><a class="job-link" href="/jobs/{{ $job.ID }}">{{ $job.Name }}</a></td>
<td><span class="status {{ statusClass $job.Status }}">{{ $job.Status }}</span></td>
<td>{{ progressInt $job.Progress }}%</td>
<td>{{ if $job.FrameStart }}{{ derefInt $job.FrameStart }}{{ end }}{{ if $job.FrameEnd }}-{{ derefInt $job.FrameEnd }}{{ end }}</td>
<td>{{ if $job.OutputFormat }}{{ derefString $job.OutputFormat }}{{ else }}-{{ end }}</td>
<td>{{ formatTime $job.CreatedAt }}</td>
<td class="row">
{{ if or (eq $job.Status "pending") (eq $job.Status "running") }}
<button class="btn tiny" data-cancel-job="{{ $job.ID }}">Cancel</button>
{{ end }}
{{ if or (eq $job.Status "completed") (eq $job.Status "failed") (eq $job.Status "cancelled") }}
<button class="btn tiny danger" data-delete-job="{{ $job.ID }}">Delete</button>
{{ end }}
</td>
</tr>
{{ end }}
</tbody>
</table>
{{ end }}
{{ end }}

View File

@@ -1,19 +0,0 @@
import { defineConfig } from 'vite'
import react from '@vitejs/plugin-react'
export default defineConfig({
plugins: [react()],
build: {
outDir: 'dist',
emptyOutDir: true,
},
server: {
proxy: {
'/api': {
target: 'http://localhost:8080',
changeOrigin: true,
},
},
},
})