initial commit
This commit is contained in:
30
Makefile
Normal file
30
Makefile
Normal file
@@ -0,0 +1,30 @@
|
|||||||
|
.PHONY: build-manager build-runner run-manager run-runner clean test build-web
|
||||||
|
|
||||||
|
# Build manager
|
||||||
|
build-manager:
|
||||||
|
go build -o bin/manager ./cmd/manager
|
||||||
|
|
||||||
|
# Build runner
|
||||||
|
build-runner:
|
||||||
|
GOOS=linux GOARCH=amd64 go build -o bin/runner ./cmd/runner
|
||||||
|
|
||||||
|
# Build web UI
|
||||||
|
build-web:
|
||||||
|
cd web && npm install && npm run build
|
||||||
|
|
||||||
|
# Run manager
|
||||||
|
run-manager:
|
||||||
|
go run ./cmd/manager
|
||||||
|
|
||||||
|
# Run runner
|
||||||
|
run-runner:
|
||||||
|
go run ./cmd/runner
|
||||||
|
|
||||||
|
# Clean build artifacts
|
||||||
|
clean:
|
||||||
|
rm -rf bin/ web/dist/
|
||||||
|
|
||||||
|
# Run tests
|
||||||
|
test:
|
||||||
|
go test ./... -timeout 30s
|
||||||
|
|
||||||
174
README.md
Normal file
174
README.md
Normal file
@@ -0,0 +1,174 @@
|
|||||||
|
# Fuego - Blender Render Farm
|
||||||
|
|
||||||
|
A distributed Blender render farm system built with Go. The system consists of a manager server that handles job submission, file storage, and runner coordination, and runner clients that execute Blender renders on Linux amd64 systems.
|
||||||
|
|
||||||
|
## Architecture
|
||||||
|
|
||||||
|
- **Manager**: Central server with REST API, web UI, SQLite database, and local file storage
|
||||||
|
- **Runner**: Linux amd64 client that connects to manager, receives jobs, executes Blender renders, and reports back
|
||||||
|
|
||||||
|
## Features
|
||||||
|
|
||||||
|
- OAuth authentication (Google and Discord)
|
||||||
|
- Web-based job submission and monitoring
|
||||||
|
- Distributed rendering across multiple runners
|
||||||
|
- Real-time job progress tracking
|
||||||
|
- File upload/download for Blender files and rendered outputs
|
||||||
|
- Runner health monitoring
|
||||||
|
|
||||||
|
## Prerequisites
|
||||||
|
|
||||||
|
### Manager
|
||||||
|
- Go 1.21 or later
|
||||||
|
- SQLite3
|
||||||
|
|
||||||
|
### Runner
|
||||||
|
- Linux amd64
|
||||||
|
- Blender installed and in PATH
|
||||||
|
- FFmpeg installed (optional, for video processing)
|
||||||
|
|
||||||
|
## Installation
|
||||||
|
|
||||||
|
1. Clone the repository:
|
||||||
|
```bash
|
||||||
|
git clone <repository-url>
|
||||||
|
cd fuego
|
||||||
|
```
|
||||||
|
|
||||||
|
2. Install dependencies:
|
||||||
|
```bash
|
||||||
|
go mod download
|
||||||
|
```
|
||||||
|
|
||||||
|
## Configuration
|
||||||
|
|
||||||
|
### Manager
|
||||||
|
|
||||||
|
Set the following environment variables for OAuth (optional):
|
||||||
|
|
||||||
|
```bash
|
||||||
|
export GOOGLE_CLIENT_ID="your-google-client-id"
|
||||||
|
export GOOGLE_CLIENT_SECRET="your-google-client-secret"
|
||||||
|
export GOOGLE_REDIRECT_URL="http://localhost:8080/api/auth/google/callback"
|
||||||
|
|
||||||
|
export DISCORD_CLIENT_ID="your-discord-client-id"
|
||||||
|
export DISCORD_CLIENT_SECRET="your-discord-client-secret"
|
||||||
|
export DISCORD_REDIRECT_URL="http://localhost:8080/api/auth/discord/callback"
|
||||||
|
```
|
||||||
|
|
||||||
|
### Runner
|
||||||
|
|
||||||
|
No configuration required. Runner will auto-detect hostname and IP.
|
||||||
|
|
||||||
|
## Usage
|
||||||
|
|
||||||
|
### Running the Manager
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Using make
|
||||||
|
make run-manager
|
||||||
|
|
||||||
|
# Or directly
|
||||||
|
go run ./cmd/manager
|
||||||
|
|
||||||
|
# With custom options
|
||||||
|
go run ./cmd/manager -port 8080 -db fuego.db -storage ./storage
|
||||||
|
```
|
||||||
|
|
||||||
|
The manager will start on `http://localhost:8080` by default.
|
||||||
|
|
||||||
|
### Running a Runner
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Using make
|
||||||
|
make run-runner
|
||||||
|
|
||||||
|
# Or directly
|
||||||
|
go run ./cmd/runner
|
||||||
|
|
||||||
|
# With custom options
|
||||||
|
go run ./cmd/runner -manager http://localhost:8080 -name my-runner
|
||||||
|
```
|
||||||
|
|
||||||
|
### Building
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Build manager
|
||||||
|
make build-manager
|
||||||
|
|
||||||
|
# Build runner (Linux amd64)
|
||||||
|
make build-runner
|
||||||
|
```
|
||||||
|
|
||||||
|
## OAuth Setup
|
||||||
|
|
||||||
|
### Google OAuth
|
||||||
|
|
||||||
|
1. Go to [Google Cloud Console](https://console.cloud.google.com/)
|
||||||
|
2. Create a new project or select existing
|
||||||
|
3. Enable Google+ API
|
||||||
|
4. Create OAuth 2.0 credentials
|
||||||
|
5. Add authorized redirect URI: `http://localhost:8080/api/auth/google/callback`
|
||||||
|
6. Set environment variables with Client ID and Secret
|
||||||
|
|
||||||
|
### Discord OAuth
|
||||||
|
|
||||||
|
1. Go to [Discord Developer Portal](https://discord.com/developers/applications)
|
||||||
|
2. Create a new application
|
||||||
|
3. Go to OAuth2 section
|
||||||
|
4. Add redirect URI: `http://localhost:8080/api/auth/discord/callback`
|
||||||
|
5. Set environment variables with Client ID and Secret
|
||||||
|
|
||||||
|
## Project Structure
|
||||||
|
|
||||||
|
```
|
||||||
|
fuego/
|
||||||
|
├── cmd/
|
||||||
|
│ ├── manager/ # Manager server application
|
||||||
|
│ └── runner/ # Runner client application
|
||||||
|
├── internal/
|
||||||
|
│ ├── api/ # REST API handlers
|
||||||
|
│ ├── auth/ # OAuth authentication
|
||||||
|
│ ├── database/ # SQLite database models and migrations
|
||||||
|
│ ├── queue/ # Job queue management
|
||||||
|
│ ├── storage/ # File storage operations
|
||||||
|
│ └── runner/ # Runner management logic
|
||||||
|
├── pkg/
|
||||||
|
│ └── types/ # Shared types and models
|
||||||
|
├── web/ # Static web UI files
|
||||||
|
├── go.mod
|
||||||
|
└── Makefile
|
||||||
|
```
|
||||||
|
|
||||||
|
## API Endpoints
|
||||||
|
|
||||||
|
### Authentication
|
||||||
|
- `GET /api/auth/google/login` - Initiate Google OAuth
|
||||||
|
- `GET /api/auth/google/callback` - Google OAuth callback
|
||||||
|
- `GET /api/auth/discord/login` - Initiate Discord OAuth
|
||||||
|
- `GET /api/auth/discord/callback` - Discord OAuth callback
|
||||||
|
- `POST /api/auth/logout` - Logout
|
||||||
|
- `GET /api/auth/me` - Get current user
|
||||||
|
|
||||||
|
### Jobs
|
||||||
|
- `POST /api/jobs` - Create a new job
|
||||||
|
- `GET /api/jobs` - List user's jobs
|
||||||
|
- `GET /api/jobs/{id}` - Get job details
|
||||||
|
- `DELETE /api/jobs/{id}` - Cancel a job
|
||||||
|
- `POST /api/jobs/{id}/upload` - Upload job file
|
||||||
|
- `GET /api/jobs/{id}/files` - List job files
|
||||||
|
- `GET /api/jobs/{id}/files/{fileId}/download` - Download job file
|
||||||
|
|
||||||
|
### Runners
|
||||||
|
- `GET /api/runners` - List all runners
|
||||||
|
- `POST /api/runner/register` - Register a runner
|
||||||
|
- `POST /api/runner/heartbeat` - Update runner heartbeat
|
||||||
|
- `GET /api/runner/tasks` - Get pending tasks for runner
|
||||||
|
- `POST /api/runner/tasks/{id}/complete` - Mark task as complete
|
||||||
|
- `GET /api/runner/files/{jobId}/{fileName}` - Download file for runner
|
||||||
|
- `POST /api/runner/files/{jobId}/upload` - Upload file from runner
|
||||||
|
|
||||||
|
## License
|
||||||
|
|
||||||
|
MIT
|
||||||
|
|
||||||
66
cmd/manager/main.go
Normal file
66
cmd/manager/main.go
Normal file
@@ -0,0 +1,66 @@
|
|||||||
|
package main
|
||||||
|
|
||||||
|
import (
|
||||||
|
"flag"
|
||||||
|
"fmt"
|
||||||
|
"log"
|
||||||
|
"net/http"
|
||||||
|
"os"
|
||||||
|
|
||||||
|
"fuego/internal/api"
|
||||||
|
"fuego/internal/auth"
|
||||||
|
"fuego/internal/database"
|
||||||
|
"fuego/internal/storage"
|
||||||
|
)
|
||||||
|
|
||||||
|
func main() {
|
||||||
|
var (
|
||||||
|
port = flag.String("port", getEnv("PORT", "8080"), "Server port")
|
||||||
|
dbPath = flag.String("db", getEnv("DB_PATH", "fuego.db"), "Database path")
|
||||||
|
storagePath = flag.String("storage", getEnv("STORAGE_PATH", "./storage"), "Storage path")
|
||||||
|
)
|
||||||
|
flag.Parse()
|
||||||
|
|
||||||
|
// Initialize database
|
||||||
|
db, err := database.NewDB(*dbPath)
|
||||||
|
if err != nil {
|
||||||
|
log.Fatalf("Failed to initialize database: %v", err)
|
||||||
|
}
|
||||||
|
defer db.Close()
|
||||||
|
|
||||||
|
// Initialize auth
|
||||||
|
authHandler, err := auth.NewAuth(db.DB)
|
||||||
|
if err != nil {
|
||||||
|
log.Fatalf("Failed to initialize auth: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Initialize storage
|
||||||
|
storageHandler, err := storage.NewStorage(*storagePath)
|
||||||
|
if err != nil {
|
||||||
|
log.Fatalf("Failed to initialize storage: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create API server
|
||||||
|
server, err := api.NewServer(db, authHandler, storageHandler)
|
||||||
|
if err != nil {
|
||||||
|
log.Fatalf("Failed to create server: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Start server
|
||||||
|
addr := fmt.Sprintf(":%s", *port)
|
||||||
|
log.Printf("Starting manager server on %s", addr)
|
||||||
|
log.Printf("Database: %s", *dbPath)
|
||||||
|
log.Printf("Storage: %s", *storagePath)
|
||||||
|
|
||||||
|
if err := http.ListenAndServe(addr, server); err != nil {
|
||||||
|
log.Fatalf("Server failed: %v", err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func getEnv(key, defaultValue string) string {
|
||||||
|
if value := os.Getenv(key); value != "" {
|
||||||
|
return value
|
||||||
|
}
|
||||||
|
return defaultValue
|
||||||
|
}
|
||||||
|
|
||||||
120
cmd/runner/main.go
Normal file
120
cmd/runner/main.go
Normal file
@@ -0,0 +1,120 @@
|
|||||||
|
package main
|
||||||
|
|
||||||
|
import (
|
||||||
|
"encoding/json"
|
||||||
|
"flag"
|
||||||
|
"fmt"
|
||||||
|
"log"
|
||||||
|
"os"
|
||||||
|
|
||||||
|
"fuego/internal/runner"
|
||||||
|
)
|
||||||
|
|
||||||
|
type SecretsFile struct {
|
||||||
|
RunnerID int64 `json:"runner_id"`
|
||||||
|
RunnerSecret string `json:"runner_secret"`
|
||||||
|
ManagerSecret string `json:"manager_secret"`
|
||||||
|
}
|
||||||
|
|
||||||
|
func main() {
|
||||||
|
var (
|
||||||
|
managerURL = flag.String("manager", getEnv("MANAGER_URL", "http://localhost:8080"), "Manager URL")
|
||||||
|
name = flag.String("name", getEnv("RUNNER_NAME", ""), "Runner name")
|
||||||
|
hostname = flag.String("hostname", getEnv("RUNNER_HOSTNAME", ""), "Runner hostname")
|
||||||
|
ipAddress = flag.String("ip", getEnv("RUNNER_IP", ""), "Runner IP address")
|
||||||
|
token = flag.String("token", getEnv("REGISTRATION_TOKEN", ""), "Registration token")
|
||||||
|
secretsFile = flag.String("secrets-file", getEnv("SECRETS_FILE", ""), "Path to secrets file for persistent storage")
|
||||||
|
)
|
||||||
|
flag.Parse()
|
||||||
|
|
||||||
|
if *name == "" {
|
||||||
|
hostname, _ := os.Hostname()
|
||||||
|
*name = fmt.Sprintf("runner-%s", hostname)
|
||||||
|
}
|
||||||
|
if *hostname == "" {
|
||||||
|
*hostname, _ = os.Hostname()
|
||||||
|
}
|
||||||
|
if *ipAddress == "" {
|
||||||
|
*ipAddress = "127.0.0.1"
|
||||||
|
}
|
||||||
|
|
||||||
|
client := runner.NewClient(*managerURL, *name, *hostname, *ipAddress)
|
||||||
|
|
||||||
|
// Try to load secrets from file
|
||||||
|
var runnerID int64
|
||||||
|
var runnerSecret, managerSecret string
|
||||||
|
if *secretsFile != "" {
|
||||||
|
if secrets, err := loadSecrets(*secretsFile); err == nil {
|
||||||
|
runnerID = secrets.RunnerID
|
||||||
|
runnerSecret = secrets.RunnerSecret
|
||||||
|
managerSecret = secrets.ManagerSecret
|
||||||
|
client.SetSecrets(runnerID, runnerSecret, managerSecret)
|
||||||
|
log.Printf("Loaded secrets from %s", *secretsFile)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// If no secrets loaded, register with token
|
||||||
|
if runnerID == 0 {
|
||||||
|
if *token == "" {
|
||||||
|
log.Fatalf("Registration token required (use --token or set REGISTRATION_TOKEN env var)")
|
||||||
|
}
|
||||||
|
|
||||||
|
var err error
|
||||||
|
runnerID, runnerSecret, managerSecret, err = client.Register(*token)
|
||||||
|
if err != nil {
|
||||||
|
log.Fatalf("Failed to register runner: %v", err)
|
||||||
|
}
|
||||||
|
log.Printf("Registered runner with ID: %d", runnerID)
|
||||||
|
|
||||||
|
// Save secrets to file if specified
|
||||||
|
if *secretsFile != "" {
|
||||||
|
secrets := SecretsFile{
|
||||||
|
RunnerID: runnerID,
|
||||||
|
RunnerSecret: runnerSecret,
|
||||||
|
ManagerSecret: managerSecret,
|
||||||
|
}
|
||||||
|
if err := saveSecrets(*secretsFile, secrets); err != nil {
|
||||||
|
log.Printf("Warning: Failed to save secrets: %v", err)
|
||||||
|
} else {
|
||||||
|
log.Printf("Saved secrets to %s", *secretsFile)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Start heartbeat loop
|
||||||
|
go client.HeartbeatLoop()
|
||||||
|
|
||||||
|
// Start task processing loop
|
||||||
|
client.ProcessTasks()
|
||||||
|
}
|
||||||
|
|
||||||
|
func loadSecrets(path string) (*SecretsFile, error) {
|
||||||
|
data, err := os.ReadFile(path)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
var secrets SecretsFile
|
||||||
|
if err := json.Unmarshal(data, &secrets); err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
return &secrets, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func saveSecrets(path string, secrets SecretsFile) error {
|
||||||
|
data, err := json.MarshalIndent(secrets, "", " ")
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
return os.WriteFile(path, data, 0600)
|
||||||
|
}
|
||||||
|
|
||||||
|
func getEnv(key, defaultValue string) string {
|
||||||
|
if value := os.Getenv(key); value != "" {
|
||||||
|
return value
|
||||||
|
}
|
||||||
|
return defaultValue
|
||||||
|
}
|
||||||
|
|
||||||
12
go.mod
Normal file
12
go.mod
Normal file
@@ -0,0 +1,12 @@
|
|||||||
|
module fuego
|
||||||
|
|
||||||
|
go 1.25.4
|
||||||
|
|
||||||
|
require (
|
||||||
|
cloud.google.com/go/compute/metadata v0.3.0 // indirect
|
||||||
|
github.com/go-chi/chi/v5 v5.2.3 // indirect
|
||||||
|
github.com/go-chi/cors v1.2.2 // indirect
|
||||||
|
github.com/google/uuid v1.6.0 // indirect
|
||||||
|
github.com/mattn/go-sqlite3 v1.14.32 // indirect
|
||||||
|
golang.org/x/oauth2 v0.33.0 // indirect
|
||||||
|
)
|
||||||
12
go.sum
Normal file
12
go.sum
Normal file
@@ -0,0 +1,12 @@
|
|||||||
|
cloud.google.com/go/compute/metadata v0.3.0 h1:Tz+eQXMEqDIKRsmY3cHTL6FVaynIjX2QxYC4trgAKZc=
|
||||||
|
cloud.google.com/go/compute/metadata v0.3.0/go.mod h1:zFmK7XCadkQkj6TtorcaGlCW1hT1fIilQDwofLpJ20k=
|
||||||
|
github.com/go-chi/chi/v5 v5.2.3 h1:WQIt9uxdsAbgIYgid+BpYc+liqQZGMHRaUwp0JUcvdE=
|
||||||
|
github.com/go-chi/chi/v5 v5.2.3/go.mod h1:L2yAIGWB3H+phAw1NxKwWM+7eUH/lU8pOMm5hHcoops=
|
||||||
|
github.com/go-chi/cors v1.2.2 h1:Jmey33TE+b+rB7fT8MUy1u0I4L+NARQlK6LhzKPSyQE=
|
||||||
|
github.com/go-chi/cors v1.2.2/go.mod h1:sSbTewc+6wYHBBCW7ytsFSn836hqM7JxpglAy2Vzc58=
|
||||||
|
github.com/google/uuid v1.6.0 h1:NIvaJDMOsjHA8n1jAhLSgzrAzy1Hgr+hNrb57e+94F0=
|
||||||
|
github.com/google/uuid v1.6.0/go.mod h1:TIyPZe4MgqvfeYDBFedMoGGpEw/LqOeaOT+nhxU+yHo=
|
||||||
|
github.com/mattn/go-sqlite3 v1.14.32 h1:JD12Ag3oLy1zQA+BNn74xRgaBbdhbNIDYvQUEuuErjs=
|
||||||
|
github.com/mattn/go-sqlite3 v1.14.32/go.mod h1:Uh1q+B4BYcTPb+yiD3kU8Ct7aC0hY9fxUwlHK0RXw+Y=
|
||||||
|
golang.org/x/oauth2 v0.33.0 h1:4Q+qn+E5z8gPRJfmRy7C2gGG3T4jIprK6aSYgTXGRpo=
|
||||||
|
golang.org/x/oauth2 v0.33.0/go.mod h1:lzm5WQJQwKZ3nwavOZ3IS5Aulzxi68dUSgRHujetwEA=
|
||||||
172
internal/api/admin.go
Normal file
172
internal/api/admin.go
Normal file
@@ -0,0 +1,172 @@
|
|||||||
|
package api
|
||||||
|
|
||||||
|
import (
|
||||||
|
"database/sql"
|
||||||
|
"encoding/json"
|
||||||
|
"fmt"
|
||||||
|
"net/http"
|
||||||
|
"time"
|
||||||
|
|
||||||
|
"github.com/go-chi/chi/v5"
|
||||||
|
"fuego/internal/auth"
|
||||||
|
"fuego/pkg/types"
|
||||||
|
)
|
||||||
|
|
||||||
|
// handleGenerateRegistrationToken generates a new registration token
|
||||||
|
func (s *Server) handleGenerateRegistrationToken(w http.ResponseWriter, r *http.Request) {
|
||||||
|
userID, err := getUserID(r)
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusUnauthorized, err.Error())
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// Default expiration: 24 hours
|
||||||
|
expiresIn := 24 * time.Hour
|
||||||
|
|
||||||
|
var req struct {
|
||||||
|
ExpiresInHours int `json:"expires_in_hours,omitempty"`
|
||||||
|
}
|
||||||
|
if r.Body != nil && r.ContentLength > 0 {
|
||||||
|
if err := json.NewDecoder(r.Body).Decode(&req); err == nil && req.ExpiresInHours > 0 {
|
||||||
|
expiresIn = time.Duration(req.ExpiresInHours) * time.Hour
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
token, err := s.secrets.GenerateRegistrationToken(userID, expiresIn)
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusInternalServerError, fmt.Sprintf("Failed to generate token: %v", err))
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
s.respondJSON(w, http.StatusCreated, map[string]interface{}{
|
||||||
|
"token": token,
|
||||||
|
"expires_in": expiresIn.String(),
|
||||||
|
"expires_at": time.Now().Add(expiresIn),
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
// handleListRegistrationTokens lists all registration tokens
|
||||||
|
func (s *Server) handleListRegistrationTokens(w http.ResponseWriter, r *http.Request) {
|
||||||
|
tokens, err := s.secrets.ListRegistrationTokens()
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusInternalServerError, fmt.Sprintf("Failed to list tokens: %v", err))
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
s.respondJSON(w, http.StatusOK, tokens)
|
||||||
|
}
|
||||||
|
|
||||||
|
// handleRevokeRegistrationToken revokes a registration token
|
||||||
|
func (s *Server) handleRevokeRegistrationToken(w http.ResponseWriter, r *http.Request) {
|
||||||
|
tokenID, err := parseID(r, "id")
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusBadRequest, err.Error())
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
if err := s.secrets.RevokeRegistrationToken(tokenID); err != nil {
|
||||||
|
s.respondError(w, http.StatusInternalServerError, fmt.Sprintf("Failed to revoke token: %v", err))
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
s.respondJSON(w, http.StatusOK, map[string]string{"message": "Token revoked"})
|
||||||
|
}
|
||||||
|
|
||||||
|
// handleVerifyRunner manually verifies a runner
|
||||||
|
func (s *Server) handleVerifyRunner(w http.ResponseWriter, r *http.Request) {
|
||||||
|
runnerID, err := parseID(r, "id")
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusBadRequest, err.Error())
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check if runner exists
|
||||||
|
var exists bool
|
||||||
|
err = s.db.QueryRow("SELECT EXISTS(SELECT 1 FROM runners WHERE id = ?)", runnerID).Scan(&exists)
|
||||||
|
if err != nil || !exists {
|
||||||
|
s.respondError(w, http.StatusNotFound, "Runner not found")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// Mark runner as verified
|
||||||
|
_, err = s.db.Exec("UPDATE runners SET verified = 1 WHERE id = ?", runnerID)
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusInternalServerError, fmt.Sprintf("Failed to verify runner: %v", err))
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
s.respondJSON(w, http.StatusOK, map[string]string{"message": "Runner verified"})
|
||||||
|
}
|
||||||
|
|
||||||
|
// handleDeleteRunner removes a runner
|
||||||
|
func (s *Server) handleDeleteRunner(w http.ResponseWriter, r *http.Request) {
|
||||||
|
runnerID, err := parseID(r, "id")
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusBadRequest, err.Error())
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check if runner exists
|
||||||
|
var exists bool
|
||||||
|
err = s.db.QueryRow("SELECT EXISTS(SELECT 1 FROM runners WHERE id = ?)", runnerID).Scan(&exists)
|
||||||
|
if err != nil || !exists {
|
||||||
|
s.respondError(w, http.StatusNotFound, "Runner not found")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// Delete runner
|
||||||
|
_, err = s.db.Exec("DELETE FROM runners WHERE id = ?", runnerID)
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusInternalServerError, fmt.Sprintf("Failed to delete runner: %v", err))
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
s.respondJSON(w, http.StatusOK, map[string]string{"message": "Runner deleted"})
|
||||||
|
}
|
||||||
|
|
||||||
|
// handleListRunnersAdmin lists all runners with admin details
|
||||||
|
func (s *Server) handleListRunnersAdmin(w http.ResponseWriter, r *http.Request) {
|
||||||
|
rows, err := s.db.Query(
|
||||||
|
`SELECT id, name, hostname, ip_address, status, last_heartbeat, capabilities,
|
||||||
|
registration_token, verified, created_at
|
||||||
|
FROM runners ORDER BY created_at DESC`,
|
||||||
|
)
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusInternalServerError, fmt.Sprintf("Failed to query runners: %v", err))
|
||||||
|
return
|
||||||
|
}
|
||||||
|
defer rows.Close()
|
||||||
|
|
||||||
|
runners := []map[string]interface{}{}
|
||||||
|
for rows.Next() {
|
||||||
|
var runner types.Runner
|
||||||
|
var registrationToken sql.NullString
|
||||||
|
var verified bool
|
||||||
|
|
||||||
|
err := rows.Scan(
|
||||||
|
&runner.ID, &runner.Name, &runner.Hostname, &runner.IPAddress,
|
||||||
|
&runner.Status, &runner.LastHeartbeat, &runner.Capabilities,
|
||||||
|
®istrationToken, &verified, &runner.CreatedAt,
|
||||||
|
)
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusInternalServerError, fmt.Sprintf("Failed to scan runner: %v", err))
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
runners = append(runners, map[string]interface{}{
|
||||||
|
"id": runner.ID,
|
||||||
|
"name": runner.Name,
|
||||||
|
"hostname": runner.Hostname,
|
||||||
|
"ip_address": runner.IPAddress,
|
||||||
|
"status": runner.Status,
|
||||||
|
"last_heartbeat": runner.LastHeartbeat,
|
||||||
|
"capabilities": runner.Capabilities,
|
||||||
|
"registration_token": registrationToken.String,
|
||||||
|
"verified": verified,
|
||||||
|
"created_at": runner.CreatedAt,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
s.respondJSON(w, http.StatusOK, runners)
|
||||||
|
}
|
||||||
|
|
||||||
498
internal/api/jobs.go
Normal file
498
internal/api/jobs.go
Normal file
@@ -0,0 +1,498 @@
|
|||||||
|
package api
|
||||||
|
|
||||||
|
import (
|
||||||
|
"database/sql"
|
||||||
|
"encoding/json"
|
||||||
|
"fmt"
|
||||||
|
"io"
|
||||||
|
"net/http"
|
||||||
|
"time"
|
||||||
|
|
||||||
|
"fuego/pkg/types"
|
||||||
|
)
|
||||||
|
|
||||||
|
// handleCreateJob creates a new job
|
||||||
|
func (s *Server) handleCreateJob(w http.ResponseWriter, r *http.Request) {
|
||||||
|
userID, err := getUserID(r)
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusUnauthorized, err.Error())
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
var req types.CreateJobRequest
|
||||||
|
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
|
||||||
|
s.respondError(w, http.StatusBadRequest, "Invalid request body")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
if req.Name == "" {
|
||||||
|
s.respondError(w, http.StatusBadRequest, "Job name is required")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
if req.FrameStart < 0 || req.FrameEnd < req.FrameStart {
|
||||||
|
s.respondError(w, http.StatusBadRequest, "Invalid frame range")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
if req.OutputFormat == "" {
|
||||||
|
req.OutputFormat = "PNG"
|
||||||
|
}
|
||||||
|
|
||||||
|
result, err := s.db.Exec(
|
||||||
|
`INSERT INTO jobs (user_id, name, status, progress, frame_start, frame_end, output_format)
|
||||||
|
VALUES (?, ?, ?, ?, ?, ?, ?)`,
|
||||||
|
userID, req.Name, types.JobStatusPending, 0.0, req.FrameStart, req.FrameEnd, req.OutputFormat,
|
||||||
|
)
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusInternalServerError, fmt.Sprintf("Failed to create job: %v", err))
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
jobID, _ := result.LastInsertId()
|
||||||
|
|
||||||
|
// Create tasks for the job (one task per frame for simplicity, could be batched)
|
||||||
|
for frame := req.FrameStart; frame <= req.FrameEnd; frame++ {
|
||||||
|
_, err = s.db.Exec(
|
||||||
|
`INSERT INTO tasks (job_id, frame_start, frame_end, status) VALUES (?, ?, ?, ?)`,
|
||||||
|
jobID, frame, frame, types.TaskStatusPending,
|
||||||
|
)
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusInternalServerError, fmt.Sprintf("Failed to create tasks: %v", err))
|
||||||
|
return
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
job := types.Job{
|
||||||
|
ID: jobID,
|
||||||
|
UserID: userID,
|
||||||
|
Name: req.Name,
|
||||||
|
Status: types.JobStatusPending,
|
||||||
|
Progress: 0.0,
|
||||||
|
FrameStart: req.FrameStart,
|
||||||
|
FrameEnd: req.FrameEnd,
|
||||||
|
OutputFormat: req.OutputFormat,
|
||||||
|
CreatedAt: time.Now(),
|
||||||
|
}
|
||||||
|
|
||||||
|
s.respondJSON(w, http.StatusCreated, job)
|
||||||
|
}
|
||||||
|
|
||||||
|
// handleListJobs lists jobs for the current user
|
||||||
|
func (s *Server) handleListJobs(w http.ResponseWriter, r *http.Request) {
|
||||||
|
userID, err := getUserID(r)
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusUnauthorized, err.Error())
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
rows, err := s.db.Query(
|
||||||
|
`SELECT id, user_id, name, status, progress, frame_start, frame_end, output_format,
|
||||||
|
created_at, started_at, completed_at, error_message
|
||||||
|
FROM jobs WHERE user_id = ? ORDER BY created_at DESC`,
|
||||||
|
userID,
|
||||||
|
)
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusInternalServerError, fmt.Sprintf("Failed to query jobs: %v", err))
|
||||||
|
return
|
||||||
|
}
|
||||||
|
defer rows.Close()
|
||||||
|
|
||||||
|
jobs := []types.Job{}
|
||||||
|
for rows.Next() {
|
||||||
|
var job types.Job
|
||||||
|
var startedAt, completedAt sql.NullTime
|
||||||
|
|
||||||
|
err := rows.Scan(
|
||||||
|
&job.ID, &job.UserID, &job.Name, &job.Status, &job.Progress,
|
||||||
|
&job.FrameStart, &job.FrameEnd, &job.OutputFormat,
|
||||||
|
&job.CreatedAt, &startedAt, &completedAt, &job.ErrorMessage,
|
||||||
|
)
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusInternalServerError, fmt.Sprintf("Failed to scan job: %v", err))
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
if startedAt.Valid {
|
||||||
|
job.StartedAt = &startedAt.Time
|
||||||
|
}
|
||||||
|
if completedAt.Valid {
|
||||||
|
job.CompletedAt = &completedAt.Time
|
||||||
|
}
|
||||||
|
|
||||||
|
jobs = append(jobs, job)
|
||||||
|
}
|
||||||
|
|
||||||
|
s.respondJSON(w, http.StatusOK, jobs)
|
||||||
|
}
|
||||||
|
|
||||||
|
// handleGetJob gets a specific job
|
||||||
|
func (s *Server) handleGetJob(w http.ResponseWriter, r *http.Request) {
|
||||||
|
userID, err := getUserID(r)
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusUnauthorized, err.Error())
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
jobID, err := parseID(r, "id")
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusBadRequest, err.Error())
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
var job types.Job
|
||||||
|
var startedAt, completedAt sql.NullTime
|
||||||
|
|
||||||
|
err = s.db.QueryRow(
|
||||||
|
`SELECT id, user_id, name, status, progress, frame_start, frame_end, output_format,
|
||||||
|
created_at, started_at, completed_at, error_message
|
||||||
|
FROM jobs WHERE id = ? AND user_id = ?`,
|
||||||
|
jobID, userID,
|
||||||
|
).Scan(
|
||||||
|
&job.ID, &job.UserID, &job.Name, &job.Status, &job.Progress,
|
||||||
|
&job.FrameStart, &job.FrameEnd, &job.OutputFormat,
|
||||||
|
&job.CreatedAt, &startedAt, &completedAt, &job.ErrorMessage,
|
||||||
|
)
|
||||||
|
|
||||||
|
if err == sql.ErrNoRows {
|
||||||
|
s.respondError(w, http.StatusNotFound, "Job not found")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusInternalServerError, fmt.Sprintf("Failed to query job: %v", err))
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
if startedAt.Valid {
|
||||||
|
job.StartedAt = &startedAt.Time
|
||||||
|
}
|
||||||
|
if completedAt.Valid {
|
||||||
|
job.CompletedAt = &completedAt.Time
|
||||||
|
}
|
||||||
|
|
||||||
|
s.respondJSON(w, http.StatusOK, job)
|
||||||
|
}
|
||||||
|
|
||||||
|
// handleCancelJob cancels a job
|
||||||
|
func (s *Server) handleCancelJob(w http.ResponseWriter, r *http.Request) {
|
||||||
|
userID, err := getUserID(r)
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusUnauthorized, err.Error())
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
jobID, err := parseID(r, "id")
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusBadRequest, err.Error())
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
result, err := s.db.Exec(
|
||||||
|
`UPDATE jobs SET status = ? WHERE id = ? AND user_id = ?`,
|
||||||
|
types.JobStatusCancelled, jobID, userID,
|
||||||
|
)
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusInternalServerError, fmt.Sprintf("Failed to cancel job: %v", err))
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
rowsAffected, _ := result.RowsAffected()
|
||||||
|
if rowsAffected == 0 {
|
||||||
|
s.respondError(w, http.StatusNotFound, "Job not found")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// Cancel pending tasks
|
||||||
|
_, err = s.db.Exec(
|
||||||
|
`UPDATE tasks SET status = ? WHERE job_id = ? AND status = ?`,
|
||||||
|
types.TaskStatusFailed, jobID, types.TaskStatusPending,
|
||||||
|
)
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusInternalServerError, fmt.Sprintf("Failed to cancel tasks: %v", err))
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
s.respondJSON(w, http.StatusOK, map[string]string{"message": "Job cancelled"})
|
||||||
|
}
|
||||||
|
|
||||||
|
// handleUploadJobFile handles file upload for a job
|
||||||
|
func (s *Server) handleUploadJobFile(w http.ResponseWriter, r *http.Request) {
|
||||||
|
userID, err := getUserID(r)
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusUnauthorized, err.Error())
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
jobID, err := parseID(r, "id")
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusBadRequest, err.Error())
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify job belongs to user
|
||||||
|
var jobUserID int64
|
||||||
|
err = s.db.QueryRow("SELECT user_id FROM jobs WHERE id = ?", jobID).Scan(&jobUserID)
|
||||||
|
if err == sql.ErrNoRows {
|
||||||
|
s.respondError(w, http.StatusNotFound, "Job not found")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusInternalServerError, fmt.Sprintf("Failed to verify job: %v", err))
|
||||||
|
return
|
||||||
|
}
|
||||||
|
if jobUserID != userID {
|
||||||
|
s.respondError(w, http.StatusForbidden, "Access denied")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// Parse multipart form
|
||||||
|
err = r.ParseMultipartForm(100 << 20) // 100 MB
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusBadRequest, "Failed to parse form")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
file, header, err := r.FormFile("file")
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusBadRequest, "No file provided")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
defer file.Close()
|
||||||
|
|
||||||
|
// Save file
|
||||||
|
filePath, err := s.storage.SaveUpload(jobID, header.Filename, file)
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusInternalServerError, fmt.Sprintf("Failed to save file: %v", err))
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// Record in database
|
||||||
|
result, err := s.db.Exec(
|
||||||
|
`INSERT INTO job_files (job_id, file_type, file_path, file_name, file_size)
|
||||||
|
VALUES (?, ?, ?, ?, ?)`,
|
||||||
|
jobID, types.JobFileTypeInput, filePath, header.Filename, header.Size,
|
||||||
|
)
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusInternalServerError, fmt.Sprintf("Failed to record file: %v", err))
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
fileID, _ := result.LastInsertId()
|
||||||
|
|
||||||
|
s.respondJSON(w, http.StatusCreated, map[string]interface{}{
|
||||||
|
"id": fileID,
|
||||||
|
"file_name": header.Filename,
|
||||||
|
"file_path": filePath,
|
||||||
|
"file_size": header.Size,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
// handleListJobFiles lists files for a job
|
||||||
|
func (s *Server) handleListJobFiles(w http.ResponseWriter, r *http.Request) {
|
||||||
|
userID, err := getUserID(r)
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusUnauthorized, err.Error())
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
jobID, err := parseID(r, "id")
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusBadRequest, err.Error())
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify job belongs to user
|
||||||
|
var jobUserID int64
|
||||||
|
err = s.db.QueryRow("SELECT user_id FROM jobs WHERE id = ?", jobID).Scan(&jobUserID)
|
||||||
|
if err == sql.ErrNoRows {
|
||||||
|
s.respondError(w, http.StatusNotFound, "Job not found")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
if jobUserID != userID {
|
||||||
|
s.respondError(w, http.StatusForbidden, "Access denied")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
rows, err := s.db.Query(
|
||||||
|
`SELECT id, job_id, file_type, file_path, file_name, file_size, created_at
|
||||||
|
FROM job_files WHERE job_id = ? ORDER BY created_at DESC`,
|
||||||
|
jobID,
|
||||||
|
)
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusInternalServerError, fmt.Sprintf("Failed to query files: %v", err))
|
||||||
|
return
|
||||||
|
}
|
||||||
|
defer rows.Close()
|
||||||
|
|
||||||
|
files := []types.JobFile{}
|
||||||
|
for rows.Next() {
|
||||||
|
var file types.JobFile
|
||||||
|
err := rows.Scan(
|
||||||
|
&file.ID, &file.JobID, &file.FileType, &file.FilePath,
|
||||||
|
&file.FileName, &file.FileSize, &file.CreatedAt,
|
||||||
|
)
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusInternalServerError, fmt.Sprintf("Failed to scan file: %v", err))
|
||||||
|
return
|
||||||
|
}
|
||||||
|
files = append(files, file)
|
||||||
|
}
|
||||||
|
|
||||||
|
s.respondJSON(w, http.StatusOK, files)
|
||||||
|
}
|
||||||
|
|
||||||
|
// handleDownloadJobFile downloads a job file
|
||||||
|
func (s *Server) handleDownloadJobFile(w http.ResponseWriter, r *http.Request) {
|
||||||
|
userID, err := getUserID(r)
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusUnauthorized, err.Error())
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
jobID, err := parseID(r, "id")
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusBadRequest, err.Error())
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
fileID, err := parseID(r, "fileId")
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusBadRequest, err.Error())
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify job belongs to user
|
||||||
|
var jobUserID int64
|
||||||
|
err = s.db.QueryRow("SELECT user_id FROM jobs WHERE id = ?", jobID).Scan(&jobUserID)
|
||||||
|
if err == sql.ErrNoRows {
|
||||||
|
s.respondError(w, http.StatusNotFound, "Job not found")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
if jobUserID != userID {
|
||||||
|
s.respondError(w, http.StatusForbidden, "Access denied")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get file info
|
||||||
|
var filePath, fileName string
|
||||||
|
err = s.db.QueryRow(
|
||||||
|
`SELECT file_path, file_name FROM job_files WHERE id = ? AND job_id = ?`,
|
||||||
|
fileID, jobID,
|
||||||
|
).Scan(&filePath, &fileName)
|
||||||
|
if err == sql.ErrNoRows {
|
||||||
|
s.respondError(w, http.StatusNotFound, "File not found")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusInternalServerError, fmt.Sprintf("Failed to query file: %v", err))
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// Open file
|
||||||
|
file, err := s.storage.GetFile(filePath)
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusNotFound, "File not found on disk")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
defer file.Close()
|
||||||
|
|
||||||
|
// Set headers
|
||||||
|
w.Header().Set("Content-Disposition", fmt.Sprintf("attachment; filename=%s", fileName))
|
||||||
|
w.Header().Set("Content-Type", "application/octet-stream")
|
||||||
|
|
||||||
|
// Stream file
|
||||||
|
io.Copy(w, file)
|
||||||
|
}
|
||||||
|
|
||||||
|
// handleStreamVideo streams MP4 video file with range support
|
||||||
|
func (s *Server) handleStreamVideo(w http.ResponseWriter, r *http.Request) {
|
||||||
|
userID, err := getUserID(r)
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusUnauthorized, err.Error())
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
jobID, err := parseID(r, "id")
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusBadRequest, err.Error())
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify job belongs to user
|
||||||
|
var jobUserID int64
|
||||||
|
var outputFormat string
|
||||||
|
err = s.db.QueryRow("SELECT user_id, output_format FROM jobs WHERE id = ?", jobID).Scan(&jobUserID, &outputFormat)
|
||||||
|
if err == sql.ErrNoRows {
|
||||||
|
s.respondError(w, http.StatusNotFound, "Job not found")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
if jobUserID != userID {
|
||||||
|
s.respondError(w, http.StatusForbidden, "Access denied")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// Find MP4 file
|
||||||
|
var filePath, fileName string
|
||||||
|
err = s.db.QueryRow(
|
||||||
|
`SELECT file_path, file_name FROM job_files
|
||||||
|
WHERE job_id = ? AND file_type = ? AND file_name LIKE '%.mp4'
|
||||||
|
ORDER BY created_at DESC LIMIT 1`,
|
||||||
|
jobID, types.JobFileTypeOutput,
|
||||||
|
).Scan(&filePath, &fileName)
|
||||||
|
if err == sql.ErrNoRows {
|
||||||
|
s.respondError(w, http.StatusNotFound, "Video file not found")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusInternalServerError, fmt.Sprintf("Failed to query file: %v", err))
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// Open file
|
||||||
|
file, err := s.storage.GetFile(filePath)
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusNotFound, "File not found on disk")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
defer file.Close()
|
||||||
|
|
||||||
|
// Get file info
|
||||||
|
fileInfo, err := file.Stat()
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusInternalServerError, "Failed to get file info")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
fileSize := fileInfo.Size()
|
||||||
|
|
||||||
|
// Handle range requests for video seeking
|
||||||
|
rangeHeader := r.Header.Get("Range")
|
||||||
|
if rangeHeader != "" {
|
||||||
|
// Parse range header
|
||||||
|
var start, end int64
|
||||||
|
fmt.Sscanf(rangeHeader, "bytes=%d-%d", &start, &end)
|
||||||
|
if end == 0 {
|
||||||
|
end = fileSize - 1
|
||||||
|
}
|
||||||
|
|
||||||
|
// Seek to start position
|
||||||
|
file.Seek(start, 0)
|
||||||
|
|
||||||
|
// Set headers for partial content
|
||||||
|
w.Header().Set("Content-Range", fmt.Sprintf("bytes %d-%d/%d", start, end, fileSize))
|
||||||
|
w.Header().Set("Accept-Ranges", "bytes")
|
||||||
|
w.Header().Set("Content-Length", fmt.Sprintf("%d", end-start+1))
|
||||||
|
w.Header().Set("Content-Type", "video/mp4")
|
||||||
|
w.WriteHeader(http.StatusPartialContent)
|
||||||
|
|
||||||
|
// Copy partial content
|
||||||
|
io.CopyN(w, file, end-start+1)
|
||||||
|
} else {
|
||||||
|
// Full file
|
||||||
|
w.Header().Set("Content-Type", "video/mp4")
|
||||||
|
w.Header().Set("Content-Length", fmt.Sprintf("%d", fileSize))
|
||||||
|
w.Header().Set("Accept-Ranges", "bytes")
|
||||||
|
io.Copy(w, file)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
582
internal/api/runners.go
Normal file
582
internal/api/runners.go
Normal file
@@ -0,0 +1,582 @@
|
|||||||
|
package api
|
||||||
|
|
||||||
|
import (
|
||||||
|
"context"
|
||||||
|
"database/sql"
|
||||||
|
"encoding/json"
|
||||||
|
"fmt"
|
||||||
|
"io"
|
||||||
|
"log"
|
||||||
|
"net/http"
|
||||||
|
"strings"
|
||||||
|
"time"
|
||||||
|
|
||||||
|
"github.com/go-chi/chi/v5"
|
||||||
|
"fuego/internal/auth"
|
||||||
|
"fuego/pkg/types"
|
||||||
|
)
|
||||||
|
|
||||||
|
// handleListRunners lists all runners
|
||||||
|
func (s *Server) handleListRunners(w http.ResponseWriter, r *http.Request) {
|
||||||
|
_, err := getUserID(r)
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusUnauthorized, err.Error())
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
rows, err := s.db.Query(
|
||||||
|
`SELECT id, name, hostname, ip_address, status, last_heartbeat, capabilities, created_at
|
||||||
|
FROM runners ORDER BY created_at DESC`,
|
||||||
|
)
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusInternalServerError, fmt.Sprintf("Failed to query runners: %v", err))
|
||||||
|
return
|
||||||
|
}
|
||||||
|
defer rows.Close()
|
||||||
|
|
||||||
|
runners := []types.Runner{}
|
||||||
|
for rows.Next() {
|
||||||
|
var runner types.Runner
|
||||||
|
err := rows.Scan(
|
||||||
|
&runner.ID, &runner.Name, &runner.Hostname, &runner.IPAddress,
|
||||||
|
&runner.Status, &runner.LastHeartbeat, &runner.Capabilities, &runner.CreatedAt,
|
||||||
|
)
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusInternalServerError, fmt.Sprintf("Failed to scan runner: %v", err))
|
||||||
|
return
|
||||||
|
}
|
||||||
|
runners = append(runners, runner)
|
||||||
|
}
|
||||||
|
|
||||||
|
s.respondJSON(w, http.StatusOK, runners)
|
||||||
|
}
|
||||||
|
|
||||||
|
// runnerAuthMiddleware verifies runner requests using HMAC signatures
|
||||||
|
func (s *Server) runnerAuthMiddleware(next http.HandlerFunc) http.HandlerFunc {
|
||||||
|
return func(w http.ResponseWriter, r *http.Request) {
|
||||||
|
// Get runner ID from query string
|
||||||
|
runnerIDStr := r.URL.Query().Get("runner_id")
|
||||||
|
if runnerIDStr == "" {
|
||||||
|
s.respondError(w, http.StatusBadRequest, "runner_id required in query string")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
var runnerID int64
|
||||||
|
_, err := fmt.Sscanf(runnerIDStr, "%d", &runnerID)
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusBadRequest, "invalid runner_id")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get runner secret
|
||||||
|
runnerSecret, err := s.secrets.GetRunnerSecret(runnerID)
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusUnauthorized, "runner not found or not verified")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify request signature
|
||||||
|
valid, err := auth.VerifyRequest(r, runnerSecret, 5*time.Minute)
|
||||||
|
if err != nil || !valid {
|
||||||
|
s.respondError(w, http.StatusUnauthorized, "invalid signature")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// Add runner ID to context
|
||||||
|
ctx := r.Context()
|
||||||
|
ctx = context.WithValue(ctx, "runner_id", runnerID)
|
||||||
|
next(w, r.WithContext(ctx))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// handleRegisterRunner registers a new runner
|
||||||
|
func (s *Server) handleRegisterRunner(w http.ResponseWriter, r *http.Request) {
|
||||||
|
var req struct {
|
||||||
|
types.RegisterRunnerRequest
|
||||||
|
RegistrationToken string `json:"registration_token"`
|
||||||
|
}
|
||||||
|
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
|
||||||
|
s.respondError(w, http.StatusBadRequest, "Invalid request body")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
if req.Name == "" {
|
||||||
|
s.respondError(w, http.StatusBadRequest, "Runner name is required")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
if req.RegistrationToken == "" {
|
||||||
|
s.respondError(w, http.StatusBadRequest, "Registration token is required")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// Validate registration token
|
||||||
|
valid, err := s.secrets.ValidateRegistrationToken(req.RegistrationToken)
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusInternalServerError, fmt.Sprintf("Failed to validate token: %v", err))
|
||||||
|
return
|
||||||
|
}
|
||||||
|
if !valid {
|
||||||
|
s.respondError(w, http.StatusUnauthorized, "Invalid or expired registration token")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get manager secret
|
||||||
|
managerSecret, err := s.secrets.GetManagerSecret()
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusInternalServerError, "Failed to get manager secret")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// Generate runner secret
|
||||||
|
runnerSecret, err := s.secrets.GenerateRunnerSecret()
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusInternalServerError, "Failed to generate runner secret")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// Register runner
|
||||||
|
result, err := s.db.Exec(
|
||||||
|
`INSERT INTO runners (name, hostname, ip_address, status, last_heartbeat, capabilities,
|
||||||
|
registration_token, runner_secret, manager_secret, verified)
|
||||||
|
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?)`,
|
||||||
|
req.Name, req.Hostname, req.IPAddress, types.RunnerStatusOnline, time.Now(), req.Capabilities,
|
||||||
|
req.RegistrationToken, runnerSecret, managerSecret, true,
|
||||||
|
)
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusInternalServerError, fmt.Sprintf("Failed to register runner: %v", err))
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
runnerID, _ := result.LastInsertId()
|
||||||
|
|
||||||
|
// Return runner info with secrets
|
||||||
|
s.respondJSON(w, http.StatusCreated, map[string]interface{}{
|
||||||
|
"id": runnerID,
|
||||||
|
"name": req.Name,
|
||||||
|
"hostname": req.Hostname,
|
||||||
|
"ip_address": req.IPAddress,
|
||||||
|
"status": types.RunnerStatusOnline,
|
||||||
|
"runner_secret": runnerSecret,
|
||||||
|
"manager_secret": managerSecret,
|
||||||
|
"verified": true,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
// handleRunnerHeartbeat updates runner heartbeat
|
||||||
|
func (s *Server) handleRunnerHeartbeat(w http.ResponseWriter, r *http.Request) {
|
||||||
|
runnerID, ok := r.Context().Value("runner_id").(int64)
|
||||||
|
if !ok {
|
||||||
|
s.respondError(w, http.StatusBadRequest, "runner_id not found in context")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
_, err := s.db.Exec(
|
||||||
|
`UPDATE runners SET last_heartbeat = ?, status = ? WHERE id = ?`,
|
||||||
|
time.Now(), types.RunnerStatusOnline, runnerID,
|
||||||
|
)
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusInternalServerError, fmt.Sprintf("Failed to update heartbeat: %v", err))
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
s.respondJSON(w, http.StatusOK, map[string]string{"message": "Heartbeat updated"})
|
||||||
|
}
|
||||||
|
|
||||||
|
// handleGetRunnerTasks gets pending tasks for a runner
|
||||||
|
func (s *Server) handleGetRunnerTasks(w http.ResponseWriter, r *http.Request) {
|
||||||
|
runnerID, ok := r.Context().Value("runner_id").(int64)
|
||||||
|
if !ok {
|
||||||
|
s.respondError(w, http.StatusBadRequest, "runner_id not found in context")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get pending tasks
|
||||||
|
rows, err := s.db.Query(
|
||||||
|
`SELECT t.id, t.job_id, t.runner_id, t.frame_start, t.frame_end, t.status, t.output_path,
|
||||||
|
t.created_at, t.started_at, t.completed_at, t.error_message,
|
||||||
|
j.name as job_name, j.output_format
|
||||||
|
FROM tasks t
|
||||||
|
JOIN jobs j ON t.job_id = j.id
|
||||||
|
WHERE t.status = ? AND j.status != ?
|
||||||
|
ORDER BY t.created_at ASC
|
||||||
|
LIMIT 10`,
|
||||||
|
types.TaskStatusPending, types.JobStatusCancelled,
|
||||||
|
)
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusInternalServerError, fmt.Sprintf("Failed to query tasks: %v", err))
|
||||||
|
return
|
||||||
|
}
|
||||||
|
defer rows.Close()
|
||||||
|
|
||||||
|
tasks := []map[string]interface{}{}
|
||||||
|
for rows.Next() {
|
||||||
|
var task types.Task
|
||||||
|
var runnerID sql.NullInt64
|
||||||
|
var startedAt, completedAt sql.NullTime
|
||||||
|
var jobName, outputFormat string
|
||||||
|
|
||||||
|
err := rows.Scan(
|
||||||
|
&task.ID, &task.JobID, &runnerID, &task.FrameStart, &task.FrameEnd,
|
||||||
|
&task.Status, &task.OutputPath, &task.CreatedAt,
|
||||||
|
&startedAt, &completedAt, &task.ErrorMessage,
|
||||||
|
&jobName, &outputFormat,
|
||||||
|
)
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusInternalServerError, fmt.Sprintf("Failed to scan task: %v", err))
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
if runnerID.Valid {
|
||||||
|
task.RunnerID = &runnerID.Int64
|
||||||
|
}
|
||||||
|
if startedAt.Valid {
|
||||||
|
task.StartedAt = &startedAt.Time
|
||||||
|
}
|
||||||
|
if completedAt.Valid {
|
||||||
|
task.CompletedAt = &completedAt.Time
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get input files for the job
|
||||||
|
var inputFiles []string
|
||||||
|
fileRows, err := s.db.Query(
|
||||||
|
`SELECT file_path FROM job_files WHERE job_id = ? AND file_type = ?`,
|
||||||
|
task.JobID, types.JobFileTypeInput,
|
||||||
|
)
|
||||||
|
if err == nil {
|
||||||
|
for fileRows.Next() {
|
||||||
|
var filePath string
|
||||||
|
if err := fileRows.Scan(&filePath); err == nil {
|
||||||
|
inputFiles = append(inputFiles, filePath)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
fileRows.Close()
|
||||||
|
}
|
||||||
|
|
||||||
|
tasks = append(tasks, map[string]interface{}{
|
||||||
|
"task": task,
|
||||||
|
"job_name": jobName,
|
||||||
|
"output_format": outputFormat,
|
||||||
|
"input_files": inputFiles,
|
||||||
|
})
|
||||||
|
|
||||||
|
// Assign task to runner
|
||||||
|
_, err = s.db.Exec(
|
||||||
|
`UPDATE tasks SET runner_id = ?, status = ? WHERE id = ?`,
|
||||||
|
runnerID, types.TaskStatusRunning, task.ID,
|
||||||
|
)
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusInternalServerError, fmt.Sprintf("Failed to assign task: %v", err))
|
||||||
|
return
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
s.respondJSON(w, http.StatusOK, tasks)
|
||||||
|
}
|
||||||
|
|
||||||
|
// handleCompleteTask marks a task as completed
|
||||||
|
func (s *Server) handleCompleteTask(w http.ResponseWriter, r *http.Request) {
|
||||||
|
taskID, err := parseID(r, "id")
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusBadRequest, err.Error())
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
var req struct {
|
||||||
|
OutputPath string `json:"output_path"`
|
||||||
|
Success bool `json:"success"`
|
||||||
|
Error string `json:"error,omitempty"`
|
||||||
|
}
|
||||||
|
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
|
||||||
|
s.respondError(w, http.StatusBadRequest, "Invalid request body")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
status := types.TaskStatusCompleted
|
||||||
|
if !req.Success {
|
||||||
|
status = types.TaskStatusFailed
|
||||||
|
}
|
||||||
|
|
||||||
|
now := time.Now()
|
||||||
|
_, err = s.db.Exec(
|
||||||
|
`UPDATE tasks SET status = ?, output_path = ?, completed_at = ?, error_message = ? WHERE id = ?`,
|
||||||
|
status, req.OutputPath, now, req.Error, taskID,
|
||||||
|
)
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusInternalServerError, fmt.Sprintf("Failed to update task: %v", err))
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// Update job progress
|
||||||
|
var jobID int64
|
||||||
|
var frameStart, frameEnd int
|
||||||
|
err = s.db.QueryRow(
|
||||||
|
`SELECT job_id, frame_start, frame_end FROM tasks WHERE id = ?`,
|
||||||
|
taskID,
|
||||||
|
).Scan(&jobID, &frameStart, &frameEnd)
|
||||||
|
if err == nil {
|
||||||
|
// Count completed tasks
|
||||||
|
var totalTasks, completedTasks int
|
||||||
|
s.db.QueryRow(
|
||||||
|
`SELECT COUNT(*) FROM tasks WHERE job_id = ?`,
|
||||||
|
jobID,
|
||||||
|
).Scan(&totalTasks)
|
||||||
|
s.db.QueryRow(
|
||||||
|
`SELECT COUNT(*) FROM tasks WHERE job_id = ? AND status = ?`,
|
||||||
|
jobID, types.TaskStatusCompleted,
|
||||||
|
).Scan(&completedTasks)
|
||||||
|
|
||||||
|
progress := float64(completedTasks) / float64(totalTasks) * 100.0
|
||||||
|
|
||||||
|
// Update job status
|
||||||
|
var jobStatus string
|
||||||
|
var outputFormat string
|
||||||
|
s.db.QueryRow(`SELECT output_format FROM jobs WHERE id = ?`, jobID).Scan(&outputFormat)
|
||||||
|
|
||||||
|
if completedTasks == totalTasks {
|
||||||
|
jobStatus = string(types.JobStatusCompleted)
|
||||||
|
now := time.Now()
|
||||||
|
s.db.Exec(
|
||||||
|
`UPDATE jobs SET status = ?, progress = ?, completed_at = ? WHERE id = ?`,
|
||||||
|
jobStatus, progress, now, jobID,
|
||||||
|
)
|
||||||
|
|
||||||
|
// For MP4 jobs, create a video generation task
|
||||||
|
if outputFormat == "MP4" {
|
||||||
|
go s.generateMP4Video(jobID)
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
jobStatus = string(types.JobStatusRunning)
|
||||||
|
var startedAt sql.NullTime
|
||||||
|
s.db.QueryRow(`SELECT started_at FROM jobs WHERE id = ?`, jobID).Scan(&startedAt)
|
||||||
|
if !startedAt.Valid {
|
||||||
|
now := time.Now()
|
||||||
|
s.db.Exec(`UPDATE jobs SET started_at = ? WHERE id = ?`, now, jobID)
|
||||||
|
}
|
||||||
|
s.db.Exec(
|
||||||
|
`UPDATE jobs SET status = ?, progress = ? WHERE id = ?`,
|
||||||
|
jobStatus, progress, jobID,
|
||||||
|
)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
s.respondJSON(w, http.StatusOK, map[string]string{"message": "Task completed"})
|
||||||
|
}
|
||||||
|
|
||||||
|
// handleUpdateTaskProgress updates task progress
|
||||||
|
func (s *Server) handleUpdateTaskProgress(w http.ResponseWriter, r *http.Request) {
|
||||||
|
_, err := parseID(r, "id")
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusBadRequest, err.Error())
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
var req struct {
|
||||||
|
Progress float64 `json:"progress"`
|
||||||
|
}
|
||||||
|
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
|
||||||
|
s.respondError(w, http.StatusBadRequest, "Invalid request body")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// This is mainly for logging/debugging, actual progress is calculated from completed tasks
|
||||||
|
s.respondJSON(w, http.StatusOK, map[string]string{"message": "Progress updated"})
|
||||||
|
}
|
||||||
|
|
||||||
|
// handleDownloadFileForRunner allows runners to download job files
|
||||||
|
func (s *Server) handleDownloadFileForRunner(w http.ResponseWriter, r *http.Request) {
|
||||||
|
jobID, err := parseID(r, "jobId")
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusBadRequest, err.Error())
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
fileName := chi.URLParam(r, "fileName")
|
||||||
|
|
||||||
|
// Find the file in the database
|
||||||
|
var filePath string
|
||||||
|
err = s.db.QueryRow(
|
||||||
|
`SELECT file_path FROM job_files WHERE job_id = ? AND file_name = ?`,
|
||||||
|
jobID, fileName,
|
||||||
|
).Scan(&filePath)
|
||||||
|
if err == sql.ErrNoRows {
|
||||||
|
s.respondError(w, http.StatusNotFound, "File not found")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusInternalServerError, fmt.Sprintf("Failed to query file: %v", err))
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// Open and serve file
|
||||||
|
file, err := s.storage.GetFile(filePath)
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusNotFound, "File not found on disk")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
defer file.Close()
|
||||||
|
|
||||||
|
w.Header().Set("Content-Type", "application/octet-stream")
|
||||||
|
w.Header().Set("Content-Disposition", fmt.Sprintf("attachment; filename=%s", fileName))
|
||||||
|
io.Copy(w, file)
|
||||||
|
}
|
||||||
|
|
||||||
|
// handleUploadFileFromRunner allows runners to upload output files
|
||||||
|
func (s *Server) handleUploadFileFromRunner(w http.ResponseWriter, r *http.Request) {
|
||||||
|
jobID, err := parseID(r, "jobId")
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusBadRequest, err.Error())
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
err = r.ParseMultipartForm(100 << 20) // 100 MB
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusBadRequest, "Failed to parse form")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
file, header, err := r.FormFile("file")
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusBadRequest, "No file provided")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
defer file.Close()
|
||||||
|
|
||||||
|
// Save file
|
||||||
|
filePath, err := s.storage.SaveOutput(jobID, header.Filename, file)
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusInternalServerError, fmt.Sprintf("Failed to save file: %v", err))
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// Record in database
|
||||||
|
_, err = s.db.Exec(
|
||||||
|
`INSERT INTO job_files (job_id, file_type, file_path, file_name, file_size)
|
||||||
|
VALUES (?, ?, ?, ?, ?)`,
|
||||||
|
jobID, types.JobFileTypeOutput, filePath, header.Filename, header.Size,
|
||||||
|
)
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusInternalServerError, fmt.Sprintf("Failed to record file: %v", err))
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
s.respondJSON(w, http.StatusCreated, map[string]interface{}{
|
||||||
|
"file_path": filePath,
|
||||||
|
"file_name": header.Filename,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
// generateMP4Video generates MP4 video from PNG frames for a completed job
|
||||||
|
func (s *Server) generateMP4Video(jobID int64) {
|
||||||
|
// This would be called by a runner or external process
|
||||||
|
// For now, we'll create a special task that runners can pick up
|
||||||
|
// In a production system, you might want to use a job queue or have a dedicated video processor
|
||||||
|
|
||||||
|
// Get all PNG output files for this job
|
||||||
|
rows, err := s.db.Query(
|
||||||
|
`SELECT file_path, file_name FROM job_files
|
||||||
|
WHERE job_id = ? AND file_type = ? AND file_name LIKE '%.png'
|
||||||
|
ORDER BY file_name`,
|
||||||
|
jobID, types.JobFileTypeOutput,
|
||||||
|
)
|
||||||
|
if err != nil {
|
||||||
|
log.Printf("Failed to query PNG files for job %d: %v", jobID, err)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
defer rows.Close()
|
||||||
|
|
||||||
|
var pngFiles []string
|
||||||
|
for rows.Next() {
|
||||||
|
var filePath, fileName string
|
||||||
|
if err := rows.Scan(&filePath, &fileName); err == nil {
|
||||||
|
pngFiles = append(pngFiles, filePath)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if len(pngFiles) == 0 {
|
||||||
|
log.Printf("No PNG files found for job %d", jobID)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// Note: Video generation will be handled by runners when they complete tasks
|
||||||
|
// Runners can check job status and generate MP4 when all frames are complete
|
||||||
|
log.Printf("Job %d completed with %d PNG frames - ready for MP4 generation", jobID, len(pngFiles))
|
||||||
|
}
|
||||||
|
|
||||||
|
// handleGetJobStatusForRunner allows runners to check job status
|
||||||
|
func (s *Server) handleGetJobStatusForRunner(w http.ResponseWriter, r *http.Request) {
|
||||||
|
jobID, err := parseID(r, "jobId")
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusBadRequest, err.Error())
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
var job types.Job
|
||||||
|
var startedAt, completedAt sql.NullTime
|
||||||
|
|
||||||
|
err = s.db.QueryRow(
|
||||||
|
`SELECT id, user_id, name, status, progress, frame_start, frame_end, output_format,
|
||||||
|
created_at, started_at, completed_at, error_message
|
||||||
|
FROM jobs WHERE id = ?`,
|
||||||
|
jobID,
|
||||||
|
).Scan(
|
||||||
|
&job.ID, &job.UserID, &job.Name, &job.Status, &job.Progress,
|
||||||
|
&job.FrameStart, &job.FrameEnd, &job.OutputFormat,
|
||||||
|
&job.CreatedAt, &startedAt, &completedAt, &job.ErrorMessage,
|
||||||
|
)
|
||||||
|
|
||||||
|
if err == sql.ErrNoRows {
|
||||||
|
s.respondError(w, http.StatusNotFound, "Job not found")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusInternalServerError, fmt.Sprintf("Failed to query job: %v", err))
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
if startedAt.Valid {
|
||||||
|
job.StartedAt = &startedAt.Time
|
||||||
|
}
|
||||||
|
if completedAt.Valid {
|
||||||
|
job.CompletedAt = &completedAt.Time
|
||||||
|
}
|
||||||
|
|
||||||
|
s.respondJSON(w, http.StatusOK, job)
|
||||||
|
}
|
||||||
|
|
||||||
|
// handleGetJobFilesForRunner allows runners to get job files
|
||||||
|
func (s *Server) handleGetJobFilesForRunner(w http.ResponseWriter, r *http.Request) {
|
||||||
|
jobID, err := parseID(r, "jobId")
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusBadRequest, err.Error())
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
rows, err := s.db.Query(
|
||||||
|
`SELECT id, job_id, file_type, file_path, file_name, file_size, created_at
|
||||||
|
FROM job_files WHERE job_id = ? ORDER BY file_name`,
|
||||||
|
jobID,
|
||||||
|
)
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusInternalServerError, fmt.Sprintf("Failed to query files: %v", err))
|
||||||
|
return
|
||||||
|
}
|
||||||
|
defer rows.Close()
|
||||||
|
|
||||||
|
files := []types.JobFile{}
|
||||||
|
for rows.Next() {
|
||||||
|
var file types.JobFile
|
||||||
|
err := rows.Scan(
|
||||||
|
&file.ID, &file.JobID, &file.FileType, &file.FilePath,
|
||||||
|
&file.FileName, &file.FileSize, &file.CreatedAt,
|
||||||
|
)
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusInternalServerError, fmt.Sprintf("Failed to scan file: %v", err))
|
||||||
|
return
|
||||||
|
}
|
||||||
|
files = append(files, file)
|
||||||
|
}
|
||||||
|
|
||||||
|
s.respondJSON(w, http.StatusOK, files)
|
||||||
|
}
|
||||||
|
|
||||||
284
internal/api/server.go
Normal file
284
internal/api/server.go
Normal file
@@ -0,0 +1,284 @@
|
|||||||
|
package api
|
||||||
|
|
||||||
|
import (
|
||||||
|
"encoding/json"
|
||||||
|
"fmt"
|
||||||
|
"log"
|
||||||
|
"net/http"
|
||||||
|
"strconv"
|
||||||
|
"time"
|
||||||
|
|
||||||
|
"github.com/go-chi/chi/v5"
|
||||||
|
"github.com/go-chi/chi/v5/middleware"
|
||||||
|
"github.com/go-chi/cors"
|
||||||
|
"fuego/internal/auth"
|
||||||
|
"fuego/internal/database"
|
||||||
|
"fuego/internal/storage"
|
||||||
|
)
|
||||||
|
|
||||||
|
// Server represents the API server
|
||||||
|
type Server struct {
|
||||||
|
db *database.DB
|
||||||
|
auth *auth.Auth
|
||||||
|
secrets *auth.Secrets
|
||||||
|
storage *storage.Storage
|
||||||
|
router *chi.Mux
|
||||||
|
}
|
||||||
|
|
||||||
|
// NewServer creates a new API server
|
||||||
|
func NewServer(db *database.DB, auth *auth.Auth, storage *storage.Storage) (*Server, error) {
|
||||||
|
secrets, err := auth.NewSecrets(db.DB)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to initialize secrets: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
s := &Server{
|
||||||
|
db: db,
|
||||||
|
auth: auth,
|
||||||
|
secrets: secrets,
|
||||||
|
storage: storage,
|
||||||
|
router: chi.NewRouter(),
|
||||||
|
}
|
||||||
|
|
||||||
|
s.setupMiddleware()
|
||||||
|
s.setupRoutes()
|
||||||
|
|
||||||
|
return s, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// setupMiddleware configures middleware
|
||||||
|
func (s *Server) setupMiddleware() {
|
||||||
|
s.router.Use(middleware.Logger)
|
||||||
|
s.router.Use(middleware.Recoverer)
|
||||||
|
s.router.Use(middleware.Timeout(60 * time.Second))
|
||||||
|
|
||||||
|
s.router.Use(cors.Handler(cors.Options{
|
||||||
|
AllowedOrigins: []string{"*"},
|
||||||
|
AllowedMethods: []string{"GET", "POST", "PUT", "DELETE", "OPTIONS"},
|
||||||
|
AllowedHeaders: []string{"Accept", "Authorization", "Content-Type"},
|
||||||
|
ExposedHeaders: []string{"Link"},
|
||||||
|
AllowCredentials: true,
|
||||||
|
MaxAge: 300,
|
||||||
|
}))
|
||||||
|
}
|
||||||
|
|
||||||
|
// setupRoutes configures routes
|
||||||
|
func (s *Server) setupRoutes() {
|
||||||
|
// Public routes
|
||||||
|
s.router.Route("/api/auth", func(r chi.Router) {
|
||||||
|
r.Get("/google/login", s.handleGoogleLogin)
|
||||||
|
r.Get("/google/callback", s.handleGoogleCallback)
|
||||||
|
r.Get("/discord/login", s.handleDiscordLogin)
|
||||||
|
r.Get("/discord/callback", s.handleDiscordCallback)
|
||||||
|
r.Post("/logout", s.handleLogout)
|
||||||
|
r.Get("/me", s.handleGetMe)
|
||||||
|
})
|
||||||
|
|
||||||
|
// Protected routes
|
||||||
|
s.router.Route("/api/jobs", func(r chi.Router) {
|
||||||
|
r.Use(func(next http.Handler) http.Handler {
|
||||||
|
return http.HandlerFunc(s.auth.Middleware(next.ServeHTTP))
|
||||||
|
})
|
||||||
|
r.Post("/", s.handleCreateJob)
|
||||||
|
r.Get("/", s.handleListJobs)
|
||||||
|
r.Get("/{id}", s.handleGetJob)
|
||||||
|
r.Delete("/{id}", s.handleCancelJob)
|
||||||
|
r.Post("/{id}/upload", s.handleUploadJobFile)
|
||||||
|
r.Get("/{id}/files", s.handleListJobFiles)
|
||||||
|
r.Get("/{id}/files/{fileId}/download", s.handleDownloadJobFile)
|
||||||
|
r.Get("/{id}/video", s.handleStreamVideo)
|
||||||
|
})
|
||||||
|
|
||||||
|
s.router.Route("/api/runners", func(r chi.Router) {
|
||||||
|
r.Use(func(next http.Handler) http.Handler {
|
||||||
|
return http.HandlerFunc(s.auth.Middleware(next.ServeHTTP))
|
||||||
|
})
|
||||||
|
r.Get("/", s.handleListRunners)
|
||||||
|
})
|
||||||
|
|
||||||
|
// Admin routes
|
||||||
|
s.router.Route("/api/admin", func(r chi.Router) {
|
||||||
|
r.Use(func(next http.Handler) http.Handler {
|
||||||
|
return http.HandlerFunc(s.auth.AdminMiddleware(next.ServeHTTP))
|
||||||
|
})
|
||||||
|
r.Route("/runners", func(r chi.Router) {
|
||||||
|
r.Route("/tokens", func(r chi.Router) {
|
||||||
|
r.Post("/", s.handleGenerateRegistrationToken)
|
||||||
|
r.Get("/", s.handleListRegistrationTokens)
|
||||||
|
r.Delete("/{id}", s.handleRevokeRegistrationToken)
|
||||||
|
})
|
||||||
|
r.Get("/", s.handleListRunnersAdmin)
|
||||||
|
r.Post("/{id}/verify", s.handleVerifyRunner)
|
||||||
|
r.Delete("/{id}", s.handleDeleteRunner)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
// Runner API
|
||||||
|
s.router.Route("/api/runner", func(r chi.Router) {
|
||||||
|
// Registration doesn't require auth (uses token)
|
||||||
|
r.Post("/register", s.handleRegisterRunner)
|
||||||
|
|
||||||
|
// All other endpoints require runner authentication
|
||||||
|
r.Group(func(r chi.Router) {
|
||||||
|
r.Use(func(next http.Handler) http.Handler {
|
||||||
|
return http.HandlerFunc(s.runnerAuthMiddleware(next.ServeHTTP))
|
||||||
|
})
|
||||||
|
r.Post("/heartbeat", s.handleRunnerHeartbeat)
|
||||||
|
r.Get("/tasks", s.handleGetRunnerTasks)
|
||||||
|
r.Post("/tasks/{id}/complete", s.handleCompleteTask)
|
||||||
|
r.Post("/tasks/{id}/progress", s.handleUpdateTaskProgress)
|
||||||
|
r.Get("/files/{jobId}/{fileName}", s.handleDownloadFileForRunner)
|
||||||
|
r.Post("/files/{jobId}/upload", s.handleUploadFileFromRunner)
|
||||||
|
r.Get("/jobs/{jobId}/status", s.handleGetJobStatusForRunner)
|
||||||
|
r.Get("/jobs/{jobId}/files", s.handleGetJobFilesForRunner)
|
||||||
|
})
|
||||||
|
})
|
||||||
|
|
||||||
|
// Serve static files (built React app)
|
||||||
|
s.router.Handle("/*", http.FileServer(http.Dir("./web/dist")))
|
||||||
|
}
|
||||||
|
|
||||||
|
// ServeHTTP implements http.Handler
|
||||||
|
func (s *Server) ServeHTTP(w http.ResponseWriter, r *http.Request) {
|
||||||
|
s.router.ServeHTTP(w, r)
|
||||||
|
}
|
||||||
|
|
||||||
|
// JSON response helpers
|
||||||
|
func (s *Server) respondJSON(w http.ResponseWriter, status int, data interface{}) {
|
||||||
|
w.Header().Set("Content-Type", "application/json")
|
||||||
|
w.WriteHeader(status)
|
||||||
|
if err := json.NewEncoder(w).Encode(data); err != nil {
|
||||||
|
log.Printf("Failed to encode JSON response: %v", err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Server) respondError(w http.ResponseWriter, status int, message string) {
|
||||||
|
s.respondJSON(w, status, map[string]string{"error": message})
|
||||||
|
}
|
||||||
|
|
||||||
|
// Auth handlers
|
||||||
|
func (s *Server) handleGoogleLogin(w http.ResponseWriter, r *http.Request) {
|
||||||
|
url, err := s.auth.GoogleLoginURL()
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusInternalServerError, err.Error())
|
||||||
|
return
|
||||||
|
}
|
||||||
|
http.Redirect(w, r, url, http.StatusFound)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Server) handleGoogleCallback(w http.ResponseWriter, r *http.Request) {
|
||||||
|
code := r.URL.Query().Get("code")
|
||||||
|
if code == "" {
|
||||||
|
s.respondError(w, http.StatusBadRequest, "Missing code parameter")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
session, err := s.auth.GoogleCallback(r.Context(), code)
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusInternalServerError, err.Error())
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
sessionID := s.auth.CreateSession(session)
|
||||||
|
http.SetCookie(w, &http.Cookie{
|
||||||
|
Name: "session_id",
|
||||||
|
Value: sessionID,
|
||||||
|
Path: "/",
|
||||||
|
MaxAge: 86400,
|
||||||
|
HttpOnly: true,
|
||||||
|
SameSite: http.SameSiteLaxMode,
|
||||||
|
})
|
||||||
|
|
||||||
|
http.Redirect(w, r, "/", http.StatusFound)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Server) handleDiscordLogin(w http.ResponseWriter, r *http.Request) {
|
||||||
|
url, err := s.auth.DiscordLoginURL()
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusInternalServerError, err.Error())
|
||||||
|
return
|
||||||
|
}
|
||||||
|
http.Redirect(w, r, url, http.StatusFound)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Server) handleDiscordCallback(w http.ResponseWriter, r *http.Request) {
|
||||||
|
code := r.URL.Query().Get("code")
|
||||||
|
if code == "" {
|
||||||
|
s.respondError(w, http.StatusBadRequest, "Missing code parameter")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
session, err := s.auth.DiscordCallback(r.Context(), code)
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusInternalServerError, err.Error())
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
sessionID := s.auth.CreateSession(session)
|
||||||
|
http.SetCookie(w, &http.Cookie{
|
||||||
|
Name: "session_id",
|
||||||
|
Value: sessionID,
|
||||||
|
Path: "/",
|
||||||
|
MaxAge: 86400,
|
||||||
|
HttpOnly: true,
|
||||||
|
SameSite: http.SameSiteLaxMode,
|
||||||
|
})
|
||||||
|
|
||||||
|
http.Redirect(w, r, "/", http.StatusFound)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Server) handleLogout(w http.ResponseWriter, r *http.Request) {
|
||||||
|
cookie, err := r.Cookie("session_id")
|
||||||
|
if err == nil {
|
||||||
|
s.auth.DeleteSession(cookie.Value)
|
||||||
|
}
|
||||||
|
http.SetCookie(w, &http.Cookie{
|
||||||
|
Name: "session_id",
|
||||||
|
Value: "",
|
||||||
|
Path: "/",
|
||||||
|
MaxAge: -1,
|
||||||
|
HttpOnly: true,
|
||||||
|
})
|
||||||
|
s.respondJSON(w, http.StatusOK, map[string]string{"message": "Logged out"})
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *Server) handleGetMe(w http.ResponseWriter, r *http.Request) {
|
||||||
|
cookie, err := r.Cookie("session_id")
|
||||||
|
if err != nil {
|
||||||
|
s.respondError(w, http.StatusUnauthorized, "Not authenticated")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
session, ok := s.auth.GetSession(cookie.Value)
|
||||||
|
if !ok {
|
||||||
|
s.respondError(w, http.StatusUnauthorized, "Invalid session")
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
s.respondJSON(w, http.StatusOK, map[string]interface{}{
|
||||||
|
"id": session.UserID,
|
||||||
|
"email": session.Email,
|
||||||
|
"name": session.Name,
|
||||||
|
"is_admin": session.IsAdmin,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
// Helper to get user ID from context
|
||||||
|
func getUserID(r *http.Request) (int64, error) {
|
||||||
|
userID, ok := auth.GetUserID(r.Context())
|
||||||
|
if !ok {
|
||||||
|
return 0, fmt.Errorf("user ID not found in context")
|
||||||
|
}
|
||||||
|
return userID, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// Helper to parse ID from URL
|
||||||
|
func parseID(r *http.Request, param string) (int64, error) {
|
||||||
|
idStr := chi.URLParam(r, param)
|
||||||
|
id, err := strconv.ParseInt(idStr, 10, 64)
|
||||||
|
if err != nil {
|
||||||
|
return 0, fmt.Errorf("invalid ID: %s", idStr)
|
||||||
|
}
|
||||||
|
return id, nil
|
||||||
|
}
|
||||||
|
|
||||||
302
internal/auth/auth.go
Normal file
302
internal/auth/auth.go
Normal file
@@ -0,0 +1,302 @@
|
|||||||
|
package auth
|
||||||
|
|
||||||
|
import (
|
||||||
|
"context"
|
||||||
|
"database/sql"
|
||||||
|
"encoding/json"
|
||||||
|
"fmt"
|
||||||
|
"net/http"
|
||||||
|
"os"
|
||||||
|
"time"
|
||||||
|
|
||||||
|
"github.com/google/uuid"
|
||||||
|
"golang.org/x/oauth2"
|
||||||
|
"golang.org/x/oauth2/google"
|
||||||
|
)
|
||||||
|
|
||||||
|
// Auth handles authentication
|
||||||
|
type Auth struct {
|
||||||
|
db *sql.DB
|
||||||
|
googleConfig *oauth2.Config
|
||||||
|
discordConfig *oauth2.Config
|
||||||
|
sessionStore map[string]*Session
|
||||||
|
}
|
||||||
|
|
||||||
|
// Session represents a user session
|
||||||
|
type Session struct {
|
||||||
|
UserID int64
|
||||||
|
Email string
|
||||||
|
Name string
|
||||||
|
IsAdmin bool
|
||||||
|
ExpiresAt time.Time
|
||||||
|
}
|
||||||
|
|
||||||
|
// NewAuth creates a new auth instance
|
||||||
|
func NewAuth(db *sql.DB) (*Auth, error) {
|
||||||
|
auth := &Auth{
|
||||||
|
db: db,
|
||||||
|
sessionStore: make(map[string]*Session),
|
||||||
|
}
|
||||||
|
|
||||||
|
// Initialize Google OAuth
|
||||||
|
googleClientID := os.Getenv("GOOGLE_CLIENT_ID")
|
||||||
|
googleClientSecret := os.Getenv("GOOGLE_CLIENT_SECRET")
|
||||||
|
if googleClientID != "" && googleClientSecret != "" {
|
||||||
|
auth.googleConfig = &oauth2.Config{
|
||||||
|
ClientID: googleClientID,
|
||||||
|
ClientSecret: googleClientSecret,
|
||||||
|
RedirectURL: os.Getenv("GOOGLE_REDIRECT_URL"),
|
||||||
|
Scopes: []string{"openid", "profile", "email"},
|
||||||
|
Endpoint: google.Endpoint,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Initialize Discord OAuth
|
||||||
|
discordClientID := os.Getenv("DISCORD_CLIENT_ID")
|
||||||
|
discordClientSecret := os.Getenv("DISCORD_CLIENT_SECRET")
|
||||||
|
if discordClientID != "" && discordClientSecret != "" {
|
||||||
|
auth.discordConfig = &oauth2.Config{
|
||||||
|
ClientID: discordClientID,
|
||||||
|
ClientSecret: discordClientSecret,
|
||||||
|
RedirectURL: os.Getenv("DISCORD_REDIRECT_URL"),
|
||||||
|
Scopes: []string{"identify", "email"},
|
||||||
|
Endpoint: oauth2.Endpoint{
|
||||||
|
AuthURL: "https://discord.com/api/oauth2/authorize",
|
||||||
|
TokenURL: "https://discord.com/api/oauth2/token",
|
||||||
|
},
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return auth, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// GoogleLoginURL returns the Google OAuth login URL
|
||||||
|
func (a *Auth) GoogleLoginURL() (string, error) {
|
||||||
|
if a.googleConfig == nil {
|
||||||
|
return "", fmt.Errorf("Google OAuth not configured")
|
||||||
|
}
|
||||||
|
state := uuid.New().String()
|
||||||
|
return a.googleConfig.AuthCodeURL(state), nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// DiscordLoginURL returns the Discord OAuth login URL
|
||||||
|
func (a *Auth) DiscordLoginURL() (string, error) {
|
||||||
|
if a.discordConfig == nil {
|
||||||
|
return "", fmt.Errorf("Discord OAuth not configured")
|
||||||
|
}
|
||||||
|
state := uuid.New().String()
|
||||||
|
return a.discordConfig.AuthCodeURL(state), nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// GoogleCallback handles Google OAuth callback
|
||||||
|
func (a *Auth) GoogleCallback(ctx context.Context, code string) (*Session, error) {
|
||||||
|
if a.googleConfig == nil {
|
||||||
|
return nil, fmt.Errorf("Google OAuth not configured")
|
||||||
|
}
|
||||||
|
|
||||||
|
token, err := a.googleConfig.Exchange(ctx, code)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to exchange token: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
client := a.googleConfig.Client(ctx, token)
|
||||||
|
resp, err := client.Get("https://www.googleapis.com/oauth2/v2/userinfo")
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to get user info: %w", err)
|
||||||
|
}
|
||||||
|
defer resp.Body.Close()
|
||||||
|
|
||||||
|
var userInfo struct {
|
||||||
|
ID string `json:"id"`
|
||||||
|
Email string `json:"email"`
|
||||||
|
Name string `json:"name"`
|
||||||
|
}
|
||||||
|
|
||||||
|
if err := json.NewDecoder(resp.Body).Decode(&userInfo); err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to decode user info: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
return a.getOrCreateUser("google", userInfo.ID, userInfo.Email, userInfo.Name)
|
||||||
|
}
|
||||||
|
|
||||||
|
// DiscordCallback handles Discord OAuth callback
|
||||||
|
func (a *Auth) DiscordCallback(ctx context.Context, code string) (*Session, error) {
|
||||||
|
if a.discordConfig == nil {
|
||||||
|
return nil, fmt.Errorf("Discord OAuth not configured")
|
||||||
|
}
|
||||||
|
|
||||||
|
token, err := a.discordConfig.Exchange(ctx, code)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to exchange token: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
client := a.discordConfig.Client(ctx, token)
|
||||||
|
resp, err := client.Get("https://discord.com/api/users/@me")
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to get user info: %w", err)
|
||||||
|
}
|
||||||
|
defer resp.Body.Close()
|
||||||
|
|
||||||
|
var userInfo struct {
|
||||||
|
ID string `json:"id"`
|
||||||
|
Email string `json:"email"`
|
||||||
|
Username string `json:"username"`
|
||||||
|
}
|
||||||
|
|
||||||
|
if err := json.NewDecoder(resp.Body).Decode(&userInfo); err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to decode user info: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
return a.getOrCreateUser("discord", userInfo.ID, userInfo.Email, userInfo.Username)
|
||||||
|
}
|
||||||
|
|
||||||
|
// getOrCreateUser gets or creates a user in the database
|
||||||
|
func (a *Auth) getOrCreateUser(provider, oauthID, email, name string) (*Session, error) {
|
||||||
|
var userID int64
|
||||||
|
var dbEmail, dbName string
|
||||||
|
var isAdmin bool
|
||||||
|
|
||||||
|
err := a.db.QueryRow(
|
||||||
|
"SELECT id, email, name, is_admin FROM users WHERE oauth_provider = ? AND oauth_id = ?",
|
||||||
|
provider, oauthID,
|
||||||
|
).Scan(&userID, &dbEmail, &dbName, &isAdmin)
|
||||||
|
|
||||||
|
if err == sql.ErrNoRows {
|
||||||
|
// Check if this is the first user
|
||||||
|
var userCount int
|
||||||
|
a.db.QueryRow("SELECT COUNT(*) FROM users").Scan(&userCount)
|
||||||
|
isAdmin = userCount == 0
|
||||||
|
|
||||||
|
// Create new user
|
||||||
|
result, err := a.db.Exec(
|
||||||
|
"INSERT INTO users (email, name, oauth_provider, oauth_id, is_admin) VALUES (?, ?, ?, ?, ?)",
|
||||||
|
email, name, provider, oauthID, isAdmin,
|
||||||
|
)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to create user: %w", err)
|
||||||
|
}
|
||||||
|
userID, _ = result.LastInsertId()
|
||||||
|
} else if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to query user: %w", err)
|
||||||
|
} else {
|
||||||
|
// Update user info if changed
|
||||||
|
if dbEmail != email || dbName != name {
|
||||||
|
_, err = a.db.Exec(
|
||||||
|
"UPDATE users SET email = ?, name = ? WHERE id = ?",
|
||||||
|
email, name, userID,
|
||||||
|
)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to update user: %w", err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
session := &Session{
|
||||||
|
UserID: userID,
|
||||||
|
Email: email,
|
||||||
|
Name: name,
|
||||||
|
IsAdmin: isAdmin,
|
||||||
|
ExpiresAt: time.Now().Add(24 * time.Hour),
|
||||||
|
}
|
||||||
|
|
||||||
|
return session, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// CreateSession creates a new session and returns a session ID
|
||||||
|
func (a *Auth) CreateSession(session *Session) string {
|
||||||
|
sessionID := uuid.New().String()
|
||||||
|
a.sessionStore[sessionID] = session
|
||||||
|
return sessionID
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetSession retrieves a session by ID
|
||||||
|
func (a *Auth) GetSession(sessionID string) (*Session, bool) {
|
||||||
|
session, ok := a.sessionStore[sessionID]
|
||||||
|
if !ok {
|
||||||
|
return nil, false
|
||||||
|
}
|
||||||
|
if time.Now().After(session.ExpiresAt) {
|
||||||
|
delete(a.sessionStore, sessionID)
|
||||||
|
return nil, false
|
||||||
|
}
|
||||||
|
// Refresh admin status from database
|
||||||
|
var isAdmin bool
|
||||||
|
err := a.db.QueryRow("SELECT is_admin FROM users WHERE id = ?", session.UserID).Scan(&isAdmin)
|
||||||
|
if err == nil {
|
||||||
|
session.IsAdmin = isAdmin
|
||||||
|
}
|
||||||
|
return session, true
|
||||||
|
}
|
||||||
|
|
||||||
|
// DeleteSession deletes a session
|
||||||
|
func (a *Auth) DeleteSession(sessionID string) {
|
||||||
|
delete(a.sessionStore, sessionID)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Middleware creates an authentication middleware
|
||||||
|
func (a *Auth) Middleware(next http.HandlerFunc) http.HandlerFunc {
|
||||||
|
return func(w http.ResponseWriter, r *http.Request) {
|
||||||
|
cookie, err := r.Cookie("session_id")
|
||||||
|
if err != nil {
|
||||||
|
http.Error(w, "Unauthorized", http.StatusUnauthorized)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
session, ok := a.GetSession(cookie.Value)
|
||||||
|
if !ok {
|
||||||
|
http.Error(w, "Unauthorized", http.StatusUnauthorized)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// Add user info to request context
|
||||||
|
ctx := context.WithValue(r.Context(), "user_id", session.UserID)
|
||||||
|
ctx = context.WithValue(ctx, "user_email", session.Email)
|
||||||
|
ctx = context.WithValue(ctx, "user_name", session.Name)
|
||||||
|
ctx = context.WithValue(ctx, "is_admin", session.IsAdmin)
|
||||||
|
next(w, r.WithContext(ctx))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetUserID gets the user ID from context
|
||||||
|
func GetUserID(ctx context.Context) (int64, bool) {
|
||||||
|
userID, ok := ctx.Value("user_id").(int64)
|
||||||
|
return userID, ok
|
||||||
|
}
|
||||||
|
|
||||||
|
// IsAdmin checks if the user in context is an admin
|
||||||
|
func IsAdmin(ctx context.Context) bool {
|
||||||
|
isAdmin, ok := ctx.Value("is_admin").(bool)
|
||||||
|
return ok && isAdmin
|
||||||
|
}
|
||||||
|
|
||||||
|
// AdminMiddleware creates an admin-only middleware
|
||||||
|
func (a *Auth) AdminMiddleware(next http.HandlerFunc) http.HandlerFunc {
|
||||||
|
return func(w http.ResponseWriter, r *http.Request) {
|
||||||
|
// First check authentication
|
||||||
|
cookie, err := r.Cookie("session_id")
|
||||||
|
if err != nil {
|
||||||
|
http.Error(w, "Unauthorized", http.StatusUnauthorized)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
session, ok := a.GetSession(cookie.Value)
|
||||||
|
if !ok {
|
||||||
|
http.Error(w, "Unauthorized", http.StatusUnauthorized)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// Then check admin status
|
||||||
|
if !session.IsAdmin {
|
||||||
|
http.Error(w, "Forbidden: Admin access required", http.StatusForbidden)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// Add user info to request context
|
||||||
|
ctx := context.WithValue(r.Context(), "user_id", session.UserID)
|
||||||
|
ctx = context.WithValue(ctx, "user_email", session.Email)
|
||||||
|
ctx = context.WithValue(ctx, "user_name", session.Name)
|
||||||
|
ctx = context.WithValue(ctx, "is_admin", session.IsAdmin)
|
||||||
|
next(w, r.WithContext(ctx))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
244
internal/auth/secrets.go
Normal file
244
internal/auth/secrets.go
Normal file
@@ -0,0 +1,244 @@
|
|||||||
|
package auth
|
||||||
|
|
||||||
|
import (
|
||||||
|
"crypto/hmac"
|
||||||
|
"crypto/rand"
|
||||||
|
"crypto/sha256"
|
||||||
|
"database/sql"
|
||||||
|
"encoding/hex"
|
||||||
|
"fmt"
|
||||||
|
"io"
|
||||||
|
"net/http"
|
||||||
|
"strings"
|
||||||
|
"time"
|
||||||
|
)
|
||||||
|
|
||||||
|
// Secrets handles secret and token management
|
||||||
|
type Secrets struct {
|
||||||
|
db *sql.DB
|
||||||
|
}
|
||||||
|
|
||||||
|
// NewSecrets creates a new secrets manager
|
||||||
|
func NewSecrets(db *sql.DB) (*Secrets, error) {
|
||||||
|
s := &Secrets{db: db}
|
||||||
|
|
||||||
|
// Ensure manager secret exists
|
||||||
|
if err := s.ensureManagerSecret(); err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to ensure manager secret: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
return s, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// ensureManagerSecret ensures a manager secret exists in the database
|
||||||
|
func (s *Secrets) ensureManagerSecret() error {
|
||||||
|
var count int
|
||||||
|
err := s.db.QueryRow("SELECT COUNT(*) FROM manager_secrets").Scan(&count)
|
||||||
|
if err != nil {
|
||||||
|
return fmt.Errorf("failed to check manager secrets: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
if count == 0 {
|
||||||
|
// Generate new manager secret
|
||||||
|
secret, err := generateSecret(32)
|
||||||
|
if err != nil {
|
||||||
|
return fmt.Errorf("failed to generate manager secret: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
_, err = s.db.Exec("INSERT INTO manager_secrets (secret) VALUES (?)", secret)
|
||||||
|
if err != nil {
|
||||||
|
return fmt.Errorf("failed to store manager secret: %w", err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetManagerSecret retrieves the current manager secret
|
||||||
|
func (s *Secrets) GetManagerSecret() (string, error) {
|
||||||
|
var secret string
|
||||||
|
err := s.db.QueryRow("SELECT secret FROM manager_secrets ORDER BY created_at DESC LIMIT 1").Scan(&secret)
|
||||||
|
if err != nil {
|
||||||
|
return "", fmt.Errorf("failed to get manager secret: %w", err)
|
||||||
|
}
|
||||||
|
return secret, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// GenerateRegistrationToken generates a new registration token
|
||||||
|
func (s *Secrets) GenerateRegistrationToken(createdBy int64, expiresIn time.Duration) (string, error) {
|
||||||
|
token, err := generateSecret(32)
|
||||||
|
if err != nil {
|
||||||
|
return "", fmt.Errorf("failed to generate token: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
expiresAt := time.Now().Add(expiresIn)
|
||||||
|
|
||||||
|
_, err = s.db.Exec(
|
||||||
|
"INSERT INTO registration_tokens (token, expires_at, created_by) VALUES (?, ?, ?)",
|
||||||
|
token, expiresAt, createdBy,
|
||||||
|
)
|
||||||
|
if err != nil {
|
||||||
|
return "", fmt.Errorf("failed to store registration token: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
return token, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// ValidateRegistrationToken validates a registration token
|
||||||
|
func (s *Secrets) ValidateRegistrationToken(token string) (bool, error) {
|
||||||
|
var used bool
|
||||||
|
var expiresAt time.Time
|
||||||
|
var id int64
|
||||||
|
|
||||||
|
err := s.db.QueryRow(
|
||||||
|
"SELECT id, expires_at, used FROM registration_tokens WHERE token = ?",
|
||||||
|
token,
|
||||||
|
).Scan(&id, &expiresAt, &used)
|
||||||
|
|
||||||
|
if err == sql.ErrNoRows {
|
||||||
|
return false, nil
|
||||||
|
}
|
||||||
|
if err != nil {
|
||||||
|
return false, fmt.Errorf("failed to query token: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
if used {
|
||||||
|
return false, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
if time.Now().After(expiresAt) {
|
||||||
|
return false, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// Mark token as used
|
||||||
|
_, err = s.db.Exec("UPDATE registration_tokens SET used = 1 WHERE id = ?", id)
|
||||||
|
if err != nil {
|
||||||
|
return false, fmt.Errorf("failed to mark token as used: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
return true, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// ListRegistrationTokens lists all registration tokens
|
||||||
|
func (s *Secrets) ListRegistrationTokens() ([]map[string]interface{}, error) {
|
||||||
|
rows, err := s.db.Query(
|
||||||
|
`SELECT id, token, expires_at, used, created_at, created_by
|
||||||
|
FROM registration_tokens
|
||||||
|
ORDER BY created_at DESC`,
|
||||||
|
)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to query tokens: %w", err)
|
||||||
|
}
|
||||||
|
defer rows.Close()
|
||||||
|
|
||||||
|
var tokens []map[string]interface{}
|
||||||
|
for rows.Next() {
|
||||||
|
var id, createdBy sql.NullInt64
|
||||||
|
var token string
|
||||||
|
var expiresAt, createdAt time.Time
|
||||||
|
var used bool
|
||||||
|
|
||||||
|
err := rows.Scan(&id, &token, &expiresAt, &used, &createdAt, &createdBy)
|
||||||
|
if err != nil {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
tokens = append(tokens, map[string]interface{}{
|
||||||
|
"id": id.Int64,
|
||||||
|
"token": token,
|
||||||
|
"expires_at": expiresAt,
|
||||||
|
"used": used,
|
||||||
|
"created_at": createdAt,
|
||||||
|
"created_by": createdBy.Int64,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
return tokens, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// RevokeRegistrationToken revokes a registration token
|
||||||
|
func (s *Secrets) RevokeRegistrationToken(tokenID int64) error {
|
||||||
|
_, err := s.db.Exec("UPDATE registration_tokens SET used = 1 WHERE id = ?", tokenID)
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
// GenerateRunnerSecret generates a unique secret for a runner
|
||||||
|
func (s *Secrets) GenerateRunnerSecret() (string, error) {
|
||||||
|
return generateSecret(32)
|
||||||
|
}
|
||||||
|
|
||||||
|
// SignRequest signs a request with the given secret
|
||||||
|
func SignRequest(method, path, body, secret string, timestamp time.Time) string {
|
||||||
|
message := fmt.Sprintf("%s\n%s\n%s\n%d", method, path, body, timestamp.Unix())
|
||||||
|
h := hmac.New(sha256.New, []byte(secret))
|
||||||
|
h.Write([]byte(message))
|
||||||
|
return hex.EncodeToString(h.Sum(nil))
|
||||||
|
}
|
||||||
|
|
||||||
|
// VerifyRequest verifies a signed request
|
||||||
|
func VerifyRequest(r *http.Request, secret string, maxAge time.Duration) (bool, error) {
|
||||||
|
signature := r.Header.Get("X-Runner-Signature")
|
||||||
|
if signature == "" {
|
||||||
|
return false, fmt.Errorf("missing signature")
|
||||||
|
}
|
||||||
|
|
||||||
|
timestampStr := r.Header.Get("X-Runner-Timestamp")
|
||||||
|
if timestampStr == "" {
|
||||||
|
return false, fmt.Errorf("missing timestamp")
|
||||||
|
}
|
||||||
|
|
||||||
|
var timestamp time.Time
|
||||||
|
_, err := fmt.Sscanf(timestampStr, "%d", ×tamp)
|
||||||
|
if err != nil {
|
||||||
|
return false, fmt.Errorf("invalid timestamp: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check timestamp is not too old
|
||||||
|
if time.Since(timestamp) > maxAge {
|
||||||
|
return false, fmt.Errorf("request too old")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check timestamp is not in the future (allow 1 minute clock skew)
|
||||||
|
if timestamp.After(time.Now().Add(1 * time.Minute)) {
|
||||||
|
return false, fmt.Errorf("timestamp in future")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Read body
|
||||||
|
bodyBytes, err := io.ReadAll(r.Body)
|
||||||
|
if err != nil {
|
||||||
|
return false, fmt.Errorf("failed to read body: %w", err)
|
||||||
|
}
|
||||||
|
// Restore body for handler
|
||||||
|
r.Body = io.NopCloser(strings.NewReader(string(bodyBytes)))
|
||||||
|
|
||||||
|
// Verify signature
|
||||||
|
expectedSig := SignRequest(r.Method, r.URL.Path, string(bodyBytes), secret, timestamp)
|
||||||
|
|
||||||
|
return hmac.Equal([]byte(signature), []byte(expectedSig)), nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetRunnerSecret retrieves the runner secret for a runner ID
|
||||||
|
func (s *Secrets) GetRunnerSecret(runnerID int64) (string, error) {
|
||||||
|
var secret string
|
||||||
|
err := s.db.QueryRow("SELECT runner_secret FROM runners WHERE id = ?", runnerID).Scan(&secret)
|
||||||
|
if err == sql.ErrNoRows {
|
||||||
|
return "", fmt.Errorf("runner not found")
|
||||||
|
}
|
||||||
|
if err != nil {
|
||||||
|
return "", fmt.Errorf("failed to get runner secret: %w", err)
|
||||||
|
}
|
||||||
|
if secret == "" {
|
||||||
|
return "", fmt.Errorf("runner not verified")
|
||||||
|
}
|
||||||
|
return secret, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// generateSecret generates a random secret of the given length
|
||||||
|
func generateSecret(length int) (string, error) {
|
||||||
|
bytes := make([]byte, length)
|
||||||
|
if _, err := rand.Read(bytes); err != nil {
|
||||||
|
return "", err
|
||||||
|
}
|
||||||
|
return hex.EncodeToString(bytes), nil
|
||||||
|
}
|
||||||
|
|
||||||
159
internal/database/schema.go
Normal file
159
internal/database/schema.go
Normal file
@@ -0,0 +1,159 @@
|
|||||||
|
package database
|
||||||
|
|
||||||
|
import (
|
||||||
|
"database/sql"
|
||||||
|
"fmt"
|
||||||
|
|
||||||
|
_ "github.com/mattn/go-sqlite3"
|
||||||
|
)
|
||||||
|
|
||||||
|
// DB wraps the database connection
|
||||||
|
type DB struct {
|
||||||
|
*sql.DB
|
||||||
|
}
|
||||||
|
|
||||||
|
// NewDB creates a new database connection
|
||||||
|
func NewDB(dbPath string) (*DB, error) {
|
||||||
|
db, err := sql.Open("sqlite3", dbPath+"?_foreign_keys=1")
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to open database: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
if err := db.Ping(); err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to ping database: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
database := &DB{DB: db}
|
||||||
|
if err := database.migrate(); err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to migrate database: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
return database, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// migrate runs database migrations
|
||||||
|
func (db *DB) migrate() error {
|
||||||
|
schema := `
|
||||||
|
CREATE TABLE IF NOT EXISTS users (
|
||||||
|
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||||
|
email TEXT UNIQUE NOT NULL,
|
||||||
|
name TEXT NOT NULL,
|
||||||
|
oauth_provider TEXT NOT NULL,
|
||||||
|
oauth_id TEXT NOT NULL,
|
||||||
|
is_admin BOOLEAN NOT NULL DEFAULT 0,
|
||||||
|
created_at DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
UNIQUE(oauth_provider, oauth_id)
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE TABLE IF NOT EXISTS jobs (
|
||||||
|
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||||
|
user_id INTEGER NOT NULL,
|
||||||
|
name TEXT NOT NULL,
|
||||||
|
status TEXT NOT NULL DEFAULT 'pending',
|
||||||
|
progress REAL NOT NULL DEFAULT 0.0,
|
||||||
|
frame_start INTEGER NOT NULL,
|
||||||
|
frame_end INTEGER NOT NULL,
|
||||||
|
output_format TEXT NOT NULL DEFAULT 'PNG',
|
||||||
|
created_at DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
started_at DATETIME,
|
||||||
|
completed_at DATETIME,
|
||||||
|
error_message TEXT,
|
||||||
|
FOREIGN KEY (user_id) REFERENCES users(id) ON DELETE CASCADE
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE TABLE IF NOT EXISTS runners (
|
||||||
|
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||||
|
name TEXT NOT NULL,
|
||||||
|
hostname TEXT NOT NULL,
|
||||||
|
ip_address TEXT NOT NULL,
|
||||||
|
status TEXT NOT NULL DEFAULT 'offline',
|
||||||
|
last_heartbeat DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
capabilities TEXT,
|
||||||
|
registration_token TEXT,
|
||||||
|
runner_secret TEXT,
|
||||||
|
manager_secret TEXT,
|
||||||
|
verified BOOLEAN NOT NULL DEFAULT 0,
|
||||||
|
created_at DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE TABLE IF NOT EXISTS tasks (
|
||||||
|
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||||
|
job_id INTEGER NOT NULL,
|
||||||
|
runner_id INTEGER,
|
||||||
|
frame_start INTEGER NOT NULL,
|
||||||
|
frame_end INTEGER NOT NULL,
|
||||||
|
status TEXT NOT NULL DEFAULT 'pending',
|
||||||
|
output_path TEXT,
|
||||||
|
created_at DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
started_at DATETIME,
|
||||||
|
completed_at DATETIME,
|
||||||
|
error_message TEXT,
|
||||||
|
FOREIGN KEY (job_id) REFERENCES jobs(id) ON DELETE CASCADE,
|
||||||
|
FOREIGN KEY (runner_id) REFERENCES runners(id) ON DELETE SET NULL
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE TABLE IF NOT EXISTS job_files (
|
||||||
|
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||||
|
job_id INTEGER NOT NULL,
|
||||||
|
file_type TEXT NOT NULL,
|
||||||
|
file_path TEXT NOT NULL,
|
||||||
|
file_name TEXT NOT NULL,
|
||||||
|
file_size INTEGER NOT NULL,
|
||||||
|
created_at DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
FOREIGN KEY (job_id) REFERENCES jobs(id) ON DELETE CASCADE
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE TABLE IF NOT EXISTS manager_secrets (
|
||||||
|
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||||
|
secret TEXT UNIQUE NOT NULL,
|
||||||
|
created_at DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE TABLE IF NOT EXISTS registration_tokens (
|
||||||
|
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||||
|
token TEXT UNIQUE NOT NULL,
|
||||||
|
expires_at DATETIME NOT NULL,
|
||||||
|
used BOOLEAN NOT NULL DEFAULT 0,
|
||||||
|
created_at DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
|
||||||
|
created_by INTEGER,
|
||||||
|
FOREIGN KEY (created_by) REFERENCES users(id) ON DELETE SET NULL
|
||||||
|
);
|
||||||
|
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_jobs_user_id ON jobs(user_id);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_jobs_status ON jobs(status);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_tasks_job_id ON tasks(job_id);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_tasks_runner_id ON tasks(runner_id);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_tasks_status ON tasks(status);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_job_files_job_id ON job_files(job_id);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_registration_tokens_token ON registration_tokens(token);
|
||||||
|
CREATE INDEX IF NOT EXISTS idx_registration_tokens_expires_at ON registration_tokens(expires_at);
|
||||||
|
`
|
||||||
|
|
||||||
|
if _, err := db.Exec(schema); err != nil {
|
||||||
|
return fmt.Errorf("failed to create schema: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Migrate existing tables to add new columns
|
||||||
|
migrations := []string{
|
||||||
|
// Add is_admin to users if it doesn't exist
|
||||||
|
`ALTER TABLE users ADD COLUMN is_admin BOOLEAN NOT NULL DEFAULT 0`,
|
||||||
|
// Add new columns to runners if they don't exist
|
||||||
|
`ALTER TABLE runners ADD COLUMN registration_token TEXT`,
|
||||||
|
`ALTER TABLE runners ADD COLUMN runner_secret TEXT`,
|
||||||
|
`ALTER TABLE runners ADD COLUMN manager_secret TEXT`,
|
||||||
|
`ALTER TABLE runners ADD COLUMN verified BOOLEAN NOT NULL DEFAULT 0`,
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, migration := range migrations {
|
||||||
|
// SQLite doesn't support IF NOT EXISTS for ALTER TABLE, so we ignore errors
|
||||||
|
db.Exec(migration)
|
||||||
|
}
|
||||||
|
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// Close closes the database connection
|
||||||
|
func (db *DB) Close() error {
|
||||||
|
return db.DB.Close()
|
||||||
|
}
|
||||||
|
|
||||||
627
internal/runner/client.go
Normal file
627
internal/runner/client.go
Normal file
@@ -0,0 +1,627 @@
|
|||||||
|
package runner
|
||||||
|
|
||||||
|
import (
|
||||||
|
"bytes"
|
||||||
|
"crypto/hmac"
|
||||||
|
"crypto/sha256"
|
||||||
|
"encoding/hex"
|
||||||
|
"encoding/json"
|
||||||
|
"fmt"
|
||||||
|
"io"
|
||||||
|
"log"
|
||||||
|
"mime/multipart"
|
||||||
|
"net/http"
|
||||||
|
"os"
|
||||||
|
"os/exec"
|
||||||
|
"path/filepath"
|
||||||
|
"sort"
|
||||||
|
"strings"
|
||||||
|
"time"
|
||||||
|
)
|
||||||
|
|
||||||
|
// Client represents a runner client
|
||||||
|
type Client struct {
|
||||||
|
managerURL string
|
||||||
|
name string
|
||||||
|
hostname string
|
||||||
|
ipAddress string
|
||||||
|
httpClient *http.Client
|
||||||
|
runnerID int64
|
||||||
|
runnerSecret string
|
||||||
|
managerSecret string
|
||||||
|
}
|
||||||
|
|
||||||
|
// NewClient creates a new runner client
|
||||||
|
func NewClient(managerURL, name, hostname, ipAddress string) *Client {
|
||||||
|
return &Client{
|
||||||
|
managerURL: managerURL,
|
||||||
|
name: name,
|
||||||
|
hostname: hostname,
|
||||||
|
ipAddress: ipAddress,
|
||||||
|
httpClient: &http.Client{Timeout: 30 * time.Second},
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// SetSecrets sets the runner and manager secrets
|
||||||
|
func (c *Client) SetSecrets(runnerID int64, runnerSecret, managerSecret string) {
|
||||||
|
c.runnerID = runnerID
|
||||||
|
c.runnerSecret = runnerSecret
|
||||||
|
c.managerSecret = managerSecret
|
||||||
|
}
|
||||||
|
|
||||||
|
// Register registers the runner with the manager using a registration token
|
||||||
|
func (c *Client) Register(registrationToken string) (int64, string, string, error) {
|
||||||
|
req := map[string]interface{}{
|
||||||
|
"name": c.name,
|
||||||
|
"hostname": c.hostname,
|
||||||
|
"ip_address": c.ipAddress,
|
||||||
|
"capabilities": "blender,ffmpeg",
|
||||||
|
"registration_token": registrationToken,
|
||||||
|
}
|
||||||
|
|
||||||
|
body, _ := json.Marshal(req)
|
||||||
|
resp, err := c.httpClient.Post(
|
||||||
|
fmt.Sprintf("%s/api/runner/register", c.managerURL),
|
||||||
|
"application/json",
|
||||||
|
bytes.NewReader(body),
|
||||||
|
)
|
||||||
|
if err != nil {
|
||||||
|
return 0, "", "", fmt.Errorf("failed to register: %w", err)
|
||||||
|
}
|
||||||
|
defer resp.Body.Close()
|
||||||
|
|
||||||
|
if resp.StatusCode != http.StatusCreated {
|
||||||
|
body, _ := io.ReadAll(resp.Body)
|
||||||
|
return 0, "", "", fmt.Errorf("registration failed: %s", string(body))
|
||||||
|
}
|
||||||
|
|
||||||
|
var result struct {
|
||||||
|
ID int64 `json:"id"`
|
||||||
|
RunnerSecret string `json:"runner_secret"`
|
||||||
|
ManagerSecret string `json:"manager_secret"`
|
||||||
|
}
|
||||||
|
if err := json.NewDecoder(resp.Body).Decode(&result); err != nil {
|
||||||
|
return 0, "", "", fmt.Errorf("failed to decode response: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
c.runnerID = result.ID
|
||||||
|
c.runnerSecret = result.RunnerSecret
|
||||||
|
c.managerSecret = result.ManagerSecret
|
||||||
|
|
||||||
|
return result.ID, result.RunnerSecret, result.ManagerSecret, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// signRequest signs a request with the runner secret
|
||||||
|
func (c *Client) signRequest(method, path string, body []byte) (string, time.Time) {
|
||||||
|
timestamp := time.Now()
|
||||||
|
message := fmt.Sprintf("%s\n%s\n%s\n%d", method, path, string(body), timestamp.Unix())
|
||||||
|
h := hmac.New(sha256.New, []byte(c.runnerSecret))
|
||||||
|
h.Write([]byte(message))
|
||||||
|
signature := hex.EncodeToString(h.Sum(nil))
|
||||||
|
return signature, timestamp
|
||||||
|
}
|
||||||
|
|
||||||
|
// doSignedRequest performs a signed HTTP request
|
||||||
|
func (c *Client) doSignedRequest(method, path string, body []byte) (*http.Response, error) {
|
||||||
|
if c.runnerSecret == "" {
|
||||||
|
return nil, fmt.Errorf("runner not authenticated")
|
||||||
|
}
|
||||||
|
|
||||||
|
signature, timestamp := c.signRequest(method, path, body)
|
||||||
|
|
||||||
|
req, err := http.NewRequest(method, fmt.Sprintf("%s%s", c.managerURL, path), bytes.NewReader(body))
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
req.Header.Set("Content-Type", "application/json")
|
||||||
|
req.Header.Set("X-Runner-Signature", signature)
|
||||||
|
req.Header.Set("X-Runner-Timestamp", fmt.Sprintf("%d", timestamp.Unix()))
|
||||||
|
|
||||||
|
return c.httpClient.Do(req)
|
||||||
|
}
|
||||||
|
|
||||||
|
// HeartbeatLoop sends periodic heartbeats to the manager
|
||||||
|
func (c *Client) HeartbeatLoop() {
|
||||||
|
ticker := time.NewTicker(30 * time.Second)
|
||||||
|
defer ticker.Stop()
|
||||||
|
|
||||||
|
for range ticker.C {
|
||||||
|
req := map[string]interface{}{}
|
||||||
|
body, _ := json.Marshal(req)
|
||||||
|
|
||||||
|
resp, err := c.doSignedRequest("POST", "/api/runner/heartbeat?runner_id="+fmt.Sprintf("%d", c.runnerID), body)
|
||||||
|
if err != nil {
|
||||||
|
log.Printf("Heartbeat failed: %v", err)
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
resp.Body.Close()
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// ProcessTasks polls for tasks and processes them
|
||||||
|
func (c *Client) ProcessTasks() {
|
||||||
|
ticker := time.NewTicker(5 * time.Second)
|
||||||
|
defer ticker.Stop()
|
||||||
|
|
||||||
|
for range ticker.C {
|
||||||
|
tasks, err := c.getTasks()
|
||||||
|
if err != nil {
|
||||||
|
log.Printf("Failed to get tasks: %v", err)
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, taskData := range tasks {
|
||||||
|
taskMap, ok := taskData["task"].(map[string]interface{})
|
||||||
|
if !ok {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
jobName, _ := taskData["job_name"].(string)
|
||||||
|
outputFormat, _ := taskData["output_format"].(string)
|
||||||
|
inputFilesRaw, _ := taskData["input_files"].([]interface{})
|
||||||
|
|
||||||
|
if len(inputFilesRaw) == 0 {
|
||||||
|
log.Printf("No input files for task %v", taskMap["id"])
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
// Process the task
|
||||||
|
if err := c.processTask(taskMap, jobName, outputFormat, inputFilesRaw); err != nil {
|
||||||
|
taskID, _ := taskMap["id"].(float64)
|
||||||
|
log.Printf("Failed to process task %v: %v", taskID, err)
|
||||||
|
c.completeTask(int64(taskID), "", false, err.Error())
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// getTasks fetches tasks from the manager
|
||||||
|
func (c *Client) getTasks() ([]map[string]interface{}, error) {
|
||||||
|
path := fmt.Sprintf("/api/runner/tasks?runner_id=%d", c.runnerID)
|
||||||
|
resp, err := c.doSignedRequest("GET", path, nil)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
defer resp.Body.Close()
|
||||||
|
|
||||||
|
if resp.StatusCode != http.StatusOK {
|
||||||
|
body, _ := io.ReadAll(resp.Body)
|
||||||
|
return nil, fmt.Errorf("failed to get tasks: %s", string(body))
|
||||||
|
}
|
||||||
|
|
||||||
|
var tasks []map[string]interface{}
|
||||||
|
if err := json.NewDecoder(resp.Body).Decode(&tasks); err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
return tasks, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// processTask processes a single task
|
||||||
|
func (c *Client) processTask(task map[string]interface{}, jobName, outputFormat string, inputFiles []interface{}) error {
|
||||||
|
taskID := int64(task["id"].(float64))
|
||||||
|
jobID := int64(task["job_id"].(float64))
|
||||||
|
frameStart := int(task["frame_start"].(float64))
|
||||||
|
frameEnd := int(task["frame_end"].(float64))
|
||||||
|
|
||||||
|
log.Printf("Processing task %d: job %d, frames %d-%d, format: %s", taskID, jobID, frameStart, frameEnd, outputFormat)
|
||||||
|
|
||||||
|
// Create work directory
|
||||||
|
workDir := filepath.Join(os.TempDir(), fmt.Sprintf("fuego-task-%d", taskID))
|
||||||
|
if err := os.MkdirAll(workDir, 0755); err != nil {
|
||||||
|
return fmt.Errorf("failed to create work directory: %w", err)
|
||||||
|
}
|
||||||
|
defer os.RemoveAll(workDir)
|
||||||
|
|
||||||
|
// Download input files
|
||||||
|
blendFile := ""
|
||||||
|
for _, filePath := range inputFiles {
|
||||||
|
filePathStr := filePath.(string)
|
||||||
|
if err := c.downloadFile(filePathStr, workDir); err != nil {
|
||||||
|
return fmt.Errorf("failed to download file %s: %w", filePathStr, err)
|
||||||
|
}
|
||||||
|
if filepath.Ext(filePathStr) == ".blend" {
|
||||||
|
blendFile = filepath.Join(workDir, filepath.Base(filePathStr))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if blendFile == "" {
|
||||||
|
return fmt.Errorf("no .blend file found in input files")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Render frames
|
||||||
|
outputDir := filepath.Join(workDir, "output")
|
||||||
|
if err := os.MkdirAll(outputDir, 0755); err != nil {
|
||||||
|
return fmt.Errorf("failed to create output directory: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// For MP4, render as PNG first, then combine into video
|
||||||
|
renderFormat := outputFormat
|
||||||
|
if outputFormat == "MP4" {
|
||||||
|
renderFormat = "PNG"
|
||||||
|
}
|
||||||
|
|
||||||
|
outputPattern := filepath.Join(outputDir, fmt.Sprintf("frame_%%04d.%s", strings.ToLower(renderFormat)))
|
||||||
|
|
||||||
|
// Execute Blender
|
||||||
|
cmd := exec.Command("blender", "-b", blendFile, "-o", outputPattern, "-f", fmt.Sprintf("%d", frameStart))
|
||||||
|
cmd.Dir = workDir
|
||||||
|
output, err := cmd.CombinedOutput()
|
||||||
|
if err != nil {
|
||||||
|
return fmt.Errorf("blender failed: %w\nOutput: %s", err, string(output))
|
||||||
|
}
|
||||||
|
|
||||||
|
// Find rendered output file
|
||||||
|
outputFile := filepath.Join(outputDir, fmt.Sprintf("frame_%04d.%s", frameStart, strings.ToLower(renderFormat)))
|
||||||
|
if _, err := os.Stat(outputFile); os.IsNotExist(err) {
|
||||||
|
return fmt.Errorf("output file not found: %s", outputFile)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Upload frame file
|
||||||
|
outputPath, err := c.uploadFile(jobID, outputFile)
|
||||||
|
if err != nil {
|
||||||
|
return fmt.Errorf("failed to upload output: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Mark task as complete
|
||||||
|
if err := c.completeTask(taskID, outputPath, true, ""); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
// For MP4 format, check if all frames are done and generate video
|
||||||
|
if outputFormat == "MP4" {
|
||||||
|
if err := c.checkAndGenerateMP4(jobID); err != nil {
|
||||||
|
log.Printf("Failed to generate MP4 for job %d: %v", jobID, err)
|
||||||
|
// Don't fail the task if video generation fails - frames are already uploaded
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// checkAndGenerateMP4 checks if all frames are complete and generates MP4 if so
|
||||||
|
func (c *Client) checkAndGenerateMP4(jobID int64) error {
|
||||||
|
// Check job status
|
||||||
|
job, err := c.getJobStatus(jobID)
|
||||||
|
if err != nil {
|
||||||
|
return fmt.Errorf("failed to get job status: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
if job["status"] != "completed" {
|
||||||
|
log.Printf("Job %d not yet complete (%v), skipping MP4 generation", jobID, job["status"])
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get all output files for this job
|
||||||
|
files, err := c.getJobFiles(jobID)
|
||||||
|
if err != nil {
|
||||||
|
return fmt.Errorf("failed to get job files: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Find all PNG frame files
|
||||||
|
var pngFiles []map[string]interface{}
|
||||||
|
for _, file := range files {
|
||||||
|
fileType, _ := file["file_type"].(string)
|
||||||
|
fileName, _ := file["file_name"].(string)
|
||||||
|
if fileType == "output" && strings.HasSuffix(fileName, ".png") {
|
||||||
|
pngFiles = append(pngFiles, file)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if len(pngFiles) == 0 {
|
||||||
|
return fmt.Errorf("no PNG frame files found for MP4 generation")
|
||||||
|
}
|
||||||
|
|
||||||
|
log.Printf("Generating MP4 for job %d from %d PNG frames", jobID, len(pngFiles))
|
||||||
|
|
||||||
|
// Create work directory for video generation
|
||||||
|
workDir := filepath.Join(os.TempDir(), fmt.Sprintf("fuego-video-%d", jobID))
|
||||||
|
if err := os.MkdirAll(workDir, 0755); err != nil {
|
||||||
|
return fmt.Errorf("failed to create work directory: %w", err)
|
||||||
|
}
|
||||||
|
defer os.RemoveAll(workDir)
|
||||||
|
|
||||||
|
// Download all PNG frames
|
||||||
|
var frameFiles []string
|
||||||
|
for _, file := range pngFiles {
|
||||||
|
fileName, _ := file["file_name"].(string)
|
||||||
|
framePath := filepath.Join(workDir, fileName)
|
||||||
|
if err := c.downloadFrameFile(jobID, fileName, framePath); err != nil {
|
||||||
|
log.Printf("Failed to download frame %s: %v", fileName, err)
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
frameFiles = append(frameFiles, framePath)
|
||||||
|
}
|
||||||
|
|
||||||
|
if len(frameFiles) == 0 {
|
||||||
|
return fmt.Errorf("failed to download any frame files")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Sort frame files by name to ensure correct order
|
||||||
|
sort.Strings(frameFiles)
|
||||||
|
|
||||||
|
// Generate MP4 using ffmpeg
|
||||||
|
outputMP4 := filepath.Join(workDir, fmt.Sprintf("output_%d.mp4", jobID))
|
||||||
|
|
||||||
|
// Use ffmpeg to combine frames into MP4
|
||||||
|
// Method 1: Using image sequence input (more reliable)
|
||||||
|
firstFrame := frameFiles[0]
|
||||||
|
// Extract frame number pattern (e.g., frame_0001.png -> frame_%04d.png)
|
||||||
|
baseName := filepath.Base(firstFrame)
|
||||||
|
pattern := strings.Replace(baseName, fmt.Sprintf("%04d", extractFrameNumber(baseName)), "%04d", 1)
|
||||||
|
patternPath := filepath.Join(workDir, pattern)
|
||||||
|
|
||||||
|
// Run ffmpeg to combine frames into MP4 at 24 fps
|
||||||
|
cmd := exec.Command("ffmpeg", "-y", "-framerate", "24", "-i", patternPath,
|
||||||
|
"-c:v", "libx264", "-pix_fmt", "yuv420p", "-r", "24", outputMP4)
|
||||||
|
cmd.Dir = workDir
|
||||||
|
output, err := cmd.CombinedOutput()
|
||||||
|
if err != nil {
|
||||||
|
// Try alternative method with concat demuxer
|
||||||
|
log.Printf("First ffmpeg attempt failed, trying concat method: %s", string(output))
|
||||||
|
return c.generateMP4WithConcat(frameFiles, outputMP4, workDir)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check if MP4 was created
|
||||||
|
if _, err := os.Stat(outputMP4); os.IsNotExist(err) {
|
||||||
|
return fmt.Errorf("MP4 file not created: %s", outputMP4)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Upload MP4 file
|
||||||
|
mp4Path, err := c.uploadFile(jobID, outputMP4)
|
||||||
|
if err != nil {
|
||||||
|
return fmt.Errorf("failed to upload MP4: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
log.Printf("Successfully generated and uploaded MP4 for job %d: %s", jobID, mp4Path)
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// generateMP4WithConcat uses ffmpeg concat demuxer as fallback
|
||||||
|
func (c *Client) generateMP4WithConcat(frameFiles []string, outputMP4, workDir string) error {
|
||||||
|
// Create file list for ffmpeg concat demuxer
|
||||||
|
listFile := filepath.Join(workDir, "frames.txt")
|
||||||
|
listFileHandle, err := os.Create(listFile)
|
||||||
|
if err != nil {
|
||||||
|
return fmt.Errorf("failed to create list file: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, frameFile := range frameFiles {
|
||||||
|
absPath, _ := filepath.Abs(frameFile)
|
||||||
|
fmt.Fprintf(listFileHandle, "file '%s'\n", absPath)
|
||||||
|
}
|
||||||
|
listFileHandle.Close()
|
||||||
|
|
||||||
|
// Run ffmpeg with concat demuxer
|
||||||
|
cmd := exec.Command("ffmpeg", "-f", "concat", "-safe", "0", "-i", listFile,
|
||||||
|
"-c:v", "libx264", "-pix_fmt", "yuv420p", "-r", "24", "-y", outputMP4)
|
||||||
|
output, err := cmd.CombinedOutput()
|
||||||
|
if err != nil {
|
||||||
|
return fmt.Errorf("ffmpeg concat failed: %w\nOutput: %s", err, string(output))
|
||||||
|
}
|
||||||
|
|
||||||
|
if _, err := os.Stat(outputMP4); os.IsNotExist(err) {
|
||||||
|
return fmt.Errorf("MP4 file not created: %s", outputMP4)
|
||||||
|
}
|
||||||
|
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// extractFrameNumber extracts frame number from filename like "frame_0001.png"
|
||||||
|
func extractFrameNumber(filename string) int {
|
||||||
|
parts := strings.Split(filepath.Base(filename), "_")
|
||||||
|
if len(parts) < 2 {
|
||||||
|
return 0
|
||||||
|
}
|
||||||
|
framePart := strings.Split(parts[1], ".")[0]
|
||||||
|
var frameNum int
|
||||||
|
fmt.Sscanf(framePart, "%d", &frameNum)
|
||||||
|
return frameNum
|
||||||
|
}
|
||||||
|
|
||||||
|
// getJobStatus gets job status from manager
|
||||||
|
func (c *Client) getJobStatus(jobID int64) (map[string]interface{}, error) {
|
||||||
|
path := fmt.Sprintf("/api/runner/jobs/%d/status?runner_id=%d", jobID, c.runnerID)
|
||||||
|
resp, err := c.doSignedRequest("GET", path, nil)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
defer resp.Body.Close()
|
||||||
|
|
||||||
|
if resp.StatusCode != http.StatusOK {
|
||||||
|
body, _ := io.ReadAll(resp.Body)
|
||||||
|
return nil, fmt.Errorf("failed to get job status: %s", string(body))
|
||||||
|
}
|
||||||
|
|
||||||
|
var job map[string]interface{}
|
||||||
|
if err := json.NewDecoder(resp.Body).Decode(&job); err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
return job, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// getJobFiles gets job files from manager
|
||||||
|
func (c *Client) getJobFiles(jobID int64) ([]map[string]interface{}, error) {
|
||||||
|
path := fmt.Sprintf("/api/runner/jobs/%d/files?runner_id=%d", jobID, c.runnerID)
|
||||||
|
resp, err := c.doSignedRequest("GET", path, nil)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
defer resp.Body.Close()
|
||||||
|
|
||||||
|
if resp.StatusCode != http.StatusOK {
|
||||||
|
body, _ := io.ReadAll(resp.Body)
|
||||||
|
return nil, fmt.Errorf("failed to get job files: %s", string(body))
|
||||||
|
}
|
||||||
|
|
||||||
|
var files []map[string]interface{}
|
||||||
|
if err := json.NewDecoder(resp.Body).Decode(&files); err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
return files, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// downloadFrameFile downloads a frame file for MP4 generation
|
||||||
|
func (c *Client) downloadFrameFile(jobID int64, fileName, destPath string) error {
|
||||||
|
path := fmt.Sprintf("/api/runner/files/%d/%s?runner_id=%d", jobID, fileName, c.runnerID)
|
||||||
|
resp, err := c.doSignedRequest("GET", path, nil)
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
defer resp.Body.Close()
|
||||||
|
|
||||||
|
if resp.StatusCode != http.StatusOK {
|
||||||
|
body, _ := io.ReadAll(resp.Body)
|
||||||
|
return fmt.Errorf("download failed: %s", string(body))
|
||||||
|
}
|
||||||
|
|
||||||
|
file, err := os.Create(destPath)
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
defer file.Close()
|
||||||
|
|
||||||
|
_, err = io.Copy(file, resp.Body)
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
// downloadFile downloads a file from the manager
|
||||||
|
func (c *Client) downloadFile(filePath, destDir string) error {
|
||||||
|
// Extract job ID and filename from path
|
||||||
|
// Path format: storage/jobs/{jobID}/{filename}
|
||||||
|
parts := filepath.SplitList(filePath)
|
||||||
|
if len(parts) < 3 {
|
||||||
|
return fmt.Errorf("invalid file path format: %s", filePath)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Find job ID in path (look for "jobs" directory)
|
||||||
|
jobID := ""
|
||||||
|
fileName := filepath.Base(filePath)
|
||||||
|
for i, part := range parts {
|
||||||
|
if part == "jobs" && i+1 < len(parts) {
|
||||||
|
jobID = parts[i+1]
|
||||||
|
break
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if jobID == "" {
|
||||||
|
return fmt.Errorf("could not extract job ID from path: %s", filePath)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Download via HTTP
|
||||||
|
path := fmt.Sprintf("/api/runner/files/%s/%s?runner_id=%d", jobID, fileName, c.runnerID)
|
||||||
|
resp, err := c.doSignedRequest("GET", path, nil)
|
||||||
|
if err != nil {
|
||||||
|
return fmt.Errorf("failed to download file: %w", err)
|
||||||
|
}
|
||||||
|
defer resp.Body.Close()
|
||||||
|
|
||||||
|
if resp.StatusCode != http.StatusOK {
|
||||||
|
body, _ := io.ReadAll(resp.Body)
|
||||||
|
return fmt.Errorf("download failed: %s", string(body))
|
||||||
|
}
|
||||||
|
|
||||||
|
destPath := filepath.Join(destDir, fileName)
|
||||||
|
file, err := os.Create(destPath)
|
||||||
|
if err != nil {
|
||||||
|
return fmt.Errorf("failed to create destination file: %w", err)
|
||||||
|
}
|
||||||
|
defer file.Close()
|
||||||
|
|
||||||
|
_, err = io.Copy(file, resp.Body)
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
// uploadFile uploads a file to the manager
|
||||||
|
func (c *Client) uploadFile(jobID int64, filePath string) (string, error) {
|
||||||
|
file, err := os.Open(filePath)
|
||||||
|
if err != nil {
|
||||||
|
return "", fmt.Errorf("failed to open file: %w", err)
|
||||||
|
}
|
||||||
|
defer file.Close()
|
||||||
|
|
||||||
|
// Create multipart form
|
||||||
|
var buf bytes.Buffer
|
||||||
|
formWriter := multipart.NewWriter(&buf)
|
||||||
|
|
||||||
|
part, err := formWriter.CreateFormFile("file", filepath.Base(filePath))
|
||||||
|
if err != nil {
|
||||||
|
return "", fmt.Errorf("failed to create form file: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
_, err = io.Copy(part, file)
|
||||||
|
if err != nil {
|
||||||
|
return "", fmt.Errorf("failed to copy file data: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
formWriter.Close()
|
||||||
|
|
||||||
|
// Upload file with signature
|
||||||
|
path := fmt.Sprintf("/api/runner/files/%d/upload?runner_id=%d", jobID, c.runnerID)
|
||||||
|
timestamp := time.Now()
|
||||||
|
message := fmt.Sprintf("POST\n%s\n%s\n%d", path, buf.String(), timestamp.Unix())
|
||||||
|
h := hmac.New(sha256.New, []byte(c.runnerSecret))
|
||||||
|
h.Write([]byte(message))
|
||||||
|
signature := hex.EncodeToString(h.Sum(nil))
|
||||||
|
|
||||||
|
url := fmt.Sprintf("%s%s", c.managerURL, path)
|
||||||
|
req, err := http.NewRequest("POST", url, &buf)
|
||||||
|
if err != nil {
|
||||||
|
return "", fmt.Errorf("failed to create request: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
req.Header.Set("Content-Type", formWriter.FormDataContentType())
|
||||||
|
req.Header.Set("X-Runner-Signature", signature)
|
||||||
|
req.Header.Set("X-Runner-Timestamp", fmt.Sprintf("%d", timestamp.Unix()))
|
||||||
|
|
||||||
|
resp, err := c.httpClient.Do(req)
|
||||||
|
if err != nil {
|
||||||
|
return "", fmt.Errorf("failed to upload file: %w", err)
|
||||||
|
}
|
||||||
|
defer resp.Body.Close()
|
||||||
|
|
||||||
|
if resp.StatusCode != http.StatusCreated {
|
||||||
|
body, _ := io.ReadAll(resp.Body)
|
||||||
|
return "", fmt.Errorf("upload failed: %s", string(body))
|
||||||
|
}
|
||||||
|
|
||||||
|
var result struct {
|
||||||
|
FilePath string `json:"file_path"`
|
||||||
|
FileName string `json:"file_name"`
|
||||||
|
}
|
||||||
|
if err := json.NewDecoder(resp.Body).Decode(&result); err != nil {
|
||||||
|
return "", fmt.Errorf("failed to decode response: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
return result.FilePath, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// completeTask marks a task as complete
|
||||||
|
func (c *Client) completeTask(taskID int64, outputPath string, success bool, errorMsg string) error {
|
||||||
|
req := map[string]interface{}{
|
||||||
|
"output_path": outputPath,
|
||||||
|
"success": success,
|
||||||
|
}
|
||||||
|
if !success {
|
||||||
|
req["error"] = errorMsg
|
||||||
|
}
|
||||||
|
|
||||||
|
body, _ := json.Marshal(req)
|
||||||
|
path := fmt.Sprintf("/api/runner/tasks/%d/complete?runner_id=%d", taskID, c.runnerID)
|
||||||
|
resp, err := c.doSignedRequest("POST", path, body)
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
defer resp.Body.Close()
|
||||||
|
|
||||||
|
if resp.StatusCode != http.StatusOK {
|
||||||
|
body, _ := io.ReadAll(resp.Body)
|
||||||
|
return fmt.Errorf("failed to complete task: %s", string(body))
|
||||||
|
}
|
||||||
|
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
137
internal/storage/storage.go
Normal file
137
internal/storage/storage.go
Normal file
@@ -0,0 +1,137 @@
|
|||||||
|
package storage
|
||||||
|
|
||||||
|
import (
|
||||||
|
"fmt"
|
||||||
|
"io"
|
||||||
|
"os"
|
||||||
|
"path/filepath"
|
||||||
|
)
|
||||||
|
|
||||||
|
// Storage handles file storage operations
|
||||||
|
type Storage struct {
|
||||||
|
basePath string
|
||||||
|
}
|
||||||
|
|
||||||
|
// NewStorage creates a new storage instance
|
||||||
|
func NewStorage(basePath string) (*Storage, error) {
|
||||||
|
s := &Storage{basePath: basePath}
|
||||||
|
if err := s.init(); err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
return s, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// init creates necessary directories
|
||||||
|
func (s *Storage) init() error {
|
||||||
|
dirs := []string{
|
||||||
|
s.basePath,
|
||||||
|
s.uploadsPath(),
|
||||||
|
s.outputsPath(),
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, dir := range dirs {
|
||||||
|
if err := os.MkdirAll(dir, 0755); err != nil {
|
||||||
|
return fmt.Errorf("failed to create directory %s: %w", dir, err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// uploadsPath returns the path for uploads
|
||||||
|
func (s *Storage) uploadsPath() string {
|
||||||
|
return filepath.Join(s.basePath, "uploads")
|
||||||
|
}
|
||||||
|
|
||||||
|
// outputsPath returns the path for outputs
|
||||||
|
func (s *Storage) outputsPath() string {
|
||||||
|
return filepath.Join(s.basePath, "outputs")
|
||||||
|
}
|
||||||
|
|
||||||
|
// JobPath returns the path for a specific job's files
|
||||||
|
func (s *Storage) JobPath(jobID int64) string {
|
||||||
|
return filepath.Join(s.basePath, "jobs", fmt.Sprintf("%d", jobID))
|
||||||
|
}
|
||||||
|
|
||||||
|
// SaveUpload saves an uploaded file
|
||||||
|
func (s *Storage) SaveUpload(jobID int64, filename string, reader io.Reader) (string, error) {
|
||||||
|
jobPath := s.JobPath(jobID)
|
||||||
|
if err := os.MkdirAll(jobPath, 0755); err != nil {
|
||||||
|
return "", fmt.Errorf("failed to create job directory: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
filePath := filepath.Join(jobPath, filename)
|
||||||
|
file, err := os.Create(filePath)
|
||||||
|
if err != nil {
|
||||||
|
return "", fmt.Errorf("failed to create file: %w", err)
|
||||||
|
}
|
||||||
|
defer file.Close()
|
||||||
|
|
||||||
|
if _, err := io.Copy(file, reader); err != nil {
|
||||||
|
return "", fmt.Errorf("failed to write file: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
return filePath, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// SaveOutput saves an output file
|
||||||
|
func (s *Storage) SaveOutput(jobID int64, filename string, reader io.Reader) (string, error) {
|
||||||
|
outputPath := filepath.Join(s.outputsPath(), fmt.Sprintf("%d", jobID))
|
||||||
|
if err := os.MkdirAll(outputPath, 0755); err != nil {
|
||||||
|
return "", fmt.Errorf("failed to create output directory: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
filePath := filepath.Join(outputPath, filename)
|
||||||
|
file, err := os.Create(filePath)
|
||||||
|
if err != nil {
|
||||||
|
return "", fmt.Errorf("failed to create file: %w", err)
|
||||||
|
}
|
||||||
|
defer file.Close()
|
||||||
|
|
||||||
|
if _, err := io.Copy(file, reader); err != nil {
|
||||||
|
return "", fmt.Errorf("failed to write file: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
return filePath, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetFile returns a file reader for the given path
|
||||||
|
func (s *Storage) GetFile(filePath string) (*os.File, error) {
|
||||||
|
return os.Open(filePath)
|
||||||
|
}
|
||||||
|
|
||||||
|
// DeleteFile deletes a file
|
||||||
|
func (s *Storage) DeleteFile(filePath string) error {
|
||||||
|
return os.Remove(filePath)
|
||||||
|
}
|
||||||
|
|
||||||
|
// DeleteJobFiles deletes all files for a job
|
||||||
|
func (s *Storage) DeleteJobFiles(jobID int64) error {
|
||||||
|
jobPath := s.JobPath(jobID)
|
||||||
|
if err := os.RemoveAll(jobPath); err != nil && !os.IsNotExist(err) {
|
||||||
|
return fmt.Errorf("failed to delete job files: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
outputPath := filepath.Join(s.outputsPath(), fmt.Sprintf("%d", jobID))
|
||||||
|
if err := os.RemoveAll(outputPath); err != nil && !os.IsNotExist(err) {
|
||||||
|
return fmt.Errorf("failed to delete output files: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// FileExists checks if a file exists
|
||||||
|
func (s *Storage) FileExists(filePath string) bool {
|
||||||
|
_, err := os.Stat(filePath)
|
||||||
|
return err == nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetFileSize returns the size of a file
|
||||||
|
func (s *Storage) GetFileSize(filePath string) (int64, error) {
|
||||||
|
info, err := os.Stat(filePath)
|
||||||
|
if err != nil {
|
||||||
|
return 0, err
|
||||||
|
}
|
||||||
|
return info.Size(), nil
|
||||||
|
}
|
||||||
|
|
||||||
127
pkg/types/types.go
Normal file
127
pkg/types/types.go
Normal file
@@ -0,0 +1,127 @@
|
|||||||
|
package types
|
||||||
|
|
||||||
|
import "time"
|
||||||
|
|
||||||
|
// User represents a user in the system
|
||||||
|
type User struct {
|
||||||
|
ID int64 `json:"id"`
|
||||||
|
Email string `json:"email"`
|
||||||
|
Name string `json:"name"`
|
||||||
|
OAuthProvider string `json:"oauth_provider"` // "google" or "discord"
|
||||||
|
OAuthID string `json:"oauth_id"`
|
||||||
|
CreatedAt time.Time `json:"created_at"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// JobStatus represents the status of a job
|
||||||
|
type JobStatus string
|
||||||
|
|
||||||
|
const (
|
||||||
|
JobStatusPending JobStatus = "pending"
|
||||||
|
JobStatusRunning JobStatus = "running"
|
||||||
|
JobStatusCompleted JobStatus = "completed"
|
||||||
|
JobStatusFailed JobStatus = "failed"
|
||||||
|
JobStatusCancelled JobStatus = "cancelled"
|
||||||
|
)
|
||||||
|
|
||||||
|
// Job represents a render job
|
||||||
|
type Job struct {
|
||||||
|
ID int64 `json:"id"`
|
||||||
|
UserID int64 `json:"user_id"`
|
||||||
|
Name string `json:"name"`
|
||||||
|
Status JobStatus `json:"status"`
|
||||||
|
Progress float64 `json:"progress"` // 0.0 to 100.0
|
||||||
|
FrameStart int `json:"frame_start"`
|
||||||
|
FrameEnd int `json:"frame_end"`
|
||||||
|
OutputFormat string `json:"output_format"` // PNG, JPEG, EXR, etc.
|
||||||
|
CreatedAt time.Time `json:"created_at"`
|
||||||
|
StartedAt *time.Time `json:"started_at,omitempty"`
|
||||||
|
CompletedAt *time.Time `json:"completed_at,omitempty"`
|
||||||
|
ErrorMessage string `json:"error_message,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// RunnerStatus represents the status of a runner
|
||||||
|
type RunnerStatus string
|
||||||
|
|
||||||
|
const (
|
||||||
|
RunnerStatusOnline RunnerStatus = "online"
|
||||||
|
RunnerStatusOffline RunnerStatus = "offline"
|
||||||
|
RunnerStatusBusy RunnerStatus = "busy"
|
||||||
|
)
|
||||||
|
|
||||||
|
// Runner represents a render runner
|
||||||
|
type Runner struct {
|
||||||
|
ID int64 `json:"id"`
|
||||||
|
Name string `json:"name"`
|
||||||
|
Hostname string `json:"hostname"`
|
||||||
|
IPAddress string `json:"ip_address"`
|
||||||
|
Status RunnerStatus `json:"status"`
|
||||||
|
LastHeartbeat time.Time `json:"last_heartbeat"`
|
||||||
|
Capabilities string `json:"capabilities"` // JSON string of capabilities
|
||||||
|
CreatedAt time.Time `json:"created_at"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// TaskStatus represents the status of a task
|
||||||
|
type TaskStatus string
|
||||||
|
|
||||||
|
const (
|
||||||
|
TaskStatusPending TaskStatus = "pending"
|
||||||
|
TaskStatusRunning TaskStatus = "running"
|
||||||
|
TaskStatusCompleted TaskStatus = "completed"
|
||||||
|
TaskStatusFailed TaskStatus = "failed"
|
||||||
|
)
|
||||||
|
|
||||||
|
// Task represents a render task assigned to a runner
|
||||||
|
type Task struct {
|
||||||
|
ID int64 `json:"id"`
|
||||||
|
JobID int64 `json:"job_id"`
|
||||||
|
RunnerID *int64 `json:"runner_id,omitempty"`
|
||||||
|
FrameStart int `json:"frame_start"`
|
||||||
|
FrameEnd int `json:"frame_end"`
|
||||||
|
Status TaskStatus `json:"status"`
|
||||||
|
OutputPath string `json:"output_path,omitempty"`
|
||||||
|
CreatedAt time.Time `json:"created_at"`
|
||||||
|
StartedAt *time.Time `json:"started_at,omitempty"`
|
||||||
|
CompletedAt *time.Time `json:"completed_at,omitempty"`
|
||||||
|
ErrorMessage string `json:"error_message,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// JobFileType represents the type of file
|
||||||
|
type JobFileType string
|
||||||
|
|
||||||
|
const (
|
||||||
|
JobFileTypeInput JobFileType = "input"
|
||||||
|
JobFileTypeOutput JobFileType = "output"
|
||||||
|
)
|
||||||
|
|
||||||
|
// JobFile represents a file associated with a job
|
||||||
|
type JobFile struct {
|
||||||
|
ID int64 `json:"id"`
|
||||||
|
JobID int64 `json:"job_id"`
|
||||||
|
FileType JobFileType `json:"file_type"`
|
||||||
|
FilePath string `json:"file_path"`
|
||||||
|
FileName string `json:"file_name"`
|
||||||
|
FileSize int64 `json:"file_size"`
|
||||||
|
CreatedAt time.Time `json:"created_at"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// CreateJobRequest represents a request to create a new job
|
||||||
|
type CreateJobRequest struct {
|
||||||
|
Name string `json:"name"`
|
||||||
|
FrameStart int `json:"frame_start"`
|
||||||
|
FrameEnd int `json:"frame_end"`
|
||||||
|
OutputFormat string `json:"output_format"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// UpdateJobProgressRequest represents a request to update job progress
|
||||||
|
type UpdateJobProgressRequest struct {
|
||||||
|
Progress float64 `json:"progress"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// RegisterRunnerRequest represents a request to register a runner
|
||||||
|
type RegisterRunnerRequest struct {
|
||||||
|
Name string `json:"name"`
|
||||||
|
Hostname string `json:"hostname"`
|
||||||
|
IPAddress string `json:"ip_address"`
|
||||||
|
Capabilities string `json:"capabilities"`
|
||||||
|
}
|
||||||
|
|
||||||
270
web/app.js
Normal file
270
web/app.js
Normal file
@@ -0,0 +1,270 @@
|
|||||||
|
const API_BASE = '/api';
|
||||||
|
|
||||||
|
let currentUser = null;
|
||||||
|
|
||||||
|
// Check authentication on load
|
||||||
|
async function init() {
|
||||||
|
await checkAuth();
|
||||||
|
setupEventListeners();
|
||||||
|
if (currentUser) {
|
||||||
|
showMainPage();
|
||||||
|
loadJobs();
|
||||||
|
loadRunners();
|
||||||
|
} else {
|
||||||
|
showLoginPage();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async function checkAuth() {
|
||||||
|
try {
|
||||||
|
const response = await fetch(`${API_BASE}/auth/me`);
|
||||||
|
if (response.ok) {
|
||||||
|
currentUser = await response.json();
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Auth check failed:', error);
|
||||||
|
}
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
function showLoginPage() {
|
||||||
|
document.getElementById('login-page').classList.remove('hidden');
|
||||||
|
document.getElementById('main-page').classList.add('hidden');
|
||||||
|
}
|
||||||
|
|
||||||
|
function showMainPage() {
|
||||||
|
document.getElementById('login-page').classList.add('hidden');
|
||||||
|
document.getElementById('main-page').classList.remove('hidden');
|
||||||
|
if (currentUser) {
|
||||||
|
document.getElementById('user-name').textContent = currentUser.name || currentUser.email;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function setupEventListeners() {
|
||||||
|
// Navigation
|
||||||
|
document.querySelectorAll('.nav-btn').forEach(btn => {
|
||||||
|
btn.addEventListener('click', (e) => {
|
||||||
|
const page = e.target.dataset.page;
|
||||||
|
switchPage(page);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
// Logout
|
||||||
|
document.getElementById('logout-btn').addEventListener('click', async () => {
|
||||||
|
await fetch(`${API_BASE}/auth/logout`, { method: 'POST' });
|
||||||
|
currentUser = null;
|
||||||
|
showLoginPage();
|
||||||
|
});
|
||||||
|
|
||||||
|
// Job form
|
||||||
|
document.getElementById('job-form').addEventListener('submit', async (e) => {
|
||||||
|
e.preventDefault();
|
||||||
|
await submitJob();
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
function switchPage(page) {
|
||||||
|
document.querySelectorAll('.content-page').forEach(p => p.classList.add('hidden'));
|
||||||
|
document.querySelectorAll('.nav-btn').forEach(b => b.classList.remove('active'));
|
||||||
|
|
||||||
|
document.getElementById(`${page}-page`).classList.remove('hidden');
|
||||||
|
document.querySelector(`[data-page="${page}"]`).classList.add('active');
|
||||||
|
|
||||||
|
if (page === 'jobs') {
|
||||||
|
loadJobs();
|
||||||
|
} else if (page === 'runners') {
|
||||||
|
loadRunners();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async function submitJob() {
|
||||||
|
const form = document.getElementById('job-form');
|
||||||
|
const formData = new FormData(form);
|
||||||
|
|
||||||
|
const jobData = {
|
||||||
|
name: document.getElementById('job-name').value,
|
||||||
|
frame_start: parseInt(document.getElementById('frame-start').value),
|
||||||
|
frame_end: parseInt(document.getElementById('frame-end').value),
|
||||||
|
output_format: document.getElementById('output-format').value,
|
||||||
|
};
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Create job
|
||||||
|
const jobResponse = await fetch(`${API_BASE}/jobs`, {
|
||||||
|
method: 'POST',
|
||||||
|
headers: { 'Content-Type': 'application/json' },
|
||||||
|
body: JSON.stringify(jobData),
|
||||||
|
});
|
||||||
|
|
||||||
|
if (!jobResponse.ok) {
|
||||||
|
throw new Error('Failed to create job');
|
||||||
|
}
|
||||||
|
|
||||||
|
const job = await jobResponse.json();
|
||||||
|
|
||||||
|
// Upload file
|
||||||
|
const fileInput = document.getElementById('blend-file');
|
||||||
|
if (fileInput.files.length > 0) {
|
||||||
|
const fileFormData = new FormData();
|
||||||
|
fileFormData.append('file', fileInput.files[0]);
|
||||||
|
|
||||||
|
const fileResponse = await fetch(`${API_BASE}/jobs/${job.id}/upload`, {
|
||||||
|
method: 'POST',
|
||||||
|
body: fileFormData,
|
||||||
|
});
|
||||||
|
|
||||||
|
if (!fileResponse.ok) {
|
||||||
|
throw new Error('Failed to upload file');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
alert('Job submitted successfully!');
|
||||||
|
form.reset();
|
||||||
|
switchPage('jobs');
|
||||||
|
loadJobs();
|
||||||
|
} catch (error) {
|
||||||
|
alert('Failed to submit job: ' + error.message);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async function loadJobs() {
|
||||||
|
try {
|
||||||
|
const response = await fetch(`${API_BASE}/jobs`);
|
||||||
|
if (!response.ok) throw new Error('Failed to load jobs');
|
||||||
|
|
||||||
|
const jobs = await response.json();
|
||||||
|
displayJobs(jobs);
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Failed to load jobs:', error);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function displayJobs(jobs) {
|
||||||
|
const container = document.getElementById('jobs-list');
|
||||||
|
if (jobs.length === 0) {
|
||||||
|
container.innerHTML = '<p>No jobs yet. Submit a job to get started!</p>';
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
container.innerHTML = jobs.map(job => `
|
||||||
|
<div class="job-card">
|
||||||
|
<h3>${escapeHtml(job.name)}</h3>
|
||||||
|
<div class="job-meta">
|
||||||
|
<span>Frames: ${job.frame_start}-${job.frame_end}</span>
|
||||||
|
<span>Format: ${job.output_format}</span>
|
||||||
|
<span>Created: ${new Date(job.created_at).toLocaleString()}</span>
|
||||||
|
</div>
|
||||||
|
<div class="job-status ${job.status}">${job.status}</div>
|
||||||
|
<div class="progress-bar">
|
||||||
|
<div class="progress-fill" style="width: ${job.progress}%"></div>
|
||||||
|
</div>
|
||||||
|
<div class="job-actions">
|
||||||
|
<button onclick="viewJob(${job.id})" class="btn btn-primary">View Details</button>
|
||||||
|
${job.status === 'pending' || job.status === 'running' ?
|
||||||
|
`<button onclick="cancelJob(${job.id})" class="btn btn-secondary">Cancel</button>` : ''}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
`).join('');
|
||||||
|
}
|
||||||
|
|
||||||
|
async function viewJob(jobId) {
|
||||||
|
try {
|
||||||
|
const response = await fetch(`${API_BASE}/jobs/${jobId}`);
|
||||||
|
if (!response.ok) throw new Error('Failed to load job');
|
||||||
|
|
||||||
|
const job = await response.json();
|
||||||
|
|
||||||
|
// Load files
|
||||||
|
const filesResponse = await fetch(`${API_BASE}/jobs/${jobId}/files`);
|
||||||
|
const files = filesResponse.ok ? await filesResponse.json() : [];
|
||||||
|
|
||||||
|
const outputFiles = files.filter(f => f.file_type === 'output');
|
||||||
|
if (outputFiles.length > 0) {
|
||||||
|
let message = 'Output files:\n';
|
||||||
|
outputFiles.forEach(file => {
|
||||||
|
message += `- ${file.file_name}\n`;
|
||||||
|
});
|
||||||
|
message += '\nWould you like to download them?';
|
||||||
|
if (confirm(message)) {
|
||||||
|
outputFiles.forEach(file => {
|
||||||
|
window.open(`${API_BASE}/jobs/${jobId}/files/${file.id}/download`, '_blank');
|
||||||
|
});
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
alert(`Job: ${job.name}\nStatus: ${job.status}\nProgress: ${job.progress.toFixed(1)}%`);
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
alert('Failed to load job details: ' + error.message);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async function cancelJob(jobId) {
|
||||||
|
if (!confirm('Are you sure you want to cancel this job?')) return;
|
||||||
|
|
||||||
|
try {
|
||||||
|
const response = await fetch(`${API_BASE}/jobs/${jobId}`, {
|
||||||
|
method: 'DELETE',
|
||||||
|
});
|
||||||
|
if (!response.ok) throw new Error('Failed to cancel job');
|
||||||
|
loadJobs();
|
||||||
|
} catch (error) {
|
||||||
|
alert('Failed to cancel job: ' + error.message);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async function loadRunners() {
|
||||||
|
try {
|
||||||
|
const response = await fetch(`${API_BASE}/runners`);
|
||||||
|
if (!response.ok) throw new Error('Failed to load runners');
|
||||||
|
|
||||||
|
const runners = await response.json();
|
||||||
|
displayRunners(runners);
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Failed to load runners:', error);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function displayRunners(runners) {
|
||||||
|
const container = document.getElementById('runners-list');
|
||||||
|
if (runners.length === 0) {
|
||||||
|
container.innerHTML = '<p>No runners connected.</p>';
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
container.innerHTML = runners.map(runner => {
|
||||||
|
const lastHeartbeat = new Date(runner.last_heartbeat);
|
||||||
|
const isOnline = (Date.now() - lastHeartbeat.getTime()) < 60000; // 1 minute
|
||||||
|
|
||||||
|
return `
|
||||||
|
<div class="runner-card">
|
||||||
|
<h3>${escapeHtml(runner.name)}</h3>
|
||||||
|
<div class="runner-info">
|
||||||
|
<span>Hostname: ${escapeHtml(runner.hostname)}</span>
|
||||||
|
<span>IP: ${escapeHtml(runner.ip_address)}</span>
|
||||||
|
<span>Last heartbeat: ${lastHeartbeat.toLocaleString()}</span>
|
||||||
|
</div>
|
||||||
|
<div class="runner-status ${isOnline ? 'online' : 'offline'}">
|
||||||
|
${isOnline ? 'Online' : 'Offline'}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
`;
|
||||||
|
}).join('');
|
||||||
|
}
|
||||||
|
|
||||||
|
function escapeHtml(text) {
|
||||||
|
const div = document.createElement('div');
|
||||||
|
div.textContent = text;
|
||||||
|
return div.innerHTML;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Auto-refresh jobs every 5 seconds
|
||||||
|
setInterval(() => {
|
||||||
|
if (currentUser && document.getElementById('jobs-page').classList.contains('hidden') === false) {
|
||||||
|
loadJobs();
|
||||||
|
}
|
||||||
|
}, 5000);
|
||||||
|
|
||||||
|
// Initialize on load
|
||||||
|
init();
|
||||||
|
|
||||||
13
web/index.html
Normal file
13
web/index.html
Normal file
@@ -0,0 +1,13 @@
|
|||||||
|
<!doctype html>
|
||||||
|
<html lang="en">
|
||||||
|
<head>
|
||||||
|
<meta charset="UTF-8" />
|
||||||
|
<link rel="icon" type="image/svg+xml" href="/vite.svg" />
|
||||||
|
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
|
||||||
|
<title>Fuego Render Farm</title>
|
||||||
|
</head>
|
||||||
|
<body>
|
||||||
|
<div id="root"></div>
|
||||||
|
<script type="module" src="/src/main.jsx"></script>
|
||||||
|
</body>
|
||||||
|
</html>
|
||||||
1
web/node_modules/.bin/autoprefixer
generated
vendored
Symbolic link
1
web/node_modules/.bin/autoprefixer
generated
vendored
Symbolic link
@@ -0,0 +1 @@
|
|||||||
|
../autoprefixer/bin/autoprefixer
|
||||||
1
web/node_modules/.bin/baseline-browser-mapping
generated
vendored
Symbolic link
1
web/node_modules/.bin/baseline-browser-mapping
generated
vendored
Symbolic link
@@ -0,0 +1 @@
|
|||||||
|
../baseline-browser-mapping/dist/cli.js
|
||||||
1
web/node_modules/.bin/browserslist
generated
vendored
Symbolic link
1
web/node_modules/.bin/browserslist
generated
vendored
Symbolic link
@@ -0,0 +1 @@
|
|||||||
|
../browserslist/cli.js
|
||||||
1
web/node_modules/.bin/esbuild
generated
vendored
Symbolic link
1
web/node_modules/.bin/esbuild
generated
vendored
Symbolic link
@@ -0,0 +1 @@
|
|||||||
|
../esbuild/bin/esbuild
|
||||||
1
web/node_modules/.bin/jsesc
generated
vendored
Symbolic link
1
web/node_modules/.bin/jsesc
generated
vendored
Symbolic link
@@ -0,0 +1 @@
|
|||||||
|
../jsesc/bin/jsesc
|
||||||
1
web/node_modules/.bin/json5
generated
vendored
Symbolic link
1
web/node_modules/.bin/json5
generated
vendored
Symbolic link
@@ -0,0 +1 @@
|
|||||||
|
../json5/lib/cli.js
|
||||||
1
web/node_modules/.bin/nanoid
generated
vendored
Symbolic link
1
web/node_modules/.bin/nanoid
generated
vendored
Symbolic link
@@ -0,0 +1 @@
|
|||||||
|
../nanoid/bin/nanoid.cjs
|
||||||
1
web/node_modules/.bin/parser
generated
vendored
Symbolic link
1
web/node_modules/.bin/parser
generated
vendored
Symbolic link
@@ -0,0 +1 @@
|
|||||||
|
../@babel/parser/bin/babel-parser.js
|
||||||
1
web/node_modules/.bin/rollup
generated
vendored
Symbolic link
1
web/node_modules/.bin/rollup
generated
vendored
Symbolic link
@@ -0,0 +1 @@
|
|||||||
|
../rollup/dist/bin/rollup
|
||||||
1
web/node_modules/.bin/semver
generated
vendored
Symbolic link
1
web/node_modules/.bin/semver
generated
vendored
Symbolic link
@@ -0,0 +1 @@
|
|||||||
|
../semver/bin/semver.js
|
||||||
1
web/node_modules/.bin/update-browserslist-db
generated
vendored
Symbolic link
1
web/node_modules/.bin/update-browserslist-db
generated
vendored
Symbolic link
@@ -0,0 +1 @@
|
|||||||
|
../update-browserslist-db/cli.js
|
||||||
1
web/node_modules/.bin/vite
generated
vendored
Symbolic link
1
web/node_modules/.bin/vite
generated
vendored
Symbolic link
@@ -0,0 +1 @@
|
|||||||
|
../vite/bin/vite.js
|
||||||
1010
web/node_modules/.package-lock.json
generated
vendored
Normal file
1010
web/node_modules/.package-lock.json
generated
vendored
Normal file
File diff suppressed because it is too large
Load Diff
22
web/node_modules/@babel/code-frame/LICENSE
generated
vendored
Normal file
22
web/node_modules/@babel/code-frame/LICENSE
generated
vendored
Normal file
@@ -0,0 +1,22 @@
|
|||||||
|
MIT License
|
||||||
|
|
||||||
|
Copyright (c) 2014-present Sebastian McKenzie and other contributors
|
||||||
|
|
||||||
|
Permission is hereby granted, free of charge, to any person obtaining
|
||||||
|
a copy of this software and associated documentation files (the
|
||||||
|
"Software"), to deal in the Software without restriction, including
|
||||||
|
without limitation the rights to use, copy, modify, merge, publish,
|
||||||
|
distribute, sublicense, and/or sell copies of the Software, and to
|
||||||
|
permit persons to whom the Software is furnished to do so, subject to
|
||||||
|
the following conditions:
|
||||||
|
|
||||||
|
The above copyright notice and this permission notice shall be
|
||||||
|
included in all copies or substantial portions of the Software.
|
||||||
|
|
||||||
|
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
|
||||||
|
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
|
||||||
|
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
|
||||||
|
NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE
|
||||||
|
LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
|
||||||
|
OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION
|
||||||
|
WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
|
||||||
19
web/node_modules/@babel/code-frame/README.md
generated
vendored
Normal file
19
web/node_modules/@babel/code-frame/README.md
generated
vendored
Normal file
@@ -0,0 +1,19 @@
|
|||||||
|
# @babel/code-frame
|
||||||
|
|
||||||
|
> Generate errors that contain a code frame that point to source locations.
|
||||||
|
|
||||||
|
See our website [@babel/code-frame](https://babeljs.io/docs/babel-code-frame) for more information.
|
||||||
|
|
||||||
|
## Install
|
||||||
|
|
||||||
|
Using npm:
|
||||||
|
|
||||||
|
```sh
|
||||||
|
npm install --save-dev @babel/code-frame
|
||||||
|
```
|
||||||
|
|
||||||
|
or using yarn:
|
||||||
|
|
||||||
|
```sh
|
||||||
|
yarn add @babel/code-frame --dev
|
||||||
|
```
|
||||||
216
web/node_modules/@babel/code-frame/lib/index.js
generated
vendored
Normal file
216
web/node_modules/@babel/code-frame/lib/index.js
generated
vendored
Normal file
@@ -0,0 +1,216 @@
|
|||||||
|
'use strict';
|
||||||
|
|
||||||
|
Object.defineProperty(exports, '__esModule', { value: true });
|
||||||
|
|
||||||
|
var picocolors = require('picocolors');
|
||||||
|
var jsTokens = require('js-tokens');
|
||||||
|
var helperValidatorIdentifier = require('@babel/helper-validator-identifier');
|
||||||
|
|
||||||
|
function isColorSupported() {
|
||||||
|
return (typeof process === "object" && (process.env.FORCE_COLOR === "0" || process.env.FORCE_COLOR === "false") ? false : picocolors.isColorSupported
|
||||||
|
);
|
||||||
|
}
|
||||||
|
const compose = (f, g) => v => f(g(v));
|
||||||
|
function buildDefs(colors) {
|
||||||
|
return {
|
||||||
|
keyword: colors.cyan,
|
||||||
|
capitalized: colors.yellow,
|
||||||
|
jsxIdentifier: colors.yellow,
|
||||||
|
punctuator: colors.yellow,
|
||||||
|
number: colors.magenta,
|
||||||
|
string: colors.green,
|
||||||
|
regex: colors.magenta,
|
||||||
|
comment: colors.gray,
|
||||||
|
invalid: compose(compose(colors.white, colors.bgRed), colors.bold),
|
||||||
|
gutter: colors.gray,
|
||||||
|
marker: compose(colors.red, colors.bold),
|
||||||
|
message: compose(colors.red, colors.bold),
|
||||||
|
reset: colors.reset
|
||||||
|
};
|
||||||
|
}
|
||||||
|
const defsOn = buildDefs(picocolors.createColors(true));
|
||||||
|
const defsOff = buildDefs(picocolors.createColors(false));
|
||||||
|
function getDefs(enabled) {
|
||||||
|
return enabled ? defsOn : defsOff;
|
||||||
|
}
|
||||||
|
|
||||||
|
const sometimesKeywords = new Set(["as", "async", "from", "get", "of", "set"]);
|
||||||
|
const NEWLINE$1 = /\r\n|[\n\r\u2028\u2029]/;
|
||||||
|
const BRACKET = /^[()[\]{}]$/;
|
||||||
|
let tokenize;
|
||||||
|
{
|
||||||
|
const JSX_TAG = /^[a-z][\w-]*$/i;
|
||||||
|
const getTokenType = function (token, offset, text) {
|
||||||
|
if (token.type === "name") {
|
||||||
|
if (helperValidatorIdentifier.isKeyword(token.value) || helperValidatorIdentifier.isStrictReservedWord(token.value, true) || sometimesKeywords.has(token.value)) {
|
||||||
|
return "keyword";
|
||||||
|
}
|
||||||
|
if (JSX_TAG.test(token.value) && (text[offset - 1] === "<" || text.slice(offset - 2, offset) === "</")) {
|
||||||
|
return "jsxIdentifier";
|
||||||
|
}
|
||||||
|
if (token.value[0] !== token.value[0].toLowerCase()) {
|
||||||
|
return "capitalized";
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (token.type === "punctuator" && BRACKET.test(token.value)) {
|
||||||
|
return "bracket";
|
||||||
|
}
|
||||||
|
if (token.type === "invalid" && (token.value === "@" || token.value === "#")) {
|
||||||
|
return "punctuator";
|
||||||
|
}
|
||||||
|
return token.type;
|
||||||
|
};
|
||||||
|
tokenize = function* (text) {
|
||||||
|
let match;
|
||||||
|
while (match = jsTokens.default.exec(text)) {
|
||||||
|
const token = jsTokens.matchToToken(match);
|
||||||
|
yield {
|
||||||
|
type: getTokenType(token, match.index, text),
|
||||||
|
value: token.value
|
||||||
|
};
|
||||||
|
}
|
||||||
|
};
|
||||||
|
}
|
||||||
|
function highlight(text) {
|
||||||
|
if (text === "") return "";
|
||||||
|
const defs = getDefs(true);
|
||||||
|
let highlighted = "";
|
||||||
|
for (const {
|
||||||
|
type,
|
||||||
|
value
|
||||||
|
} of tokenize(text)) {
|
||||||
|
if (type in defs) {
|
||||||
|
highlighted += value.split(NEWLINE$1).map(str => defs[type](str)).join("\n");
|
||||||
|
} else {
|
||||||
|
highlighted += value;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return highlighted;
|
||||||
|
}
|
||||||
|
|
||||||
|
let deprecationWarningShown = false;
|
||||||
|
const NEWLINE = /\r\n|[\n\r\u2028\u2029]/;
|
||||||
|
function getMarkerLines(loc, source, opts) {
|
||||||
|
const startLoc = Object.assign({
|
||||||
|
column: 0,
|
||||||
|
line: -1
|
||||||
|
}, loc.start);
|
||||||
|
const endLoc = Object.assign({}, startLoc, loc.end);
|
||||||
|
const {
|
||||||
|
linesAbove = 2,
|
||||||
|
linesBelow = 3
|
||||||
|
} = opts || {};
|
||||||
|
const startLine = startLoc.line;
|
||||||
|
const startColumn = startLoc.column;
|
||||||
|
const endLine = endLoc.line;
|
||||||
|
const endColumn = endLoc.column;
|
||||||
|
let start = Math.max(startLine - (linesAbove + 1), 0);
|
||||||
|
let end = Math.min(source.length, endLine + linesBelow);
|
||||||
|
if (startLine === -1) {
|
||||||
|
start = 0;
|
||||||
|
}
|
||||||
|
if (endLine === -1) {
|
||||||
|
end = source.length;
|
||||||
|
}
|
||||||
|
const lineDiff = endLine - startLine;
|
||||||
|
const markerLines = {};
|
||||||
|
if (lineDiff) {
|
||||||
|
for (let i = 0; i <= lineDiff; i++) {
|
||||||
|
const lineNumber = i + startLine;
|
||||||
|
if (!startColumn) {
|
||||||
|
markerLines[lineNumber] = true;
|
||||||
|
} else if (i === 0) {
|
||||||
|
const sourceLength = source[lineNumber - 1].length;
|
||||||
|
markerLines[lineNumber] = [startColumn, sourceLength - startColumn + 1];
|
||||||
|
} else if (i === lineDiff) {
|
||||||
|
markerLines[lineNumber] = [0, endColumn];
|
||||||
|
} else {
|
||||||
|
const sourceLength = source[lineNumber - i].length;
|
||||||
|
markerLines[lineNumber] = [0, sourceLength];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
if (startColumn === endColumn) {
|
||||||
|
if (startColumn) {
|
||||||
|
markerLines[startLine] = [startColumn, 0];
|
||||||
|
} else {
|
||||||
|
markerLines[startLine] = true;
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
markerLines[startLine] = [startColumn, endColumn - startColumn];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return {
|
||||||
|
start,
|
||||||
|
end,
|
||||||
|
markerLines
|
||||||
|
};
|
||||||
|
}
|
||||||
|
function codeFrameColumns(rawLines, loc, opts = {}) {
|
||||||
|
const shouldHighlight = opts.forceColor || isColorSupported() && opts.highlightCode;
|
||||||
|
const defs = getDefs(shouldHighlight);
|
||||||
|
const lines = rawLines.split(NEWLINE);
|
||||||
|
const {
|
||||||
|
start,
|
||||||
|
end,
|
||||||
|
markerLines
|
||||||
|
} = getMarkerLines(loc, lines, opts);
|
||||||
|
const hasColumns = loc.start && typeof loc.start.column === "number";
|
||||||
|
const numberMaxWidth = String(end).length;
|
||||||
|
const highlightedLines = shouldHighlight ? highlight(rawLines) : rawLines;
|
||||||
|
let frame = highlightedLines.split(NEWLINE, end).slice(start, end).map((line, index) => {
|
||||||
|
const number = start + 1 + index;
|
||||||
|
const paddedNumber = ` ${number}`.slice(-numberMaxWidth);
|
||||||
|
const gutter = ` ${paddedNumber} |`;
|
||||||
|
const hasMarker = markerLines[number];
|
||||||
|
const lastMarkerLine = !markerLines[number + 1];
|
||||||
|
if (hasMarker) {
|
||||||
|
let markerLine = "";
|
||||||
|
if (Array.isArray(hasMarker)) {
|
||||||
|
const markerSpacing = line.slice(0, Math.max(hasMarker[0] - 1, 0)).replace(/[^\t]/g, " ");
|
||||||
|
const numberOfMarkers = hasMarker[1] || 1;
|
||||||
|
markerLine = ["\n ", defs.gutter(gutter.replace(/\d/g, " ")), " ", markerSpacing, defs.marker("^").repeat(numberOfMarkers)].join("");
|
||||||
|
if (lastMarkerLine && opts.message) {
|
||||||
|
markerLine += " " + defs.message(opts.message);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return [defs.marker(">"), defs.gutter(gutter), line.length > 0 ? ` ${line}` : "", markerLine].join("");
|
||||||
|
} else {
|
||||||
|
return ` ${defs.gutter(gutter)}${line.length > 0 ? ` ${line}` : ""}`;
|
||||||
|
}
|
||||||
|
}).join("\n");
|
||||||
|
if (opts.message && !hasColumns) {
|
||||||
|
frame = `${" ".repeat(numberMaxWidth + 1)}${opts.message}\n${frame}`;
|
||||||
|
}
|
||||||
|
if (shouldHighlight) {
|
||||||
|
return defs.reset(frame);
|
||||||
|
} else {
|
||||||
|
return frame;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
function index (rawLines, lineNumber, colNumber, opts = {}) {
|
||||||
|
if (!deprecationWarningShown) {
|
||||||
|
deprecationWarningShown = true;
|
||||||
|
const message = "Passing lineNumber and colNumber is deprecated to @babel/code-frame. Please use `codeFrameColumns`.";
|
||||||
|
if (process.emitWarning) {
|
||||||
|
process.emitWarning(message, "DeprecationWarning");
|
||||||
|
} else {
|
||||||
|
const deprecationError = new Error(message);
|
||||||
|
deprecationError.name = "DeprecationWarning";
|
||||||
|
console.warn(new Error(message));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
colNumber = Math.max(colNumber, 0);
|
||||||
|
const location = {
|
||||||
|
start: {
|
||||||
|
column: colNumber,
|
||||||
|
line: lineNumber
|
||||||
|
}
|
||||||
|
};
|
||||||
|
return codeFrameColumns(rawLines, location, opts);
|
||||||
|
}
|
||||||
|
|
||||||
|
exports.codeFrameColumns = codeFrameColumns;
|
||||||
|
exports.default = index;
|
||||||
|
exports.highlight = highlight;
|
||||||
|
//# sourceMappingURL=index.js.map
|
||||||
1
web/node_modules/@babel/code-frame/lib/index.js.map
generated
vendored
Normal file
1
web/node_modules/@babel/code-frame/lib/index.js.map
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
31
web/node_modules/@babel/code-frame/package.json
generated
vendored
Normal file
31
web/node_modules/@babel/code-frame/package.json
generated
vendored
Normal file
@@ -0,0 +1,31 @@
|
|||||||
|
{
|
||||||
|
"name": "@babel/code-frame",
|
||||||
|
"version": "7.27.1",
|
||||||
|
"description": "Generate errors that contain a code frame that point to source locations.",
|
||||||
|
"author": "The Babel Team (https://babel.dev/team)",
|
||||||
|
"homepage": "https://babel.dev/docs/en/next/babel-code-frame",
|
||||||
|
"bugs": "https://github.com/babel/babel/issues?utf8=%E2%9C%93&q=is%3Aissue+is%3Aopen",
|
||||||
|
"license": "MIT",
|
||||||
|
"publishConfig": {
|
||||||
|
"access": "public"
|
||||||
|
},
|
||||||
|
"repository": {
|
||||||
|
"type": "git",
|
||||||
|
"url": "https://github.com/babel/babel.git",
|
||||||
|
"directory": "packages/babel-code-frame"
|
||||||
|
},
|
||||||
|
"main": "./lib/index.js",
|
||||||
|
"dependencies": {
|
||||||
|
"@babel/helper-validator-identifier": "^7.27.1",
|
||||||
|
"js-tokens": "^4.0.0",
|
||||||
|
"picocolors": "^1.1.1"
|
||||||
|
},
|
||||||
|
"devDependencies": {
|
||||||
|
"import-meta-resolve": "^4.1.0",
|
||||||
|
"strip-ansi": "^4.0.0"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">=6.9.0"
|
||||||
|
},
|
||||||
|
"type": "commonjs"
|
||||||
|
}
|
||||||
22
web/node_modules/@babel/compat-data/LICENSE
generated
vendored
Normal file
22
web/node_modules/@babel/compat-data/LICENSE
generated
vendored
Normal file
@@ -0,0 +1,22 @@
|
|||||||
|
MIT License
|
||||||
|
|
||||||
|
Copyright (c) 2014-present Sebastian McKenzie and other contributors
|
||||||
|
|
||||||
|
Permission is hereby granted, free of charge, to any person obtaining
|
||||||
|
a copy of this software and associated documentation files (the
|
||||||
|
"Software"), to deal in the Software without restriction, including
|
||||||
|
without limitation the rights to use, copy, modify, merge, publish,
|
||||||
|
distribute, sublicense, and/or sell copies of the Software, and to
|
||||||
|
permit persons to whom the Software is furnished to do so, subject to
|
||||||
|
the following conditions:
|
||||||
|
|
||||||
|
The above copyright notice and this permission notice shall be
|
||||||
|
included in all copies or substantial portions of the Software.
|
||||||
|
|
||||||
|
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
|
||||||
|
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
|
||||||
|
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
|
||||||
|
NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE
|
||||||
|
LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
|
||||||
|
OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION
|
||||||
|
WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
|
||||||
19
web/node_modules/@babel/compat-data/README.md
generated
vendored
Normal file
19
web/node_modules/@babel/compat-data/README.md
generated
vendored
Normal file
@@ -0,0 +1,19 @@
|
|||||||
|
# @babel/compat-data
|
||||||
|
|
||||||
|
> The compat-data to determine required Babel plugins
|
||||||
|
|
||||||
|
See our website [@babel/compat-data](https://babeljs.io/docs/babel-compat-data) for more information.
|
||||||
|
|
||||||
|
## Install
|
||||||
|
|
||||||
|
Using npm:
|
||||||
|
|
||||||
|
```sh
|
||||||
|
npm install --save @babel/compat-data
|
||||||
|
```
|
||||||
|
|
||||||
|
or using yarn:
|
||||||
|
|
||||||
|
```sh
|
||||||
|
yarn add @babel/compat-data
|
||||||
|
```
|
||||||
2
web/node_modules/@babel/compat-data/corejs2-built-ins.js
generated
vendored
Normal file
2
web/node_modules/@babel/compat-data/corejs2-built-ins.js
generated
vendored
Normal file
@@ -0,0 +1,2 @@
|
|||||||
|
// Todo (Babel 8): remove this file as Babel 8 drop support of core-js 2
|
||||||
|
module.exports = require("./data/corejs2-built-ins.json");
|
||||||
2
web/node_modules/@babel/compat-data/corejs3-shipped-proposals.js
generated
vendored
Normal file
2
web/node_modules/@babel/compat-data/corejs3-shipped-proposals.js
generated
vendored
Normal file
@@ -0,0 +1,2 @@
|
|||||||
|
// Todo (Babel 8): remove this file now that it is included in babel-plugin-polyfill-corejs3
|
||||||
|
module.exports = require("./data/corejs3-shipped-proposals.json");
|
||||||
2106
web/node_modules/@babel/compat-data/data/corejs2-built-ins.json
generated
vendored
Normal file
2106
web/node_modules/@babel/compat-data/data/corejs2-built-ins.json
generated
vendored
Normal file
File diff suppressed because it is too large
Load Diff
5
web/node_modules/@babel/compat-data/data/corejs3-shipped-proposals.json
generated
vendored
Normal file
5
web/node_modules/@babel/compat-data/data/corejs3-shipped-proposals.json
generated
vendored
Normal file
@@ -0,0 +1,5 @@
|
|||||||
|
[
|
||||||
|
"esnext.promise.all-settled",
|
||||||
|
"esnext.string.match-all",
|
||||||
|
"esnext.global-this"
|
||||||
|
]
|
||||||
18
web/node_modules/@babel/compat-data/data/native-modules.json
generated
vendored
Normal file
18
web/node_modules/@babel/compat-data/data/native-modules.json
generated
vendored
Normal file
@@ -0,0 +1,18 @@
|
|||||||
|
{
|
||||||
|
"es6.module": {
|
||||||
|
"chrome": "61",
|
||||||
|
"and_chr": "61",
|
||||||
|
"edge": "16",
|
||||||
|
"firefox": "60",
|
||||||
|
"and_ff": "60",
|
||||||
|
"node": "13.2.0",
|
||||||
|
"opera": "48",
|
||||||
|
"op_mob": "45",
|
||||||
|
"safari": "10.1",
|
||||||
|
"ios": "10.3",
|
||||||
|
"samsung": "8.2",
|
||||||
|
"android": "61",
|
||||||
|
"electron": "2.0",
|
||||||
|
"ios_saf": "10.3"
|
||||||
|
}
|
||||||
|
}
|
||||||
35
web/node_modules/@babel/compat-data/data/overlapping-plugins.json
generated
vendored
Normal file
35
web/node_modules/@babel/compat-data/data/overlapping-plugins.json
generated
vendored
Normal file
@@ -0,0 +1,35 @@
|
|||||||
|
{
|
||||||
|
"transform-async-to-generator": [
|
||||||
|
"bugfix/transform-async-arrows-in-class"
|
||||||
|
],
|
||||||
|
"transform-parameters": [
|
||||||
|
"bugfix/transform-edge-default-parameters",
|
||||||
|
"bugfix/transform-safari-id-destructuring-collision-in-function-expression"
|
||||||
|
],
|
||||||
|
"transform-function-name": [
|
||||||
|
"bugfix/transform-edge-function-name"
|
||||||
|
],
|
||||||
|
"transform-block-scoping": [
|
||||||
|
"bugfix/transform-safari-block-shadowing",
|
||||||
|
"bugfix/transform-safari-for-shadowing"
|
||||||
|
],
|
||||||
|
"transform-template-literals": [
|
||||||
|
"bugfix/transform-tagged-template-caching"
|
||||||
|
],
|
||||||
|
"transform-optional-chaining": [
|
||||||
|
"bugfix/transform-v8-spread-parameters-in-optional-chaining"
|
||||||
|
],
|
||||||
|
"proposal-optional-chaining": [
|
||||||
|
"bugfix/transform-v8-spread-parameters-in-optional-chaining"
|
||||||
|
],
|
||||||
|
"transform-class-properties": [
|
||||||
|
"bugfix/transform-v8-static-class-fields-redefine-readonly",
|
||||||
|
"bugfix/transform-firefox-class-in-computed-class-key",
|
||||||
|
"bugfix/transform-safari-class-field-initializer-scope"
|
||||||
|
],
|
||||||
|
"proposal-class-properties": [
|
||||||
|
"bugfix/transform-v8-static-class-fields-redefine-readonly",
|
||||||
|
"bugfix/transform-firefox-class-in-computed-class-key",
|
||||||
|
"bugfix/transform-safari-class-field-initializer-scope"
|
||||||
|
]
|
||||||
|
}
|
||||||
203
web/node_modules/@babel/compat-data/data/plugin-bugfixes.json
generated
vendored
Normal file
203
web/node_modules/@babel/compat-data/data/plugin-bugfixes.json
generated
vendored
Normal file
@@ -0,0 +1,203 @@
|
|||||||
|
{
|
||||||
|
"bugfix/transform-async-arrows-in-class": {
|
||||||
|
"chrome": "55",
|
||||||
|
"opera": "42",
|
||||||
|
"edge": "15",
|
||||||
|
"firefox": "52",
|
||||||
|
"safari": "11",
|
||||||
|
"node": "7.6",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "11",
|
||||||
|
"samsung": "6",
|
||||||
|
"opera_mobile": "42",
|
||||||
|
"electron": "1.6"
|
||||||
|
},
|
||||||
|
"bugfix/transform-edge-default-parameters": {
|
||||||
|
"chrome": "49",
|
||||||
|
"opera": "36",
|
||||||
|
"edge": "18",
|
||||||
|
"firefox": "52",
|
||||||
|
"safari": "10",
|
||||||
|
"node": "6",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "10",
|
||||||
|
"samsung": "5",
|
||||||
|
"opera_mobile": "36",
|
||||||
|
"electron": "0.37"
|
||||||
|
},
|
||||||
|
"bugfix/transform-edge-function-name": {
|
||||||
|
"chrome": "51",
|
||||||
|
"opera": "38",
|
||||||
|
"edge": "79",
|
||||||
|
"firefox": "53",
|
||||||
|
"safari": "10",
|
||||||
|
"node": "6.5",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "10",
|
||||||
|
"samsung": "5",
|
||||||
|
"opera_mobile": "41",
|
||||||
|
"electron": "1.2"
|
||||||
|
},
|
||||||
|
"bugfix/transform-safari-block-shadowing": {
|
||||||
|
"chrome": "49",
|
||||||
|
"opera": "36",
|
||||||
|
"edge": "12",
|
||||||
|
"firefox": "44",
|
||||||
|
"safari": "11",
|
||||||
|
"node": "6",
|
||||||
|
"deno": "1",
|
||||||
|
"ie": "11",
|
||||||
|
"ios": "11",
|
||||||
|
"samsung": "5",
|
||||||
|
"opera_mobile": "36",
|
||||||
|
"electron": "0.37"
|
||||||
|
},
|
||||||
|
"bugfix/transform-safari-for-shadowing": {
|
||||||
|
"chrome": "49",
|
||||||
|
"opera": "36",
|
||||||
|
"edge": "12",
|
||||||
|
"firefox": "4",
|
||||||
|
"safari": "11",
|
||||||
|
"node": "6",
|
||||||
|
"deno": "1",
|
||||||
|
"ie": "11",
|
||||||
|
"ios": "11",
|
||||||
|
"samsung": "5",
|
||||||
|
"rhino": "1.7.13",
|
||||||
|
"opera_mobile": "36",
|
||||||
|
"electron": "0.37"
|
||||||
|
},
|
||||||
|
"bugfix/transform-safari-id-destructuring-collision-in-function-expression": {
|
||||||
|
"chrome": "49",
|
||||||
|
"opera": "36",
|
||||||
|
"edge": "14",
|
||||||
|
"firefox": "2",
|
||||||
|
"safari": "16.3",
|
||||||
|
"node": "6",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "16.3",
|
||||||
|
"samsung": "5",
|
||||||
|
"opera_mobile": "36",
|
||||||
|
"electron": "0.37"
|
||||||
|
},
|
||||||
|
"bugfix/transform-tagged-template-caching": {
|
||||||
|
"chrome": "41",
|
||||||
|
"opera": "28",
|
||||||
|
"edge": "12",
|
||||||
|
"firefox": "34",
|
||||||
|
"safari": "13",
|
||||||
|
"node": "4",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "13",
|
||||||
|
"samsung": "3.4",
|
||||||
|
"rhino": "1.7.14",
|
||||||
|
"opera_mobile": "28",
|
||||||
|
"electron": "0.21"
|
||||||
|
},
|
||||||
|
"bugfix/transform-v8-spread-parameters-in-optional-chaining": {
|
||||||
|
"chrome": "91",
|
||||||
|
"opera": "77",
|
||||||
|
"edge": "91",
|
||||||
|
"firefox": "74",
|
||||||
|
"safari": "13.1",
|
||||||
|
"node": "16.9",
|
||||||
|
"deno": "1.9",
|
||||||
|
"ios": "13.4",
|
||||||
|
"samsung": "16",
|
||||||
|
"opera_mobile": "64",
|
||||||
|
"electron": "13.0"
|
||||||
|
},
|
||||||
|
"transform-optional-chaining": {
|
||||||
|
"chrome": "80",
|
||||||
|
"opera": "67",
|
||||||
|
"edge": "80",
|
||||||
|
"firefox": "74",
|
||||||
|
"safari": "13.1",
|
||||||
|
"node": "14",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "13.4",
|
||||||
|
"samsung": "13",
|
||||||
|
"rhino": "1.8",
|
||||||
|
"opera_mobile": "57",
|
||||||
|
"electron": "8.0"
|
||||||
|
},
|
||||||
|
"proposal-optional-chaining": {
|
||||||
|
"chrome": "80",
|
||||||
|
"opera": "67",
|
||||||
|
"edge": "80",
|
||||||
|
"firefox": "74",
|
||||||
|
"safari": "13.1",
|
||||||
|
"node": "14",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "13.4",
|
||||||
|
"samsung": "13",
|
||||||
|
"rhino": "1.8",
|
||||||
|
"opera_mobile": "57",
|
||||||
|
"electron": "8.0"
|
||||||
|
},
|
||||||
|
"transform-parameters": {
|
||||||
|
"chrome": "49",
|
||||||
|
"opera": "36",
|
||||||
|
"edge": "15",
|
||||||
|
"firefox": "52",
|
||||||
|
"safari": "10",
|
||||||
|
"node": "6",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "10",
|
||||||
|
"samsung": "5",
|
||||||
|
"opera_mobile": "36",
|
||||||
|
"electron": "0.37"
|
||||||
|
},
|
||||||
|
"transform-async-to-generator": {
|
||||||
|
"chrome": "55",
|
||||||
|
"opera": "42",
|
||||||
|
"edge": "15",
|
||||||
|
"firefox": "52",
|
||||||
|
"safari": "10.1",
|
||||||
|
"node": "7.6",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "10.3",
|
||||||
|
"samsung": "6",
|
||||||
|
"opera_mobile": "42",
|
||||||
|
"electron": "1.6"
|
||||||
|
},
|
||||||
|
"transform-template-literals": {
|
||||||
|
"chrome": "41",
|
||||||
|
"opera": "28",
|
||||||
|
"edge": "13",
|
||||||
|
"firefox": "34",
|
||||||
|
"safari": "9",
|
||||||
|
"node": "4",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "9",
|
||||||
|
"samsung": "3.4",
|
||||||
|
"opera_mobile": "28",
|
||||||
|
"electron": "0.21"
|
||||||
|
},
|
||||||
|
"transform-function-name": {
|
||||||
|
"chrome": "51",
|
||||||
|
"opera": "38",
|
||||||
|
"edge": "14",
|
||||||
|
"firefox": "53",
|
||||||
|
"safari": "10",
|
||||||
|
"node": "6.5",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "10",
|
||||||
|
"samsung": "5",
|
||||||
|
"opera_mobile": "41",
|
||||||
|
"electron": "1.2"
|
||||||
|
},
|
||||||
|
"transform-block-scoping": {
|
||||||
|
"chrome": "50",
|
||||||
|
"opera": "37",
|
||||||
|
"edge": "14",
|
||||||
|
"firefox": "53",
|
||||||
|
"safari": "10",
|
||||||
|
"node": "6",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "10",
|
||||||
|
"samsung": "5",
|
||||||
|
"opera_mobile": "37",
|
||||||
|
"electron": "1.1"
|
||||||
|
}
|
||||||
|
}
|
||||||
838
web/node_modules/@babel/compat-data/data/plugins.json
generated
vendored
Normal file
838
web/node_modules/@babel/compat-data/data/plugins.json
generated
vendored
Normal file
@@ -0,0 +1,838 @@
|
|||||||
|
{
|
||||||
|
"transform-explicit-resource-management": {
|
||||||
|
"chrome": "134",
|
||||||
|
"edge": "134",
|
||||||
|
"firefox": "141",
|
||||||
|
"node": "24",
|
||||||
|
"electron": "35.0"
|
||||||
|
},
|
||||||
|
"transform-duplicate-named-capturing-groups-regex": {
|
||||||
|
"chrome": "126",
|
||||||
|
"opera": "112",
|
||||||
|
"edge": "126",
|
||||||
|
"firefox": "129",
|
||||||
|
"safari": "17.4",
|
||||||
|
"node": "23",
|
||||||
|
"ios": "17.4",
|
||||||
|
"electron": "31.0"
|
||||||
|
},
|
||||||
|
"transform-regexp-modifiers": {
|
||||||
|
"chrome": "125",
|
||||||
|
"opera": "111",
|
||||||
|
"edge": "125",
|
||||||
|
"firefox": "132",
|
||||||
|
"node": "23",
|
||||||
|
"samsung": "27",
|
||||||
|
"electron": "31.0"
|
||||||
|
},
|
||||||
|
"transform-unicode-sets-regex": {
|
||||||
|
"chrome": "112",
|
||||||
|
"opera": "98",
|
||||||
|
"edge": "112",
|
||||||
|
"firefox": "116",
|
||||||
|
"safari": "17",
|
||||||
|
"node": "20",
|
||||||
|
"deno": "1.32",
|
||||||
|
"ios": "17",
|
||||||
|
"samsung": "23",
|
||||||
|
"opera_mobile": "75",
|
||||||
|
"electron": "24.0"
|
||||||
|
},
|
||||||
|
"bugfix/transform-v8-static-class-fields-redefine-readonly": {
|
||||||
|
"chrome": "98",
|
||||||
|
"opera": "84",
|
||||||
|
"edge": "98",
|
||||||
|
"firefox": "75",
|
||||||
|
"safari": "15",
|
||||||
|
"node": "12",
|
||||||
|
"deno": "1.18",
|
||||||
|
"ios": "15",
|
||||||
|
"samsung": "11",
|
||||||
|
"opera_mobile": "52",
|
||||||
|
"electron": "17.0"
|
||||||
|
},
|
||||||
|
"bugfix/transform-firefox-class-in-computed-class-key": {
|
||||||
|
"chrome": "74",
|
||||||
|
"opera": "62",
|
||||||
|
"edge": "79",
|
||||||
|
"firefox": "126",
|
||||||
|
"safari": "16",
|
||||||
|
"node": "12",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "16",
|
||||||
|
"samsung": "11",
|
||||||
|
"opera_mobile": "53",
|
||||||
|
"electron": "6.0"
|
||||||
|
},
|
||||||
|
"bugfix/transform-safari-class-field-initializer-scope": {
|
||||||
|
"chrome": "74",
|
||||||
|
"opera": "62",
|
||||||
|
"edge": "79",
|
||||||
|
"firefox": "69",
|
||||||
|
"safari": "16",
|
||||||
|
"node": "12",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "16",
|
||||||
|
"samsung": "11",
|
||||||
|
"opera_mobile": "53",
|
||||||
|
"electron": "6.0"
|
||||||
|
},
|
||||||
|
"transform-class-static-block": {
|
||||||
|
"chrome": "94",
|
||||||
|
"opera": "80",
|
||||||
|
"edge": "94",
|
||||||
|
"firefox": "93",
|
||||||
|
"safari": "16.4",
|
||||||
|
"node": "16.11",
|
||||||
|
"deno": "1.14",
|
||||||
|
"ios": "16.4",
|
||||||
|
"samsung": "17",
|
||||||
|
"opera_mobile": "66",
|
||||||
|
"electron": "15.0"
|
||||||
|
},
|
||||||
|
"proposal-class-static-block": {
|
||||||
|
"chrome": "94",
|
||||||
|
"opera": "80",
|
||||||
|
"edge": "94",
|
||||||
|
"firefox": "93",
|
||||||
|
"safari": "16.4",
|
||||||
|
"node": "16.11",
|
||||||
|
"deno": "1.14",
|
||||||
|
"ios": "16.4",
|
||||||
|
"samsung": "17",
|
||||||
|
"opera_mobile": "66",
|
||||||
|
"electron": "15.0"
|
||||||
|
},
|
||||||
|
"transform-private-property-in-object": {
|
||||||
|
"chrome": "91",
|
||||||
|
"opera": "77",
|
||||||
|
"edge": "91",
|
||||||
|
"firefox": "90",
|
||||||
|
"safari": "15",
|
||||||
|
"node": "16.9",
|
||||||
|
"deno": "1.9",
|
||||||
|
"ios": "15",
|
||||||
|
"samsung": "16",
|
||||||
|
"opera_mobile": "64",
|
||||||
|
"electron": "13.0"
|
||||||
|
},
|
||||||
|
"proposal-private-property-in-object": {
|
||||||
|
"chrome": "91",
|
||||||
|
"opera": "77",
|
||||||
|
"edge": "91",
|
||||||
|
"firefox": "90",
|
||||||
|
"safari": "15",
|
||||||
|
"node": "16.9",
|
||||||
|
"deno": "1.9",
|
||||||
|
"ios": "15",
|
||||||
|
"samsung": "16",
|
||||||
|
"opera_mobile": "64",
|
||||||
|
"electron": "13.0"
|
||||||
|
},
|
||||||
|
"transform-class-properties": {
|
||||||
|
"chrome": "74",
|
||||||
|
"opera": "62",
|
||||||
|
"edge": "79",
|
||||||
|
"firefox": "90",
|
||||||
|
"safari": "14.1",
|
||||||
|
"node": "12",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "14.5",
|
||||||
|
"samsung": "11",
|
||||||
|
"opera_mobile": "53",
|
||||||
|
"electron": "6.0"
|
||||||
|
},
|
||||||
|
"proposal-class-properties": {
|
||||||
|
"chrome": "74",
|
||||||
|
"opera": "62",
|
||||||
|
"edge": "79",
|
||||||
|
"firefox": "90",
|
||||||
|
"safari": "14.1",
|
||||||
|
"node": "12",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "14.5",
|
||||||
|
"samsung": "11",
|
||||||
|
"opera_mobile": "53",
|
||||||
|
"electron": "6.0"
|
||||||
|
},
|
||||||
|
"transform-private-methods": {
|
||||||
|
"chrome": "84",
|
||||||
|
"opera": "70",
|
||||||
|
"edge": "84",
|
||||||
|
"firefox": "90",
|
||||||
|
"safari": "15",
|
||||||
|
"node": "14.6",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "15",
|
||||||
|
"samsung": "14",
|
||||||
|
"opera_mobile": "60",
|
||||||
|
"electron": "10.0"
|
||||||
|
},
|
||||||
|
"proposal-private-methods": {
|
||||||
|
"chrome": "84",
|
||||||
|
"opera": "70",
|
||||||
|
"edge": "84",
|
||||||
|
"firefox": "90",
|
||||||
|
"safari": "15",
|
||||||
|
"node": "14.6",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "15",
|
||||||
|
"samsung": "14",
|
||||||
|
"opera_mobile": "60",
|
||||||
|
"electron": "10.0"
|
||||||
|
},
|
||||||
|
"transform-numeric-separator": {
|
||||||
|
"chrome": "75",
|
||||||
|
"opera": "62",
|
||||||
|
"edge": "79",
|
||||||
|
"firefox": "70",
|
||||||
|
"safari": "13",
|
||||||
|
"node": "12.5",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "13",
|
||||||
|
"samsung": "11",
|
||||||
|
"rhino": "1.7.14",
|
||||||
|
"opera_mobile": "54",
|
||||||
|
"electron": "6.0"
|
||||||
|
},
|
||||||
|
"proposal-numeric-separator": {
|
||||||
|
"chrome": "75",
|
||||||
|
"opera": "62",
|
||||||
|
"edge": "79",
|
||||||
|
"firefox": "70",
|
||||||
|
"safari": "13",
|
||||||
|
"node": "12.5",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "13",
|
||||||
|
"samsung": "11",
|
||||||
|
"rhino": "1.7.14",
|
||||||
|
"opera_mobile": "54",
|
||||||
|
"electron": "6.0"
|
||||||
|
},
|
||||||
|
"transform-logical-assignment-operators": {
|
||||||
|
"chrome": "85",
|
||||||
|
"opera": "71",
|
||||||
|
"edge": "85",
|
||||||
|
"firefox": "79",
|
||||||
|
"safari": "14",
|
||||||
|
"node": "15",
|
||||||
|
"deno": "1.2",
|
||||||
|
"ios": "14",
|
||||||
|
"samsung": "14",
|
||||||
|
"opera_mobile": "60",
|
||||||
|
"electron": "10.0"
|
||||||
|
},
|
||||||
|
"proposal-logical-assignment-operators": {
|
||||||
|
"chrome": "85",
|
||||||
|
"opera": "71",
|
||||||
|
"edge": "85",
|
||||||
|
"firefox": "79",
|
||||||
|
"safari": "14",
|
||||||
|
"node": "15",
|
||||||
|
"deno": "1.2",
|
||||||
|
"ios": "14",
|
||||||
|
"samsung": "14",
|
||||||
|
"opera_mobile": "60",
|
||||||
|
"electron": "10.0"
|
||||||
|
},
|
||||||
|
"transform-nullish-coalescing-operator": {
|
||||||
|
"chrome": "80",
|
||||||
|
"opera": "67",
|
||||||
|
"edge": "80",
|
||||||
|
"firefox": "72",
|
||||||
|
"safari": "13.1",
|
||||||
|
"node": "14",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "13.4",
|
||||||
|
"samsung": "13",
|
||||||
|
"rhino": "1.8",
|
||||||
|
"opera_mobile": "57",
|
||||||
|
"electron": "8.0"
|
||||||
|
},
|
||||||
|
"proposal-nullish-coalescing-operator": {
|
||||||
|
"chrome": "80",
|
||||||
|
"opera": "67",
|
||||||
|
"edge": "80",
|
||||||
|
"firefox": "72",
|
||||||
|
"safari": "13.1",
|
||||||
|
"node": "14",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "13.4",
|
||||||
|
"samsung": "13",
|
||||||
|
"rhino": "1.8",
|
||||||
|
"opera_mobile": "57",
|
||||||
|
"electron": "8.0"
|
||||||
|
},
|
||||||
|
"transform-optional-chaining": {
|
||||||
|
"chrome": "91",
|
||||||
|
"opera": "77",
|
||||||
|
"edge": "91",
|
||||||
|
"firefox": "74",
|
||||||
|
"safari": "13.1",
|
||||||
|
"node": "16.9",
|
||||||
|
"deno": "1.9",
|
||||||
|
"ios": "13.4",
|
||||||
|
"samsung": "16",
|
||||||
|
"opera_mobile": "64",
|
||||||
|
"electron": "13.0"
|
||||||
|
},
|
||||||
|
"proposal-optional-chaining": {
|
||||||
|
"chrome": "91",
|
||||||
|
"opera": "77",
|
||||||
|
"edge": "91",
|
||||||
|
"firefox": "74",
|
||||||
|
"safari": "13.1",
|
||||||
|
"node": "16.9",
|
||||||
|
"deno": "1.9",
|
||||||
|
"ios": "13.4",
|
||||||
|
"samsung": "16",
|
||||||
|
"opera_mobile": "64",
|
||||||
|
"electron": "13.0"
|
||||||
|
},
|
||||||
|
"transform-json-strings": {
|
||||||
|
"chrome": "66",
|
||||||
|
"opera": "53",
|
||||||
|
"edge": "79",
|
||||||
|
"firefox": "62",
|
||||||
|
"safari": "12",
|
||||||
|
"node": "10",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "12",
|
||||||
|
"samsung": "9",
|
||||||
|
"rhino": "1.7.14",
|
||||||
|
"opera_mobile": "47",
|
||||||
|
"electron": "3.0"
|
||||||
|
},
|
||||||
|
"proposal-json-strings": {
|
||||||
|
"chrome": "66",
|
||||||
|
"opera": "53",
|
||||||
|
"edge": "79",
|
||||||
|
"firefox": "62",
|
||||||
|
"safari": "12",
|
||||||
|
"node": "10",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "12",
|
||||||
|
"samsung": "9",
|
||||||
|
"rhino": "1.7.14",
|
||||||
|
"opera_mobile": "47",
|
||||||
|
"electron": "3.0"
|
||||||
|
},
|
||||||
|
"transform-optional-catch-binding": {
|
||||||
|
"chrome": "66",
|
||||||
|
"opera": "53",
|
||||||
|
"edge": "79",
|
||||||
|
"firefox": "58",
|
||||||
|
"safari": "11.1",
|
||||||
|
"node": "10",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "11.3",
|
||||||
|
"samsung": "9",
|
||||||
|
"opera_mobile": "47",
|
||||||
|
"electron": "3.0"
|
||||||
|
},
|
||||||
|
"proposal-optional-catch-binding": {
|
||||||
|
"chrome": "66",
|
||||||
|
"opera": "53",
|
||||||
|
"edge": "79",
|
||||||
|
"firefox": "58",
|
||||||
|
"safari": "11.1",
|
||||||
|
"node": "10",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "11.3",
|
||||||
|
"samsung": "9",
|
||||||
|
"opera_mobile": "47",
|
||||||
|
"electron": "3.0"
|
||||||
|
},
|
||||||
|
"transform-parameters": {
|
||||||
|
"chrome": "49",
|
||||||
|
"opera": "36",
|
||||||
|
"edge": "18",
|
||||||
|
"firefox": "52",
|
||||||
|
"safari": "16.3",
|
||||||
|
"node": "6",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "16.3",
|
||||||
|
"samsung": "5",
|
||||||
|
"opera_mobile": "36",
|
||||||
|
"electron": "0.37"
|
||||||
|
},
|
||||||
|
"transform-async-generator-functions": {
|
||||||
|
"chrome": "63",
|
||||||
|
"opera": "50",
|
||||||
|
"edge": "79",
|
||||||
|
"firefox": "57",
|
||||||
|
"safari": "12",
|
||||||
|
"node": "10",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "12",
|
||||||
|
"samsung": "8",
|
||||||
|
"opera_mobile": "46",
|
||||||
|
"electron": "3.0"
|
||||||
|
},
|
||||||
|
"proposal-async-generator-functions": {
|
||||||
|
"chrome": "63",
|
||||||
|
"opera": "50",
|
||||||
|
"edge": "79",
|
||||||
|
"firefox": "57",
|
||||||
|
"safari": "12",
|
||||||
|
"node": "10",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "12",
|
||||||
|
"samsung": "8",
|
||||||
|
"opera_mobile": "46",
|
||||||
|
"electron": "3.0"
|
||||||
|
},
|
||||||
|
"transform-object-rest-spread": {
|
||||||
|
"chrome": "60",
|
||||||
|
"opera": "47",
|
||||||
|
"edge": "79",
|
||||||
|
"firefox": "55",
|
||||||
|
"safari": "11.1",
|
||||||
|
"node": "8.3",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "11.3",
|
||||||
|
"samsung": "8",
|
||||||
|
"opera_mobile": "44",
|
||||||
|
"electron": "2.0"
|
||||||
|
},
|
||||||
|
"proposal-object-rest-spread": {
|
||||||
|
"chrome": "60",
|
||||||
|
"opera": "47",
|
||||||
|
"edge": "79",
|
||||||
|
"firefox": "55",
|
||||||
|
"safari": "11.1",
|
||||||
|
"node": "8.3",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "11.3",
|
||||||
|
"samsung": "8",
|
||||||
|
"opera_mobile": "44",
|
||||||
|
"electron": "2.0"
|
||||||
|
},
|
||||||
|
"transform-dotall-regex": {
|
||||||
|
"chrome": "62",
|
||||||
|
"opera": "49",
|
||||||
|
"edge": "79",
|
||||||
|
"firefox": "78",
|
||||||
|
"safari": "11.1",
|
||||||
|
"node": "8.10",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "11.3",
|
||||||
|
"samsung": "8",
|
||||||
|
"rhino": "1.7.15",
|
||||||
|
"opera_mobile": "46",
|
||||||
|
"electron": "3.0"
|
||||||
|
},
|
||||||
|
"transform-unicode-property-regex": {
|
||||||
|
"chrome": "64",
|
||||||
|
"opera": "51",
|
||||||
|
"edge": "79",
|
||||||
|
"firefox": "78",
|
||||||
|
"safari": "11.1",
|
||||||
|
"node": "10",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "11.3",
|
||||||
|
"samsung": "9",
|
||||||
|
"opera_mobile": "47",
|
||||||
|
"electron": "3.0"
|
||||||
|
},
|
||||||
|
"proposal-unicode-property-regex": {
|
||||||
|
"chrome": "64",
|
||||||
|
"opera": "51",
|
||||||
|
"edge": "79",
|
||||||
|
"firefox": "78",
|
||||||
|
"safari": "11.1",
|
||||||
|
"node": "10",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "11.3",
|
||||||
|
"samsung": "9",
|
||||||
|
"opera_mobile": "47",
|
||||||
|
"electron": "3.0"
|
||||||
|
},
|
||||||
|
"transform-named-capturing-groups-regex": {
|
||||||
|
"chrome": "64",
|
||||||
|
"opera": "51",
|
||||||
|
"edge": "79",
|
||||||
|
"firefox": "78",
|
||||||
|
"safari": "11.1",
|
||||||
|
"node": "10",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "11.3",
|
||||||
|
"samsung": "9",
|
||||||
|
"opera_mobile": "47",
|
||||||
|
"electron": "3.0"
|
||||||
|
},
|
||||||
|
"transform-async-to-generator": {
|
||||||
|
"chrome": "55",
|
||||||
|
"opera": "42",
|
||||||
|
"edge": "15",
|
||||||
|
"firefox": "52",
|
||||||
|
"safari": "11",
|
||||||
|
"node": "7.6",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "11",
|
||||||
|
"samsung": "6",
|
||||||
|
"opera_mobile": "42",
|
||||||
|
"electron": "1.6"
|
||||||
|
},
|
||||||
|
"transform-exponentiation-operator": {
|
||||||
|
"chrome": "52",
|
||||||
|
"opera": "39",
|
||||||
|
"edge": "14",
|
||||||
|
"firefox": "52",
|
||||||
|
"safari": "10.1",
|
||||||
|
"node": "7",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "10.3",
|
||||||
|
"samsung": "6",
|
||||||
|
"rhino": "1.7.14",
|
||||||
|
"opera_mobile": "41",
|
||||||
|
"electron": "1.3"
|
||||||
|
},
|
||||||
|
"transform-template-literals": {
|
||||||
|
"chrome": "41",
|
||||||
|
"opera": "28",
|
||||||
|
"edge": "13",
|
||||||
|
"firefox": "34",
|
||||||
|
"safari": "13",
|
||||||
|
"node": "4",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "13",
|
||||||
|
"samsung": "3.4",
|
||||||
|
"opera_mobile": "28",
|
||||||
|
"electron": "0.21"
|
||||||
|
},
|
||||||
|
"transform-literals": {
|
||||||
|
"chrome": "44",
|
||||||
|
"opera": "31",
|
||||||
|
"edge": "12",
|
||||||
|
"firefox": "53",
|
||||||
|
"safari": "9",
|
||||||
|
"node": "4",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "9",
|
||||||
|
"samsung": "4",
|
||||||
|
"rhino": "1.7.15",
|
||||||
|
"opera_mobile": "32",
|
||||||
|
"electron": "0.30"
|
||||||
|
},
|
||||||
|
"transform-function-name": {
|
||||||
|
"chrome": "51",
|
||||||
|
"opera": "38",
|
||||||
|
"edge": "79",
|
||||||
|
"firefox": "53",
|
||||||
|
"safari": "10",
|
||||||
|
"node": "6.5",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "10",
|
||||||
|
"samsung": "5",
|
||||||
|
"opera_mobile": "41",
|
||||||
|
"electron": "1.2"
|
||||||
|
},
|
||||||
|
"transform-arrow-functions": {
|
||||||
|
"chrome": "47",
|
||||||
|
"opera": "34",
|
||||||
|
"edge": "13",
|
||||||
|
"firefox": "43",
|
||||||
|
"safari": "10",
|
||||||
|
"node": "6",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "10",
|
||||||
|
"samsung": "5",
|
||||||
|
"rhino": "1.7.13",
|
||||||
|
"opera_mobile": "34",
|
||||||
|
"electron": "0.36"
|
||||||
|
},
|
||||||
|
"transform-block-scoped-functions": {
|
||||||
|
"chrome": "41",
|
||||||
|
"opera": "28",
|
||||||
|
"edge": "12",
|
||||||
|
"firefox": "46",
|
||||||
|
"safari": "10",
|
||||||
|
"node": "4",
|
||||||
|
"deno": "1",
|
||||||
|
"ie": "11",
|
||||||
|
"ios": "10",
|
||||||
|
"samsung": "3.4",
|
||||||
|
"opera_mobile": "28",
|
||||||
|
"electron": "0.21"
|
||||||
|
},
|
||||||
|
"transform-classes": {
|
||||||
|
"chrome": "46",
|
||||||
|
"opera": "33",
|
||||||
|
"edge": "13",
|
||||||
|
"firefox": "45",
|
||||||
|
"safari": "10",
|
||||||
|
"node": "5",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "10",
|
||||||
|
"samsung": "5",
|
||||||
|
"opera_mobile": "33",
|
||||||
|
"electron": "0.36"
|
||||||
|
},
|
||||||
|
"transform-object-super": {
|
||||||
|
"chrome": "46",
|
||||||
|
"opera": "33",
|
||||||
|
"edge": "13",
|
||||||
|
"firefox": "45",
|
||||||
|
"safari": "10",
|
||||||
|
"node": "5",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "10",
|
||||||
|
"samsung": "5",
|
||||||
|
"opera_mobile": "33",
|
||||||
|
"electron": "0.36"
|
||||||
|
},
|
||||||
|
"transform-shorthand-properties": {
|
||||||
|
"chrome": "43",
|
||||||
|
"opera": "30",
|
||||||
|
"edge": "12",
|
||||||
|
"firefox": "33",
|
||||||
|
"safari": "9",
|
||||||
|
"node": "4",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "9",
|
||||||
|
"samsung": "4",
|
||||||
|
"rhino": "1.7.14",
|
||||||
|
"opera_mobile": "30",
|
||||||
|
"electron": "0.27"
|
||||||
|
},
|
||||||
|
"transform-duplicate-keys": {
|
||||||
|
"chrome": "42",
|
||||||
|
"opera": "29",
|
||||||
|
"edge": "12",
|
||||||
|
"firefox": "34",
|
||||||
|
"safari": "9",
|
||||||
|
"node": "4",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "9",
|
||||||
|
"samsung": "3.4",
|
||||||
|
"opera_mobile": "29",
|
||||||
|
"electron": "0.25"
|
||||||
|
},
|
||||||
|
"transform-computed-properties": {
|
||||||
|
"chrome": "44",
|
||||||
|
"opera": "31",
|
||||||
|
"edge": "12",
|
||||||
|
"firefox": "34",
|
||||||
|
"safari": "7.1",
|
||||||
|
"node": "4",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "8",
|
||||||
|
"samsung": "4",
|
||||||
|
"rhino": "1.8",
|
||||||
|
"opera_mobile": "32",
|
||||||
|
"electron": "0.30"
|
||||||
|
},
|
||||||
|
"transform-for-of": {
|
||||||
|
"chrome": "51",
|
||||||
|
"opera": "38",
|
||||||
|
"edge": "15",
|
||||||
|
"firefox": "53",
|
||||||
|
"safari": "10",
|
||||||
|
"node": "6.5",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "10",
|
||||||
|
"samsung": "5",
|
||||||
|
"opera_mobile": "41",
|
||||||
|
"electron": "1.2"
|
||||||
|
},
|
||||||
|
"transform-sticky-regex": {
|
||||||
|
"chrome": "49",
|
||||||
|
"opera": "36",
|
||||||
|
"edge": "13",
|
||||||
|
"firefox": "3",
|
||||||
|
"safari": "10",
|
||||||
|
"node": "6",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "10",
|
||||||
|
"samsung": "5",
|
||||||
|
"rhino": "1.7.15",
|
||||||
|
"opera_mobile": "36",
|
||||||
|
"electron": "0.37"
|
||||||
|
},
|
||||||
|
"transform-unicode-escapes": {
|
||||||
|
"chrome": "44",
|
||||||
|
"opera": "31",
|
||||||
|
"edge": "12",
|
||||||
|
"firefox": "53",
|
||||||
|
"safari": "9",
|
||||||
|
"node": "4",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "9",
|
||||||
|
"samsung": "4",
|
||||||
|
"rhino": "1.7.15",
|
||||||
|
"opera_mobile": "32",
|
||||||
|
"electron": "0.30"
|
||||||
|
},
|
||||||
|
"transform-unicode-regex": {
|
||||||
|
"chrome": "50",
|
||||||
|
"opera": "37",
|
||||||
|
"edge": "13",
|
||||||
|
"firefox": "46",
|
||||||
|
"safari": "12",
|
||||||
|
"node": "6",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "12",
|
||||||
|
"samsung": "5",
|
||||||
|
"opera_mobile": "37",
|
||||||
|
"electron": "1.1"
|
||||||
|
},
|
||||||
|
"transform-spread": {
|
||||||
|
"chrome": "46",
|
||||||
|
"opera": "33",
|
||||||
|
"edge": "13",
|
||||||
|
"firefox": "45",
|
||||||
|
"safari": "10",
|
||||||
|
"node": "5",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "10",
|
||||||
|
"samsung": "5",
|
||||||
|
"opera_mobile": "33",
|
||||||
|
"electron": "0.36"
|
||||||
|
},
|
||||||
|
"transform-destructuring": {
|
||||||
|
"chrome": "51",
|
||||||
|
"opera": "38",
|
||||||
|
"edge": "15",
|
||||||
|
"firefox": "53",
|
||||||
|
"safari": "10",
|
||||||
|
"node": "6.5",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "10",
|
||||||
|
"samsung": "5",
|
||||||
|
"opera_mobile": "41",
|
||||||
|
"electron": "1.2"
|
||||||
|
},
|
||||||
|
"transform-block-scoping": {
|
||||||
|
"chrome": "50",
|
||||||
|
"opera": "37",
|
||||||
|
"edge": "14",
|
||||||
|
"firefox": "53",
|
||||||
|
"safari": "11",
|
||||||
|
"node": "6",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "11",
|
||||||
|
"samsung": "5",
|
||||||
|
"opera_mobile": "37",
|
||||||
|
"electron": "1.1"
|
||||||
|
},
|
||||||
|
"transform-typeof-symbol": {
|
||||||
|
"chrome": "48",
|
||||||
|
"opera": "35",
|
||||||
|
"edge": "12",
|
||||||
|
"firefox": "36",
|
||||||
|
"safari": "9",
|
||||||
|
"node": "6",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "9",
|
||||||
|
"samsung": "5",
|
||||||
|
"rhino": "1.8",
|
||||||
|
"opera_mobile": "35",
|
||||||
|
"electron": "0.37"
|
||||||
|
},
|
||||||
|
"transform-new-target": {
|
||||||
|
"chrome": "46",
|
||||||
|
"opera": "33",
|
||||||
|
"edge": "14",
|
||||||
|
"firefox": "41",
|
||||||
|
"safari": "10",
|
||||||
|
"node": "5",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "10",
|
||||||
|
"samsung": "5",
|
||||||
|
"opera_mobile": "33",
|
||||||
|
"electron": "0.36"
|
||||||
|
},
|
||||||
|
"transform-regenerator": {
|
||||||
|
"chrome": "50",
|
||||||
|
"opera": "37",
|
||||||
|
"edge": "13",
|
||||||
|
"firefox": "53",
|
||||||
|
"safari": "10",
|
||||||
|
"node": "6",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "10",
|
||||||
|
"samsung": "5",
|
||||||
|
"opera_mobile": "37",
|
||||||
|
"electron": "1.1"
|
||||||
|
},
|
||||||
|
"transform-member-expression-literals": {
|
||||||
|
"chrome": "7",
|
||||||
|
"opera": "12",
|
||||||
|
"edge": "12",
|
||||||
|
"firefox": "2",
|
||||||
|
"safari": "5.1",
|
||||||
|
"node": "0.4",
|
||||||
|
"deno": "1",
|
||||||
|
"ie": "9",
|
||||||
|
"android": "4",
|
||||||
|
"ios": "6",
|
||||||
|
"phantom": "1.9",
|
||||||
|
"samsung": "1",
|
||||||
|
"rhino": "1.7.13",
|
||||||
|
"opera_mobile": "12",
|
||||||
|
"electron": "0.20"
|
||||||
|
},
|
||||||
|
"transform-property-literals": {
|
||||||
|
"chrome": "7",
|
||||||
|
"opera": "12",
|
||||||
|
"edge": "12",
|
||||||
|
"firefox": "2",
|
||||||
|
"safari": "5.1",
|
||||||
|
"node": "0.4",
|
||||||
|
"deno": "1",
|
||||||
|
"ie": "9",
|
||||||
|
"android": "4",
|
||||||
|
"ios": "6",
|
||||||
|
"phantom": "1.9",
|
||||||
|
"samsung": "1",
|
||||||
|
"rhino": "1.7.13",
|
||||||
|
"opera_mobile": "12",
|
||||||
|
"electron": "0.20"
|
||||||
|
},
|
||||||
|
"transform-reserved-words": {
|
||||||
|
"chrome": "13",
|
||||||
|
"opera": "10.50",
|
||||||
|
"edge": "12",
|
||||||
|
"firefox": "2",
|
||||||
|
"safari": "3.1",
|
||||||
|
"node": "0.6",
|
||||||
|
"deno": "1",
|
||||||
|
"ie": "9",
|
||||||
|
"android": "4.4",
|
||||||
|
"ios": "6",
|
||||||
|
"phantom": "1.9",
|
||||||
|
"samsung": "1",
|
||||||
|
"rhino": "1.7.13",
|
||||||
|
"opera_mobile": "10.1",
|
||||||
|
"electron": "0.20"
|
||||||
|
},
|
||||||
|
"transform-export-namespace-from": {
|
||||||
|
"chrome": "72",
|
||||||
|
"deno": "1.0",
|
||||||
|
"edge": "79",
|
||||||
|
"firefox": "80",
|
||||||
|
"node": "13.2.0",
|
||||||
|
"opera": "60",
|
||||||
|
"opera_mobile": "51",
|
||||||
|
"safari": "14.1",
|
||||||
|
"ios": "14.5",
|
||||||
|
"samsung": "11.0",
|
||||||
|
"android": "72",
|
||||||
|
"electron": "5.0"
|
||||||
|
},
|
||||||
|
"proposal-export-namespace-from": {
|
||||||
|
"chrome": "72",
|
||||||
|
"deno": "1.0",
|
||||||
|
"edge": "79",
|
||||||
|
"firefox": "80",
|
||||||
|
"node": "13.2.0",
|
||||||
|
"opera": "60",
|
||||||
|
"opera_mobile": "51",
|
||||||
|
"safari": "14.1",
|
||||||
|
"ios": "14.5",
|
||||||
|
"samsung": "11.0",
|
||||||
|
"android": "72",
|
||||||
|
"electron": "5.0"
|
||||||
|
}
|
||||||
|
}
|
||||||
2
web/node_modules/@babel/compat-data/native-modules.js
generated
vendored
Normal file
2
web/node_modules/@babel/compat-data/native-modules.js
generated
vendored
Normal file
@@ -0,0 +1,2 @@
|
|||||||
|
// Todo (Babel 8): remove this file, in Babel 8 users import the .json directly
|
||||||
|
module.exports = require("./data/native-modules.json");
|
||||||
2
web/node_modules/@babel/compat-data/overlapping-plugins.js
generated
vendored
Normal file
2
web/node_modules/@babel/compat-data/overlapping-plugins.js
generated
vendored
Normal file
@@ -0,0 +1,2 @@
|
|||||||
|
// Todo (Babel 8): remove this file, in Babel 8 users import the .json directly
|
||||||
|
module.exports = require("./data/overlapping-plugins.json");
|
||||||
40
web/node_modules/@babel/compat-data/package.json
generated
vendored
Normal file
40
web/node_modules/@babel/compat-data/package.json
generated
vendored
Normal file
@@ -0,0 +1,40 @@
|
|||||||
|
{
|
||||||
|
"name": "@babel/compat-data",
|
||||||
|
"version": "7.28.5",
|
||||||
|
"author": "The Babel Team (https://babel.dev/team)",
|
||||||
|
"license": "MIT",
|
||||||
|
"description": "The compat-data to determine required Babel plugins",
|
||||||
|
"repository": {
|
||||||
|
"type": "git",
|
||||||
|
"url": "https://github.com/babel/babel.git",
|
||||||
|
"directory": "packages/babel-compat-data"
|
||||||
|
},
|
||||||
|
"publishConfig": {
|
||||||
|
"access": "public"
|
||||||
|
},
|
||||||
|
"exports": {
|
||||||
|
"./plugins": "./plugins.js",
|
||||||
|
"./native-modules": "./native-modules.js",
|
||||||
|
"./corejs2-built-ins": "./corejs2-built-ins.js",
|
||||||
|
"./corejs3-shipped-proposals": "./corejs3-shipped-proposals.js",
|
||||||
|
"./overlapping-plugins": "./overlapping-plugins.js",
|
||||||
|
"./plugin-bugfixes": "./plugin-bugfixes.js"
|
||||||
|
},
|
||||||
|
"scripts": {
|
||||||
|
"build-data": "./scripts/download-compat-table.sh && node ./scripts/build-data.mjs && node ./scripts/build-modules-support.mjs && node ./scripts/build-bugfixes-targets.mjs"
|
||||||
|
},
|
||||||
|
"keywords": [
|
||||||
|
"babel",
|
||||||
|
"compat-table",
|
||||||
|
"compat-data"
|
||||||
|
],
|
||||||
|
"devDependencies": {
|
||||||
|
"@mdn/browser-compat-data": "^6.0.8",
|
||||||
|
"core-js-compat": "^3.43.0",
|
||||||
|
"electron-to-chromium": "^1.5.140"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">=6.9.0"
|
||||||
|
},
|
||||||
|
"type": "commonjs"
|
||||||
|
}
|
||||||
2
web/node_modules/@babel/compat-data/plugin-bugfixes.js
generated
vendored
Normal file
2
web/node_modules/@babel/compat-data/plugin-bugfixes.js
generated
vendored
Normal file
@@ -0,0 +1,2 @@
|
|||||||
|
// Todo (Babel 8): remove this file, in Babel 8 users import the .json directly
|
||||||
|
module.exports = require("./data/plugin-bugfixes.json");
|
||||||
2
web/node_modules/@babel/compat-data/plugins.js
generated
vendored
Normal file
2
web/node_modules/@babel/compat-data/plugins.js
generated
vendored
Normal file
@@ -0,0 +1,2 @@
|
|||||||
|
// Todo (Babel 8): remove this file, in Babel 8 users import the .json directly
|
||||||
|
module.exports = require("./data/plugins.json");
|
||||||
22
web/node_modules/@babel/core/LICENSE
generated
vendored
Normal file
22
web/node_modules/@babel/core/LICENSE
generated
vendored
Normal file
@@ -0,0 +1,22 @@
|
|||||||
|
MIT License
|
||||||
|
|
||||||
|
Copyright (c) 2014-present Sebastian McKenzie and other contributors
|
||||||
|
|
||||||
|
Permission is hereby granted, free of charge, to any person obtaining
|
||||||
|
a copy of this software and associated documentation files (the
|
||||||
|
"Software"), to deal in the Software without restriction, including
|
||||||
|
without limitation the rights to use, copy, modify, merge, publish,
|
||||||
|
distribute, sublicense, and/or sell copies of the Software, and to
|
||||||
|
permit persons to whom the Software is furnished to do so, subject to
|
||||||
|
the following conditions:
|
||||||
|
|
||||||
|
The above copyright notice and this permission notice shall be
|
||||||
|
included in all copies or substantial portions of the Software.
|
||||||
|
|
||||||
|
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
|
||||||
|
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
|
||||||
|
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
|
||||||
|
NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE
|
||||||
|
LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
|
||||||
|
OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION
|
||||||
|
WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
|
||||||
19
web/node_modules/@babel/core/README.md
generated
vendored
Normal file
19
web/node_modules/@babel/core/README.md
generated
vendored
Normal file
@@ -0,0 +1,19 @@
|
|||||||
|
# @babel/core
|
||||||
|
|
||||||
|
> Babel compiler core.
|
||||||
|
|
||||||
|
See our website [@babel/core](https://babeljs.io/docs/babel-core) for more information or the [issues](https://github.com/babel/babel/issues?utf8=%E2%9C%93&q=is%3Aissue+label%3A%22pkg%3A%20core%22+is%3Aopen) associated with this package.
|
||||||
|
|
||||||
|
## Install
|
||||||
|
|
||||||
|
Using npm:
|
||||||
|
|
||||||
|
```sh
|
||||||
|
npm install --save-dev @babel/core
|
||||||
|
```
|
||||||
|
|
||||||
|
or using yarn:
|
||||||
|
|
||||||
|
```sh
|
||||||
|
yarn add @babel/core --dev
|
||||||
|
```
|
||||||
5
web/node_modules/@babel/core/lib/config/cache-contexts.js
generated
vendored
Normal file
5
web/node_modules/@babel/core/lib/config/cache-contexts.js
generated
vendored
Normal file
@@ -0,0 +1,5 @@
|
|||||||
|
"use strict";
|
||||||
|
|
||||||
|
0 && 0;
|
||||||
|
|
||||||
|
//# sourceMappingURL=cache-contexts.js.map
|
||||||
1
web/node_modules/@babel/core/lib/config/cache-contexts.js.map
generated
vendored
Normal file
1
web/node_modules/@babel/core/lib/config/cache-contexts.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"version":3,"names":[],"sources":["../../src/config/cache-contexts.ts"],"sourcesContent":["import type { ConfigContext } from \"./config-chain.ts\";\nimport type {\n CallerMetadata,\n TargetsListOrObject,\n} from \"./validation/options.ts\";\n\nexport type { ConfigContext as FullConfig };\n\nexport type FullPreset = {\n targets: TargetsListOrObject;\n} & ConfigContext;\nexport type FullPlugin = {\n assumptions: { [name: string]: boolean };\n} & FullPreset;\n\n// Context not including filename since it is used in places that cannot\n// process 'ignore'/'only' and other filename-based logic.\nexport type SimpleConfig = {\n envName: string;\n caller: CallerMetadata | undefined;\n};\nexport type SimplePreset = {\n targets: TargetsListOrObject;\n} & SimpleConfig;\nexport type SimplePlugin = {\n assumptions: {\n [name: string]: boolean;\n };\n} & SimplePreset;\n"],"mappings":"","ignoreList":[]}
|
||||||
261
web/node_modules/@babel/core/lib/config/caching.js
generated
vendored
Normal file
261
web/node_modules/@babel/core/lib/config/caching.js
generated
vendored
Normal file
@@ -0,0 +1,261 @@
|
|||||||
|
"use strict";
|
||||||
|
|
||||||
|
Object.defineProperty(exports, "__esModule", {
|
||||||
|
value: true
|
||||||
|
});
|
||||||
|
exports.assertSimpleType = assertSimpleType;
|
||||||
|
exports.makeStrongCache = makeStrongCache;
|
||||||
|
exports.makeStrongCacheSync = makeStrongCacheSync;
|
||||||
|
exports.makeWeakCache = makeWeakCache;
|
||||||
|
exports.makeWeakCacheSync = makeWeakCacheSync;
|
||||||
|
function _gensync() {
|
||||||
|
const data = require("gensync");
|
||||||
|
_gensync = function () {
|
||||||
|
return data;
|
||||||
|
};
|
||||||
|
return data;
|
||||||
|
}
|
||||||
|
var _async = require("../gensync-utils/async.js");
|
||||||
|
var _util = require("./util.js");
|
||||||
|
const synchronize = gen => {
|
||||||
|
return _gensync()(gen).sync;
|
||||||
|
};
|
||||||
|
function* genTrue() {
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
function makeWeakCache(handler) {
|
||||||
|
return makeCachedFunction(WeakMap, handler);
|
||||||
|
}
|
||||||
|
function makeWeakCacheSync(handler) {
|
||||||
|
return synchronize(makeWeakCache(handler));
|
||||||
|
}
|
||||||
|
function makeStrongCache(handler) {
|
||||||
|
return makeCachedFunction(Map, handler);
|
||||||
|
}
|
||||||
|
function makeStrongCacheSync(handler) {
|
||||||
|
return synchronize(makeStrongCache(handler));
|
||||||
|
}
|
||||||
|
function makeCachedFunction(CallCache, handler) {
|
||||||
|
const callCacheSync = new CallCache();
|
||||||
|
const callCacheAsync = new CallCache();
|
||||||
|
const futureCache = new CallCache();
|
||||||
|
return function* cachedFunction(arg, data) {
|
||||||
|
const asyncContext = yield* (0, _async.isAsync)();
|
||||||
|
const callCache = asyncContext ? callCacheAsync : callCacheSync;
|
||||||
|
const cached = yield* getCachedValueOrWait(asyncContext, callCache, futureCache, arg, data);
|
||||||
|
if (cached.valid) return cached.value;
|
||||||
|
const cache = new CacheConfigurator(data);
|
||||||
|
const handlerResult = handler(arg, cache);
|
||||||
|
let finishLock;
|
||||||
|
let value;
|
||||||
|
if ((0, _util.isIterableIterator)(handlerResult)) {
|
||||||
|
value = yield* (0, _async.onFirstPause)(handlerResult, () => {
|
||||||
|
finishLock = setupAsyncLocks(cache, futureCache, arg);
|
||||||
|
});
|
||||||
|
} else {
|
||||||
|
value = handlerResult;
|
||||||
|
}
|
||||||
|
updateFunctionCache(callCache, cache, arg, value);
|
||||||
|
if (finishLock) {
|
||||||
|
futureCache.delete(arg);
|
||||||
|
finishLock.release(value);
|
||||||
|
}
|
||||||
|
return value;
|
||||||
|
};
|
||||||
|
}
|
||||||
|
function* getCachedValue(cache, arg, data) {
|
||||||
|
const cachedValue = cache.get(arg);
|
||||||
|
if (cachedValue) {
|
||||||
|
for (const {
|
||||||
|
value,
|
||||||
|
valid
|
||||||
|
} of cachedValue) {
|
||||||
|
if (yield* valid(data)) return {
|
||||||
|
valid: true,
|
||||||
|
value
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return {
|
||||||
|
valid: false,
|
||||||
|
value: null
|
||||||
|
};
|
||||||
|
}
|
||||||
|
function* getCachedValueOrWait(asyncContext, callCache, futureCache, arg, data) {
|
||||||
|
const cached = yield* getCachedValue(callCache, arg, data);
|
||||||
|
if (cached.valid) {
|
||||||
|
return cached;
|
||||||
|
}
|
||||||
|
if (asyncContext) {
|
||||||
|
const cached = yield* getCachedValue(futureCache, arg, data);
|
||||||
|
if (cached.valid) {
|
||||||
|
const value = yield* (0, _async.waitFor)(cached.value.promise);
|
||||||
|
return {
|
||||||
|
valid: true,
|
||||||
|
value
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return {
|
||||||
|
valid: false,
|
||||||
|
value: null
|
||||||
|
};
|
||||||
|
}
|
||||||
|
function setupAsyncLocks(config, futureCache, arg) {
|
||||||
|
const finishLock = new Lock();
|
||||||
|
updateFunctionCache(futureCache, config, arg, finishLock);
|
||||||
|
return finishLock;
|
||||||
|
}
|
||||||
|
function updateFunctionCache(cache, config, arg, value) {
|
||||||
|
if (!config.configured()) config.forever();
|
||||||
|
let cachedValue = cache.get(arg);
|
||||||
|
config.deactivate();
|
||||||
|
switch (config.mode()) {
|
||||||
|
case "forever":
|
||||||
|
cachedValue = [{
|
||||||
|
value,
|
||||||
|
valid: genTrue
|
||||||
|
}];
|
||||||
|
cache.set(arg, cachedValue);
|
||||||
|
break;
|
||||||
|
case "invalidate":
|
||||||
|
cachedValue = [{
|
||||||
|
value,
|
||||||
|
valid: config.validator()
|
||||||
|
}];
|
||||||
|
cache.set(arg, cachedValue);
|
||||||
|
break;
|
||||||
|
case "valid":
|
||||||
|
if (cachedValue) {
|
||||||
|
cachedValue.push({
|
||||||
|
value,
|
||||||
|
valid: config.validator()
|
||||||
|
});
|
||||||
|
} else {
|
||||||
|
cachedValue = [{
|
||||||
|
value,
|
||||||
|
valid: config.validator()
|
||||||
|
}];
|
||||||
|
cache.set(arg, cachedValue);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
class CacheConfigurator {
|
||||||
|
constructor(data) {
|
||||||
|
this._active = true;
|
||||||
|
this._never = false;
|
||||||
|
this._forever = false;
|
||||||
|
this._invalidate = false;
|
||||||
|
this._configured = false;
|
||||||
|
this._pairs = [];
|
||||||
|
this._data = void 0;
|
||||||
|
this._data = data;
|
||||||
|
}
|
||||||
|
simple() {
|
||||||
|
return makeSimpleConfigurator(this);
|
||||||
|
}
|
||||||
|
mode() {
|
||||||
|
if (this._never) return "never";
|
||||||
|
if (this._forever) return "forever";
|
||||||
|
if (this._invalidate) return "invalidate";
|
||||||
|
return "valid";
|
||||||
|
}
|
||||||
|
forever() {
|
||||||
|
if (!this._active) {
|
||||||
|
throw new Error("Cannot change caching after evaluation has completed.");
|
||||||
|
}
|
||||||
|
if (this._never) {
|
||||||
|
throw new Error("Caching has already been configured with .never()");
|
||||||
|
}
|
||||||
|
this._forever = true;
|
||||||
|
this._configured = true;
|
||||||
|
}
|
||||||
|
never() {
|
||||||
|
if (!this._active) {
|
||||||
|
throw new Error("Cannot change caching after evaluation has completed.");
|
||||||
|
}
|
||||||
|
if (this._forever) {
|
||||||
|
throw new Error("Caching has already been configured with .forever()");
|
||||||
|
}
|
||||||
|
this._never = true;
|
||||||
|
this._configured = true;
|
||||||
|
}
|
||||||
|
using(handler) {
|
||||||
|
if (!this._active) {
|
||||||
|
throw new Error("Cannot change caching after evaluation has completed.");
|
||||||
|
}
|
||||||
|
if (this._never || this._forever) {
|
||||||
|
throw new Error("Caching has already been configured with .never or .forever()");
|
||||||
|
}
|
||||||
|
this._configured = true;
|
||||||
|
const key = handler(this._data);
|
||||||
|
const fn = (0, _async.maybeAsync)(handler, `You appear to be using an async cache handler, but Babel has been called synchronously`);
|
||||||
|
if ((0, _async.isThenable)(key)) {
|
||||||
|
return key.then(key => {
|
||||||
|
this._pairs.push([key, fn]);
|
||||||
|
return key;
|
||||||
|
});
|
||||||
|
}
|
||||||
|
this._pairs.push([key, fn]);
|
||||||
|
return key;
|
||||||
|
}
|
||||||
|
invalidate(handler) {
|
||||||
|
this._invalidate = true;
|
||||||
|
return this.using(handler);
|
||||||
|
}
|
||||||
|
validator() {
|
||||||
|
const pairs = this._pairs;
|
||||||
|
return function* (data) {
|
||||||
|
for (const [key, fn] of pairs) {
|
||||||
|
if (key !== (yield* fn(data))) return false;
|
||||||
|
}
|
||||||
|
return true;
|
||||||
|
};
|
||||||
|
}
|
||||||
|
deactivate() {
|
||||||
|
this._active = false;
|
||||||
|
}
|
||||||
|
configured() {
|
||||||
|
return this._configured;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
function makeSimpleConfigurator(cache) {
|
||||||
|
function cacheFn(val) {
|
||||||
|
if (typeof val === "boolean") {
|
||||||
|
if (val) cache.forever();else cache.never();
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
return cache.using(() => assertSimpleType(val()));
|
||||||
|
}
|
||||||
|
cacheFn.forever = () => cache.forever();
|
||||||
|
cacheFn.never = () => cache.never();
|
||||||
|
cacheFn.using = cb => cache.using(() => assertSimpleType(cb()));
|
||||||
|
cacheFn.invalidate = cb => cache.invalidate(() => assertSimpleType(cb()));
|
||||||
|
return cacheFn;
|
||||||
|
}
|
||||||
|
function assertSimpleType(value) {
|
||||||
|
if ((0, _async.isThenable)(value)) {
|
||||||
|
throw new Error(`You appear to be using an async cache handler, ` + `which your current version of Babel does not support. ` + `We may add support for this in the future, ` + `but if you're on the most recent version of @babel/core and still ` + `seeing this error, then you'll need to synchronously handle your caching logic.`);
|
||||||
|
}
|
||||||
|
if (value != null && typeof value !== "string" && typeof value !== "boolean" && typeof value !== "number") {
|
||||||
|
throw new Error("Cache keys must be either string, boolean, number, null, or undefined.");
|
||||||
|
}
|
||||||
|
return value;
|
||||||
|
}
|
||||||
|
class Lock {
|
||||||
|
constructor() {
|
||||||
|
this.released = false;
|
||||||
|
this.promise = void 0;
|
||||||
|
this._resolve = void 0;
|
||||||
|
this.promise = new Promise(resolve => {
|
||||||
|
this._resolve = resolve;
|
||||||
|
});
|
||||||
|
}
|
||||||
|
release(value) {
|
||||||
|
this.released = true;
|
||||||
|
this._resolve(value);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
0 && 0;
|
||||||
|
|
||||||
|
//# sourceMappingURL=caching.js.map
|
||||||
1
web/node_modules/@babel/core/lib/config/caching.js.map
generated
vendored
Normal file
1
web/node_modules/@babel/core/lib/config/caching.js.map
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
469
web/node_modules/@babel/core/lib/config/config-chain.js
generated
vendored
Normal file
469
web/node_modules/@babel/core/lib/config/config-chain.js
generated
vendored
Normal file
@@ -0,0 +1,469 @@
|
|||||||
|
"use strict";
|
||||||
|
|
||||||
|
Object.defineProperty(exports, "__esModule", {
|
||||||
|
value: true
|
||||||
|
});
|
||||||
|
exports.buildPresetChain = buildPresetChain;
|
||||||
|
exports.buildPresetChainWalker = void 0;
|
||||||
|
exports.buildRootChain = buildRootChain;
|
||||||
|
function _path() {
|
||||||
|
const data = require("path");
|
||||||
|
_path = function () {
|
||||||
|
return data;
|
||||||
|
};
|
||||||
|
return data;
|
||||||
|
}
|
||||||
|
function _debug() {
|
||||||
|
const data = require("debug");
|
||||||
|
_debug = function () {
|
||||||
|
return data;
|
||||||
|
};
|
||||||
|
return data;
|
||||||
|
}
|
||||||
|
var _options = require("./validation/options.js");
|
||||||
|
var _patternToRegex = require("./pattern-to-regex.js");
|
||||||
|
var _printer = require("./printer.js");
|
||||||
|
var _rewriteStackTrace = require("../errors/rewrite-stack-trace.js");
|
||||||
|
var _configError = require("../errors/config-error.js");
|
||||||
|
var _index = require("./files/index.js");
|
||||||
|
var _caching = require("./caching.js");
|
||||||
|
var _configDescriptors = require("./config-descriptors.js");
|
||||||
|
const debug = _debug()("babel:config:config-chain");
|
||||||
|
function* buildPresetChain(arg, context) {
|
||||||
|
const chain = yield* buildPresetChainWalker(arg, context);
|
||||||
|
if (!chain) return null;
|
||||||
|
return {
|
||||||
|
plugins: dedupDescriptors(chain.plugins),
|
||||||
|
presets: dedupDescriptors(chain.presets),
|
||||||
|
options: chain.options.map(o => createConfigChainOptions(o)),
|
||||||
|
files: new Set()
|
||||||
|
};
|
||||||
|
}
|
||||||
|
const buildPresetChainWalker = exports.buildPresetChainWalker = makeChainWalker({
|
||||||
|
root: preset => loadPresetDescriptors(preset),
|
||||||
|
env: (preset, envName) => loadPresetEnvDescriptors(preset)(envName),
|
||||||
|
overrides: (preset, index) => loadPresetOverridesDescriptors(preset)(index),
|
||||||
|
overridesEnv: (preset, index, envName) => loadPresetOverridesEnvDescriptors(preset)(index)(envName),
|
||||||
|
createLogger: () => () => {}
|
||||||
|
});
|
||||||
|
const loadPresetDescriptors = (0, _caching.makeWeakCacheSync)(preset => buildRootDescriptors(preset, preset.alias, _configDescriptors.createUncachedDescriptors));
|
||||||
|
const loadPresetEnvDescriptors = (0, _caching.makeWeakCacheSync)(preset => (0, _caching.makeStrongCacheSync)(envName => buildEnvDescriptors(preset, preset.alias, _configDescriptors.createUncachedDescriptors, envName)));
|
||||||
|
const loadPresetOverridesDescriptors = (0, _caching.makeWeakCacheSync)(preset => (0, _caching.makeStrongCacheSync)(index => buildOverrideDescriptors(preset, preset.alias, _configDescriptors.createUncachedDescriptors, index)));
|
||||||
|
const loadPresetOverridesEnvDescriptors = (0, _caching.makeWeakCacheSync)(preset => (0, _caching.makeStrongCacheSync)(index => (0, _caching.makeStrongCacheSync)(envName => buildOverrideEnvDescriptors(preset, preset.alias, _configDescriptors.createUncachedDescriptors, index, envName))));
|
||||||
|
function* buildRootChain(opts, context) {
|
||||||
|
let configReport, babelRcReport;
|
||||||
|
const programmaticLogger = new _printer.ConfigPrinter();
|
||||||
|
const programmaticChain = yield* loadProgrammaticChain({
|
||||||
|
options: opts,
|
||||||
|
dirname: context.cwd
|
||||||
|
}, context, undefined, programmaticLogger);
|
||||||
|
if (!programmaticChain) return null;
|
||||||
|
const programmaticReport = yield* programmaticLogger.output();
|
||||||
|
let configFile;
|
||||||
|
if (typeof opts.configFile === "string") {
|
||||||
|
configFile = yield* (0, _index.loadConfig)(opts.configFile, context.cwd, context.envName, context.caller);
|
||||||
|
} else if (opts.configFile !== false) {
|
||||||
|
configFile = yield* (0, _index.findRootConfig)(context.root, context.envName, context.caller);
|
||||||
|
}
|
||||||
|
let {
|
||||||
|
babelrc,
|
||||||
|
babelrcRoots
|
||||||
|
} = opts;
|
||||||
|
let babelrcRootsDirectory = context.cwd;
|
||||||
|
const configFileChain = emptyChain();
|
||||||
|
const configFileLogger = new _printer.ConfigPrinter();
|
||||||
|
if (configFile) {
|
||||||
|
const validatedFile = validateConfigFile(configFile);
|
||||||
|
const result = yield* loadFileChain(validatedFile, context, undefined, configFileLogger);
|
||||||
|
if (!result) return null;
|
||||||
|
configReport = yield* configFileLogger.output();
|
||||||
|
if (babelrc === undefined) {
|
||||||
|
babelrc = validatedFile.options.babelrc;
|
||||||
|
}
|
||||||
|
if (babelrcRoots === undefined) {
|
||||||
|
babelrcRootsDirectory = validatedFile.dirname;
|
||||||
|
babelrcRoots = validatedFile.options.babelrcRoots;
|
||||||
|
}
|
||||||
|
mergeChain(configFileChain, result);
|
||||||
|
}
|
||||||
|
let ignoreFile, babelrcFile;
|
||||||
|
let isIgnored = false;
|
||||||
|
const fileChain = emptyChain();
|
||||||
|
if ((babelrc === true || babelrc === undefined) && typeof context.filename === "string") {
|
||||||
|
const pkgData = yield* (0, _index.findPackageData)(context.filename);
|
||||||
|
if (pkgData && babelrcLoadEnabled(context, pkgData, babelrcRoots, babelrcRootsDirectory)) {
|
||||||
|
({
|
||||||
|
ignore: ignoreFile,
|
||||||
|
config: babelrcFile
|
||||||
|
} = yield* (0, _index.findRelativeConfig)(pkgData, context.envName, context.caller));
|
||||||
|
if (ignoreFile) {
|
||||||
|
fileChain.files.add(ignoreFile.filepath);
|
||||||
|
}
|
||||||
|
if (ignoreFile && shouldIgnore(context, ignoreFile.ignore, null, ignoreFile.dirname)) {
|
||||||
|
isIgnored = true;
|
||||||
|
}
|
||||||
|
if (babelrcFile && !isIgnored) {
|
||||||
|
const validatedFile = validateBabelrcFile(babelrcFile);
|
||||||
|
const babelrcLogger = new _printer.ConfigPrinter();
|
||||||
|
const result = yield* loadFileChain(validatedFile, context, undefined, babelrcLogger);
|
||||||
|
if (!result) {
|
||||||
|
isIgnored = true;
|
||||||
|
} else {
|
||||||
|
babelRcReport = yield* babelrcLogger.output();
|
||||||
|
mergeChain(fileChain, result);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (babelrcFile && isIgnored) {
|
||||||
|
fileChain.files.add(babelrcFile.filepath);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (context.showConfig) {
|
||||||
|
console.log(`Babel configs on "${context.filename}" (ascending priority):\n` + [configReport, babelRcReport, programmaticReport].filter(x => !!x).join("\n\n") + "\n-----End Babel configs-----");
|
||||||
|
}
|
||||||
|
const chain = mergeChain(mergeChain(mergeChain(emptyChain(), configFileChain), fileChain), programmaticChain);
|
||||||
|
return {
|
||||||
|
plugins: isIgnored ? [] : dedupDescriptors(chain.plugins),
|
||||||
|
presets: isIgnored ? [] : dedupDescriptors(chain.presets),
|
||||||
|
options: isIgnored ? [] : chain.options.map(o => createConfigChainOptions(o)),
|
||||||
|
fileHandling: isIgnored ? "ignored" : "transpile",
|
||||||
|
ignore: ignoreFile || undefined,
|
||||||
|
babelrc: babelrcFile || undefined,
|
||||||
|
config: configFile || undefined,
|
||||||
|
files: chain.files
|
||||||
|
};
|
||||||
|
}
|
||||||
|
function babelrcLoadEnabled(context, pkgData, babelrcRoots, babelrcRootsDirectory) {
|
||||||
|
if (typeof babelrcRoots === "boolean") return babelrcRoots;
|
||||||
|
const absoluteRoot = context.root;
|
||||||
|
if (babelrcRoots === undefined) {
|
||||||
|
return pkgData.directories.includes(absoluteRoot);
|
||||||
|
}
|
||||||
|
let babelrcPatterns = babelrcRoots;
|
||||||
|
if (!Array.isArray(babelrcPatterns)) {
|
||||||
|
babelrcPatterns = [babelrcPatterns];
|
||||||
|
}
|
||||||
|
babelrcPatterns = babelrcPatterns.map(pat => {
|
||||||
|
return typeof pat === "string" ? _path().resolve(babelrcRootsDirectory, pat) : pat;
|
||||||
|
});
|
||||||
|
if (babelrcPatterns.length === 1 && babelrcPatterns[0] === absoluteRoot) {
|
||||||
|
return pkgData.directories.includes(absoluteRoot);
|
||||||
|
}
|
||||||
|
return babelrcPatterns.some(pat => {
|
||||||
|
if (typeof pat === "string") {
|
||||||
|
pat = (0, _patternToRegex.default)(pat, babelrcRootsDirectory);
|
||||||
|
}
|
||||||
|
return pkgData.directories.some(directory => {
|
||||||
|
return matchPattern(pat, babelrcRootsDirectory, directory, context);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
}
|
||||||
|
const validateConfigFile = (0, _caching.makeWeakCacheSync)(file => ({
|
||||||
|
filepath: file.filepath,
|
||||||
|
dirname: file.dirname,
|
||||||
|
options: (0, _options.validate)("configfile", file.options, file.filepath)
|
||||||
|
}));
|
||||||
|
const validateBabelrcFile = (0, _caching.makeWeakCacheSync)(file => ({
|
||||||
|
filepath: file.filepath,
|
||||||
|
dirname: file.dirname,
|
||||||
|
options: (0, _options.validate)("babelrcfile", file.options, file.filepath)
|
||||||
|
}));
|
||||||
|
const validateExtendFile = (0, _caching.makeWeakCacheSync)(file => ({
|
||||||
|
filepath: file.filepath,
|
||||||
|
dirname: file.dirname,
|
||||||
|
options: (0, _options.validate)("extendsfile", file.options, file.filepath)
|
||||||
|
}));
|
||||||
|
const loadProgrammaticChain = makeChainWalker({
|
||||||
|
root: input => buildRootDescriptors(input, "base", _configDescriptors.createCachedDescriptors),
|
||||||
|
env: (input, envName) => buildEnvDescriptors(input, "base", _configDescriptors.createCachedDescriptors, envName),
|
||||||
|
overrides: (input, index) => buildOverrideDescriptors(input, "base", _configDescriptors.createCachedDescriptors, index),
|
||||||
|
overridesEnv: (input, index, envName) => buildOverrideEnvDescriptors(input, "base", _configDescriptors.createCachedDescriptors, index, envName),
|
||||||
|
createLogger: (input, context, baseLogger) => buildProgrammaticLogger(input, context, baseLogger)
|
||||||
|
});
|
||||||
|
const loadFileChainWalker = makeChainWalker({
|
||||||
|
root: file => loadFileDescriptors(file),
|
||||||
|
env: (file, envName) => loadFileEnvDescriptors(file)(envName),
|
||||||
|
overrides: (file, index) => loadFileOverridesDescriptors(file)(index),
|
||||||
|
overridesEnv: (file, index, envName) => loadFileOverridesEnvDescriptors(file)(index)(envName),
|
||||||
|
createLogger: (file, context, baseLogger) => buildFileLogger(file.filepath, context, baseLogger)
|
||||||
|
});
|
||||||
|
function* loadFileChain(input, context, files, baseLogger) {
|
||||||
|
const chain = yield* loadFileChainWalker(input, context, files, baseLogger);
|
||||||
|
chain == null || chain.files.add(input.filepath);
|
||||||
|
return chain;
|
||||||
|
}
|
||||||
|
const loadFileDescriptors = (0, _caching.makeWeakCacheSync)(file => buildRootDescriptors(file, file.filepath, _configDescriptors.createUncachedDescriptors));
|
||||||
|
const loadFileEnvDescriptors = (0, _caching.makeWeakCacheSync)(file => (0, _caching.makeStrongCacheSync)(envName => buildEnvDescriptors(file, file.filepath, _configDescriptors.createUncachedDescriptors, envName)));
|
||||||
|
const loadFileOverridesDescriptors = (0, _caching.makeWeakCacheSync)(file => (0, _caching.makeStrongCacheSync)(index => buildOverrideDescriptors(file, file.filepath, _configDescriptors.createUncachedDescriptors, index)));
|
||||||
|
const loadFileOverridesEnvDescriptors = (0, _caching.makeWeakCacheSync)(file => (0, _caching.makeStrongCacheSync)(index => (0, _caching.makeStrongCacheSync)(envName => buildOverrideEnvDescriptors(file, file.filepath, _configDescriptors.createUncachedDescriptors, index, envName))));
|
||||||
|
function buildFileLogger(filepath, context, baseLogger) {
|
||||||
|
if (!baseLogger) {
|
||||||
|
return () => {};
|
||||||
|
}
|
||||||
|
return baseLogger.configure(context.showConfig, _printer.ChainFormatter.Config, {
|
||||||
|
filepath
|
||||||
|
});
|
||||||
|
}
|
||||||
|
function buildRootDescriptors({
|
||||||
|
dirname,
|
||||||
|
options
|
||||||
|
}, alias, descriptors) {
|
||||||
|
return descriptors(dirname, options, alias);
|
||||||
|
}
|
||||||
|
function buildProgrammaticLogger(_, context, baseLogger) {
|
||||||
|
var _context$caller;
|
||||||
|
if (!baseLogger) {
|
||||||
|
return () => {};
|
||||||
|
}
|
||||||
|
return baseLogger.configure(context.showConfig, _printer.ChainFormatter.Programmatic, {
|
||||||
|
callerName: (_context$caller = context.caller) == null ? void 0 : _context$caller.name
|
||||||
|
});
|
||||||
|
}
|
||||||
|
function buildEnvDescriptors({
|
||||||
|
dirname,
|
||||||
|
options
|
||||||
|
}, alias, descriptors, envName) {
|
||||||
|
var _options$env;
|
||||||
|
const opts = (_options$env = options.env) == null ? void 0 : _options$env[envName];
|
||||||
|
return opts ? descriptors(dirname, opts, `${alias}.env["${envName}"]`) : null;
|
||||||
|
}
|
||||||
|
function buildOverrideDescriptors({
|
||||||
|
dirname,
|
||||||
|
options
|
||||||
|
}, alias, descriptors, index) {
|
||||||
|
var _options$overrides;
|
||||||
|
const opts = (_options$overrides = options.overrides) == null ? void 0 : _options$overrides[index];
|
||||||
|
if (!opts) throw new Error("Assertion failure - missing override");
|
||||||
|
return descriptors(dirname, opts, `${alias}.overrides[${index}]`);
|
||||||
|
}
|
||||||
|
function buildOverrideEnvDescriptors({
|
||||||
|
dirname,
|
||||||
|
options
|
||||||
|
}, alias, descriptors, index, envName) {
|
||||||
|
var _options$overrides2, _override$env;
|
||||||
|
const override = (_options$overrides2 = options.overrides) == null ? void 0 : _options$overrides2[index];
|
||||||
|
if (!override) throw new Error("Assertion failure - missing override");
|
||||||
|
const opts = (_override$env = override.env) == null ? void 0 : _override$env[envName];
|
||||||
|
return opts ? descriptors(dirname, opts, `${alias}.overrides[${index}].env["${envName}"]`) : null;
|
||||||
|
}
|
||||||
|
function makeChainWalker({
|
||||||
|
root,
|
||||||
|
env,
|
||||||
|
overrides,
|
||||||
|
overridesEnv,
|
||||||
|
createLogger
|
||||||
|
}) {
|
||||||
|
return function* chainWalker(input, context, files = new Set(), baseLogger) {
|
||||||
|
const {
|
||||||
|
dirname
|
||||||
|
} = input;
|
||||||
|
const flattenedConfigs = [];
|
||||||
|
const rootOpts = root(input);
|
||||||
|
if (configIsApplicable(rootOpts, dirname, context, input.filepath)) {
|
||||||
|
flattenedConfigs.push({
|
||||||
|
config: rootOpts,
|
||||||
|
envName: undefined,
|
||||||
|
index: undefined
|
||||||
|
});
|
||||||
|
const envOpts = env(input, context.envName);
|
||||||
|
if (envOpts && configIsApplicable(envOpts, dirname, context, input.filepath)) {
|
||||||
|
flattenedConfigs.push({
|
||||||
|
config: envOpts,
|
||||||
|
envName: context.envName,
|
||||||
|
index: undefined
|
||||||
|
});
|
||||||
|
}
|
||||||
|
(rootOpts.options.overrides || []).forEach((_, index) => {
|
||||||
|
const overrideOps = overrides(input, index);
|
||||||
|
if (configIsApplicable(overrideOps, dirname, context, input.filepath)) {
|
||||||
|
flattenedConfigs.push({
|
||||||
|
config: overrideOps,
|
||||||
|
index,
|
||||||
|
envName: undefined
|
||||||
|
});
|
||||||
|
const overrideEnvOpts = overridesEnv(input, index, context.envName);
|
||||||
|
if (overrideEnvOpts && configIsApplicable(overrideEnvOpts, dirname, context, input.filepath)) {
|
||||||
|
flattenedConfigs.push({
|
||||||
|
config: overrideEnvOpts,
|
||||||
|
index,
|
||||||
|
envName: context.envName
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
if (flattenedConfigs.some(({
|
||||||
|
config: {
|
||||||
|
options: {
|
||||||
|
ignore,
|
||||||
|
only
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}) => shouldIgnore(context, ignore, only, dirname))) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
const chain = emptyChain();
|
||||||
|
const logger = createLogger(input, context, baseLogger);
|
||||||
|
for (const {
|
||||||
|
config,
|
||||||
|
index,
|
||||||
|
envName
|
||||||
|
} of flattenedConfigs) {
|
||||||
|
if (!(yield* mergeExtendsChain(chain, config.options, dirname, context, files, baseLogger))) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
logger(config, index, envName);
|
||||||
|
yield* mergeChainOpts(chain, config);
|
||||||
|
}
|
||||||
|
return chain;
|
||||||
|
};
|
||||||
|
}
|
||||||
|
function* mergeExtendsChain(chain, opts, dirname, context, files, baseLogger) {
|
||||||
|
if (opts.extends === undefined) return true;
|
||||||
|
const file = yield* (0, _index.loadConfig)(opts.extends, dirname, context.envName, context.caller);
|
||||||
|
if (files.has(file)) {
|
||||||
|
throw new Error(`Configuration cycle detected loading ${file.filepath}.\n` + `File already loaded following the config chain:\n` + Array.from(files, file => ` - ${file.filepath}`).join("\n"));
|
||||||
|
}
|
||||||
|
files.add(file);
|
||||||
|
const fileChain = yield* loadFileChain(validateExtendFile(file), context, files, baseLogger);
|
||||||
|
files.delete(file);
|
||||||
|
if (!fileChain) return false;
|
||||||
|
mergeChain(chain, fileChain);
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
function mergeChain(target, source) {
|
||||||
|
target.options.push(...source.options);
|
||||||
|
target.plugins.push(...source.plugins);
|
||||||
|
target.presets.push(...source.presets);
|
||||||
|
for (const file of source.files) {
|
||||||
|
target.files.add(file);
|
||||||
|
}
|
||||||
|
return target;
|
||||||
|
}
|
||||||
|
function* mergeChainOpts(target, {
|
||||||
|
options,
|
||||||
|
plugins,
|
||||||
|
presets
|
||||||
|
}) {
|
||||||
|
target.options.push(options);
|
||||||
|
target.plugins.push(...(yield* plugins()));
|
||||||
|
target.presets.push(...(yield* presets()));
|
||||||
|
return target;
|
||||||
|
}
|
||||||
|
function emptyChain() {
|
||||||
|
return {
|
||||||
|
options: [],
|
||||||
|
presets: [],
|
||||||
|
plugins: [],
|
||||||
|
files: new Set()
|
||||||
|
};
|
||||||
|
}
|
||||||
|
function createConfigChainOptions(opts) {
|
||||||
|
const options = Object.assign({}, opts);
|
||||||
|
delete options.extends;
|
||||||
|
delete options.env;
|
||||||
|
delete options.overrides;
|
||||||
|
delete options.plugins;
|
||||||
|
delete options.presets;
|
||||||
|
delete options.passPerPreset;
|
||||||
|
delete options.ignore;
|
||||||
|
delete options.only;
|
||||||
|
delete options.test;
|
||||||
|
delete options.include;
|
||||||
|
delete options.exclude;
|
||||||
|
if (hasOwnProperty.call(options, "sourceMap")) {
|
||||||
|
options.sourceMaps = options.sourceMap;
|
||||||
|
delete options.sourceMap;
|
||||||
|
}
|
||||||
|
return options;
|
||||||
|
}
|
||||||
|
function dedupDescriptors(items) {
|
||||||
|
const map = new Map();
|
||||||
|
const descriptors = [];
|
||||||
|
for (const item of items) {
|
||||||
|
if (typeof item.value === "function") {
|
||||||
|
const fnKey = item.value;
|
||||||
|
let nameMap = map.get(fnKey);
|
||||||
|
if (!nameMap) {
|
||||||
|
nameMap = new Map();
|
||||||
|
map.set(fnKey, nameMap);
|
||||||
|
}
|
||||||
|
let desc = nameMap.get(item.name);
|
||||||
|
if (!desc) {
|
||||||
|
desc = {
|
||||||
|
value: item
|
||||||
|
};
|
||||||
|
descriptors.push(desc);
|
||||||
|
if (!item.ownPass) nameMap.set(item.name, desc);
|
||||||
|
} else {
|
||||||
|
desc.value = item;
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
descriptors.push({
|
||||||
|
value: item
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return descriptors.reduce((acc, desc) => {
|
||||||
|
acc.push(desc.value);
|
||||||
|
return acc;
|
||||||
|
}, []);
|
||||||
|
}
|
||||||
|
function configIsApplicable({
|
||||||
|
options
|
||||||
|
}, dirname, context, configName) {
|
||||||
|
return (options.test === undefined || configFieldIsApplicable(context, options.test, dirname, configName)) && (options.include === undefined || configFieldIsApplicable(context, options.include, dirname, configName)) && (options.exclude === undefined || !configFieldIsApplicable(context, options.exclude, dirname, configName));
|
||||||
|
}
|
||||||
|
function configFieldIsApplicable(context, test, dirname, configName) {
|
||||||
|
const patterns = Array.isArray(test) ? test : [test];
|
||||||
|
return matchesPatterns(context, patterns, dirname, configName);
|
||||||
|
}
|
||||||
|
function ignoreListReplacer(_key, value) {
|
||||||
|
if (value instanceof RegExp) {
|
||||||
|
return String(value);
|
||||||
|
}
|
||||||
|
return value;
|
||||||
|
}
|
||||||
|
function shouldIgnore(context, ignore, only, dirname) {
|
||||||
|
if (ignore && matchesPatterns(context, ignore, dirname)) {
|
||||||
|
var _context$filename;
|
||||||
|
const message = `No config is applied to "${(_context$filename = context.filename) != null ? _context$filename : "(unknown)"}" because it matches one of \`ignore: ${JSON.stringify(ignore, ignoreListReplacer)}\` from "${dirname}"`;
|
||||||
|
debug(message);
|
||||||
|
if (context.showConfig) {
|
||||||
|
console.log(message);
|
||||||
|
}
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
if (only && !matchesPatterns(context, only, dirname)) {
|
||||||
|
var _context$filename2;
|
||||||
|
const message = `No config is applied to "${(_context$filename2 = context.filename) != null ? _context$filename2 : "(unknown)"}" because it fails to match one of \`only: ${JSON.stringify(only, ignoreListReplacer)}\` from "${dirname}"`;
|
||||||
|
debug(message);
|
||||||
|
if (context.showConfig) {
|
||||||
|
console.log(message);
|
||||||
|
}
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
function matchesPatterns(context, patterns, dirname, configName) {
|
||||||
|
return patterns.some(pattern => matchPattern(pattern, dirname, context.filename, context, configName));
|
||||||
|
}
|
||||||
|
function matchPattern(pattern, dirname, pathToTest, context, configName) {
|
||||||
|
if (typeof pattern === "function") {
|
||||||
|
return !!(0, _rewriteStackTrace.endHiddenCallStack)(pattern)(pathToTest, {
|
||||||
|
dirname,
|
||||||
|
envName: context.envName,
|
||||||
|
caller: context.caller
|
||||||
|
});
|
||||||
|
}
|
||||||
|
if (typeof pathToTest !== "string") {
|
||||||
|
throw new _configError.default(`Configuration contains string/RegExp pattern, but no filename was passed to Babel`, configName);
|
||||||
|
}
|
||||||
|
if (typeof pattern === "string") {
|
||||||
|
pattern = (0, _patternToRegex.default)(pattern, dirname);
|
||||||
|
}
|
||||||
|
return pattern.test(pathToTest);
|
||||||
|
}
|
||||||
|
0 && 0;
|
||||||
|
|
||||||
|
//# sourceMappingURL=config-chain.js.map
|
||||||
1
web/node_modules/@babel/core/lib/config/config-chain.js.map
generated
vendored
Normal file
1
web/node_modules/@babel/core/lib/config/config-chain.js.map
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
190
web/node_modules/@babel/core/lib/config/config-descriptors.js
generated
vendored
Normal file
190
web/node_modules/@babel/core/lib/config/config-descriptors.js
generated
vendored
Normal file
@@ -0,0 +1,190 @@
|
|||||||
|
"use strict";
|
||||||
|
|
||||||
|
Object.defineProperty(exports, "__esModule", {
|
||||||
|
value: true
|
||||||
|
});
|
||||||
|
exports.createCachedDescriptors = createCachedDescriptors;
|
||||||
|
exports.createDescriptor = createDescriptor;
|
||||||
|
exports.createUncachedDescriptors = createUncachedDescriptors;
|
||||||
|
function _gensync() {
|
||||||
|
const data = require("gensync");
|
||||||
|
_gensync = function () {
|
||||||
|
return data;
|
||||||
|
};
|
||||||
|
return data;
|
||||||
|
}
|
||||||
|
var _functional = require("../gensync-utils/functional.js");
|
||||||
|
var _index = require("./files/index.js");
|
||||||
|
var _item = require("./item.js");
|
||||||
|
var _caching = require("./caching.js");
|
||||||
|
var _resolveTargets = require("./resolve-targets.js");
|
||||||
|
function isEqualDescriptor(a, b) {
|
||||||
|
var _a$file, _b$file, _a$file2, _b$file2;
|
||||||
|
return a.name === b.name && a.value === b.value && a.options === b.options && a.dirname === b.dirname && a.alias === b.alias && a.ownPass === b.ownPass && ((_a$file = a.file) == null ? void 0 : _a$file.request) === ((_b$file = b.file) == null ? void 0 : _b$file.request) && ((_a$file2 = a.file) == null ? void 0 : _a$file2.resolved) === ((_b$file2 = b.file) == null ? void 0 : _b$file2.resolved);
|
||||||
|
}
|
||||||
|
function* handlerOf(value) {
|
||||||
|
return value;
|
||||||
|
}
|
||||||
|
function optionsWithResolvedBrowserslistConfigFile(options, dirname) {
|
||||||
|
if (typeof options.browserslistConfigFile === "string") {
|
||||||
|
options.browserslistConfigFile = (0, _resolveTargets.resolveBrowserslistConfigFile)(options.browserslistConfigFile, dirname);
|
||||||
|
}
|
||||||
|
return options;
|
||||||
|
}
|
||||||
|
function createCachedDescriptors(dirname, options, alias) {
|
||||||
|
const {
|
||||||
|
plugins,
|
||||||
|
presets,
|
||||||
|
passPerPreset
|
||||||
|
} = options;
|
||||||
|
return {
|
||||||
|
options: optionsWithResolvedBrowserslistConfigFile(options, dirname),
|
||||||
|
plugins: plugins ? () => createCachedPluginDescriptors(plugins, dirname)(alias) : () => handlerOf([]),
|
||||||
|
presets: presets ? () => createCachedPresetDescriptors(presets, dirname)(alias)(!!passPerPreset) : () => handlerOf([])
|
||||||
|
};
|
||||||
|
}
|
||||||
|
function createUncachedDescriptors(dirname, options, alias) {
|
||||||
|
return {
|
||||||
|
options: optionsWithResolvedBrowserslistConfigFile(options, dirname),
|
||||||
|
plugins: (0, _functional.once)(() => createPluginDescriptors(options.plugins || [], dirname, alias)),
|
||||||
|
presets: (0, _functional.once)(() => createPresetDescriptors(options.presets || [], dirname, alias, !!options.passPerPreset))
|
||||||
|
};
|
||||||
|
}
|
||||||
|
const PRESET_DESCRIPTOR_CACHE = new WeakMap();
|
||||||
|
const createCachedPresetDescriptors = (0, _caching.makeWeakCacheSync)((items, cache) => {
|
||||||
|
const dirname = cache.using(dir => dir);
|
||||||
|
return (0, _caching.makeStrongCacheSync)(alias => (0, _caching.makeStrongCache)(function* (passPerPreset) {
|
||||||
|
const descriptors = yield* createPresetDescriptors(items, dirname, alias, passPerPreset);
|
||||||
|
return descriptors.map(desc => loadCachedDescriptor(PRESET_DESCRIPTOR_CACHE, desc));
|
||||||
|
}));
|
||||||
|
});
|
||||||
|
const PLUGIN_DESCRIPTOR_CACHE = new WeakMap();
|
||||||
|
const createCachedPluginDescriptors = (0, _caching.makeWeakCacheSync)((items, cache) => {
|
||||||
|
const dirname = cache.using(dir => dir);
|
||||||
|
return (0, _caching.makeStrongCache)(function* (alias) {
|
||||||
|
const descriptors = yield* createPluginDescriptors(items, dirname, alias);
|
||||||
|
return descriptors.map(desc => loadCachedDescriptor(PLUGIN_DESCRIPTOR_CACHE, desc));
|
||||||
|
});
|
||||||
|
});
|
||||||
|
const DEFAULT_OPTIONS = {};
|
||||||
|
function loadCachedDescriptor(cache, desc) {
|
||||||
|
const {
|
||||||
|
value,
|
||||||
|
options = DEFAULT_OPTIONS
|
||||||
|
} = desc;
|
||||||
|
if (options === false) return desc;
|
||||||
|
let cacheByOptions = cache.get(value);
|
||||||
|
if (!cacheByOptions) {
|
||||||
|
cacheByOptions = new WeakMap();
|
||||||
|
cache.set(value, cacheByOptions);
|
||||||
|
}
|
||||||
|
let possibilities = cacheByOptions.get(options);
|
||||||
|
if (!possibilities) {
|
||||||
|
possibilities = [];
|
||||||
|
cacheByOptions.set(options, possibilities);
|
||||||
|
}
|
||||||
|
if (!possibilities.includes(desc)) {
|
||||||
|
const matches = possibilities.filter(possibility => isEqualDescriptor(possibility, desc));
|
||||||
|
if (matches.length > 0) {
|
||||||
|
return matches[0];
|
||||||
|
}
|
||||||
|
possibilities.push(desc);
|
||||||
|
}
|
||||||
|
return desc;
|
||||||
|
}
|
||||||
|
function* createPresetDescriptors(items, dirname, alias, passPerPreset) {
|
||||||
|
return yield* createDescriptors("preset", items, dirname, alias, passPerPreset);
|
||||||
|
}
|
||||||
|
function* createPluginDescriptors(items, dirname, alias) {
|
||||||
|
return yield* createDescriptors("plugin", items, dirname, alias);
|
||||||
|
}
|
||||||
|
function* createDescriptors(type, items, dirname, alias, ownPass) {
|
||||||
|
const descriptors = yield* _gensync().all(items.map((item, index) => createDescriptor(item, dirname, {
|
||||||
|
type,
|
||||||
|
alias: `${alias}$${index}`,
|
||||||
|
ownPass: !!ownPass
|
||||||
|
})));
|
||||||
|
assertNoDuplicates(descriptors);
|
||||||
|
return descriptors;
|
||||||
|
}
|
||||||
|
function* createDescriptor(pair, dirname, {
|
||||||
|
type,
|
||||||
|
alias,
|
||||||
|
ownPass
|
||||||
|
}) {
|
||||||
|
const desc = (0, _item.getItemDescriptor)(pair);
|
||||||
|
if (desc) {
|
||||||
|
return desc;
|
||||||
|
}
|
||||||
|
let name;
|
||||||
|
let options;
|
||||||
|
let value = pair;
|
||||||
|
if (Array.isArray(value)) {
|
||||||
|
if (value.length === 3) {
|
||||||
|
[value, options, name] = value;
|
||||||
|
} else {
|
||||||
|
[value, options] = value;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
let file = undefined;
|
||||||
|
let filepath = null;
|
||||||
|
if (typeof value === "string") {
|
||||||
|
if (typeof type !== "string") {
|
||||||
|
throw new Error("To resolve a string-based item, the type of item must be given");
|
||||||
|
}
|
||||||
|
const resolver = type === "plugin" ? _index.loadPlugin : _index.loadPreset;
|
||||||
|
const request = value;
|
||||||
|
({
|
||||||
|
filepath,
|
||||||
|
value
|
||||||
|
} = yield* resolver(value, dirname));
|
||||||
|
file = {
|
||||||
|
request,
|
||||||
|
resolved: filepath
|
||||||
|
};
|
||||||
|
}
|
||||||
|
if (!value) {
|
||||||
|
throw new Error(`Unexpected falsy value: ${String(value)}`);
|
||||||
|
}
|
||||||
|
if (typeof value === "object" && value.__esModule) {
|
||||||
|
if (value.default) {
|
||||||
|
value = value.default;
|
||||||
|
} else {
|
||||||
|
throw new Error("Must export a default export when using ES6 modules.");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (typeof value !== "object" && typeof value !== "function") {
|
||||||
|
throw new Error(`Unsupported format: ${typeof value}. Expected an object or a function.`);
|
||||||
|
}
|
||||||
|
if (filepath !== null && typeof value === "object" && value) {
|
||||||
|
throw new Error(`Plugin/Preset files are not allowed to export objects, only functions. In ${filepath}`);
|
||||||
|
}
|
||||||
|
return {
|
||||||
|
name,
|
||||||
|
alias: filepath || alias,
|
||||||
|
value,
|
||||||
|
options,
|
||||||
|
dirname,
|
||||||
|
ownPass,
|
||||||
|
file
|
||||||
|
};
|
||||||
|
}
|
||||||
|
function assertNoDuplicates(items) {
|
||||||
|
const map = new Map();
|
||||||
|
for (const item of items) {
|
||||||
|
if (typeof item.value !== "function") continue;
|
||||||
|
let nameMap = map.get(item.value);
|
||||||
|
if (!nameMap) {
|
||||||
|
nameMap = new Set();
|
||||||
|
map.set(item.value, nameMap);
|
||||||
|
}
|
||||||
|
if (nameMap.has(item.name)) {
|
||||||
|
const conflicts = items.filter(i => i.value === item.value);
|
||||||
|
throw new Error([`Duplicate plugin/preset detected.`, `If you'd like to use two separate instances of a plugin,`, `they need separate names, e.g.`, ``, ` plugins: [`, ` ['some-plugin', {}],`, ` ['some-plugin', {}, 'some unique name'],`, ` ]`, ``, `Duplicates detected are:`, `${JSON.stringify(conflicts, null, 2)}`].join("\n"));
|
||||||
|
}
|
||||||
|
nameMap.add(item.name);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
0 && 0;
|
||||||
|
|
||||||
|
//# sourceMappingURL=config-descriptors.js.map
|
||||||
1
web/node_modules/@babel/core/lib/config/config-descriptors.js.map
generated
vendored
Normal file
1
web/node_modules/@babel/core/lib/config/config-descriptors.js.map
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
290
web/node_modules/@babel/core/lib/config/files/configuration.js
generated
vendored
Normal file
290
web/node_modules/@babel/core/lib/config/files/configuration.js
generated
vendored
Normal file
@@ -0,0 +1,290 @@
|
|||||||
|
"use strict";
|
||||||
|
|
||||||
|
Object.defineProperty(exports, "__esModule", {
|
||||||
|
value: true
|
||||||
|
});
|
||||||
|
exports.ROOT_CONFIG_FILENAMES = void 0;
|
||||||
|
exports.findConfigUpwards = findConfigUpwards;
|
||||||
|
exports.findRelativeConfig = findRelativeConfig;
|
||||||
|
exports.findRootConfig = findRootConfig;
|
||||||
|
exports.loadConfig = loadConfig;
|
||||||
|
exports.resolveShowConfigPath = resolveShowConfigPath;
|
||||||
|
function _debug() {
|
||||||
|
const data = require("debug");
|
||||||
|
_debug = function () {
|
||||||
|
return data;
|
||||||
|
};
|
||||||
|
return data;
|
||||||
|
}
|
||||||
|
function _fs() {
|
||||||
|
const data = require("fs");
|
||||||
|
_fs = function () {
|
||||||
|
return data;
|
||||||
|
};
|
||||||
|
return data;
|
||||||
|
}
|
||||||
|
function _path() {
|
||||||
|
const data = require("path");
|
||||||
|
_path = function () {
|
||||||
|
return data;
|
||||||
|
};
|
||||||
|
return data;
|
||||||
|
}
|
||||||
|
function _json() {
|
||||||
|
const data = require("json5");
|
||||||
|
_json = function () {
|
||||||
|
return data;
|
||||||
|
};
|
||||||
|
return data;
|
||||||
|
}
|
||||||
|
function _gensync() {
|
||||||
|
const data = require("gensync");
|
||||||
|
_gensync = function () {
|
||||||
|
return data;
|
||||||
|
};
|
||||||
|
return data;
|
||||||
|
}
|
||||||
|
var _caching = require("../caching.js");
|
||||||
|
var _configApi = require("../helpers/config-api.js");
|
||||||
|
var _utils = require("./utils.js");
|
||||||
|
var _moduleTypes = require("./module-types.js");
|
||||||
|
var _patternToRegex = require("../pattern-to-regex.js");
|
||||||
|
var _configError = require("../../errors/config-error.js");
|
||||||
|
var fs = require("../../gensync-utils/fs.js");
|
||||||
|
require("module");
|
||||||
|
var _rewriteStackTrace = require("../../errors/rewrite-stack-trace.js");
|
||||||
|
var _async = require("../../gensync-utils/async.js");
|
||||||
|
const debug = _debug()("babel:config:loading:files:configuration");
|
||||||
|
const ROOT_CONFIG_FILENAMES = exports.ROOT_CONFIG_FILENAMES = ["babel.config.js", "babel.config.cjs", "babel.config.mjs", "babel.config.json", "babel.config.cts", "babel.config.ts", "babel.config.mts"];
|
||||||
|
const RELATIVE_CONFIG_FILENAMES = [".babelrc", ".babelrc.js", ".babelrc.cjs", ".babelrc.mjs", ".babelrc.json", ".babelrc.cts"];
|
||||||
|
const BABELIGNORE_FILENAME = ".babelignore";
|
||||||
|
const runConfig = (0, _caching.makeWeakCache)(function* runConfig(options, cache) {
|
||||||
|
yield* [];
|
||||||
|
return {
|
||||||
|
options: (0, _rewriteStackTrace.endHiddenCallStack)(options)((0, _configApi.makeConfigAPI)(cache)),
|
||||||
|
cacheNeedsConfiguration: !cache.configured()
|
||||||
|
};
|
||||||
|
});
|
||||||
|
function* readConfigCode(filepath, data) {
|
||||||
|
if (!_fs().existsSync(filepath)) return null;
|
||||||
|
let options = yield* (0, _moduleTypes.default)(filepath, (yield* (0, _async.isAsync)()) ? "auto" : "require", "You appear to be using a native ECMAScript module configuration " + "file, which is only supported when running Babel asynchronously " + "or when using the Node.js `--experimental-require-module` flag.", "You appear to be using a configuration file that contains top-level " + "await, which is only supported when running Babel asynchronously.");
|
||||||
|
let cacheNeedsConfiguration = false;
|
||||||
|
if (typeof options === "function") {
|
||||||
|
({
|
||||||
|
options,
|
||||||
|
cacheNeedsConfiguration
|
||||||
|
} = yield* runConfig(options, data));
|
||||||
|
}
|
||||||
|
if (!options || typeof options !== "object" || Array.isArray(options)) {
|
||||||
|
throw new _configError.default(`Configuration should be an exported JavaScript object.`, filepath);
|
||||||
|
}
|
||||||
|
if (typeof options.then === "function") {
|
||||||
|
options.catch == null || options.catch(() => {});
|
||||||
|
throw new _configError.default(`You appear to be using an async configuration, ` + `which your current version of Babel does not support. ` + `We may add support for this in the future, ` + `but if you're on the most recent version of @babel/core and still ` + `seeing this error, then you'll need to synchronously return your config.`, filepath);
|
||||||
|
}
|
||||||
|
if (cacheNeedsConfiguration) throwConfigError(filepath);
|
||||||
|
return buildConfigFileObject(options, filepath);
|
||||||
|
}
|
||||||
|
const cfboaf = new WeakMap();
|
||||||
|
function buildConfigFileObject(options, filepath) {
|
||||||
|
let configFilesByFilepath = cfboaf.get(options);
|
||||||
|
if (!configFilesByFilepath) {
|
||||||
|
cfboaf.set(options, configFilesByFilepath = new Map());
|
||||||
|
}
|
||||||
|
let configFile = configFilesByFilepath.get(filepath);
|
||||||
|
if (!configFile) {
|
||||||
|
configFile = {
|
||||||
|
filepath,
|
||||||
|
dirname: _path().dirname(filepath),
|
||||||
|
options
|
||||||
|
};
|
||||||
|
configFilesByFilepath.set(filepath, configFile);
|
||||||
|
}
|
||||||
|
return configFile;
|
||||||
|
}
|
||||||
|
const packageToBabelConfig = (0, _caching.makeWeakCacheSync)(file => {
|
||||||
|
const babel = file.options.babel;
|
||||||
|
if (babel === undefined) return null;
|
||||||
|
if (typeof babel !== "object" || Array.isArray(babel) || babel === null) {
|
||||||
|
throw new _configError.default(`.babel property must be an object`, file.filepath);
|
||||||
|
}
|
||||||
|
return {
|
||||||
|
filepath: file.filepath,
|
||||||
|
dirname: file.dirname,
|
||||||
|
options: babel
|
||||||
|
};
|
||||||
|
});
|
||||||
|
const readConfigJSON5 = (0, _utils.makeStaticFileCache)((filepath, content) => {
|
||||||
|
let options;
|
||||||
|
try {
|
||||||
|
options = _json().parse(content);
|
||||||
|
} catch (err) {
|
||||||
|
throw new _configError.default(`Error while parsing config - ${err.message}`, filepath);
|
||||||
|
}
|
||||||
|
if (!options) throw new _configError.default(`No config detected`, filepath);
|
||||||
|
if (typeof options !== "object") {
|
||||||
|
throw new _configError.default(`Config returned typeof ${typeof options}`, filepath);
|
||||||
|
}
|
||||||
|
if (Array.isArray(options)) {
|
||||||
|
throw new _configError.default(`Expected config object but found array`, filepath);
|
||||||
|
}
|
||||||
|
delete options.$schema;
|
||||||
|
return {
|
||||||
|
filepath,
|
||||||
|
dirname: _path().dirname(filepath),
|
||||||
|
options
|
||||||
|
};
|
||||||
|
});
|
||||||
|
const readIgnoreConfig = (0, _utils.makeStaticFileCache)((filepath, content) => {
|
||||||
|
const ignoreDir = _path().dirname(filepath);
|
||||||
|
const ignorePatterns = content.split("\n").map(line => line.replace(/#.*$/, "").trim()).filter(Boolean);
|
||||||
|
for (const pattern of ignorePatterns) {
|
||||||
|
if (pattern[0] === "!") {
|
||||||
|
throw new _configError.default(`Negation of file paths is not supported.`, filepath);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return {
|
||||||
|
filepath,
|
||||||
|
dirname: _path().dirname(filepath),
|
||||||
|
ignore: ignorePatterns.map(pattern => (0, _patternToRegex.default)(pattern, ignoreDir))
|
||||||
|
};
|
||||||
|
});
|
||||||
|
function findConfigUpwards(rootDir) {
|
||||||
|
let dirname = rootDir;
|
||||||
|
for (;;) {
|
||||||
|
for (const filename of ROOT_CONFIG_FILENAMES) {
|
||||||
|
if (_fs().existsSync(_path().join(dirname, filename))) {
|
||||||
|
return dirname;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
const nextDir = _path().dirname(dirname);
|
||||||
|
if (dirname === nextDir) break;
|
||||||
|
dirname = nextDir;
|
||||||
|
}
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
function* findRelativeConfig(packageData, envName, caller) {
|
||||||
|
let config = null;
|
||||||
|
let ignore = null;
|
||||||
|
const dirname = _path().dirname(packageData.filepath);
|
||||||
|
for (const loc of packageData.directories) {
|
||||||
|
if (!config) {
|
||||||
|
var _packageData$pkg;
|
||||||
|
config = yield* loadOneConfig(RELATIVE_CONFIG_FILENAMES, loc, envName, caller, ((_packageData$pkg = packageData.pkg) == null ? void 0 : _packageData$pkg.dirname) === loc ? packageToBabelConfig(packageData.pkg) : null);
|
||||||
|
}
|
||||||
|
if (!ignore) {
|
||||||
|
const ignoreLoc = _path().join(loc, BABELIGNORE_FILENAME);
|
||||||
|
ignore = yield* readIgnoreConfig(ignoreLoc);
|
||||||
|
if (ignore) {
|
||||||
|
debug("Found ignore %o from %o.", ignore.filepath, dirname);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return {
|
||||||
|
config,
|
||||||
|
ignore
|
||||||
|
};
|
||||||
|
}
|
||||||
|
function findRootConfig(dirname, envName, caller) {
|
||||||
|
return loadOneConfig(ROOT_CONFIG_FILENAMES, dirname, envName, caller);
|
||||||
|
}
|
||||||
|
function* loadOneConfig(names, dirname, envName, caller, previousConfig = null) {
|
||||||
|
const configs = yield* _gensync().all(names.map(filename => readConfig(_path().join(dirname, filename), envName, caller)));
|
||||||
|
const config = configs.reduce((previousConfig, config) => {
|
||||||
|
if (config && previousConfig) {
|
||||||
|
throw new _configError.default(`Multiple configuration files found. Please remove one:\n` + ` - ${_path().basename(previousConfig.filepath)}\n` + ` - ${config.filepath}\n` + `from ${dirname}`);
|
||||||
|
}
|
||||||
|
return config || previousConfig;
|
||||||
|
}, previousConfig);
|
||||||
|
if (config) {
|
||||||
|
debug("Found configuration %o from %o.", config.filepath, dirname);
|
||||||
|
}
|
||||||
|
return config;
|
||||||
|
}
|
||||||
|
function* loadConfig(name, dirname, envName, caller) {
|
||||||
|
const filepath = (((v, w) => (v = v.split("."), w = w.split("."), +v[0] > +w[0] || v[0] == w[0] && +v[1] >= +w[1]))(process.versions.node, "8.9") ? require.resolve : (r, {
|
||||||
|
paths: [b]
|
||||||
|
}, M = require("module")) => {
|
||||||
|
let f = M._findPath(r, M._nodeModulePaths(b).concat(b));
|
||||||
|
if (f) return f;
|
||||||
|
f = new Error(`Cannot resolve module '${r}'`);
|
||||||
|
f.code = "MODULE_NOT_FOUND";
|
||||||
|
throw f;
|
||||||
|
})(name, {
|
||||||
|
paths: [dirname]
|
||||||
|
});
|
||||||
|
const conf = yield* readConfig(filepath, envName, caller);
|
||||||
|
if (!conf) {
|
||||||
|
throw new _configError.default(`Config file contains no configuration data`, filepath);
|
||||||
|
}
|
||||||
|
debug("Loaded config %o from %o.", name, dirname);
|
||||||
|
return conf;
|
||||||
|
}
|
||||||
|
function readConfig(filepath, envName, caller) {
|
||||||
|
const ext = _path().extname(filepath);
|
||||||
|
switch (ext) {
|
||||||
|
case ".js":
|
||||||
|
case ".cjs":
|
||||||
|
case ".mjs":
|
||||||
|
case ".ts":
|
||||||
|
case ".cts":
|
||||||
|
case ".mts":
|
||||||
|
return readConfigCode(filepath, {
|
||||||
|
envName,
|
||||||
|
caller
|
||||||
|
});
|
||||||
|
default:
|
||||||
|
return readConfigJSON5(filepath);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
function* resolveShowConfigPath(dirname) {
|
||||||
|
const targetPath = process.env.BABEL_SHOW_CONFIG_FOR;
|
||||||
|
if (targetPath != null) {
|
||||||
|
const absolutePath = _path().resolve(dirname, targetPath);
|
||||||
|
const stats = yield* fs.stat(absolutePath);
|
||||||
|
if (!stats.isFile()) {
|
||||||
|
throw new Error(`${absolutePath}: BABEL_SHOW_CONFIG_FOR must refer to a regular file, directories are not supported.`);
|
||||||
|
}
|
||||||
|
return absolutePath;
|
||||||
|
}
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
function throwConfigError(filepath) {
|
||||||
|
throw new _configError.default(`\
|
||||||
|
Caching was left unconfigured. Babel's plugins, presets, and .babelrc.js files can be configured
|
||||||
|
for various types of caching, using the first param of their handler functions:
|
||||||
|
|
||||||
|
module.exports = function(api) {
|
||||||
|
// The API exposes the following:
|
||||||
|
|
||||||
|
// Cache the returned value forever and don't call this function again.
|
||||||
|
api.cache(true);
|
||||||
|
|
||||||
|
// Don't cache at all. Not recommended because it will be very slow.
|
||||||
|
api.cache(false);
|
||||||
|
|
||||||
|
// Cached based on the value of some function. If this function returns a value different from
|
||||||
|
// a previously-encountered value, the plugins will re-evaluate.
|
||||||
|
var env = api.cache(() => process.env.NODE_ENV);
|
||||||
|
|
||||||
|
// If testing for a specific env, we recommend specifics to avoid instantiating a plugin for
|
||||||
|
// any possible NODE_ENV value that might come up during plugin execution.
|
||||||
|
var isProd = api.cache(() => process.env.NODE_ENV === "production");
|
||||||
|
|
||||||
|
// .cache(fn) will perform a linear search though instances to find the matching plugin based
|
||||||
|
// based on previous instantiated plugins. If you want to recreate the plugin and discard the
|
||||||
|
// previous instance whenever something changes, you may use:
|
||||||
|
var isProd = api.cache.invalidate(() => process.env.NODE_ENV === "production");
|
||||||
|
|
||||||
|
// Note, we also expose the following more-verbose versions of the above examples:
|
||||||
|
api.cache.forever(); // api.cache(true)
|
||||||
|
api.cache.never(); // api.cache(false)
|
||||||
|
api.cache.using(fn); // api.cache(fn)
|
||||||
|
|
||||||
|
// Return the value that will be cached.
|
||||||
|
return { };
|
||||||
|
};`, filepath);
|
||||||
|
}
|
||||||
|
0 && 0;
|
||||||
|
|
||||||
|
//# sourceMappingURL=configuration.js.map
|
||||||
1
web/node_modules/@babel/core/lib/config/files/configuration.js.map
generated
vendored
Normal file
1
web/node_modules/@babel/core/lib/config/files/configuration.js.map
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
6
web/node_modules/@babel/core/lib/config/files/import.cjs
generated
vendored
Normal file
6
web/node_modules/@babel/core/lib/config/files/import.cjs
generated
vendored
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
module.exports = function import_(filepath) {
|
||||||
|
return import(filepath);
|
||||||
|
};
|
||||||
|
0 && 0;
|
||||||
|
|
||||||
|
//# sourceMappingURL=import.cjs.map
|
||||||
1
web/node_modules/@babel/core/lib/config/files/import.cjs.map
generated
vendored
Normal file
1
web/node_modules/@babel/core/lib/config/files/import.cjs.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"version":3,"names":["module","exports","import_","filepath"],"sources":["../../../src/config/files/import.cjs"],"sourcesContent":["// We keep this in a separate file so that in older node versions, where\n// import() isn't supported, we can try/catch around the require() call\n// when loading this file.\n\nmodule.exports = function import_(filepath) {\n return import(filepath);\n};\n"],"mappings":"AAIAA,MAAM,CAACC,OAAO,GAAG,SAASC,OAAOA,CAACC,QAAQ,EAAE;EAC1C,OAAO,OAAOA,QAAQ,CAAC;AACzB,CAAC;AAAC","ignoreList":[]}
|
||||||
58
web/node_modules/@babel/core/lib/config/files/index-browser.js
generated
vendored
Normal file
58
web/node_modules/@babel/core/lib/config/files/index-browser.js
generated
vendored
Normal file
@@ -0,0 +1,58 @@
|
|||||||
|
"use strict";
|
||||||
|
|
||||||
|
Object.defineProperty(exports, "__esModule", {
|
||||||
|
value: true
|
||||||
|
});
|
||||||
|
exports.ROOT_CONFIG_FILENAMES = void 0;
|
||||||
|
exports.findConfigUpwards = findConfigUpwards;
|
||||||
|
exports.findPackageData = findPackageData;
|
||||||
|
exports.findRelativeConfig = findRelativeConfig;
|
||||||
|
exports.findRootConfig = findRootConfig;
|
||||||
|
exports.loadConfig = loadConfig;
|
||||||
|
exports.loadPlugin = loadPlugin;
|
||||||
|
exports.loadPreset = loadPreset;
|
||||||
|
exports.resolvePlugin = resolvePlugin;
|
||||||
|
exports.resolvePreset = resolvePreset;
|
||||||
|
exports.resolveShowConfigPath = resolveShowConfigPath;
|
||||||
|
function findConfigUpwards(rootDir) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
function* findPackageData(filepath) {
|
||||||
|
return {
|
||||||
|
filepath,
|
||||||
|
directories: [],
|
||||||
|
pkg: null,
|
||||||
|
isPackage: false
|
||||||
|
};
|
||||||
|
}
|
||||||
|
function* findRelativeConfig(pkgData, envName, caller) {
|
||||||
|
return {
|
||||||
|
config: null,
|
||||||
|
ignore: null
|
||||||
|
};
|
||||||
|
}
|
||||||
|
function* findRootConfig(dirname, envName, caller) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
function* loadConfig(name, dirname, envName, caller) {
|
||||||
|
throw new Error(`Cannot load ${name} relative to ${dirname} in a browser`);
|
||||||
|
}
|
||||||
|
function* resolveShowConfigPath(dirname) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
const ROOT_CONFIG_FILENAMES = exports.ROOT_CONFIG_FILENAMES = [];
|
||||||
|
function resolvePlugin(name, dirname) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
function resolvePreset(name, dirname) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
function loadPlugin(name, dirname) {
|
||||||
|
throw new Error(`Cannot load plugin ${name} relative to ${dirname} in a browser`);
|
||||||
|
}
|
||||||
|
function loadPreset(name, dirname) {
|
||||||
|
throw new Error(`Cannot load preset ${name} relative to ${dirname} in a browser`);
|
||||||
|
}
|
||||||
|
0 && 0;
|
||||||
|
|
||||||
|
//# sourceMappingURL=index-browser.js.map
|
||||||
1
web/node_modules/@babel/core/lib/config/files/index-browser.js.map
generated
vendored
Normal file
1
web/node_modules/@babel/core/lib/config/files/index-browser.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"version":3,"names":["findConfigUpwards","rootDir","findPackageData","filepath","directories","pkg","isPackage","findRelativeConfig","pkgData","envName","caller","config","ignore","findRootConfig","dirname","loadConfig","name","Error","resolveShowConfigPath","ROOT_CONFIG_FILENAMES","exports","resolvePlugin","resolvePreset","loadPlugin","loadPreset"],"sources":["../../../src/config/files/index-browser.ts"],"sourcesContent":["/* c8 ignore start */\n\nimport type { Handler } from \"gensync\";\n\nimport type {\n ConfigFile,\n IgnoreFile,\n RelativeConfig,\n FilePackageData,\n} from \"./types.ts\";\n\nimport type { CallerMetadata } from \"../validation/options.ts\";\n\nexport type { ConfigFile, IgnoreFile, RelativeConfig, FilePackageData };\n\nexport function findConfigUpwards(\n // eslint-disable-next-line @typescript-eslint/no-unused-vars\n rootDir: string,\n): string | null {\n return null;\n}\n\n// eslint-disable-next-line require-yield\nexport function* findPackageData(filepath: string): Handler<FilePackageData> {\n return {\n filepath,\n directories: [],\n pkg: null,\n isPackage: false,\n };\n}\n\n// eslint-disable-next-line require-yield\nexport function* findRelativeConfig(\n // eslint-disable-next-line @typescript-eslint/no-unused-vars\n pkgData: FilePackageData,\n // eslint-disable-next-line @typescript-eslint/no-unused-vars\n envName: string,\n // eslint-disable-next-line @typescript-eslint/no-unused-vars\n caller: CallerMetadata | undefined,\n): Handler<RelativeConfig> {\n return { config: null, ignore: null };\n}\n\n// eslint-disable-next-line require-yield\nexport function* findRootConfig(\n // eslint-disable-next-line @typescript-eslint/no-unused-vars\n dirname: string,\n // eslint-disable-next-line @typescript-eslint/no-unused-vars\n envName: string,\n // eslint-disable-next-line @typescript-eslint/no-unused-vars\n caller: CallerMetadata | undefined,\n): Handler<ConfigFile | null> {\n return null;\n}\n\n// eslint-disable-next-line require-yield\nexport function* loadConfig(\n name: string,\n dirname: string,\n // eslint-disable-next-line @typescript-eslint/no-unused-vars\n envName: string,\n // eslint-disable-next-line @typescript-eslint/no-unused-vars\n caller: CallerMetadata | undefined,\n): Handler<ConfigFile> {\n throw new Error(`Cannot load ${name} relative to ${dirname} in a browser`);\n}\n\n// eslint-disable-next-line require-yield\nexport function* resolveShowConfigPath(\n // eslint-disable-next-line @typescript-eslint/no-unused-vars\n dirname: string,\n): Handler<string | null> {\n return null;\n}\n\nexport const ROOT_CONFIG_FILENAMES: string[] = [];\n\ntype Resolved =\n | { loader: \"require\"; filepath: string }\n | { loader: \"import\"; filepath: string };\n\n// eslint-disable-next-line @typescript-eslint/no-unused-vars\nexport function resolvePlugin(name: string, dirname: string): Resolved | null {\n return null;\n}\n\n// eslint-disable-next-line @typescript-eslint/no-unused-vars\nexport function resolvePreset(name: string, dirname: string): Resolved | null {\n return null;\n}\n\nexport function loadPlugin(\n name: string,\n dirname: string,\n): Handler<{\n filepath: string;\n value: unknown;\n}> {\n throw new Error(\n `Cannot load plugin ${name} relative to ${dirname} in a browser`,\n );\n}\n\nexport function loadPreset(\n name: string,\n dirname: string,\n): Handler<{\n filepath: string;\n value: unknown;\n}> {\n throw new Error(\n `Cannot load preset ${name} relative to ${dirname} in a browser`,\n );\n}\n"],"mappings":";;;;;;;;;;;;;;;;AAeO,SAASA,iBAAiBA,CAE/BC,OAAe,EACA;EACf,OAAO,IAAI;AACb;AAGO,UAAUC,eAAeA,CAACC,QAAgB,EAA4B;EAC3E,OAAO;IACLA,QAAQ;IACRC,WAAW,EAAE,EAAE;IACfC,GAAG,EAAE,IAAI;IACTC,SAAS,EAAE;EACb,CAAC;AACH;AAGO,UAAUC,kBAAkBA,CAEjCC,OAAwB,EAExBC,OAAe,EAEfC,MAAkC,EACT;EACzB,OAAO;IAAEC,MAAM,EAAE,IAAI;IAAEC,MAAM,EAAE;EAAK,CAAC;AACvC;AAGO,UAAUC,cAAcA,CAE7BC,OAAe,EAEfL,OAAe,EAEfC,MAAkC,EACN;EAC5B,OAAO,IAAI;AACb;AAGO,UAAUK,UAAUA,CACzBC,IAAY,EACZF,OAAe,EAEfL,OAAe,EAEfC,MAAkC,EACb;EACrB,MAAM,IAAIO,KAAK,CAAC,eAAeD,IAAI,gBAAgBF,OAAO,eAAe,CAAC;AAC5E;AAGO,UAAUI,qBAAqBA,CAEpCJ,OAAe,EACS;EACxB,OAAO,IAAI;AACb;AAEO,MAAMK,qBAA+B,GAAAC,OAAA,CAAAD,qBAAA,GAAG,EAAE;AAO1C,SAASE,aAAaA,CAACL,IAAY,EAAEF,OAAe,EAAmB;EAC5E,OAAO,IAAI;AACb;AAGO,SAASQ,aAAaA,CAACN,IAAY,EAAEF,OAAe,EAAmB;EAC5E,OAAO,IAAI;AACb;AAEO,SAASS,UAAUA,CACxBP,IAAY,EACZF,OAAe,EAId;EACD,MAAM,IAAIG,KAAK,CACb,sBAAsBD,IAAI,gBAAgBF,OAAO,eACnD,CAAC;AACH;AAEO,SAASU,UAAUA,CACxBR,IAAY,EACZF,OAAe,EAId;EACD,MAAM,IAAIG,KAAK,CACb,sBAAsBD,IAAI,gBAAgBF,OAAO,eACnD,CAAC;AACH;AAAC","ignoreList":[]}
|
||||||
78
web/node_modules/@babel/core/lib/config/files/index.js
generated
vendored
Normal file
78
web/node_modules/@babel/core/lib/config/files/index.js
generated
vendored
Normal file
@@ -0,0 +1,78 @@
|
|||||||
|
"use strict";
|
||||||
|
|
||||||
|
Object.defineProperty(exports, "__esModule", {
|
||||||
|
value: true
|
||||||
|
});
|
||||||
|
Object.defineProperty(exports, "ROOT_CONFIG_FILENAMES", {
|
||||||
|
enumerable: true,
|
||||||
|
get: function () {
|
||||||
|
return _configuration.ROOT_CONFIG_FILENAMES;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
Object.defineProperty(exports, "findConfigUpwards", {
|
||||||
|
enumerable: true,
|
||||||
|
get: function () {
|
||||||
|
return _configuration.findConfigUpwards;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
Object.defineProperty(exports, "findPackageData", {
|
||||||
|
enumerable: true,
|
||||||
|
get: function () {
|
||||||
|
return _package.findPackageData;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
Object.defineProperty(exports, "findRelativeConfig", {
|
||||||
|
enumerable: true,
|
||||||
|
get: function () {
|
||||||
|
return _configuration.findRelativeConfig;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
Object.defineProperty(exports, "findRootConfig", {
|
||||||
|
enumerable: true,
|
||||||
|
get: function () {
|
||||||
|
return _configuration.findRootConfig;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
Object.defineProperty(exports, "loadConfig", {
|
||||||
|
enumerable: true,
|
||||||
|
get: function () {
|
||||||
|
return _configuration.loadConfig;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
Object.defineProperty(exports, "loadPlugin", {
|
||||||
|
enumerable: true,
|
||||||
|
get: function () {
|
||||||
|
return _plugins.loadPlugin;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
Object.defineProperty(exports, "loadPreset", {
|
||||||
|
enumerable: true,
|
||||||
|
get: function () {
|
||||||
|
return _plugins.loadPreset;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
Object.defineProperty(exports, "resolvePlugin", {
|
||||||
|
enumerable: true,
|
||||||
|
get: function () {
|
||||||
|
return _plugins.resolvePlugin;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
Object.defineProperty(exports, "resolvePreset", {
|
||||||
|
enumerable: true,
|
||||||
|
get: function () {
|
||||||
|
return _plugins.resolvePreset;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
Object.defineProperty(exports, "resolveShowConfigPath", {
|
||||||
|
enumerable: true,
|
||||||
|
get: function () {
|
||||||
|
return _configuration.resolveShowConfigPath;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
var _package = require("./package.js");
|
||||||
|
var _configuration = require("./configuration.js");
|
||||||
|
var _plugins = require("./plugins.js");
|
||||||
|
({});
|
||||||
|
0 && 0;
|
||||||
|
|
||||||
|
//# sourceMappingURL=index.js.map
|
||||||
1
web/node_modules/@babel/core/lib/config/files/index.js.map
generated
vendored
Normal file
1
web/node_modules/@babel/core/lib/config/files/index.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"version":3,"names":["_package","require","_configuration","_plugins"],"sources":["../../../src/config/files/index.ts"],"sourcesContent":["type indexBrowserType = typeof import(\"./index-browser\");\ntype indexType = typeof import(\"./index\");\n\n// Kind of gross, but essentially asserting that the exports of this module are the same as the\n// exports of index-browser, since this file may be replaced at bundle time with index-browser.\n({}) as any as indexBrowserType as indexType;\n\nexport { findPackageData } from \"./package.ts\";\n\nexport {\n findConfigUpwards,\n findRelativeConfig,\n findRootConfig,\n loadConfig,\n resolveShowConfigPath,\n ROOT_CONFIG_FILENAMES,\n} from \"./configuration.ts\";\nexport type {\n ConfigFile,\n IgnoreFile,\n RelativeConfig,\n FilePackageData,\n} from \"./types.ts\";\nexport {\n loadPlugin,\n loadPreset,\n resolvePlugin,\n resolvePreset,\n} from \"./plugins.ts\";\n"],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAOA,IAAAA,QAAA,GAAAC,OAAA;AAEA,IAAAC,cAAA,GAAAD,OAAA;AAcA,IAAAE,QAAA,GAAAF,OAAA;AAlBA,CAAC,CAAC,CAAC;AAA0C","ignoreList":[]}
|
||||||
211
web/node_modules/@babel/core/lib/config/files/module-types.js
generated
vendored
Normal file
211
web/node_modules/@babel/core/lib/config/files/module-types.js
generated
vendored
Normal file
@@ -0,0 +1,211 @@
|
|||||||
|
"use strict";
|
||||||
|
|
||||||
|
Object.defineProperty(exports, "__esModule", {
|
||||||
|
value: true
|
||||||
|
});
|
||||||
|
exports.default = loadCodeDefault;
|
||||||
|
exports.supportsESM = void 0;
|
||||||
|
var _async = require("../../gensync-utils/async.js");
|
||||||
|
function _path() {
|
||||||
|
const data = require("path");
|
||||||
|
_path = function () {
|
||||||
|
return data;
|
||||||
|
};
|
||||||
|
return data;
|
||||||
|
}
|
||||||
|
function _url() {
|
||||||
|
const data = require("url");
|
||||||
|
_url = function () {
|
||||||
|
return data;
|
||||||
|
};
|
||||||
|
return data;
|
||||||
|
}
|
||||||
|
require("module");
|
||||||
|
function _semver() {
|
||||||
|
const data = require("semver");
|
||||||
|
_semver = function () {
|
||||||
|
return data;
|
||||||
|
};
|
||||||
|
return data;
|
||||||
|
}
|
||||||
|
function _debug() {
|
||||||
|
const data = require("debug");
|
||||||
|
_debug = function () {
|
||||||
|
return data;
|
||||||
|
};
|
||||||
|
return data;
|
||||||
|
}
|
||||||
|
var _rewriteStackTrace = require("../../errors/rewrite-stack-trace.js");
|
||||||
|
var _configError = require("../../errors/config-error.js");
|
||||||
|
var _transformFile = require("../../transform-file.js");
|
||||||
|
function asyncGeneratorStep(n, t, e, r, o, a, c) { try { var i = n[a](c), u = i.value; } catch (n) { return void e(n); } i.done ? t(u) : Promise.resolve(u).then(r, o); }
|
||||||
|
function _asyncToGenerator(n) { return function () { var t = this, e = arguments; return new Promise(function (r, o) { var a = n.apply(t, e); function _next(n) { asyncGeneratorStep(a, r, o, _next, _throw, "next", n); } function _throw(n) { asyncGeneratorStep(a, r, o, _next, _throw, "throw", n); } _next(void 0); }); }; }
|
||||||
|
const debug = _debug()("babel:config:loading:files:module-types");
|
||||||
|
{
|
||||||
|
try {
|
||||||
|
var import_ = require("./import.cjs");
|
||||||
|
} catch (_unused) {}
|
||||||
|
}
|
||||||
|
const supportsESM = exports.supportsESM = _semver().satisfies(process.versions.node, "^12.17 || >=13.2");
|
||||||
|
const LOADING_CJS_FILES = new Set();
|
||||||
|
function loadCjsDefault(filepath) {
|
||||||
|
if (LOADING_CJS_FILES.has(filepath)) {
|
||||||
|
debug("Auto-ignoring usage of config %o.", filepath);
|
||||||
|
return {};
|
||||||
|
}
|
||||||
|
let module;
|
||||||
|
try {
|
||||||
|
LOADING_CJS_FILES.add(filepath);
|
||||||
|
module = (0, _rewriteStackTrace.endHiddenCallStack)(require)(filepath);
|
||||||
|
} finally {
|
||||||
|
LOADING_CJS_FILES.delete(filepath);
|
||||||
|
}
|
||||||
|
{
|
||||||
|
return module != null && (module.__esModule || module[Symbol.toStringTag] === "Module") ? module.default || (arguments[1] ? module : undefined) : module;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
const loadMjsFromPath = (0, _rewriteStackTrace.endHiddenCallStack)(function () {
|
||||||
|
var _loadMjsFromPath = _asyncToGenerator(function* (filepath) {
|
||||||
|
const url = (0, _url().pathToFileURL)(filepath).toString() + "?import";
|
||||||
|
{
|
||||||
|
if (!import_) {
|
||||||
|
throw new _configError.default("Internal error: Native ECMAScript modules aren't supported by this platform.\n", filepath);
|
||||||
|
}
|
||||||
|
return yield import_(url);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
function loadMjsFromPath(_x) {
|
||||||
|
return _loadMjsFromPath.apply(this, arguments);
|
||||||
|
}
|
||||||
|
return loadMjsFromPath;
|
||||||
|
}());
|
||||||
|
const tsNotSupportedError = ext => `\
|
||||||
|
You are using a ${ext} config file, but Babel only supports transpiling .cts configs. Either:
|
||||||
|
- Use a .cts config file
|
||||||
|
- Update to Node.js 23.6.0, which has native TypeScript support
|
||||||
|
- Install tsx to transpile ${ext} files on the fly\
|
||||||
|
`;
|
||||||
|
const SUPPORTED_EXTENSIONS = {
|
||||||
|
".js": "unknown",
|
||||||
|
".mjs": "esm",
|
||||||
|
".cjs": "cjs",
|
||||||
|
".ts": "unknown",
|
||||||
|
".mts": "esm",
|
||||||
|
".cts": "cjs"
|
||||||
|
};
|
||||||
|
const asyncModules = new Set();
|
||||||
|
function* loadCodeDefault(filepath, loader, esmError, tlaError) {
|
||||||
|
let async;
|
||||||
|
const ext = _path().extname(filepath);
|
||||||
|
const isTS = ext === ".ts" || ext === ".cts" || ext === ".mts";
|
||||||
|
const type = SUPPORTED_EXTENSIONS[hasOwnProperty.call(SUPPORTED_EXTENSIONS, ext) ? ext : ".js"];
|
||||||
|
const pattern = `${loader} ${type}`;
|
||||||
|
switch (pattern) {
|
||||||
|
case "require cjs":
|
||||||
|
case "auto cjs":
|
||||||
|
if (isTS) {
|
||||||
|
return ensureTsSupport(filepath, ext, () => loadCjsDefault(filepath));
|
||||||
|
} else {
|
||||||
|
return loadCjsDefault(filepath, arguments[2]);
|
||||||
|
}
|
||||||
|
case "auto unknown":
|
||||||
|
case "require unknown":
|
||||||
|
case "require esm":
|
||||||
|
try {
|
||||||
|
if (isTS) {
|
||||||
|
return ensureTsSupport(filepath, ext, () => loadCjsDefault(filepath));
|
||||||
|
} else {
|
||||||
|
return loadCjsDefault(filepath, arguments[2]);
|
||||||
|
}
|
||||||
|
} catch (e) {
|
||||||
|
if (e.code === "ERR_REQUIRE_ASYNC_MODULE" || e.code === "ERR_REQUIRE_CYCLE_MODULE" && asyncModules.has(filepath)) {
|
||||||
|
asyncModules.add(filepath);
|
||||||
|
if (!(async != null ? async : async = yield* (0, _async.isAsync)())) {
|
||||||
|
throw new _configError.default(tlaError, filepath);
|
||||||
|
}
|
||||||
|
} else if (e.code === "ERR_REQUIRE_ESM" || type === "esm") {} else {
|
||||||
|
throw e;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
case "auto esm":
|
||||||
|
if (async != null ? async : async = yield* (0, _async.isAsync)()) {
|
||||||
|
const promise = isTS ? ensureTsSupport(filepath, ext, () => loadMjsFromPath(filepath)) : loadMjsFromPath(filepath);
|
||||||
|
return (yield* (0, _async.waitFor)(promise)).default;
|
||||||
|
}
|
||||||
|
if (isTS) {
|
||||||
|
throw new _configError.default(tsNotSupportedError(ext), filepath);
|
||||||
|
} else {
|
||||||
|
throw new _configError.default(esmError, filepath);
|
||||||
|
}
|
||||||
|
default:
|
||||||
|
throw new Error("Internal Babel error: unreachable code.");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
function ensureTsSupport(filepath, ext, callback) {
|
||||||
|
if (process.features.typescript || require.extensions[".ts"] || require.extensions[".cts"] || require.extensions[".mts"]) {
|
||||||
|
return callback();
|
||||||
|
}
|
||||||
|
if (ext !== ".cts") {
|
||||||
|
throw new _configError.default(tsNotSupportedError(ext), filepath);
|
||||||
|
}
|
||||||
|
const opts = {
|
||||||
|
babelrc: false,
|
||||||
|
configFile: false,
|
||||||
|
sourceType: "unambiguous",
|
||||||
|
sourceMaps: "inline",
|
||||||
|
sourceFileName: _path().basename(filepath),
|
||||||
|
presets: [[getTSPreset(filepath), Object.assign({
|
||||||
|
onlyRemoveTypeImports: true,
|
||||||
|
optimizeConstEnums: true
|
||||||
|
}, {
|
||||||
|
allowDeclareFields: true
|
||||||
|
})]]
|
||||||
|
};
|
||||||
|
let handler = function (m, filename) {
|
||||||
|
if (handler && filename.endsWith(".cts")) {
|
||||||
|
try {
|
||||||
|
return m._compile((0, _transformFile.transformFileSync)(filename, Object.assign({}, opts, {
|
||||||
|
filename
|
||||||
|
})).code, filename);
|
||||||
|
} catch (error) {
|
||||||
|
const packageJson = require("@babel/preset-typescript/package.json");
|
||||||
|
if (_semver().lt(packageJson.version, "7.21.4")) {
|
||||||
|
console.error("`.cts` configuration file failed to load, please try to update `@babel/preset-typescript`.");
|
||||||
|
}
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return require.extensions[".js"](m, filename);
|
||||||
|
};
|
||||||
|
require.extensions[ext] = handler;
|
||||||
|
try {
|
||||||
|
return callback();
|
||||||
|
} finally {
|
||||||
|
if (require.extensions[ext] === handler) delete require.extensions[ext];
|
||||||
|
handler = undefined;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
function getTSPreset(filepath) {
|
||||||
|
try {
|
||||||
|
return require("@babel/preset-typescript");
|
||||||
|
} catch (error) {
|
||||||
|
if (error.code !== "MODULE_NOT_FOUND") throw error;
|
||||||
|
let message = "You appear to be using a .cts file as Babel configuration, but the `@babel/preset-typescript` package was not found: please install it!";
|
||||||
|
{
|
||||||
|
if (process.versions.pnp) {
|
||||||
|
message += `
|
||||||
|
If you are using Yarn Plug'n'Play, you may also need to add the following configuration to your .yarnrc.yml file:
|
||||||
|
|
||||||
|
packageExtensions:
|
||||||
|
\t"@babel/core@*":
|
||||||
|
\t\tpeerDependencies:
|
||||||
|
\t\t\t"@babel/preset-typescript": "*"
|
||||||
|
`;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
throw new _configError.default(message, filepath);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
0 && 0;
|
||||||
|
|
||||||
|
//# sourceMappingURL=module-types.js.map
|
||||||
1
web/node_modules/@babel/core/lib/config/files/module-types.js.map
generated
vendored
Normal file
1
web/node_modules/@babel/core/lib/config/files/module-types.js.map
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
61
web/node_modules/@babel/core/lib/config/files/package.js
generated
vendored
Normal file
61
web/node_modules/@babel/core/lib/config/files/package.js
generated
vendored
Normal file
@@ -0,0 +1,61 @@
|
|||||||
|
"use strict";
|
||||||
|
|
||||||
|
Object.defineProperty(exports, "__esModule", {
|
||||||
|
value: true
|
||||||
|
});
|
||||||
|
exports.findPackageData = findPackageData;
|
||||||
|
function _path() {
|
||||||
|
const data = require("path");
|
||||||
|
_path = function () {
|
||||||
|
return data;
|
||||||
|
};
|
||||||
|
return data;
|
||||||
|
}
|
||||||
|
var _utils = require("./utils.js");
|
||||||
|
var _configError = require("../../errors/config-error.js");
|
||||||
|
const PACKAGE_FILENAME = "package.json";
|
||||||
|
const readConfigPackage = (0, _utils.makeStaticFileCache)((filepath, content) => {
|
||||||
|
let options;
|
||||||
|
try {
|
||||||
|
options = JSON.parse(content);
|
||||||
|
} catch (err) {
|
||||||
|
throw new _configError.default(`Error while parsing JSON - ${err.message}`, filepath);
|
||||||
|
}
|
||||||
|
if (!options) throw new Error(`${filepath}: No config detected`);
|
||||||
|
if (typeof options !== "object") {
|
||||||
|
throw new _configError.default(`Config returned typeof ${typeof options}`, filepath);
|
||||||
|
}
|
||||||
|
if (Array.isArray(options)) {
|
||||||
|
throw new _configError.default(`Expected config object but found array`, filepath);
|
||||||
|
}
|
||||||
|
return {
|
||||||
|
filepath,
|
||||||
|
dirname: _path().dirname(filepath),
|
||||||
|
options
|
||||||
|
};
|
||||||
|
});
|
||||||
|
function* findPackageData(filepath) {
|
||||||
|
let pkg = null;
|
||||||
|
const directories = [];
|
||||||
|
let isPackage = true;
|
||||||
|
let dirname = _path().dirname(filepath);
|
||||||
|
while (!pkg && _path().basename(dirname) !== "node_modules") {
|
||||||
|
directories.push(dirname);
|
||||||
|
pkg = yield* readConfigPackage(_path().join(dirname, PACKAGE_FILENAME));
|
||||||
|
const nextLoc = _path().dirname(dirname);
|
||||||
|
if (dirname === nextLoc) {
|
||||||
|
isPackage = false;
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
dirname = nextLoc;
|
||||||
|
}
|
||||||
|
return {
|
||||||
|
filepath,
|
||||||
|
directories,
|
||||||
|
pkg,
|
||||||
|
isPackage
|
||||||
|
};
|
||||||
|
}
|
||||||
|
0 && 0;
|
||||||
|
|
||||||
|
//# sourceMappingURL=package.js.map
|
||||||
1
web/node_modules/@babel/core/lib/config/files/package.js.map
generated
vendored
Normal file
1
web/node_modules/@babel/core/lib/config/files/package.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"version":3,"names":["_path","data","require","_utils","_configError","PACKAGE_FILENAME","readConfigPackage","makeStaticFileCache","filepath","content","options","JSON","parse","err","ConfigError","message","Error","Array","isArray","dirname","path","findPackageData","pkg","directories","isPackage","basename","push","join","nextLoc"],"sources":["../../../src/config/files/package.ts"],"sourcesContent":["import path from \"node:path\";\nimport type { Handler } from \"gensync\";\nimport { makeStaticFileCache } from \"./utils.ts\";\n\nimport type { ConfigFile, FilePackageData } from \"./types.ts\";\n\nimport ConfigError from \"../../errors/config-error.ts\";\n\nconst PACKAGE_FILENAME = \"package.json\";\n\nconst readConfigPackage = makeStaticFileCache(\n (filepath, content): ConfigFile => {\n let options;\n try {\n options = JSON.parse(content) as unknown;\n } catch (err) {\n throw new ConfigError(\n `Error while parsing JSON - ${err.message}`,\n filepath,\n );\n }\n\n if (!options) throw new Error(`${filepath}: No config detected`);\n\n if (typeof options !== \"object\") {\n throw new ConfigError(\n `Config returned typeof ${typeof options}`,\n filepath,\n );\n }\n if (Array.isArray(options)) {\n throw new ConfigError(`Expected config object but found array`, filepath);\n }\n\n return {\n filepath,\n dirname: path.dirname(filepath),\n options,\n };\n },\n);\n\n/**\n * Find metadata about the package that this file is inside of. Resolution\n * of Babel's config requires general package information to decide when to\n * search for .babelrc files\n */\nexport function* findPackageData(filepath: string): Handler<FilePackageData> {\n let pkg = null;\n const directories = [];\n let isPackage = true;\n\n let dirname = path.dirname(filepath);\n while (!pkg && path.basename(dirname) !== \"node_modules\") {\n directories.push(dirname);\n\n pkg = yield* readConfigPackage(path.join(dirname, PACKAGE_FILENAME));\n\n const nextLoc = path.dirname(dirname);\n if (dirname === nextLoc) {\n isPackage = false;\n break;\n }\n dirname = nextLoc;\n }\n\n return { filepath, directories, pkg, isPackage };\n}\n"],"mappings":";;;;;;AAAA,SAAAA,MAAA;EAAA,MAAAC,IAAA,GAAAC,OAAA;EAAAF,KAAA,YAAAA,CAAA;IAAA,OAAAC,IAAA;EAAA;EAAA,OAAAA,IAAA;AAAA;AAEA,IAAAE,MAAA,GAAAD,OAAA;AAIA,IAAAE,YAAA,GAAAF,OAAA;AAEA,MAAMG,gBAAgB,GAAG,cAAc;AAEvC,MAAMC,iBAAiB,GAAG,IAAAC,0BAAmB,EAC3C,CAACC,QAAQ,EAAEC,OAAO,KAAiB;EACjC,IAAIC,OAAO;EACX,IAAI;IACFA,OAAO,GAAGC,IAAI,CAACC,KAAK,CAACH,OAAO,CAAY;EAC1C,CAAC,CAAC,OAAOI,GAAG,EAAE;IACZ,MAAM,IAAIC,oBAAW,CACnB,8BAA8BD,GAAG,CAACE,OAAO,EAAE,EAC3CP,QACF,CAAC;EACH;EAEA,IAAI,CAACE,OAAO,EAAE,MAAM,IAAIM,KAAK,CAAC,GAAGR,QAAQ,sBAAsB,CAAC;EAEhE,IAAI,OAAOE,OAAO,KAAK,QAAQ,EAAE;IAC/B,MAAM,IAAII,oBAAW,CACnB,0BAA0B,OAAOJ,OAAO,EAAE,EAC1CF,QACF,CAAC;EACH;EACA,IAAIS,KAAK,CAACC,OAAO,CAACR,OAAO,CAAC,EAAE;IAC1B,MAAM,IAAII,oBAAW,CAAC,wCAAwC,EAAEN,QAAQ,CAAC;EAC3E;EAEA,OAAO;IACLA,QAAQ;IACRW,OAAO,EAAEC,MAAGA,CAAC,CAACD,OAAO,CAACX,QAAQ,CAAC;IAC/BE;EACF,CAAC;AACH,CACF,CAAC;AAOM,UAAUW,eAAeA,CAACb,QAAgB,EAA4B;EAC3E,IAAIc,GAAG,GAAG,IAAI;EACd,MAAMC,WAAW,GAAG,EAAE;EACtB,IAAIC,SAAS,GAAG,IAAI;EAEpB,IAAIL,OAAO,GAAGC,MAAGA,CAAC,CAACD,OAAO,CAACX,QAAQ,CAAC;EACpC,OAAO,CAACc,GAAG,IAAIF,MAAGA,CAAC,CAACK,QAAQ,CAACN,OAAO,CAAC,KAAK,cAAc,EAAE;IACxDI,WAAW,CAACG,IAAI,CAACP,OAAO,CAAC;IAEzBG,GAAG,GAAG,OAAOhB,iBAAiB,CAACc,MAAGA,CAAC,CAACO,IAAI,CAACR,OAAO,EAAEd,gBAAgB,CAAC,CAAC;IAEpE,MAAMuB,OAAO,GAAGR,MAAGA,CAAC,CAACD,OAAO,CAACA,OAAO,CAAC;IACrC,IAAIA,OAAO,KAAKS,OAAO,EAAE;MACvBJ,SAAS,GAAG,KAAK;MACjB;IACF;IACAL,OAAO,GAAGS,OAAO;EACnB;EAEA,OAAO;IAAEpB,QAAQ;IAAEe,WAAW;IAAED,GAAG;IAAEE;EAAU,CAAC;AAClD;AAAC","ignoreList":[]}
|
||||||
230
web/node_modules/@babel/core/lib/config/files/plugins.js
generated
vendored
Normal file
230
web/node_modules/@babel/core/lib/config/files/plugins.js
generated
vendored
Normal file
@@ -0,0 +1,230 @@
|
|||||||
|
"use strict";
|
||||||
|
|
||||||
|
Object.defineProperty(exports, "__esModule", {
|
||||||
|
value: true
|
||||||
|
});
|
||||||
|
exports.loadPlugin = loadPlugin;
|
||||||
|
exports.loadPreset = loadPreset;
|
||||||
|
exports.resolvePreset = exports.resolvePlugin = void 0;
|
||||||
|
function _debug() {
|
||||||
|
const data = require("debug");
|
||||||
|
_debug = function () {
|
||||||
|
return data;
|
||||||
|
};
|
||||||
|
return data;
|
||||||
|
}
|
||||||
|
function _path() {
|
||||||
|
const data = require("path");
|
||||||
|
_path = function () {
|
||||||
|
return data;
|
||||||
|
};
|
||||||
|
return data;
|
||||||
|
}
|
||||||
|
var _async = require("../../gensync-utils/async.js");
|
||||||
|
var _moduleTypes = require("./module-types.js");
|
||||||
|
function _url() {
|
||||||
|
const data = require("url");
|
||||||
|
_url = function () {
|
||||||
|
return data;
|
||||||
|
};
|
||||||
|
return data;
|
||||||
|
}
|
||||||
|
var _importMetaResolve = require("../../vendor/import-meta-resolve.js");
|
||||||
|
require("module");
|
||||||
|
function _fs() {
|
||||||
|
const data = require("fs");
|
||||||
|
_fs = function () {
|
||||||
|
return data;
|
||||||
|
};
|
||||||
|
return data;
|
||||||
|
}
|
||||||
|
const debug = _debug()("babel:config:loading:files:plugins");
|
||||||
|
const EXACT_RE = /^module:/;
|
||||||
|
const BABEL_PLUGIN_PREFIX_RE = /^(?!@|module:|[^/]+\/|babel-plugin-)/;
|
||||||
|
const BABEL_PRESET_PREFIX_RE = /^(?!@|module:|[^/]+\/|babel-preset-)/;
|
||||||
|
const BABEL_PLUGIN_ORG_RE = /^(@babel\/)(?!plugin-|[^/]+\/)/;
|
||||||
|
const BABEL_PRESET_ORG_RE = /^(@babel\/)(?!preset-|[^/]+\/)/;
|
||||||
|
const OTHER_PLUGIN_ORG_RE = /^(@(?!babel\/)[^/]+\/)(?![^/]*babel-plugin(?:-|\/|$)|[^/]+\/)/;
|
||||||
|
const OTHER_PRESET_ORG_RE = /^(@(?!babel\/)[^/]+\/)(?![^/]*babel-preset(?:-|\/|$)|[^/]+\/)/;
|
||||||
|
const OTHER_ORG_DEFAULT_RE = /^(@(?!babel$)[^/]+)$/;
|
||||||
|
const resolvePlugin = exports.resolvePlugin = resolveStandardizedName.bind(null, "plugin");
|
||||||
|
const resolvePreset = exports.resolvePreset = resolveStandardizedName.bind(null, "preset");
|
||||||
|
function* loadPlugin(name, dirname) {
|
||||||
|
const {
|
||||||
|
filepath,
|
||||||
|
loader
|
||||||
|
} = resolvePlugin(name, dirname, yield* (0, _async.isAsync)());
|
||||||
|
const value = yield* requireModule("plugin", loader, filepath);
|
||||||
|
debug("Loaded plugin %o from %o.", name, dirname);
|
||||||
|
return {
|
||||||
|
filepath,
|
||||||
|
value
|
||||||
|
};
|
||||||
|
}
|
||||||
|
function* loadPreset(name, dirname) {
|
||||||
|
const {
|
||||||
|
filepath,
|
||||||
|
loader
|
||||||
|
} = resolvePreset(name, dirname, yield* (0, _async.isAsync)());
|
||||||
|
const value = yield* requireModule("preset", loader, filepath);
|
||||||
|
debug("Loaded preset %o from %o.", name, dirname);
|
||||||
|
return {
|
||||||
|
filepath,
|
||||||
|
value
|
||||||
|
};
|
||||||
|
}
|
||||||
|
function standardizeName(type, name) {
|
||||||
|
if (_path().isAbsolute(name)) return name;
|
||||||
|
const isPreset = type === "preset";
|
||||||
|
return name.replace(isPreset ? BABEL_PRESET_PREFIX_RE : BABEL_PLUGIN_PREFIX_RE, `babel-${type}-`).replace(isPreset ? BABEL_PRESET_ORG_RE : BABEL_PLUGIN_ORG_RE, `$1${type}-`).replace(isPreset ? OTHER_PRESET_ORG_RE : OTHER_PLUGIN_ORG_RE, `$1babel-${type}-`).replace(OTHER_ORG_DEFAULT_RE, `$1/babel-${type}`).replace(EXACT_RE, "");
|
||||||
|
}
|
||||||
|
function* resolveAlternativesHelper(type, name) {
|
||||||
|
const standardizedName = standardizeName(type, name);
|
||||||
|
const {
|
||||||
|
error,
|
||||||
|
value
|
||||||
|
} = yield standardizedName;
|
||||||
|
if (!error) return value;
|
||||||
|
if (error.code !== "MODULE_NOT_FOUND") throw error;
|
||||||
|
if (standardizedName !== name && !(yield name).error) {
|
||||||
|
error.message += `\n- If you want to resolve "${name}", use "module:${name}"`;
|
||||||
|
}
|
||||||
|
if (!(yield standardizeName(type, "@babel/" + name)).error) {
|
||||||
|
error.message += `\n- Did you mean "@babel/${name}"?`;
|
||||||
|
}
|
||||||
|
const oppositeType = type === "preset" ? "plugin" : "preset";
|
||||||
|
if (!(yield standardizeName(oppositeType, name)).error) {
|
||||||
|
error.message += `\n- Did you accidentally pass a ${oppositeType} as a ${type}?`;
|
||||||
|
}
|
||||||
|
if (type === "plugin") {
|
||||||
|
const transformName = standardizedName.replace("-proposal-", "-transform-");
|
||||||
|
if (transformName !== standardizedName && !(yield transformName).error) {
|
||||||
|
error.message += `\n- Did you mean "${transformName}"?`;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
error.message += `\n
|
||||||
|
Make sure that all the Babel plugins and presets you are using
|
||||||
|
are defined as dependencies or devDependencies in your package.json
|
||||||
|
file. It's possible that the missing plugin is loaded by a preset
|
||||||
|
you are using that forgot to add the plugin to its dependencies: you
|
||||||
|
can workaround this problem by explicitly adding the missing package
|
||||||
|
to your top-level package.json.
|
||||||
|
`;
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
function tryRequireResolve(id, dirname) {
|
||||||
|
try {
|
||||||
|
if (dirname) {
|
||||||
|
return {
|
||||||
|
error: null,
|
||||||
|
value: (((v, w) => (v = v.split("."), w = w.split("."), +v[0] > +w[0] || v[0] == w[0] && +v[1] >= +w[1]))(process.versions.node, "8.9") ? require.resolve : (r, {
|
||||||
|
paths: [b]
|
||||||
|
}, M = require("module")) => {
|
||||||
|
let f = M._findPath(r, M._nodeModulePaths(b).concat(b));
|
||||||
|
if (f) return f;
|
||||||
|
f = new Error(`Cannot resolve module '${r}'`);
|
||||||
|
f.code = "MODULE_NOT_FOUND";
|
||||||
|
throw f;
|
||||||
|
})(id, {
|
||||||
|
paths: [dirname]
|
||||||
|
})
|
||||||
|
};
|
||||||
|
} else {
|
||||||
|
return {
|
||||||
|
error: null,
|
||||||
|
value: require.resolve(id)
|
||||||
|
};
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
return {
|
||||||
|
error,
|
||||||
|
value: null
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
function tryImportMetaResolve(id, options) {
|
||||||
|
try {
|
||||||
|
return {
|
||||||
|
error: null,
|
||||||
|
value: (0, _importMetaResolve.resolve)(id, options)
|
||||||
|
};
|
||||||
|
} catch (error) {
|
||||||
|
return {
|
||||||
|
error,
|
||||||
|
value: null
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
function resolveStandardizedNameForRequire(type, name, dirname) {
|
||||||
|
const it = resolveAlternativesHelper(type, name);
|
||||||
|
let res = it.next();
|
||||||
|
while (!res.done) {
|
||||||
|
res = it.next(tryRequireResolve(res.value, dirname));
|
||||||
|
}
|
||||||
|
return {
|
||||||
|
loader: "require",
|
||||||
|
filepath: res.value
|
||||||
|
};
|
||||||
|
}
|
||||||
|
function resolveStandardizedNameForImport(type, name, dirname) {
|
||||||
|
const parentUrl = (0, _url().pathToFileURL)(_path().join(dirname, "./babel-virtual-resolve-base.js")).href;
|
||||||
|
const it = resolveAlternativesHelper(type, name);
|
||||||
|
let res = it.next();
|
||||||
|
while (!res.done) {
|
||||||
|
res = it.next(tryImportMetaResolve(res.value, parentUrl));
|
||||||
|
}
|
||||||
|
return {
|
||||||
|
loader: "auto",
|
||||||
|
filepath: (0, _url().fileURLToPath)(res.value)
|
||||||
|
};
|
||||||
|
}
|
||||||
|
function resolveStandardizedName(type, name, dirname, allowAsync) {
|
||||||
|
if (!_moduleTypes.supportsESM || !allowAsync) {
|
||||||
|
return resolveStandardizedNameForRequire(type, name, dirname);
|
||||||
|
}
|
||||||
|
try {
|
||||||
|
const resolved = resolveStandardizedNameForImport(type, name, dirname);
|
||||||
|
if (!(0, _fs().existsSync)(resolved.filepath)) {
|
||||||
|
throw Object.assign(new Error(`Could not resolve "${name}" in file ${dirname}.`), {
|
||||||
|
type: "MODULE_NOT_FOUND"
|
||||||
|
});
|
||||||
|
}
|
||||||
|
return resolved;
|
||||||
|
} catch (e) {
|
||||||
|
try {
|
||||||
|
return resolveStandardizedNameForRequire(type, name, dirname);
|
||||||
|
} catch (e2) {
|
||||||
|
if (e.type === "MODULE_NOT_FOUND") throw e;
|
||||||
|
if (e2.type === "MODULE_NOT_FOUND") throw e2;
|
||||||
|
throw e;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
{
|
||||||
|
var LOADING_MODULES = new Set();
|
||||||
|
}
|
||||||
|
function* requireModule(type, loader, name) {
|
||||||
|
{
|
||||||
|
if (!(yield* (0, _async.isAsync)()) && LOADING_MODULES.has(name)) {
|
||||||
|
throw new Error(`Reentrant ${type} detected trying to load "${name}". This module is not ignored ` + "and is trying to load itself while compiling itself, leading to a dependency cycle. " + 'We recommend adding it to your "ignore" list in your babelrc, or to a .babelignore.');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
try {
|
||||||
|
{
|
||||||
|
LOADING_MODULES.add(name);
|
||||||
|
}
|
||||||
|
{
|
||||||
|
return yield* (0, _moduleTypes.default)(name, loader, `You appear to be using a native ECMAScript module ${type}, ` + "which is only supported when running Babel asynchronously " + "or when using the Node.js `--experimental-require-module` flag.", `You appear to be using a ${type} that contains top-level await, ` + "which is only supported when running Babel asynchronously.", true);
|
||||||
|
}
|
||||||
|
} catch (err) {
|
||||||
|
err.message = `[BABEL]: ${err.message} (While processing: ${name})`;
|
||||||
|
throw err;
|
||||||
|
} finally {
|
||||||
|
{
|
||||||
|
LOADING_MODULES.delete(name);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
0 && 0;
|
||||||
|
|
||||||
|
//# sourceMappingURL=plugins.js.map
|
||||||
1
web/node_modules/@babel/core/lib/config/files/plugins.js.map
generated
vendored
Normal file
1
web/node_modules/@babel/core/lib/config/files/plugins.js.map
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
5
web/node_modules/@babel/core/lib/config/files/types.js
generated
vendored
Normal file
5
web/node_modules/@babel/core/lib/config/files/types.js
generated
vendored
Normal file
@@ -0,0 +1,5 @@
|
|||||||
|
"use strict";
|
||||||
|
|
||||||
|
0 && 0;
|
||||||
|
|
||||||
|
//# sourceMappingURL=types.js.map
|
||||||
1
web/node_modules/@babel/core/lib/config/files/types.js.map
generated
vendored
Normal file
1
web/node_modules/@babel/core/lib/config/files/types.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"version":3,"names":[],"sources":["../../../src/config/files/types.ts"],"sourcesContent":["import type { InputOptions } from \"../index.ts\";\n\nexport type ConfigFile = {\n filepath: string;\n dirname: string;\n options: InputOptions & { babel?: unknown };\n};\n\nexport type IgnoreFile = {\n filepath: string;\n dirname: string;\n ignore: Array<RegExp>;\n};\n\nexport type RelativeConfig = {\n // The actual config, either from package.json#babel, .babelrc, or\n // .babelrc.js, if there was one.\n config: ConfigFile | null;\n // The .babelignore, if there was one.\n ignore: IgnoreFile | null;\n};\n\nexport type FilePackageData = {\n // The file in the package.\n filepath: string;\n // Any ancestor directories of the file that are within the package.\n directories: Array<string>;\n // The contents of the package.json. May not be found if the package just\n // terminated at a node_modules folder without finding one.\n pkg: ConfigFile | null;\n // True if a package.json or node_modules folder was found while traversing\n // the directory structure.\n isPackage: boolean;\n};\n"],"mappings":"","ignoreList":[]}
|
||||||
36
web/node_modules/@babel/core/lib/config/files/utils.js
generated
vendored
Normal file
36
web/node_modules/@babel/core/lib/config/files/utils.js
generated
vendored
Normal file
@@ -0,0 +1,36 @@
|
|||||||
|
"use strict";
|
||||||
|
|
||||||
|
Object.defineProperty(exports, "__esModule", {
|
||||||
|
value: true
|
||||||
|
});
|
||||||
|
exports.makeStaticFileCache = makeStaticFileCache;
|
||||||
|
var _caching = require("../caching.js");
|
||||||
|
var fs = require("../../gensync-utils/fs.js");
|
||||||
|
function _fs2() {
|
||||||
|
const data = require("fs");
|
||||||
|
_fs2 = function () {
|
||||||
|
return data;
|
||||||
|
};
|
||||||
|
return data;
|
||||||
|
}
|
||||||
|
function makeStaticFileCache(fn) {
|
||||||
|
return (0, _caching.makeStrongCache)(function* (filepath, cache) {
|
||||||
|
const cached = cache.invalidate(() => fileMtime(filepath));
|
||||||
|
if (cached === null) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
return fn(filepath, yield* fs.readFile(filepath, "utf8"));
|
||||||
|
});
|
||||||
|
}
|
||||||
|
function fileMtime(filepath) {
|
||||||
|
if (!_fs2().existsSync(filepath)) return null;
|
||||||
|
try {
|
||||||
|
return +_fs2().statSync(filepath).mtime;
|
||||||
|
} catch (e) {
|
||||||
|
if (e.code !== "ENOENT" && e.code !== "ENOTDIR") throw e;
|
||||||
|
}
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
0 && 0;
|
||||||
|
|
||||||
|
//# sourceMappingURL=utils.js.map
|
||||||
1
web/node_modules/@babel/core/lib/config/files/utils.js.map
generated
vendored
Normal file
1
web/node_modules/@babel/core/lib/config/files/utils.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"version":3,"names":["_caching","require","fs","_fs2","data","makeStaticFileCache","fn","makeStrongCache","filepath","cache","cached","invalidate","fileMtime","readFile","nodeFs","existsSync","statSync","mtime","e","code"],"sources":["../../../src/config/files/utils.ts"],"sourcesContent":["import type { Handler } from \"gensync\";\n\nimport { makeStrongCache } from \"../caching.ts\";\nimport type { CacheConfigurator } from \"../caching.ts\";\nimport * as fs from \"../../gensync-utils/fs.ts\";\nimport nodeFs from \"node:fs\";\n\nexport function makeStaticFileCache<T>(\n fn: (filepath: string, contents: string) => T,\n) {\n return makeStrongCache(function* (\n filepath: string,\n cache: CacheConfigurator<void>,\n ): Handler<null | T> {\n const cached = cache.invalidate(() => fileMtime(filepath));\n\n if (cached === null) {\n return null;\n }\n\n return fn(filepath, yield* fs.readFile(filepath, \"utf8\"));\n });\n}\n\nfunction fileMtime(filepath: string): number | null {\n if (!nodeFs.existsSync(filepath)) return null;\n\n try {\n return +nodeFs.statSync(filepath).mtime;\n } catch (e) {\n if (e.code !== \"ENOENT\" && e.code !== \"ENOTDIR\") throw e;\n }\n\n return null;\n}\n"],"mappings":";;;;;;AAEA,IAAAA,QAAA,GAAAC,OAAA;AAEA,IAAAC,EAAA,GAAAD,OAAA;AACA,SAAAE,KAAA;EAAA,MAAAC,IAAA,GAAAH,OAAA;EAAAE,IAAA,YAAAA,CAAA;IAAA,OAAAC,IAAA;EAAA;EAAA,OAAAA,IAAA;AAAA;AAEO,SAASC,mBAAmBA,CACjCC,EAA6C,EAC7C;EACA,OAAO,IAAAC,wBAAe,EAAC,WACrBC,QAAgB,EAChBC,KAA8B,EACX;IACnB,MAAMC,MAAM,GAAGD,KAAK,CAACE,UAAU,CAAC,MAAMC,SAAS,CAACJ,QAAQ,CAAC,CAAC;IAE1D,IAAIE,MAAM,KAAK,IAAI,EAAE;MACnB,OAAO,IAAI;IACb;IAEA,OAAOJ,EAAE,CAACE,QAAQ,EAAE,OAAON,EAAE,CAACW,QAAQ,CAACL,QAAQ,EAAE,MAAM,CAAC,CAAC;EAC3D,CAAC,CAAC;AACJ;AAEA,SAASI,SAASA,CAACJ,QAAgB,EAAiB;EAClD,IAAI,CAACM,KAAKA,CAAC,CAACC,UAAU,CAACP,QAAQ,CAAC,EAAE,OAAO,IAAI;EAE7C,IAAI;IACF,OAAO,CAACM,KAAKA,CAAC,CAACE,QAAQ,CAACR,QAAQ,CAAC,CAACS,KAAK;EACzC,CAAC,CAAC,OAAOC,CAAC,EAAE;IACV,IAAIA,CAAC,CAACC,IAAI,KAAK,QAAQ,IAAID,CAAC,CAACC,IAAI,KAAK,SAAS,EAAE,MAAMD,CAAC;EAC1D;EAEA,OAAO,IAAI;AACb;AAAC","ignoreList":[]}
|
||||||
312
web/node_modules/@babel/core/lib/config/full.js
generated
vendored
Normal file
312
web/node_modules/@babel/core/lib/config/full.js
generated
vendored
Normal file
@@ -0,0 +1,312 @@
|
|||||||
|
"use strict";
|
||||||
|
|
||||||
|
Object.defineProperty(exports, "__esModule", {
|
||||||
|
value: true
|
||||||
|
});
|
||||||
|
exports.default = void 0;
|
||||||
|
function _gensync() {
|
||||||
|
const data = require("gensync");
|
||||||
|
_gensync = function () {
|
||||||
|
return data;
|
||||||
|
};
|
||||||
|
return data;
|
||||||
|
}
|
||||||
|
var _async = require("../gensync-utils/async.js");
|
||||||
|
var _util = require("./util.js");
|
||||||
|
var context = require("../index.js");
|
||||||
|
var _plugin = require("./plugin.js");
|
||||||
|
var _item = require("./item.js");
|
||||||
|
var _configChain = require("./config-chain.js");
|
||||||
|
var _deepArray = require("./helpers/deep-array.js");
|
||||||
|
function _traverse() {
|
||||||
|
const data = require("@babel/traverse");
|
||||||
|
_traverse = function () {
|
||||||
|
return data;
|
||||||
|
};
|
||||||
|
return data;
|
||||||
|
}
|
||||||
|
var _caching = require("./caching.js");
|
||||||
|
var _options = require("./validation/options.js");
|
||||||
|
var _plugins = require("./validation/plugins.js");
|
||||||
|
var _configApi = require("./helpers/config-api.js");
|
||||||
|
var _partial = require("./partial.js");
|
||||||
|
var _configError = require("../errors/config-error.js");
|
||||||
|
var _default = exports.default = _gensync()(function* loadFullConfig(inputOpts) {
|
||||||
|
var _opts$assumptions;
|
||||||
|
const result = yield* (0, _partial.default)(inputOpts);
|
||||||
|
if (!result) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
const {
|
||||||
|
options,
|
||||||
|
context,
|
||||||
|
fileHandling
|
||||||
|
} = result;
|
||||||
|
if (fileHandling === "ignored") {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
const optionDefaults = {};
|
||||||
|
const {
|
||||||
|
plugins,
|
||||||
|
presets
|
||||||
|
} = options;
|
||||||
|
if (!plugins || !presets) {
|
||||||
|
throw new Error("Assertion failure - plugins and presets exist");
|
||||||
|
}
|
||||||
|
const presetContext = Object.assign({}, context, {
|
||||||
|
targets: options.targets
|
||||||
|
});
|
||||||
|
const toDescriptor = item => {
|
||||||
|
const desc = (0, _item.getItemDescriptor)(item);
|
||||||
|
if (!desc) {
|
||||||
|
throw new Error("Assertion failure - must be config item");
|
||||||
|
}
|
||||||
|
return desc;
|
||||||
|
};
|
||||||
|
const presetsDescriptors = presets.map(toDescriptor);
|
||||||
|
const initialPluginsDescriptors = plugins.map(toDescriptor);
|
||||||
|
const pluginDescriptorsByPass = [[]];
|
||||||
|
const passes = [];
|
||||||
|
const externalDependencies = [];
|
||||||
|
const ignored = yield* enhanceError(context, function* recursePresetDescriptors(rawPresets, pluginDescriptorsPass) {
|
||||||
|
const presets = [];
|
||||||
|
for (let i = 0; i < rawPresets.length; i++) {
|
||||||
|
const descriptor = rawPresets[i];
|
||||||
|
if (descriptor.options !== false) {
|
||||||
|
try {
|
||||||
|
var preset = yield* loadPresetDescriptor(descriptor, presetContext);
|
||||||
|
} catch (e) {
|
||||||
|
if (e.code === "BABEL_UNKNOWN_OPTION") {
|
||||||
|
(0, _options.checkNoUnwrappedItemOptionPairs)(rawPresets, i, "preset", e);
|
||||||
|
}
|
||||||
|
throw e;
|
||||||
|
}
|
||||||
|
externalDependencies.push(preset.externalDependencies);
|
||||||
|
if (descriptor.ownPass) {
|
||||||
|
presets.push({
|
||||||
|
preset: preset.chain,
|
||||||
|
pass: []
|
||||||
|
});
|
||||||
|
} else {
|
||||||
|
presets.unshift({
|
||||||
|
preset: preset.chain,
|
||||||
|
pass: pluginDescriptorsPass
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (presets.length > 0) {
|
||||||
|
pluginDescriptorsByPass.splice(1, 0, ...presets.map(o => o.pass).filter(p => p !== pluginDescriptorsPass));
|
||||||
|
for (const {
|
||||||
|
preset,
|
||||||
|
pass
|
||||||
|
} of presets) {
|
||||||
|
if (!preset) return true;
|
||||||
|
pass.push(...preset.plugins);
|
||||||
|
const ignored = yield* recursePresetDescriptors(preset.presets, pass);
|
||||||
|
if (ignored) return true;
|
||||||
|
preset.options.forEach(opts => {
|
||||||
|
(0, _util.mergeOptions)(optionDefaults, opts);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
})(presetsDescriptors, pluginDescriptorsByPass[0]);
|
||||||
|
if (ignored) return null;
|
||||||
|
const opts = optionDefaults;
|
||||||
|
(0, _util.mergeOptions)(opts, options);
|
||||||
|
const pluginContext = Object.assign({}, presetContext, {
|
||||||
|
assumptions: (_opts$assumptions = opts.assumptions) != null ? _opts$assumptions : {}
|
||||||
|
});
|
||||||
|
yield* enhanceError(context, function* loadPluginDescriptors() {
|
||||||
|
pluginDescriptorsByPass[0].unshift(...initialPluginsDescriptors);
|
||||||
|
for (const descs of pluginDescriptorsByPass) {
|
||||||
|
const pass = [];
|
||||||
|
passes.push(pass);
|
||||||
|
for (let i = 0; i < descs.length; i++) {
|
||||||
|
const descriptor = descs[i];
|
||||||
|
if (descriptor.options !== false) {
|
||||||
|
try {
|
||||||
|
var plugin = yield* loadPluginDescriptor(descriptor, pluginContext);
|
||||||
|
} catch (e) {
|
||||||
|
if (e.code === "BABEL_UNKNOWN_PLUGIN_PROPERTY") {
|
||||||
|
(0, _options.checkNoUnwrappedItemOptionPairs)(descs, i, "plugin", e);
|
||||||
|
}
|
||||||
|
throw e;
|
||||||
|
}
|
||||||
|
pass.push(plugin);
|
||||||
|
externalDependencies.push(plugin.externalDependencies);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
})();
|
||||||
|
opts.plugins = passes[0];
|
||||||
|
opts.presets = passes.slice(1).filter(plugins => plugins.length > 0).map(plugins => ({
|
||||||
|
plugins
|
||||||
|
}));
|
||||||
|
opts.passPerPreset = opts.presets.length > 0;
|
||||||
|
return {
|
||||||
|
options: opts,
|
||||||
|
passes: passes,
|
||||||
|
externalDependencies: (0, _deepArray.finalize)(externalDependencies)
|
||||||
|
};
|
||||||
|
});
|
||||||
|
function enhanceError(context, fn) {
|
||||||
|
return function* (arg1, arg2) {
|
||||||
|
try {
|
||||||
|
return yield* fn(arg1, arg2);
|
||||||
|
} catch (e) {
|
||||||
|
if (!/^\[BABEL\]/.test(e.message)) {
|
||||||
|
var _context$filename;
|
||||||
|
e.message = `[BABEL] ${(_context$filename = context.filename) != null ? _context$filename : "unknown file"}: ${e.message}`;
|
||||||
|
}
|
||||||
|
throw e;
|
||||||
|
}
|
||||||
|
};
|
||||||
|
}
|
||||||
|
const makeDescriptorLoader = apiFactory => (0, _caching.makeWeakCache)(function* ({
|
||||||
|
value,
|
||||||
|
options,
|
||||||
|
dirname,
|
||||||
|
alias
|
||||||
|
}, cache) {
|
||||||
|
if (options === false) throw new Error("Assertion failure");
|
||||||
|
options = options || {};
|
||||||
|
const externalDependencies = [];
|
||||||
|
let item = value;
|
||||||
|
if (typeof value === "function") {
|
||||||
|
const factory = (0, _async.maybeAsync)(value, `You appear to be using an async plugin/preset, but Babel has been called synchronously`);
|
||||||
|
const api = Object.assign({}, context, apiFactory(cache, externalDependencies));
|
||||||
|
try {
|
||||||
|
item = yield* factory(api, options, dirname);
|
||||||
|
} catch (e) {
|
||||||
|
if (alias) {
|
||||||
|
e.message += ` (While processing: ${JSON.stringify(alias)})`;
|
||||||
|
}
|
||||||
|
throw e;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (!item || typeof item !== "object") {
|
||||||
|
throw new Error("Plugin/Preset did not return an object.");
|
||||||
|
}
|
||||||
|
if ((0, _async.isThenable)(item)) {
|
||||||
|
yield* [];
|
||||||
|
throw new Error(`You appear to be using a promise as a plugin, ` + `which your current version of Babel does not support. ` + `If you're using a published plugin, ` + `you may need to upgrade your @babel/core version. ` + `As an alternative, you can prefix the promise with "await". ` + `(While processing: ${JSON.stringify(alias)})`);
|
||||||
|
}
|
||||||
|
if (externalDependencies.length > 0 && (!cache.configured() || cache.mode() === "forever")) {
|
||||||
|
let error = `A plugin/preset has external untracked dependencies ` + `(${externalDependencies[0]}), but the cache `;
|
||||||
|
if (!cache.configured()) {
|
||||||
|
error += `has not been configured to be invalidated when the external dependencies change. `;
|
||||||
|
} else {
|
||||||
|
error += ` has been configured to never be invalidated. `;
|
||||||
|
}
|
||||||
|
error += `Plugins/presets should configure their cache to be invalidated when the external ` + `dependencies change, for example using \`api.cache.invalidate(() => ` + `statSync(filepath).mtimeMs)\` or \`api.cache.never()\`\n` + `(While processing: ${JSON.stringify(alias)})`;
|
||||||
|
throw new Error(error);
|
||||||
|
}
|
||||||
|
return {
|
||||||
|
value: item,
|
||||||
|
options,
|
||||||
|
dirname,
|
||||||
|
alias,
|
||||||
|
externalDependencies: (0, _deepArray.finalize)(externalDependencies)
|
||||||
|
};
|
||||||
|
});
|
||||||
|
const pluginDescriptorLoader = makeDescriptorLoader(_configApi.makePluginAPI);
|
||||||
|
const presetDescriptorLoader = makeDescriptorLoader(_configApi.makePresetAPI);
|
||||||
|
const instantiatePlugin = (0, _caching.makeWeakCache)(function* ({
|
||||||
|
value,
|
||||||
|
options,
|
||||||
|
dirname,
|
||||||
|
alias,
|
||||||
|
externalDependencies
|
||||||
|
}, cache) {
|
||||||
|
const pluginObj = (0, _plugins.validatePluginObject)(value);
|
||||||
|
const plugin = Object.assign({}, pluginObj);
|
||||||
|
if (plugin.visitor) {
|
||||||
|
plugin.visitor = _traverse().default.explode(Object.assign({}, plugin.visitor));
|
||||||
|
}
|
||||||
|
if (plugin.inherits) {
|
||||||
|
const inheritsDescriptor = {
|
||||||
|
name: undefined,
|
||||||
|
alias: `${alias}$inherits`,
|
||||||
|
value: plugin.inherits,
|
||||||
|
options,
|
||||||
|
dirname
|
||||||
|
};
|
||||||
|
const inherits = yield* (0, _async.forwardAsync)(loadPluginDescriptor, run => {
|
||||||
|
return cache.invalidate(data => run(inheritsDescriptor, data));
|
||||||
|
});
|
||||||
|
plugin.pre = chainMaybeAsync(inherits.pre, plugin.pre);
|
||||||
|
plugin.post = chainMaybeAsync(inherits.post, plugin.post);
|
||||||
|
plugin.manipulateOptions = chainMaybeAsync(inherits.manipulateOptions, plugin.manipulateOptions);
|
||||||
|
plugin.visitor = _traverse().default.visitors.merge([inherits.visitor || {}, plugin.visitor || {}]);
|
||||||
|
if (inherits.externalDependencies.length > 0) {
|
||||||
|
if (externalDependencies.length === 0) {
|
||||||
|
externalDependencies = inherits.externalDependencies;
|
||||||
|
} else {
|
||||||
|
externalDependencies = (0, _deepArray.finalize)([externalDependencies, inherits.externalDependencies]);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return new _plugin.default(plugin, options, alias, externalDependencies);
|
||||||
|
});
|
||||||
|
function* loadPluginDescriptor(descriptor, context) {
|
||||||
|
if (descriptor.value instanceof _plugin.default) {
|
||||||
|
if (descriptor.options) {
|
||||||
|
throw new Error("Passed options to an existing Plugin instance will not work.");
|
||||||
|
}
|
||||||
|
return descriptor.value;
|
||||||
|
}
|
||||||
|
return yield* instantiatePlugin(yield* pluginDescriptorLoader(descriptor, context), context);
|
||||||
|
}
|
||||||
|
const needsFilename = val => val && typeof val !== "function";
|
||||||
|
const validateIfOptionNeedsFilename = (options, descriptor) => {
|
||||||
|
if (needsFilename(options.test) || needsFilename(options.include) || needsFilename(options.exclude)) {
|
||||||
|
const formattedPresetName = descriptor.name ? `"${descriptor.name}"` : "/* your preset */";
|
||||||
|
throw new _configError.default([`Preset ${formattedPresetName} requires a filename to be set when babel is called directly,`, `\`\`\``, `babel.transformSync(code, { filename: 'file.ts', presets: [${formattedPresetName}] });`, `\`\`\``, `See https://babeljs.io/docs/en/options#filename for more information.`].join("\n"));
|
||||||
|
}
|
||||||
|
};
|
||||||
|
const validatePreset = (preset, context, descriptor) => {
|
||||||
|
if (!context.filename) {
|
||||||
|
var _options$overrides;
|
||||||
|
const {
|
||||||
|
options
|
||||||
|
} = preset;
|
||||||
|
validateIfOptionNeedsFilename(options, descriptor);
|
||||||
|
(_options$overrides = options.overrides) == null || _options$overrides.forEach(overrideOptions => validateIfOptionNeedsFilename(overrideOptions, descriptor));
|
||||||
|
}
|
||||||
|
};
|
||||||
|
const instantiatePreset = (0, _caching.makeWeakCacheSync)(({
|
||||||
|
value,
|
||||||
|
dirname,
|
||||||
|
alias,
|
||||||
|
externalDependencies
|
||||||
|
}) => {
|
||||||
|
return {
|
||||||
|
options: (0, _options.validate)("preset", value),
|
||||||
|
alias,
|
||||||
|
dirname,
|
||||||
|
externalDependencies
|
||||||
|
};
|
||||||
|
});
|
||||||
|
function* loadPresetDescriptor(descriptor, context) {
|
||||||
|
const preset = instantiatePreset(yield* presetDescriptorLoader(descriptor, context));
|
||||||
|
validatePreset(preset, context, descriptor);
|
||||||
|
return {
|
||||||
|
chain: yield* (0, _configChain.buildPresetChain)(preset, context),
|
||||||
|
externalDependencies: preset.externalDependencies
|
||||||
|
};
|
||||||
|
}
|
||||||
|
function chainMaybeAsync(a, b) {
|
||||||
|
if (!a) return b;
|
||||||
|
if (!b) return a;
|
||||||
|
return function (...args) {
|
||||||
|
const res = a.apply(this, args);
|
||||||
|
if (res && typeof res.then === "function") {
|
||||||
|
return res.then(() => b.apply(this, args));
|
||||||
|
}
|
||||||
|
return b.apply(this, args);
|
||||||
|
};
|
||||||
|
}
|
||||||
|
0 && 0;
|
||||||
|
|
||||||
|
//# sourceMappingURL=full.js.map
|
||||||
1
web/node_modules/@babel/core/lib/config/full.js.map
generated
vendored
Normal file
1
web/node_modules/@babel/core/lib/config/full.js.map
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
84
web/node_modules/@babel/core/lib/config/helpers/config-api.js
generated
vendored
Normal file
84
web/node_modules/@babel/core/lib/config/helpers/config-api.js
generated
vendored
Normal file
@@ -0,0 +1,84 @@
|
|||||||
|
"use strict";
|
||||||
|
|
||||||
|
Object.defineProperty(exports, "__esModule", {
|
||||||
|
value: true
|
||||||
|
});
|
||||||
|
exports.makeConfigAPI = makeConfigAPI;
|
||||||
|
exports.makePluginAPI = makePluginAPI;
|
||||||
|
exports.makePresetAPI = makePresetAPI;
|
||||||
|
function _semver() {
|
||||||
|
const data = require("semver");
|
||||||
|
_semver = function () {
|
||||||
|
return data;
|
||||||
|
};
|
||||||
|
return data;
|
||||||
|
}
|
||||||
|
var _index = require("../../index.js");
|
||||||
|
var _caching = require("../caching.js");
|
||||||
|
function makeConfigAPI(cache) {
|
||||||
|
const env = value => cache.using(data => {
|
||||||
|
if (value === undefined) return data.envName;
|
||||||
|
if (typeof value === "function") {
|
||||||
|
return (0, _caching.assertSimpleType)(value(data.envName));
|
||||||
|
}
|
||||||
|
return (Array.isArray(value) ? value : [value]).some(entry => {
|
||||||
|
if (typeof entry !== "string") {
|
||||||
|
throw new Error("Unexpected non-string value");
|
||||||
|
}
|
||||||
|
return entry === data.envName;
|
||||||
|
});
|
||||||
|
});
|
||||||
|
const caller = cb => cache.using(data => (0, _caching.assertSimpleType)(cb(data.caller)));
|
||||||
|
return {
|
||||||
|
version: _index.version,
|
||||||
|
cache: cache.simple(),
|
||||||
|
env,
|
||||||
|
async: () => false,
|
||||||
|
caller,
|
||||||
|
assertVersion
|
||||||
|
};
|
||||||
|
}
|
||||||
|
function makePresetAPI(cache, externalDependencies) {
|
||||||
|
const targets = () => JSON.parse(cache.using(data => JSON.stringify(data.targets)));
|
||||||
|
const addExternalDependency = ref => {
|
||||||
|
externalDependencies.push(ref);
|
||||||
|
};
|
||||||
|
return Object.assign({}, makeConfigAPI(cache), {
|
||||||
|
targets,
|
||||||
|
addExternalDependency
|
||||||
|
});
|
||||||
|
}
|
||||||
|
function makePluginAPI(cache, externalDependencies) {
|
||||||
|
const assumption = name => cache.using(data => data.assumptions[name]);
|
||||||
|
return Object.assign({}, makePresetAPI(cache, externalDependencies), {
|
||||||
|
assumption
|
||||||
|
});
|
||||||
|
}
|
||||||
|
function assertVersion(range) {
|
||||||
|
if (typeof range === "number") {
|
||||||
|
if (!Number.isInteger(range)) {
|
||||||
|
throw new Error("Expected string or integer value.");
|
||||||
|
}
|
||||||
|
range = `^${range}.0.0-0`;
|
||||||
|
}
|
||||||
|
if (typeof range !== "string") {
|
||||||
|
throw new Error("Expected string or integer value.");
|
||||||
|
}
|
||||||
|
if (range === "*" || _semver().satisfies(_index.version, range)) return;
|
||||||
|
const limit = Error.stackTraceLimit;
|
||||||
|
if (typeof limit === "number" && limit < 25) {
|
||||||
|
Error.stackTraceLimit = 25;
|
||||||
|
}
|
||||||
|
const err = new Error(`Requires Babel "${range}", but was loaded with "${_index.version}". ` + `If you are sure you have a compatible version of @babel/core, ` + `it is likely that something in your build process is loading the ` + `wrong version. Inspect the stack trace of this error to look for ` + `the first entry that doesn't mention "@babel/core" or "babel-core" ` + `to see what is calling Babel.`);
|
||||||
|
if (typeof limit === "number") {
|
||||||
|
Error.stackTraceLimit = limit;
|
||||||
|
}
|
||||||
|
throw Object.assign(err, {
|
||||||
|
code: "BABEL_VERSION_UNSUPPORTED",
|
||||||
|
version: _index.version,
|
||||||
|
range
|
||||||
|
});
|
||||||
|
}
|
||||||
|
0 && 0;
|
||||||
|
|
||||||
|
//# sourceMappingURL=config-api.js.map
|
||||||
1
web/node_modules/@babel/core/lib/config/helpers/config-api.js.map
generated
vendored
Normal file
1
web/node_modules/@babel/core/lib/config/helpers/config-api.js.map
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
23
web/node_modules/@babel/core/lib/config/helpers/deep-array.js
generated
vendored
Normal file
23
web/node_modules/@babel/core/lib/config/helpers/deep-array.js
generated
vendored
Normal file
@@ -0,0 +1,23 @@
|
|||||||
|
"use strict";
|
||||||
|
|
||||||
|
Object.defineProperty(exports, "__esModule", {
|
||||||
|
value: true
|
||||||
|
});
|
||||||
|
exports.finalize = finalize;
|
||||||
|
exports.flattenToSet = flattenToSet;
|
||||||
|
function finalize(deepArr) {
|
||||||
|
return Object.freeze(deepArr);
|
||||||
|
}
|
||||||
|
function flattenToSet(arr) {
|
||||||
|
const result = new Set();
|
||||||
|
const stack = [arr];
|
||||||
|
while (stack.length > 0) {
|
||||||
|
for (const el of stack.pop()) {
|
||||||
|
if (Array.isArray(el)) stack.push(el);else result.add(el);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
0 && 0;
|
||||||
|
|
||||||
|
//# sourceMappingURL=deep-array.js.map
|
||||||
1
web/node_modules/@babel/core/lib/config/helpers/deep-array.js.map
generated
vendored
Normal file
1
web/node_modules/@babel/core/lib/config/helpers/deep-array.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"version":3,"names":["finalize","deepArr","Object","freeze","flattenToSet","arr","result","Set","stack","length","el","pop","Array","isArray","push","add"],"sources":["../../../src/config/helpers/deep-array.ts"],"sourcesContent":["export type DeepArray<T> = Array<T | ReadonlyDeepArray<T>>;\n\n// Just to make sure that DeepArray<T> is not assignable to ReadonlyDeepArray<T>\ndeclare const __marker: unique symbol;\nexport type ReadonlyDeepArray<T> = ReadonlyArray<T | ReadonlyDeepArray<T>> & {\n [__marker]: true;\n};\n\nexport function finalize<T>(deepArr: DeepArray<T>): ReadonlyDeepArray<T> {\n return Object.freeze(deepArr) as ReadonlyDeepArray<T>;\n}\n\nexport function flattenToSet<T extends string>(\n arr: ReadonlyDeepArray<T>,\n): Set<T> {\n const result = new Set<T>();\n const stack = [arr];\n while (stack.length > 0) {\n for (const el of stack.pop()) {\n if (Array.isArray(el)) stack.push(el as ReadonlyDeepArray<T>);\n else result.add(el as T);\n }\n }\n return result;\n}\n"],"mappings":";;;;;;;AAQO,SAASA,QAAQA,CAAIC,OAAqB,EAAwB;EACvE,OAAOC,MAAM,CAACC,MAAM,CAACF,OAAO,CAAC;AAC/B;AAEO,SAASG,YAAYA,CAC1BC,GAAyB,EACjB;EACR,MAAMC,MAAM,GAAG,IAAIC,GAAG,CAAI,CAAC;EAC3B,MAAMC,KAAK,GAAG,CAACH,GAAG,CAAC;EACnB,OAAOG,KAAK,CAACC,MAAM,GAAG,CAAC,EAAE;IACvB,KAAK,MAAMC,EAAE,IAAIF,KAAK,CAACG,GAAG,CAAC,CAAC,EAAE;MAC5B,IAAIC,KAAK,CAACC,OAAO,CAACH,EAAE,CAAC,EAAEF,KAAK,CAACM,IAAI,CAACJ,EAA0B,CAAC,CAAC,KACzDJ,MAAM,CAACS,GAAG,CAACL,EAAO,CAAC;IAC1B;EACF;EACA,OAAOJ,MAAM;AACf;AAAC","ignoreList":[]}
|
||||||
12
web/node_modules/@babel/core/lib/config/helpers/environment.js
generated
vendored
Normal file
12
web/node_modules/@babel/core/lib/config/helpers/environment.js
generated
vendored
Normal file
@@ -0,0 +1,12 @@
|
|||||||
|
"use strict";
|
||||||
|
|
||||||
|
Object.defineProperty(exports, "__esModule", {
|
||||||
|
value: true
|
||||||
|
});
|
||||||
|
exports.getEnv = getEnv;
|
||||||
|
function getEnv(defaultValue = "development") {
|
||||||
|
return process.env.BABEL_ENV || process.env.NODE_ENV || defaultValue;
|
||||||
|
}
|
||||||
|
0 && 0;
|
||||||
|
|
||||||
|
//# sourceMappingURL=environment.js.map
|
||||||
1
web/node_modules/@babel/core/lib/config/helpers/environment.js.map
generated
vendored
Normal file
1
web/node_modules/@babel/core/lib/config/helpers/environment.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"version":3,"names":["getEnv","defaultValue","process","env","BABEL_ENV","NODE_ENV"],"sources":["../../../src/config/helpers/environment.ts"],"sourcesContent":["export function getEnv(defaultValue: string = \"development\"): string {\n return process.env.BABEL_ENV || process.env.NODE_ENV || defaultValue;\n}\n"],"mappings":";;;;;;AAAO,SAASA,MAAMA,CAACC,YAAoB,GAAG,aAAa,EAAU;EACnE,OAAOC,OAAO,CAACC,GAAG,CAACC,SAAS,IAAIF,OAAO,CAACC,GAAG,CAACE,QAAQ,IAAIJ,YAAY;AACtE;AAAC","ignoreList":[]}
|
||||||
93
web/node_modules/@babel/core/lib/config/index.js
generated
vendored
Normal file
93
web/node_modules/@babel/core/lib/config/index.js
generated
vendored
Normal file
@@ -0,0 +1,93 @@
|
|||||||
|
"use strict";
|
||||||
|
|
||||||
|
Object.defineProperty(exports, "__esModule", {
|
||||||
|
value: true
|
||||||
|
});
|
||||||
|
exports.createConfigItem = createConfigItem;
|
||||||
|
exports.createConfigItemAsync = createConfigItemAsync;
|
||||||
|
exports.createConfigItemSync = createConfigItemSync;
|
||||||
|
Object.defineProperty(exports, "default", {
|
||||||
|
enumerable: true,
|
||||||
|
get: function () {
|
||||||
|
return _full.default;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
exports.loadOptions = loadOptions;
|
||||||
|
exports.loadOptionsAsync = loadOptionsAsync;
|
||||||
|
exports.loadOptionsSync = loadOptionsSync;
|
||||||
|
exports.loadPartialConfig = loadPartialConfig;
|
||||||
|
exports.loadPartialConfigAsync = loadPartialConfigAsync;
|
||||||
|
exports.loadPartialConfigSync = loadPartialConfigSync;
|
||||||
|
function _gensync() {
|
||||||
|
const data = require("gensync");
|
||||||
|
_gensync = function () {
|
||||||
|
return data;
|
||||||
|
};
|
||||||
|
return data;
|
||||||
|
}
|
||||||
|
var _full = require("./full.js");
|
||||||
|
var _partial = require("./partial.js");
|
||||||
|
var _item = require("./item.js");
|
||||||
|
var _rewriteStackTrace = require("../errors/rewrite-stack-trace.js");
|
||||||
|
const loadPartialConfigRunner = _gensync()(_partial.loadPartialConfig);
|
||||||
|
function loadPartialConfigAsync(...args) {
|
||||||
|
return (0, _rewriteStackTrace.beginHiddenCallStack)(loadPartialConfigRunner.async)(...args);
|
||||||
|
}
|
||||||
|
function loadPartialConfigSync(...args) {
|
||||||
|
return (0, _rewriteStackTrace.beginHiddenCallStack)(loadPartialConfigRunner.sync)(...args);
|
||||||
|
}
|
||||||
|
function loadPartialConfig(opts, callback) {
|
||||||
|
if (callback !== undefined) {
|
||||||
|
(0, _rewriteStackTrace.beginHiddenCallStack)(loadPartialConfigRunner.errback)(opts, callback);
|
||||||
|
} else if (typeof opts === "function") {
|
||||||
|
(0, _rewriteStackTrace.beginHiddenCallStack)(loadPartialConfigRunner.errback)(undefined, opts);
|
||||||
|
} else {
|
||||||
|
{
|
||||||
|
return loadPartialConfigSync(opts);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
function* loadOptionsImpl(opts) {
|
||||||
|
var _config$options;
|
||||||
|
const config = yield* (0, _full.default)(opts);
|
||||||
|
return (_config$options = config == null ? void 0 : config.options) != null ? _config$options : null;
|
||||||
|
}
|
||||||
|
const loadOptionsRunner = _gensync()(loadOptionsImpl);
|
||||||
|
function loadOptionsAsync(...args) {
|
||||||
|
return (0, _rewriteStackTrace.beginHiddenCallStack)(loadOptionsRunner.async)(...args);
|
||||||
|
}
|
||||||
|
function loadOptionsSync(...args) {
|
||||||
|
return (0, _rewriteStackTrace.beginHiddenCallStack)(loadOptionsRunner.sync)(...args);
|
||||||
|
}
|
||||||
|
function loadOptions(opts, callback) {
|
||||||
|
if (callback !== undefined) {
|
||||||
|
(0, _rewriteStackTrace.beginHiddenCallStack)(loadOptionsRunner.errback)(opts, callback);
|
||||||
|
} else if (typeof opts === "function") {
|
||||||
|
(0, _rewriteStackTrace.beginHiddenCallStack)(loadOptionsRunner.errback)(undefined, opts);
|
||||||
|
} else {
|
||||||
|
{
|
||||||
|
return loadOptionsSync(opts);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
const createConfigItemRunner = _gensync()(_item.createConfigItem);
|
||||||
|
function createConfigItemAsync(...args) {
|
||||||
|
return (0, _rewriteStackTrace.beginHiddenCallStack)(createConfigItemRunner.async)(...args);
|
||||||
|
}
|
||||||
|
function createConfigItemSync(...args) {
|
||||||
|
return (0, _rewriteStackTrace.beginHiddenCallStack)(createConfigItemRunner.sync)(...args);
|
||||||
|
}
|
||||||
|
function createConfigItem(target, options, callback) {
|
||||||
|
if (callback !== undefined) {
|
||||||
|
(0, _rewriteStackTrace.beginHiddenCallStack)(createConfigItemRunner.errback)(target, options, callback);
|
||||||
|
} else if (typeof options === "function") {
|
||||||
|
(0, _rewriteStackTrace.beginHiddenCallStack)(createConfigItemRunner.errback)(target, undefined, callback);
|
||||||
|
} else {
|
||||||
|
{
|
||||||
|
return createConfigItemSync(target, options);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
0 && 0;
|
||||||
|
|
||||||
|
//# sourceMappingURL=index.js.map
|
||||||
1
web/node_modules/@babel/core/lib/config/index.js.map
generated
vendored
Normal file
1
web/node_modules/@babel/core/lib/config/index.js.map
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
67
web/node_modules/@babel/core/lib/config/item.js
generated
vendored
Normal file
67
web/node_modules/@babel/core/lib/config/item.js
generated
vendored
Normal file
@@ -0,0 +1,67 @@
|
|||||||
|
"use strict";
|
||||||
|
|
||||||
|
Object.defineProperty(exports, "__esModule", {
|
||||||
|
value: true
|
||||||
|
});
|
||||||
|
exports.createConfigItem = createConfigItem;
|
||||||
|
exports.createItemFromDescriptor = createItemFromDescriptor;
|
||||||
|
exports.getItemDescriptor = getItemDescriptor;
|
||||||
|
function _path() {
|
||||||
|
const data = require("path");
|
||||||
|
_path = function () {
|
||||||
|
return data;
|
||||||
|
};
|
||||||
|
return data;
|
||||||
|
}
|
||||||
|
var _configDescriptors = require("./config-descriptors.js");
|
||||||
|
function createItemFromDescriptor(desc) {
|
||||||
|
return new ConfigItem(desc);
|
||||||
|
}
|
||||||
|
function* createConfigItem(value, {
|
||||||
|
dirname = ".",
|
||||||
|
type
|
||||||
|
} = {}) {
|
||||||
|
const descriptor = yield* (0, _configDescriptors.createDescriptor)(value, _path().resolve(dirname), {
|
||||||
|
type,
|
||||||
|
alias: "programmatic item"
|
||||||
|
});
|
||||||
|
return createItemFromDescriptor(descriptor);
|
||||||
|
}
|
||||||
|
const CONFIG_ITEM_BRAND = Symbol.for("@babel/core@7 - ConfigItem");
|
||||||
|
function getItemDescriptor(item) {
|
||||||
|
if (item != null && item[CONFIG_ITEM_BRAND]) {
|
||||||
|
return item._descriptor;
|
||||||
|
}
|
||||||
|
return undefined;
|
||||||
|
}
|
||||||
|
class ConfigItem {
|
||||||
|
constructor(descriptor) {
|
||||||
|
this._descriptor = void 0;
|
||||||
|
this[CONFIG_ITEM_BRAND] = true;
|
||||||
|
this.value = void 0;
|
||||||
|
this.options = void 0;
|
||||||
|
this.dirname = void 0;
|
||||||
|
this.name = void 0;
|
||||||
|
this.file = void 0;
|
||||||
|
this._descriptor = descriptor;
|
||||||
|
Object.defineProperty(this, "_descriptor", {
|
||||||
|
enumerable: false
|
||||||
|
});
|
||||||
|
Object.defineProperty(this, CONFIG_ITEM_BRAND, {
|
||||||
|
enumerable: false
|
||||||
|
});
|
||||||
|
this.value = this._descriptor.value;
|
||||||
|
this.options = this._descriptor.options;
|
||||||
|
this.dirname = this._descriptor.dirname;
|
||||||
|
this.name = this._descriptor.name;
|
||||||
|
this.file = this._descriptor.file ? {
|
||||||
|
request: this._descriptor.file.request,
|
||||||
|
resolved: this._descriptor.file.resolved
|
||||||
|
} : undefined;
|
||||||
|
Object.freeze(this);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
Object.freeze(ConfigItem.prototype);
|
||||||
|
0 && 0;
|
||||||
|
|
||||||
|
//# sourceMappingURL=item.js.map
|
||||||
1
web/node_modules/@babel/core/lib/config/item.js.map
generated
vendored
Normal file
1
web/node_modules/@babel/core/lib/config/item.js.map
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
158
web/node_modules/@babel/core/lib/config/partial.js
generated
vendored
Normal file
158
web/node_modules/@babel/core/lib/config/partial.js
generated
vendored
Normal file
@@ -0,0 +1,158 @@
|
|||||||
|
"use strict";
|
||||||
|
|
||||||
|
Object.defineProperty(exports, "__esModule", {
|
||||||
|
value: true
|
||||||
|
});
|
||||||
|
exports.default = loadPrivatePartialConfig;
|
||||||
|
exports.loadPartialConfig = loadPartialConfig;
|
||||||
|
function _path() {
|
||||||
|
const data = require("path");
|
||||||
|
_path = function () {
|
||||||
|
return data;
|
||||||
|
};
|
||||||
|
return data;
|
||||||
|
}
|
||||||
|
var _plugin = require("./plugin.js");
|
||||||
|
var _util = require("./util.js");
|
||||||
|
var _item = require("./item.js");
|
||||||
|
var _configChain = require("./config-chain.js");
|
||||||
|
var _environment = require("./helpers/environment.js");
|
||||||
|
var _options = require("./validation/options.js");
|
||||||
|
var _index = require("./files/index.js");
|
||||||
|
var _resolveTargets = require("./resolve-targets.js");
|
||||||
|
const _excluded = ["showIgnoredFiles"];
|
||||||
|
function _objectWithoutPropertiesLoose(r, e) { if (null == r) return {}; var t = {}; for (var n in r) if ({}.hasOwnProperty.call(r, n)) { if (-1 !== e.indexOf(n)) continue; t[n] = r[n]; } return t; }
|
||||||
|
function resolveRootMode(rootDir, rootMode) {
|
||||||
|
switch (rootMode) {
|
||||||
|
case "root":
|
||||||
|
return rootDir;
|
||||||
|
case "upward-optional":
|
||||||
|
{
|
||||||
|
const upwardRootDir = (0, _index.findConfigUpwards)(rootDir);
|
||||||
|
return upwardRootDir === null ? rootDir : upwardRootDir;
|
||||||
|
}
|
||||||
|
case "upward":
|
||||||
|
{
|
||||||
|
const upwardRootDir = (0, _index.findConfigUpwards)(rootDir);
|
||||||
|
if (upwardRootDir !== null) return upwardRootDir;
|
||||||
|
throw Object.assign(new Error(`Babel was run with rootMode:"upward" but a root could not ` + `be found when searching upward from "${rootDir}".\n` + `One of the following config files must be in the directory tree: ` + `"${_index.ROOT_CONFIG_FILENAMES.join(", ")}".`), {
|
||||||
|
code: "BABEL_ROOT_NOT_FOUND",
|
||||||
|
dirname: rootDir
|
||||||
|
});
|
||||||
|
}
|
||||||
|
default:
|
||||||
|
throw new Error(`Assertion failure - unknown rootMode value.`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
function* loadPrivatePartialConfig(inputOpts) {
|
||||||
|
if (inputOpts != null && (typeof inputOpts !== "object" || Array.isArray(inputOpts))) {
|
||||||
|
throw new Error("Babel options must be an object, null, or undefined");
|
||||||
|
}
|
||||||
|
const args = inputOpts ? (0, _options.validate)("arguments", inputOpts) : {};
|
||||||
|
const {
|
||||||
|
envName = (0, _environment.getEnv)(),
|
||||||
|
cwd = ".",
|
||||||
|
root: rootDir = ".",
|
||||||
|
rootMode = "root",
|
||||||
|
caller,
|
||||||
|
cloneInputAst = true
|
||||||
|
} = args;
|
||||||
|
const absoluteCwd = _path().resolve(cwd);
|
||||||
|
const absoluteRootDir = resolveRootMode(_path().resolve(absoluteCwd, rootDir), rootMode);
|
||||||
|
const filename = typeof args.filename === "string" ? _path().resolve(cwd, args.filename) : undefined;
|
||||||
|
const showConfigPath = yield* (0, _index.resolveShowConfigPath)(absoluteCwd);
|
||||||
|
const context = {
|
||||||
|
filename,
|
||||||
|
cwd: absoluteCwd,
|
||||||
|
root: absoluteRootDir,
|
||||||
|
envName,
|
||||||
|
caller,
|
||||||
|
showConfig: showConfigPath === filename
|
||||||
|
};
|
||||||
|
const configChain = yield* (0, _configChain.buildRootChain)(args, context);
|
||||||
|
if (!configChain) return null;
|
||||||
|
const merged = {
|
||||||
|
assumptions: {}
|
||||||
|
};
|
||||||
|
configChain.options.forEach(opts => {
|
||||||
|
(0, _util.mergeOptions)(merged, opts);
|
||||||
|
});
|
||||||
|
const options = Object.assign({}, merged, {
|
||||||
|
targets: (0, _resolveTargets.resolveTargets)(merged, absoluteRootDir),
|
||||||
|
cloneInputAst,
|
||||||
|
babelrc: false,
|
||||||
|
configFile: false,
|
||||||
|
browserslistConfigFile: false,
|
||||||
|
passPerPreset: false,
|
||||||
|
envName: context.envName,
|
||||||
|
cwd: context.cwd,
|
||||||
|
root: context.root,
|
||||||
|
rootMode: "root",
|
||||||
|
filename: typeof context.filename === "string" ? context.filename : undefined,
|
||||||
|
plugins: configChain.plugins.map(descriptor => (0, _item.createItemFromDescriptor)(descriptor)),
|
||||||
|
presets: configChain.presets.map(descriptor => (0, _item.createItemFromDescriptor)(descriptor))
|
||||||
|
});
|
||||||
|
return {
|
||||||
|
options,
|
||||||
|
context,
|
||||||
|
fileHandling: configChain.fileHandling,
|
||||||
|
ignore: configChain.ignore,
|
||||||
|
babelrc: configChain.babelrc,
|
||||||
|
config: configChain.config,
|
||||||
|
files: configChain.files
|
||||||
|
};
|
||||||
|
}
|
||||||
|
function* loadPartialConfig(opts) {
|
||||||
|
let showIgnoredFiles = false;
|
||||||
|
if (typeof opts === "object" && opts !== null && !Array.isArray(opts)) {
|
||||||
|
var _opts = opts;
|
||||||
|
({
|
||||||
|
showIgnoredFiles
|
||||||
|
} = _opts);
|
||||||
|
opts = _objectWithoutPropertiesLoose(_opts, _excluded);
|
||||||
|
_opts;
|
||||||
|
}
|
||||||
|
const result = yield* loadPrivatePartialConfig(opts);
|
||||||
|
if (!result) return null;
|
||||||
|
const {
|
||||||
|
options,
|
||||||
|
babelrc,
|
||||||
|
ignore,
|
||||||
|
config,
|
||||||
|
fileHandling,
|
||||||
|
files
|
||||||
|
} = result;
|
||||||
|
if (fileHandling === "ignored" && !showIgnoredFiles) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
(options.plugins || []).forEach(item => {
|
||||||
|
if (item.value instanceof _plugin.default) {
|
||||||
|
throw new Error("Passing cached plugin instances is not supported in " + "babel.loadPartialConfig()");
|
||||||
|
}
|
||||||
|
});
|
||||||
|
return new PartialConfig(options, babelrc ? babelrc.filepath : undefined, ignore ? ignore.filepath : undefined, config ? config.filepath : undefined, fileHandling, files);
|
||||||
|
}
|
||||||
|
class PartialConfig {
|
||||||
|
constructor(options, babelrc, ignore, config, fileHandling, files) {
|
||||||
|
this.options = void 0;
|
||||||
|
this.babelrc = void 0;
|
||||||
|
this.babelignore = void 0;
|
||||||
|
this.config = void 0;
|
||||||
|
this.fileHandling = void 0;
|
||||||
|
this.files = void 0;
|
||||||
|
this.options = options;
|
||||||
|
this.babelignore = ignore;
|
||||||
|
this.babelrc = babelrc;
|
||||||
|
this.config = config;
|
||||||
|
this.fileHandling = fileHandling;
|
||||||
|
this.files = files;
|
||||||
|
Object.freeze(this);
|
||||||
|
}
|
||||||
|
hasFilesystemConfig() {
|
||||||
|
return this.babelrc !== undefined || this.config !== undefined;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
Object.freeze(PartialConfig.prototype);
|
||||||
|
0 && 0;
|
||||||
|
|
||||||
|
//# sourceMappingURL=partial.js.map
|
||||||
1
web/node_modules/@babel/core/lib/config/partial.js.map
generated
vendored
Normal file
1
web/node_modules/@babel/core/lib/config/partial.js.map
generated
vendored
Normal file
File diff suppressed because one or more lines are too long
38
web/node_modules/@babel/core/lib/config/pattern-to-regex.js
generated
vendored
Normal file
38
web/node_modules/@babel/core/lib/config/pattern-to-regex.js
generated
vendored
Normal file
@@ -0,0 +1,38 @@
|
|||||||
|
"use strict";
|
||||||
|
|
||||||
|
Object.defineProperty(exports, "__esModule", {
|
||||||
|
value: true
|
||||||
|
});
|
||||||
|
exports.default = pathToPattern;
|
||||||
|
function _path() {
|
||||||
|
const data = require("path");
|
||||||
|
_path = function () {
|
||||||
|
return data;
|
||||||
|
};
|
||||||
|
return data;
|
||||||
|
}
|
||||||
|
const sep = `\\${_path().sep}`;
|
||||||
|
const endSep = `(?:${sep}|$)`;
|
||||||
|
const substitution = `[^${sep}]+`;
|
||||||
|
const starPat = `(?:${substitution}${sep})`;
|
||||||
|
const starPatLast = `(?:${substitution}${endSep})`;
|
||||||
|
const starStarPat = `${starPat}*?`;
|
||||||
|
const starStarPatLast = `${starPat}*?${starPatLast}?`;
|
||||||
|
function escapeRegExp(string) {
|
||||||
|
return string.replace(/[|\\{}()[\]^$+*?.]/g, "\\$&");
|
||||||
|
}
|
||||||
|
function pathToPattern(pattern, dirname) {
|
||||||
|
const parts = _path().resolve(dirname, pattern).split(_path().sep);
|
||||||
|
return new RegExp(["^", ...parts.map((part, i) => {
|
||||||
|
const last = i === parts.length - 1;
|
||||||
|
if (part === "**") return last ? starStarPatLast : starStarPat;
|
||||||
|
if (part === "*") return last ? starPatLast : starPat;
|
||||||
|
if (part.indexOf("*.") === 0) {
|
||||||
|
return substitution + escapeRegExp(part.slice(1)) + (last ? endSep : sep);
|
||||||
|
}
|
||||||
|
return escapeRegExp(part) + (last ? endSep : sep);
|
||||||
|
})].join(""));
|
||||||
|
}
|
||||||
|
0 && 0;
|
||||||
|
|
||||||
|
//# sourceMappingURL=pattern-to-regex.js.map
|
||||||
1
web/node_modules/@babel/core/lib/config/pattern-to-regex.js.map
generated
vendored
Normal file
1
web/node_modules/@babel/core/lib/config/pattern-to-regex.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"version":3,"names":["_path","data","require","sep","path","endSep","substitution","starPat","starPatLast","starStarPat","starStarPatLast","escapeRegExp","string","replace","pathToPattern","pattern","dirname","parts","resolve","split","RegExp","map","part","i","last","length","indexOf","slice","join"],"sources":["../../src/config/pattern-to-regex.ts"],"sourcesContent":["import path from \"node:path\";\n\nconst sep = `\\\\${path.sep}`;\nconst endSep = `(?:${sep}|$)`;\n\nconst substitution = `[^${sep}]+`;\n\nconst starPat = `(?:${substitution}${sep})`;\nconst starPatLast = `(?:${substitution}${endSep})`;\n\nconst starStarPat = `${starPat}*?`;\nconst starStarPatLast = `${starPat}*?${starPatLast}?`;\n\nfunction escapeRegExp(string: string) {\n return string.replace(/[|\\\\{}()[\\]^$+*?.]/g, \"\\\\$&\");\n}\n\n/**\n * Implement basic pattern matching that will allow users to do the simple\n * tests with * and **. If users want full complex pattern matching, then can\n * always use regex matching, or function validation.\n */\nexport default function pathToPattern(\n pattern: string,\n dirname: string,\n): RegExp {\n const parts = path.resolve(dirname, pattern).split(path.sep);\n\n return new RegExp(\n [\n \"^\",\n ...parts.map((part, i) => {\n const last = i === parts.length - 1;\n\n // ** matches 0 or more path parts.\n if (part === \"**\") return last ? starStarPatLast : starStarPat;\n\n // * matches 1 path part.\n if (part === \"*\") return last ? starPatLast : starPat;\n\n // *.ext matches a wildcard with an extension.\n if (part.indexOf(\"*.\") === 0) {\n return (\n substitution + escapeRegExp(part.slice(1)) + (last ? endSep : sep)\n );\n }\n\n // Otherwise match the pattern text.\n return escapeRegExp(part) + (last ? endSep : sep);\n }),\n ].join(\"\"),\n );\n}\n"],"mappings":";;;;;;AAAA,SAAAA,MAAA;EAAA,MAAAC,IAAA,GAAAC,OAAA;EAAAF,KAAA,YAAAA,CAAA;IAAA,OAAAC,IAAA;EAAA;EAAA,OAAAA,IAAA;AAAA;AAEA,MAAME,GAAG,GAAG,KAAKC,MAAGA,CAAC,CAACD,GAAG,EAAE;AAC3B,MAAME,MAAM,GAAG,MAAMF,GAAG,KAAK;AAE7B,MAAMG,YAAY,GAAG,KAAKH,GAAG,IAAI;AAEjC,MAAMI,OAAO,GAAG,MAAMD,YAAY,GAAGH,GAAG,GAAG;AAC3C,MAAMK,WAAW,GAAG,MAAMF,YAAY,GAAGD,MAAM,GAAG;AAElD,MAAMI,WAAW,GAAG,GAAGF,OAAO,IAAI;AAClC,MAAMG,eAAe,GAAG,GAAGH,OAAO,KAAKC,WAAW,GAAG;AAErD,SAASG,YAAYA,CAACC,MAAc,EAAE;EACpC,OAAOA,MAAM,CAACC,OAAO,CAAC,qBAAqB,EAAE,MAAM,CAAC;AACtD;AAOe,SAASC,aAAaA,CACnCC,OAAe,EACfC,OAAe,EACP;EACR,MAAMC,KAAK,GAAGb,MAAGA,CAAC,CAACc,OAAO,CAACF,OAAO,EAAED,OAAO,CAAC,CAACI,KAAK,CAACf,MAAGA,CAAC,CAACD,GAAG,CAAC;EAE5D,OAAO,IAAIiB,MAAM,CACf,CACE,GAAG,EACH,GAAGH,KAAK,CAACI,GAAG,CAAC,CAACC,IAAI,EAAEC,CAAC,KAAK;IACxB,MAAMC,IAAI,GAAGD,CAAC,KAAKN,KAAK,CAACQ,MAAM,GAAG,CAAC;IAGnC,IAAIH,IAAI,KAAK,IAAI,EAAE,OAAOE,IAAI,GAAGd,eAAe,GAAGD,WAAW;IAG9D,IAAIa,IAAI,KAAK,GAAG,EAAE,OAAOE,IAAI,GAAGhB,WAAW,GAAGD,OAAO;IAGrD,IAAIe,IAAI,CAACI,OAAO,CAAC,IAAI,CAAC,KAAK,CAAC,EAAE;MAC5B,OACEpB,YAAY,GAAGK,YAAY,CAACW,IAAI,CAACK,KAAK,CAAC,CAAC,CAAC,CAAC,IAAIH,IAAI,GAAGnB,MAAM,GAAGF,GAAG,CAAC;IAEtE;IAGA,OAAOQ,YAAY,CAACW,IAAI,CAAC,IAAIE,IAAI,GAAGnB,MAAM,GAAGF,GAAG,CAAC;EACnD,CAAC,CAAC,CACH,CAACyB,IAAI,CAAC,EAAE,CACX,CAAC;AACH;AAAC","ignoreList":[]}
|
||||||
33
web/node_modules/@babel/core/lib/config/plugin.js
generated
vendored
Normal file
33
web/node_modules/@babel/core/lib/config/plugin.js
generated
vendored
Normal file
@@ -0,0 +1,33 @@
|
|||||||
|
"use strict";
|
||||||
|
|
||||||
|
Object.defineProperty(exports, "__esModule", {
|
||||||
|
value: true
|
||||||
|
});
|
||||||
|
exports.default = void 0;
|
||||||
|
var _deepArray = require("./helpers/deep-array.js");
|
||||||
|
class Plugin {
|
||||||
|
constructor(plugin, options, key, externalDependencies = (0, _deepArray.finalize)([])) {
|
||||||
|
this.key = void 0;
|
||||||
|
this.manipulateOptions = void 0;
|
||||||
|
this.post = void 0;
|
||||||
|
this.pre = void 0;
|
||||||
|
this.visitor = void 0;
|
||||||
|
this.parserOverride = void 0;
|
||||||
|
this.generatorOverride = void 0;
|
||||||
|
this.options = void 0;
|
||||||
|
this.externalDependencies = void 0;
|
||||||
|
this.key = plugin.name || key;
|
||||||
|
this.manipulateOptions = plugin.manipulateOptions;
|
||||||
|
this.post = plugin.post;
|
||||||
|
this.pre = plugin.pre;
|
||||||
|
this.visitor = plugin.visitor || {};
|
||||||
|
this.parserOverride = plugin.parserOverride;
|
||||||
|
this.generatorOverride = plugin.generatorOverride;
|
||||||
|
this.options = options;
|
||||||
|
this.externalDependencies = externalDependencies;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
exports.default = Plugin;
|
||||||
|
0 && 0;
|
||||||
|
|
||||||
|
//# sourceMappingURL=plugin.js.map
|
||||||
1
web/node_modules/@babel/core/lib/config/plugin.js.map
generated
vendored
Normal file
1
web/node_modules/@babel/core/lib/config/plugin.js.map
generated
vendored
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"version":3,"names":["_deepArray","require","Plugin","constructor","plugin","options","key","externalDependencies","finalize","manipulateOptions","post","pre","visitor","parserOverride","generatorOverride","name","exports","default"],"sources":["../../src/config/plugin.ts"],"sourcesContent":["import { finalize } from \"./helpers/deep-array.ts\";\nimport type { ReadonlyDeepArray } from \"./helpers/deep-array.ts\";\nimport type { PluginObject } from \"./validation/plugins.ts\";\n\nexport default class Plugin {\n key: string | undefined | null;\n manipulateOptions?: PluginObject[\"manipulateOptions\"];\n post?: PluginObject[\"post\"];\n pre?: PluginObject[\"pre\"];\n visitor: PluginObject[\"visitor\"];\n\n parserOverride?: PluginObject[\"parserOverride\"];\n generatorOverride?: PluginObject[\"generatorOverride\"];\n\n options: object;\n\n externalDependencies: ReadonlyDeepArray<string>;\n\n constructor(\n plugin: PluginObject,\n options: object,\n key?: string,\n externalDependencies: ReadonlyDeepArray<string> = finalize([]),\n ) {\n this.key = plugin.name || key;\n\n this.manipulateOptions = plugin.manipulateOptions;\n this.post = plugin.post;\n this.pre = plugin.pre;\n this.visitor = plugin.visitor || {};\n this.parserOverride = plugin.parserOverride;\n this.generatorOverride = plugin.generatorOverride;\n\n this.options = options;\n this.externalDependencies = externalDependencies;\n }\n}\n"],"mappings":";;;;;;AAAA,IAAAA,UAAA,GAAAC,OAAA;AAIe,MAAMC,MAAM,CAAC;EAc1BC,WAAWA,CACTC,MAAoB,EACpBC,OAAe,EACfC,GAAY,EACZC,oBAA+C,GAAG,IAAAC,mBAAQ,EAAC,EAAE,CAAC,EAC9D;IAAA,KAlBFF,GAAG;IAAA,KACHG,iBAAiB;IAAA,KACjBC,IAAI;IAAA,KACJC,GAAG;IAAA,KACHC,OAAO;IAAA,KAEPC,cAAc;IAAA,KACdC,iBAAiB;IAAA,KAEjBT,OAAO;IAAA,KAEPE,oBAAoB;IAQlB,IAAI,CAACD,GAAG,GAAGF,MAAM,CAACW,IAAI,IAAIT,GAAG;IAE7B,IAAI,CAACG,iBAAiB,GAAGL,MAAM,CAACK,iBAAiB;IACjD,IAAI,CAACC,IAAI,GAAGN,MAAM,CAACM,IAAI;IACvB,IAAI,CAACC,GAAG,GAAGP,MAAM,CAACO,GAAG;IACrB,IAAI,CAACC,OAAO,GAAGR,MAAM,CAACQ,OAAO,IAAI,CAAC,CAAC;IACnC,IAAI,CAACC,cAAc,GAAGT,MAAM,CAACS,cAAc;IAC3C,IAAI,CAACC,iBAAiB,GAAGV,MAAM,CAACU,iBAAiB;IAEjD,IAAI,CAACT,OAAO,GAAGA,OAAO;IACtB,IAAI,CAACE,oBAAoB,GAAGA,oBAAoB;EAClD;AACF;AAACS,OAAA,CAAAC,OAAA,GAAAf,MAAA;AAAA","ignoreList":[]}
|
||||||
113
web/node_modules/@babel/core/lib/config/printer.js
generated
vendored
Normal file
113
web/node_modules/@babel/core/lib/config/printer.js
generated
vendored
Normal file
@@ -0,0 +1,113 @@
|
|||||||
|
"use strict";
|
||||||
|
|
||||||
|
Object.defineProperty(exports, "__esModule", {
|
||||||
|
value: true
|
||||||
|
});
|
||||||
|
exports.ConfigPrinter = exports.ChainFormatter = void 0;
|
||||||
|
function _gensync() {
|
||||||
|
const data = require("gensync");
|
||||||
|
_gensync = function () {
|
||||||
|
return data;
|
||||||
|
};
|
||||||
|
return data;
|
||||||
|
}
|
||||||
|
const ChainFormatter = exports.ChainFormatter = {
|
||||||
|
Programmatic: 0,
|
||||||
|
Config: 1
|
||||||
|
};
|
||||||
|
const Formatter = {
|
||||||
|
title(type, callerName, filepath) {
|
||||||
|
let title = "";
|
||||||
|
if (type === ChainFormatter.Programmatic) {
|
||||||
|
title = "programmatic options";
|
||||||
|
if (callerName) {
|
||||||
|
title += " from " + callerName;
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
title = "config " + filepath;
|
||||||
|
}
|
||||||
|
return title;
|
||||||
|
},
|
||||||
|
loc(index, envName) {
|
||||||
|
let loc = "";
|
||||||
|
if (index != null) {
|
||||||
|
loc += `.overrides[${index}]`;
|
||||||
|
}
|
||||||
|
if (envName != null) {
|
||||||
|
loc += `.env["${envName}"]`;
|
||||||
|
}
|
||||||
|
return loc;
|
||||||
|
},
|
||||||
|
*optionsAndDescriptors(opt) {
|
||||||
|
const content = Object.assign({}, opt.options);
|
||||||
|
delete content.overrides;
|
||||||
|
delete content.env;
|
||||||
|
const pluginDescriptors = [...(yield* opt.plugins())];
|
||||||
|
if (pluginDescriptors.length) {
|
||||||
|
content.plugins = pluginDescriptors.map(d => descriptorToConfig(d));
|
||||||
|
}
|
||||||
|
const presetDescriptors = [...(yield* opt.presets())];
|
||||||
|
if (presetDescriptors.length) {
|
||||||
|
content.presets = [...presetDescriptors].map(d => descriptorToConfig(d));
|
||||||
|
}
|
||||||
|
return JSON.stringify(content, undefined, 2);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
function descriptorToConfig(d) {
|
||||||
|
var _d$file;
|
||||||
|
let name = (_d$file = d.file) == null ? void 0 : _d$file.request;
|
||||||
|
if (name == null) {
|
||||||
|
if (typeof d.value === "object") {
|
||||||
|
name = d.value;
|
||||||
|
} else if (typeof d.value === "function") {
|
||||||
|
name = `[Function: ${d.value.toString().slice(0, 50)} ... ]`;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (name == null) {
|
||||||
|
name = "[Unknown]";
|
||||||
|
}
|
||||||
|
if (d.options === undefined) {
|
||||||
|
return name;
|
||||||
|
} else if (d.name == null) {
|
||||||
|
return [name, d.options];
|
||||||
|
} else {
|
||||||
|
return [name, d.options, d.name];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
class ConfigPrinter {
|
||||||
|
constructor() {
|
||||||
|
this._stack = [];
|
||||||
|
}
|
||||||
|
configure(enabled, type, {
|
||||||
|
callerName,
|
||||||
|
filepath
|
||||||
|
}) {
|
||||||
|
if (!enabled) return () => {};
|
||||||
|
return (content, index, envName) => {
|
||||||
|
this._stack.push({
|
||||||
|
type,
|
||||||
|
callerName,
|
||||||
|
filepath,
|
||||||
|
content,
|
||||||
|
index,
|
||||||
|
envName
|
||||||
|
});
|
||||||
|
};
|
||||||
|
}
|
||||||
|
static *format(config) {
|
||||||
|
let title = Formatter.title(config.type, config.callerName, config.filepath);
|
||||||
|
const loc = Formatter.loc(config.index, config.envName);
|
||||||
|
if (loc) title += ` ${loc}`;
|
||||||
|
const content = yield* Formatter.optionsAndDescriptors(config.content);
|
||||||
|
return `${title}\n${content}`;
|
||||||
|
}
|
||||||
|
*output() {
|
||||||
|
if (this._stack.length === 0) return "";
|
||||||
|
const configs = yield* _gensync().all(this._stack.map(s => ConfigPrinter.format(s)));
|
||||||
|
return configs.join("\n\n");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
exports.ConfigPrinter = ConfigPrinter;
|
||||||
|
0 && 0;
|
||||||
|
|
||||||
|
//# sourceMappingURL=printer.js.map
|
||||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user