Compare commits

..

No commits in common. "2923510c5132a60a3f3b63d67fd3cc625a85cade" and "8706a4f3f71be333f89ae92ca89693cc23188e92" have entirely different histories.

23 changed files with 578 additions and 2591 deletions

View File

@ -3,43 +3,25 @@ FROM python:3.11-slim
ENV PYTHONUNBUFFERED=1
# Instalar ffmpeg, Node.js 20 LTS y herramientas necesarias
# Node.js es requerido por yt-dlp --js-runtimes para resolver n-challenge/signature de YouTube
# Instalar ffmpeg y herramientas necesarias
RUN apt-get update \
&& apt-get install -y --no-install-recommends \
ffmpeg \
curl \
ca-certificates \
gnupg \
&& curl -fsSL https://deb.nodesource.com/setup_20.x | bash - \
&& apt-get install -y --no-install-recommends nodejs \
&& rm -rf /var/lib/apt/lists/*
WORKDIR /app
# Copiar requirements y instalar dependencias Python
# Copiar requirements y instalar dependencias
COPY requirements.txt /app/requirements.txt
RUN pip install --no-cache-dir -r /app/requirements.txt
RUN pip install --no-cache-dir -r /app/requirements.txt \
&& pip install --no-cache-dir yt-dlp
# Instalar yt-dlp desde el binario oficial más reciente (no pip) para siempre tener la última versión
RUN curl -L https://github.com/yt-dlp/yt-dlp/releases/latest/download/yt-dlp -o /usr/local/bin/yt-dlp \
&& chmod a+rx /usr/local/bin/yt-dlp
# ARG para invalidar caché del COPY al hacer rebuild con --build-arg CACHEBUST=$(date +%s)
ARG CACHEBUST=1
# Copiar el resto del código
COPY . /app
# Crear carpeta data con permisos abiertos para que cualquier UID pueda leer/escribir
RUN mkdir -p /app/data && chmod 777 /app/data
# Crear usuario appuser (UID 1000) y darle acceso a /app
RUN groupadd -g 1000 appgroup && useradd -u 1000 -g appgroup -s /bin/sh appuser \
&& chown -R appuser:appgroup /app
USER appuser
EXPOSE 8000
# Comando para ejecutar la API
# Comando por defecto para ejecutar la API
CMD ["uvicorn", "main:app", "--host", "0.0.0.0", "--port", "8000"]

View File

@ -27,7 +27,7 @@ chmod +x docker-start.sh
# 3. Abrir en navegador
# Panel Web: http://localhost:8501
# API: http://localhost:8282
# API: http://localhost:8080
```
📚 Ver [DOCKER_README.md](DOCKER_README.md) para más información.
@ -48,8 +48,8 @@ chmod +x docker-start.sh
Esto iniciará:
- **Panel Web Streamlit**: http://localhost:8501
- **API FastAPI**: http://localhost:8282
- **Documentación API**: http://localhost:8282/docs
- **API FastAPI**: http://localhost:8080
- **Documentación API**: http://localhost:8080/docs
📚 Documentación completa: [DOCKER_GUIDE.md](DOCKER_GUIDE.md)
@ -214,25 +214,18 @@ docker-compose down
Esto iniciará:
- **Panel Streamlit**: http://localhost:8501 (Frontend)
- **API FastAPI**: http://localhost:8282 (Backend)
- **Docs API**: http://localhost:8282/docs (Swagger UI)
- **API FastAPI**: http://localhost:8080 (Backend)
- **Docs API**: http://localhost:8080/docs (Swagger UI)
### Volumen de configuración: `./data`
### Características Docker
A partir de la configuración actual, el proyecto monta una única carpeta local `./data` dentro del contenedor en `/app/data`.
Coloca ahí los archivos de configuración y persistencia (por ejemplo: `cookies.txt`, `stream_config.json`, `streams_state.json`).
- ✅ Health checks automáticos
- ✅ Auto-restart si falla
- ✅ Red compartida entre servicios
- ✅ Volúmenes persistentes para configuración
- ✅ FFmpeg incluido en la imagen
- Ventajas:
- Mantener todos los archivos de configuración en un solo lugar
- Puedes reemplazar `cookies.txt` desde fuera del servidor (host) sin editar el compose
- Evita montajes individuales de archivos que generen conflictos de permisos
- Ejemplo (crear la carpeta si no existe):
```bash
mkdir -p ./data
chmod 755 ./data
```
📚 **Documentación completa**: [DOCKER_GUIDE.md](DOCKER_GUIDE.md)
## 📁 Estructura del Proyecto
@ -242,15 +235,15 @@ TubeScript-API/
├── streamlit_app.py # Panel web de control
├── requirements.txt # Dependencias Python
├── Dockerfile # Imagen Docker optimizada
├── docker-compose.yml # Orquestación de servicios (monta ./data -> /app/data)
├── docker-compose.yml # Orquestación de servicios
├── docker-start.sh # Script de inicio automático
├── docker-stop.sh # Script para detener
├── docker-logs.sh # Script para ver logs
├── data/ # Carpeta montada en el contenedor (/app/data) para configuración persistente
│ ├── stream_config.json # Configuración de plataformas (generado/gestionado aquí)
│ ├── streams_state.json # Estado de transmisiones (generado/gestionado aquí)
│ └── cookies.txt # Cookies de YouTube (opcional — poner aquí o subir vía endpoint)
└── README.md # Documentación
├── Dockerfile # Configuración Docker
├── docker-compose.yml # Orquestación de servicios
├── stream_config.json # Configuración de plataformas (generado)
├── streams_state.json # Estado de transmisiones (generado)
└── cookies.txt # Cookies de YouTube (opcional)
```
## 🔧 Configuración Avanzada
@ -262,23 +255,7 @@ Para acceder a videos con restricciones, puedes proporcionar cookies:
1. Instala la extensión "Get cookies.txt" en tu navegador
2. Visita youtube.com e inicia sesión
3. Exporta las cookies como `cookies.txt`
4. Coloca el archivo en `./data/cookies.txt` o súbelo mediante el endpoint `/upload_cookies`
Ejemplo: copiar manualmente al volumen montado:
```bash
cp /ruta/local/cookies.txt ./data/cookies.txt
# (si el servicio ya está corriendo, reinicia el contenedor para que los procesos usen la nueva cookie si es necesario)
```
O usar el endpoint de la API (si la API está expuesta en el host):
```bash
# Si usas docker-compose.yml (puerto 8282)
curl -v -X POST "http://127.0.0.1:8282/upload_cookies" -F "file=@/ruta/a/cookies.txt" -H "Accept: application/json"
# Si usas docker-compose.local.yml y expones en 8000, ajusta el puerto a 8000
```
4. Coloca el archivo en la raíz del proyecto
### Personalizar Calidad de Video
@ -312,7 +289,7 @@ command = [
### Error: "No se pudo obtener la URL del stream"
- Verifica que el video esté realmente en vivo
- Intenta agregar cookies de YouTube (colocando `./data/cookies.txt` o subiéndolas vía `/upload_cookies`)
- Intenta agregar cookies de YouTube
- Verifica tu conexión a internet
### Error: "Transmisión con estado error"

View File

@ -7,14 +7,14 @@ services:
dockerfile: Dockerfile.api
container_name: tubescript-api
image: tubescript-api:local
user: "${LOCAL_UID:-1000}:${LOCAL_GID:-1000}"
ports:
- "8000:8000"
volumes:
- ./data:/app/data:rw
- ./cookies.txt:/app/cookies.txt
- ./stream_config.json:/app/stream_config.json:ro
- ./streams_state.json:/app/streams_state.json:rw
environment:
API_BASE_URL: http://localhost:8000
API_COOKIES_PATH: /app/data/cookies.txt
TZ: UTC
restart: unless-stopped
healthcheck:
@ -34,7 +34,8 @@ services:
ports:
- "8501:8501"
volumes:
- ./data:/app/data:ro
- ./stream_config.json:/app/stream_config.json:ro
- ./cookies.txt:/app/cookies.txt:ro
environment:
API_BASE_URL: http://localhost:8000
TZ: UTC

View File

@ -1,31 +1,23 @@
services:
# Servicio FastAPI - Backend API
tubescript-api:
build:
context: .
dockerfile: Dockerfile.api
args:
# Invalida la capa COPY . /app sin necesidad de --no-cache completo
CACHEBUST: "${CACHEBUST:-1}"
image: tubescript-api:latest
container_name: tubescript_api
command: uvicorn main:app --host 0.0.0.0 --port 8000 --reload
ports:
- "8282:8000"
- "8000:8000"
volumes:
# Datos persistentes: cookies.txt, config, etc.
- ./data:/app/data:rw
# ── Perfiles de navegador del HOST (read-only) ──────────────────────────
# yt-dlp puede leer cookies directamente del navegador con
# POST /extract_chrome_cookies?browser=chrome
# Descomenta el navegador que tengas instalado en el host:
- ${HOME}/.config/google-chrome:/host-chrome:ro
# - ${HOME}/.config/chromium:/host-chromium:ro
# - ${HOME}/.config/BraveSoftware/Brave-Browser:/host-brave:ro
# - ${HOME}/.mozilla/firefox:/host-firefox:ro
# - ${HOME}/.config/microsoft-edge:/host-edge:ro
- ./:/app:rw
- ./cookies.txt:/app/cookies.txt:ro
- ./stream_config.json:/app/stream_config.json:ro
- ./streams_state.json:/app/streams_state.json
- ./data:/app/data
environment:
- PYTHONUNBUFFERED=1
- API_COOKIES_PATH=/app/data/cookies.txt
# Proxy opcional: socks5h://127.0.0.1:9050
- API_COOKIES_PATH=/app/cookies.txt
# Optional: set API_PROXY when you want the container to use a SOCKS/HTTP proxy (e.g. tor)
- API_PROXY=${API_PROXY:-}
restart: unless-stopped
networks:
@ -40,4 +32,3 @@ services:
networks:
tubescript-network:
name: tubescript-network
driver: bridge

View File

@ -1,83 +1,111 @@
#!/bin/bash
# Script para reconstruir TubeScript-API desde cero
# Script para reconstruir las imágenes Docker de TubeScript
set -e
GREEN='\033[0;32m'; YELLOW='\033[1;33m'; RED='\033[0;31m'; NC='\033[0m'
ok() { echo -e "${GREEN}$1${NC}"; }
warn() { echo -e "${YELLOW}⚠️ $1${NC}"; }
err() { echo -e "${RED}$1${NC}"; }
echo "════════════════════════════════════════════════════════════"
echo " 🔨 TubeScript-API — Rebuild completo"
echo " 🔨 TubeScript-API - Rebuild de Docker"
echo "════════════════════════════════════════════════════════════"
echo ""
# ── Verificar Docker (plugin compose, no docker-compose legacy) ──────────────
if ! docker compose version &>/dev/null; then
err "docker compose no está disponible. Instala Docker Desktop o el plugin compose."
# Colores
GREEN='\033[0;32m'
YELLOW='\033[1;33m'
RED='\033[0;31m'
NC='\033[0m'
print_success() {
echo -e "${GREEN}$1${NC}"
}
print_warning() {
echo -e "${YELLOW}⚠️ $1${NC}"
}
print_error() {
echo -e "${RED}$1${NC}"
}
# Verificar Docker
echo "🔍 Verificando Docker..."
if ! command -v docker &> /dev/null; then
print_error "Docker no está instalado"
exit 1
fi
ok "Docker compose disponible: $(docker compose version --short 2>/dev/null || echo 'ok')"
echo ""
# ── Carpeta data ──────────────────────────────────────────────────────────────
mkdir -p ./data
chmod 777 ./data 2>/dev/null || true
ok "Carpeta ./data lista (permisos 777)"
echo " Coloca cookies.txt en ./data/cookies.txt para autenticación"
echo ""
# ── Detener contenedores existentes ──────────────────────────────────────────
echo "🛑 Deteniendo contenedores..."
docker compose down --remove-orphans 2>/dev/null || true
ok "Contenedores detenidos"
echo ""
# ── Eliminar imagen anterior para forzar build limpio ─────────────────────────
echo "🧹 Eliminando imagen anterior (tubescript-api:latest)..."
docker rmi tubescript-api:latest 2>/dev/null && ok "Imagen anterior eliminada" || warn "No había imagen previa"
echo ""
# ── Build sin caché ───────────────────────────────────────────────────────────
echo "🔨 Construyendo imagen desde cero (--no-cache)..."
echo " Esto puede tardar 3-5 minutos la primera vez..."
echo ""
CACHEBUST=$(date +%s) docker compose build --no-cache
ok "Imagen construida exitosamente"
echo ""
# ── Iniciar servicios ─────────────────────────────────────────────────────────
echo "🚀 Iniciando servicios..."
docker compose up -d
ok "Servicios iniciados"
echo ""
# ── Esperar y mostrar estado ──────────────────────────────────────────────────
echo "⏳ Esperando que la API arranque (15s)..."
sleep 15
echo ""
echo "📊 Estado de contenedores:"
docker compose ps
echo ""
# ── Health check ──────────────────────────────────────────────────────────────
echo "🩺 Verificando API..."
if curl -sf http://localhost:8282/docs -o /dev/null; then
ok "API respondiendo en http://localhost:8282"
else
warn "API aún no responde (puede necesitar más tiempo). Revisa: docker compose logs -f"
if ! command -v docker-compose &> /dev/null; then
print_error "Docker Compose no está instalado"
exit 1
fi
print_success "Docker encontrado"
echo ""
# Detener contenedores
echo "🛑 Deteniendo contenedores existentes..."
docker-compose down 2>/dev/null || true
print_success "Contenedores detenidos"
echo ""
# Limpiar imágenes antiguas (opcional)
echo "🧹 ¿Deseas eliminar las imágenes antiguas? (s/N)"
read -p "> " clean_images
if [ "$clean_images" = "s" ] || [ "$clean_images" = "S" ]; then
echo "Eliminando imágenes antiguas..."
docker-compose down --rmi all 2>/dev/null || true
print_success "Imágenes antiguas eliminadas"
fi
echo ""
# Reconstruir sin cache
echo "🔨 Reconstruyendo imágenes sin cache..."
echo "Esto puede tardar varios minutos..."
echo ""
docker-compose build --no-cache
if [ $? -eq 0 ]; then
print_success "Imágenes reconstruidas exitosamente"
else
print_error "Error al reconstruir imágenes"
exit 1
fi
echo ""
# Preguntar si desea iniciar
echo "🚀 ¿Deseas iniciar los servicios ahora? (S/n)"
read -p "> " start_services
if [ "$start_services" != "n" ] && [ "$start_services" != "N" ]; then
echo ""
echo "🚀 Iniciando servicios..."
docker-compose up -d
if [ $? -eq 0 ]; then
print_success "Servicios iniciados"
echo ""
echo "📊 Estado de los servicios:"
sleep 3
docker-compose ps
echo ""
echo "════════════════════════════════════════════════════════════"
print_success "¡Rebuild completado!"
echo "════════════════════════════════════════════════════════════"
echo ""
echo "🌐 Servicios disponibles:"
echo " Panel Web: http://localhost:8501"
echo " API: http://localhost:8080"
echo ""
else
print_error "Error al iniciar servicios"
exit 1
fi
else
echo ""
print_success "Rebuild completado (servicios no iniciados)"
echo ""
echo "Para iniciar los servicios:"
echo " docker-compose up -d"
fi
echo "════════════════════════════════════════════════════════════"
ok "¡Rebuild completado!"
echo "════════════════════════════════════════════════════════════"
echo ""
echo " API: http://localhost:8282"
echo " Docs: http://localhost:8282/docs"
echo " Logs: docker compose logs -f"
echo " Cookies: curl -X POST http://localhost:8282/upload_cookies -F 'file=@cookies.txt'"
echo ""

View File

@ -21,9 +21,11 @@ docker run -d \
--name tubescript_api \
--network tubescript-network \
-p 8080:8000 \
-v "$(pwd)/data:/app/data:rw" \
-e API_COOKIES_PATH=/app/data/cookies.txt \
-v "$(pwd)/cookies.txt:/app/cookies.txt:ro" \
-v "$(pwd)/stream_config.json:/app/stream_config.json" \
-v "$(pwd)/streams_state.json:/app/streams_state.json" \
-v "$(pwd)/process_state.json:/app/process_state.json" \
-v "$(pwd)/data:/app/data" \
-e PYTHONUNBUFFERED=1 \
tubescript-api \
uvicorn main:app --host 0.0.0.0 --port 8000 --reload

View File

@ -30,8 +30,11 @@ docker run -d \
--name streamlit_panel \
--network tubescript-network \
-p 8501:8501 \
-v "$(pwd)/data:/app/data:ro" \
-e API_COOKIES_PATH=/app/data/cookies.txt \
-v "$(pwd)/cookies.txt:/app/cookies.txt:ro" \
-v "$(pwd)/stream_config.json:/app/stream_config.json" \
-v "$(pwd)/streams_state.json:/app/streams_state.json" \
-v "$(pwd)/process_state.json:/app/process_state.json" \
-v "$(pwd)/data:/app/data" \
-e PYTHONUNBUFFERED=1 \
-e API_URL="$API_URL" \
tubescript-api \

View File

@ -1,79 +1,180 @@
#!/bin/bash
# Script para iniciar TubeScript-API con docker compose
# Script para iniciar el stack completo de TubeScript con Docker
set -e
GREEN='\033[0;32m'; YELLOW='\033[1;33m'; RED='\033[0;31m'; NC='\033[0m'
print_success() { echo -e "${GREEN}$1${NC}"; }
print_warning() { echo -e "${YELLOW}⚠️ $1${NC}"; }
print_error() { echo -e "${RED}$1${NC}"; }
echo "════════════════════════════════════════════════════════════"
echo " 🐳 TubeScript-API — docker compose up"
echo " 🐳 TubeScript-API - Inicio con Docker"
echo "════════════════════════════════════════════════════════════"
echo ""
# Verificar Docker
if ! command -v docker &>/dev/null; then
print_error "Docker no está instalado"; exit 1
fi
print_success "Docker encontrado: $(docker --version)"
echo ""
# Colores para output
GREEN='\033[0;32m'
YELLOW='\033[1;33m'
RED='\033[0;31m'
NC='\033[0m' # No Color
# Crear carpeta data con permisos correctos (necesaria para cookies.txt y otros)
if [ ! -d "data" ]; then
mkdir -p data && chmod 755 data
print_success "Creado directorio ./data"
else
print_success "Directorio ./data ya existe"
fi
# Función para imprimir mensajes con color
print_success() {
echo -e "${GREEN}$1${NC}"
}
# Sugerencia de cookies
if [ ! -f "data/cookies.txt" ]; then
touch data/cookies.txt
print_warning "data/cookies.txt vacío creado (sube cookies con POST /upload_cookies)"
else
print_success "data/cookies.txt encontrado"
fi
echo ""
print_warning() {
echo -e "${YELLOW}⚠️ $1${NC}"
}
# Detener contenedores existentes
echo "🛑 Deteniendo contenedores existentes..."
docker compose down 2>/dev/null || true
echo ""
print_error() {
echo -e "${RED}$1${NC}"
}
# Build + arranque con CACHEBUST para forzar copia fresca del código
export CACHEBUST="$(date +%s)"
echo "🔨 Construyendo e iniciando servicios..."
echo " (CACHEBUST=${CACHEBUST} — solo invalida la capa de código, no las capas de apt/pip)"
echo ""
docker compose up -d --build
if [ $? -eq 0 ]; then
echo ""
echo "⏳ Esperando arranque de uvicorn..."
sleep 8
echo ""
echo "📊 Estado:"
docker compose ps
echo ""
echo "📋 Logs recientes:"
docker compose logs --tail=6
echo ""
echo "════════════════════════════════════════════════════════════"
print_success "¡Listo!"
echo "════════════════════════════════════════════════════════════"
echo ""
echo " 🌐 API: http://localhost:8282"
echo " 📖 Docs: http://localhost:8282/docs"
echo " 🍪 Subir cookies: curl -X POST http://localhost:8282/upload_cookies -F 'file=@cookies.txt'"
echo ""
echo " 📝 Comandos útiles:"
echo " Logs en vivo: docker compose logs -f tubescript-api"
echo " Detener: docker compose down"
echo " Rebuild: CACHEBUST=\$(date +%s) docker compose up -d --build"
echo ""
else
print_error "Error al iniciar servicios"
docker compose logs --tail=20
# Verificar que Docker esté instalado
if ! command -v docker &> /dev/null; then
print_error "Docker no está instalado"
echo "Instala Docker desde: https://www.docker.com/get-started"
exit 1
fi
if ! command -v docker-compose &> /dev/null; then
print_error "Docker Compose no está instalado"
exit 1
fi
print_success "Docker y Docker Compose encontrados"
# Solicitar URL de la API si no está configurada
echo ""
echo "🌐 Configuración de API URL..."
# Verificar si existe archivo .env
if [ ! -f ".env" ]; then
echo ""
echo "Por favor, ingresa la URL del dominio de la API:"
echo "(Ejemplos: https://api.tubescript.com, http://localhost:8080, https://mi-dominio.com)"
read -p "API URL [http://localhost:8080]: " api_url
api_url=${api_url:-http://localhost:8080}
echo "API_URL=$api_url" > .env
print_success "Creado archivo .env con API_URL=$api_url"
else
# Leer URL existente
source .env
print_success "Usando API_URL existente: $API_URL"
echo "¿Deseas cambiar la API URL? (s/N)"
read -p "> " change_url
if [ "$change_url" = "s" ] || [ "$change_url" = "S" ]; then
read -p "Nueva API URL: " new_api_url
if [ ! -z "$new_api_url" ]; then
sed -i.bak "s|API_URL=.*|API_URL=$new_api_url|" .env
print_success "API_URL actualizada a: $new_api_url"
fi
fi
fi
# Crear archivos de configuración si no existen
echo ""
echo "📝 Verificando archivos de configuración..."
if [ ! -f "stream_config.json" ]; then
echo '{
"platforms": {
"YouTube": {"rtmp_url": "", "stream_key": "", "enabled": false},
"Facebook": {"rtmp_url": "", "stream_key": "", "enabled": false},
"Twitch": {"rtmp_url": "", "stream_key": "", "enabled": false},
"X (Twitter)": {"rtmp_url": "", "stream_key": "", "enabled": false},
"Instagram": {"rtmp_url": "", "stream_key": "", "enabled": false},
"TikTok": {"rtmp_url": "", "stream_key": "", "enabled": false}
}
}' > stream_config.json
print_success "Creado stream_config.json"
else
print_success "stream_config.json ya existe"
fi
if [ ! -f "streams_state.json" ]; then
echo '{}' > streams_state.json
print_success "Creado streams_state.json"
else
print_success "streams_state.json ya existe"
fi
if [ ! -f "cookies.txt" ]; then
touch cookies.txt
print_warning "Creado cookies.txt vacío (opcional para videos restringidos)"
else
print_success "cookies.txt existe"
fi
# Crear directorio data si no existe
if [ ! -d "data" ]; then
mkdir -p data
print_success "Creado directorio data/"
fi
# Detener contenedores existentes si los hay
echo ""
echo "🛑 Deteniendo contenedores existentes..."
docker-compose down 2>/dev/null || true
# Construir las imágenes
echo ""
echo "🔨 Construyendo imágenes Docker..."
docker-compose build
# Iniciar los servicios
echo ""
echo "🚀 Iniciando servicios..."
docker-compose up -d
# Esperar a que los servicios estén listos
echo ""
echo "⏳ Esperando que los servicios inicien..."
sleep 5
# Verificar estado de los servicios
echo ""
echo "📊 Estado de los servicios:"
docker-compose ps
# Mostrar logs iniciales
echo ""
echo "📋 Logs recientes:"
docker-compose logs --tail=10
echo ""
echo "════════════════════════════════════════════════════════════"
print_success "¡Servicios iniciados correctamente!"
echo "════════════════════════════════════════════════════════════"
echo ""
echo "📡 Servicios disponibles:"
echo ""
echo " 🌐 Panel Web Streamlit:"
echo " http://localhost:8501"
echo ""
echo " 📡 API FastAPI:"
echo " http://localhost:8080"
echo " http://localhost:8080/docs (Documentación Swagger)"
echo ""
echo "────────────────────────────────────────────────────────────"
echo "📝 Comandos útiles:"
echo ""
echo " Ver logs en tiempo real:"
echo " docker-compose logs -f"
echo ""
echo " Ver logs de un servicio:"
echo " docker-compose logs -f streamlit-panel"
echo " docker-compose logs -f tubescript-api"
echo ""
echo " Detener servicios:"
echo " docker-compose down"
echo ""
echo " Reiniciar servicios:"
echo " docker-compose restart"
echo ""
echo " Ver estado:"
echo " docker-compose ps"
echo ""
echo "════════════════════════════════════════════════════════════"
echo "🎉 ¡Listo para transmitir!"
echo "════════════════════════════════════════════════════════════"

View File

@ -1,141 +0,0 @@
#!/bin/bash
# ─────────────────────────────────────────────────────────────────────────────
# export-chrome-cookies.sh
# Exporta cookies de YouTube desde el perfil del navegador del HOST
# usando yt-dlp, y las copia a ./data/cookies.txt para que la API las use.
#
# Uso:
# ./export-chrome-cookies.sh # Chrome (default)
# ./export-chrome-cookies.sh chromium # Chromium
# ./export-chrome-cookies.sh brave # Brave
# ./export-chrome-cookies.sh firefox # Firefox
# ./export-chrome-cookies.sh edge # Edge
#
# IMPORTANTE:
# - Cierra el navegador antes de ejecutar (Chrome bloquea el archivo de cookies)
# - En Linux no requiere contraseña ni keychain especial
# ─────────────────────────────────────────────────────────────────────────────
set -e
BROWSER="${1:-chrome}"
OUTPUT="./data/cookies.txt"
GREEN='\033[0;32m'; YELLOW='\033[1;33m'; RED='\033[0;31m'; NC='\033[0m'
ok() { echo -e "${GREEN}$1${NC}"; }
warn() { echo -e "${YELLOW}⚠️ $1${NC}"; }
err() { echo -e "${RED}$1${NC}"; exit 1; }
echo ""
echo "🍪 Exportando cookies de YouTube desde: $BROWSER"
echo ""
# Verificar yt-dlp
if ! command -v yt-dlp &>/dev/null; then
err "yt-dlp no está instalado. Instala con: pip install yt-dlp"
fi
# Verificar que el navegador no esté corriendo (puede causar errores de bloqueo)
BROWSER_PROC=""
case "$BROWSER" in
chrome) BROWSER_PROC="google-chrome\|chrome" ;;
chromium) BROWSER_PROC="chromium" ;;
brave) BROWSER_PROC="brave" ;;
firefox) BROWSER_PROC="firefox" ;;
edge) BROWSER_PROC="msedge\|microsoft-edge" ;;
esac
if [ -n "$BROWSER_PROC" ] && pgrep -f "$BROWSER_PROC" &>/dev/null; then
warn "El navegador '$BROWSER' parece estar corriendo."
warn "Ciérralo antes de exportar para evitar errores de bloqueo del DB."
echo ""
read -p "¿Continuar de todos modos? (s/N): " confirm
[[ "$confirm" =~ ^[sS]$ ]] || { echo "Cancelado."; exit 0; }
echo ""
fi
# Crear directorio de destino
mkdir -p "$(dirname "$OUTPUT")"
# Detectar ruta del perfil
PROFILE_PATH=""
case "$BROWSER" in
chrome)
for p in "$HOME/.config/google-chrome/Default" "$HOME/.config/google-chrome/Profile 1"; do
[ -d "$p" ] && PROFILE_PATH="$p" && break
done
;;
chromium)
PROFILE_PATH="$HOME/.config/chromium/Default"
;;
brave)
PROFILE_PATH="$HOME/.config/BraveSoftware/Brave-Browser/Default"
;;
firefox)
# Firefox: yt-dlp detecta el perfil automáticamente
PROFILE_PATH=""
;;
edge)
PROFILE_PATH="$HOME/.config/microsoft-edge/Default"
;;
*)
err "Navegador '$BROWSER' no soportado. Usa: chrome, chromium, brave, firefox, edge"
;;
esac
if [ -n "$PROFILE_PATH" ] && [ ! -d "$PROFILE_PATH" ]; then
err "No se encontró el perfil de $BROWSER en: $PROFILE_PATH"
fi
# Construir argumento --cookies-from-browser
if [ -n "$PROFILE_PATH" ]; then
BROWSER_ARG="${BROWSER}:${PROFILE_PATH}"
echo " Perfil: $PROFILE_PATH"
else
BROWSER_ARG="$BROWSER"
echo " Perfil: detectado automáticamente"
fi
echo " Destino: $OUTPUT"
echo ""
# Exportar cookies con yt-dlp
echo "⏳ Extrayendo cookies..."
yt-dlp \
--cookies-from-browser "$BROWSER_ARG" \
--cookies "$OUTPUT" \
--skip-download \
--no-warnings \
--extractor-args "youtube:player_client=tv_embedded" \
"https://www.youtube.com/watch?v=dQw4w9WgXcQ" 2>&1 || {
err "Error al extraer cookies. Asegúrate de que el navegador está cerrado y tienes sesión en YouTube."
}
# Verificar resultado
if [ ! -f "$OUTPUT" ] || [ ! -s "$OUTPUT" ]; then
err "No se generó el archivo de cookies o está vacío."
fi
YT_LINES=$(grep -c "youtube.com" "$OUTPUT" 2>/dev/null || echo 0)
FILE_SIZE=$(du -h "$OUTPUT" | cut -f1)
echo ""
ok "Cookies exportadas exitosamente"
echo " Archivo: $OUTPUT"
echo " Tamaño: $FILE_SIZE"
echo " Líneas youtube.com: $YT_LINES"
echo ""
if [ "$YT_LINES" -lt 3 ]; then
warn "Pocas cookies de YouTube encontradas ($YT_LINES)."
warn "Verifica que estás logueado en YouTube en $BROWSER."
else
ok "Cookies de YouTube encontradas: $YT_LINES líneas"
fi
echo ""
echo "📋 Próximos pasos:"
echo " 1. Si el contenedor está corriendo, las cookies ya están disponibles en /app/data/"
echo " 2. Si no está corriendo: docker compose up -d"
echo " 3. Prueba: curl http://localhost:8282/cookies/status"
echo ""

View File

@ -17,7 +17,7 @@ import os
import subprocess
import tempfile
import glob
from main import parse_subtitle_format, get_transcript_data
from main import parse_subtitle_format
def fetch_with_browser_cookies(video_id, lang="es", browser="chrome"):
"""Intenta obtener transcript usando cookies desde el navegador directamente."""
@ -78,15 +78,18 @@ def main():
print(f" Idioma: {lang}")
if browser:
print(" Método: Cookies desde {}".format(browser))
print(f" Método: Cookies desde {browser}")
segments, error = fetch_with_browser_cookies(video_id, lang, browser)
else:
print(" Método: API del proyecto")
print(" Cookies: {}".format(os.getenv('API_COOKIES_PATH', './data/cookies.txt')))
print(f" Método: API del proyecto")
print(f" Cookies: {os.getenv('API_COOKIES_PATH', './cookies.txt')}")
from main import get_transcript_data
segments, error = get_transcript_data(video_id, lang)
print("")
# Intentar obtener transcript
segments, error = get_transcript_data(video_id, lang)
if error:
print(f"❌ ERROR: {error}")

View File

@ -1,124 +0,0 @@
#!/bin/bash
# Script para reconstruir y levantar TubeScript-API con soporte correcto de YouTube
set -e
REPO_DIR="/home/xesar/PycharmProjects/TubeScript-API"
cd "$REPO_DIR"
echo "======================================================"
echo " TubeScript-API - Fix & Restart"
echo "======================================================"
# 1. Parar contenedor anterior si existe
echo ""
echo ">>> [1/7] Parando contenedor anterior..."
docker stop tubescript_api 2>/dev/null && echo " Parado." || echo " No estaba corriendo."
docker rm tubescript_api 2>/dev/null && echo " Eliminado." || echo " No existia."
# 2. Construir imagen con tag explícito (siempre sin cache para forzar yt-dlp latest)
echo ""
echo ">>> [2/7] Construyendo imagen tubescript-api:latest ..."
docker build -f Dockerfile.api -t tubescript-api:latest .
echo " Build OK."
# 3. Asegurar permisos de ./data
echo ""
echo ">>> [3/7] Asegurando permisos de ./data ..."
mkdir -p ./data
chown -R "$(id -u):$(id -g)" ./data 2>/dev/null || sudo chown -R "$(id -u):$(id -g)" ./data
chmod -R u+rwX ./data
ls -la ./data
echo " Permisos OK."
# 4. Crear red si no existe
echo ""
echo ">>> [4/7] Asegurando red tubescript-network ..."
docker network create tubescript-network 2>/dev/null && echo " Red creada." || echo " Red ya existe."
# 5. Levantar contenedor
echo ""
echo ">>> [5/7] Levantando contenedor ..."
docker run -d \
--name tubescript_api \
--network tubescript-network \
-p 8282:8000 \
-v "${REPO_DIR}/data:/app/data:rw" \
-e PYTHONUNBUFFERED=1 \
-e API_COOKIES_PATH=/app/data/cookies.txt \
--restart unless-stopped \
tubescript-api:latest
echo " Contenedor iniciado. Esperando arranque de uvicorn..."
sleep 6
# 6. Verificaciones internas
echo ""
echo ">>> [6/7] Verificaciones del contenedor ..."
echo ""
echo "-- Estado:"
docker ps --filter "name=tubescript_api" --format " ID={{.ID}} STATUS={{.Status}} PORTS={{.Ports}}"
echo ""
echo "-- Logs uvicorn:"
docker logs tubescript_api 2>&1 | tail -6
echo ""
echo "-- Versiones:"
docker exec tubescript_api sh -c "
echo ' node :' \$(node --version 2>/dev/null || echo 'no instalado')
echo ' yt-dlp :' \$(yt-dlp --version 2>/dev/null || echo 'no instalado')
"
# 7. Prueba real de yt-dlp con player_client=android (evita n-challenge sin Node extras)
echo ""
echo ">>> [7/7] Prueba yt-dlp (android client) ..."
echo ""
echo "-- Sin cookies (android client):"
docker exec tubescript_api yt-dlp \
--no-warnings --skip-download \
--extractor-args "youtube:player_client=android" \
--print title \
"https://www.youtube.com/watch?v=dQw4w9WgXcQ" 2>&1 \
&& echo " OK" || echo " FALLO"
echo ""
echo "-- Con cookies (mweb client — acepta cookies web sin n-challenge):"
if [ -s "${REPO_DIR}/data/cookies.txt" ]; then
docker exec tubescript_api yt-dlp \
--cookies /app/data/cookies.txt \
--no-warnings --skip-download \
--extractor-args "youtube:player_client=mweb" \
--print title \
"https://www.youtube.com/watch?v=dQw4w9WgXcQ" 2>&1 \
&& echo " OK - título obtenido con cookies" || echo " FALLO con cookies"
else
echo " AVISO: cookies.txt vacío o no existe."
echo " Sube tus cookies: curl 'http://127.0.0.1:8282/upload_cookies' -F 'file=@/ruta/cookies.txt'"
fi
echo ""
echo "-- Endpoint /debug/metadata:"
sleep 2
curl -s --max-time 30 "http://127.0.0.1:8282/debug/metadata/dQw4w9WgXcQ" \
| python3 -c "
import sys, json
try:
d = json.loads(sys.stdin.read())
print(' title :', d.get('title','?'))
print(' is_live :', d.get('is_live','?'))
print(' id :', d.get('id','?'))
except Exception as e:
print(' ERROR:', e)
" 2>&1
echo ""
echo "======================================================"
echo " LISTO."
echo " API: http://127.0.0.1:8282"
echo " Docs: http://127.0.0.1:8282/docs"
echo ""
echo " Subir cookies:"
echo " curl 'http://127.0.0.1:8282/upload_cookies' -F 'file=@./data/cookies.txt'"
echo "======================================================"

1243
main.py

File diff suppressed because it is too large Load Diff

View File

@ -1,138 +0,0 @@
#!/bin/bash
# Sin cookies → android (sin n-challenge, sin Node.js)
# Con cookies → web + Node.js (Node.js resuelve n-challenge/signature)
# for_stream → android (mejor HLS en lives)
# Script de prueba completo — guarda TODO en /tmp/resultado.txt
exec > /tmp/resultado.txt 2>&1
REPO="/home/xesar/PycharmProjects/TubeScript-API"
cd "$REPO"
echo "=== $(date) ==="
# ---------- 1. Rebuild imagen ----------
echo "--- Parando contenedor anterior ---"
docker rm -f tubescript_api 2>/dev/null || true
echo "--- Construyendo imagen (CACHEBUST para forzar COPY . /app fresco) ---"
# --build-arg CACHEBUST=$(date +%s) invalida solo la capa COPY . /app
# (mucho más rápido que --no-cache que descarga todo desde cero)
docker build \
--build-arg CACHEBUST="$(date +%s)" \
-f Dockerfile.api \
-t tubescript-api:latest . 2>&1 | tail -8
echo "BUILD_RC=$?"
# ---------- 2. Levantar ----------
echo "--- Levantando contenedor ---"
docker run -d \
--name tubescript_api \
--network tubescript-network \
-p 8282:8000 \
-v "${REPO}/data:/app/data:rw" \
-e PYTHONUNBUFFERED=1 \
-e API_COOKIES_PATH=/app/data/cookies.txt \
--restart unless-stopped \
tubescript-api:latest
echo "RC_RUN=$?"
sleep 10
echo "--- docker ps ---"
docker ps --format "{{.Names}} {{.Status}} {{.Ports}}" | grep tube || echo "NO CORRIENDO"
echo "--- uvicorn logs ---"
docker logs tubescript_api 2>&1 | tail -4
echo "--- _yt_client_args en imagen (verificar lógica nueva) ---"
docker exec tubescript_api grep -A12 "def _yt_client_args" /app/main.py
echo ""
echo "=== PRUEBA A: android SIN cookies ==="
docker exec tubescript_api yt-dlp \
--no-warnings --skip-download \
--extractor-args "youtube:player_client=android" \
--print title \
"https://www.youtube.com/watch?v=dQw4w9WgXcQ" 2>&1
echo "RC_A=$?"
echo ""
echo "=== PRUEBA B: web + Node.js CON cookies ==="
docker exec tubescript_api yt-dlp \
--cookies /app/data/cookies.txt \
--no-warnings --skip-download \
--extractor-args "youtube:player_client=web" \
--js-runtimes "node:/usr/bin/node" \
--print title \
"https://www.youtube.com/watch?v=dQw4w9WgXcQ" 2>&1
echo "RC_B=$?"
echo ""
echo "=== PRUEBA C: endpoint /debug/metadata ==="
sleep 2
curl -s --max-time 30 "http://127.0.0.1:8282/debug/metadata/dQw4w9WgXcQ" \
| python3 -c "
import sys,json
raw=sys.stdin.read()
try:
d=json.loads(raw)
if 'detail' in d:
print('ERROR:', d['detail'][:200])
else:
print('title :', d.get('title','?'))
print('uploader:', d.get('uploader','?'))
print('duration:', d.get('duration','?'))
except Exception as e:
print('PARSE ERROR:', e)
print('RAW:', raw[:300])
"
echo "RC_C=$?"
echo ""
echo "=== PRUEBA D: endpoint /transcript?lang=en ==="
curl -s --max-time 90 "http://127.0.0.1:8282/transcript/dQw4w9WgXcQ?lang=en" \
| python3 -c "
import sys,json
raw=sys.stdin.read()
try:
d=json.loads(raw)
if 'detail' in d:
print('ERROR:', d['detail'][:200])
else:
print('count :', d.get('count','?'))
print('preview:', str(d.get('text','?'))[:120])
except Exception as e:
print('PARSE ERROR:', e)
print('RAW:', raw[:200])
"
echo "RC_D=$?"
echo ""
echo "=== PRUEBA E: /transcript/QjK5wq8L3Ac (sin subtítulos — mensaje claro esperado) ==="
curl -s --max-time 60 "http://127.0.0.1:8282/transcript/QjK5wq8L3Ac?lang=es" \
| python3 -c "
import sys,json
raw=sys.stdin.read()
try:
d=json.loads(raw)
if 'detail' in d:
print('DETALLE:', d['detail'][:250])
else:
print('OK count:', d.get('count','?'))
except Exception as e:
print('RAW:', raw[:200])
"
echo "RC_E=$?"
echo ""
echo "=== PRUEBA F: /debug/metadata/QjK5wq8L3Ac (title con cookies) ==="
curl -s --max-time 30 "http://127.0.0.1:8282/debug/metadata/QjK5wq8L3Ac" \
| python3 -c "
import sys,json
d=json.loads(sys.stdin.read())
print('title:',d.get('title','?')) if 'title' in d else print('ERROR:',d.get('detail','?')[:200])
"
echo "RC_F=$?"
echo ""
echo "=== FIN ==="

View File

@ -1,116 +0,0 @@
#!/bin/bash
# Test completo de TubeScript-API con cookies reales
set -e
REPO="/home/xesar/PycharmProjects/TubeScript-API"
cd "$REPO"
LOG="/tmp/tubescript_test_$(date +%H%M%S).log"
echo "======================================================"
echo " TubeScript-API — Test completo"
echo " Log: $LOG"
echo "======================================================"
# ---------- 1. Reconstruir imagen ----------
echo ""
echo ">>> [1/5] Parando contenedor anterior..."
docker stop tubescript_api 2>/dev/null && echo " Parado." || echo " No estaba corriendo."
docker rm tubescript_api 2>/dev/null && echo " Eliminado." || echo " No existia."
echo ""
echo ">>> [2/5] Construyendo imagen sin caché..."
docker build --no-cache -f Dockerfile.api -t tubescript-api:latest . 2>&1 \
| grep -E "^#|DONE|ERROR|naming|Built" || true
echo " Build OK."
# ---------- 2. Levantar ----------
echo ""
echo ">>> [3/5] Levantando contenedor..."
docker run -d \
--name tubescript_api \
--network tubescript-network \
-p 8282:8000 \
-v "${REPO}/data:/app/data:rw" \
-e PYTHONUNBUFFERED=1 \
-e API_COOKIES_PATH=/app/data/cookies.txt \
--restart unless-stopped \
tubescript-api:latest
echo " Esperando arranque (8s)..."
sleep 8
docker logs tubescript_api 2>&1 | grep -E "Uvicorn running|startup|ERROR" | head -5
# ---------- 3. Verificar código en imagen ----------
echo ""
echo ">>> [4/5] Verificando lógica de player_client en imagen..."
echo " Líneas clave en main.py:"
docker exec tubescript_api grep -n "mweb\|_yt_client_args\|client =" /app/main.py | head -10
# ---------- 4. Pruebas yt-dlp directas ----------
echo ""
echo ">>> [5/5] Pruebas yt-dlp..."
echo ""
echo " [A] android SIN cookies (cliente base, sin n-challenge):"
docker exec tubescript_api yt-dlp \
--no-warnings --skip-download \
--extractor-args "youtube:player_client=android" \
--print title \
"https://www.youtube.com/watch?v=dQw4w9WgXcQ" 2>&1 \
&& echo " ✅ OK" || echo " ❌ FALLO"
echo ""
echo " [B] mweb,android CON cookies (mweb acepta cookies web, android como fallback):"
docker exec tubescript_api yt-dlp \
--cookies /app/data/cookies.txt \
--no-warnings --skip-download \
--extractor-args "youtube:player_client=mweb,android" \
--print title \
"https://www.youtube.com/watch?v=dQw4w9WgXcQ" 2>&1 \
&& echo " ✅ OK" || echo " ❌ FALLO"
echo ""
echo " [C] dump-json CON cookies (para /debug/metadata):"
docker exec tubescript_api yt-dlp \
--cookies /app/data/cookies.txt \
--no-warnings --skip-download \
--extractor-args "youtube:player_client=mweb" \
--dump-json \
"https://www.youtube.com/watch?v=dQw4w9WgXcQ" 2>&1 \
| python3 -c "import sys,json; d=json.loads(sys.stdin.read()); print(' title:', d.get('title')); print(' uploader:', d.get('uploader'))" \
&& echo " ✅ OK" || echo " ❌ FALLO"
# ---------- 5. Endpoints API ----------
echo ""
echo " [D] Endpoint /debug/metadata:"
sleep 2
RESULT=$(curl -s --max-time 30 "http://127.0.0.1:8282/debug/metadata/dQw4w9WgXcQ")
echo "$RESULT" | python3 -c "
import sys,json
d=json.loads(sys.stdin.read())
if 'detail' in d:
print(' ❌ ERROR:', d['detail'][:200])
else:
print(' ✅ title :', d.get('title','?'))
print(' ✅ uploader:', d.get('uploader','?'))
print(' ✅ is_live :', d.get('is_live','?'))
" 2>&1
echo ""
echo " [E] Endpoint /transcript/dQw4w9WgXcQ?lang=en:"
RESULT2=$(curl -s --max-time 60 "http://127.0.0.1:8282/transcript/dQw4w9WgXcQ?lang=en")
echo "$RESULT2" | python3 -c "
import sys,json
d=json.loads(sys.stdin.read())
if 'detail' in d:
print(' ❌ ERROR:', d['detail'][:200])
else:
print(' ✅ count :', d.get('count','?'))
print(' ✅ preview :', str(d.get('text',''))[:100])
" 2>&1
echo ""
echo "======================================================"
echo " DONE. API: http://127.0.0.1:8282 Docs: http://127.0.0.1:8282/docs"
echo "======================================================"

View File

@ -1,27 +0,0 @@
Playwright extractor
=====================
Este script abre un video de YouTube con Playwright, captura peticiones de red y busca
URLs M3U8/HLS. Opcionalmente exporta cookies al formato Netscape en `./data/cookies.txt`.
Requisitos (host):
pip install playwright
python -m playwright install
Uso ejemplo (headful, usando tu perfil de Chrome):
python3 tools/playwright_extract_m3u8.py --video https://www.youtube.com/watch?v=cmqVmX2UVBM --profile ~/.config/google-chrome --headless
Si no usas perfil, quita `--profile` y el script abrirá un contexto temporal.
Salida JSON:
{
"m3u8_urls": [ ... ],
"cookies_file": "./data/cookies.txt",
"errors": []
}
Consejos:
- Ejecuta en el host (no en contenedor) si quieres usar tu perfil real de Chrome.
- Si Playwright no encuentra el ejecutable del navegador, corre `python -m playwright install`.
- Para usar las cookies exportadas desde la API: `curl -s http://localhost:8282/cookies/status` para comprobarlas.

View File

@ -1,113 +0,0 @@
#!/usr/bin/env python3
"""
expand_and_test_proxies.py
Lee tools/user_proxies.txt, genera variantes (intenta también SOCKS5/SOCKS5H en puertos comunes)
y ejecuta tools/generate_proxy_whitelist.py con la lista expandida.
Uso:
python3 tools/expand_and_test_proxies.py
Salida:
- tools/expanded_proxies.txt (lista expandida)
- llama a generate_proxy_whitelist.py y produce tools/whitelist.json y tools/whitelist.txt
"""
import os
import re
import subprocess
from pathlib import Path
BASE = Path(__file__).resolve().parent
USER_FILE = BASE / 'user_proxies.txt'
EXPANDED_FILE = BASE / 'expanded_proxies.txt'
GEN_SCRIPT = BASE / 'generate_proxy_whitelist.py'
COMMON_SOCKS_PORTS = [1080, 10808, 9050]
def normalize_line(line: str) -> str | None:
s = line.strip()
if not s or s.startswith('#'):
return None
return s
def parse_host_port(s: str):
# remove scheme if present
m = re.match(r'^(?:(?P<scheme>[a-zA-Z0-9+.-]+)://)?(?P<host>[^:/@]+)(?::(?P<port>\d+))?(?:@.*)?$', s)
if not m:
return None, None, None
scheme = m.group('scheme')
host = m.group('host')
port = m.group('port')
port = int(port) if port else None
return scheme, host, port
def build_variants(s: str):
scheme, host, port = parse_host_port(s)
variants = []
# keep original if it has scheme
if scheme:
variants.append(s)
else:
# assume http by default if none
if port:
variants.append(f'http://{host}:{port}')
else:
variants.append(f'http://{host}:80')
# Try socks5h on same port if port present
if port:
variants.append(f'socks5h://{host}:{port}')
# Try socks5h on common ports
for p in COMMON_SOCKS_PORTS:
variants.append(f'socks5h://{host}:{p}')
# Deduplicate preserving order
seen = set()
out = []
for v in variants:
if v in seen:
continue
seen.add(v)
out.append(v)
return out
def main():
if not USER_FILE.exists():
print(f'No se encontró {USER_FILE}. Crea el archivo con proxies (uno por línea).')
return
all_variants = []
with USER_FILE.open('r', encoding='utf-8') as fh:
for line in fh:
s = normalize_line(line)
if not s:
continue
vars = build_variants(s)
all_variants.extend(vars)
# write expanded file
with EXPANDED_FILE.open('w', encoding='utf-8') as fh:
for v in all_variants:
fh.write(v + '\n')
print(f'Wrote expanded proxies to {EXPANDED_FILE} ({len(all_variants)} entries)')
# Call generator
cmd = [ 'python3', str(GEN_SCRIPT), '--input', str(EXPANDED_FILE), '--out-json', str(BASE / 'whitelist.json'), '--out-txt', str(BASE / 'whitelist.txt'), '--test-url', 'https://www.youtube.com/watch?v=dQw4w9WgXcQ', '--concurrency', '6']
print('Running generator...')
try:
res = subprocess.run(cmd, capture_output=True, text=True, timeout=600)
print('Generator exit code:', res.returncode)
print('stdout:\n', res.stdout)
print('stderr:\n', res.stderr)
except Exception as e:
print('Error running generator:', e)
if __name__ == '__main__':
main()

View File

@ -1,30 +0,0 @@
http://48.210.225.96:80
socks5h://48.210.225.96:80
socks5h://48.210.225.96:1080
socks5h://48.210.225.96:10808
socks5h://48.210.225.96:9050
http://107.174.231.218:8888
socks5h://107.174.231.218:8888
socks5h://107.174.231.218:1080
socks5h://107.174.231.218:10808
socks5h://107.174.231.218:9050
http://188.239.43.6:80
socks5h://188.239.43.6:80
socks5h://188.239.43.6:1080
socks5h://188.239.43.6:10808
socks5h://188.239.43.6:9050
http://52.229.30.3:80
socks5h://52.229.30.3:80
socks5h://52.229.30.3:1080
socks5h://52.229.30.3:10808
socks5h://52.229.30.3:9050
http://142.93.202.130:3128
socks5h://142.93.202.130:3128
socks5h://142.93.202.130:1080
socks5h://142.93.202.130:10808
socks5h://142.93.202.130:9050
http://154.219.101.86:8888
socks5h://154.219.101.86:8888
socks5h://154.219.101.86:1080
socks5h://154.219.101.86:10808
socks5h://154.219.101.86:9050

View File

@ -1,242 +0,0 @@
#!/usr/bin/env python3
"""
generate_proxy_whitelist.py
Lee una lista de proxies desde un archivo (proxies.txt), prueba cada proxy con yt-dlp
intentando descargar metadata mínimo de YouTube, mide latencia y genera:
- whitelist.json : lista estructurada de proxies con estado y métricas
- whitelist.txt : solo proxies válidos, ordenados por latencia
Formato de proxies.txt: una URL por línea, ejemplos:
socks5h://127.0.0.1:1080
http://10.0.0.1:3128
Uso:
python3 tools/generate_proxy_whitelist.py --input tools/proxies.txt --out tools/whitelist.json --test-url https://www.youtube.com/watch?v=dQw4w9WgXcQ
Notas:
- Requiere tener `yt-dlp` instalado en el entorno donde se ejecuta este script.
- Este script intenta usar yt-dlp porque valida directamente que el proxy funciona
para las llamadas a YouTube (incluye manejo de JS/firma en yt-dlp cuando aplique).
- Ajusta timeouts y pruebas por concurrencia según tus necesidades.
"""
import argparse
import json
import subprocess
import time
import os
from concurrent.futures import ThreadPoolExecutor, as_completed
from urllib.parse import urlparse
import requests
# Mensajes que indican bloqueo/bot-check de yt-dlp
BOT_MARKERS = ("sign in to confirm", "not a bot", "sign in to", "HTTP Error 403", "HTTP Error 429")
def test_proxy(proxy: str, test_url: str, timeout: int = 25) -> dict:
"""Prueba un proxy ejecutando yt-dlp --dump-json sobre test_url.
Retorna dict con info: proxy, ok, rc, stderr, elapsed_ms, stdout_preview
"""
proxy = proxy.strip()
if not proxy:
return {"proxy": proxy, "ok": False, "error": "empty"}
cmd = [
"yt-dlp",
"--skip-download",
"--dump-json",
"--no-warnings",
"--extractor-args", "youtube:player_client=tv_embedded",
"--socket-timeout", "10",
test_url,
"--proxy", proxy,
]
start = time.perf_counter()
try:
proc = subprocess.run(cmd, capture_output=True, text=True, timeout=timeout)
elapsed = (time.perf_counter() - start) * 1000.0
stdout = proc.stdout or ""
stderr = proc.stderr or ""
rc = proc.returncode
# heurística de éxito: rc == 0 y stdout no vacío y no markers de bot en stderr
stderr_low = stderr.lower()
bot_hit = any(m.lower() in stderr_low for m in BOT_MARKERS)
ok = (rc == 0 and stdout.strip() != "" and not bot_hit)
return {
"proxy": proxy,
"ok": ok,
"rc": rc,
"elapsed_ms": int(elapsed),
"bot_detected": bool(bot_hit),
"stderr_preview": stderr[:1000],
"stdout_preview": stdout[:2000],
}
except subprocess.TimeoutExpired:
elapsed = (time.perf_counter() - start) * 1000.0
return {"proxy": proxy, "ok": False, "error": "timeout", "elapsed_ms": int(elapsed)}
except FileNotFoundError:
return {"proxy": proxy, "ok": False, "error": "yt-dlp-not-found"}
except Exception as e:
elapsed = (time.perf_counter() - start) * 1000.0
return {"proxy": proxy, "ok": False, "error": str(e), "elapsed_ms": int(elapsed)}
def generate_whitelist(input_file: str, out_json: str, out_txt: str, test_url: str, concurrency: int = 6):
proxies = []
with open(input_file, 'r', encoding='utf-8') as fh:
for line in fh:
line = line.strip()
if not line or line.startswith('#'):
continue
proxies.append(line)
results = []
with ThreadPoolExecutor(max_workers=concurrency) as ex:
futures = {ex.submit(test_proxy, p, test_url): p for p in proxies}
for fut in as_completed(futures):
try:
r = fut.result()
except Exception as e:
r = {"proxy": futures[fut], "ok": False, "error": str(e)}
results.append(r)
print(f"Tested: {r.get('proxy')} ok={r.get('ok')} rc={r.get('rc', '-') } elapsed={r.get('elapsed_ms','-')}ms")
# Ordenar proxies válidos por elapsed asc
valid = [r for r in results if r.get('ok')]
valid_sorted = sorted(valid, key=lambda x: x.get('elapsed_ms', 999999))
# Guardar JSON completo
out = {"tested_at": int(time.time()), "test_url": test_url, "results": results, "valid_count": len(valid_sorted)}
with open(out_json, 'w', encoding='utf-8') as fh:
json.dump(out, fh, indent=2, ensure_ascii=False)
# Guardar lista TXT (whitelist) con orden preferido
with open(out_txt, 'w', encoding='utf-8') as fh:
for r in valid_sorted:
fh.write(r['proxy'] + '\n')
return out, valid_sorted
def _extract_proxies_from_json(obj):
"""Dado un objeto JSON (parsed), intenta extraer una lista de proxies en forma de URLs.
Soporta varias estructuras comunes:
- lista simple de strings: ["socks5h://1.2.3.4:1080", ...]
- lista de objetos con keys como ip, port, protocol
- objetos anidados con 'proxy' o 'url' o 'address'
"""
proxies = []
if isinstance(obj, list):
for item in obj:
if isinstance(item, str):
proxies.append(item.strip())
elif isinstance(item, dict):
# intentar keys comunes
# ejemplos: {"ip":"1.2.3.4","port":1080, "protocol":"socks5"}
ip = item.get('ip') or item.get('host') or item.get('address') or item.get('ip_address')
port = item.get('port') or item.get('p')
proto = item.get('protocol') or item.get('proto') or item.get('type') or item.get('scheme')
if ip and port:
proto = proto or 'http'
proxies.append(f"{proto}://{ip}:{port}")
continue
# buscar valores en keys que puedan contener url
for k in ('proxy','url','address','connect'):
v = item.get(k)
if isinstance(v, str) and v.strip():
proxies.append(v.strip())
break
elif isinstance(obj, dict):
# encontrar listas dentro del dict
for v in obj.values():
if isinstance(v, (list, dict)):
proxies.extend(_extract_proxies_from_json(v))
# si el dict mismo tiene un campo 'proxy' o similar
for k in ('proxies','list','data'):
if k in obj and isinstance(obj[k], (list,dict)):
proxies.extend(_extract_proxies_from_json(obj[k]))
return [p for p in proxies if p]
def download_and_write_proxies(url: str, out_file: str) -> int:
"""Descarga JSON desde `url`, extrae proxies y las escribe en `out_file`.
Retorna número de proxies escritas.
"""
try:
r = requests.get(url, timeout=30)
r.raise_for_status()
data = r.json()
except Exception as e:
raise RuntimeError(f"Error descargando/parsing JSON desde {url}: {e}")
proxies = _extract_proxies_from_json(data)
# normalizar: si la entrada es 'ip:port' convertir a http://ip:port
normalized = []
for p in proxies:
p = p.strip()
if not p:
continue
# si es 'ip:port' o 'ip port'
if ':' in p and not p.lower().startswith(('http://','https://','socks5://','socks5h://','socks4://')):
# asumir http
normalized.append('http://' + p)
else:
normalized.append(p)
# dedup preserving order
seen = set()
out = []
for p in normalized:
if p in seen:
continue
seen.add(p)
out.append(p)
if not out:
# como fallback, si JSON es una estructura plana de objetos con 'ip' y 'port'
# ya manejado, si nada, error
raise RuntimeError(f"No se extrajeron proxies del JSON: {url}")
with open(out_file, 'w', encoding='utf-8') as fh:
for p in out:
fh.write(p + '\n')
return len(out)
if __name__ == '__main__':
parser = argparse.ArgumentParser(description='Test a list of proxies with yt-dlp and generate a whitelist')
parser.add_argument('--input', default='tools/proxies.txt', help='Input file with proxies (one per line)')
parser.add_argument('--out-json', default='tools/whitelist.json', help='Output JSON results')
parser.add_argument('--out-txt', default='tools/whitelist.txt', help='Output whitelist (one proxy per line)')
parser.add_argument('--test-url', default='https://www.youtube.com/watch?v=dQw4w9WgXcQ', help='YouTube test URL to use')
parser.add_argument('--concurrency', type=int, default=6, help='Concurrent workers')
parser.add_argument('--from-url', default='', help='Download a JSON of proxies from a URL and use it as input')
args = parser.parse_args()
# If from-url provided, download and write to temporary input file
input_file = args.input
temp_written = False
try:
if args.from_url:
print(f"Downloading proxies JSON from: {args.from_url}")
written = download_and_write_proxies(args.from_url, input_file)
print(f"Wrote {written} proxies to {input_file}")
temp_written = True
if not os.path.exists(input_file):
print(f"Input file {input_file} not found. Create it with one proxy per line or use --from-url.")
raise SystemExit(1)
out, valid_sorted = generate_whitelist(input_file, args.out_json, args.out_txt, args.test_url, args.concurrency)
print('\nSummary:')
print(f" Tested: {len(out['results'])}, Valid: {len(valid_sorted)}")
print(f" JSON: {args.out_json}, TXT whitelist: {args.out_txt}")
finally:
# optionally remove temp file? keep it for inspection
pass

View File

@ -1,177 +0,0 @@
#!/usr/bin/env python3
"""playwright_extract_m3u8.py
Abre una página de YouTube con Playwright y captura la primera URL m3u8/HLS
visible en las peticiones de red. También puede exportar cookies al formato
Netscape para usarlas con yt-dlp/tu API.
Uso:
python3 tools/playwright_extract_m3u8.py --video https://www.youtube.com/watch?v=ID [--profile /path/to/profile] [--headless]
Requisitos (host):
pip install playwright
python -m playwright install
Notas:
- Recomiendo ejecutarlo en el host (no en el contenedor) para usar el perfil de Chrome
y para que Playwright pueda manejar la interfaz gráfica si necesitas login/manual.
- Si pasas --profile, se lanzará una sesión persistente usando ese directorio (útil
para usar tu sesión de Chrome ya logueada). Si dejas vacío, se usa un contexto limpio.
"""
import argparse
import os
import json
import time
from pathlib import Path
try:
from playwright.sync_api import sync_playwright, TimeoutError as PWTimeout
except Exception as e:
print("playwright no está instalado. Instala con: pip install playwright && python -m playwright install")
raise
def write_netscape_cookie_file(cookies, target_path):
# cookies: list of dicts like Playwright provides
lines = ["# Netscape HTTP Cookie File"]
for c in cookies:
domain = c.get("domain", "")
flag = "TRUE" if domain.startswith('.') else "FALSE"
path = c.get("path", "/")
secure = "TRUE" if c.get("secure") else "FALSE"
expires = str(int(c.get("expires", 0))) if c.get("expires") else "0"
name = c.get("name", "")
value = c.get("value", "")
lines.append("\t".join([domain, flag, path, secure, expires, name, value]))
Path(target_path).parent.mkdir(parents=True, exist_ok=True)
with open(target_path, "w", encoding="utf-8") as fh:
fh.write("\n".join(lines) + "\n")
def extract_m3u8(video_url: str, profile: str | None, headless: bool, timeout: int = 45, save_cookies: bool = True):
result = {"m3u8_urls": [], "cookies_file": None, "errors": []}
data_dir = Path.cwd() / "data"
data_dir.mkdir(exist_ok=True)
target_cookies = str(data_dir / "cookies.txt")
with sync_playwright() as p:
# Usar Chromium para mejor compatibilidad con Chrome profile
browser_type = p.chromium
# establecer User-Agent a uno real para simular navegador
ua = "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36"
extra_headers = {"Accept-Language": "en-US,en;q=0.9", "Accept": "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8"}
launch_args = ['--no-sandbox', '--disable-setuid-sandbox', '--disable-dev-shm-usage']
if profile:
# persistent context uses a profile dir (user data dir)
user_data_dir = profile
# avoid passing user_agent due to some Playwright builds missing API; set headers only
context = browser_type.launch_persistent_context(user_data_dir=user_data_dir, headless=headless, extra_http_headers=extra_headers, args=launch_args)
else:
# pass common args to help in container environments
browser = browser_type.launch(headless=headless, args=launch_args)
# do not pass user_agent param; rely on browser default and headers
context = browser.new_context(extra_http_headers=extra_headers)
# debug info
try:
print(f"[playwright] started browser headless={headless} profile={'yes' if profile else 'no'}")
except Exception:
pass
page = context.new_page()
collected = set()
def on_response(resp):
try:
url = resp.url
# heurística: m3u8 en URL o content-type de respuesta
if ".m3u8" in url.lower():
collected.add(url)
else:
ct = resp.headers.get("content-type", "")
if "application/vnd.apple.mpegurl" in ct or "vnd.apple.mpegurl" in ct or "application/x-mpegURL" in ct:
collected.add(url)
except Exception:
pass
page.on("response", on_response)
try:
page.goto(video_url, timeout=timeout * 1000)
# esperar un poco para que las peticiones de manifest se disparen
wait_seconds = 6
for i in range(wait_seconds):
time.sleep(1)
# si encontramos algo temprano, romper
if collected:
break
# Si no encontramos m3u8, intentar forzar la apertura del player y realizar scroll
if not collected:
try:
# click play
page.evaluate("() => { const v = document.querySelector('video'); if (v) v.play(); }")
except Exception:
pass
# esperar más
time.sleep(3)
# recopilar URLs
result_urls = list(collected)
# desduplicar y ordenar
result_urls = sorted(set(result_urls))
result['m3u8_urls'] = result_urls
# guardar cookies si se pidió
if save_cookies:
try:
cookies = context.cookies()
write_netscape_cookie_file(cookies, target_cookies)
result['cookies_file'] = target_cookies
except Exception as e:
result['errors'].append(f"cookie_export_error:{e}")
except PWTimeout as e:
result['errors'].append(f"page_timeout: {e}")
except Exception as e:
import traceback
result['errors'].append(traceback.format_exc())
finally:
# intentar cerrar context y browser si existen
try:
if 'context' in locals() and context:
try:
context.close()
except Exception:
pass
except Exception:
pass
try:
if 'browser' in locals() and browser:
try:
browser.close()
except Exception:
pass
except Exception:
pass
return result
if __name__ == '__main__':
parser = argparse.ArgumentParser(description='Playwright m3u8 extractor for YouTube')
parser.add_argument('--video', required=True, help='Video URL or ID (e.g. https://www.youtube.com/watch?v=ID)')
parser.add_argument('--profile', default='', help='Path to browser profile (user data dir) to reuse logged session')
parser.add_argument('--headless', action='store_true', help='Run headless')
parser.add_argument('--timeout', type=int, default=45, help='Timeout for page load (seconds)')
parser.add_argument('--no-cookies', dest='save_cookies', action='store_false', help='Don\'t save cookies to ./data/cookies.txt')
args = parser.parse_args()
video = args.video
if len(video) == 11 and not video.startswith('http'):
video = f'https://www.youtube.com/watch?v={video}'
res = extract_m3u8(video, profile=args.profile or None, headless=args.headless, timeout=args.timeout, save_cookies=args.save_cookies)
print(json.dumps(res, indent=2, ensure_ascii=False))

View File

@ -1,10 +0,0 @@
# Proxies proporcionados por el usuario (formato: esquema://ip:port)
# Fuente: lista JSON proporcionada por el usuario — comprobadas por Google (campo "google": true)
http://48.210.225.96:80
http://107.174.231.218:8888
http://188.239.43.6:80
http://52.229.30.3:80
http://142.93.202.130:3128
http://154.219.101.86:8888

View File

@ -1,256 +0,0 @@
{
"tested_at": 1772912928,
"test_url": "https://www.youtube.com/watch?v=dQw4w9WgXcQ",
"results": [
{
"proxy": "http://107.174.231.218:8888",
"ok": false,
"rc": 1,
"elapsed_ms": 2714,
"bot_detected": false,
"stderr_preview": "ERROR: [youtube] dQw4w9WgXcQ: Unable to download API page: ('Unable to connect to proxy', OSError('Tunnel connection failed: 400 Bad Request')) (caused by ProxyError(\"('Unable to connect to proxy', OSError('Tunnel connection failed: 400 Bad Request'))\")); please report this issue on https://github.com/yt-dlp/yt-dlp/issues?q= , filling out the appropriate issue template. Confirm you are on the latest version using yt-dlp -U\n",
"stdout_preview": ""
},
{
"proxy": "socks5h://107.174.231.218:8888",
"ok": false,
"rc": 1,
"elapsed_ms": 1473,
"bot_detected": false,
"stderr_preview": "ERROR: [youtube] dQw4w9WgXcQ: Unable to download API page: ('[Errno 0] Invalid response version from server. Expected 05 got 48', InvalidVersionError(0, 'Invalid response version from server. Expected 05 got 48')) (caused by ProxyError(\"('[Errno 0] Invalid response version from server. Expected 05 got 48', InvalidVersionError(0, 'Invalid response version from server. Expected 05 got 48'))\")); please report this issue on https://github.com/yt-dlp/yt-dlp/issues?q= , filling out the appropriate issue template. Confirm you are on the latest version using yt-dlp -U\n",
"stdout_preview": ""
},
{
"proxy": "socks5h://48.210.225.96:9050",
"ok": false,
"rc": 1,
"elapsed_ms": 4559,
"bot_detected": false,
"stderr_preview": "ERROR: [youtube] dQw4w9WgXcQ: Unable to download API page: ('[Errno 0] Invalid response version from server. Expected 05 got 48', InvalidVersionError(0, 'Invalid response version from server. Expected 05 got 48')) (caused by ProxyError(\"('[Errno 0] Invalid response version from server. Expected 05 got 48', InvalidVersionError(0, 'Invalid response version from server. Expected 05 got 48'))\")); please report this issue on https://github.com/yt-dlp/yt-dlp/issues?q= , filling out the appropriate issue template. Confirm you are on the latest version using yt-dlp -U\n",
"stdout_preview": ""
},
{
"proxy": "socks5h://48.210.225.96:80",
"ok": false,
"rc": 1,
"elapsed_ms": 4850,
"bot_detected": false,
"stderr_preview": "ERROR: [youtube] dQw4w9WgXcQ: Unable to download API page: ('[Errno 0] Invalid response version from server. Expected 05 got 48', InvalidVersionError(0, 'Invalid response version from server. Expected 05 got 48')) (caused by ProxyError(\"('[Errno 0] Invalid response version from server. Expected 05 got 48', InvalidVersionError(0, 'Invalid response version from server. Expected 05 got 48'))\")); please report this issue on https://github.com/yt-dlp/yt-dlp/issues?q= , filling out the appropriate issue template. Confirm you are on the latest version using yt-dlp -U\n",
"stdout_preview": ""
},
{
"proxy": "http://48.210.225.96:80",
"ok": false,
"rc": 1,
"elapsed_ms": 5159,
"bot_detected": false,
"stderr_preview": "ERROR: [youtube] dQw4w9WgXcQ: Unable to download API page: ('Unable to connect to proxy', OSError('Tunnel connection failed: 400 Bad Request')) (caused by ProxyError(\"('Unable to connect to proxy', OSError('Tunnel connection failed: 400 Bad Request'))\")); please report this issue on https://github.com/yt-dlp/yt-dlp/issues?q= , filling out the appropriate issue template. Confirm you are on the latest version using yt-dlp -U\n",
"stdout_preview": ""
},
{
"proxy": "socks5h://107.174.231.218:1080",
"ok": false,
"rc": 1,
"elapsed_ms": 1057,
"bot_detected": false,
"stderr_preview": "ERROR: [youtube] dQw4w9WgXcQ: Unable to download API page: SocksHTTPSConnection(host='www.youtube.com', port=443): Failed to establish a new connection: [Errno 111] Connection refused (caused by TransportError(\"SocksHTTPSConnection(host='www.youtube.com', port=443): Failed to establish a new connection: [Errno 111] Connection refused\"))\n",
"stdout_preview": ""
},
{
"proxy": "socks5h://107.174.231.218:10808",
"ok": false,
"rc": 1,
"elapsed_ms": 1208,
"bot_detected": false,
"stderr_preview": "ERROR: [youtube] dQw4w9WgXcQ: Unable to download API page: SocksHTTPSConnection(host='www.youtube.com', port=443): Failed to establish a new connection: [Errno 111] Connection refused (caused by TransportError(\"SocksHTTPSConnection(host='www.youtube.com', port=443): Failed to establish a new connection: [Errno 111] Connection refused\"))\n",
"stdout_preview": ""
},
{
"proxy": "socks5h://107.174.231.218:9050",
"ok": false,
"rc": 1,
"elapsed_ms": 1123,
"bot_detected": false,
"stderr_preview": "ERROR: [youtube] dQw4w9WgXcQ: Unable to download API page: SocksHTTPSConnection(host='www.youtube.com', port=443): Failed to establish a new connection: [Errno 111] Connection refused (caused by TransportError(\"SocksHTTPSConnection(host='www.youtube.com', port=443): Failed to establish a new connection: [Errno 111] Connection refused\"))\n",
"stdout_preview": ""
},
{
"proxy": "socks5h://188.239.43.6:80",
"ok": false,
"rc": 1,
"elapsed_ms": 7075,
"bot_detected": false,
"stderr_preview": "ERROR: [youtube] dQw4w9WgXcQ: Unable to download API page: SocksHTTPSConnection(host='www.youtube.com', port=443): Failed to establish a new connection: [Errno 104] Connection reset by peer (caused by TransportError(\"SocksHTTPSConnection(host='www.youtube.com', port=443): Failed to establish a new connection: [Errno 104] Connection reset by peer\"))\n",
"stdout_preview": ""
},
{
"proxy": "http://188.239.43.6:80",
"ok": false,
"rc": 1,
"elapsed_ms": 7192,
"bot_detected": false,
"stderr_preview": "ERROR: [youtube] dQw4w9WgXcQ: Unable to download API page: ('Connection aborted.', ConnectionResetError(104, 'Connection reset by peer')) (caused by TransportError(\"('Connection aborted.', ConnectionResetError(104, 'Connection reset by peer'))\"))\n",
"stdout_preview": ""
},
{
"proxy": "http://52.229.30.3:80",
"ok": false,
"rc": 1,
"elapsed_ms": 2332,
"bot_detected": false,
"stderr_preview": "ERROR: [youtube] dQw4w9WgXcQ: Unable to download API page: ('Unable to connect to proxy', OSError('Tunnel connection failed: 400 Bad Request')) (caused by ProxyError(\"('Unable to connect to proxy', OSError('Tunnel connection failed: 400 Bad Request'))\")); please report this issue on https://github.com/yt-dlp/yt-dlp/issues?q= , filling out the appropriate issue template. Confirm you are on the latest version using yt-dlp -U\n",
"stdout_preview": ""
},
{
"proxy": "socks5h://52.229.30.3:80",
"ok": false,
"rc": 1,
"elapsed_ms": 2265,
"bot_detected": false,
"stderr_preview": "ERROR: [youtube] dQw4w9WgXcQ: Unable to download API page: ('[Errno 0] Invalid response version from server. Expected 05 got 48', InvalidVersionError(0, 'Invalid response version from server. Expected 05 got 48')) (caused by ProxyError(\"('[Errno 0] Invalid response version from server. Expected 05 got 48', InvalidVersionError(0, 'Invalid response version from server. Expected 05 got 48'))\")); please report this issue on https://github.com/yt-dlp/yt-dlp/issues?q= , filling out the appropriate issue template. Confirm you are on the latest version using yt-dlp -U\n",
"stdout_preview": ""
},
{
"proxy": "socks5h://48.210.225.96:1080",
"ok": false,
"error": "timeout",
"elapsed_ms": 25022
},
{
"proxy": "socks5h://48.210.225.96:10808",
"ok": false,
"error": "timeout",
"elapsed_ms": 25036
},
{
"proxy": "socks5h://52.229.30.3:9050",
"ok": false,
"rc": 1,
"elapsed_ms": 2430,
"bot_detected": false,
"stderr_preview": "ERROR: [youtube] dQw4w9WgXcQ: Unable to download API page: ('[Errno 0] Invalid response version from server. Expected 05 got 48', InvalidVersionError(0, 'Invalid response version from server. Expected 05 got 48')) (caused by ProxyError(\"('[Errno 0] Invalid response version from server. Expected 05 got 48', InvalidVersionError(0, 'Invalid response version from server. Expected 05 got 48'))\")); please report this issue on https://github.com/yt-dlp/yt-dlp/issues?q= , filling out the appropriate issue template. Confirm you are on the latest version using yt-dlp -U\n",
"stdout_preview": ""
},
{
"proxy": "http://142.93.202.130:3128",
"ok": false,
"rc": 1,
"elapsed_ms": 1668,
"bot_detected": false,
"stderr_preview": "ERROR: [youtube] dQw4w9WgXcQ: Unable to download API page: ('Unable to connect to proxy', OSError('Tunnel connection failed: 400 Bad Request')) (caused by ProxyError(\"('Unable to connect to proxy', OSError('Tunnel connection failed: 400 Bad Request'))\")); please report this issue on https://github.com/yt-dlp/yt-dlp/issues?q= , filling out the appropriate issue template. Confirm you are on the latest version using yt-dlp -U\n",
"stdout_preview": ""
},
{
"proxy": "socks5h://142.93.202.130:3128",
"ok": false,
"rc": 1,
"elapsed_ms": 1652,
"bot_detected": false,
"stderr_preview": "ERROR: [youtube] dQw4w9WgXcQ: Unable to download API page: ('[Errno 0] Invalid response version from server. Expected 05 got 48', InvalidVersionError(0, 'Invalid response version from server. Expected 05 got 48')) (caused by ProxyError(\"('[Errno 0] Invalid response version from server. Expected 05 got 48', InvalidVersionError(0, 'Invalid response version from server. Expected 05 got 48'))\")); please report this issue on https://github.com/yt-dlp/yt-dlp/issues?q= , filling out the appropriate issue template. Confirm you are on the latest version using yt-dlp -U\n",
"stdout_preview": ""
},
{
"proxy": "socks5h://188.239.43.6:1080",
"ok": false,
"error": "timeout",
"elapsed_ms": 25031
},
{
"proxy": "socks5h://188.239.43.6:10808",
"ok": false,
"error": "timeout",
"elapsed_ms": 25030
},
{
"proxy": "socks5h://142.93.202.130:1080",
"ok": false,
"rc": 1,
"elapsed_ms": 1364,
"bot_detected": false,
"stderr_preview": "ERROR: [youtube] dQw4w9WgXcQ: Unable to download API page: SocksHTTPSConnection(host='www.youtube.com', port=443): Failed to establish a new connection: [Errno 111] Connection refused (caused by TransportError(\"SocksHTTPSConnection(host='www.youtube.com', port=443): Failed to establish a new connection: [Errno 111] Connection refused\"))\n",
"stdout_preview": ""
},
{
"proxy": "socks5h://142.93.202.130:10808",
"ok": false,
"rc": 1,
"elapsed_ms": 1405,
"bot_detected": false,
"stderr_preview": "ERROR: [youtube] dQw4w9WgXcQ: Unable to download API page: SocksHTTPSConnection(host='www.youtube.com', port=443): Failed to establish a new connection: [Errno 111] Connection refused (caused by TransportError(\"SocksHTTPSConnection(host='www.youtube.com', port=443): Failed to establish a new connection: [Errno 111] Connection refused\"))\n",
"stdout_preview": ""
},
{
"proxy": "socks5h://142.93.202.130:9050",
"ok": false,
"rc": 1,
"elapsed_ms": 1322,
"bot_detected": false,
"stderr_preview": "ERROR: [youtube] dQw4w9WgXcQ: Unable to download API page: SocksHTTPSConnection(host='www.youtube.com', port=443): Failed to establish a new connection: [Errno 111] Connection refused (caused by TransportError(\"SocksHTTPSConnection(host='www.youtube.com', port=443): Failed to establish a new connection: [Errno 111] Connection refused\"))\n",
"stdout_preview": ""
},
{
"proxy": "socks5h://154.219.101.86:1080",
"ok": false,
"rc": 1,
"elapsed_ms": 2199,
"bot_detected": false,
"stderr_preview": "ERROR: [youtube] dQw4w9WgXcQ: Unable to download API page: SocksHTTPSConnection(host='www.youtube.com', port=443): Failed to establish a new connection: [Errno 111] Connection refused (caused by TransportError(\"SocksHTTPSConnection(host='www.youtube.com', port=443): Failed to establish a new connection: [Errno 111] Connection refused\"))\n",
"stdout_preview": ""
},
{
"proxy": "http://154.219.101.86:8888",
"ok": false,
"rc": 1,
"elapsed_ms": 3651,
"bot_detected": false,
"stderr_preview": "ERROR: [youtube] dQw4w9WgXcQ: Unable to download API page: ('Unable to connect to proxy', OSError('Tunnel connection failed: 400 Bad Request')) (caused by ProxyError(\"('Unable to connect to proxy', OSError('Tunnel connection failed: 400 Bad Request'))\")); please report this issue on https://github.com/yt-dlp/yt-dlp/issues?q= , filling out the appropriate issue template. Confirm you are on the latest version using yt-dlp -U\n",
"stdout_preview": ""
},
{
"proxy": "socks5h://154.219.101.86:8888",
"ok": false,
"rc": 1,
"elapsed_ms": 3628,
"bot_detected": false,
"stderr_preview": "ERROR: [youtube] dQw4w9WgXcQ: Unable to download API page: ('[Errno 0] Invalid response version from server. Expected 05 got 48', InvalidVersionError(0, 'Invalid response version from server. Expected 05 got 48')) (caused by ProxyError(\"('[Errno 0] Invalid response version from server. Expected 05 got 48', InvalidVersionError(0, 'Invalid response version from server. Expected 05 got 48'))\")); please report this issue on https://github.com/yt-dlp/yt-dlp/issues?q= , filling out the appropriate issue template. Confirm you are on the latest version using yt-dlp -U\n",
"stdout_preview": ""
},
{
"proxy": "socks5h://154.219.101.86:10808",
"ok": false,
"rc": 1,
"elapsed_ms": 1981,
"bot_detected": false,
"stderr_preview": "ERROR: [youtube] dQw4w9WgXcQ: Unable to download API page: SocksHTTPSConnection(host='www.youtube.com', port=443): Failed to establish a new connection: [Errno 111] Connection refused (caused by TransportError(\"SocksHTTPSConnection(host='www.youtube.com', port=443): Failed to establish a new connection: [Errno 111] Connection refused\"))\n",
"stdout_preview": ""
},
{
"proxy": "socks5h://188.239.43.6:9050",
"ok": false,
"error": "timeout",
"elapsed_ms": 25023
},
{
"proxy": "socks5h://154.219.101.86:9050",
"ok": false,
"rc": 1,
"elapsed_ms": 1962,
"bot_detected": false,
"stderr_preview": "ERROR: [youtube] dQw4w9WgXcQ: Unable to download API page: SocksHTTPSConnection(host='www.youtube.com', port=443): Failed to establish a new connection: [Errno 111] Connection refused (caused by TransportError(\"SocksHTTPSConnection(host='www.youtube.com', port=443): Failed to establish a new connection: [Errno 111] Connection refused\"))\n",
"stdout_preview": ""
},
{
"proxy": "socks5h://52.229.30.3:1080",
"ok": false,
"error": "timeout",
"elapsed_ms": 25026
},
{
"proxy": "socks5h://52.229.30.3:10808",
"ok": false,
"error": "timeout",
"elapsed_ms": 25028
}
],
"valid_count": 0
}

View File