Files
derp/plugins/searx.py
user 6083de13f9
Some checks failed
CI / gitleaks (push) Failing after 3s
CI / lint (push) Successful in 22s
CI / test (3.11) (push) Failing after 2m47s
CI / test (3.13) (push) Failing after 2m52s
CI / test (3.12) (push) Failing after 2m54s
CI / build (push) Has been skipped
feat: playlist shuffle, lazy resolution, TTS ducking, kept repair
Music:
- #random URL fragment shuffles playlist tracks before enqueuing
- Lazy playlist resolution: first 10 tracks resolve immediately,
  remaining are fetched in a background task
- !kept repair re-downloads kept tracks with missing local files
- !kept shows [MISSING] marker for tracks without local files
- TTS ducking: music ducks when merlin speaks via voice peer,
  smooth restore after TTS finishes

Performance (from profiling):
- Connection pool: preload_content=True for SOCKS connection reuse
- Pool tuning: 30 pools / 8 connections (up from 20/4)
- _PooledResponse wrapper for stdlib-compatible read interface
- Iterative _extract_videos (replace 51K-deep recursion with stack)
- proxy=False for local SearXNG

Voice + multi-bot:
- Per-bot voice config lookup ([<username>.voice] in TOML)
- Mute detection: skip duck silence when all users muted
- Autoplay shuffle deck (no repeats until full cycle)
- Seek clamp to track duration (prevent seek-past-end stall)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-22 16:21:47 +01:00

96 lines
2.7 KiB
Python

"""Plugin: SearXNG web search."""
from __future__ import annotations
import json
import urllib.parse
import urllib.request
from derp.http import urlopen as _urlopen
from derp.plugin import command
# -- Constants ---------------------------------------------------------------
_SEARX_URL = "https://searx.mymx.me/search"
_FETCH_TIMEOUT = 10
_MAX_RESULTS = 3
_MAX_TITLE_LEN = 80
_MAX_QUERY_LEN = 200
# -- Pure helpers ------------------------------------------------------------
def _truncate(text: str, max_len: int = _MAX_TITLE_LEN) -> str:
"""Truncate text with ellipsis if needed."""
if len(text) <= max_len:
return text
return text[: max_len - 3].rstrip() + "..."
# -- SearXNG search (blocking) ----------------------------------------------
def _search(query: str) -> list[dict]:
"""Search SearXNG. Blocking.
Returns list of dicts with keys: title, url, snippet.
Raises on HTTP or parse errors.
"""
params = urllib.parse.urlencode({"q": query, "format": "json"})
url = f"{_SEARX_URL}?{params}"
req = urllib.request.Request(url, method="GET")
resp = _urlopen(req, timeout=_FETCH_TIMEOUT, proxy=False)
raw = resp.read()
resp.close()
data = json.loads(raw)
results: list[dict] = []
for item in data.get("results", []):
results.append({
"title": item.get("title", ""),
"url": item.get("url", ""),
"snippet": item.get("content", item.get("snippet", "")),
})
return results
# -- Command handler ---------------------------------------------------------
@command("searx", help="Search: !searx <query>")
async def cmd_searx(bot, message):
"""Search SearXNG and show top results.
Usage: !searx <query...>
"""
if not message.is_channel:
await bot.reply(message, "Use this command in a channel")
return
parts = message.text.split(None, 1)
if len(parts) < 2 or not parts[1].strip():
await bot.reply(message, "Usage: !searx <query>")
return
query = parts[1].strip()
if len(query) > _MAX_QUERY_LEN:
await bot.reply(message, f"Query too long (max {_MAX_QUERY_LEN} chars)")
return
import asyncio
loop = asyncio.get_running_loop()
try:
results = await loop.run_in_executor(None, _search, query)
except Exception as exc:
await bot.reply(message, f"Search failed: {exc}")
return
if not results:
await bot.reply(message, f"No results for: {query}")
return
for item in results[:_MAX_RESULTS]:
title = _truncate(item["title"]) if item["title"] else "(no title)"
url = item["url"]
await bot.reply(message, f"{title} -- {url}")