feat: add wave 4 plugins (opslog, note, subdomain, headers)
Opslog: timestamped operational log per channel with add, list, search, and delete. SQLite-backed, admin-only clear. Note: persistent per-channel key-value store with set, get, del, list, clear. SQLite-backed, admin-only clear. Subdomain: enumeration via crt.sh CT log query with optional DNS brute force using a built-in 80-word prefix wordlist. Resolves discovered subdomains concurrently. Headers: HTTP header fingerprinting against 50+ signature patterns. Detects servers, frameworks, CDNs, and security headers (HSTS, CSP, XFO, etc). Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
@@ -55,6 +55,10 @@ make down # Stop
|
|||||||
| torcheck | tor | Tor exit node check (local list) |
|
| torcheck | tor | Tor exit node check (local list) |
|
||||||
| iprep | iprep | IP reputation (Firehol/ET blocklists) |
|
| iprep | iprep | IP reputation (Firehol/ET blocklists) |
|
||||||
| cve | cve | CVE lookup + search (local NVD mirror) |
|
| cve | cve | CVE lookup + search (local NVD mirror) |
|
||||||
|
| opslog | opslog | Timestamped operational notes (SQLite) |
|
||||||
|
| note | note | Per-channel persistent key-value store |
|
||||||
|
| subdomain | subdomain | Subdomain enum (crt.sh + DNS brute) |
|
||||||
|
| headers | headers | HTTP header fingerprinting |
|
||||||
| example | echo | Demo plugin |
|
| example | echo | Demo plugin |
|
||||||
|
|
||||||
## Writing Plugins
|
## Writing Plugins
|
||||||
|
|||||||
12
ROADMAP.md
12
ROADMAP.md
@@ -39,7 +39,7 @@
|
|||||||
- [x] Admin/owner permission system (hostmask + IRCOP)
|
- [x] Admin/owner permission system (hostmask + IRCOP)
|
||||||
- [x] !whoami and !admins commands
|
- [x] !whoami and !admins commands
|
||||||
|
|
||||||
## v0.4.0 -- Wave 3 Plugins (Local Databases) (current)
|
## v0.4.0 -- Wave 3 Plugins (Local Databases) (done)
|
||||||
|
|
||||||
- [ ] GeoIP plugin (MaxMind GeoLite2-City mmdb)
|
- [ ] GeoIP plugin (MaxMind GeoLite2-City mmdb)
|
||||||
- [ ] ASN plugin (GeoLite2-ASN mmdb)
|
- [ ] ASN plugin (GeoLite2-ASN mmdb)
|
||||||
@@ -48,12 +48,12 @@
|
|||||||
- [ ] CVE lookup plugin (local NVD JSON feed)
|
- [ ] CVE lookup plugin (local NVD JSON feed)
|
||||||
- [ ] Data update script (cron-friendly, all local DBs)
|
- [ ] Data update script (cron-friendly, all local DBs)
|
||||||
|
|
||||||
## v0.5.0 -- Wave 4 Plugins (Advanced)
|
## v0.5.0 -- Wave 4 Plugins (Advanced) (current)
|
||||||
|
|
||||||
- [ ] Operational logging plugin (SQLite per-channel)
|
- [x] Operational logging plugin (SQLite per-channel)
|
||||||
- [ ] Persistent notes plugin (per-channel key-value)
|
- [x] Persistent notes plugin (per-channel key-value)
|
||||||
- [ ] Subdomain enumeration (crt.sh + wordlist DNS brute)
|
- [x] Subdomain enumeration (crt.sh + wordlist DNS brute)
|
||||||
- [ ] HTTP header fingerprinting (local signature db)
|
- [x] HTTP header fingerprinting (local signature db)
|
||||||
- [ ] ExploitDB search (local CSV clone)
|
- [ ] ExploitDB search (local CSV clone)
|
||||||
- [ ] Payload template library (SQLi, XSS, SSTI)
|
- [ ] Payload template library (SQLi, XSS, SSTI)
|
||||||
|
|
||||||
|
|||||||
17
TASKS.md
17
TASKS.md
@@ -1,21 +1,22 @@
|
|||||||
# derp - Tasks
|
# derp - Tasks
|
||||||
|
|
||||||
## Current Sprint -- v0.4.0 Wave 3 (2026-02-15)
|
## Current Sprint -- v0.5.0 Wave 4 (2026-02-15)
|
||||||
|
|
||||||
| Pri | Status | Task |
|
| Pri | Status | Task |
|
||||||
|-----|--------|------|
|
|-----|--------|------|
|
||||||
| P0 | [x] | GeoIP plugin (GeoLite2-City mmdb) |
|
| P0 | [x] | Opslog plugin (SQLite per-channel notes) |
|
||||||
| P0 | [x] | ASN plugin (GeoLite2-ASN mmdb) |
|
| P0 | [x] | Note plugin (per-channel key-value store) |
|
||||||
| P0 | [x] | Tor exit node check plugin |
|
| P0 | [x] | Subdomain plugin (crt.sh + DNS brute force) |
|
||||||
| P0 | [x] | IP reputation plugin (Firehol blocklists) |
|
| P0 | [x] | Headers plugin (HTTP header fingerprinting) |
|
||||||
| P0 | [x] | CVE lookup plugin (NVD JSON feed) |
|
| P1 | [ ] | ExploitDB search plugin (local CSV clone) |
|
||||||
| P0 | [x] | Data update script (scripts/update-data.sh) |
|
| P1 | [ ] | Payload template plugin (SQLi, XSS, SSTI) |
|
||||||
| P0 | [x] | Documentation update (all docs current) |
|
| P1 | [x] | Documentation update |
|
||||||
|
|
||||||
## Completed
|
## Completed
|
||||||
|
|
||||||
| Date | Task |
|
| Date | Task |
|
||||||
|------|------|
|
|------|------|
|
||||||
|
| 2026-02-15 | Wave 4 batch 1 (opslog, note, subdomain, headers) |
|
||||||
| 2026-02-15 | Wave 3 plugins (geoip, asn, torcheck, iprep, cve) + update script |
|
| 2026-02-15 | Wave 3 plugins (geoip, asn, torcheck, iprep, cve) + update script |
|
||||||
| 2026-02-15 | Admin/owner permission system (hostmask + IRCOP) |
|
| 2026-02-15 | Admin/owner permission system (hostmask + IRCOP) |
|
||||||
| 2026-02-15 | SASL PLAIN, rate limiting, CTCP responses |
|
| 2026-02-15 | SASL PLAIN, rate limiting, CTCP responses |
|
||||||
|
|||||||
@@ -86,6 +86,25 @@ IRC operators are auto-detected via WHO. Hostmask patterns use fnmatch.
|
|||||||
!cert example.com # CT log lookup (max 5 domains)
|
!cert example.com # CT log lookup (max 5 domains)
|
||||||
!whois example.com # WHOIS domain lookup
|
!whois example.com # WHOIS domain lookup
|
||||||
!whois 8.8.8.8 # WHOIS IP lookup
|
!whois 8.8.8.8 # WHOIS IP lookup
|
||||||
|
!subdomain example.com # CT log subdomain enum
|
||||||
|
!subdomain example.com brute # + DNS wordlist brute
|
||||||
|
!headers example.com # HTTP fingerprint (tech + security)
|
||||||
|
```
|
||||||
|
|
||||||
|
## Ops
|
||||||
|
|
||||||
|
```
|
||||||
|
!opslog add Compromised target # Add timestamped entry
|
||||||
|
!opslog list # Show last 5 entries
|
||||||
|
!opslog list 10 # Show last 10
|
||||||
|
!opslog search pivot # Search entries
|
||||||
|
!opslog del 3 # Delete entry by ID
|
||||||
|
!opslog clear # Clear channel log (admin)
|
||||||
|
!note set target 10.0.0.1 # Store a note
|
||||||
|
!note get target # Retrieve a note
|
||||||
|
!note del target # Delete a note
|
||||||
|
!note list # List all keys
|
||||||
|
!note clear # Clear all notes (admin)
|
||||||
```
|
```
|
||||||
|
|
||||||
## Red Team
|
## Red Team
|
||||||
|
|||||||
@@ -90,6 +90,10 @@ level = "info" # Logging level: debug, info, warning, error
|
|||||||
| `!tor <ip\|update>` | Check IP against Tor exit nodes |
|
| `!tor <ip\|update>` | Check IP against Tor exit nodes |
|
||||||
| `!iprep <ip\|update>` | Check IP against Firehol/ET blocklists |
|
| `!iprep <ip\|update>` | Check IP against Firehol/ET blocklists |
|
||||||
| `!cve <id\|search>` | CVE lookup from local NVD mirror |
|
| `!cve <id\|search>` | CVE lookup from local NVD mirror |
|
||||||
|
| `!opslog <add\|list\|search\|del\|clear>` | Timestamped operational log |
|
||||||
|
| `!note <set\|get\|del\|list\|clear>` | Per-channel key-value notes |
|
||||||
|
| `!subdomain <domain> [brute]` | Subdomain enumeration (crt.sh + DNS) |
|
||||||
|
| `!headers <url>` | HTTP header fingerprinting |
|
||||||
|
|
||||||
### Command Shorthand
|
### Command Shorthand
|
||||||
|
|
||||||
|
|||||||
183
plugins/headers.py
Normal file
183
plugins/headers.py
Normal file
@@ -0,0 +1,183 @@
|
|||||||
|
"""Plugin: HTTP header fingerprinting with local signature patterns."""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import asyncio
|
||||||
|
import logging
|
||||||
|
import re
|
||||||
|
import ssl
|
||||||
|
import urllib.request
|
||||||
|
|
||||||
|
from derp.plugin import command
|
||||||
|
|
||||||
|
log = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
_TIMEOUT = 10
|
||||||
|
_USER_AGENT = "Mozilla/5.0 (compatible; derp-bot/1.0)"
|
||||||
|
|
||||||
|
# -- Signature database -------------------------------------------------------
|
||||||
|
# Each entry: (header_name, pattern_regex, technology_label)
|
||||||
|
# Patterns are case-insensitive.
|
||||||
|
|
||||||
|
_SIGNATURES: list[tuple[str, str, str]] = [
|
||||||
|
# Web servers
|
||||||
|
("Server", r"Apache/?(\S+)?", "Apache {0}"),
|
||||||
|
("Server", r"nginx/?(\S+)?", "nginx {0}"),
|
||||||
|
("Server", r"Microsoft-IIS/?(\S+)?", "IIS {0}"),
|
||||||
|
("Server", r"LiteSpeed", "LiteSpeed"),
|
||||||
|
("Server", r"Caddy", "Caddy"),
|
||||||
|
("Server", r"openresty/?(\S+)?", "OpenResty {0}"),
|
||||||
|
("Server", r"Cowboy", "Cowboy (Erlang)"),
|
||||||
|
("Server", r"gunicorn/?(\S+)?", "Gunicorn {0}"),
|
||||||
|
("Server", r"uvicorn", "Uvicorn"),
|
||||||
|
("Server", r"Werkzeug/?(\S+)?", "Werkzeug {0}"),
|
||||||
|
("Server", r"Kestrel", "Kestrel (.NET)"),
|
||||||
|
("Server", r"Jetty", "Jetty (Java)"),
|
||||||
|
|
||||||
|
# Frameworks / languages
|
||||||
|
("X-Powered-By", r"PHP/?(\S+)?", "PHP {0}"),
|
||||||
|
("X-Powered-By", r"ASP\.NET", "ASP.NET"),
|
||||||
|
("X-Powered-By", r"Express", "Express (Node.js)"),
|
||||||
|
("X-Powered-By", r"Next\.js", "Next.js"),
|
||||||
|
("X-Powered-By", r"Phusion Passenger", "Passenger"),
|
||||||
|
("X-Powered-By", r"Django", "Django"),
|
||||||
|
("X-Powered-By", r"Flask", "Flask"),
|
||||||
|
("X-AspNet-Version", r"(\S+)", "ASP.NET {0}"),
|
||||||
|
("X-Drupal-Cache", r".*", "Drupal"),
|
||||||
|
("X-Generator", r"WordPress", "WordPress"),
|
||||||
|
("X-Generator", r"Drupal", "Drupal"),
|
||||||
|
("X-Shopify-Stage", r".*", "Shopify"),
|
||||||
|
("X-Wix-Request-Id", r".*", "Wix"),
|
||||||
|
|
||||||
|
# CDN / proxy
|
||||||
|
("CF-RAY", r".*", "Cloudflare"),
|
||||||
|
("X-Cache", r".*cloudfront.*", "CloudFront"),
|
||||||
|
("X-Served-By", r".*cache.*", "Fastly/Varnish"),
|
||||||
|
("X-Varnish", r".*", "Varnish"),
|
||||||
|
("Via", r".*varnish.*", "Varnish"),
|
||||||
|
("Via", r".*cloudfront.*", "CloudFront"),
|
||||||
|
("X-Akamai-Transformed", r".*", "Akamai"),
|
||||||
|
("X-Azure-Ref", r".*", "Azure CDN"),
|
||||||
|
("X-Vercel-Id", r".*", "Vercel"),
|
||||||
|
("X-Netlify-Request-Id", r".*", "Netlify"),
|
||||||
|
("Fly-Request-Id", r".*", "Fly.io"),
|
||||||
|
|
||||||
|
# Security headers (presence check)
|
||||||
|
("Strict-Transport-Security", r".*", "HSTS"),
|
||||||
|
("Content-Security-Policy", r".*", "CSP"),
|
||||||
|
("X-Content-Type-Options", r"nosniff", "X-CTO"),
|
||||||
|
("X-Frame-Options", r".*", "XFO"),
|
||||||
|
("X-XSS-Protection", r".*", "X-XSS"),
|
||||||
|
("Permissions-Policy", r".*", "Permissions-Policy"),
|
||||||
|
("Referrer-Policy", r".*", "Referrer-Policy"),
|
||||||
|
]
|
||||||
|
|
||||||
|
# Compile patterns once
|
||||||
|
_COMPILED: list[tuple[str, re.Pattern, str]] = []
|
||||||
|
for _hdr, _pat, _label in _SIGNATURES:
|
||||||
|
_COMPILED.append((_hdr.lower(), re.compile(_pat, re.IGNORECASE), _label))
|
||||||
|
|
||||||
|
|
||||||
|
def _fetch_headers(url: str) -> tuple[dict[str, str], str]:
|
||||||
|
"""Blocking HEAD request. Returns (headers_dict, error_str)."""
|
||||||
|
ctx = ssl.create_default_context()
|
||||||
|
ctx.check_hostname = False
|
||||||
|
ctx.verify_mode = ssl.CERT_NONE
|
||||||
|
|
||||||
|
opener = urllib.request.build_opener(
|
||||||
|
urllib.request.HTTPSHandler(context=ctx),
|
||||||
|
)
|
||||||
|
req = urllib.request.Request(url, method="GET")
|
||||||
|
req.add_header("User-Agent", _USER_AGENT)
|
||||||
|
|
||||||
|
try:
|
||||||
|
resp = opener.open(req, timeout=_TIMEOUT)
|
||||||
|
hdrs = {k.lower(): v for k, v in resp.headers.items()}
|
||||||
|
resp.close()
|
||||||
|
return hdrs, ""
|
||||||
|
except urllib.error.HTTPError as exc:
|
||||||
|
hdrs = {k.lower(): v for k, v in exc.headers.items()}
|
||||||
|
return hdrs, ""
|
||||||
|
except Exception as exc:
|
||||||
|
return {}, str(exc)[:100]
|
||||||
|
|
||||||
|
|
||||||
|
def _fingerprint(headers: dict[str, str]) -> tuple[list[str], list[str]]:
|
||||||
|
"""Match headers against signature database.
|
||||||
|
|
||||||
|
Returns (tech_list, security_list).
|
||||||
|
"""
|
||||||
|
tech: list[str] = []
|
||||||
|
security: list[str] = []
|
||||||
|
seen: set[str] = set()
|
||||||
|
|
||||||
|
for hdr_lower, pattern, label_fmt in _COMPILED:
|
||||||
|
value = headers.get(hdr_lower, "")
|
||||||
|
if not value:
|
||||||
|
continue
|
||||||
|
m = pattern.search(value)
|
||||||
|
if not m:
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Format label with captured groups
|
||||||
|
groups = m.groups()
|
||||||
|
label = label_fmt
|
||||||
|
for i, g in enumerate(groups):
|
||||||
|
label = label.replace(f"{{{i}}}", g or "")
|
||||||
|
label = label.strip()
|
||||||
|
|
||||||
|
if label in seen:
|
||||||
|
continue
|
||||||
|
seen.add(label)
|
||||||
|
|
||||||
|
# Categorize: security headers vs tech
|
||||||
|
if hdr_lower in ("strict-transport-security", "content-security-policy",
|
||||||
|
"x-content-type-options", "x-frame-options",
|
||||||
|
"x-xss-protection", "permissions-policy",
|
||||||
|
"referrer-policy"):
|
||||||
|
security.append(label)
|
||||||
|
else:
|
||||||
|
tech.append(label)
|
||||||
|
|
||||||
|
return tech, security
|
||||||
|
|
||||||
|
|
||||||
|
@command("headers", help="HTTP fingerprint: !headers <url>")
|
||||||
|
async def cmd_headers(bot, message):
|
||||||
|
"""Fetch HTTP headers and fingerprint server technology.
|
||||||
|
|
||||||
|
Usage:
|
||||||
|
!headers example.com
|
||||||
|
!headers https://10.0.0.1:8080
|
||||||
|
"""
|
||||||
|
parts = message.text.split(None, 2)
|
||||||
|
if len(parts) < 2:
|
||||||
|
await bot.reply(message, "Usage: !headers <url>")
|
||||||
|
return
|
||||||
|
|
||||||
|
url = parts[1]
|
||||||
|
if not url.startswith(("http://", "https://")):
|
||||||
|
url = f"https://{url}"
|
||||||
|
|
||||||
|
loop = asyncio.get_running_loop()
|
||||||
|
headers, error = await loop.run_in_executor(None, _fetch_headers, url)
|
||||||
|
|
||||||
|
if error:
|
||||||
|
await bot.reply(message, f"{url} -> error: {error}")
|
||||||
|
return
|
||||||
|
|
||||||
|
if not headers:
|
||||||
|
await bot.reply(message, f"{url} -> no headers received")
|
||||||
|
return
|
||||||
|
|
||||||
|
tech, security = _fingerprint(headers)
|
||||||
|
|
||||||
|
out = []
|
||||||
|
if tech:
|
||||||
|
out.append(f"Tech: {', '.join(tech)}")
|
||||||
|
if security:
|
||||||
|
out.append(f"Security: {', '.join(security)}")
|
||||||
|
if not out:
|
||||||
|
out.append("No signatures matched")
|
||||||
|
|
||||||
|
await bot.reply(message, f"{url} -> {' | '.join(out)}")
|
||||||
130
plugins/note.py
Normal file
130
plugins/note.py
Normal file
@@ -0,0 +1,130 @@
|
|||||||
|
"""Plugin: per-channel persistent key-value notes (SQLite)."""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import logging
|
||||||
|
import sqlite3
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
from derp.plugin import command
|
||||||
|
|
||||||
|
log = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
_DB_PATH = Path("data/notes.db")
|
||||||
|
_MAX_LIST = 20
|
||||||
|
|
||||||
|
_conn: sqlite3.Connection | None = None
|
||||||
|
|
||||||
|
|
||||||
|
def _db() -> sqlite3.Connection:
|
||||||
|
"""Lazy-init the database connection and schema."""
|
||||||
|
global _conn
|
||||||
|
if _conn is not None:
|
||||||
|
return _conn
|
||||||
|
_DB_PATH.parent.mkdir(parents=True, exist_ok=True)
|
||||||
|
_conn = sqlite3.connect(str(_DB_PATH))
|
||||||
|
_conn.execute("""
|
||||||
|
CREATE TABLE IF NOT EXISTS notes (
|
||||||
|
channel TEXT NOT NULL,
|
||||||
|
key TEXT NOT NULL,
|
||||||
|
value TEXT NOT NULL,
|
||||||
|
nick TEXT NOT NULL,
|
||||||
|
PRIMARY KEY (channel, key)
|
||||||
|
)
|
||||||
|
""")
|
||||||
|
_conn.commit()
|
||||||
|
return _conn
|
||||||
|
|
||||||
|
|
||||||
|
@command("note", help="Notes: !note set|get|del|list|clear")
|
||||||
|
async def cmd_note(bot, message):
|
||||||
|
"""Per-channel persistent key-value store.
|
||||||
|
|
||||||
|
Usage:
|
||||||
|
!note set <key> <value> Store a note
|
||||||
|
!note get <key> Retrieve a note
|
||||||
|
!note del <key> Delete a note
|
||||||
|
!note list List all keys
|
||||||
|
!note clear Clear all notes for this channel (admin)
|
||||||
|
"""
|
||||||
|
parts = message.text.split(None, 3)
|
||||||
|
if len(parts) < 2:
|
||||||
|
await bot.reply(message, "Usage: !note <set|get|del|list|clear> [args]")
|
||||||
|
return
|
||||||
|
|
||||||
|
sub = parts[1].lower()
|
||||||
|
channel = message.target or "dm"
|
||||||
|
|
||||||
|
if sub == "set":
|
||||||
|
if len(parts) < 4:
|
||||||
|
await bot.reply(message, "Usage: !note set <key> <value>")
|
||||||
|
return
|
||||||
|
key = parts[2].lower()
|
||||||
|
value = parts[3]
|
||||||
|
db = _db()
|
||||||
|
db.execute(
|
||||||
|
"INSERT OR REPLACE INTO notes (channel, key, value, nick) VALUES (?, ?, ?, ?)",
|
||||||
|
(channel, key, value, message.nick or "?"),
|
||||||
|
)
|
||||||
|
db.commit()
|
||||||
|
await bot.reply(message, f"{key}: saved")
|
||||||
|
|
||||||
|
elif sub == "get":
|
||||||
|
if len(parts) < 3:
|
||||||
|
await bot.reply(message, "Usage: !note get <key>")
|
||||||
|
return
|
||||||
|
key = parts[2].lower()
|
||||||
|
db = _db()
|
||||||
|
row = db.execute(
|
||||||
|
"SELECT value, nick FROM notes WHERE channel = ? AND key = ?",
|
||||||
|
(channel, key),
|
||||||
|
).fetchone()
|
||||||
|
if row:
|
||||||
|
value, nick = row
|
||||||
|
await bot.reply(message, f"{key}: {value} (set by {nick})")
|
||||||
|
else:
|
||||||
|
await bot.reply(message, f"{key}: not found")
|
||||||
|
|
||||||
|
elif sub == "del":
|
||||||
|
if len(parts) < 3:
|
||||||
|
await bot.reply(message, "Usage: !note del <key>")
|
||||||
|
return
|
||||||
|
key = parts[2].lower()
|
||||||
|
db = _db()
|
||||||
|
cur = db.execute(
|
||||||
|
"DELETE FROM notes WHERE channel = ? AND key = ?",
|
||||||
|
(channel, key),
|
||||||
|
)
|
||||||
|
db.commit()
|
||||||
|
if cur.rowcount:
|
||||||
|
await bot.reply(message, f"{key}: deleted")
|
||||||
|
else:
|
||||||
|
await bot.reply(message, f"{key}: not found")
|
||||||
|
|
||||||
|
elif sub == "list":
|
||||||
|
db = _db()
|
||||||
|
rows = db.execute(
|
||||||
|
"SELECT key FROM notes WHERE channel = ? ORDER BY key LIMIT ?",
|
||||||
|
(channel, _MAX_LIST),
|
||||||
|
).fetchall()
|
||||||
|
if not rows:
|
||||||
|
await bot.reply(message, "No notes")
|
||||||
|
return
|
||||||
|
keys = [r[0] for r in rows]
|
||||||
|
total = db.execute(
|
||||||
|
"SELECT COUNT(*) FROM notes WHERE channel = ?", (channel,),
|
||||||
|
).fetchone()[0]
|
||||||
|
suffix = f" ({total} total)" if total > _MAX_LIST else ""
|
||||||
|
await bot.reply(message, f"Notes: {', '.join(keys)}{suffix}")
|
||||||
|
|
||||||
|
elif sub == "clear":
|
||||||
|
if not bot._is_admin(message):
|
||||||
|
await bot.reply(message, "Permission denied: clear requires admin")
|
||||||
|
return
|
||||||
|
db = _db()
|
||||||
|
cur = db.execute("DELETE FROM notes WHERE channel = ?", (channel,))
|
||||||
|
db.commit()
|
||||||
|
await bot.reply(message, f"Cleared {cur.rowcount} notes")
|
||||||
|
|
||||||
|
else:
|
||||||
|
await bot.reply(message, "Usage: !note <set|get|del|list|clear> [args]")
|
||||||
140
plugins/opslog.py
Normal file
140
plugins/opslog.py
Normal file
@@ -0,0 +1,140 @@
|
|||||||
|
"""Plugin: timestamped operational log (SQLite per-channel)."""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import logging
|
||||||
|
import sqlite3
|
||||||
|
from datetime import datetime, timezone
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
from derp.plugin import command
|
||||||
|
|
||||||
|
log = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
_DB_PATH = Path("data/opslog.db")
|
||||||
|
_MAX_LIST = 10
|
||||||
|
_MAX_SEARCH = 10
|
||||||
|
|
||||||
|
_conn: sqlite3.Connection | None = None
|
||||||
|
|
||||||
|
|
||||||
|
def _db() -> sqlite3.Connection:
|
||||||
|
"""Lazy-init the database connection and schema."""
|
||||||
|
global _conn
|
||||||
|
if _conn is not None:
|
||||||
|
return _conn
|
||||||
|
_DB_PATH.parent.mkdir(parents=True, exist_ok=True)
|
||||||
|
_conn = sqlite3.connect(str(_DB_PATH))
|
||||||
|
_conn.execute("""
|
||||||
|
CREATE TABLE IF NOT EXISTS entries (
|
||||||
|
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||||
|
channel TEXT NOT NULL,
|
||||||
|
nick TEXT NOT NULL,
|
||||||
|
ts TEXT NOT NULL,
|
||||||
|
message TEXT NOT NULL
|
||||||
|
)
|
||||||
|
""")
|
||||||
|
_conn.execute("CREATE INDEX IF NOT EXISTS idx_entries_channel ON entries(channel)")
|
||||||
|
_conn.commit()
|
||||||
|
return _conn
|
||||||
|
|
||||||
|
|
||||||
|
@command("opslog", help="Op log: !opslog add|list|search|del|clear")
|
||||||
|
async def cmd_opslog(bot, message):
|
||||||
|
"""Timestamped operational log per channel.
|
||||||
|
|
||||||
|
Usage:
|
||||||
|
!opslog add <text> Add a log entry
|
||||||
|
!opslog list [n] Show last n entries (default 5)
|
||||||
|
!opslog search <term> Search entries
|
||||||
|
!opslog del <id> Delete an entry
|
||||||
|
!opslog clear Clear all entries for this channel (admin)
|
||||||
|
"""
|
||||||
|
parts = message.text.split(None, 2)
|
||||||
|
if len(parts) < 2:
|
||||||
|
await bot.reply(message, "Usage: !opslog <add|list|search|del|clear> [args]")
|
||||||
|
return
|
||||||
|
|
||||||
|
sub = parts[1].lower()
|
||||||
|
rest = parts[2] if len(parts) > 2 else ""
|
||||||
|
channel = message.target or "dm"
|
||||||
|
|
||||||
|
if sub == "add":
|
||||||
|
if not rest:
|
||||||
|
await bot.reply(message, "Usage: !opslog add <text>")
|
||||||
|
return
|
||||||
|
ts = datetime.now(timezone.utc).strftime("%Y-%m-%d %H:%M")
|
||||||
|
db = _db()
|
||||||
|
cur = db.execute(
|
||||||
|
"INSERT INTO entries (channel, nick, ts, message) VALUES (?, ?, ?, ?)",
|
||||||
|
(channel, message.nick or "?", ts, rest),
|
||||||
|
)
|
||||||
|
db.commit()
|
||||||
|
await bot.reply(message, f"[{cur.lastrowid}] logged")
|
||||||
|
|
||||||
|
elif sub == "list":
|
||||||
|
limit = _MAX_LIST
|
||||||
|
if rest:
|
||||||
|
try:
|
||||||
|
limit = min(int(rest), _MAX_LIST)
|
||||||
|
except ValueError:
|
||||||
|
pass
|
||||||
|
db = _db()
|
||||||
|
rows = db.execute(
|
||||||
|
"SELECT id, nick, ts, message FROM entries WHERE channel = ? "
|
||||||
|
"ORDER BY id DESC LIMIT ?",
|
||||||
|
(channel, limit),
|
||||||
|
).fetchall()
|
||||||
|
if not rows:
|
||||||
|
await bot.reply(message, "No entries")
|
||||||
|
return
|
||||||
|
for row_id, nick, ts, msg in reversed(rows):
|
||||||
|
await bot.reply(message, f"[{row_id}] {ts} <{nick}> {msg}")
|
||||||
|
|
||||||
|
elif sub == "search":
|
||||||
|
if not rest:
|
||||||
|
await bot.reply(message, "Usage: !opslog search <term>")
|
||||||
|
return
|
||||||
|
db = _db()
|
||||||
|
rows = db.execute(
|
||||||
|
"SELECT id, nick, ts, message FROM entries "
|
||||||
|
"WHERE channel = ? AND message LIKE ? ORDER BY id DESC LIMIT ?",
|
||||||
|
(channel, f"%{rest}%", _MAX_SEARCH),
|
||||||
|
).fetchall()
|
||||||
|
if not rows:
|
||||||
|
await bot.reply(message, f"No entries matching '{rest}'")
|
||||||
|
return
|
||||||
|
for row_id, nick, ts, msg in reversed(rows):
|
||||||
|
await bot.reply(message, f"[{row_id}] {ts} <{nick}> {msg}")
|
||||||
|
|
||||||
|
elif sub == "del":
|
||||||
|
if not rest:
|
||||||
|
await bot.reply(message, "Usage: !opslog del <id>")
|
||||||
|
return
|
||||||
|
try:
|
||||||
|
entry_id = int(rest)
|
||||||
|
except ValueError:
|
||||||
|
await bot.reply(message, "Invalid ID")
|
||||||
|
return
|
||||||
|
db = _db()
|
||||||
|
cur = db.execute(
|
||||||
|
"DELETE FROM entries WHERE id = ? AND channel = ?",
|
||||||
|
(entry_id, channel),
|
||||||
|
)
|
||||||
|
db.commit()
|
||||||
|
if cur.rowcount:
|
||||||
|
await bot.reply(message, f"Deleted entry {entry_id}")
|
||||||
|
else:
|
||||||
|
await bot.reply(message, f"Entry {entry_id} not found")
|
||||||
|
|
||||||
|
elif sub == "clear":
|
||||||
|
if not bot._is_admin(message):
|
||||||
|
await bot.reply(message, "Permission denied: clear requires admin")
|
||||||
|
return
|
||||||
|
db = _db()
|
||||||
|
cur = db.execute("DELETE FROM entries WHERE channel = ?", (channel,))
|
||||||
|
db.commit()
|
||||||
|
await bot.reply(message, f"Cleared {cur.rowcount} entries")
|
||||||
|
|
||||||
|
else:
|
||||||
|
await bot.reply(message, "Usage: !opslog <add|list|search|del|clear> [args]")
|
||||||
229
plugins/subdomain.py
Normal file
229
plugins/subdomain.py
Normal file
@@ -0,0 +1,229 @@
|
|||||||
|
"""Plugin: subdomain enumeration via crt.sh + DNS brute force."""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import asyncio
|
||||||
|
import ipaddress
|
||||||
|
import json
|
||||||
|
import logging
|
||||||
|
import os
|
||||||
|
import re
|
||||||
|
import socket
|
||||||
|
import struct
|
||||||
|
import urllib.request
|
||||||
|
|
||||||
|
from derp.plugin import command
|
||||||
|
|
||||||
|
log = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
_CRTSH_URL = "https://crt.sh/?q=%25.{domain}&output=json"
|
||||||
|
_CRTSH_TIMEOUT = 30
|
||||||
|
_DNS_TIMEOUT = 3.0
|
||||||
|
_MAX_RESULTS = 20
|
||||||
|
|
||||||
|
# Built-in wordlist for brute force (common subdomain prefixes)
|
||||||
|
_WORDLIST = [
|
||||||
|
"www", "mail", "ftp", "smtp", "pop", "imap", "webmail", "mx",
|
||||||
|
"ns1", "ns2", "ns3", "dns", "dns1", "dns2",
|
||||||
|
"dev", "staging", "stage", "test", "qa", "uat", "beta", "demo",
|
||||||
|
"api", "app", "web", "portal", "admin", "panel", "dashboard",
|
||||||
|
"vpn", "remote", "gateway", "proxy", "cdn", "static", "assets",
|
||||||
|
"media", "img", "images", "files", "download", "upload",
|
||||||
|
"db", "database", "mysql", "postgres", "redis", "mongo", "elastic",
|
||||||
|
"git", "svn", "repo", "ci", "jenkins", "build",
|
||||||
|
"ldap", "ad", "auth", "sso", "login", "id", "accounts",
|
||||||
|
"docs", "wiki", "help", "support", "status", "monitor",
|
||||||
|
"backup", "bak", "old", "new", "internal", "intranet", "corp",
|
||||||
|
"shop", "store", "blog", "forum", "crm", "erp",
|
||||||
|
"cloud", "aws", "s3", "gcp", "azure",
|
||||||
|
"mx1", "mx2", "relay", "smtp2", "autodiscover",
|
||||||
|
]
|
||||||
|
|
||||||
|
_DOMAIN_RE = re.compile(r"^[a-zA-Z0-9]([a-zA-Z0-9-]*[a-zA-Z0-9])?(\.[a-zA-Z]{2,})+$")
|
||||||
|
|
||||||
|
|
||||||
|
def _get_resolver() -> str:
|
||||||
|
"""Read first IPv4 nameserver from /etc/resolv.conf."""
|
||||||
|
try:
|
||||||
|
with open("/etc/resolv.conf") as f:
|
||||||
|
for line in f:
|
||||||
|
line = line.strip()
|
||||||
|
if line.startswith("nameserver"):
|
||||||
|
addr = line.split()[1]
|
||||||
|
try:
|
||||||
|
ipaddress.IPv4Address(addr)
|
||||||
|
return addr
|
||||||
|
except ValueError:
|
||||||
|
continue
|
||||||
|
except (OSError, IndexError):
|
||||||
|
pass
|
||||||
|
return "8.8.8.8"
|
||||||
|
|
||||||
|
|
||||||
|
def _build_a_query(name: str) -> bytes:
|
||||||
|
"""Build a minimal DNS A query."""
|
||||||
|
tid = os.urandom(2)
|
||||||
|
flags = struct.pack("!H", 0x0100)
|
||||||
|
counts = struct.pack("!HHHH", 1, 0, 0, 0)
|
||||||
|
encoded = b""
|
||||||
|
for label in name.rstrip(".").split("."):
|
||||||
|
encoded += bytes([len(label)]) + label.encode("ascii")
|
||||||
|
encoded += b"\x00"
|
||||||
|
return tid + flags + counts + encoded + struct.pack("!HH", 1, 1)
|
||||||
|
|
||||||
|
|
||||||
|
def _parse_a_response(data: bytes) -> list[str]:
|
||||||
|
"""Extract A record IPs from a DNS response."""
|
||||||
|
if len(data) < 12:
|
||||||
|
return []
|
||||||
|
_, flags, _, ancount = struct.unpack_from("!HHHH", data, 0)
|
||||||
|
rcode = flags & 0x0F
|
||||||
|
if rcode != 0 or ancount == 0:
|
||||||
|
return []
|
||||||
|
|
||||||
|
offset = 12
|
||||||
|
# Skip question section
|
||||||
|
while offset < len(data) and data[offset] != 0:
|
||||||
|
if (data[offset] & 0xC0) == 0xC0:
|
||||||
|
offset += 2
|
||||||
|
break
|
||||||
|
offset += data[offset] + 1
|
||||||
|
else:
|
||||||
|
offset += 1
|
||||||
|
offset += 4 # QTYPE + QCLASS
|
||||||
|
|
||||||
|
results = []
|
||||||
|
for _ in range(ancount):
|
||||||
|
if offset + 12 > len(data):
|
||||||
|
break
|
||||||
|
# Skip name (may be pointer)
|
||||||
|
if (data[offset] & 0xC0) == 0xC0:
|
||||||
|
offset += 2
|
||||||
|
else:
|
||||||
|
while offset < len(data) and data[offset] != 0:
|
||||||
|
offset += data[offset] + 1
|
||||||
|
offset += 1
|
||||||
|
rtype, _, _, rdlength = struct.unpack_from("!HHIH", data, offset)
|
||||||
|
offset += 10
|
||||||
|
if rtype == 1 and rdlength == 4:
|
||||||
|
results.append(socket.inet_ntoa(data[offset:offset + 4]))
|
||||||
|
offset += rdlength
|
||||||
|
return results
|
||||||
|
|
||||||
|
|
||||||
|
def _resolve_a(name: str, server: str) -> list[str]:
|
||||||
|
"""Blocking DNS A lookup. Returns list of IPs."""
|
||||||
|
query = _build_a_query(name)
|
||||||
|
sock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
|
||||||
|
sock.settimeout(_DNS_TIMEOUT)
|
||||||
|
try:
|
||||||
|
sock.sendto(query, (server, 53))
|
||||||
|
data = sock.recv(4096)
|
||||||
|
return _parse_a_response(data)
|
||||||
|
except (socket.timeout, OSError):
|
||||||
|
return []
|
||||||
|
finally:
|
||||||
|
sock.close()
|
||||||
|
|
||||||
|
|
||||||
|
def _fetch_crtsh(domain: str) -> set[str]:
|
||||||
|
"""Fetch subdomains from crt.sh CT logs. Blocking."""
|
||||||
|
url = _CRTSH_URL.format(domain=domain)
|
||||||
|
req = urllib.request.Request(url, headers={"User-Agent": "derp-bot"})
|
||||||
|
with urllib.request.urlopen(req, timeout=_CRTSH_TIMEOUT) as resp: # noqa: S310
|
||||||
|
data = json.loads(resp.read())
|
||||||
|
|
||||||
|
subs: set[str] = set()
|
||||||
|
for entry in data:
|
||||||
|
name = entry.get("common_name", "").strip().lower()
|
||||||
|
if name and name.endswith(f".{domain}") and "*" not in name:
|
||||||
|
subs.add(name)
|
||||||
|
# Also check SAN entries
|
||||||
|
name_value = entry.get("name_value", "")
|
||||||
|
for line in name_value.split("\n"):
|
||||||
|
line = line.strip().lower()
|
||||||
|
if line and line.endswith(f".{domain}") and "*" not in line:
|
||||||
|
subs.add(line)
|
||||||
|
return subs
|
||||||
|
|
||||||
|
|
||||||
|
async def _brute_one(prefix: str, domain: str,
|
||||||
|
server: str) -> tuple[str, list[str]]:
|
||||||
|
"""Resolve one subdomain. Returns (fqdn, [ips])."""
|
||||||
|
fqdn = f"{prefix}.{domain}"
|
||||||
|
loop = asyncio.get_running_loop()
|
||||||
|
ips = await loop.run_in_executor(None, _resolve_a, fqdn, server)
|
||||||
|
return fqdn, ips
|
||||||
|
|
||||||
|
|
||||||
|
@command("subdomain", help="Subdomain enum: !subdomain <domain> [brute]")
|
||||||
|
async def cmd_subdomain(bot, message):
|
||||||
|
"""Enumerate subdomains via CT logs and optional DNS brute force.
|
||||||
|
|
||||||
|
Usage:
|
||||||
|
!subdomain example.com CT log lookup only
|
||||||
|
!subdomain example.com brute CT + DNS brute force
|
||||||
|
"""
|
||||||
|
parts = message.text.split(None, 3)
|
||||||
|
if len(parts) < 2:
|
||||||
|
await bot.reply(message, "Usage: !subdomain <domain> [brute]")
|
||||||
|
return
|
||||||
|
|
||||||
|
domain = parts[1].lower().rstrip(".")
|
||||||
|
brute = len(parts) > 2 and parts[2].lower() == "brute"
|
||||||
|
|
||||||
|
if not _DOMAIN_RE.match(domain):
|
||||||
|
await bot.reply(message, f"Invalid domain: {domain}")
|
||||||
|
return
|
||||||
|
|
||||||
|
await bot.reply(message, f"Enumerating subdomains for {domain}...")
|
||||||
|
|
||||||
|
loop = asyncio.get_running_loop()
|
||||||
|
found: dict[str, list[str]] = {}
|
||||||
|
|
||||||
|
# Phase 1: crt.sh CT log lookup
|
||||||
|
try:
|
||||||
|
ct_subs = await asyncio.wait_for(
|
||||||
|
loop.run_in_executor(None, _fetch_crtsh, domain),
|
||||||
|
timeout=35.0,
|
||||||
|
)
|
||||||
|
# Resolve the CT-discovered subdomains
|
||||||
|
server = _get_resolver()
|
||||||
|
tasks = [_brute_one(sub.removesuffix(f".{domain}"), domain, server)
|
||||||
|
for sub in ct_subs]
|
||||||
|
if tasks:
|
||||||
|
results = await asyncio.gather(*tasks)
|
||||||
|
for fqdn, ips in results:
|
||||||
|
if ips:
|
||||||
|
found[fqdn] = ips
|
||||||
|
except TimeoutError:
|
||||||
|
await bot.reply(message, "crt.sh: timeout (continuing...)")
|
||||||
|
except Exception as exc:
|
||||||
|
reason = str(exc)[:60] if str(exc) else type(exc).__name__
|
||||||
|
await bot.reply(message, f"crt.sh: {reason} (continuing...)")
|
||||||
|
|
||||||
|
# Phase 2: DNS brute force (optional)
|
||||||
|
if brute:
|
||||||
|
server = _get_resolver()
|
||||||
|
tasks = [_brute_one(w, domain, server) for w in _WORDLIST
|
||||||
|
if f"{w}.{domain}" not in found]
|
||||||
|
if tasks:
|
||||||
|
results = await asyncio.gather(*tasks)
|
||||||
|
for fqdn, ips in results:
|
||||||
|
if ips:
|
||||||
|
found[fqdn] = ips
|
||||||
|
|
||||||
|
if not found:
|
||||||
|
await bot.reply(message, f"{domain}: no subdomains found")
|
||||||
|
return
|
||||||
|
|
||||||
|
# Sort and output
|
||||||
|
sorted_subs = sorted(found.items())
|
||||||
|
total = len(sorted_subs)
|
||||||
|
shown = sorted_subs[:_MAX_RESULTS]
|
||||||
|
|
||||||
|
for fqdn, ips in shown:
|
||||||
|
await bot.reply(message, f" {fqdn} -> {', '.join(ips)}")
|
||||||
|
|
||||||
|
suffix = f" ({total - _MAX_RESULTS} more)" if total > _MAX_RESULTS else ""
|
||||||
|
await bot.reply(message, f"{domain}: {total} subdomains found{suffix}")
|
||||||
Reference in New Issue
Block a user