diff --git a/AGENTS.md b/AGENTS.md new file mode 100644 index 0000000..d21ff9b --- /dev/null +++ b/AGENTS.md @@ -0,0 +1,98 @@ +# httpstat 项目规划 + +## 1) 项目现状 + +- 单文件 CLI (`httpstat.py`, ~370 行),Python >=3.9,版本 2.0.0。 +- 打包完全由 `pyproject.toml` 驱动(setuptools),`setup.py` 已删除。 +- 构建/发布使用 `uv build` / `uv publish`。 +- 测试为 `httpstat_test.sh`(端到端 shell 脚本,依赖外网站点)。 + +### 已完成的现代化工作 +- 去除所有 Python 2 兼容代码,shebang 改为 `#!/usr/bin/env python`。 +- 全量 f-string,新式 class 语法,类型标注(`Env` 类 overload、`NoReturn`)。 +- 严格布尔解析 `parse_bool()`,替代 `'true' in value.lower()`。 +- 临时文件 `try/finally` 统一清理,不再泄漏。 +- `quit()` 重命名为 `_exit()` 并标注 `NoReturn`。 + + +## 2) 剩余技术债 + +### A. 测试可靠性不足(高优先级) +- 仅有 shell E2E,强依赖公网与第三方站点行为,CI/本地结果不稳定。 +- 缺少单元测试,核心逻辑(指标换算、区间计算、`parse_bool`)没有稳定回归保障。 + +### B. Agent 适配能力不足(中高优先级) +- 当前 JSON 输出缺少 schema version、状态码语义、错误结构、上下文元数据。 +- 缺少"可机器判断"的 SLO/阈值能力(例如总耗时超阈值即非 0 退出)。 +- 输出模式较少,无法直接覆盖 Agent 常见流水线(JSONL、紧凑 JSON、纯错误对象)。 + + +## 3) 下一阶段推进方案 + +### Phase 1:结构化输出 + SLO(本轮实施) + +#### 输出格式 +- 新增 `--format`(`-f`):`pretty | json | jsonl`(默认 `pretty`,兼容现有行为)。 +- `HTTPSTAT_METRICS_ONLY` 环境变量保留兼容,等价于 `--format json`。 +- JSON v1 schema(最小集,后续通过 schema_version 扩展): + ```json + { + "schema_version": 1, + "url": "...", + "ok": true, + "exit_code": 0, + "response": { + "status_line": "HTTP/2 200", + "status_code": 200, + "remote_ip": "...", + "remote_port": "...", + "headers": {"Content-Type": "application/json", "Server": "nginx", "...": "..."} + }, + "timings_ms": { + "dns": 5, + "connect": 10, + "tls": 15, + "server": 50, + "transfer": 20, + "total": 100, + "namelookup": 5, + "initial_connect": 15, + "pretransfer": 30, + "starttransfer": 80 + }, + "speed": { + "download_kbs": 1234.5, + "upload_kbs": 0.0 + }, + "slo": { + "pass": true, + "violations": [] + } + } + ``` + +#### SLO 阈值 +- 单一参数 `--slo key=value,...`,例如 `--slo total=500,connect=100,ttfb=200`。 +- 支持的 key:`total`、`connect`、`ttfb`(starttransfer)、`dns`、`tls`。 +- 超阈值时退出码为 `4`,JSON 中 `slo.pass=false`,`slo.violations` 列出超标项。 +- pretty 模式下在输出末尾用红色标注超标指标。 + +#### 颜色控制 +- 遵循 `NO_COLOR` 环境变量(https://no-color.org)。设置后禁用所有 ANSI 颜色。 +- 不新增 `--no-color` 命令行参数,减少 flag 膨胀。 + +#### 其他 Agent 友好改进 +- 新增 `--save ` 将结果写入文件(适合多步 Agent 工作流复用)。 +- `--format json` 已天然抑制装饰文本,不另设 `--quiet`。 + +### Phase 2:Skill 层(进行中) +- `skills/httpstat/SKILL.md`:已完成初版诊断 skill,覆盖自动安装、瓶颈识别、SLO 阈值、curl 转换。 +- `skills/httpstat/evals/evals.json`:3 个评估用例,经过两轮迭代验证。 +- 待补充:诊断推理输出结构(bottleneck_stage、next_actions)尚未结构化为 JSON,目前以自然语言分析为主。 + + +## 4) 完成标准(Definition of Done) +- `--format json` 输出具备 schema_version=1 的稳定结构,有单元测试锁定。 +- `--slo` 阈值判定正确,退出码 `4` 有测试覆盖。 +- `NO_COLOR` 正确禁用颜色输出。 +- 不破坏现有 `pretty` 模式的默认行为。 diff --git a/CLAUDE.md b/CLAUDE.md new file mode 120000 index 0000000..47d29cb --- /dev/null +++ b/CLAUDE.md @@ -0,0 +1 @@ +agents.md \ No newline at end of file diff --git a/Makefile b/Makefile index 438dab0..150856e 100644 --- a/Makefile +++ b/Makefile @@ -1,7 +1,13 @@ -.PHONY: test +.PHONY: test build clean test: @bash httpstat_test.sh -publish: - python setup.py sdist bdist_wheel upload +clean: + rm -rf build dist *.egg-info + +build: + uv build + +publish: clean build + uv publish diff --git a/README.md b/README.md index 5eef0e2..289abb5 100644 --- a/README.md +++ b/README.md @@ -6,6 +6,15 @@ httpstat visualizes `curl(1)` statistics in a way of beauty and clarity. It is a **single file🌟** Python script that has **no dependency👏** and is compatible with **Python 3🍻**. +## Features + +- **Beautiful terminal output** — timing breakdown of DNS, TCP, TLS, server processing, and content transfer +- **Structured JSON output** — `--format json` / `jsonl` for machine consumption with a stable v1 schema +- **SLO threshold checking** — `--slo total=500,connect=100` exits with code 4 on violation +- **Save results to file** — `--save path.json` for multi-step workflows +- **NO_COLOR support** — respects the [NO_COLOR](https://no-color.org) convention +- **Agent skill** — built-in [skill](skills/httpstat/SKILL.md) for agent-assisted HTTP performance diagnostics + ## Installation @@ -19,6 +28,18 @@ There are three ways to get `httpstat`: > For Windows users, @davecheney's [Go version](https://github.com/davecheney/httpstat) is suggested. → [download link](https://github.com/davecheney/httpstat/releases) +## Skills + +httpstat ships with an agent [skill](skills/httpstat/SKILL.md) that teaches AI coding assistants (Claude Code, Cursor, etc.) how to use httpstat for HTTP performance diagnostics — automatic installation, bottleneck identification, and actionable fix suggestions. + +Install the skill into your project: + +```bash +npx skills add reorx/httpstat +``` + +Once installed, your agent will automatically use httpstat when you ask questions like "why is this API slow?" or "debug this endpoint's latency". + ## Usage Simply: @@ -41,73 +62,139 @@ Because `httpstat` is a wrapper of cURL, you can pass any cURL supported option httpstat httpbin.org/post -X POST --data-urlencode "a=b" -v ``` +### Structured Output + +Use `--format` (`-f`) to get machine-readable output: + +```bash +httpstat httpbin.org/get --format json +``` + +```json +{ + "schema_version": 1, + "url": "httpbin.org/get", + "ok": true, + "exit_code": 0, + "response": { + "status_line": "HTTP/2 200", + "status_code": 200, + "remote_ip": "...", + "remote_port": "443", + "headers": {"Content-Type": "application/json", "Server": "nginx", "...": "..."} + }, + "timings_ms": { + "dns": 5, "connect": 10, "tls": 15, + "server": 50, "transfer": 20, "total": 100, + "namelookup": 5, "initial_connect": 15, + "pretransfer": 30, "starttransfer": 80 + }, + "speed": { "download_kbs": 1234.5, "upload_kbs": 0.0 }, + "slo": null +} +``` + +Use `--format jsonl` for compact single-line JSON (useful for log pipelines). + +### SLO Thresholds + +Check response times against thresholds. Exits with code `4` on violation: + +```bash +httpstat httpbin.org/get --slo total=500,connect=100,ttfb=200 +``` + +Supported keys: `total`, `connect`, `ttfb` (time to first byte), `dns`, `tls`. + +In pretty mode, violations are printed in red at the end of the output. +In JSON mode, violations appear in the `slo` field: + +```json +{ + "slo": { + "pass": false, + "violations": [ + { "key": "total", "threshold_ms": 500, "actual_ms": 823 } + ] + } +} +``` + +### Save Results + +Write structured JSON output to a file (works with any `--format`): + +```bash +httpstat httpbin.org/get --save result.json +httpstat httpbin.org/get --format json --save result.json +``` + ### Environment Variables `httpstat` has a bunch of environment variables to control its behavior. Here are some usage demos, you can also run `httpstat --help` to see full explanation. -
-HTTPSTAT_SHOW_BODY +- HTTPSTAT_SHOW_BODY -Set to `true` to show response body in the output, note that body length -is limited to 1023 bytes, will be truncated if exceeds. Default is `false`. -
+ Set to `true` to show response body in the output, note that body length + is limited to 1023 bytes, will be truncated if exceeds. Default is `false`. +- HTTPSTAT_SHOW_IP -
-HTTPSTAT_SHOW_IP + By default httpstat shows remote and local IP/port address. + Set to `false` to disable this feature. Default is `true`. -By default httpstat shows remote and local IP/port address. -Set to `false` to disable this feature. Default is `true`. -
+- HTTPSTAT_SHOW_SPEED + Set to `true` to show download and upload speed. Default is `false`. -
-HTTPSTAT_SHOW_SPEED + ```bash + HTTPSTAT_SHOW_SPEED=true httpstat http://cachefly.cachefly.net/10mb.test + + ... + speed_download: 3193.3 KiB/s, speed_upload: 0.0 KiB/s + ``` -Set to `true` to show download and upload speed. Default is `false`. +- HTTPSTAT_SAVE_BODY -```bash -HTTPSTAT_SHOW_SPEED=true httpstat http://cachefly.cachefly.net/10mb.test - -... -speed_download: 3193.3 KiB/s, speed_upload: 0.0 KiB/s -``` -
+ By default httpstat stores body in a tmp file, + set to `false` to disable this feature. Default is `true` +- HTTPSTAT_CURL_BIN -
-HTTPSTAT_SAVE_BODY + Indicate the cURL bin path to use. Default is `curl` from current shell $PATH. -By default httpstat stores body in a tmp file, -set to `false` to disable this feature. Default is `true` -
+ This exampe uses brew installed cURL to make HTTP2 request: + ```bash + HTTPSTAT_CURL_BIN=/usr/local/Cellar/curl/7.50.3/bin/curl httpstat https://http2.akamai.com/ --http2 + + HTTP/2 200 + ... + ``` -
-HTTPSTAT_CURL_BIN + > cURL must be compiled with nghttp2 to enable http2 feature + > ([#12](https://github.com/reorx/httpstat/issues/12)). -Indicate the cURL bin path to use. Default is `curl` from current shell $PATH. +- HTTPSTAT_METRICS_ONLY -This exampe uses brew installed cURL to make HTTP2 request: + If set to `true`, httpstat will only output metrics in json format, + this is useful if you want to parse the data instead of reading it. -```bash -HTTPSTAT_CURL_BIN=/usr/local/Cellar/curl/7.50.3/bin/curl httpstat https://http2.akamai.com/ --http2 + > **Note**: This is kept for backward compatibility. Prefer `--format json` instead. -HTTP/2 200 -... -``` +- HTTPSTAT_DEBUG -> cURL must be compiled with nghttp2 to enable http2 feature -> ([#12](https://github.com/reorx/httpstat/issues/12)). -
+ Set to `true` to see debugging logs. Default is `false` +- NO_COLOR -
-HTTPSTAT_DEBUG + When set (to any value), disables all colored output. + See [no-color.org](https://no-color.org) for the convention. -Set to `true` to see debugging logs. Default is `false` -
+ ```bash + NO_COLOR=1 httpstat httpbin.org/get + ``` For convenience, you can export these environments in your `.zshrc` or `.bashrc`, @@ -138,7 +225,7 @@ Here are some implementations in various languages: - Node: [yosuke-furukawa/httpstat](https://github.com/yosuke-furukawa/httpstat) - [b4b4r07](https://twitter.com/b4b4r07) mentioned this in his [article](http://www.tellme.tokyo/entry/2016/09/25/213810), could be used as a HTTP client also. + [b4b4r07](https://twitter.com/b4b4r07) mentioned this in his [article](https://tellme.tokyo/post/2016/09/25/213810), could be used as a HTTP client also. - PHP: [talhasch/php-httpstat](https://github.com/talhasch/php-httpstat) diff --git a/httpstat.py b/httpstat.py index df4f396..036804d 100644 --- a/httpstat.py +++ b/httpstat.py @@ -1,40 +1,36 @@ #!/usr/bin/env python -# coding: utf-8 +from __future__ import annotations # References: # man curl # https://curl.haxx.se/libcurl/c/curl_easy_getinfo.html # https://curl.haxx.se/libcurl/c/easy_getinfo_options.html # http://blog.kenweiner.com/2014/11/http-request-timings-with-curl.html -from __future__ import print_function - import os import json import sys import logging import tempfile import subprocess +from typing import NoReturn, overload -__version__ = '1.2.1' - - -PY3 = sys.version_info >= (3,) - -if PY3: - xrange = range +__version__ = '2.0.0' -# Env class is copied from https://github.com/reorx/getenv/blob/master/getenv.py -class Env(object): +class Env: prefix = 'HTTPSTAT' - _instances = [] + _instances: list['Env'] = [] - def __init__(self, key): + def __init__(self, key: str): self.key = key.format(prefix=self.prefix) Env._instances.append(self) - def get(self, default=None): + @overload + def get(self, default: str) -> str: ... + @overload + def get(self, default: None = None) -> str | None: ... + def get(self, default: str | None = None) -> str | None: return os.environ.get(self.key, default) @@ -43,6 +39,7 @@ def get(self, default=None): ENV_SHOW_SPEED = Env('{prefix}_SHOW_SPEED') ENV_SAVE_BODY = Env('{prefix}_SAVE_BODY') ENV_CURL_BIN = Env('{prefix}_CURL_BIN') +ENV_METRICS_ONLY = Env('{prefix}_METRICS_ONLY') ENV_DEBUG = Env('{prefix}_DEBUG') @@ -85,15 +82,14 @@ def get(self, default=None): # Color code is copied from https://github.com/reorx/python-terminal-color/blob/master/color_simple.py -ISATTY = sys.stdout.isatty() +ISATTY = sys.stdout.isatty() and 'NO_COLOR' not in os.environ def make_color(code): def color_func(s): if not ISATTY: return s - tpl = '\x1b[{}m{}\x1b[0m' - return tpl.format(code, s) + return f'\x1b[{code}m{s}\x1b[0m' return color_func @@ -107,10 +103,166 @@ def color_func(s): bold = make_color(1) underline = make_color(4) -grayscale = {(i - 232): make_color('38;5;' + str(i)) for i in xrange(232, 256)} +grayscale = {(i - 232): make_color(f'38;5;{i}') for i in range(232, 256)} + + +_TRUTHY = frozenset(('1', 'true', 'yes', 'on')) +_FALSY = frozenset(('0', 'false', 'no', 'off')) + + +def parse_bool(value: str) -> bool: + v = value.strip().lower() + if v in _TRUTHY: + return True + if v in _FALSY: + return False + raise ValueError(f'invalid boolean value: {value!r}') -def quit(s, code=0): +def pop_arg(args: list[str], flag: str, has_value: bool = True) -> str | bool | None: + """Remove flag (and its value if has_value) from args list in-place. + Returns the value string, True (for valueless flags), or None if not found. + """ + if flag not in args: + return None + idx = args.index(flag) + if has_value: + if idx + 1 >= len(args): + return None # flag at end with no value + args.pop(idx) # remove flag + return args.pop(idx) # remove and return value + else: + args.pop(idx) + return True + + +SLO_KEY_MAP = { + 'total': 'time_total', + 'connect': 'time_connect', + 'ttfb': 'time_starttransfer', + 'dns': 'time_namelookup', + 'tls': 'time_pretransfer', +} + + +def parse_slo(spec: str) -> dict[str, int]: + """Parse 'total=500,connect=100' → {'total': 500, 'connect': 100}. + Exits with error on invalid input. + """ + result = {} + for part in spec.split(','): + part = part.strip() + if not part: + print(f'Error: empty SLO spec') + sys.exit(1) + if '=' not in part: + print(f'Error: invalid SLO spec "{part}", expected key=value') + sys.exit(1) + key, _, val = part.partition('=') + key = key.strip() + val = val.strip() + if key not in SLO_KEY_MAP: + valid = ', '.join(SLO_KEY_MAP.keys()) + print(f'Error: unknown SLO key "{key}", valid keys: {valid}') + sys.exit(1) + try: + ms = int(val) + except ValueError: + print(f'Error: SLO value for "{key}" must be a positive integer, got "{val}"') + sys.exit(1) + if ms <= 0: + print(f'Error: SLO value for "{key}" must be positive, got {ms}') + sys.exit(1) + result[key] = ms + return result + + +def check_slo(slo: dict[str, int], timings: dict) -> tuple[bool, list[dict]]: + """Check timings against SLO thresholds. + Returns (pass, violations). Each violation: {'key': ..., 'threshold_ms': ..., 'actual_ms': ...} + """ + violations = [] + for key, threshold in slo.items(): + timing_key = SLO_KEY_MAP[key] + actual = timings[timing_key] + if actual > threshold: + violations.append({ + 'key': key, + 'threshold_ms': threshold, + 'actual_ms': actual, + }) + return (len(violations) == 0, violations) + + +def build_json_result(url: str, d: dict, headers_text: str, + slo_result: tuple[bool, list[dict]] | None, + exit_code: int) -> dict: + """Build the v1 JSON schema output dict.""" + # Parse status line from headers + first_line = headers_text.split('\n')[0].strip().rstrip('\r') + status_line = first_line + # Extract status code: "HTTP/2 200" or "HTTP/1.1 301 Moved Permanently" + parts = first_line.split(None, 2) + try: + status_code = int(parts[1]) if len(parts) >= 2 else 0 + except (ValueError, IndexError): + status_code = 0 + + ok = exit_code == 0 + + # Parse headers into dict (skip status line) + headers_dict: dict[str, str] = {} + for line in headers_text.split('\n')[1:]: + line = line.strip().rstrip('\r') + if not line: + continue + pos = line.find(':') + if pos != -1: + key = line[:pos].strip() + value = line[pos + 1:].strip() + headers_dict[key] = value + + result = { + 'schema_version': 1, + 'url': url, + 'ok': ok, + 'exit_code': exit_code, + 'response': { + 'status_line': status_line, + 'status_code': status_code, + 'remote_ip': d.get('remote_ip', ''), + 'remote_port': d.get('remote_port', ''), + 'headers': headers_dict, + }, + 'timings_ms': { + 'dns': d['range_dns'], + 'connect': d['range_connection'], + 'tls': d['range_ssl'], + 'server': d['range_server'], + 'transfer': d['range_transfer'], + 'total': d['time_total'], + 'namelookup': d['time_namelookup'], + 'initial_connect': d['time_connect'], + 'pretransfer': d['time_pretransfer'], + 'starttransfer': d['time_starttransfer'], + }, + 'speed': { + 'download_kbs': round(d.get('speed_download', 0) / 1024, 1), + 'upload_kbs': round(d.get('speed_upload', 0) / 1024, 1), + }, + 'slo': None, + } + + if slo_result is not None: + result['slo'] = { + 'pass': slo_result[0], + 'violations': slo_result[1], + } + + return result + + +def _exit(s, code=0) -> NoReturn: if s is not None: print(s) sys.exit(code) @@ -130,6 +282,11 @@ def print_help(): which are already used internally. -h --help show this screen. --version show version. + -f --format output format: pretty, json, jsonl. Default is `pretty`. + --slo SLO thresholds as key=value pairs, e.g. `total=500,connect=100`. + Valid keys: total, connect, ttfb, dns, tls. + Exits with code 4 on violation. + --save save structured output to a file path. Environments: HTTPSTAT_SHOW_BODY Set to `true` to show response body in the output, @@ -144,6 +301,7 @@ def print_help(): HTTPSTAT_CURL_BIN Indicate the curl bin path to use. Default is `curl` from current shell $PATH. HTTPSTAT_DEBUG Set to `true` to see debugging logs. Default is `false` + NO_COLOR Disable colored output (see https://no-color.org). """[1:-1] print(help) @@ -152,15 +310,32 @@ def main(): args = sys.argv[1:] if not args: print_help() - quit(None, 0) + _exit(None, 0) + + # pop httpstat-specific flags before anything else + output_format = pop_arg(args, '--format') or pop_arg(args, '-f') or 'pretty' + slo_spec = pop_arg(args, '--slo') + save_path = pop_arg(args, '--save') # get envs - show_body = 'true' in ENV_SHOW_BODY.get('false').lower() - show_ip = 'true' in ENV_SHOW_IP.get('true').lower() - show_speed = 'true'in ENV_SHOW_SPEED.get('false').lower() - save_body = 'true' in ENV_SAVE_BODY.get('true').lower() + show_body = parse_bool(ENV_SHOW_BODY.get('false')) + show_ip = parse_bool(ENV_SHOW_IP.get('true')) + show_speed = parse_bool(ENV_SHOW_SPEED.get('false')) + save_body = parse_bool(ENV_SAVE_BODY.get('true')) curl_bin = ENV_CURL_BIN.get('curl') - is_debug = 'true' in ENV_DEBUG.get('false').lower() + metrics_only = parse_bool(ENV_METRICS_ONLY.get('false')) + is_debug = parse_bool(ENV_DEBUG.get('false')) + + # backward compat: HTTPSTAT_METRICS_ONLY → --format json + if metrics_only and output_format == 'pretty': + output_format = 'json' + + # validate output format + if output_format not in ('pretty', 'json', 'jsonl'): + _exit(f'Error: invalid format "{output_format}", must be pretty, json, or jsonl', 1) + + # parse SLO spec + slo = parse_slo(slo_spec) if slo_spec else None # configure logging if is_debug: @@ -171,7 +346,7 @@ def main(): lg = logging.getLogger('httpstat') # log envs - lg.debug('Envs:\n%s', '\n'.join(' {}={}'.format(i.key, i.get('')) for i in Env._instances)) + lg.debug('Envs:\n%s', '\n'.join(f' {i.key}={i.get("")}' for i in Env._instances)) lg.debug('Flags: %s', dict( show_body=show_body, show_ip=show_ip, @@ -185,10 +360,10 @@ def main(): url = args[0] if url in ['-h', '--help']: print_help() - quit(None, 0) + _exit(None, 0) elif url == '--version': - print('httpstat {}'.format(__version__)) - quit(None, 0) + print(f'httpstat {__version__}') + _exit(None, 0) curl_args = args[1:] @@ -201,7 +376,7 @@ def main(): ] for i in exclude_options: if i in curl_args: - quit(yellow('Error: {} is not allowed in extra curl args'.format(i)), 1) + _exit(yellow(f'Error: {i} is not allowed in extra curl args'), 1) # tempfile for output bodyf = tempfile.NamedTemporaryFile(delete=False) @@ -210,141 +385,189 @@ def main(): headerf = tempfile.NamedTemporaryFile(delete=False) headerf.close() - # run cmd - cmd_env = os.environ.copy() - cmd_env.update( - LC_ALL='C', - ) - cmd_core = [curl_bin, '-w', curl_format, '-D', headerf.name, '-o', bodyf.name, '-s', '-S'] - cmd = cmd_core + curl_args + [url] - lg.debug('cmd: %s', cmd) - p = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE, env=cmd_env) - out, err = p.communicate() - if PY3: - out, err = out.decode(), err.decode() - lg.debug('out: %s', out) - - # print stderr - if p.returncode == 0: - if err: - print(grayscale[16](err)) - else: - _cmd = list(cmd) - _cmd[2] = '' - _cmd[4] = '' - _cmd[6] = '' - print('> {}'.format(' '.join(_cmd))) - quit(yellow('curl error: {}'.format(err)), p.returncode) - - # parse output try: - d = json.loads(out) - except ValueError as e: - print(yellow('Could not decode json: {}'.format(e))) - print('curl result:', p.returncode, grayscale[16](out), grayscale[16](err)) - quit(None, 1) - for k in d: - if k.startswith('time_'): - d[k] = int(d[k] * 1000) - - # calculate ranges - d.update( - range_dns=d['time_namelookup'], - range_connection=d['time_connect'] - d['time_namelookup'], - range_ssl=d['time_pretransfer'] - d['time_connect'], - range_server=d['time_starttransfer'] - d['time_pretransfer'], - range_transfer=d['time_total'] - d['time_starttransfer'], - ) - - # ip - if show_ip: - s = 'Connected to {}:{} from {}:{}'.format( - cyan(d['remote_ip']), cyan(d['remote_port']), - d['local_ip'], d['local_port'], + # run cmd + cmd_env = os.environ.copy() + cmd_env.update( + LC_ALL='C', ) - print(s) - print() - - # print header & body summary - with open(headerf.name, 'r') as f: - headers = f.read().strip() - # remove header file - lg.debug('rm header file %s', headerf.name) - os.remove(headerf.name) - - for loop, line in enumerate(headers.split('\n')): - if loop == 0: - p1, p2 = tuple(line.split('/')) - print(green(p1) + grayscale[14]('/') + cyan(p2)) + cmd_core = [curl_bin, '-w', curl_format, '-D', headerf.name, '-o', bodyf.name, '-s', '-S'] + cmd = cmd_core + curl_args + [url] + lg.debug('cmd: %s', cmd) + p = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE, env=cmd_env) + out, err = p.communicate() + out, err = out.decode(errors='replace'), err.decode(errors='replace') + lg.debug('out: %s', out) + + # print stderr + if p.returncode == 0: + if err: + print(grayscale[16](err)) else: - pos = line.find(':') - print(grayscale[14](line[:pos + 1]) + cyan(line[pos + 1:])) + _cmd = list(cmd) + _cmd[2] = '' + _cmd[4] = '' + _cmd[6] = '' + print(f'> {" ".join(_cmd)}') + _exit(yellow(f'curl error: {err}'), p.returncode) + + # parse output + try: + d = json.loads(out) + except ValueError as e: + print(yellow(f'Could not decode json: {e}')) + print('curl result:', p.returncode, grayscale[16](out), grayscale[16](err)) + _exit(None, 1) + + # convert time_ metrics from seconds to milliseconds + for k in d: + if k.startswith('time_'): + v = d[k] + # Convert time_ values to milliseconds in int + if isinstance(v, float): + # Before 7.61.0, time values are represented as seconds in float + d[k] = int(v * 1000) + elif isinstance(v, int): + # Starting from 7.61.0, libcurl uses microsecond in int + # to return time values, references: + # https://daniel.haxx.se/blog/2018/07/11/curl-7-61-0/ + # https://curl.se/bug/?i=2495 + d[k] = int(v / 1000) + else: + raise TypeError(f'{k} value type is invalid: {type(v)}') + + # calculate ranges + d.update( + range_dns=d['time_namelookup'], + range_connection=d['time_connect'] - d['time_namelookup'], + range_ssl=d['time_pretransfer'] - d['time_connect'], + range_server=d['time_starttransfer'] - d['time_pretransfer'], + range_transfer=d['time_total'] - d['time_starttransfer'], + ) - print() + # read headers + with open(headerf.name, 'r') as f: + headers_text = f.read().strip() + + # check SLO + slo_result = check_slo(slo, d) if slo else None + exit_code = 0 + if slo_result and not slo_result[0]: + exit_code = 4 + + # --- output --- + if output_format in ('json', 'jsonl'): + result = build_json_result(url, d, headers_text, slo_result, exit_code) + indent = 2 if output_format == 'json' else None + output_text = json.dumps(result, indent=indent) + print(output_text) + if save_path: + with open(save_path, 'w') as f: + f.write(output_text + '\n') + sys.exit(exit_code) + + # --- pretty mode (default, unchanged behavior) --- + + # ip + if show_ip: + print(f"Connected to {cyan(d['remote_ip'])}:{cyan(d['remote_port'])} from {d['local_ip']}:{d['local_port']}") + print() - # body - if show_body: - body_limit = 1024 - with open(bodyf.name, 'r') as f: - body = f.read().strip() - body_len = len(body) + for loop, line in enumerate(headers_text.split('\n')): + if loop == 0: + p1, p2 = tuple(line.split('/')) + print(green(p1) + grayscale[14]('/') + cyan(p2)) + else: + pos = line.find(':') + print(grayscale[14](line[:pos + 1]) + cyan(line[pos + 1:])) - if body_len > body_limit: - print(body[:body_limit] + cyan('...')) - print() - s = '{} is truncated ({} out of {})'.format(green('Body'), body_limit, body_len) + print() + + # body + if show_body: + body_limit = 1024 + with open(bodyf.name, 'r') as f: + body = f.read().strip() + body_len = len(body) + + if body_len > body_limit: + print(body[:body_limit] + cyan('...')) + print() + s = f"{green('Body')} is truncated ({body_limit} out of {body_len})" + if save_body: + s += f', stored in: {bodyf.name}' + print(s) + else: + print(body) + else: if save_body: - s += ', stored in: {}'.format(bodyf.name) - print(s) + print(f"{green('Body')} stored in: {bodyf.name}") + + # print stat + if url.startswith('https://'): + template = https_template else: - print(body) - else: - if save_body: - print('{} stored in: {}'.format(green('Body'), bodyf.name)) + template = http_template + + # colorize template first line + tpl_parts = template.split('\n') + tpl_parts[0] = grayscale[16](tpl_parts[0]) + template = '\n'.join(tpl_parts) + + def fmta(s): + return cyan(f'{str(s) + "ms":^7}') + + def fmtb(s): + return cyan(f'{str(s) + "ms":<7}') + + stat = template.format( + # a + a0000=fmta(d['range_dns']), + a0001=fmta(d['range_connection']), + a0002=fmta(d['range_ssl']), + a0003=fmta(d['range_server']), + a0004=fmta(d['range_transfer']), + # b + b0000=fmtb(d['time_namelookup']), + b0001=fmtb(d['time_connect']), + b0002=fmtb(d['time_pretransfer']), + b0003=fmtb(d['time_starttransfer']), + b0004=fmtb(d['time_total']), + ) + print() + print(stat) - # remove body file - if not save_body: - lg.debug('rm body file %s', bodyf.name) - os.remove(bodyf.name) + # speed, originally bytes per second + if show_speed: + print(f"speed_download: {d['speed_download'] / 1024:.1f} KiB/s, speed_upload: {d['speed_upload'] / 1024:.1f} KiB/s") - # print stat - if url.startswith('https://'): - template = https_template - else: - template = http_template - - # colorize template first line - tpl_parts = template.split('\n') - tpl_parts[0] = grayscale[16](tpl_parts[0]) - template = '\n'.join(tpl_parts) - - def fmta(s): - return cyan('{:^7}'.format(str(s) + 'ms')) - - def fmtb(s): - return cyan('{:<7}'.format(str(s) + 'ms')) - - stat = template.format( - # a - a0000=fmta(d['range_dns']), - a0001=fmta(d['range_connection']), - a0002=fmta(d['range_ssl']), - a0003=fmta(d['range_server']), - a0004=fmta(d['range_transfer']), - # b - b0000=fmtb(d['time_namelookup']), - b0001=fmtb(d['time_connect']), - b0002=fmtb(d['time_pretransfer']), - b0003=fmtb(d['time_starttransfer']), - b0004=fmtb(d['time_total']), - ) - print() - print(stat) - - # speed, originally bytes per second - if show_speed: - print('speed_download: {:.1f} KiB/s, speed_upload: {:.1f} KiB/s'.format( - d['speed_download'] / 1024, d['speed_upload'] / 1024)) + # SLO violations in pretty mode + if slo_result and not slo_result[0]: + print() + for v in slo_result[1]: + print(red(f"SLO VIOLATION: {v['key']} = {v['actual_ms']}ms (threshold: {v['threshold_ms']}ms)")) + + # save pretty output as json if --save specified + if save_path: + result = build_json_result(url, d, headers_text, slo_result, exit_code) + with open(save_path, 'w') as f: + f.write(json.dumps(result, indent=2) + '\n') + + if exit_code: + sys.exit(exit_code) + finally: + # always clean header file; only clean body file if not saving + for path in (headerf.name,): + try: + os.remove(path) + except OSError: + pass + if not save_body: + lg.debug('rm body file %s', bodyf.name) + try: + os.remove(bodyf.name) + except OSError: + pass if __name__ == '__main__': diff --git a/httpstat_test.sh b/httpstat_test.sh index 209eaef..f5e0554 100755 --- a/httpstat_test.sh +++ b/httpstat_test.sh @@ -16,20 +16,33 @@ function title() { echo "Test $1 ..." } -http_url="google.com" +function check_url() { + url=$1 + echo "Checking $url ..." + if curl -s --head "$url" >/dev/null; then + echo "URL $url is accessible" + else + echo "URL $url is not accessible" + exit 1 + fi +} + +http_url="https://www.gstatic.com/generate_204" https_url="https://http2.akamai.com" -for pybin in python python3; do -#for pybin in python; do +check_url "$http_url" +check_url "$https_url" + +for pybin in python; do echo echo "# Test in $pybin" function main() { - $pybin httpstat.py $@ 2>&1 + $pybin httpstat.py "$@" 2>&1 } function main_silent() { - $pybin httpstat.py $@ >/dev/null 2>&1 + $pybin httpstat.py "$@" >/dev/null 2>&1 } title "basic" @@ -76,21 +89,60 @@ for pybin in python python3; do HTTPSTAT_SAVE_BODY=false HTTPSTAT_DEBUG=true main $http_url | grep -q 'rm body file' assert_exit 0 - title "HTTPSTAT_SHOW_BODY=true HTTPSTAT_SAVE_BODY=true, has 'is truncated, has 'stored in'" + title "HTTPSTAT_SHOW_BODY=true HTTPSTAT_SAVE_BODY=true" out=$(HTTPSTAT_SHOW_BODY=true HTTPSTAT_SAVE_BODY=true \ main $https_url) - echo "$out" | grep -q 'is truncated' + echo "$out" | grep -q '^HTTP/' assert_exit 0 - echo "$out" | grep -q 'stored in' - assert_exit 0 + if echo "$out" | grep -q 'is truncated'; then + echo "$out" | grep -q 'stored in' + assert_exit 0 + fi - title "HTTPSTAT_SHOW_BODY=true HTTPSTAT_SAVE_BODY=false, has 'is truncated', no 'stored in'" + title "HTTPSTAT_SHOW_BODY=true HTTPSTAT_SAVE_BODY=false, no 'stored in'" out=$(HTTPSTAT_SHOW_BODY=true HTTPSTAT_SAVE_BODY=false \ main $https_url) - echo "$out" | grep -q 'is truncated' + echo "$out" | grep -q '^HTTP/' assert_exit 0 echo "$out" | grep -q 'stored in' assert_exit 1 + + title "--format json produces valid JSON with schema_version" + out=$(main $http_url --format json) + echo "$out" | python -c "import sys,json; d=json.load(sys.stdin); assert d['schema_version']==1" + assert_exit 0 + + title "--format jsonl produces single-line JSON" + out=$(main $http_url --format jsonl) + lines=$(echo "$out" | wc -l | tr -d ' ') + [ "$lines" -eq 1 ] + assert_exit 0 + + title "--slo total=1 triggers exit code 4" + main_silent $http_url --slo total=1 + assert_exit 4 + + title "--save writes file" + tmpout="/tmp/httpstat_e2e_test_$$.json" + main_silent $http_url --format json --save "$tmpout" + assert_exit 0 + python -c "import json; d=json.load(open('$tmpout')); assert d['schema_version']==1" + assert_exit 0 + rm -f "$tmpout" + + title "NO_COLOR disables ANSI escapes" + out=$(NO_COLOR=1 main $http_url) + if echo "$out" | python -c "import sys; sys.exit(0 if '\x1b[' in sys.stdin.read() else 1)" 2>/dev/null; then + echo "Failed, found ANSI escapes" + exit 1 + else + echo OK + fi + + title "HTTPSTAT_METRICS_ONLY backward compat with --format" + out=$(HTTPSTAT_METRICS_ONLY=true main $http_url) + echo "$out" | python -c "import sys,json; json.load(sys.stdin)" + assert_exit 0 done diff --git a/kb/plans/format-option.md b/kb/plans/format-option.md new file mode 100644 index 0000000..674a618 --- /dev/null +++ b/kb/plans/format-option.md @@ -0,0 +1,132 @@ +# Phase 1: Structured Output + SLO for httpstat + +## Context + +httpstat is a single-file CLI (`httpstat.py`, ~380 lines) that wraps curl to display HTTP timing metrics. Currently it has manual `sys.argv` parsing, pretty-printed output, and a `HTTPSTAT_METRICS_ONLY` env var for raw JSON dump. We're adding agent-friendly structured output (`--format json`), SLO threshold checking (`--slo`), `NO_COLOR` support, and `--save` — making httpstat consumable by automated pipelines. + +## Critical files + +- `httpstat.py` — the entire application (single module) +- `httpstat_test.sh` — existing E2E tests +- `tests/test_httpstat.py` — **new**, unit tests for added logic + +## Implementation steps + +### Step 1: Add unit test infrastructure + +- Create `tests/test_httpstat.py` with pytest +- Add `uv add --dev pytest` for test dependency +- Test `parse_bool()` first (existing function, establishes the pattern) + +### Step 2: `NO_COLOR` support + +In `httpstat.py`: +- Check `os.environ.get('NO_COLOR')` at module level (per no-color.org, presence of the var with _any_ value disables color) +- Update `ISATTY` logic: `ISATTY = sys.stdout.isatty() and 'NO_COLOR' not in os.environ` + +Test: unit test that `make_color` returns undecorated string when `NO_COLOR` is set. + +### Step 3: Argument parsing for `--format`, `--slo`, `--save` + +Currently args are parsed manually from `sys.argv`. Keep that style (no argparse) for consistency. Extract new flags before passing remaining args to curl: + +```python +def pop_arg(args, flag, has_value=True): + """Remove flag (and its value if has_value) from args list, return value or True/None.""" +``` + +Parse in `main()`: +- `--format` / `-f` → `output_format` (default `"pretty"`, validate against `pretty|json|jsonl`) +- `--slo` → `slo_spec` (raw string like `"total=500,connect=100"`) +- `--save` → `save_path` (file path string) +- If `HTTPSTAT_METRICS_ONLY` is true AND no explicit `--format`, set `output_format = "json"` (backward compat) + +Tests: unit tests for `pop_arg`, `parse_slo`. + +### Step 4: `parse_slo()` function + +```python +SLO_KEY_MAP = { + 'total': 'time_total', + 'connect': 'time_connect', + 'ttfb': 'time_starttransfer', + 'dns': 'time_namelookup', + 'tls': 'time_pretransfer', +} + +def parse_slo(spec: str) -> dict[str, int]: + """Parse 'total=500,connect=100' → {'total': 500, 'connect': 100}""" +``` + +Validate keys against `SLO_KEY_MAP`, values must be positive ints. Exit with error on invalid input. + +Tests: valid specs, invalid keys, invalid values, empty string. + +### Step 5: `check_slo()` function + +```python +def check_slo(slo: dict[str, int], timings: dict) -> tuple[bool, list[dict]]: + """Returns (pass, violations). Each violation: {'key': ..., 'threshold_ms': ..., 'actual_ms': ...}""" +``` + +Tests: all pass, some violations, edge cases (exactly at threshold = pass). + +### Step 6: `build_json_result()` function + +```python +def build_json_result(url, d, headers_text, slo_result, exit_code) -> dict: + """Build the v1 JSON schema output dict.""" +``` + +Parses `status_line` and `status_code` from headers_text first line. +Constructs: +``` +schema_version, url, ok, exit_code, +response {status_line, status_code, remote_ip, remote_port, headers}, +timings_ms {dns, connect, tls, server, transfer, total, namelookup, initial_connect, pretransfer, starttransfer}, +speed {download_kbs, upload_kbs}, +slo {pass, violations} +``` + +Tests: snapshot-style test with known input dict → expected output structure. + +### Step 7: Wire it all together in `main()` + +Modify `main()` flow: +1. Pop `--format`, `--slo`, `--save` from args before curl invocation (unchanged curl logic) +2. After timings are computed and ranges calculated, branch on `output_format`: + - `"pretty"`: existing pretty-print logic (unchanged), plus SLO violation warnings at the end + - `"json"`: call `build_json_result()`, `json.dumps(indent=2)`, print + - `"jsonl"`: same but `json.dumps()` (compact, single line) +3. If `--slo` specified: run `check_slo()`, if failed set exit code to 4 +4. If `--save` specified: write output to file +5. SLO violations in pretty mode: print red warning lines after the timing diagram + +### Step 8: Update help text + +Add `--format`, `--slo`, `--save` to `print_help()`. Add `NO_COLOR` to the environments section. + +### Step 9: E2E test additions + +Add cases to `httpstat_test.sh`: +- `--format json` produces valid JSON with `schema_version` +- `--slo total=1` triggers exit code 4 (1ms threshold will always fail) +- `--save /tmp/httpstat_test_out.json` writes file +- `NO_COLOR=1` output contains no ANSI escapes + +## Verification + +```bash +# Unit tests +uv run pytest tests/ -v + +# E2E tests +bash httpstat_test.sh + +# Manual smoke tests +python httpstat.py https://example.com +python httpstat.py https://example.com --format json +python httpstat.py https://example.com --format json --slo total=100 +python httpstat.py https://example.com --format json --save /tmp/out.json +NO_COLOR=1 python httpstat.py https://example.com +``` diff --git a/kb/sessions/2026-04-08-create-httpstat-diagnostics-skill.md b/kb/sessions/2026-04-08-create-httpstat-diagnostics-skill.md new file mode 100644 index 0000000..ea9aa00 --- /dev/null +++ b/kb/sessions/2026-04-08-create-httpstat-diagnostics-skill.md @@ -0,0 +1,30 @@ +--- +created: 2026-04-08 +tags: + - skill + - diagnostics + - agent +--- + +# 为 httpstat 创建 HTTP 性能诊断 Skill + +## 概要 + +本次 session 为 httpstat 项目创建了 Phase 2 规划中的 Skill 层——一个面向 AI Agent 的 HTTP 性能诊断技能。Skill 指导 Agent 自动安装 httpstat、运行结构化诊断、识别瓶颈阶段(DNS/TCP/TLS/Server/Transfer)并给出修复建议。通过 skill-creator 工具进行了两轮迭代评估:第一轮在 httpstat 项目目录中运行(baseline 也能发现 httpstat.py 导致差异不明显),第二轮在干净目录中运行(baseline 只能用 raw curl -w,skill 引导 Agent 自动 pip install httpstat),最终 with-skill pass rate 91.7% vs baseline 78.3%,验证了 skill 的价值。 + +## 修改的文件 + +- `skills/httpstat/SKILL.md`:新建,httpstat 诊断 skill 主文件,包含 setup(自动安装)、运行方式、JSON 输出解读、5 类瓶颈诊断指南、诊断工作流、curl 命令转换、SLO 参考 +- `skills/httpstat/evals/evals.json`:新建,3 个评估用例(慢 API 诊断、curl 命令调试、TLS 开销对比) +- `AGENTS.md`:修正 JSON schema 中 headers 字段从 text 改为 dict 示例 + +## Git 提交记录 + +- `b525ca5` feat: add httpstat diagnostics skill for HTTP performance debugging + +## 注意事项 + +- **Skill 不依赖项目目录**:skill 使用 `which httpstat || pip install httpstat` 模式,确保在任何目录下都能工作,不假设 httpstat.py 在当前目录 +- **评估环境隔离很重要**:第一轮在项目目录中运行时 baseline 也能发现 httpstat.py,导致 pass rate 无差异(均 91.7%);第二轮在 /tmp 干净目录中运行才体现出 skill 的真正价值 +- **测试污染问题**:并行运行 with-skill 和 baseline 时,with-skill agent 先 pip install 了 httpstat 到全局 PATH,后续 baseline agent 也能用到。理想情况下应使用独立虚拟环境 +- **redirect 断言两组都未通过**:eval 2 中"HTTP 是否提到可能重定向到 HTTPS"这一断言对两组都失败,属于非区分性断言,可考虑在后续迭代中移除或改进 skill 中相关指导 diff --git a/pyproject.toml b/pyproject.toml new file mode 100644 index 0000000..665aec3 --- /dev/null +++ b/pyproject.toml @@ -0,0 +1,40 @@ +[build-system] +requires = ["setuptools>=61", "wheel"] +build-backend = "setuptools.build_meta" + +[project] +name = "httpstat" +dynamic = ["version"] +description = "curl statistics made simple" +readme = "README.md" +license = "MIT" +requires-python = ">=3.9" +authors = [ + { name = "reorx", email = "novoreorx@gmail.com" }, +] +classifiers = [ + "Programming Language :: Python :: 3", + "Programming Language :: Python :: 3 :: Only", + "Programming Language :: Python :: 3.9", + "Programming Language :: Python :: 3.10", + "Programming Language :: Python :: 3.11", + "Programming Language :: Python :: 3.12", + "Programming Language :: Python :: 3.13", +] + +[project.urls] +Homepage = "https://github.com/reorx/httpstat" + +[project.scripts] +httpstat = "httpstat:main" + +[tool.setuptools] +py-modules = ["httpstat"] + +[tool.setuptools.dynamic] +version = { attr = "httpstat.__version__" } + +[dependency-groups] +dev = [ + "pytest>=8.4.2", +] diff --git a/setup.py b/setup.py deleted file mode 100644 index fc5a556..0000000 --- a/setup.py +++ /dev/null @@ -1,43 +0,0 @@ -#!/usr/bin/env python -# coding=utf-8 - -from setuptools import setup - - -package_name = 'httpstat' -filename = package_name + '.py' - - -def get_version(): - import ast - - with open(filename) as input_file: - for line in input_file: - if line.startswith('__version__'): - return ast.parse(line).body[0].value.s - - -def get_long_description(): - try: - with open('README.md', 'r') as f: - return f.read() - except IOError: - return '' - - -setup( - name=package_name, - version=get_version(), - author='reorx', - author_email='novoreorx@gmail.com', - description='curl statistics made simple', - url='https://github.com/reorx/httpstat', - long_description=get_long_description(), - py_modules=[package_name], - entry_points={ - 'console_scripts': [ - 'httpstat = httpstat:main' - ] - }, - license='License :: OSI Approved :: MIT License', -) diff --git a/skills/httpstat/SKILL.md b/skills/httpstat/SKILL.md new file mode 100644 index 0000000..808930a --- /dev/null +++ b/skills/httpstat/SKILL.md @@ -0,0 +1,261 @@ +--- +name: httpstat +description: > + Diagnose website and API performance using httpstat — a curl wrapper that visualizes + HTTP timing breakdowns (DNS, TCP, TLS, server processing, content transfer). + Use this skill whenever the user wants to debug slow websites, analyze HTTP/HTTPS + latency, profile API response times, understand curl timing output, find network + bottlenecks, check TLS handshake speed, measure Time to First Byte (TTFB), or + troubleshoot any connection performance issue. Also trigger when the user has a + curl command and wants to understand where time is being spent, or when they paste + httpstat output and want help interpreting it. Even if the user doesn't mention + "httpstat" by name — if they're asking "why is this endpoint slow?" or "what's + taking so long?" for an HTTP request, this skill applies. +--- + +# httpstat: HTTP Performance Diagnostics + +Use `httpstat` — a curl wrapper that breaks down HTTP request timing into discrete +phases — to diagnose where latency lives and what to do about it. + +## When to use httpstat + +- User says a website or API is slow +- User wants to profile an HTTP endpoint +- User has a curl command and wants timing visibility +- User asks about DNS, TLS, TTFB, or connection performance +- User pastes httpstat output and wants interpretation +- User wants to compare performance before/after a change + +## Setup (do this first) + +Before running any diagnostics, ensure httpstat is available: + +```bash +which httpstat || pip install httpstat +``` + +If `pip` is not appropriate for the environment, try `uv pip install httpstat`, +`pipx install httpstat`, or `brew install httpstat` (macOS). The goal is to get +the `httpstat` command on PATH. Do not assume it is already installed — always +check first. + +## Running httpstat + +### Basic usage + +```bash +httpstat [curl options] +``` + +### Get structured JSON output (preferred for analysis) + +```bash +httpstat --format json [curl options] +``` + +### With SLO thresholds (exit code 4 if any threshold exceeded) + +```bash +httpstat --format json --slo total=500,connect=100,ttfb=200 +``` + +### Save results to file for later comparison + +```bash +httpstat --format json --save results.json +``` + +### Supported curl options + +Any curl flag works EXCEPT `-w`, `-D`, `-o`, `-s`, `-S` (reserved by httpstat). +Common useful ones: + +- `-X POST --data '{"key":"val"}'` — test POST endpoints +- `-H "Authorization: Bearer TOKEN"` — authenticated requests +- `-L` — follow redirects +- `-k` — skip TLS verification (useful for self-signed certs) +- `--connect-timeout 10` — cap connection time +- `-6` or `-4` — force IPv6 or IPv4 + +### Environment variables + +| Variable | Default | Purpose | +|----------|---------|---------| +| `HTTPSTAT_SHOW_BODY=true` | false | Show response body (truncated to 1023 bytes) | +| `HTTPSTAT_SHOW_SPEED=true` | false | Show download/upload speed | +| `NO_COLOR=1` | unset | Disable colored output | + +## Understanding the JSON output + +The `--format json` output has this structure: + +``` +timings_ms: + dns — DNS resolution time + connect — TCP handshake duration (after DNS) + tls — TLS handshake duration (after TCP connect) + server — Server processing time (after TLS, before first byte) + transfer — Content download time (after first byte) + total — End-to-end total + + namelookup — Cumulative: DNS done + initial_connect — Cumulative: TCP done + pretransfer — Cumulative: TLS done, ready to send + starttransfer — Cumulative: first byte received (TTFB) +``` + +The "range" fields (dns, connect, tls, server, transfer) show how long each +phase took independently. The cumulative fields show elapsed time since request +start. Both are in milliseconds. + +## Bottleneck diagnosis + +After running httpstat, identify the bottleneck by finding the phase that +consumes the most time relative to total. Then apply the appropriate diagnosis. + +### DNS is slow (dns > 50ms) + +Typical healthy: 1-20ms (cached) or 20-80ms (cold lookup). + +**Likely causes:** +- DNS resolver is far away or overloaded +- Domain has complex CNAME chains +- DNSSEC validation overhead +- No local DNS cache + +**Suggestions:** +- Try a faster public resolver: `--resolve host:port:ip` to bypass DNS +- Check if DNS caching is enabled on the machine (`systemd-resolved`, `dnsmasq`) +- Run `dig +trace ` to see the full resolution path +- Compare with `dig @8.8.8.8 ` vs `dig @1.1.1.1 ` + +### TCP connect is slow (connect > 100ms) + +Typical healthy: 5-50ms (same region), 100-200ms (cross-continent). + +**Likely causes:** +- Server is geographically distant +- Network congestion or packet loss +- Server TCP backlog is full (SYN queue saturation) +- Firewall or security group adding latency + +**Suggestions:** +- Check server location vs client: `curl -s ipinfo.io/` +- Run from a closer region if possible +- Check for packet loss: `mtr ` or `traceroute ` +- Compare IPv4 vs IPv6: run httpstat with `-4` and `-6` + +### TLS handshake is slow (tls > 200ms) + +Typical healthy: 30-100ms (TLS 1.3), 100-250ms (TLS 1.2). + +**Likely causes:** +- TLS 1.2 with full handshake (no session resumption) +- OCSP stapling not configured (client does online revocation check) +- Large certificate chain +- Server doing expensive key exchange (RSA vs ECDHE) + +**Suggestions:** +- Check TLS version: `curl -v 2>&1 | grep 'SSL connection'` +- Check cert chain size: `openssl s_client -connect host:443 -showcerts` +- Verify OCSP stapling: `openssl s_client -connect host:443 -status` +- TLS 1.3 reduces round trips — check if server supports it + +### Server processing is slow (server > 500ms) + +This is TTFB minus connection overhead. It reflects backend work. + +**Likely causes:** +- Slow database queries +- Missing cache (hitting origin every time) +- Cold start (serverless/lambda functions) +- Upstream dependency latency +- Heavy computation in request handler + +**Suggestions:** +- This is an application-level issue, not a network issue +- Check server-side logs and APM tools +- Compare with a simpler endpoint on the same host to isolate +- Run multiple times — if first request is slow but subsequent are fast, it's a cold start +- Add `--slo server=200` to set a threshold and monitor + +### Content transfer is slow (transfer > 500ms for small responses) + +**Likely causes:** +- Large response body without compression +- Server sending data in small chunks with delays (streaming/SSE) +- Bandwidth limitation between client and server +- Transfer-Encoding: chunked with slow chunk generation + +**Suggestions:** +- Check response size: look at Content-Length or `HTTPSTAT_SHOW_BODY=true` +- Verify compression: `-H "Accept-Encoding: gzip"` and check response headers +- If response is large, this may be expected — calculate effective bandwidth +- Compare `HTTPSTAT_SHOW_SPEED=true` output against expected bandwidth + +## Diagnostic workflow + +Follow this sequence when a user reports a slow endpoint: + +1. **Run httpstat with JSON output** to get structured timings +2. **Identify the dominant phase** — which timing is the largest portion of total? +3. **Run 3-5 times** to check for variance (one slow DNS lookup doesn't mean DNS is the problem) +4. **Compare against baselines** — is this endpoint unusually slow, or is the server always like this? +5. **Drill into the bottleneck** using the phase-specific diagnostics above +6. **Set SLO thresholds** if the user wants ongoing monitoring: `--slo total=500,ttfb=300` + +### Multiple-run comparison + +Run httpstat several times and compare JSON output to distinguish consistent +bottlenecks from transient issues: + +```bash +for i in 1 2 3; do + httpstat --format json --save "run_$i.json" +done +``` + +Then compare the `timings_ms` across runs. High variance in one phase suggests +transient issues (DNS cache miss on first run, TLS session resumption on +subsequent runs, etc.). + +## Converting curl commands + +When a user has an existing curl command, convert it to httpstat by: + +1. Replace `curl` with `httpstat` +2. Remove any `-w` (write-out format) flag — httpstat provides its own +3. Remove `-o /dev/null` or `-o ` — httpstat handles output +4. Remove `-s` and `-S` — httpstat sets these internally +5. Keep everything else (headers, method, data, auth, etc.) + +Example: +``` +# Original curl +curl -X POST https://api.example.com/v1/users \ + -H "Authorization: Bearer tok_xxx" \ + -H "Content-Type: application/json" \ + -d '{"name":"test"}' \ + -w "time_total: %{time_total}\n" -o /dev/null -s + +# Converted to httpstat +httpstat https://api.example.com/v1/users \ + -X POST \ + -H "Authorization: Bearer tok_xxx" \ + -H "Content-Type: application/json" \ + -d '{"name":"test"}' \ + --format json +``` + +## SLO key reference + +| SLO key | Maps to | What it measures | +|---------|---------|------------------| +| `dns` | time_namelookup | DNS resolution | +| `connect` | time_connect | DNS + TCP connect | +| `tls` | time_pretransfer | DNS + TCP + TLS | +| `ttfb` | time_starttransfer | Time to first byte | +| `total` | time_total | Complete request | + +All values in milliseconds. Exit code is `4` when any threshold is exceeded. diff --git a/skills/httpstat/evals/evals.json b/skills/httpstat/evals/evals.json new file mode 100644 index 0000000..4aa1a4f --- /dev/null +++ b/skills/httpstat/evals/evals.json @@ -0,0 +1,23 @@ +{ + "skill_name": "httpstat", + "evals": [ + { + "id": 0, + "prompt": "Our /api/v2/search endpoint at https://httpbin.org/delay/2 is taking forever, can you figure out what's going on?", + "expected_output": "Should run httpstat with --format json, identify server processing as the dominant bottleneck (~2s), and explain that the delay is server-side, not network.", + "files": [] + }, + { + "id": 1, + "prompt": "I have this curl command: curl -X GET https://httpbin.org/get -H 'Accept: application/json' — can you show me where the time is being spent?", + "expected_output": "Should convert the curl command to httpstat, run it, break down timings by phase, and explain the results.", + "files": [] + }, + { + "id": 2, + "prompt": "Can you check if there's a noticeable TLS overhead when hitting https://www.google.com vs http://www.google.com?", + "expected_output": "Should run httpstat on both URLs, compare TLS timing, and quantify the overhead. Should note that http may redirect to https.", + "files": [] + } + ] +} diff --git a/tests/test_httpstat.py b/tests/test_httpstat.py new file mode 100644 index 0000000..dfaf330 --- /dev/null +++ b/tests/test_httpstat.py @@ -0,0 +1,320 @@ +from __future__ import annotations + +import json +import os +import pytest + +import httpstat + + +# --- parse_bool --- + +class TestParseBool: + @pytest.mark.parametrize('value', ['1', 'true', 'yes', 'on', 'TRUE', ' True ', 'YES']) + def test_truthy(self, value): + assert httpstat.parse_bool(value) is True + + @pytest.mark.parametrize('value', ['0', 'false', 'no', 'off', 'FALSE', ' False ', 'NO']) + def test_falsy(self, value): + assert httpstat.parse_bool(value) is False + + @pytest.mark.parametrize('value', ['', 'maybe', '2', 'truthy']) + def test_invalid(self, value): + with pytest.raises(ValueError): + httpstat.parse_bool(value) + + +# --- pop_arg --- + +class TestPopArg: + def test_pop_flag_with_value(self): + args = ['--format', 'json', 'https://example.com'] + val = httpstat.pop_arg(args, '--format') + assert val == 'json' + assert args == ['https://example.com'] + + def test_pop_flag_with_value_short(self): + args = ['-f', 'json', 'https://example.com'] + val = httpstat.pop_arg(args, '-f') + assert val == 'json' + assert args == ['https://example.com'] + + def test_pop_flag_no_value(self): + args = ['--verbose', 'https://example.com'] + val = httpstat.pop_arg(args, '--verbose', has_value=False) + assert val is True + assert args == ['https://example.com'] + + def test_pop_flag_missing(self): + args = ['https://example.com'] + val = httpstat.pop_arg(args, '--format') + assert val is None + assert args == ['https://example.com'] + + def test_pop_flag_missing_no_value(self): + args = ['https://example.com'] + val = httpstat.pop_arg(args, '--verbose', has_value=False) + assert val is None + assert args == ['https://example.com'] + + def test_pop_flag_at_end_missing_value(self): + args = ['https://example.com', '--format'] + val = httpstat.pop_arg(args, '--format') + assert val is None + assert args == ['https://example.com', '--format'] + + def test_pop_multiple_flags(self): + args = ['--format', 'json', '--slo', 'total=500', 'https://example.com'] + fmt = httpstat.pop_arg(args, '--format') + slo = httpstat.pop_arg(args, '--slo') + assert fmt == 'json' + assert slo == 'total=500' + assert args == ['https://example.com'] + + +# --- parse_slo --- + +class TestParseSlo: + def test_single_key(self): + result = httpstat.parse_slo('total=500') + assert result == {'total': 500} + + def test_multiple_keys(self): + result = httpstat.parse_slo('total=500,connect=100,ttfb=200') + assert result == {'total': 500, 'connect': 100, 'ttfb': 200} + + def test_all_valid_keys(self): + result = httpstat.parse_slo('total=1,connect=2,ttfb=3,dns=4,tls=5') + assert result == {'total': 1, 'connect': 2, 'ttfb': 3, 'dns': 4, 'tls': 5} + + def test_invalid_key(self): + with pytest.raises(SystemExit): + httpstat.parse_slo('badkey=100') + + def test_invalid_value_not_int(self): + with pytest.raises(SystemExit): + httpstat.parse_slo('total=abc') + + def test_invalid_value_negative(self): + with pytest.raises(SystemExit): + httpstat.parse_slo('total=-1') + + def test_invalid_value_zero(self): + with pytest.raises(SystemExit): + httpstat.parse_slo('total=0') + + def test_malformed_no_equals(self): + with pytest.raises(SystemExit): + httpstat.parse_slo('total') + + def test_empty_string(self): + with pytest.raises(SystemExit): + httpstat.parse_slo('') + + def test_spaces_trimmed(self): + result = httpstat.parse_slo(' total = 500 , connect = 100 ') + assert result == {'total': 500, 'connect': 100} + + +# --- check_slo --- + +class TestCheckSlo: + def _make_timings(self, **overrides): + base = { + 'time_namelookup': 5, + 'time_connect': 15, + 'time_pretransfer': 30, + 'time_starttransfer': 80, + 'time_total': 100, + } + base.update(overrides) + return base + + def test_all_pass(self): + slo = {'total': 200, 'connect': 50} + passed, violations = httpstat.check_slo(slo, self._make_timings()) + assert passed is True + assert violations == [] + + def test_one_violation(self): + slo = {'total': 50} # actual is 100 + passed, violations = httpstat.check_slo(slo, self._make_timings()) + assert passed is False + assert len(violations) == 1 + assert violations[0]['key'] == 'total' + assert violations[0]['threshold_ms'] == 50 + assert violations[0]['actual_ms'] == 100 + + def test_multiple_violations(self): + slo = {'total': 50, 'dns': 1} + passed, violations = httpstat.check_slo(slo, self._make_timings()) + assert passed is False + assert len(violations) == 2 + + def test_exactly_at_threshold_passes(self): + slo = {'total': 100} # actual is 100, should pass + passed, violations = httpstat.check_slo(slo, self._make_timings()) + assert passed is True + assert violations == [] + + def test_ttfb_maps_to_starttransfer(self): + slo = {'ttfb': 50} # actual starttransfer is 80 + passed, violations = httpstat.check_slo(slo, self._make_timings()) + assert passed is False + assert violations[0]['actual_ms'] == 80 + + def test_tls_maps_to_pretransfer(self): + slo = {'tls': 25} # actual pretransfer is 30 + passed, violations = httpstat.check_slo(slo, self._make_timings()) + assert passed is False + assert violations[0]['actual_ms'] == 30 + + +# --- build_json_result --- + +class TestBuildJsonResult: + def _make_d(self): + return { + 'time_namelookup': 5, + 'time_connect': 15, + 'time_appconnect': 25, + 'time_pretransfer': 30, + 'time_redirect': 0, + 'time_starttransfer': 80, + 'time_total': 100, + 'speed_download': 10240.0, + 'speed_upload': 0.0, + 'remote_ip': '93.184.216.34', + 'remote_port': '443', + 'local_ip': '192.168.1.1', + 'local_port': '54321', + 'range_dns': 5, + 'range_connection': 10, + 'range_ssl': 15, + 'range_server': 50, + 'range_transfer': 20, + } + + def test_schema_version(self): + result = httpstat.build_json_result( + 'https://example.com', self._make_d(), + 'HTTP/2 200\r\ncontent-type: text/html\r\n', + None, 0, + ) + assert result['schema_version'] == 1 + + def test_basic_fields(self): + result = httpstat.build_json_result( + 'https://example.com', self._make_d(), + 'HTTP/2 200\r\ncontent-type: text/html\r\n', + None, 0, + ) + assert result['url'] == 'https://example.com' + assert result['ok'] is True + assert result['exit_code'] == 0 + + def test_response_fields(self): + result = httpstat.build_json_result( + 'https://example.com', self._make_d(), + 'HTTP/2 200\r\ncontent-type: text/html\r\n', + None, 0, + ) + assert result['response']['status_line'] == 'HTTP/2 200' + assert result['response']['status_code'] == 200 + assert result['response']['remote_ip'] == '93.184.216.34' + assert result['response']['remote_port'] == '443' + + def test_timings_ms(self): + result = httpstat.build_json_result( + 'https://example.com', self._make_d(), + 'HTTP/2 200\r\n', + None, 0, + ) + t = result['timings_ms'] + assert t['dns'] == 5 + assert t['connect'] == 10 + assert t['tls'] == 15 + assert t['server'] == 50 + assert t['transfer'] == 20 + assert t['total'] == 100 + assert t['namelookup'] == 5 + assert t['initial_connect'] == 15 + assert t['pretransfer'] == 30 + assert t['starttransfer'] == 80 + + def test_speed(self): + result = httpstat.build_json_result( + 'https://example.com', self._make_d(), + 'HTTP/2 200\r\n', + None, 0, + ) + assert result['speed']['download_kbs'] == pytest.approx(10.0) + assert result['speed']['upload_kbs'] == 0.0 + + def test_slo_none(self): + result = httpstat.build_json_result( + 'https://example.com', self._make_d(), + 'HTTP/2 200\r\n', + None, 0, + ) + assert result['slo'] is None + + def test_slo_pass(self): + slo_result = (True, []) + result = httpstat.build_json_result( + 'https://example.com', self._make_d(), + 'HTTP/2 200\r\n', + slo_result, 0, + ) + assert result['slo']['pass'] is True + assert result['slo']['violations'] == [] + + def test_slo_fail(self): + slo_result = (False, [{'key': 'total', 'threshold_ms': 50, 'actual_ms': 100}]) + result = httpstat.build_json_result( + 'https://example.com', self._make_d(), + 'HTTP/2 200\r\n', + slo_result, 4, + ) + assert result['slo']['pass'] is False + assert len(result['slo']['violations']) == 1 + assert result['exit_code'] == 4 + assert result['ok'] is False + + def test_http1_status_line(self): + result = httpstat.build_json_result( + 'http://example.com', self._make_d(), + 'HTTP/1.1 301 Moved Permanently\r\nLocation: https://example.com\r\n', + None, 0, + ) + assert result['response']['status_line'] == 'HTTP/1.1 301 Moved Permanently' + assert result['response']['status_code'] == 301 + + def test_json_serializable(self): + result = httpstat.build_json_result( + 'https://example.com', self._make_d(), + 'HTTP/2 200\r\n', + (True, []), 0, + ) + # Should not raise + json.dumps(result) + + +# --- NO_COLOR --- + +class TestNoColor: + def test_isatty_false_when_no_color_set(self, monkeypatch): + monkeypatch.setenv('NO_COLOR', '') + # Re-evaluate ISATTY + isatty = os.sys.stdout.isatty() and 'NO_COLOR' not in os.environ + assert isatty is False + + def test_make_color_returns_plain_when_not_isatty(self, monkeypatch): + monkeypatch.setattr(httpstat, 'ISATTY', False) + func = httpstat.make_color(31) + assert func('hello') == 'hello' + + def test_make_color_returns_colored_when_isatty(self, monkeypatch): + monkeypatch.setattr(httpstat, 'ISATTY', True) + func = httpstat.make_color(31) + assert func('hello') == '\x1b[31mhello\x1b[0m' diff --git a/uv.lock b/uv.lock new file mode 100644 index 0000000..a8980c2 --- /dev/null +++ b/uv.lock @@ -0,0 +1,199 @@ +version = 1 +revision = 3 +requires-python = ">=3.9" +resolution-markers = [ + "python_full_version >= '3.10'", + "python_full_version < '3.10'", +] + +[[package]] +name = "colorama" +version = "0.4.6" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/d8/53/6f443c9a4a8358a93a6792e2acffb9d9d5cb0a5cfd8802644b7b1c9a02e4/colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44", size = 27697, upload-time = "2022-10-25T02:36:22.414Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/d1/d6/3965ed04c63042e047cb6a3e6ed1a63a35087b6a609aa3a15ed8ac56c221/colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6", size = 25335, upload-time = "2022-10-25T02:36:20.889Z" }, +] + +[[package]] +name = "exceptiongroup" +version = "1.3.1" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "typing-extensions", marker = "python_full_version < '3.13'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/50/79/66800aadf48771f6b62f7eb014e352e5d06856655206165d775e675a02c9/exceptiongroup-1.3.1.tar.gz", hash = "sha256:8b412432c6055b0b7d14c310000ae93352ed6754f70fa8f7c34141f91c4e3219", size = 30371, upload-time = "2025-11-21T23:01:54.787Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/8a/0e/97c33bf5009bdbac74fd2beace167cab3f978feb69cc36f1ef79360d6c4e/exceptiongroup-1.3.1-py3-none-any.whl", hash = "sha256:a7a39a3bd276781e98394987d3a5701d0c4edffb633bb7a5144577f82c773598", size = 16740, upload-time = "2025-11-21T23:01:53.443Z" }, +] + +[[package]] +name = "httpstat" +source = { editable = "." } + +[package.dev-dependencies] +dev = [ + { name = "pytest", version = "8.4.2", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version < '3.10'" }, + { name = "pytest", version = "9.0.2", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.10'" }, +] + +[package.metadata] + +[package.metadata.requires-dev] +dev = [{ name = "pytest", specifier = ">=8.4.2" }] + +[[package]] +name = "iniconfig" +version = "2.1.0" +source = { registry = "https://pypi.org/simple" } +resolution-markers = [ + "python_full_version < '3.10'", +] +sdist = { url = "https://files.pythonhosted.org/packages/f2/97/ebf4da567aa6827c909642694d71c9fcf53e5b504f2d96afea02718862f3/iniconfig-2.1.0.tar.gz", hash = "sha256:3abbd2e30b36733fee78f9c7f7308f2d0050e88f0087fd25c2645f63c773e1c7", size = 4793, upload-time = "2025-03-19T20:09:59.721Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/2c/e1/e6716421ea10d38022b952c159d5161ca1193197fb744506875fbb87ea7b/iniconfig-2.1.0-py3-none-any.whl", hash = "sha256:9deba5723312380e77435581c6bf4935c94cbfab9b1ed33ef8d238ea168eb760", size = 6050, upload-time = "2025-03-19T20:10:01.071Z" }, +] + +[[package]] +name = "iniconfig" +version = "2.3.0" +source = { registry = "https://pypi.org/simple" } +resolution-markers = [ + "python_full_version >= '3.10'", +] +sdist = { url = "https://files.pythonhosted.org/packages/72/34/14ca021ce8e5dfedc35312d08ba8bf51fdd999c576889fc2c24cb97f4f10/iniconfig-2.3.0.tar.gz", hash = "sha256:c76315c77db068650d49c5b56314774a7804df16fee4402c1f19d6d15d8c4730", size = 20503, upload-time = "2025-10-18T21:55:43.219Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/cb/b1/3846dd7f199d53cb17f49cba7e651e9ce294d8497c8c150530ed11865bb8/iniconfig-2.3.0-py3-none-any.whl", hash = "sha256:f631c04d2c48c52b84d0d0549c99ff3859c98df65b3101406327ecc7d53fbf12", size = 7484, upload-time = "2025-10-18T21:55:41.639Z" }, +] + +[[package]] +name = "packaging" +version = "26.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/65/ee/299d360cdc32edc7d2cf530f3accf79c4fca01e96ffc950d8a52213bd8e4/packaging-26.0.tar.gz", hash = "sha256:00243ae351a257117b6a241061796684b084ed1c516a08c48a3f7e147a9d80b4", size = 143416, upload-time = "2026-01-21T20:50:39.064Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/b7/b9/c538f279a4e237a006a2c98387d081e9eb060d203d8ed34467cc0f0b9b53/packaging-26.0-py3-none-any.whl", hash = "sha256:b36f1fef9334a5588b4166f8bcd26a14e521f2b55e6b9de3aaa80d3ff7a37529", size = 74366, upload-time = "2026-01-21T20:50:37.788Z" }, +] + +[[package]] +name = "pluggy" +version = "1.6.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/f9/e2/3e91f31a7d2b083fe6ef3fa267035b518369d9511ffab804f839851d2779/pluggy-1.6.0.tar.gz", hash = "sha256:7dcc130b76258d33b90f61b658791dede3486c3e6bfb003ee5c9bfb396dd22f3", size = 69412, upload-time = "2025-05-15T12:30:07.975Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/54/20/4d324d65cc6d9205fabedc306948156824eb9f0ee1633355a8f7ec5c66bf/pluggy-1.6.0-py3-none-any.whl", hash = "sha256:e920276dd6813095e9377c0bc5566d94c932c33b27a3e3945d8389c374dd4746", size = 20538, upload-time = "2025-05-15T12:30:06.134Z" }, +] + +[[package]] +name = "pygments" +version = "2.19.2" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/b0/77/a5b8c569bf593b0140bde72ea885a803b82086995367bf2037de0159d924/pygments-2.19.2.tar.gz", hash = "sha256:636cb2477cec7f8952536970bc533bc43743542f70392ae026374600add5b887", size = 4968631, upload-time = "2025-06-21T13:39:12.283Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/c7/21/705964c7812476f378728bdf590ca4b771ec72385c533964653c68e86bdc/pygments-2.19.2-py3-none-any.whl", hash = "sha256:86540386c03d588bb81d44bc3928634ff26449851e99741617ecb9037ee5ec0b", size = 1225217, upload-time = "2025-06-21T13:39:07.939Z" }, +] + +[[package]] +name = "pytest" +version = "8.4.2" +source = { registry = "https://pypi.org/simple" } +resolution-markers = [ + "python_full_version < '3.10'", +] +dependencies = [ + { name = "colorama", marker = "python_full_version < '3.10' and sys_platform == 'win32'" }, + { name = "exceptiongroup", marker = "python_full_version < '3.10'" }, + { name = "iniconfig", version = "2.1.0", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version < '3.10'" }, + { name = "packaging", marker = "python_full_version < '3.10'" }, + { name = "pluggy", marker = "python_full_version < '3.10'" }, + { name = "pygments", marker = "python_full_version < '3.10'" }, + { name = "tomli", marker = "python_full_version < '3.10'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/a3/5c/00a0e072241553e1a7496d638deababa67c5058571567b92a7eaa258397c/pytest-8.4.2.tar.gz", hash = "sha256:86c0d0b93306b961d58d62a4db4879f27fe25513d4b969df351abdddb3c30e01", size = 1519618, upload-time = "2025-09-04T14:34:22.711Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/a8/a4/20da314d277121d6534b3a980b29035dcd51e6744bd79075a6ce8fa4eb8d/pytest-8.4.2-py3-none-any.whl", hash = "sha256:872f880de3fc3a5bdc88a11b39c9710c3497a547cfa9320bc3c5e62fbf272e79", size = 365750, upload-time = "2025-09-04T14:34:20.226Z" }, +] + +[[package]] +name = "pytest" +version = "9.0.2" +source = { registry = "https://pypi.org/simple" } +resolution-markers = [ + "python_full_version >= '3.10'", +] +dependencies = [ + { name = "colorama", marker = "python_full_version >= '3.10' and sys_platform == 'win32'" }, + { name = "exceptiongroup", marker = "python_full_version == '3.10.*'" }, + { name = "iniconfig", version = "2.3.0", source = { registry = "https://pypi.org/simple" }, marker = "python_full_version >= '3.10'" }, + { name = "packaging", marker = "python_full_version >= '3.10'" }, + { name = "pluggy", marker = "python_full_version >= '3.10'" }, + { name = "pygments", marker = "python_full_version >= '3.10'" }, + { name = "tomli", marker = "python_full_version == '3.10.*'" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/d1/db/7ef3487e0fb0049ddb5ce41d3a49c235bf9ad299b6a25d5780a89f19230f/pytest-9.0.2.tar.gz", hash = "sha256:75186651a92bd89611d1d9fc20f0b4345fd827c41ccd5c299a868a05d70edf11", size = 1568901, upload-time = "2025-12-06T21:30:51.014Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/3b/ab/b3226f0bd7cdcf710fbede2b3548584366da3b19b5021e74f5bde2a8fa3f/pytest-9.0.2-py3-none-any.whl", hash = "sha256:711ffd45bf766d5264d487b917733b453d917afd2b0ad65223959f59089f875b", size = 374801, upload-time = "2025-12-06T21:30:49.154Z" }, +] + +[[package]] +name = "tomli" +version = "2.4.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/82/30/31573e9457673ab10aa432461bee537ce6cef177667deca369efb79df071/tomli-2.4.0.tar.gz", hash = "sha256:aa89c3f6c277dd275d8e243ad24f3b5e701491a860d5121f2cdd399fbb31fc9c", size = 17477, upload-time = "2026-01-11T11:22:38.165Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/3c/d9/3dc2289e1f3b32eb19b9785b6a006b28ee99acb37d1d47f78d4c10e28bf8/tomli-2.4.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:b5ef256a3fd497d4973c11bf142e9ed78b150d36f5773f1ca6088c230ffc5867", size = 153663, upload-time = "2026-01-11T11:21:45.27Z" }, + { url = "https://files.pythonhosted.org/packages/51/32/ef9f6845e6b9ca392cd3f64f9ec185cc6f09f0a2df3db08cbe8809d1d435/tomli-2.4.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:5572e41282d5268eb09a697c89a7bee84fae66511f87533a6f88bd2f7b652da9", size = 148469, upload-time = "2026-01-11T11:21:46.873Z" }, + { url = "https://files.pythonhosted.org/packages/d6/c2/506e44cce89a8b1b1e047d64bd495c22c9f71f21e05f380f1a950dd9c217/tomli-2.4.0-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:551e321c6ba03b55676970b47cb1b73f14a0a4dce6a3e1a9458fd6d921d72e95", size = 236039, upload-time = "2026-01-11T11:21:48.503Z" }, + { url = "https://files.pythonhosted.org/packages/b3/40/e1b65986dbc861b7e986e8ec394598187fa8aee85b1650b01dd925ca0be8/tomli-2.4.0-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:5e3f639a7a8f10069d0e15408c0b96a2a828cfdec6fca05296ebcdcc28ca7c76", size = 243007, upload-time = "2026-01-11T11:21:49.456Z" }, + { url = "https://files.pythonhosted.org/packages/9c/6f/6e39ce66b58a5b7ae572a0f4352ff40c71e8573633deda43f6a379d56b3e/tomli-2.4.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:1b168f2731796b045128c45982d3a4874057626da0e2ef1fdd722848b741361d", size = 240875, upload-time = "2026-01-11T11:21:50.755Z" }, + { url = "https://files.pythonhosted.org/packages/aa/ad/cb089cb190487caa80204d503c7fd0f4d443f90b95cf4ef5cf5aa0f439b0/tomli-2.4.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:133e93646ec4300d651839d382d63edff11d8978be23da4cc106f5a18b7d0576", size = 246271, upload-time = "2026-01-11T11:21:51.81Z" }, + { url = "https://files.pythonhosted.org/packages/0b/63/69125220e47fd7a3a27fd0de0c6398c89432fec41bc739823bcc66506af6/tomli-2.4.0-cp311-cp311-win32.whl", hash = "sha256:b6c78bdf37764092d369722d9946cb65b8767bfa4110f902a1b2542d8d173c8a", size = 96770, upload-time = "2026-01-11T11:21:52.647Z" }, + { url = "https://files.pythonhosted.org/packages/1e/0d/a22bb6c83f83386b0008425a6cd1fa1c14b5f3dd4bad05e98cf3dbbf4a64/tomli-2.4.0-cp311-cp311-win_amd64.whl", hash = "sha256:d3d1654e11d724760cdb37a3d7691f0be9db5fbdaef59c9f532aabf87006dbaa", size = 107626, upload-time = "2026-01-11T11:21:53.459Z" }, + { url = "https://files.pythonhosted.org/packages/2f/6d/77be674a3485e75cacbf2ddba2b146911477bd887dda9d8c9dfb2f15e871/tomli-2.4.0-cp311-cp311-win_arm64.whl", hash = "sha256:cae9c19ed12d4e8f3ebf46d1a75090e4c0dc16271c5bce1c833ac168f08fb614", size = 94842, upload-time = "2026-01-11T11:21:54.831Z" }, + { url = "https://files.pythonhosted.org/packages/3c/43/7389a1869f2f26dba52404e1ef13b4784b6b37dac93bac53457e3ff24ca3/tomli-2.4.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:920b1de295e72887bafa3ad9f7a792f811847d57ea6b1215154030cf131f16b1", size = 154894, upload-time = "2026-01-11T11:21:56.07Z" }, + { url = "https://files.pythonhosted.org/packages/e9/05/2f9bf110b5294132b2edf13fe6ca6ae456204f3d749f623307cbb7a946f2/tomli-2.4.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:7d6d9a4aee98fac3eab4952ad1d73aee87359452d1c086b5ceb43ed02ddb16b8", size = 149053, upload-time = "2026-01-11T11:21:57.467Z" }, + { url = "https://files.pythonhosted.org/packages/e8/41/1eda3ca1abc6f6154a8db4d714a4d35c4ad90adc0bcf700657291593fbf3/tomli-2.4.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:36b9d05b51e65b254ea6c2585b59d2c4cb91c8a3d91d0ed0f17591a29aaea54a", size = 243481, upload-time = "2026-01-11T11:21:58.661Z" }, + { url = "https://files.pythonhosted.org/packages/d2/6d/02ff5ab6c8868b41e7d4b987ce2b5f6a51d3335a70aa144edd999e055a01/tomli-2.4.0-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:1c8a885b370751837c029ef9bc014f27d80840e48bac415f3412e6593bbc18c1", size = 251720, upload-time = "2026-01-11T11:22:00.178Z" }, + { url = "https://files.pythonhosted.org/packages/7b/57/0405c59a909c45d5b6f146107c6d997825aa87568b042042f7a9c0afed34/tomli-2.4.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:8768715ffc41f0008abe25d808c20c3d990f42b6e2e58305d5da280ae7d1fa3b", size = 247014, upload-time = "2026-01-11T11:22:01.238Z" }, + { url = "https://files.pythonhosted.org/packages/2c/0e/2e37568edd944b4165735687cbaf2fe3648129e440c26d02223672ee0630/tomli-2.4.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:7b438885858efd5be02a9a133caf5812b8776ee0c969fea02c45e8e3f296ba51", size = 251820, upload-time = "2026-01-11T11:22:02.727Z" }, + { url = "https://files.pythonhosted.org/packages/5a/1c/ee3b707fdac82aeeb92d1a113f803cf6d0f37bdca0849cb489553e1f417a/tomli-2.4.0-cp312-cp312-win32.whl", hash = "sha256:0408e3de5ec77cc7f81960c362543cbbd91ef883e3138e81b729fc3eea5b9729", size = 97712, upload-time = "2026-01-11T11:22:03.777Z" }, + { url = "https://files.pythonhosted.org/packages/69/13/c07a9177d0b3bab7913299b9278845fc6eaaca14a02667c6be0b0a2270c8/tomli-2.4.0-cp312-cp312-win_amd64.whl", hash = "sha256:685306e2cc7da35be4ee914fd34ab801a6acacb061b6a7abca922aaf9ad368da", size = 108296, upload-time = "2026-01-11T11:22:04.86Z" }, + { url = "https://files.pythonhosted.org/packages/18/27/e267a60bbeeee343bcc279bb9e8fbed0cbe224bc7b2a3dc2975f22809a09/tomli-2.4.0-cp312-cp312-win_arm64.whl", hash = "sha256:5aa48d7c2356055feef06a43611fc401a07337d5b006be13a30f6c58f869e3c3", size = 94553, upload-time = "2026-01-11T11:22:05.854Z" }, + { url = "https://files.pythonhosted.org/packages/34/91/7f65f9809f2936e1f4ce6268ae1903074563603b2a2bd969ebbda802744f/tomli-2.4.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:84d081fbc252d1b6a982e1870660e7330fb8f90f676f6e78b052ad4e64714bf0", size = 154915, upload-time = "2026-01-11T11:22:06.703Z" }, + { url = "https://files.pythonhosted.org/packages/20/aa/64dd73a5a849c2e8f216b755599c511badde80e91e9bc2271baa7b2cdbb1/tomli-2.4.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:9a08144fa4cba33db5255f9b74f0b89888622109bd2776148f2597447f92a94e", size = 149038, upload-time = "2026-01-11T11:22:07.56Z" }, + { url = "https://files.pythonhosted.org/packages/9e/8a/6d38870bd3d52c8d1505ce054469a73f73a0fe62c0eaf5dddf61447e32fa/tomli-2.4.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:c73add4bb52a206fd0c0723432db123c0c75c280cbd67174dd9d2db228ebb1b4", size = 242245, upload-time = "2026-01-11T11:22:08.344Z" }, + { url = "https://files.pythonhosted.org/packages/59/bb/8002fadefb64ab2669e5b977df3f5e444febea60e717e755b38bb7c41029/tomli-2.4.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:1fb2945cbe303b1419e2706e711b7113da57b7db31ee378d08712d678a34e51e", size = 250335, upload-time = "2026-01-11T11:22:09.951Z" }, + { url = "https://files.pythonhosted.org/packages/a5/3d/4cdb6f791682b2ea916af2de96121b3cb1284d7c203d97d92d6003e91c8d/tomli-2.4.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:bbb1b10aa643d973366dc2cb1ad94f99c1726a02343d43cbc011edbfac579e7c", size = 245962, upload-time = "2026-01-11T11:22:11.27Z" }, + { url = "https://files.pythonhosted.org/packages/f2/4a/5f25789f9a460bd858ba9756ff52d0830d825b458e13f754952dd15fb7bb/tomli-2.4.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:4cbcb367d44a1f0c2be408758b43e1ffb5308abe0ea222897d6bfc8e8281ef2f", size = 250396, upload-time = "2026-01-11T11:22:12.325Z" }, + { url = "https://files.pythonhosted.org/packages/aa/2f/b73a36fea58dfa08e8b3a268750e6853a6aac2a349241a905ebd86f3047a/tomli-2.4.0-cp313-cp313-win32.whl", hash = "sha256:7d49c66a7d5e56ac959cb6fc583aff0651094ec071ba9ad43df785abc2320d86", size = 97530, upload-time = "2026-01-11T11:22:13.865Z" }, + { url = "https://files.pythonhosted.org/packages/3b/af/ca18c134b5d75de7e8dc551c5234eaba2e8e951f6b30139599b53de9c187/tomli-2.4.0-cp313-cp313-win_amd64.whl", hash = "sha256:3cf226acb51d8f1c394c1b310e0e0e61fecdd7adcb78d01e294ac297dd2e7f87", size = 108227, upload-time = "2026-01-11T11:22:15.224Z" }, + { url = "https://files.pythonhosted.org/packages/22/c3/b386b832f209fee8073c8138ec50f27b4460db2fdae9ffe022df89a57f9b/tomli-2.4.0-cp313-cp313-win_arm64.whl", hash = "sha256:d20b797a5c1ad80c516e41bc1fb0443ddb5006e9aaa7bda2d71978346aeb9132", size = 94748, upload-time = "2026-01-11T11:22:16.009Z" }, + { url = "https://files.pythonhosted.org/packages/f3/c4/84047a97eb1004418bc10bdbcfebda209fca6338002eba2dc27cc6d13563/tomli-2.4.0-cp314-cp314-macosx_10_15_x86_64.whl", hash = "sha256:26ab906a1eb794cd4e103691daa23d95c6919cc2fa9160000ac02370cc9dd3f6", size = 154725, upload-time = "2026-01-11T11:22:17.269Z" }, + { url = "https://files.pythonhosted.org/packages/a8/5d/d39038e646060b9d76274078cddf146ced86dc2b9e8bbf737ad5983609a0/tomli-2.4.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:20cedb4ee43278bc4f2fee6cb50daec836959aadaf948db5172e776dd3d993fc", size = 148901, upload-time = "2026-01-11T11:22:18.287Z" }, + { url = "https://files.pythonhosted.org/packages/73/e5/383be1724cb30f4ce44983d249645684a48c435e1cd4f8b5cded8a816d3c/tomli-2.4.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:39b0b5d1b6dd03684b3fb276407ebed7090bbec989fa55838c98560c01113b66", size = 243375, upload-time = "2026-01-11T11:22:19.154Z" }, + { url = "https://files.pythonhosted.org/packages/31/f0/bea80c17971c8d16d3cc109dc3585b0f2ce1036b5f4a8a183789023574f2/tomli-2.4.0-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:a26d7ff68dfdb9f87a016ecfd1e1c2bacbe3108f4e0f8bcd2228ef9a766c787d", size = 250639, upload-time = "2026-01-11T11:22:20.168Z" }, + { url = "https://files.pythonhosted.org/packages/2c/8f/2853c36abbb7608e3f945d8a74e32ed3a74ee3a1f468f1ffc7d1cb3abba6/tomli-2.4.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:20ffd184fb1df76a66e34bd1b36b4a4641bd2b82954befa32fe8163e79f1a702", size = 246897, upload-time = "2026-01-11T11:22:21.544Z" }, + { url = "https://files.pythonhosted.org/packages/49/f0/6c05e3196ed5337b9fe7ea003e95fd3819a840b7a0f2bf5a408ef1dad8ed/tomli-2.4.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:75c2f8bbddf170e8effc98f5e9084a8751f8174ea6ccf4fca5398436e0320bc8", size = 254697, upload-time = "2026-01-11T11:22:23.058Z" }, + { url = "https://files.pythonhosted.org/packages/f3/f5/2922ef29c9f2951883525def7429967fc4d8208494e5ab524234f06b688b/tomli-2.4.0-cp314-cp314-win32.whl", hash = "sha256:31d556d079d72db7c584c0627ff3a24c5d3fb4f730221d3444f3efb1b2514776", size = 98567, upload-time = "2026-01-11T11:22:24.033Z" }, + { url = "https://files.pythonhosted.org/packages/7b/31/22b52e2e06dd2a5fdbc3ee73226d763b184ff21fc24e20316a44ccc4d96b/tomli-2.4.0-cp314-cp314-win_amd64.whl", hash = "sha256:43e685b9b2341681907759cf3a04e14d7104b3580f808cfde1dfdb60ada85475", size = 108556, upload-time = "2026-01-11T11:22:25.378Z" }, + { url = "https://files.pythonhosted.org/packages/48/3d/5058dff3255a3d01b705413f64f4306a141a8fd7a251e5a495e3f192a998/tomli-2.4.0-cp314-cp314-win_arm64.whl", hash = "sha256:3d895d56bd3f82ddd6faaff993c275efc2ff38e52322ea264122d72729dca2b2", size = 96014, upload-time = "2026-01-11T11:22:26.138Z" }, + { url = "https://files.pythonhosted.org/packages/b8/4e/75dab8586e268424202d3a1997ef6014919c941b50642a1682df43204c22/tomli-2.4.0-cp314-cp314t-macosx_10_15_x86_64.whl", hash = "sha256:5b5807f3999fb66776dbce568cc9a828544244a8eb84b84b9bafc080c99597b9", size = 163339, upload-time = "2026-01-11T11:22:27.143Z" }, + { url = "https://files.pythonhosted.org/packages/06/e3/b904d9ab1016829a776d97f163f183a48be6a4deb87304d1e0116a349519/tomli-2.4.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:c084ad935abe686bd9c898e62a02a19abfc9760b5a79bc29644463eaf2840cb0", size = 159490, upload-time = "2026-01-11T11:22:28.399Z" }, + { url = "https://files.pythonhosted.org/packages/e3/5a/fc3622c8b1ad823e8ea98a35e3c632ee316d48f66f80f9708ceb4f2a0322/tomli-2.4.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:0f2e3955efea4d1cfbcb87bc321e00dc08d2bcb737fd1d5e398af111d86db5df", size = 269398, upload-time = "2026-01-11T11:22:29.345Z" }, + { url = "https://files.pythonhosted.org/packages/fd/33/62bd6152c8bdd4c305ad9faca48f51d3acb2df1f8791b1477d46ff86e7f8/tomli-2.4.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:0e0fe8a0b8312acf3a88077a0802565cb09ee34107813bba1c7cd591fa6cfc8d", size = 276515, upload-time = "2026-01-11T11:22:30.327Z" }, + { url = "https://files.pythonhosted.org/packages/4b/ff/ae53619499f5235ee4211e62a8d7982ba9e439a0fb4f2f351a93d67c1dd2/tomli-2.4.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:413540dce94673591859c4c6f794dfeaa845e98bf35d72ed59636f869ef9f86f", size = 273806, upload-time = "2026-01-11T11:22:32.56Z" }, + { url = "https://files.pythonhosted.org/packages/47/71/cbca7787fa68d4d0a9f7072821980b39fbb1b6faeb5f5cf02f4a5559fa28/tomli-2.4.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:0dc56fef0e2c1c470aeac5b6ca8cc7b640bb93e92d9803ddaf9ea03e198f5b0b", size = 281340, upload-time = "2026-01-11T11:22:33.505Z" }, + { url = "https://files.pythonhosted.org/packages/f5/00/d595c120963ad42474cf6ee7771ad0d0e8a49d0f01e29576ee9195d9ecdf/tomli-2.4.0-cp314-cp314t-win32.whl", hash = "sha256:d878f2a6707cc9d53a1be1414bbb419e629c3d6e67f69230217bb663e76b5087", size = 108106, upload-time = "2026-01-11T11:22:34.451Z" }, + { url = "https://files.pythonhosted.org/packages/de/69/9aa0c6a505c2f80e519b43764f8b4ba93b5a0bbd2d9a9de6e2b24271b9a5/tomli-2.4.0-cp314-cp314t-win_amd64.whl", hash = "sha256:2add28aacc7425117ff6364fe9e06a183bb0251b03f986df0e78e974047571fd", size = 120504, upload-time = "2026-01-11T11:22:35.764Z" }, + { url = "https://files.pythonhosted.org/packages/b3/9f/f1668c281c58cfae01482f7114a4b88d345e4c140386241a1a24dcc9e7bc/tomli-2.4.0-cp314-cp314t-win_arm64.whl", hash = "sha256:2b1e3b80e1d5e52e40e9b924ec43d81570f0e7d09d11081b797bc4692765a3d4", size = 99561, upload-time = "2026-01-11T11:22:36.624Z" }, + { url = "https://files.pythonhosted.org/packages/23/d1/136eb2cb77520a31e1f64cbae9d33ec6df0d78bdf4160398e86eec8a8754/tomli-2.4.0-py3-none-any.whl", hash = "sha256:1f776e7d669ebceb01dee46484485f43a4048746235e683bcdffacdf1fb4785a", size = 14477, upload-time = "2026-01-11T11:22:37.446Z" }, +] + +[[package]] +name = "typing-extensions" +version = "4.15.0" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/72/94/1a15dd82efb362ac84269196e94cf00f187f7ed21c242792a923cdb1c61f/typing_extensions-4.15.0.tar.gz", hash = "sha256:0cea48d173cc12fa28ecabc3b837ea3cf6f38c6d1136f85cbaaf598984861466", size = 109391, upload-time = "2025-08-25T13:49:26.313Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/18/67/36e9267722cc04a6b9f15c7f3441c2363321a3ea07da7ae0c0707beb2a9c/typing_extensions-4.15.0-py3-none-any.whl", hash = "sha256:f0fa19c6845758ab08074a0cfa8b7aecb71c999ca73d62883bc25cc018c4e548", size = 44614, upload-time = "2025-08-25T13:49:24.86Z" }, +]