runtime error

Exit code: 2. Reason: ources_1.11.0.json: 439kB [00:00, 161MB/s] 2026-03-25 04:38:15 INFO: Downloaded file to /home/appuser/.cache/stanza/1.11.0/resources/resources.json 2026-03-25 04:38:15 DEBUG: Loading resource file... 2026-03-25 04:38:15 DEBUG: Processing parameter "processors"... 2026-03-25 04:38:15 DEBUG: Found tokenize: combined. 2026-03-25 04:38:15 WARNING: Can not find ner: ontonotes_bert from official model list. Ignoring it. 2026-03-25 04:38:15 DEBUG: Found dependencies [] for processor tokenize model combined 2026-03-25 04:38:15 DEBUG: Downloading these customized packages for language: en (English)... ======================== | Processor | Package | ------------------------ | tokenize | combined | ======================== Downloading https://huggingface.co/stanfordnlp/stanza-en/resolve/v1.11.0/models/tokenize/combined.pt: 0%| | 0.00/651k [00:00<?, ?B/s] Downloading https://huggingface.co/stanfordnlp/stanza-en/resolve/v1.11.0/models/tokenize/combined.pt: 20%|██ | 131k/651k [00:00<00:01, 366kB/s] Downloading https://huggingface.co/stanfordnlp/stanza-en/resolve/v1.11.0/models/tokenize/combined.pt: 100%|██████████| 651k/651k [00:00<00:00, 1.80MB/s] 2026-03-25 04:38:15 DEBUG: Downloaded file to /home/appuser/.cache/stanza/1.11.0/resources/en/tokenize/combined.pt 2026-03-25 04:38:15 INFO: Loading these models for language: en (English): ======================== | Processor | Package | ------------------------ | tokenize | combined | ======================== 2026-03-25 04:38:15 INFO: Using device: cpu 2026-03-25 04:38:15 INFO: Loading: tokenize 2026-03-25 04:38:15 DEBUG: With settings: 2026-03-25 04:38:15 DEBUG: {'model_path': '/home/appuser/.cache/stanza/1.11.0/resources/en/tokenize/combined.pt', 'pretokenized': True, 'lang': 'en', 'mode': 'predict'} 2026-03-25 04:38:15 INFO: Done loading processors! 0it [00:00, ?it/s] 0it [00:00, ?it/s] usage: start_mcp.py [-h] filenames [filenames ...] start_mcp.py: error: the following arguments are required: filenames

Container logs:

Fetching error logs...