风险评分

100/100 (Very Low)

OpenClaw: benign
VirusTotal: benign
StaticScan: clean

Crawlee Web Scraper

作者: Bryan Tegomoh, MD, MPH
Slug:crawlee-web-scraper
版本:1.0.0
更新时间:2026-03-22 00:21:47
风险信息

OpenClaw: benign

查看 OpenClaw 分析摘要(前 200 字预览)
The skill's code, instructions, and requirements align with its stated purpose (a Crawlee-based fallback scraper); it requests no unrelated credentials or system access and contains no hidden exfiltra...

[内容已截断]

VirusTotal: benign VT 报告

静态扫描: clean

No suspicious patterns detected.
README

README 未提供

文件列表

无文件信息

下载
下载官方 ZIP
原始 JSON 数据
{
    "latestVersion": {
        "_creationTime": 1774109660679,
        "_id": "k97555hfe90wskh0x74dc12m7x83a3zy",
        "changelog": "Initial release of crawlee-web-scraper.\n\n- Provides resilient web scraping with evasion for bot detection and rate limits using Crawlee.\n- Supports both single URLs and bulk file input for scraping.\n- Implements automatic fallback: tries regular requests, then uses Crawlee on 403\/429\/503 errors.\n- Returns standardized JSON output per URL with metadata and extracted content.\n- Drop-in replacement for web_fetch, with simple command-line and Python library usage.",
        "changelogSource": "auto",
        "createdAt": 1774109660679,
        "version": "1.0.0"
    },
    "owner": {
        "_creationTime": 0,
        "_id": "publishers:missing",
        "displayName": "Bryan Tegomoh, MD, MPH",
        "handle": "bryantegomoh",
        "image": "https:\/\/avatars.githubusercontent.com\/u\/67350434?v=4",
        "kind": "user",
        "linkedUserId": "kn72bqwej19pkna60r4fb51x5d818hrf"
    },
    "ownerHandle": "bryantegomoh",
    "skill": {
        "_creationTime": 1774109660679,
        "_id": "kd71bmtx1gv3fkbd62049cagq983a5bz",
        "badges": [],
        "createdAt": 1774109660679,
        "displayName": "Crawlee Web Scraper",
        "latestVersionId": "k97555hfe90wskh0x74dc12m7x83a3zy",
        "ownerUserId": "kn72bqwej19pkna60r4fb51x5d818hrf",
        "slug": "crawlee-web-scraper",
        "stats": {
            "comments": 0,
            "downloads": 33,
            "installsAllTime": 0,
            "installsCurrent": 0,
            "stars": 0,
            "versions": 1
        },
        "summary": "Resilient web scraper with bot-detection evasion using the Crawlee library. Use when web_fetch is blocked by rate limits or bot detection. Supports single UR...",
        "tags": {
            "latest": "k97555hfe90wskh0x74dc12m7x83a3zy"
        },
        "updatedAt": 1774110107974
    }
}