junta-leiloeiros

Coleta e consulta dados de leiloeiros oficiais de todas as 27 Juntas Comerciais do Brasil. Scraper multi-UF, banco SQLite, API FastAPI e exportacao CSV/JSON.

Safety Notice

This listing is imported from skills.sh public index metadata. Review upstream SKILL.md and repository scripts before running.

Copy this and send it to your AI assistant to learn

Install skill "junta-leiloeiros" with this command: npx skills add sickn33/antigravity-awesome-skills/sickn33-antigravity-awesome-skills-junta-leiloeiros

Skill: Leiloeiros das Juntas Comerciais do Brasil

Overview

Coleta e consulta dados de leiloeiros oficiais de todas as 27 Juntas Comerciais do Brasil. Scraper multi-UF, banco SQLite, API FastAPI e exportacao CSV/JSON.

When to Use This Skill

  • When the user mentions "leiloeiro junta" or related topics
  • When the user mentions "junta comercial leiloeiro" or related topics
  • When the user mentions "scraper junta" or related topics
  • When the user mentions "jucesp leiloeiro" or related topics
  • When the user mentions "jucerja" or related topics
  • When the user mentions "jucemg leiloeiro" or related topics

Do Not Use This Skill When

  • The task is unrelated to junta leiloeiros
  • A simpler, more specific tool can handle the request
  • The user needs general-purpose assistance without domain expertise

How It Works

Coleta dados públicos de leiloeiros oficiais de todas as 27 Juntas Comerciais estaduais, persiste em banco SQLite local e oferece API REST e exportação em múltiplos formatos.

Localização

C:\Users\renat\skills\junta-leiloeiros\
├── scripts/
│   ├── scraper/
│   │   ├── base_scraper.py      ← classe abstrata
│   │   ├── states.py            ← registro dos 27 scrapers
│   │   ├── jucesp.py / jucerja.py / jucemg.py / jucec.py / jucis_df.py
│   │   └── generic_scraper.py   ← usado pelos 22 estados restantes
│   ├── db.py                    ← banco SQLite
│   ├── run_all.py               ← orquestrador de scraping
│   ├── serve_api.py             ← API FastAPI
│   ├── export.py                ← exportação
│   └── requirements.txt
├── references/
│   ├── juntas_urls.md           ← URLs e status de todas as 27 juntas
│   ├── schema.md                ← schema do banco
│   └── legal.md                 ← base legal
└── data/
    ├── leiloeiros.db            ← banco SQLite (criado no primeiro run)
    ├── scraping_log.json        ← log de cada coleta
    └── exports/                 ← arquivos exportados

Instalação (Uma Vez)

pip install -r C:\Users\renat\skills\junta-leiloeiros\scripts\requirements.txt

## Para Sites Com Javascript:

playwright install chromium

Coletar Dados


## Todos Os 27 Estados

python C:\Users\renat\skills\junta-leiloeiros\scripts\run_all.py

## Estados Específicos

python C:\Users\renat\skills\junta-leiloeiros\scripts\run_all.py --estado SP RJ MG

## Ver O Que Seria Coletado Sem Executar

python C:\Users\renat\skills\junta-leiloeiros\scripts\run_all.py --dry-run

## Controlar Paralelismo (Default: 5)

python C:\Users\renat\skills\junta-leiloeiros\scripts\run_all.py --concurrency 3

Estatísticas Por Estado

python C:\Users\renat\skills\junta-leiloeiros\scripts\db.py

Sql Direto

sqlite3 C:\Users\renat\skills\junta-leiloeiros\data\leiloeiros.db
"SELECT estado, COUNT(*) FROM leiloeiros GROUP BY estado"


## Servir Api Rest

```bash
python C:\Users\renat\skills\junta-leiloeiros\scripts\serve_api.py

## Docs Interativos: Http://Localhost:8000/Docs

Endpoints:

  • GET /leiloeiros?estado=SP&situacao=ATIVO&nome=silva&limit=100
  • GET /leiloeiros/{estado} — ex: /leiloeiros/SP
  • GET /busca?q=texto
  • GET /stats
  • GET /export/json
  • GET /export/csv

Exportar Dados

python C:\Users\renat\skills\junta-leiloeiros\scripts\export.py --format csv
python C:\Users\renat\skills\junta-leiloeiros\scripts\export.py --format json
python C:\Users\renat\skills\junta-leiloeiros\scripts\export.py --format all
python C:\Users\renat\skills\junta-leiloeiros\scripts\export.py --format csv --estado SP

Usar Em Código Python

import sys
sys.path.insert(0, r"C:\Users\renat\skills\junta-leiloeiros\scripts")
from db import Database

db = Database()
db.init()

## Todos Os Leiloeiros Ativos De Sp

leiloeiros = db.get_all(estado="SP", situacao="ATIVO")

## Busca Por Nome

resultados = db.search("silva")

## Estatísticas

stats = db.get_stats()

Adicionar Scraper Customizado

Se um estado precisar de lógica específica (ex: site usa JavaScript):


## Scripts/Scraper/Meu_Estado.Py

from .base_scraper import AbstractJuntaScraper, Leiloeiro
from typing import List

class MeuEstadoScraper(AbstractJuntaScraper):
    estado = "XX"
    junta = "JUCEX"
    url = "https://www.jucex.xx.gov.br/leiloeiros"

    async def parse_leiloeiros(self) -> List[Leiloeiro]:
        soup = await self.fetch_page()
        if not soup:
            return []
        # lógica específica aqui
        return [self.make_leiloeiro(nome="...", matricula="...")]

Registrar em scripts/scraper/states.py:

from .meu_estado import MeuEstadoScraper
SCRAPERS["XX"] = MeuEstadoScraper

Referências

  • URLs de todas as juntas: references/juntas_urls.md
  • Schema do banco: references/schema.md
  • Base legal da coleta: references/legal.md
  • Log de coleta: data/scraping_log.json

Best Practices

  • Provide clear, specific context about your project and requirements
  • Review all suggestions before applying them to production code
  • Combine with other complementary skills for comprehensive analysis

Common Pitfalls

  • Using this skill for tasks outside its domain expertise
  • Applying recommendations without understanding your specific context
  • Not providing enough project context for accurate analysis

Related Skills

  • leiloeiro-avaliacao - Complementary skill for enhanced analysis
  • leiloeiro-edital - Complementary skill for enhanced analysis
  • leiloeiro-ia - Complementary skill for enhanced analysis
  • leiloeiro-juridico - Complementary skill for enhanced analysis
  • leiloeiro-mercado - Complementary skill for enhanced analysis

Source Transparency

This detail page is rendered from real SKILL.md content. Trust labels are metadata-based hints, not a safety guarantee.

Related Skills

Related by shared tags or category signals.

General

docker-expert

No summary provided by upstream source.

Repository SourceNeeds Review
General

nextjs-supabase-auth

No summary provided by upstream source.

Repository SourceNeeds Review
3.2K-sickn33
General

nextjs-best-practices

No summary provided by upstream source.

Repository SourceNeeds Review
3.1K-sickn33
General

prisma-expert

No summary provided by upstream source.

Repository SourceNeeds Review
2.7K-sickn33