dataforseo

Complete DataForSEO API integration for SEO data and analysis. Use when the user asks for keyword research, search volume, SERP analysis, backlink audits, competitor analysis, rank tracking, domain authority, technical SEO audits, content monitoring, Google Trends, or any SEO-related data queries. Covers all DataForSEO APIs including SERP, Keywords Data, DataForSEO Labs, Backlinks, OnPage, Domain Analytics, Content Analysis, Business Data, Merchant, App Data, and AI Optimization APIs. Outputs CSV files.

Safety Notice

This listing is imported from skills.sh public index metadata. Review upstream SKILL.md and repository scripts before running.

Copy this and send it to your AI assistant to learn

Install skill "dataforseo" with this command: npx skills add nikhilbhansali/dataforseo-skill-claude/nikhilbhansali-dataforseo-skill-claude-dataforseo

DataForSEO API Skill

Universal interface to all DataForSEO APIs for comprehensive SEO data retrieval and analysis.

Credential Setup

Before first use, set up credentials:

import sys, os
sys.path.insert(0, os.path.expanduser('~/.agents/skills/dataforseo/scripts'))
from dataforseo_client import save_credentials, verify_credentials

# Get credentials from https://app.dataforseo.com/
login = "your_email@example.com"  # API login (email)
password = "your_api_password"    # API password (from dashboard)

# Verify and save
if verify_credentials(login, password):
    save_credentials(login, password)
    print("Credentials saved!")

Credentials stored at ~/.dataforseo_config.json. To update, run setup again.

Quick Start

import sys, os
sys.path.insert(0, os.path.expanduser('~/.agents/skills/dataforseo/scripts'))
from dataforseo_client import *

# Example: Get search volume
response = keywords_search_volume(
    keywords=["seo tools", "keyword research"],
    location_name="United States"
)
results = extract_results(response)
csv_path = to_csv(results, "keyword_volumes")
print(f"Results saved to: {csv_path}")

API Selection Guide

User RequestFunction to Use
Search volume, CPC, competitionkeywords_search_volume()
Keyword ideas/suggestionslabs_keyword_ideas() or labs_related_keywords()
Keywords a site ranks forlabs_ranked_keywords()
SERP results for keywordserp_google_organic()
Local/Maps rankingsserp_google_maps()
YouTube rankingsserp_youtube()
Backlink profilebacklinks_summary()
List of backlinksbacklinks_list()
Referring domainsbacklinks_referring_domains()
Domain authority/rankbacklinks_bulk_ranks()
Competing domainslabs_competitors_domain()
Keyword gap analysislabs_domain_intersection()
Link gap analysisbacklinks_domain_intersection()
Technical page auditonpage_instant_pages()
Lighthouse scoreslighthouse_live()
Technology stackdomain_technologies()
Brand mentionscontent_search()
Google Trendsgoogle_trends()

Core Workflow

  1. Import client: Add skill path and import functions
  2. Call API function: Pass required parameters
  3. Extract results: Use extract_results(response)
  4. Export to CSV: Use to_csv(results, "filename")
import sys, os
sys.path.insert(0, os.path.expanduser('~/.agents/skills/dataforseo/scripts'))
from dataforseo_client import labs_ranked_keywords, extract_results, to_csv

response = labs_ranked_keywords(
    target="competitor.com",
    location_name="United States",
    language_name="English",
    limit=500
)
results = extract_results(response)
csv_path = to_csv(results, "ranked_keywords")

Default Parameters

Most functions use these defaults:

  • location_name: "United States" (override with "India", "United Kingdom", etc.)
  • language_name: "English"
  • limit: 100 (increase up to 1000 for more results)
  • device: "desktop" (or "mobile" for SERP)

Common Location Names

  • United States, United Kingdom, India, Germany, Australia, Canada
  • For city-level: "New York,New York,United States", "London,England,United Kingdom"

Output

All results export to CSV at ~/dataforseo_outputs/. Files auto-named with timestamp if not specified.

Reference Files

  • API Reference: references/api_reference.md - Complete endpoint documentation
  • Use Cases: references/use_cases.md - Ready-to-use code recipes

Error Handling

response = some_api_function(...)
if response.get("status_code") == 20000:
    results = extract_results(response)
    # Process results
else:
    print(f"Error: {response.get('status_message')}")

Rate Limits & Costs

  • 2000 requests/minute max
  • Live methods cost more than Standard
  • Check usage with get_user_data()
  • Response includes cost field

Important Notes

  1. Async endpoints: Some APIs (merchant, app_data, business reviews) create tasks. Check task status separately.
  2. Limits: Increase limit parameter for comprehensive data (default 100, max usually 1000)
  3. Multiple keywords: Pass as list: keywords=["kw1", "kw2", "kw3"]

Source Transparency

This detail page is rendered from real SKILL.md content. Trust labels are metadata-based hints, not a safety guarantee.

Related Skills

Related by shared tags or category signals.

General

enterprise-proposal

No summary provided by upstream source.

Repository SourceNeeds Review
General

basecamp-assistant

No summary provided by upstream source.

Repository SourceNeeds Review
Security

Open Code Review

Scan AI-generated code for hallucinated packages, stale APIs, security anti-patterns, and over-engineering. Use when: (1) reviewing PRs with AI-generated cod...

Registry SourceRecently Updated
90Profile unavailable
Security

Nginx Config

Nginx配置生成。服务器配置、反向代理、SSL、缓存、安全加固、性能优化。Nginx config generator with reverse proxy, SSL, caching, security, optimization. Nginx、服务器、运维。

Registry SourceRecently Updated
1860Profile unavailable