tavily-api-expert

tavily-api-expert

Build production-ready Tavily integrations with best practices baked in. Reference documentation for developers using coding assistants (Claude Code, Cursor, etc.) to implement web search, content extraction, crawling, and research in agentic workflows, RAG systems, or autonomous agents.

2Sterne
0Forks
Aktualisiert 1/22/2026
SKILL.md
readonlyread-only
name
tavily-api-expert
description

"Build production-ready Tavily integrations with best practices baked in. Reference documentation for developers using coding assistants (Claude Code, Cursor, etc.) to implement web search, content extraction, crawling, and research in agentic workflows, RAG systems, or autonomous agents."

Tavily is a specialized search API designed specifically for LLMs, enabling developers to build AI applications that can access real-time, accurate web data. Let's use the Python SDK to build with tavily.

Prerequisites

Tavily API Key Required - Get your key at https://tavily.com

Add to ~/.claude/settings.json:

{
  "env": {
    "TAVILY_API_KEY": "tvly-your-api-key-here"
  }
}

Restart Claude Code after adding your API key.

Tavily Python SDK

Installation

pip install tavily-python

Client Initialization

from tavily import TavilyClient

client = TavilyClient(api_key="tvly-YOUR_API_KEY")

# Or use environment variable TAVILY_API_KEY
client = TavilyClient()

Async client:

The async client enables parallel query execution, ideal for agentic workflows that need to gather information quickly before passing it to a model for analysis.

from tavily import AsyncTavilyClient

async_client = AsyncTavilyClient(api_key="tvly-YOUR_API_KEY")

Available Endpoints

Endpoint Purpose Use Case
search() Web search real time data retrieval from the web
extract() Scrape content from URLs Page content extraction
crawl() and map() Traverse website structures and simultaneously scrape pages Documentation, site-wide extraction
research Out of the box research agent ready-to-use iterative research

Choosing the Right Method

If you are building a custom agent or agentic workflow:

Need Method
Web search results search()
Content from specific URLs extract()
Content from an entire site crawl()
URL discovery from a site map()

These methods give you full control but require additional work: data processing, LLM integration, and workflow orchestration.

If you want an out-of-the-box solution:

Need Method
End-to-end research with AI synthesis and built-in context engineering research()

The research endpoint provides faster time-to-value with AI-synthesized insights, but offers less flexibility than building custom workflows.

Detailed Guides

For detailed usage instructions, parameters, patterns, and best practices:

You Might Also Like

Related Skills

gog

gog

169Kdev-api

Google Workspace CLI for Gmail, Calendar, Drive, Contacts, Sheets, and Docs.

openclaw avataropenclaw
Holen
weather

weather

169Kdev-api

Get current weather and forecasts (no API key required).

openclaw avataropenclaw
Holen

Guide for implementing oRPC contract-first API patterns in Dify frontend. Triggers when creating new API contracts, adding service endpoints, integrating TanStack Query with typed contracts, or migrating legacy service calls to oRPC. Use for all API layer work in web/contract and web/service directories.

langgenius avatarlanggenius
Holen
blucli

blucli

92Kdev-api

BluOS CLI (blu) for discovery, playback, grouping, and volume.

moltbot avatarmoltbot
Holen
ordercli

ordercli

92Kdev-api

Foodora-only CLI for checking past orders and active order status (Deliveroo WIP).

moltbot avatarmoltbot
Holen
gifgrep

gifgrep

92Kdev-api

Search GIF providers with CLI/TUI, download results, and extract stills/sheets.

moltbot avatarmoltbot
Holen