Skip to main content

Overview

This guide demonstrates practical workflows and examples for using AnySite MCP tools with Claude Code. These examples show how to leverage the CLI-based integration for development, automation, and team collaboration.

Basic Usage

Starting a Session with MCP

Once configured, simply start Claude Code and the MCP server connects automatically:
claude
Your AnySite tools are immediately available. Try:
What MCP tools do I have access to?
Claude will list all available AnySite tools from the connected MCP server.

Quick Data Extraction

Example: LinkedIn Profile Analysis
Extract information from this LinkedIn profile:
https://linkedin.com/in/satyanadella

Focus on:
- Current role and company
- Career progression
- Education background
Claude will use the linkedin_user MCP tool to fetch and analyze the data.

Scope Management Examples

Example 1: Personal User-Scoped Setup

For tools you use across all your projects:
# Add to user scope (available everywhere)
claude mcp add --scope user anysite "https://api.anysite.io/mcp/direct?api_key=YOUR_KEY"
Use case: You’re a researcher who frequently extracts LinkedIn data across multiple projects. Benefits:
  • Configure once, use everywhere
  • No per-project setup needed
  • Personal API key stays private

Example 2: Team Project-Scoped Setup

For shared team projects with version-controlled config:
# Navigate to project directory
cd my-team-project

# Add with project scope
claude mcp add --scope project anysite "https://mcp.anysite.io/mcp" \
  --env API_KEY=$ANYSITE_API_KEY

# Commit .mcp.json to git
git add .mcp.json
git commit -m "Add AnySite MCP configuration"
Team members then:
# Pull the project
git pull

# Set their own API key
export ANYSITE_API_KEY="their_key_here"

# Claude Code automatically uses project config
claude
Benefits:
  • Shared configuration
  • Individual API keys (not committed)
  • Consistent team setup

Example 3: Temporary Local Setup

For quick tests or temporary work:
# Add with local scope (default)
claude mcp add anysite "https://api.anysite.io/mcp/direct?api_key=TEMP_KEY"

# Use for this session
claude

# Remove when done
claude mcp remove anysite
Use case: Testing with a trial API key or working on a temporary proof-of-concept.

Development Workflows

Workflow 1: Competitive Intelligence Research

Setup:
# Create research project
mkdir competitor-research
cd competitor-research

# Add AnySite with project scope
claude mcp add --scope project anysite "YOUR_URL"
Usage in Claude Code:
I'm researching competitors in the CRM space. For each company, get:

1. LinkedIn company pages:
   - https://linkedin.com/company/salesforce
   - https://linkedin.com/company/hubspot
   - https://linkedin.com/company/zoho

2. Extract:
   - Company size
   - Growth trends
   - Recent updates
   - Key executives

3. Create a comparison table
Automated with script:
#!/bin/bash
# research.sh

companies=(
  "salesforce"
  "hubspot"
  "zoho"
)

for company in "${companies[@]}"; do
  echo "Researching $company..." >> research.log
  claude <<EOF
Extract company information from https://linkedin.com/company/$company
Save key metrics to ${company}_data.json
EOF
done

Workflow 2: Lead Generation Pipeline

Project structure:
lead-gen/
├── .mcp.json          # Project config
├── leads.csv          # Input: URLs to research
├── scripts/
│   └── process_leads.sh
└── output/
    └── enriched_leads.json
Setup:
cd lead-gen
claude mcp add --scope project anysite "YOUR_URL"
Processing script:
#!/bin/bash
# scripts/process_leads.sh

while IFS=, read -r name linkedin_url; do
  echo "Processing: $name"

  claude <<EOF
Extract profile data from: $linkedin_url
Output JSON format:
{
  "name": "$name",
  "current_role": "",
  "company": "",
  "location": "",
  "experience_years": 0,
  "skills": []
}
EOF
done < leads.csv

Workflow 3: Content Monitoring Dashboard

Setup multiple servers for different sources:
# AnySite for social data
claude mcp add --scope user anysite "ANYSITE_URL"

# Monitor Reddit
claude <<EOF
Track mentions of "our-product" in these subreddits:
- r/technology
- r/startups
- r/SaaS

Check every 6 hours and summarize sentiment
EOF
Automated monitoring:
#!/bin/bash
# monitor.sh - Run via cron

claude <<EOF
Monitor these Reddit posts for new comments:
$(cat monitored_posts.txt)

If sentiment changes significantly, send summary to monitor@company.com
EOF

CI/CD Integration

Example 1: Automated Profile Verification

GitHub Actions workflow:
# .github/workflows/verify-profiles.yml
name: Verify Team Profiles

on:
  schedule:
    - cron: '0 9 * * 1'  # Every Monday at 9 AM
  workflow_dispatch:

jobs:
  verify:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v3

      - name: Setup Claude Code
        run: |
          # Install Claude Code
          curl -sSL https://claude.ai/install.sh | bash

      - name: Configure MCP
        env:
          ANYSITE_API_KEY: ${{ secrets.ANYSITE_API_KEY }}
        run: |
          claude mcp add --scope local anysite \
            "https://api.anysite.io/mcp/direct?api_key=$ANYSITE_API_KEY"

      - name: Verify Profiles
        run: |
          claude <<EOF
          Verify all LinkedIn profiles in team_profiles.json are still active.
          Check for:
          - Profile still exists
          - Job title changes
          - Company changes

          Output differences to profile_changes.md
          EOF

      - name: Create Issue if Changes
        if: success()
        run: |
          if [ -s profile_changes.md ]; then
            gh issue create \
              --title "Team Profile Changes Detected" \
              --body-file profile_changes.md \
              --label "hr,automated"
          fi

Example 2: Data Pipeline Integration

Airflow DAG:
# dags/linkedin_enrichment_dag.py

from airflow import DAG
from airflow.operators.bash import BashOperator
from datetime import datetime, timedelta

default_args = {
    'owner': 'data-team',
    'depends_on_past': False,
    'start_date': datetime(2024, 1, 1),
    'retries': 2,
    'retry_delay': timedelta(minutes=5),
}

dag = DAG(
    'linkedin_enrichment',
    default_args=default_args,
    schedule_interval='@daily',
    catchup=False
)

# Configure MCP
setup_mcp = BashOperator(
    task_id='setup_mcp',
    bash_command='''
        claude mcp add --scope local anysite "$ANYSITE_MCP_URL"
    ''',
    dag=dag
)

# Enrich leads
enrich_data = BashOperator(
    task_id='enrich_leads',
    bash_command='''
        claude <<EOF
        Read leads from /data/new_leads.csv
        For each LinkedIn URL, extract:
        - Profile data
        - Current company info
        - Recent activity

        Output to /data/enriched_leads.json
        EOF
    ''',
    dag=dag
)

# Load to database
load_db = BashOperator(
    task_id='load_to_database',
    bash_command='python scripts/load_enriched_data.py',
    dag=dag
)

setup_mcp >> enrich_data >> load_db

Multi-Workspace Management

Managing Multiple Projects

Scenario: You work on different projects with different MCP needs.
# Project A: Marketing research
cd ~/projects/marketing-research
claude mcp add --scope project anysite-marketing "URL1"

# Project B: Sales intelligence
cd ~/projects/sales-intel
claude mcp add --scope project anysite-sales "URL2"

# Personal utilities (available everywhere)
claude mcp add --scope user anysite-personal "URL3"
Check current configuration:
# In any project
claude mcp list

# Output shows:
# anysite-personal (user scope) ✔ connected
# anysite-marketing (project scope, if in marketing-research/) ✔ connected

Switching Contexts

# Work on marketing project
cd ~/projects/marketing-research
claude  # Uses marketing-research/.mcp.json + user scope

# Switch to sales project
cd ~/projects/sales-intel
claude  # Uses sales-intel/.mcp.json + user scope

# Personal work
cd ~/documents
claude  # Uses only user scope

Advanced Techniques

Batch Processing with Scripts

Process multiple LinkedIn profiles:
#!/bin/bash
# batch_linkedin_extract.sh

input_file="linkedin_urls.txt"
output_dir="./extracted_data"
mkdir -p "$output_dir"

while IFS= read -r url; do
  # Extract profile ID from URL
  profile_id=$(echo "$url" | sed 's/.*linkedin.com\/in\///' | sed 's/\/.*//')

  echo "Processing: $profile_id"

  # Use Claude Code with MCP
  claude <<EOF > "${output_dir}/${profile_id}.json"
Extract LinkedIn profile data from: $url

Output as JSON with these fields:
{
  "profile_id": "$profile_id",
  "extracted_at": "$(date -Iseconds)",
  "data": {
    "name": "",
    "headline": "",
    "location": "",
    "current_position": {},
    "experience": [],
    "education": [],
    "skills": []
  }
}
EOF

  # Rate limiting
  sleep 2

done < "$input_file"

echo "Batch processing complete. Extracted $(ls -1 "$output_dir" | wc -l) profiles."

Conditional Logic Based on MCP Data

#!/bin/bash
# conditional_analysis.sh

company_url="https://linkedin.com/company/target-company"

# Extract data and analyze
result=$(claude <<EOF
Analyze this company: $company_url

Determine if they are:
1. Growing rapidly (>20% employee growth)
2. In tech industry
3. Located in US

Output only: YES or NO
EOF
)

if [ "$result" = "YES" ]; then
  echo "Company matches criteria. Generating detailed report..."

  claude <<EOF
Create comprehensive company report for: $company_url
Include competitive analysis and market positioning.
Save to: reports/$(date +%Y%m%d)_target_company.md
EOF

  # Notify team
  echo "Report ready" | mail -s "New Target Company Report" team@company.com
else
  echo "Company does not match criteria. Skipping detailed analysis."
fi

Integration with Data Analysis Tools

Python script using Claude Code:
#!/usr/bin/env python3
# analyze_linkedin_data.py

import subprocess
import json
import pandas as pd

def extract_profile(linkedin_url):
    """Extract LinkedIn profile using Claude Code MCP"""

    prompt = f"""
    Extract profile data from: {linkedin_url}
    Output as valid JSON only, no explanation.
    """

    result = subprocess.run(
        ['claude'],
        input=prompt,
        capture_output=True,
        text=True
    )

    return json.loads(result.stdout)

def main():
    # Load list of profiles to analyze
    with open('profiles_list.txt', 'r') as f:
        urls = [line.strip() for line in f]

    # Extract data
    profiles = []
    for url in urls:
        print(f"Extracting: {url}")
        data = extract_profile(url)
        profiles.append(data)

    # Create DataFrame
    df = pd.DataFrame(profiles)

    # Analysis
    print("\n=== Profile Analysis ===")
    print(f"Total profiles: {len(df)}")
    print(f"Average experience: {df['years_experience'].mean():.1f} years")
    print(f"\nTop 5 skills:")
    print(df['skills'].explode().value_counts().head())

    # Export results
    df.to_csv('linkedin_analysis.csv', index=False)
    print("\nResults saved to linkedin_analysis.csv")

if __name__ == '__main__':
    main()

Debugging and Monitoring

Debug Mode Usage

Enable debug output:
# Run with debug flag
claude --mcp-debug

# Or set environment variable
export CLAUDE_MCP_DEBUG=1
claude
Debug output shows:
[MCP] Connecting to: anysite (sse)
[MCP] Transport: sse
[MCP] URL: https://api.anysite.io/mcp/direct?api_key=***
[MCP] Connection established
[MCP] Tools discovered: 15
[MCP] Tools: linkedin_user, linkedin_company, instagram_profile, ...

Monitoring MCP Usage

Check server status script:
#!/bin/bash
# check_mcp_status.sh

echo "=== MCP Server Status ==="
echo

# List all servers
echo "Configured servers:"
claude mcp list

echo
echo "Testing connection to anysite:"
claude mcp get anysite

# Log to file
{
  echo "Status check at: $(date)"
  claude mcp list
  echo "---"
} >> mcp_status.log
Run periodically:
# Add to crontab
0 */4 * * * /path/to/check_mcp_status.sh

Error Handling in Scripts

#!/bin/bash
# robust_extraction.sh

extract_with_retry() {
  local url=$1
  local max_attempts=3
  local attempt=1

  while [ $attempt -le $max_attempts ]; do
    echo "Attempt $attempt of $max_attempts"

    output=$(claude <<EOF 2>&1
Extract data from: $url
Output as JSON
EOF
    )

    # Check if extraction succeeded
    if echo "$output" | jq . > /dev/null 2>&1; then
      echo "$output"
      return 0
    fi

    echo "Extraction failed, retrying..."
    attempt=$((attempt + 1))
    sleep 5
  done

  echo "ERROR: Failed after $max_attempts attempts"
  return 1
}

# Usage
if result=$(extract_with_retry "https://linkedin.com/in/profile"); then
  echo "$result" > output.json
  echo "Success"
else
  echo "Failed to extract data" >&2
  exit 1
fi

Best Practices

1. Scope Selection Strategy

User Scope

Use for:
  • Personal API keys
  • Tools you use everywhere
  • Cross-project utilities
Example:
claude mcp add --scope user \
  anysite "URL"

Project Scope

Use for:
  • Team collaborations
  • Version-controlled configs
  • Project-specific setups
Example:
claude mcp add --scope project \
  anysite "URL" \
  --env API_KEY=$KEY

Local Scope

Use for:
  • Temporary setups
  • Testing
  • Sensitive credentials
Example:
claude mcp add \
  anysite-test "URL"

2. Security Checklist

  • ✅ Use environment variables for API keys
  • ✅ Add config files to .gitignore
  • ✅ Rotate keys regularly
  • ✅ Use --scope local for sensitive keys
  • ✅ Audit configurations with claude mcp list
  • ✅ Remove unused servers
  • ❌ Never commit API keys to version control
  • ❌ Don’t share Direct URLs publicly

3. Performance Optimization

Rate limiting:
# Add delays between requests
for url in "${urls[@]}"; do
  claude <<< "Extract: $url"
  sleep 2  # Respect API rate limits
done
Batch similar requests:
# Instead of multiple calls
claude <<EOF
Extract all these profiles in one request:
1. linkedin.com/in/profile1
2. linkedin.com/in/profile2
3. linkedin.com/in/profile3
EOF
Cache results:
# Save extracted data
output_file="cache/profile_${profile_id}.json"

if [ ! -f "$output_file" ]; then
  # Extract only if not cached
  claude <<< "Extract: $url" > "$output_file"
fi

Common Patterns

Pattern 1: Daily Automated Report

#!/bin/bash
# daily_report.sh

date=$(date +%Y-%m-%d)
report_file="reports/daily_${date}.md"

claude <<EOF > "$report_file"
Generate daily intelligence report:

1. Check these LinkedIn company pages for updates:
   - https://linkedin.com/company/competitor1
   - https://linkedin.com/company/competitor2

2. Monitor Reddit posts in r/industry for trending topics

3. Analyze sentiment and key themes

4. Format as executive summary in markdown
EOF

# Email the report
mail -s "Daily Intelligence Report - $date" \
  -a "$report_file" \
  executives@company.com < /dev/null

Pattern 2: Interactive Research Session

# Start research session
claude

# Then interactively:
  1. Extract target company profile
  2. Get list of key employees
  3. Analyze their backgrounds
  4. Identify common patterns
  5. Generate hiring strategy recommendations

Pattern 3: Data Validation Pipeline

#!/bin/bash
# validate_data.sh

input_csv="leads.csv"
output_csv="validated_leads.csv"

# Validate each LinkedIn URL
while IFS=, read -r id name linkedin_url; do
  # Check if profile exists and is accessible
  status=$(claude <<EOF
Check if this LinkedIn profile is valid and accessible:
$linkedin_url

Output only: VALID or INVALID
EOF
  )

  echo "$id,$name,$linkedin_url,$status" >> "$output_csv"
done < "$input_csv"

Resources

Need Help?

Get Support

Contact our support team for assistance with Claude Code MCP workflows