Getting Started
🎰 For Sportsbooks

Welcome to AltSportsLeagues.ai! This comprehensive guide will help betting platforms and sportsbooks discover new sports markets, access compliant league data, and expand their betting offerings through our platform.

🎯 Why AltSportsLeagues.ai?

Market Expansion Opportunities

  • Access New Sports - Discover leagues in underserved sports categories
  • Global Reach - Connect with international leagues and markets
  • Compliance Assurance - Work only with verified, compliant leagues
  • Data Quality - Access structured, real-time sports data

Competitive Advantages

  • First-Mover Advantage - Be the first to offer betting on emerging sports
  • Revenue Diversification - Expand beyond traditional sports
  • Risk Management - Partner with compliant, well-governed leagues
  • Data-Driven Decisions - Make informed market expansion choices

πŸ“‹ Prerequisites

Before integrating with our platform:

  • Licensed Sportsbook - Valid gambling license in target jurisdictions
  • Technical Team - Developers familiar with REST APIs and data integration
  • Compliance Team - Staff to handle regulatory requirements
  • API Credentials - Obtain API access (see below)

πŸš€ Integration Process

Phase 1: API Access & Setup (1-2 days)

1. Request API Access

  1. Visit altsportsleagues.ai/api-access (opens in a new tab)
  2. Select "Sportsbook Integration" account type
  3. Provide company information and licensing details
  4. Submit technical contact information
  5. Complete compliance verification

2. Install Python SDK

pip install altsportsleagues

3. Initialize Client

from altsportsleagues import AltSportsLeagues
 
# Initialize with your API key
client = AltSportsLeagues(api_key="your_api_key_here")
 
# Test connection
try:
    info = client.get_api_info()
    print(f"API Version: {info['version']}")
    print(f"Available endpoints: {len(info['endpoints'])}")
except Exception as e:
    print(f"Connection failed: {e}")

Phase 2: Data Exploration (2-3 days)

Discover Available Leagues

# Get all available leagues
all_leagues = client.get_leagues()
print(f"Total leagues available: {len(all_leagues)}")
 
# Filter by sport type
basketball_leagues = client.get_leagues(sport_type="basketball")
combat_leagues = client.get_leagues(sport_type="combat")
racing_leagues = client.get_leagues(sport_type="racing")
 
print(f"Basketball leagues: {len(basketball_leagues)}")
print(f"Combat leagues: {len(combat_leagues)}")
print(f"Racing leagues: {len(racing_leagues)}")

Analyze League Profiles

# Examine league details and compliance
for league in basketball_leagues[:10]:  # First 10 leagues
    detail = client.get_league(league.league_id)
 
    print(f"League: {detail.league_name}")
    print(f"Compliance Score: {detail.compliance_score}/100")
    print(f"Status: {detail.get_onboarding_progress()}")
    print(f"Location: {detail.location}")
    print(f"Teams: {detail.number_of_teams}")
    print(f"Founded: {detail.founded_year}")
    print("---")

Evaluate Market Potential

# Assess betting market potential
def analyze_market_potential(league):
    """Analyze if a league is suitable for betting integration"""
 
    # Must have minimum compliance score
    if league.compliance_score < 70:
        return "Not Compliant", 0
 
    # Calculate market potential score
    potential = 0
 
    # Size factors
    if league.number_of_teams and league.number_of_teams > 20:
        potential += 20
    elif league.number_of_teams and league.number_of_teams > 10:
        potential += 10
 
    # Attendance factors
    if league.average_attendance and league.average_attendance > 1000:
        potential += 25
    elif league.average_attendance and league.average_attendance > 500:
        potential += 15
 
    # Established leagues score higher
    if league.founded_year and league.founded_year < 2000:
        potential += 15
    elif league.founded_year and league.founded_year < 2010:
        potential += 10
 
    # Compliance bonus
    if league.compliance_score >= 90:
        potential += 20
    elif league.compliance_score >= 80:
        potential += 10
 
    # Determine recommendation
    if potential >= 60:
        return "High Potential", potential
    elif potential >= 40:
        return "Medium Potential", potential
    else:
        return "Low Potential", potential
 
# Analyze top leagues
for league in basketball_leagues[:20]:
    recommendation, score = analyze_market_potential(league)
    print(f"{league.league_name}: {recommendation} ({score}/100)")

Phase 3: Data Integration (1-2 weeks)

Access Historical Event Data

# Get historical events for market analysis
events = client.get_historical_events(
    league_id="nba",
    season="2023-2024",
    limit=1000
)
 
print(f"Retrieved {len(events)} NBA events")
 
# Analyze event patterns
from collections import defaultdict
import statistics
 
season_stats = defaultdict(list)
 
for event in events:
    if event.status == 'completed':
        season_stats['total_events'].append(event)
 
        # Home team win percentage
        if event.home_score and event.away_score:
            if event.home_score > event.away_score:
                season_stats['home_wins'].append(1)
            else:
                season_stats['home_wins'].append(0)
 
print(f"Total completed events: {len(season_stats['total_events'])}")
if season_stats['home_wins']:
    home_win_pct = statistics.mean(season_stats['home_wins']) * 100
    print(f"Home team win percentage: {home_win_pct:.1f}%")

Set Up Real-Time Data Feeds

# Monitor live events and odds
import time
from datetime import datetime, timedelta
 
def monitor_live_events(league_ids):
    """Monitor live events for real-time betting opportunities"""
 
    while True:
        current_time = datetime.now()
 
        for league_id in league_ids:
            # Get upcoming events in next 24 hours
            upcoming = client.get_events(
                league_id=league_id,
                start_date=current_time.isoformat(),
                end_date=(current_time + timedelta(days=1)).isoformat()
            )
 
            for event in upcoming:
                if event.is_upcoming():
                    # Get current odds
                    odds = client.get_odds(event.event_id)
 
                    print(f"Event: {event.home_team} vs {event.away_team}")
                    print(f"Start: {event.start_time}")
                    print(f"Available markets: {len(odds)}")
 
                    for odd in odds:
                        print(f"  {odd.bookmaker}: {odd.market}")
                        if odd.selections:
                            for selection in odd.selections[:3]:
                                print(f"    {selection['name']}: {selection['odds']}")
 
        # Check every 5 minutes
        time.sleep(300)
 
# Monitor NBA and NFL events
monitor_live_events(['nba', 'nfl'])

Implement Odds Aggregation

def aggregate_odds(event_id):
    """Aggregate odds from multiple sources for best pricing"""
 
    odds_data = client.get_odds(event_id)
    aggregated = {}
 
    for odd in odds_data:
        market_name = odd.market
 
        if market_name not in aggregated:
            aggregated[market_name] = {
                'selections': {},
                'sources': []
            }
 
        aggregated[market_name]['sources'].append(odd.bookmaker)
 
        # Aggregate odds for each selection
        for selection in odd.selections:
            sel_name = selection['name']
            sel_odds = selection['odds']
 
            if sel_name not in aggregated[market_name]['selections']:
                aggregated[market_name]['selections'][sel_name] = {
                    'odds': [],
                    'best_odds': 0,
                    'sources': []
                }
 
            aggregated[market_name]['selections'][sel_name]['odds'].append(sel_odds)
            aggregated[market_name]['selections'][sel_name]['sources'].append(odd.bookmaker)
 
            # Track best odds
            if sel_odds > aggregated[market_name]['selections'][sel_name]['best_odds']:
                aggregated[market_name]['selections'][sel_name]['best_odds'] = sel_odds
 
    return aggregated
 
# Example: Aggregate odds for an NBA game
event_odds = aggregate_odds("nba_game_123")
 
for market, data in event_odds.items():
    print(f"Market: {market}")
    print(f"Sources: {len(data['sources'])}")
 
    for selection, sel_data in data['selections'].items():
        best_odds = sel_data['best_odds']
        num_sources = len(sel_data['sources'])
        avg_odds = sum(sel_data['odds']) / len(sel_data['odds'])
 
        print(f"  {selection}:")
        print(f"    Best odds: {best_odds}")
        print(f"    Average odds: {avg_odds:.2f}")
        print(f"    Sources: {num_sources}")

Phase 4: Compliance & Legal Setup (3-5 days)

Verify League Compliance

# Check compliance status for target leagues
target_leagues = ['nba', 'mlb', 'nfl', 'nhl']
 
for league_id in target_leagues:
    try:
        status = client.get_compliance_status(league_id)
 
        print(f"League: {league_id.upper()}")
        print(f"Compliance Score: {status['overall_score']}/100")
        print(f"Level: {status['compliance_level']}")
        print(f"Assessment Date: {status['last_assessed']}")
 
        if status['overall_score'] >= 80:
            print("βœ… Suitable for integration")
        elif status['overall_score'] >= 70:
            print("⚠️ Requires additional due diligence")
        else:
            print("❌ Not recommended for integration")
 
        print("---")
 
    except Exception as e:
        print(f"Error checking {league_id}: {e}")

Set Up Partnership Agreements

  1. Legal Review - Have legal team review partnership terms
  2. Data Licensing - Negotiate data usage rights and fees
  3. Revenue Sharing - Agree on revenue sharing arrangements
  4. Compliance Requirements - Ensure league meets your standards

Phase 5: Launch & Monitoring (1-2 weeks)

Soft Launch Process

# Set up monitoring for new betting markets
def monitor_new_market_performance(league_id, market_type):
    """Monitor performance of newly launched betting markets"""
 
    # Get recent events
    events = client.get_events(league_id=league_id, limit=50)
 
    performance_metrics = {
        'total_events': len(events),
        'live_events': 0,
        'total_volume': 0,
        'successful_events': 0
    }
 
    for event in events:
        if event.status == 'completed':
            performance_metrics['successful_events'] += 1
 
            # Get odds data for volume estimation
            try:
                odds = client.get_odds(event.event_id)
                # Estimate volume from odds data
                event_volume = sum(
                    selection.get('volume', 0)
                    for odd in odds
                    for selection in odd.selections
                )
                performance_metrics['total_volume'] += event_volume
 
            except:
                pass  # Skip if odds data unavailable
 
        elif event.status == 'in_progress':
            performance_metrics['live_events'] += 1
 
    return performance_metrics
 
# Monitor NBA market performance
nba_metrics = monitor_new_market_performance('nba', 'moneyline')
print(f"NBA Market Performance:")
print(f"Total Events: {nba_metrics['total_events']}")
print(f"Live Events: {nba_metrics['live_events']}")
print(f"Successful Events: {nba_metrics['successful_events']}")
print(f"Estimated Volume: ${nba_metrics['total_volume']:,.2f}")

Performance Optimization

# Implement caching for frequently accessed data
from functools import lru_cache
import time
 
@lru_cache(maxsize=1000)
def cached_get_league(league_id):
    """Cache league data to reduce API calls"""
    return client.get_league(league_id)
 
@lru_cache(maxsize=5000)
def cached_get_odds(event_id):
    """Cache odds data with short TTL"""
    return client.get_odds(event_id)
 
# Batch processing for efficiency
def batch_process_events(event_ids, batch_size=10):
    """Process events in batches to optimize API usage"""
 
    results = []
    for i in range(0, len(event_ids), batch_size):
        batch = event_ids[i:i + batch_size]
 
        batch_results = []
        for event_id in batch:
            try:
                odds = cached_get_odds(event_id)
                batch_results.append({'event_id': event_id, 'odds': odds})
            except Exception as e:
                batch_results.append({'event_id': event_id, 'error': str(e)})
 
        results.extend(batch_results)
 
        # Rate limiting
        time.sleep(0.1)
 
    return results

πŸ“Š Data Formats & Integration

Understanding Our Data Models

League Data Structure

# League object structure
league = {
    'league_id': 'nba',
    'league_name': 'National Basketball Association',
    'sport_bucket': 'team',
    'compliance_score': 98,
    'location': 'United States',
    'number_of_teams': 30,
    'founded_year': 1946,
    'onboarding_status': 'onboarded'
}

Event Data Structure

# Event object structure
event = {
    'event_id': 'nba_2024_01_15_lakers_celtics',
    'league_id': 'nba',
    'home_team': 'Los Angeles Lakers',
    'away_team': 'Boston Celtics',
    'start_time': '2024-01-15T20:00:00Z',
    'venue': 'Crypto.com Arena',
    'status': 'completed',
    'home_score': 105,
    'away_score': 108,
    'season': '2023-2024'
}

Odds Data Structure

# Odds object structure
odds = {
    'odds_id': 'odds_123',
    'event_id': 'nba_2024_01_15_lakers_celtics',
    'bookmaker': 'FanDuel',
    'market': 'moneyline',
    'selections': [
        {'name': 'Los Angeles Lakers', 'odds': 1.85, 'volume': 125000},
        {'name': 'Boston Celtics', 'odds': 1.95, 'volume': 98000}
    ],
    'last_updated': '2024-01-15T19:45:00Z',
    'is_live': False
}

πŸ”§ Production Integration

API Rate Limits & Optimization

# Implement intelligent rate limiting
import time
from collections import defaultdict
 
class RateLimiter:
    def __init__(self, max_calls_per_minute=60):
        self.max_calls = max_calls_per_minute
        self.calls = defaultdict(list)
 
    def can_make_call(self, endpoint):
        """Check if we can make another API call"""
        now = time.time()
        # Remove calls older than 1 minute
        self.calls[endpoint] = [
            call_time for call_time in self.calls[endpoint]
            if now - call_time < 60
        ]
 
        return len(self.calls[endpoint]) < self.max_calls
 
    def record_call(self, endpoint):
        """Record an API call"""
        self.calls[endpoint].append(time.time())
 
# Use rate limiter
limiter = RateLimiter(max_calls_per_minute=50)
 
def safe_api_call(func, *args, **kwargs):
    """Make API call with rate limiting"""
    if not limiter.can_make_call(func.__name__):
        time.sleep(1)  # Wait 1 second if at limit
 
    result = func(*args, **kwargs)
    limiter.record_call(func.__name__)
    return result
 
# Usage
leagues = safe_api_call(client.get_leagues)

Error Handling & Resilience

# Implement comprehensive error handling
import logging
from tenacity import retry, stop_after_attempt, wait_exponential
 
logging.basicConfig(level=logging.INFO)
logger = logging.getLogger(__name__)
 
@retry(
    stop=stop_after_attempt(3),
    wait=wait_exponential(multiplier=1, min=4, max=10)
)
def resilient_api_call(func, *args, **kwargs):
    """Make API calls with automatic retry and exponential backoff"""
    try:
        return func(*args, **kwargs)
    except Exception as e:
        logger.error(f"API call failed: {e}")
        raise
 
# Usage with error handling
try:
    leagues = resilient_api_call(client.get_leagues)
    logger.info(f"Successfully retrieved {len(leagues)} leagues")
except Exception as e:
    logger.error(f"Failed to retrieve leagues after retries: {e}")
    # Fallback to cached data or alternative source
    leagues = get_cached_leagues()

Monitoring & Alerting

# Set up monitoring for production integration
def setup_monitoring():
    """Set up monitoring and alerting for API integration"""
 
    # Health check
    try:
        health = client.ping()
        if health['status'] != 'healthy':
            alert_team("API health check failed")
    except Exception as e:
        alert_team(f"API connectivity issue: {e}")
 
    # Performance monitoring
    import psutil
    import time
 
    while True:
        # Monitor API response times
        start_time = time.time()
        try:
            leagues = client.get_leagues(limit=10)
            response_time = time.time() - start_time
 
            if response_time > 2.0:  # Alert if > 2 seconds
                alert_team(f"Slow API response: {response_time:.2f}s")
 
        except Exception as e:
            alert_team(f"API call failed: {e}")
 
        # Monitor system resources
        cpu_percent = psutil.cpu_percent()
        memory_percent = psutil.virtual_memory().percent
 
        if cpu_percent > 80 or memory_percent > 80:
            alert_team(f"High resource usage - CPU: {cpu_percent}%, Memory: {memory_percent}%")
 
        time.sleep(300)  # Check every 5 minutes
 
def alert_team(message):
    """Send alert to operations team"""
    print(f"ALERT: {message}")
    # Implement actual alerting (email, Slack, PagerDuty, etc.)

πŸ“ˆ Success Metrics

Key Performance Indicators

  • Market Expansion - Number of new sports added to platform
  • Revenue Growth - Additional revenue from new betting markets
  • User Engagement - Increased betting activity on new sports
  • Data Quality - Accuracy of odds and event data
  • System Reliability - API uptime and response times

Measuring Success

def calculate_integration_success():
    """Calculate success metrics for sportsbook integration"""
 
    metrics = {
        'new_sports_added': 0,
        'new_markets_launched': 0,
        'additional_revenue': 0,
        'api_uptime': 0,
        'average_response_time': 0
    }
 
    # Calculate metrics based on your tracking data
    # This would integrate with your analytics system
 
    return metrics
 
# Example success metrics
success = calculate_integration_success()
print("Integration Success Metrics:")
for metric, value in success.items():
    print(f"{metric}: {value}")

πŸ†˜ Support & Resources

Documentation

Support Channels

Common Integration Issues

API Rate Limiting

  • Implement caching and batching
  • Use webhooks for real-time updates
  • Consider premium API tiers

Data Synchronization

  • Implement proper data versioning
  • Handle event updates gracefully
  • Set up data validation pipelines

Compliance Concerns

  • Regular compliance audits
  • Automated monitoring alerts
  • Clear escalation procedures

Ready to expand your betting platform? Request API access today (opens in a new tab)!

Platform

Documentation

Community

Support

partnership@altsportsdata.comdev@altsportsleagues.ai

2025 Β© AltSportsLeagues.ai. Powered by AI-driven sports business intelligence.

πŸ€– AI-Enhancedβ€’πŸ“Š Data-Drivenβ€’βš‘ Real-Time