Version: 1.0.0
Category: API Patterns
Purpose: Comprehensive guide for AI agents on implementing effective retry strategies for API requests
This pattern provides detailed implementation strategies for retrying failed API requests. It complements the Error Handling Protocol by focusing specifically on retry logic, circuit breakers, and advanced retry patterns.
Key Principles:
Always Retry:
GENERAL_ERROR (code: 1)INVALID_SIGNATURE (code: 3) - May succeed on retryINSUFFICIENT_GAS (code: 4) - May succeed if gas availableINTERNAL_SERVER_ERROR (code: 500)REQUEST_TIMEOUT (timeout)NETWORK_ERROR (network)RATE_LIMIT_EXCEEDED (code: 429) - Retry after delayNever Retry (will always fail):
INSUFFICIENT_FUNDS (code: 2)INVALID_MESSAGE (code: 5)PLAYER_HALTED (code: 6)INSUFFICIENT_CHARGE (code: 7)INVALID_LOCATION (code: 8)INVALID_TARGET (code: 9)ENTITY_NOT_FOUND (code: 404)BAD_REQUEST (code: 400)Reference: See protocols/error-handling.md for complete error categorization
Best For: Most retryable errors
Implementation:
{
"strategy": "exponential-backoff",
"maxRetries": 3,
"initialDelay": 1000,
"maxDelay": 10000,
"backoffMultiplier": 2,
"jitter": true
}
Delay Calculation:
{
"attempt": 1,
"delay": 1000,
"calculation": "initialDelay * (backoffMultiplier ^ (attempt - 1))"
}
Example Delays:
With Jitter (randomization to prevent thundering herd):
{
"jitter": {
"enabled": true,
"type": "full",
"range": "0 to calculated delay"
}
}
Code Example:
async function retryWithExponentialBackoff(fn, maxRetries = 3) {
let attempt = 0;
let delay = 1000;
while (attempt < maxRetries) {
try {
return await fn();
} catch (error) {
if (!isRetryableError(error)) {
throw error; // Don't retry non-retryable errors
}
attempt++;
if (attempt >= maxRetries) {
throw error; // Max retries reached
}
// Calculate delay with jitter
const jitter = Math.random() * delay;
await sleep(delay + jitter);
// Exponential backoff
delay = Math.min(delay * 2, 10000); // Cap at 10 seconds
}
}
}
Best For: Rate limiting (429 errors)
Implementation:
{
"strategy": "fixed-delay",
"maxRetries": 3,
"delay": 5000,
"useWhen": "rate-limit-errors"
}
Example:
{
"rateLimitRetry": {
"error": 429,
"delay": 5000,
"reason": "Wait for rate limit window to reset",
"maxRetries": 3
}
}
Code Example:
async function retryWithFixedDelay(fn, delay = 5000, maxRetries = 3) {
let attempt = 0;
while (attempt < maxRetries) {
try {
return await fn();
} catch (error) {
if (error.status === 429) {
attempt++;
if (attempt >= maxRetries) {
throw error;
}
await sleep(delay); // Fixed delay for rate limits
} else {
throw error; // Don't retry non-rate-limit errors
}
}
}
}
Best For: Predictable retry scenarios
Implementation:
{
"strategy": "linear-backoff",
"maxRetries": 5,
"initialDelay": 1000,
"increment": 1000,
"maxDelay": 5000
}
Delay Calculation:
{
"attempt": 1,
"delay": 1000,
"calculation": "initialDelay + (increment * (attempt - 1))"
}
Example Delays:
Purpose: Prevent cascading failures by stopping retries after repeated failures
States:
Implementation:
{
"circuitBreaker": {
"failureThreshold": 5,
"successThreshold": 2,
"timeout": 60000,
"halfOpenRetries": 1,
"states": ["closed", "open", "half-open"]
}
}
State Transitions:
{
"transitions": {
"closed-to-open": "After 5 consecutive failures",
"open-to-half-open": "After timeout (60 seconds)",
"half-open-to-closed": "After 2 successful requests",
"half-open-to-open": "After 1 failure"
}
}
Code Example:
class CircuitBreaker {
constructor(options = {}) {
this.failureThreshold = options.failureThreshold || 5;
this.successThreshold = options.successThreshold || 2;
this.timeout = options.timeout || 60000;
this.state = 'closed';
this.failureCount = 0;
this.successCount = 0;
this.nextAttempt = Date.now();
}
async execute(fn) {
if (this.state === 'open') {
if (Date.now() < this.nextAttempt) {
throw new Error('Circuit breaker is open');
}
this.state = 'half-open';
this.successCount = 0;
}
try {
const result = await fn();
this.onSuccess();
return result;
} catch (error) {
this.onFailure();
throw error;
}
}
onSuccess() {
this.failureCount = 0;
if (this.state === 'half-open') {
this.successCount++;
if (this.successCount >= this.successThreshold) {
this.state = 'closed';
}
}
}
onFailure() {
this.failureCount++;
if (this.state === 'half-open') {
this.state = 'open';
this.nextAttempt = Date.now() + this.timeout;
} else if (this.failureCount >= this.failureThreshold) {
this.state = 'open';
this.nextAttempt = Date.now() + this.timeout;
}
}
}
Purpose: Validate conditions before retrying
Implementation:
{
"retryWithValidation": {
"validateBeforeRetry": true,
"validations": [
"Check if error is retryable",
"Verify retry count < max retries",
"Validate request parameters unchanged",
"Check if conditions changed (e.g., player online)"
]
}
}
Example:
{
"scenario": "Retry mining after insufficient charge",
"validation": {
"beforeRetry": [
"Query player charge level",
"Verify charge >= required amount",
"Check player is not halted"
],
"retryIf": "charge >= required AND player.halted === false"
}
}
Code Example:
async function retryWithValidation(fn, validator, maxRetries = 3) {
let attempt = 0;
while (attempt < maxRetries) {
try {
return await fn();
} catch (error) {
if (!isRetryableError(error)) {
throw error;
}
// Validate before retrying
const canRetry = await validator(error, attempt);
if (!canRetry) {
throw error;
}
attempt++;
if (attempt >= maxRetries) {
throw error;
}
await sleep(calculateDelay(attempt));
}
}
}
Purpose: Pass context between retry attempts
Implementation:
{
"retryWithContext": {
"context": {
"requestId": "unique-request-id",
"attempt": 1,
"previousErrors": [],
"metadata": {}
},
"updateContext": "On each retry attempt"
}
}
Use Cases:
Purpose: Adjust retry strategy based on error patterns
Implementation:
{
"adaptiveRetry": {
"strategy": "adaptive",
"adjustments": {
"ifManyTimeouts": "Increase initial delay",
"ifManyRateLimits": "Use fixed delay",
"ifManyServerErrors": "Use exponential backoff",
"ifManyNetworkErrors": "Increase max retries"
}
}
}
Example:
{
"adaptiveStrategy": {
"timeoutErrors": {
"count": 5,
"action": "Increase initial delay to 2000ms"
},
"rateLimitErrors": {
"count": 3,
"action": "Switch to fixed delay strategy"
}
}
}
Strategy: Fixed delay based on rate limit headers
Implementation:
{
"rateLimitRetry": {
"error": 429,
"strategy": "fixed-delay",
"delaySource": "Retry-After header or default 5000ms",
"maxRetries": 3
}
}
Code Example:
async function handleRateLimit(error, fn) {
const retryAfter = error.headers['retry-after'] || 5;
const delay = parseInt(retryAfter) * 1000;
await sleep(delay);
return await fn();
}
Strategy: Exponential backoff with longer initial delay
Implementation:
{
"timeoutRetry": {
"error": "timeout",
"strategy": "exponential-backoff",
"initialDelay": 2000,
"maxDelay": 30000,
"maxRetries": 5
}
}
Strategy: Exponential backoff with connection check
Implementation:
{
"networkErrorRetry": {
"error": "network",
"strategy": "exponential-backoff",
"validateConnection": true,
"maxRetries": 5
}
}
Strategy: Exponential backoff with circuit breaker
Implementation:
{
"serverErrorRetry": {
"error": 500,
"strategy": "exponential-backoff",
"circuitBreaker": true,
"maxRetries": 3
}
}
Default Configuration:
{
"default": {
"maxRetries": 3,
"strategy": "exponential-backoff",
"initialDelay": 1000,
"maxDelay": 10000,
"backoffMultiplier": 2,
"jitter": true
}
}
Rate Limit Configuration:
{
"rateLimit": {
"maxRetries": 3,
"strategy": "fixed-delay",
"delay": 5000,
"useRetryAfterHeader": true
}
}
Network Error Configuration:
{
"networkError": {
"maxRetries": 5,
"strategy": "exponential-backoff",
"initialDelay": 2000,
"maxDelay": 30000
}
}
Timeout Configuration:
{
"timeout": {
"maxRetries": 5,
"strategy": "exponential-backoff",
"initialDelay": 2000,
"maxDelay": 30000
}
}
Do:
Don’t:
Do:
{
"logging": {
"include": [
"error code and message",
"retry attempt number",
"delay used",
"request details",
"timestamp"
]
}
}
Benefits:
Do:
Benefits:
Do:
Example:
{
"validation": {
"beforeRetry": [
"Is error retryable?",
"Has player come online?",
"Is charge sufficient?",
"Are parameters still valid?"
]
}
}
Do:
Benefits:
Pattern: Combine retry with rate limit handling
{
"integration": {
"retry": "Exponential backoff",
"rateLimit": "Respect rate limit headers",
"strategy": "Retry after rate limit window"
}
}
Reference: patterns/rate-limiting.md
Pattern: Use cache as fallback during retries
{
"integration": {
"retry": "Exponential backoff",
"cache": "Use cached data if retry fails",
"strategy": "Retry → Cache → Error"
}
}
Reference: patterns/caching.md
Pattern: Use circuit breaker to prevent excessive retries
{
"integration": {
"retry": "Exponential backoff",
"circuitBreaker": "Open after threshold failures",
"strategy": "Retry → Circuit Breaker → Fail Fast"
}
}
Retry Metrics:
{
"metrics": {
"retryCount": "Number of retries attempted",
"retrySuccessRate": "Percentage of successful retries",
"averageRetries": "Average retries per request",
"retryDelay": "Average delay between retries",
"circuitBreakerState": "Current circuit breaker state"
}
}
Key Indicators:
protocols/error-handling.md - Complete error handling guidepatterns/rate-limiting.md - Rate limit handlingpatterns/caching.md - Caching strategiesapi/error-codes.md - Complete error code catalogGENERAL_ERROR (1)INVALID_SIGNATURE (3)INSUFFICIENT_GAS (4)INTERNAL_SERVER_ERROR (500)REQUEST_TIMEOUTNETWORK_ERRORRATE_LIMIT_EXCEEDED (429)Pattern Version: 1.0.0 - January 2025