Skip to content

API

This guide covers how to use the Alien Giraffe service REST API to create ephemeral environments, upload data, execute queries, and manage the environment lifecycle.

All API endpoints are relative to the base URL:

https://a10e.internal-domain/api

Most operations require authentication using a token provided when creating an environment. Include the token in the Authorization header:

Authorization: Bearer YOUR_TOKEN_HERE

Create a new isolated data environment:

Endpoint: POST /environments/create

Terminal window
curl -X POST https://a10e.internal-domain/api/environments/create \
-H "Content-Type: application/json"

Response:

{
"environment_id": "550e8400-e29b-41d4-a716-446655440000",
"token": "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9...",
"expires_at": "2025-01-01T12:00:00Z",
"api_url": "https://a10e.internal-domain/api",
"mcp_url": "https://a10e.internal-domain/mcp",
"socket_url": "wss://a10e.internal-domain/ws"
}

Response Fields:

  • environment_id - Unique identifier for your environment
  • token - Authentication token for this environment (save this!)
  • expires_at - When the environment will expire if unused
  • api_url - REST API endpoint
  • mcp_url - MCP endpoint for AI integration
  • socket_url - WebSocket endpoint for interactive sessions

Verify that your environment is running:

Endpoint: GET /environments/{id}/health

Terminal window
curl https://a10e.internal-domain/api/environments/550e8400-e29b-41d4-a716-446655440000/health

Response:

{
"success": true,
"elapsed": 12313
}

Each environment automatically has access to any data sources that are configured as part of the service deployment. These configured data sources are immediately available in every new environment without additional setup.

For this example, we’ll use the Upload Data endpoint to load our own CSV files, which provides a quick way to get started without needing to configure external data sources.

Upload one or more CSV files to your environment. Files are automatically loaded into database tables with names derived from the filenames.

Endpoint: POST /environments/{id}/upload

Terminal window
curl -X POST https://a10e.internal-domain/api/environments/550e8400-e29b-41d4-a716-446655440000/upload \
-H "Authorization: Bearer YOUR_TOKEN_HERE" \
-F "files=@users.csv" \
-F "files=@orders.csv" \
-F "override=false"

Parameters:

  • files - One or more CSV files to upload
  • override - Whether to overwrite existing tables (default: false)

Response:

{
"success": true,
"tables": [
{
"table_name": "users",
"row_count": 1000
},
{
"table_name": "orders",
"row_count": 5000
}
]
}

Execute SQL queries against your environment’s database instance:

Endpoint: POST /environments/{id}/query

Terminal window
curl -X POST https://a10e.internal-domain/api/environments/550e8400-e29b-41d4-a716-446655440000/query \
-H "Authorization: Bearer YOUR_TOKEN_HERE" \
-H "Content-Type: application/json" \
-d '{
"query": "SELECT COUNT(*) as total_users FROM users WHERE age > 25"
}'

Response:

{
"columns": ["total_users"],
"data": [
{"total_users": 750}
]
}

View available tables:

SHOW TABLES;

Describe a table structure:

DESCRIBE users;

Join data from multiple tables:

SELECT u.name, COUNT(o.id) as order_count
FROM users u
LEFT JOIN orders o ON u.id = o.user_id
GROUP BY u.name
ORDER BY order_count DESC
LIMIT 10;

Aggregate analysis:

SELECT
DATE_TRUNC('month', order_date) as month,
SUM(amount) as monthly_revenue,
COUNT(*) as order_count
FROM orders
WHERE order_date >= '2024-01-01'
GROUP BY month
ORDER BY month;

For interactive sessions, establish a WebSocket connection:

Endpoint: GET /environments/{id}/connect

// Example WebSocket connection
const ws = new WebSocket('wss://a10e.internal-domain/api/environments/550e8400-e29b-41d4-a716-446655440000/connect?token=YOUR_TOKEN_HERE');
ws.onopen = function() {
console.log('Connected to environment');
// Send SQL query
ws.send(JSON.stringify({
query: "SELECT * FROM users LIMIT 5"
}));
};
ws.onmessage = function(event) {
const result = JSON.parse(event.data);
console.log('Query result:', result);
};

Explicitly destroy an environment when you’re done (optional, as environments auto-expire):

Endpoint: DELETE /environments/{id}

Terminal window
curl -X DELETE https://a10e.internal-domain/api/environments/550e8400-e29b-41d4-a716-446655440000 \
-H "Authorization: Bearer YOUR_TOKEN_HERE"

Response:

{
"success": true
}

The API returns standard HTTP status codes and JSON error responses:

{
"error": "Environment not found"
}

Common Error Codes:

  • 400 - Bad request (invalid query, malformed data)
  • 401 - Unauthorized (invalid or missing token)
  • 404 - Environment not found (expired or invalid ID)
  • 500 - Server error
  • Store environment tokens securely
  • Don’t include tokens in URLs or logs
  • Clean up environments when done (or rely on auto-expiry)
  • Upload data in appropriate batch sizes
  • Use WebSocket connections for multiple queries
  • Leverage the built-in analytical query capabilities
  • Validate data after upload
  • Use views to implement data contracts
  • Keep track of table schemas and relationships

Here’s a complete example of creating an environment, uploading data, and analyzing it:

Terminal window
# 1. Create environment
RESPONSE=$(curl -s -X POST https://a10e.internal-domain/api/environments/create)
ENV_ID=$(echo $RESPONSE | jq -r '.environment_id')
TOKEN=$(echo $RESPONSE | jq -r '.token')
echo "Created environment: $ENV_ID"
# 2. Upload data
curl -X POST https://a10e.internal-domain/api/environments/$ENV_ID/upload \
-H "Authorization: Bearer $TOKEN" \
-F "files=@sales_data.csv" \
-F "files=@customer_data.csv"
# 3. Run analysis query
curl -X POST https://a10e.internal-domain/api/environments/$ENV_ID/query \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-d '{
"query": "SELECT c.region, SUM(s.amount) as total_sales FROM customer_data c JOIN sales_data s ON c.id = s.customer_id GROUP BY c.region ORDER BY total_sales DESC"
}'
# 4. Clean up (optional)
curl -X DELETE https://a10e.internal-domain/api/environments/$ENV_ID \
-H "Authorization: Bearer $TOKEN"

This API provides a powerful foundation for building data applications, AI integrations, and automated data processing workflows on top of ephemeral, secure environments.