AWS Lambda Deployment
Deploy Fair Forge generators and runners as AWS Lambda functions for serverless execution.Available Lambda Functions
| Function | Purpose | Endpoint |
|---|---|---|
| Generators | Generate test datasets from context | POST /run |
| Runners | Execute tests against AI systems | POST /run |
Generators Lambda
Generate synthetic test datasets from markdown content using any LLM.Supported LLM Providers
| Provider | class_path |
|---|---|
| Groq | langchain_groq.chat_models.ChatGroq |
| OpenAI | langchain_openai.chat_models.ChatOpenAI |
| Google Gemini | langchain_google_genai.chat_models.ChatGoogleGenerativeAI |
| Ollama | langchain_ollama.chat_models.ChatOllama |
Request Format
Configuration Parameters
| Parameter | Type | Default | Description |
|---|---|---|---|
assistant_id | str | Required | ID for generated dataset |
num_queries | int | 3 | Questions per chunk |
language | str | "english" | Language for generation |
conversation_mode | bool | false | Generate conversations |
max_chunk_size | int | 2000 | Max chars per chunk |
min_chunk_size | int | 200 | Min chars per chunk |
seed_examples | list[str] | null | Example questions for style |
Example: Using Groq
Example: Using OpenAI
Response Format
Runners Lambda
Execute test datasets against AI systems.Modes
LLM Mode: Direct execution against any LangChain-compatible LLM Alquimia Mode: Execution against Alquimia AI agentsLLM Mode Request
Alquimia Mode Request
Example: LLM Mode
Response Format
Deployment
Prerequisites
- AWS CLI configured
- Docker installed
- AWS ECR repository access