
Start Your AI Journey Today
- Access 100+ AI APIs in a single platform.
- Compare and deploy AI models effortlessly.
- Pay-as-you-go with no upfront fees.
This guide covers using Eden AI’s Prompt Manager for efficient LLM prompt management. It walks through creating, testing, and optimizing prompts with API integration via FastAPI, including features like version control, A/B testing, and error handling.
As large language models (LLMs) become integral to products across industries, prompt engineering has emerged as a critical discipline. But managing prompts isn't just about crafting clever sentences, it involves iteration, testing, version control, and performance tracking.
In this blog post, we’ll delve into Eden AI’s Prompt Manager, a tool designed to streamline the entire lifecycle of prompts, including creation, A/B testing, and API-based execution.
Whether you're an engineer, data scientist, or prompt enthusiast, this detailed walkthrough will empower you to manage prompts like a pro.
For a hands-on learning experience, you can also watch our YouTube video on our Eden AI Youtube channel, where Krishna, our Developer Advocate, walks you through this tutorial step-by-step so you can follow along as you build and test your own prompt management API.
We’ll walk you through:
Prompt engineering is both art and science. But even well-written prompts can underperform in real applications. Without a structured way to manage them, teams often:
Eden AI solves these challenges with a Prompt Manager that acts like GitHub for prompts, providing creation, storage, testing, and analytics tools, all accessible via UI and API.
Eden AI’s Prompt Manager allows you to:
You can use it visually (via their dashboard) or programmatically using an API — the latter being ideal for automated testing, CI/CD pipelines, or prompt orchestration engines.
The demo code provided in the video sets up a FastAPI proxy service that interfaces with Eden AI’s backend. This is useful for abstracting credentials and routing requests from your application logic.
First, ensure you have a .env file with your Eden AI API key:
The code loads this securely using python-dotenv:
Now, let’s break down the full functionality implemented in custom_prompts.py.
You can retrieve all prompts stored in your Eden AI dashboard with pagination:
This endpoint proxies GET requests to:
Useful for listing available versions or prompts used across different use cases.
To define a new prompt with template variables, send a POST request to:
Request body example:
This allows structured, reusable prompts with placeholders (e.g. {{text}}).
Fetch any individual prompt by name:
This enables loading the full template, including the version history and metadata.
Once a prompt is defined, test it by sending values for its variables:
Example body:
Eden AI will substitute the variable and send it to the chosen LLM provider, returning the response.
Prompts can be updated using either:
Both endpoints:
are routed to Eden AI’s API, maintaining synchronization between your local service and Eden AI’s prompt store.
Remove a prompt when it is deprecated or no longer needed:
Returns a 204 No Content response on success.
Version control is a key advantage of Eden AI’s prompt manager. Each time you modify a prompt and want to test performance, you can create a versioned history entry:
This allows you to:
You can retrieve, update, or delete these history entries using:
You can even extract template variables from any version with:
All endpoints include robust error handling to:
For example, if Eden AI returns an error:
This ensures the FastAPI layer remains clean and stable.
To run the application locally:
Once live, you can use Postman or Curl to test all endpoints. Better yet, hook it into your application to dynamically fetch and execute prompts from Eden AI.
Eden AI’s Prompt Manager is not just a storage solution, it’s a full lifecycle platform for LLM prompt optimization. By combining a powerful UI with a flexible API, it supports scalable, collaborative, and performance-driven prompt engineering.
With the FastAPI proxy, you can:
Whether you're building a chatbot, summarization tool, or content generator, prompt management will be crucial to quality and consistency, and Eden AI provides one of the most developer-friendly solutions in the space.
You can directly start building now. If you have any questions, feel free to chat with us!
Get startedContact sales