Tutorial

Mastering Prompt Management: A Comprehensive Guide to Building, Testing, and Optimizing LLM Prompts

This guide covers using Eden AI’s Prompt Manager for efficient LLM prompt management. It walks through creating, testing, and optimizing prompts with API integration via FastAPI, including features like version control, A/B testing, and error handling.

Mastering Prompt Management: A Comprehensive Guide to Building, Testing, and Optimizing LLM Prompts
TABLE OF CONTENTS

As large language models (LLMs) become integral to products across industries, prompt engineering has emerged as a critical discipline. But managing prompts isn't just about crafting clever sentences, it involves iteration, testing, version control, and performance tracking.

In this blog post, we’ll delve into Eden AI’s Prompt Manager, a tool designed to streamline the entire lifecycle of prompts, including creation, A/B testing, and API-based execution.

Whether you're an engineer, data scientist, or prompt enthusiast, this detailed walkthrough will empower you to manage prompts like a pro.

For a hands-on learning experience, you can also watch our YouTube video on our Eden AI Youtube channel, where Krishna, our Developer Advocate, walks you through this tutorial step-by-step so you can follow along as you build and test your own prompt management API.

We’ll walk you through:

  1. Why prompt management matters
  2. How Eden AI's Prompt Manager works
  3. Setting up a FastAPI service to manage and test prompts
  4. Full API walkthrough: create, list, retrieve, execute, update, and delete prompts
  5. Version history and A/B testing for optimization

Why Prompt Management Matters

Prompt engineering is both art and science. But even well-written prompts can underperform in real applications. Without a structured way to manage them, teams often:

  • Hardcode prompts into backends
  • Lose track of which versions are deployed
  • Lack performance metrics or version control
  • Struggle with collaboration and iteration

Eden AI solves these challenges with a Prompt Manager that acts like GitHub for prompts, providing creation, storage, testing, and analytics tools, all accessible via UI and API.

Overview of Eden AI’s Prompt Manager

Eden AI’s Prompt Manager allows you to:

  • Create and version prompts with metadata
  • Test prompts in real-time using multiple LLM providers
  • Monitor performance across versions
  • Integrate via API for use in production apps

You can use it visually (via their dashboard) or programmatically using an API — the latter being ideal for automated testing, CI/CD pipelines, or prompt orchestration engines.

Building a Prompt Management API with FastAPI

The demo code provided in the video sets up a FastAPI proxy service that interfaces with Eden AI’s backend. This is useful for abstracting credentials and routing requests from your application logic.

Step 1: Environment Setup

First, ensure you have a .env file with your Eden AI API key:


EDEN_AI_API_KEY=your_api_key_here

The code loads this securely using python-dotenv:


from dotenv import load_dotenv
load_dotenv()
EDEN_AI_API_KEY = os.getenv("EDEN_AI_API_KEY")
if not EDEN_AI_API_KEY:
    raise ValueError("EDEN_AI_API_KEY environment variable not set.")

Core Features and Endpoints

Now, let’s break down the full functionality implemented in custom_prompts.py.

1. List All Prompts

You can retrieve all prompts stored in your Eden AI dashboard with pagination:


@app.get("/v2/prompts/")
async def list_prompts(page: int = 1, page_size: int = 10)

This endpoint proxies GET requests to:


GET https://api.edenai.run/v2/prompts/?page=1&page_size=10

Useful for listing available versions or prompts used across different use cases.

2. Create a New Prompt

To define a new prompt with template variables, send a POST request to:


@app.post("/v2/prompts/")
async def create_prompt(request: Request)

Request body example:


{
  "name": "extract_keywords",
  "provider": "openai",
  "template": "Extract keywords from this text: {{text}}"
}

This allows structured, reusable prompts with placeholders (e.g. {{text}}).

3. Retrieve a Specific Prompt

Fetch any individual prompt by name:


@app.get("/v2/prompts/{name}/")

This enables loading the full template, including the version history and metadata.

4. Execute (Call) a Prompt

Once a prompt is defined, test it by sending values for its variables:


@app.post("/v2/prompts/{name}/")

Example body:


{
  "variables": {
    "text": "Eden AI is revolutionizing prompt management."
  }
}

Eden AI will substitute the variable and send it to the chosen LLM provider, returning the response.

5. Update or Patch a Prompt

Prompts can be updated using either:

  • PUT (for full replacement) or

  • PATCH (for partial updates)

Both endpoints:


@app.put("/v2/prompts/{name}/")
@app.patch("/v2/prompts/{name}/")

are routed to Eden AI’s API, maintaining synchronization between your local service and Eden AI’s prompt store.

6. Delete a Prompt

Remove a prompt when it is deprecated or no longer needed:


@app.delete("/v2/prompts/{name}/")

Returns a 204 No Content response on success.

Advanced Feature: Prompt Version History and A/B Testing

Version control is a key advantage of Eden AI’s prompt manager. Each time you modify a prompt and want to test performance, you can create a versioned history entry:


@app.post("/v2/prompts/{name}/history/")

This allows you to:

  • Track performance per version
  • A/B test variants in production
  • Roll back underperforming prompts

You can retrieve, update, or delete these history entries using:


@app.get("/v2/prompts/{name}/history/")
@app.get("/v2/prompts/{name}/history/{id}/")
@app.put("/v2/prompts/{name}/history/{id}/")
@app.delete("/v2/prompts/{name}/history/{id}/")

You can even extract template variables from any version with:


@app.get("/v2/prompts/{name}/history/{id}/template-variables/")

Error Handling and Security

All endpoints include robust error handling to:

  • Return appropriate status codes (400, 500, etc.)
  • Provide user-friendly error messages
  • Catch JSON decoding issues and request failures

For example, if Eden AI returns an error:


except requests.exceptions.RequestException as e:
    status_code = e.response.status_code if e.response else 500
    error_detail = e.response.json() if e.response else str(e)
    raise HTTPException(status_code=status_code, detail=error_detail)

This ensures the FastAPI layer remains clean and stable.

Running the API

To run the application locally:


uvicorn custom_prompts:app --reload

Once live, you can use Postman or Curl to test all endpoints. Better yet, hook it into your application to dynamically fetch and execute prompts from Eden AI.

Final Thoughts

Eden AI’s Prompt Manager is not just a storage solution, it’s a full lifecycle platform for LLM prompt optimization. By combining a powerful UI with a flexible API, it supports scalable, collaborative, and performance-driven prompt engineering.

With the FastAPI proxy, you can:

  • Seamlessly integrate prompt management into your backend
  • Track and test prompt variants in production
  • Manage updates without service interruptions

Whether you're building a chatbot, summarization tool, or content generator, prompt management will be crucial to quality and consistency, and Eden AI provides one of the most developer-friendly solutions in the space.

Resources

  • 📄 Explore the Documentation
    Access the full Prompt Manager API documentation to understand every available feature and parameter: Prompt Management Documentation
  • 💻 View the Full Demo Code
    Want to see how everything works under the hood? Check out the complete FastAPI implementation in our open-source repo: custom_prompts.py on GitHub
  • 🚀 Start Using Prompt Management Today
    Experience the power of structured, version-controlled, API-driven prompt workflows, no setup required: Try Eden AI Prompt Manager
  • Start Your AI Journey Today

    • Access 100+ AI APIs in a single platform.
    • Compare and deploy AI models effortlessly.
    • Pay-as-you-go with no upfront fees.
    Start building FREE

    Related Posts

    Try Eden AI for free.

    You can directly start building now. If you have any questions, feel free to chat with us!

    Get startedContact sales
    X

    Start Your AI Journey Today

    Sign up now with free credits to explore 100+ AI APIs.
    Get my FREE credits now