Ends in
00
days
00
hrs
00
mins
00
secs
ENROLL NOW

💪 25% OFF on ALL Reviewers to Start Your 2026 Strong with our New Year, New Skills Sale!

Amazon Bedrock Prompt Management

Home » AI » Amazon Bedrock Prompt Management

Amazon Bedrock Prompt Management

Amazon Bedrock Prompt Management Cheat Sheet

  • Amazon Bedrock Prompt Management is a centralized prompt management system for generative AI, enabling easy creation, testing, versioning, and deployment of structured prompts while separating prompt engineering from application code.
  • It offers key capabilities to streamline the GenAI workflow:
    • Structured prompts: Define system instructions, tools, and user messages in a standardized format.
    • Converse and InvokeModel API integration: Invoke cataloged prompts directly in your code using their Amazon Resource Names (ARN), eliminating the need to hardcode prompts in source files.

 

Core Concepts

The following list introduces you to the basic concepts of Prompt management:
  • Prompt:
    • The text or instruction provided to a model to control its output.
  • Variable:
    • A dynamic placeholder (e.g., {{variable_name}}) in a prompt, filled with custom values during testing or runtime.
  • Tutorials dojo strip
  • Prompt Variant:
    • A different version of a prompt (such as using another model or instruction), used for side-by-side testing and optimization.
  • Prompt Builder:
    • A visual tool in the Bedrock console for creating, editing, and testing prompts and their variants, no manual JSON editing required.

 

Supported Regions and Models for Prompt Management

  • Prompt management is supported in the following AWS Regions:
    • ap-northeast-1, ap-northeast-2, ap-northeast-3, ap-south-1, ap-south-2, ap-southeast-1, ap-southeast-2, ca-central-1, eu-central-1, eu-central-2, eu-north-1, eu-south-1, eu-south-2, eu-west-1, eu-west-2, eu-west-3, sa-east-1, us-east-1, us-east-2, us-gov-east-1, us-gov-west-1, us-west-2
  • Prompt management is supported for all text models available through the Converse API.

 

Prerequisites

Before using Prompt Management, ensure you meet the following requirements:
  • Permissions: Your IAM user or role must have the necessary permissions to access Amazon Bedrock Prompt Management (e.g., bedrock:CreatePrompt, bedrock:GetPrompt).
  • Model Access: You must enable access to the foundation models you intend to use. This is configured in the Model access page of the Amazon Bedrock console.

 

Creating a Prompt

When creating a prompt, you define the message, variables, and model configurations.
  • Prompt Message:
    • The textual input that serves as the instruction for the Foundation Model (FM) to generate output.
  • Variables:
    • Dynamic placeholders defined using double curly braces (e.g., {{customer_name}}) that are populated when the prompt is invoked.
  • Model Selection:
    • You can associate a prompt with a specific model to configure inference parameters, or leave it unspecified for use with agents.
  • Inference Parameters:
    • maxTokens: The hard limit on the number of tokens the model generates in its response.
    • stopSequences: A defined list of character sequences that, when generated, force the model to immediately stop generating further text.
    • temperature: Controls the “creativity” or randomness of the output. Higher values increase the likelihood of selecting lower-probability options.
    • topP: A sampling technique (nucleus sampling) that limits the model’s token choices to the top percentage of most likely candidates.
    • Additional Fields: JSON objects used to specify model-specific parameters not covered by the base list.
  • Console Steps

    1. Navigate to Prompt management in the Amazon Bedrock console.
    2. Select Create prompt.
    3. Enter a name and optional description, then choose Create.
    4. The prompt is created and ready for editing in the Prompt Builder.
  • API Steps

    • Send a CreatePrompt request using the build-time endpoint.
    • Include the required fields such as name and optional fields like description or variants.

 

Viewing Prompt Information

  • Access details regarding your prompt’s metadata, drafts, and version history.
  • Console Steps

    1. Open the Amazon Bedrock console and select Prompt management.
    2. Choose a prompt from the Prompts list.
    3. Review the Overview (creation/update dates), Prompt draft (current configuration), and Prompt versions (deployment history) sections.
  • API Steps

    • Get Details: Send a GetPrompt request specifying the prompt’s ARN or ID as the promptIdentifier. To see a specific version, populate the promptVersion field.
    • List All: Send a ListPrompts request. Use maxResults to limit the return count and nextToken to paginate through results.

 

Modifying a Prompt

  • Update your prompt’s metadata, content, or inference configurations.
  • Console Steps

    • In Prompt management, select the specific prompt you wish to edit.
    • Edit Metadata: Choose Edit in the Overview section to change the Name or Description, then Save.
    • Edit Content: Choose Edit in prompt builder to modify the message, variables, or model parameters.
  • API Steps

    • Send an UpdatePrompt request to the build-time endpoint.
      • Note: You must include all fields you wish to maintain, as well as the fields you are changing.

 

Testing a Prompt

  • Validate the behavior of your prompt using real-time inference before creating a version.
  • Console Steps

    1. Select a prompt and choose Edit in Prompt builder (for drafts) or select a specific Version.
    2. Configure Variables: If your prompt uses {{variables}}, enter temporary Test values in the Test variables pane (these values are not saved).
    3. Run Inference: Choose Run in the Test window to generate a response.
    4. Iterate: Modify configurations and re-run as needed. If satisfied, choose Create version to snapshot the prompt for production.
  • API Steps

    • Run Inference: Send a request to InvokeModel, Converse, or ConverseStream using the prompt’s ARN as the modelId.
    • Restriction: You cannot override inferenceConfig, system, or toolConfig during this call.
    • Restriction: Messages included in the call are appended after the prompt’s defined messages.
      • Test in Flow: Create a flow with a PromptNode pointing to the prompt ARN, then use InvokeFlow.
      • Test with Agent: Use InvokeAgent and pass the prompt text into the inputText field.

 

Optimizing a Prompt

  • Automatically rewrite prompts to improve performance and output quality for a specific model.
  • Console Steps

    1. In the Prompt builder or playground, write your initial prompt and select a model.
    2. Select the Optimize (wand icon) button.
    3. Compare: View the original and optimized prompts side-by-side.
    4. Select: Choose Replace original prompt (or “Use optimized prompt”) to accept the changes, or exit to keep your original.
  • API Steps

    • Send an OptimizePrompt request to the runtime endpoint. Provide the input prompt object and the targetModelId. The response will stream an analyzePromptEvent followed by an optimizedPromptEvent containing the rewritten text.

 

Application Deployment using Versions

  • To deploy a prompt to production, you must create a version.
  • Free AWS Courses
  • A version is an immutable snapshot of your prompt taken at a specific point in time, allowing you to switch between configurations safely.
  • Create a version:

    • Save a snapshot of your current draft to stabilize it for use in applications.
  • View information about versions:

    • Access the history of all created snapshots to track changes over time.
  • Compare versions:

    • Analyze the differences between two versions (or a version and a draft) to understand validation changes.
  • Delete a version:

    • Remove specific snapshots that are no longer needed (ensure they are not in production use).

 

Deleting a Prompt

  • Permanently remove a prompt and all its associated versions.
  • Console Steps

    1. In Prompt management, select the prompt you want to remove.
    2. Choose Delete.
    3. Type confirm in the warning dialog (acknowledging that runtime errors may occur if resources still use this prompt) and choose Delete.
  • API Steps

    • Send a DeletePrompt request specifying the prompt ARN/ID. To delete only a specific version, populate the promptVersion field.

 

Security

  • IAM Permissions:
    • Access to create, edit, and invoke prompts is controlled via AWS Identity and Access Management (IAM) policies.
  • Encryption:
    • Prompts and their versions are encrypted at rest using AWS KMS keys.
  • Guardrails Integration:
    • You can attach Amazon Bedrock Guardrails to your prompts to enforce safety policies, content filtering, and redaction of sensitive information during testing and inference.

 

Best Practices

  • Use Variables:
    • Always use {{variables}} for dynamic input. This prevents prompt injection risks and ensures prompts are reusable across different contexts.
  • Version Control:
    • Never use the DRAFT version in production. Always create immutable numbered versions (e.g., v1, v2) to ensure stability.
  • Model Specificity:
    • Prompts are often model-specific. If switching from Claude to Llama, create a new variant tailored to the new model’s specific prompt engineering patterns.
  •  

Pricing

  • Management:
    • There is generally no additional charge for storing and organizing prompts in the management library.
  • Testing & Optimization:
    • You are charged standard On-Demand model inference fees (based on input/output tokens) whenever you run a prompt in the console for testing, comparison, or optimization.
  • Prompt Flows:
    • If you use prompts within Amazon Bedrock Flows, you are charged per node transition.

 

Amazon Bedrock Prompt Management References:

https://aws.amazon.com/bedrock/prompt-management/

https://aws.amazon.com/blogs/machine-learning/amazon-bedrock-prompt-management-is-now-available-in-ga/

https://docs.aws.amazon.com/bedrock/latest/userguide/prompt-management.html

https://aws.amazon.com/bedrock/pricing/

Learn AWS with our PlayCloud Hands-On Labs

$2.99 AWS and Azure Exam Study Guide eBooks

tutorials dojo study guide eBook

New AWS Generative AI Developer Professional Course AIP-C01

AIP-C01 Exam Guide AIP-C01 examtopics AWS Certified Generative AI Developer Professional Exam Domains AIP-C01

Learn GCP By Doing! Try Our GCP PlayCloud

Learn Azure with our Azure PlayCloud

FREE AI and AWS Digital Courses

FREE AWS, Azure, GCP Practice Test Samplers

Subscribe to our YouTube Channel

Tutorials Dojo YouTube Channel

Follow Us On Linkedin

Written by: Cristieneil Ceballos

Cristieneil Ceballos, “Cris” for short, is a Computer Science student at the University of the Philippines Mindanao and an IT Intern at Tutorials Dojo. Passionate about continuous learning, she volunteers and engages with various tech communities—viewing each experience as both a chance to contribute and an opportunity to explore areas she’s interested in.

AWS, Azure, and GCP Certifications are consistently among the top-paying IT certifications in the world, considering that most companies have now shifted to the cloud. Earn over $150,000 per year with an AWS, Azure, or GCP certification!

Follow us on LinkedIn, YouTube, Facebook, or join our Slack study group. More importantly, answer as many practice exams as you can to help increase your chances of passing your certification exams on your first try!

View Our AWS, Azure, and GCP Exam Reviewers Check out our FREE courses

Our Community

~98%
passing rate
Around 95-98% of our students pass the AWS Certification exams after training with our courses.
200k+
students
Over 200k enrollees choose Tutorials Dojo in preparing for their AWS Certification exams.
~4.8
ratings
Our courses are highly rated by our enrollees from all over the world.

What our students say about us?