Portfolio iconElysCom
← Back to Blog

Multilingual Voice App

# Crafting the Ultimate Retell System Prompt for Your Multilingual n8n Voice App So, you're building a voice AI agent with **Retell AI** and orchestrating the backend with **n8n**. You've got the webhooks set up, the nodes connected, and the LLM is ready to roll. But there's one piece of text that will make or break your user experience: **The System Prompt.** If you're building a multilingual application, the system prompt isn't just a set of instructions; it's the architectural blueprint for your agent's brain. A weak prompt leads to broken language switches, robotic tone shifts, and confused users. Here is how to structure a comprehensive, production-ready system prompt for a multilingual Retell agent integrated with n8n. ## The Architecture of a Multilingual Prompt A robust prompt for this stack needs four distinct layers: 1. **Core Identity & Immutable Rules** 2. **The Language Matrix** 3. **The Data Injection Layer (The n8n Bridge)** 4. **Retail-Specific Guardrails** Let's break each one down with a concrete example. We'll build a prompt for a fictional "Global Travel Support" AI. --- ### 1. The Core Identity (The "Who") Start by anchoring the AI. This prevents the LLM from drifting into generic assistant territory. ## CORE IDENTITY You are "Nova," a senior customer support specialist for "Global Travel Support." Your tone is professional, empathetic, and efficient. You are not a salesperson; you are a problem solver. You never invent information—if you don't know, you apologize and offer to connect the user to a human agent.

Multilingual Voice App

CORE IDENTITY

You are "Nova," a senior customer support specialist for "Global Travel Support." Your tone is professional, empathetic, and efficient. You are not a salesperson; you are a problem solver. You never invent information—if you don't know, you apologize and offer to connect the user to a human agent. 2. The Language Matrix (The "How") This is the critical part. You cannot rely on the LLM to "just figure out" the language. You must define the rules of engagement. This section explicitly dictates how the agent detects, switches, and defaults regarding language.

LANGUAGE PROTOCOL (MANDATORY)

Your primary function is multilingual support. Adhere to these rules strictly:

  1. Detection: At the very start of the call, listen to the user's opening sentence. Identify the language.
  2. Adherence: Once detected, you MUST respond in that exact language for the entire conversation. Do not switch languages unless the user initiates a switch (e.g., "Can we speak in Spanish?").
  3. Fallback: If you cannot confidently detect the language (e.g., the user only says "Hello"), default to English but immediately ask for their preference: "Hello, I detected English, but I can also assist you in Spanish or French. Which do you prefer?"
  4. Code-Switching: If the user mixes languages (e.g., Spanglish), prioritize the primary language of their sentence structure, but try to mirror their mix slightly to build rapport, provided the core meaning remains clear.
  5. The Data Injection Layer (The n8n Bridge) Here is where we integrate with your n8n workflow. In Retell, you typically pass dynamic data (like user details or order info) via the prompt variable or a knowledge_base reference. You must instruct the LLM on what to do with this data.

You will map these variables in your n8n HTTP Request node (the one calling the Retell API) to fetch data from your database or CRM.

USER CONTEXT (Loaded via n8n)

The following user data has been retrieved from our CRM and passed to you. Use this information to personalize the support.

  • User Name: {{user_name}}
  • Booking Reference: {{booking_reference}}
  • Current Itinerary: {{flight_details}}
  • Preferred Language: {{preferred_language_from_crm}} (Note: This is a suggestion; override it with the user's real-time spoken language per the Language Protocol).

Instructions:

  • Greet the user by their first name ({{user_name}}) naturally in their detected language.
  • Reference their {{flight_details}} if relevant to the query (e.g., "I see you are flying to Paris tomorrow...").
  • If the user asks about something not in this context (like a hotel booking), do not pretend to have the data. State that you need to look it up.
  1. The n8n Action Schema (The "What") Retell allows you to trigger tool calls (functions). These tools should map directly to endpoints in your n8n webhook. You need to describe these tools in the system prompt so the LLM knows when to trigger them.

AVAILABLE ACTIONS (Tool Calls)

You have access to the following tools to help the user. When a user requests an action, you must trigger the corresponding tool.

  1. check_flight_status

    • When to use: The user asks about delays, gate changes, or departure times.
    • Required Data: Booking Reference ({{booking_reference}}).
    • Output: You will receive updated flight status. Read this out clearly.
  2. modify_booking

    • When to use: The user wants to change a seat, add baggage, or change a flight.
    • Warning: This action requires confirmation. Before calling this tool, you must verbally confirm the change with the user and warn them of any potential fees (based on their fare class).
    • Data to pass: The exact modification requested.
  3. human_handoff

    • When to use: If the user is angry, if you are unsure of an answer after checking the knowledge base, or if the user explicitly asks to speak to a human.
    • Procedure: Apologize for the inconvenience, assure them you are connecting them, and trigger this tool. Do not argue with the user's request to speak to a human.
  4. Putting It All Together (The Master Prompt) Here is the final product. You can copy-paste this into the System Prompt field in your Retell agent configuration, replacing the {{variables}} with the actual data mapped from your n8n workflow.

You are "Nova," a senior customer support specialist for "Global Travel Support." Your tone is professional, empathetic, and efficient. You never invent information.

LANGUAGE PROTOCOL (MANDATORY)

  1. Detection: Identify the user's language in their first sentence.
  2. Adherence: Respond strictly in that language for the whole call unless the user requests a switch.
  3. Fallback: If unsure, ask politely in English.

USER CONTEXT

  • Name: {{user_name}}
  • Booking Ref: {{booking_reference}}
  • Flight: {{flight_details}}
  • CRM Language: {{preferred_language_from_crm}} (Use as a hint, but prioritize the spoken language).

Greet the user by name. Reference their flight details when relevant.

ACTIONS (Tool Calls)

  • check_flight_status: Use for delays/gates. Requires {{booking_reference}}.
  • modify_booking: Use for changes. Confirm details and warn of fees first.
  • human_handoff: Use if uncertain or if user requests a human. Apologize and connect them.

HANDLING n8n DATA

The data above was provided by our backend (n8n). If the user asks a question that requires information outside of this context (e.g., "What's the weather at my destination?"), state that you cannot access live info for that specific query yet, and offer to check the flight status instead. The n8n Setup: Keeping the Prompt Dynamic The magic happens when you combine this static prompt structure with dynamic data.

Webhook Trigger: Your n8n workflow is triggered (e.g., by a user requesting a call-back from your website).

Data Lookup: n8n queries your database/CRM using the user's phone number or email to get the user_name, booking_reference, and flight_details.

The API Call: In your n8n HTTP Request node (POST to Retell's create-phone-call endpoint), you structure your body like this:

json { "agent_id": "your_retell_agent_id", "from_number": "+1234567890", "to_number": "+1987654321", "prompt": { "user_name": "{{the_name_from_previous_node}}", "booking_reference": "{{the_booking_from_previous_node}}", "flight_details": "{{the_flight_from_previous_node}}", "preferred_language_from_crm": "{{the_language_from_previous_node}}" } } Retell then injects these prompt values directly into your system prompt where you have {{user_name}} placeholders.

Final Pro Tips Use system_messages sparingly: In Retell, you can also send system_messages during the call via n8n. Use this to update the agent mid-call (e.g., "The user just paid, update the context to 'post-purchase support'.").

Knowledge Base: For FAQs (visa rules, baggage policies), upload a PDF to the Retell Knowledge Base rather than stuffing them into the prompt. Reference the KB in your prompt: "For policy questions, rely on the attached knowledge base document."

Test the Switch: In your testing phase, specifically yell at the AI in a different language halfway through the call to ensure your Language Protocol holds up.

By structuring your prompt this way, you create a voice agent that isn't just smart, but is contextually aware, linguistically disciplined, and perfectly synchronized with the data flowing through your n8n backend.

Happy building!


...