Handling Azure OpenAI content filter errors per node (AgentNode / Prompt Node) in Kore.ai

Hello Kore.ai Support,
We are using Azure OpenAI in Kore.ai and occasionally receive errors when the Azure content filter is triggered. In that case, we want to respond with a user-friendly message such as: “Your request was blocked by the safety filter. Please rephrase and try again.”

However, if we handle this via a global error event/handler, the same fallback message gets sent for errors coming from other nodes as well, which is not what we want. We need a way to handle Azure OpenAI filter-triggered errors specifically and locally, ideally per AgentNode or Prompt Node, so that only the node that invoked Azure OpenAI returns the “please rephrase” message.

Could you please advise:

How can we detect that the error is specifically due to the Azure OpenAI content filter (e.g., based on error code / error type / response payload)?

Is there a recommended approach to implement a node-scoped fallback message for AgentNode / Prompt Node (instead of a global error handler)?

If node-scoped handling is not supported, what is the best workaround to ensure the fallback message is shown only for Azure filter errors and not for other node errors?

Thank you.

Hello Team,

Please note, Azure OpenAI enforces Responsible AI content filters that can reject prompts/responses if they trigger policy rules. When that happens, the Azure API returns an error indicating the content was filtered and this bubbles up to the Kore.ai runtime.

Unlike other errors (timeouts, connectivity issues), the content filter error is a specific type of rejection from the Azure API.

How to Detect a Content Filter Error

Azure OpenAI usually includes filtering metadata when the content filter triggers. For example, API responses may contain a structure like:

"content_filter_result": { … },
"error": { "type": "content_filter", … }

While Kore.ai doesn’t surface that directly in the UI, in the raw GenAI response payload you can inspect the returned keys/flags and match those that indicate a filter rejection.

Recommended Approach: Per-Node Error Handling

Use Post-Processor Script in the Prompt Node

If you are using a Prompt Node or AgentNode with custom prompt scripting, you can add a post-processor script to detect and handle filter errors at that node:

if(response.error && response.error.type === "content_filter") {
    // Detected Azure content filter rejection
    return {
        text: "Your request was blocked by the safety filter. Please rephrase and try again."
    }
}

// Otherwise return the normal response
return response;

This checks the error type in the payload that Azure returns and outputs a node-scoped fallback message.

This script is evaluated for each Node instance so it only applies locally, not globally.

Prompt-Level Error Response Checks

If you are not using post-processor scripts but rely on the built-in Prompt Node features, you can:

  • Enable Custom Response Mapping
  • Map an output key that checks for the presence of a filter result (e.g., content_filter_blocked) and routes to a fallback reply within that node’s transition logic

This lets you branch only within that node when Azure OpenAI’s filter triggers.

What Not to Do

  • Don’t use only a global error handler it will catch all errors and cannot distinguish between filter errors and other nodes’ exceptions.
  • Don’t assume a numeric HTTP status alone Azure may return content filter errors with standard success codes but include the filter info in the payload.

If Node-Scoped Handling Is Not Supported

If your version of the platform does not allow post-processor scripts or local error checks for GenAI nodes:

Workaround

  1. Wrap the call in a Script Node

  2. Do a custom API invocation to Azure OpenAI

  3. Parse the response manually

  4. Return either:

    • Normal response, or
    • Friendly “please rephrase” message

This gives you full control over the error handling logic before passing the result back into the dialog. (You’ll need to use a Service/Webhook call instead of the built-in GenAI node.)

Please let us know if this response helps.