Proto AICX Docs
  • Getting Started
    • Glossary
    • Signing-in
    • Main Dashboard
    • Proto AI
    • Data Privacy
  • AICX Modules
    • AI Agents
      • Manage AI Agents
      • Actions
        • Send Message
        • Send File
        • Show Carousel
        • Show Survey
        • Collect Feedback
        • Create Ticket
        • Human Handoff
        • External Handoff
        • Jump to Trigger/Action
        • If/Else
        • Set Chat Variable
        • Send API Request
      • LLMs
        • Cloudflare Training
      • Publishing
        • Webchat
          • Proactive Messages
        • Facebook Messenger
        • WhatsApp
        • Telegram
        • LINE
        • Africa's Talking
        • Bitrix24
        • Zendesk
    • Inbox
      • Find & View Conversations
      • Message Editor
      • Transfers & Takeovers
    • Livechats
    • Tickets
    • People
    • Analytics
      • Winnow
  • General
    • Settings
      • Workspace
        • Plan & Billing
          • Interactions Amount
        • Roles & Permissions
        • Users
        • Teams
        • IP Allowlist
        • Whitelabelling
      • Teamspaces
        • Banning
        • Tags
        • Chat Assignment
        • Email Domains
        • Email Templates
        • Exports
        • Canned Replies
        • Custom Fields
        • Custom Profiles
        • Custom Analytics
      • User Account
        • User Aliases
        • Online Status
    • Plans & Pricing
    • Enterprise Max
  • Developers
    • Developer Tools
      • Chat Variables
      • Chat Scripts
      • Webchat Script
      • Developer API
      • On-Premise & Hybrid Hosting
Powered by GitBook
On this page
  • Chatbot vs LLM
  • How Agents Work
  • ProtoAI
  • Chat Flow
  • Receive message
  • Check triggers, apply actions
  • Use Fallback trigger (if enabled)
  • Use LLM (if enabled)
  1. AICX Modules

AI Agents

Harness artificial intelligence to automate audience engagement.

AI agents, formerly known as assistants or chatbots, are the first line of support to triage, answer, and solve your audience's questions, issues, and concerns. All chats begin with an AI agent.


Chatbot vs LLM

Scripted chatbots have long been a staple in customer service automation, preceding the modern large language models (LLMs) that have followed. LLMs introduced important advancements for audience engagement, like unstructured conversations and the voice and vision modalities.

Proto AICX marries the best parts of each technology:

Capability
Chatbots
LLMs
Proto AI Agents

Content privacy

✔

✔

Scripted chats

✔

✔

Runs in messaging apps

✔

✔

Human handoffs

✔

✔

Trainable on resources

✔

✔

Unstructured chats

✔

✔

Voice chats

✔

✔

Realtime language translation

✔

✔

How Agents Work

ProtoAI

The combination of features from traditional chatbots and modern LLMs is thanks to ProtoAI, our natural language processing (NLP) engine. It assesses all inbound messages at runtime, using sophisticated NLP pattern recognition to match the person's conversational intent with an appropriate action in the agent's scripted flow.

If there is no match, and a specified LLM has been enabled for the agent, ProtoAI will send the message through to the LLM for an informed, organic response.

By empowering your AI agents with both a scripted flow and a trained, personalised LLM, you'll have an incredibly robust, well-rounded bot at your disposal.


Chat Flow

AI agents run this sequence for every inbound text or voice message received:

1

Receive message

A message is received through one of the AI agent's enabled messaging apps or website interfaces.

2

Check triggers, apply actions

ProtoAI checks for scripted triggers from a system event or which match the person's intent, and applies corresponding actions—like a stock reply or human handoff. The flow ends here if any trigger/action is applied.

3

Use Fallback trigger (if enabled)

The Fallback trigger is a catch-all for unmatched conversational intent. It will intercept the message and reply with a scripted response, ending the flow.

4

Use LLM (if enabled)

If nothing was triggered in steps 2 & 3, the message is sent to a specified LLM, which generates an unstructured response. LLMs can be trained and personlised for your use case.

PreviousData PrivacyNextManage AI Agents

Last updated 3 days ago

AI agents are interacted with through realtime messaging, and can be configured to handoff chats to a human. For human-to-human emailing, see .

tickets