Chatlayer Documentation
Get in touchAbout Chatlayer
  • Welcome
  • What's new
  • Send feedback
  • start quickly
    • Leadzy tutorial
      • 0. Introduction
      • 1. New bot, new block
      • 2. Understand your users
      • 3. Collect and display user input
      • 4. Steer the conversation with Conditions
      • 5. Empower your bot with Actions
      • 6. Set up a channel and publish your bot
    • Templates
      • Banking
      • E-Commerce Returns
      • E-Bike Shop
      • E-Scooter Support
      • Feedback
      • Find Nearest Location
      • GPT
      • 🆕Gym
      • Job applications
      • Knowledge base AI template
      • Lead generation
      • 🆕NPS
      • Restaurant
      • Retail
  • navigation
    • Analytics
      • Dashboard
      • Customers
      • Conversations
      • User flow
      • Intents
      • 🆕Funnels [Beta]
    • Bot builder
      • Flows
        • Canvas functionalities
        • Collaborate with team members
        • Manage your flows
        • 🆕[Beta] Export and import flows
      • Bot dialogs view
      • Translations
      • Events
    • NLP
      • Dashboard
      • Intents
      • Expressions
      • Entities
      • Train
      • NLP settings
      • Improve
    • Knowledge base AI
      • Add content to your KBAI
      • Build your KBAI flow
      • Use your KBAI source URL in a button
      • Use tags to limit your KBAI content
      • 🆕Use Tables to store your KBAI questions
      • 🆕Manage handover where KBAI is unsatisfactory
    • History
      • Conversations
      • Execution logs
      • Versions
    • Channels
    • Tables
      • Create a table with records
      • Column types
      • Operate on your records
        • Insert record
        • Update record
        • Retrieve record
        • Delete record
      • API
    • Settings
      • Bot settings
      • Offloading
      • Variables
      • Voice
      • Import/Export
  • build your bot
    • Conversation design
      • Plan your bot
      • Conversation design workflow
    • Flow logic
      • Blocks
        • Message
        • Condition
        • Collect input
        • Action
      • Go-to connections
    • 🆕Emulator
    • Tips & best practices
      • Route your flow depending on the time
      • Improve the Not understood block
        • How to create better not understood messages
        • Not understood counter
        • Not understood Google search
      • Go to previous block
      • Recognizing a returning user
      • Skip introduction message
      • Route your flow depending on the channel
      • Reuse flows
      • Connect two bots to each other
  • understand users
    • Natural language processing (NLP)
      • Basic NLP concepts
      • Detect information with entities
        • Match entities
        • Contextual entities
        • System entities
        • Composite entities
      • NLP best practices
        • Word order and diversity of expressions
      • AI intent booster
      • Train your NLP
      • Context
      • NLP import & export
      • Sentiment analysis
    • Languages
      • Supported languages
      • Make your bot multilingual
      • Change language within the conversation
  • set up channels
    • Add a channel to your bot
    • All channels
      • 🚨Facebook Messenger [Deprecated soon]
        • Facebook Admin Removal
        • Facebook Webview Whitelisting
        • Facebook Messenger API updates for Europe
      • 🚨Google Assistant [Deprecated soon]
      • Webhook
      • Web
        • 🚨Web V1 [Deprecated]
          • Customize web widget
          • Live example web widget
        • 🆕Web V2
          • 🆕From Web V1 to V2
          • 🆕Web V2 methods and options
    • Sinch Conversation API
      • Configure your Sinch Conversation API
      • 🆕 Make the most of RCS with Carousels
      • WhatsApp Business API
  • integrate & code
    • API calls
      • Configure your API integration
      • Advanced API features
      • V1 API Reference
      • Custom integrations 101
    • Code editor
      • [Example] Sending data to Airtable (POST)
      • [Example] Retrieving data from Airtable (GET)
      • [Example] Making SOAP requests
    • App integrations
      • Airtable
      • Freshdesk
      • Freshdesk Messaging
      • 🆕Gmail
      • Intercom
      • Mailgun
      • OpenAI: GPT-3, ChatGPT and GPT-4
      • Salesforce
      • 🆕Slack
      • Topdesk
      • Zendesk Support
      • Zendesk Sunshine (beta)
    • 🆕Expression syntax
    • Human handover & live chat
      • Sinch Contact Pro
      • #Interact
      • Genesys Cloud
      • Offloading Webhook
      • Salesforce Service Cloud
      • Sparkcentral by Hootsuite
      • Zendesk Chat
    • Chatlayer messages specifications
  • Publish
    • Publish your bot
      • Publish a flow
  • Analyze & maintain
    • Analytics definitions
    • Data retention
    • Maintenance
    • Session
    • Track events for analytics
  • VOICE
    • 🆕Send bot response as audio
    • Voicebots
      • Create a voicebot
      • Voicebot-specific actions
      • Test your voicebot
      • From chat to voice
  • access & support
    • Identity & Access
      • Single Sign-On (SAML SSO)
    • Billing & subscription
    • Get in touch
    • Chatlayer glossary
    • SaaS Regions & IP Ranges
    • Status
    • Debug your bot
      • No correct response
      • Collect input not working
      • Video isn't working
Powered by GitBook
On this page
  • Add a question step
  • Capture user response as
  • Input types
  • General input
  • System entity input
  • Entity input
  • Check if the input matches
  • When the user response matches
  • 🆕 When the user response doesn't match
  • Capture user response as

Was this helpful?

  1. build your bot
  2. Flow logic
  3. Blocks

Collect input

A Collect input block can be used to get information from the user. When the user gives information, the bot will first check if the info corresponds to an already known variable.

PreviousConditionNextAction

Last updated 2 months ago

Was this helpful?

Add a Collect input block by to your flow.

A Collect input gets info from the user, checks it, and saves it as a .

A Collect input will typically do 3 things:

Add a question step

A Collect input should clearly ask to the user for some input.

Capture user response as

Input types

Collect input blocks have 3 types of input recognition:

Chatlayer extracts data from user inputs. For instance, if an input plugin has a type of date and the input is 'I need to be in Paris in two days,' the parser will identify 'in two days' as the date. It converts this into the DD-MM-YYYY format and stores the result in the user session.

General input

The General input type checks if the input follows a desired format.

Any

The Any input type will accept all string values as an input.

Date

The Date input parser type will parse the response as a date. Sentences like 'next week Monday' are automatically converted to a DD-MM-YYYY date object. Supported formats (also in other supported NLP languages) are:

  • 22-04-2018

  • 22-04

  • 22 apr

  • 22 april 18

  • twenty two April 2018

  • yesterday

  • today

  • now

  • last night

  • tomorrow, tmr

  • in two weeks

  • in 3 days

  • next Monday

  • next week Friday

  • last/past Monday

  • last/past week

  • within/in 5/five days

  • Friday/Fri

Image

The Image format type allows you to check if a user has uploaded an image or other file (such as pdf).

The image will be saved as an array. If you chose {img} as variable, this means that you should use {img[0]} to retrieve the URL for the first saved image.

To save a user's attachment at any point in the flow, use the defaultOnFileUpload variable. This variable will store the URL of the attachment uploaded by the user, regardless of where they are in the conversation.

Location

The Location parser sendsthe user's input to a Google Geocoding API service. When a correct address or location is recognized, the Chatlayer platform will automatically create an object that contains all relevant geo-data.

Look at the block above. When the user answers the question "Where do you work?" with a valid location, this information will be stored as a userLocationInformed variable (you can rename this variable if needed).

Below is an example that shows how the userLocationInformed variable would be stored when the user responds with 'Chatlayer.ai':

{
    fullAddress: "Oudeleeuwenrui 39, 2000 Antwerpen, Belgium",
    latitude: 51.227317,
    longitude: 4.409155999999999,
    streetNumber: "39",
    streetName: "Oudeleeuwenrui",
    city: "Antwerpen",
    country: "Belgium",
    zipcode: "2000",
}

To show the address as a full address (street, street number, zip code and city) you need to add some extra information to the variable: .fullAddress

So in the example above, the bot can display the entire location by using the following variable:{userLocationInformed.fullAddress}

A bot message containing the following info:

Thank you, shall I send your package to {userLocationInformed.fullAddress}?

Will display the following message to the user:

Thank you, shall I send your package to Oudeleeuwenrui 39, 2000 Antwerpen, Belgium?

Language

This input type will parse and validate NLP supported languages.

  • English: (en-us): 'engels', 'English', 'en', 'anglais'

  • Dutch (nl-nl): 'nederlands', 'Dutch', 'ned', 'nl', 'vlaams', 'hollands', 'be', 'ned', 'néerlandais', 'belgisch'

  • French (fr-fr): 'French', 'français', 'frans', 'fr', 'francais'

  • Chinese (zh-cn): 'Chinese', 'cn', 'zh', 'chinees'

  • Spanish (es-es): 'Spanish', 'español', 'es', 'spaans'

  • Italian (it-it): 'Italian', 'italiaans', 'italiano', 'it

  • German (de-de): 'German', 'duits', 'de', 'deutsch

  • Japanese (ja-jp): 'Japanese', 'japans', 'jp', '日本の

  • Brazil Portugese (pt-br): 'Brazil Portugese', 'Portugese', 'portugees', 'braziliaans portugees', 'português'

voiceMessage

Use the voiceMessage input type to save voice channel messages as text. Configure the maximum duration and completion time for these messages.

Hours

This input type will parse and validate timestamps.

System entity input

🆕 LLM-based system entity recognition

Your bot is now capable of recognizing system entities depending on the context of the conversation, using LLM technology.

For example:

  1. Under Generative AI, click the toggle next to Turn on generative AI features.

  2. Toggle on LLM-based entity recognition.

  1. Click Save.

  2. Create a Collect input that checks the number of passengers and check if it matches a @sys.number input.

  1. Display that variable in the next block.

  2. Test the bot: the bot now recognizes more complex sentences as the right number of people!

Entity input

Check if the input matches

A Collect input block checks whether the user response matches an already-known variable:

  • If the variable does not have a value yet, the bot will ask the question written in the Collect input block. At this point, either:

  • If the variable has a value already, the bot will automatically skip the Collect input block.

When the user response matches

If the Collect input block is skipped, that is because the variable is already known. Variables can be known already for various reasons:

  • The user has answered this question before.

  • A previous entity was detected with the same variable name.

  • The user is authenticated and the variable was automatically set.

🆕 When the user response doesn't match

When the user provides an invalid response the bot should inform the user that their answer was invalid.

Retries

Set up how many times you want the bot to ask the question again. Typically, this would just ask to reformulate.

Fallback

Set up a fallback message to where to redirect the user when the user used all their retries already. Typically, this would lead to help from customer support, for instance.

No response

You can configure the bot to detect when a user remains silent for a specified period. It triggers a specific block when no response is received within the set timeframe or by a predetermined time.

You can set how long it takes for the new block to trigger in the duration field (in minutes or at a specific time). The duration of silence can be from 1 minute up to 1440 minutes (24 hours).

Capture user response as

The bottom of your Collect input block can be configured so that you're sure to detect the answer that you're looking for.

Disable NLP

Users are able to leave the Collect input if an intent is recognized. For bots with a very small NLP model, this might trigger a false positive. The 'disable NLP' checkbox allows you to disable the NLP model while in the Collect input, which makes sure that whatever the user says gets saved as input.

For date variable: Always past - always future

When you decide to check a date variable, Chatlayer parses the user expression to match a default date format. If the date you ask should always be in the present or future, you can use these options. A user saying “Thursday” for example will be either mapped to last or next Thursday.

defining you're looking at

your bot behaviour after that

The Collect input first checks if the input is matching an .

For , make sure you use the voiceMessage input type.

If so, the Collect input saves that input under a .

input types to check if the input follows a desired format.

input types to check if the input follows a certain Chatlayer built-in entity.

input types to check the user input with a bot .

Please note that the number of that you can use inside a Collect input block is limited. The system entities that you can use inside those blocks are: sys.email, sys.phone_number, sys.url, sys.number, and sys.time.

It is important to know that intents and entities are processed before parsers. This can be useful for automatically extracting certain pieces of a sentence as an answer to a question. We've provided a great example of this in our .

For the chat widget (web channel), we recommend using the step.

The Collect input parser can check if the given input is consistent with the format for one of the following . Whenever a system entity is chosen in the 'Check if response matches' dropdown, you can give the variable a name that works for you.

For now, this feature is only available for .

Go to your bot .

Go back to your .

After you created an , you can check that the user input matches it.

Learn more about which suits your use case.

, and the variable is filled.

, and the fallback questions are asked.

The user doesn't answer, and you decide to .

If the response is matched at the time where the Collect input gets triggered, it will be saved correctly under the specified variable name in the .

voicebots
Entity
entity
system entities
tutorial
system entities
system entities
Settings
Flows
entity
entity type
debugger
which input type
checking if the user response matches
configur
ing
input type
destination variable
General
System
The value matches
The value doesn't match
detect this silence
dragging and dropping it
variable
file upload
Collect input tab.
What a Collect input block looks like on the canvas.
Ask a question to the user.
Check if the user input matches.
Save the match input into a variable.
Rename the variable that matches the system entity parser.
Set up Retries inside Collect input blocks.
Set up a fallback after X retries.
You can select between 'wait for' or 'wait until' for user response before triggering a block
Check the user location.