Touchpoint Setup
Bidirectional Voice+
Bidirectional Voice+ enables voice-driven web experiences through bidirectional communication between your application and NLX. Users can navigate pages, fill forms, and interact with your site using natural voice commands.
- Getting Started
- Voice+ Page Navigation
- Voice+ Form Fill
- Voice+ Knowledge Base Enhanced Responses
- Sending Page Context
- Complete Custom Implementation Example
Getting Started
Voice+ with bidirectional mode enabled requires
voiceMini
input mode. This mode allows your application to handle voice commands while still maintaining a conversational flow with the user. This is all you need to get an out of the box experience that allows you to navigate your site and fill out forms.
1<html lang="en"> 2 <head> 3 <title>Touchpoint Sample HTML</title> 4 <meta name="viewport" content="width=device-width, initial-scale=1"> 5 </head> 6 <body> 7 <script type="module"> 8 import { create, React, html } from "https://unpkg.com/@nlxai/touchpoint-ui@1.1.8-alpha.0/lib/index.js?module"; 9 10 const touchpointOptions = { 11 config: { 12 applicationUrl: "YOUR_APPLICATION_URL", 13 headers: { 14 "nlx-api-key": "YOUR_API_KEY", 15 }, 16 languageCode: "en-US", 17 }, 18 input: "voiceMini", // Enables voice input with bidirectional support 19 bidirectional: {}, 20 }; 21 22 const touchpoint = await create(touchpointOptions); 23 24 </script> 25 </body> 26</html>
Voice Commands Concepts
Enhanced Voice+ supports three command types:
Classification | Actions | Description |
---|---|---|
navigation | page_next , page_previous , page_custom | Navigate between pages |
input | Form field updates | Fill form fields with voice data |
custom | Application-specific | Custom commands defined by your flow |
Voice+ Page Navigation
Touchpoint has default handlers for bidirectional voice+ commands from the user such as "page back", "next page", and basic "navigate to the {} page".
Ensuring Persistence During Navigation
If you are using a framework like React, Vue, or Angular, you should use their respective routing libraries to handle navigation in a custom navigation handler.
Payload from NLX
Key | Example Value | Description |
---|---|---|
action | page_next , page_previous , page_custom | Type of navigation action |
destination | /about | Relative or absolute URL to navigate to |
Example Payload:
1{ 2 "classification": "navigation", 3 "action": "page_next", 4 "destination": "/about" 5}
Sample Custom Navigation Handler
Touchpoint includes an application-agnostic navigation handler built in to the SDK that leverages
window
based navigation. If you're using a web framework or would like to change routing behavior, you will build your own navigation handler.
1<html lang="en"> 2 <head> 3 <title>Touchpoint Sample HTML</title> 4 <meta name="viewport" content="width=device-width, initial-scale=1"> 5 </head> 6 <body> 7 <script type="module"> 8 import { create, React, html } from "https://unpkg.com/@nlxai/touchpoint-ui@1.1.8-alpha.0/lib/index.js?module"; 9 10 function handleNavigation(action, destination) { 11 switch (action) { 12 case "page_next": 13 // Navigate to next page 14 window.history.forward(); 15 break; 16 17 case "page_previous": 18 // Navigate to previous page 19 window.history.back(); 20 break; 21 22 /** 23 * If you are using a framework like React, Vue, or Angular. 24 * Use their respective routing libraries to handle navigation. 25 **/ 26 case "page_custom": 27 // Handle relative or absolute navigation 28 if (destination.startsWith("/")) { 29 window.location.pathname = destination; 30 } else { 31 window.location.href = destination; 32 } 33 34 break; 35 } 36 } 37 38 const touchpointOptions = { 39 config: { 40 applicationUrl: "YOUR_APPLICATION_URL", 41 headers: { 42 "nlx-api-key": "YOUR_API_KEY", 43 }, 44 languageCode: "en-US", 45 }, 46 input: "voiceMini", // Enables voice input with bidirectional support 47 bidirectional: { 48 navigation: handleNavigation, 49 }, 50 }; 51 52 </script> 53 </body> 54</html>
Voice+ Form Fill
Touchpoint includes an application-agnostic form-input handler built in to the SDK that sets the values received from NLX during a conversation.
Customizing Form Fill Behavior
To customize the formfill behavior build and specify a custom input handler. NLX will send input commands with field IDs that match the IDs from your
formElements
object returned from analyzePageForms().
Important Notes:
- The
field.id
in the command will match the element IDs in yourformElements
object- This is different from the element's own
id
attribute. It is unique to Voice+. - Voice+ generates its own
id
for reference as the HTML element may not have anid
at all or the page might violate the HTML spec and assign the same id to multiple elements.
- This is different from the element's own
- Always check if the element exists before trying to update it
Payload from NLX
Key | Value | Description |
---|---|---|
classification | input | Indicates this is a form fill command |
fields | Array of field objects | Each object contains id and value |
id | NLX's unique identifier for the form field | Pairs and will be use to match the ID in your formElements. This is different from the element's own 'id' attribute. It is unique to Voice+ |
value | Value to set for the form field | The value to fill in the form field |
Example Payload:
1{ 2 "classification": "input", 3 "fields": [ 4 { "id": "firstName", "value": "John" }, 5 { "id": "email", "value": "john@example.com" } 6 ] 7}
Sample Custom Form Handler
1<html lang="en"> 2 <head> 3 <title>Touchpoint Sample HTML</title> 4 <meta name="viewport" content="width=device-width, initial-scale=1"> 5 </head> 6 <body> 7 <script type="module"> 8 import { create, React, html } from "https://unpkg.com/@nlxai/touchpoint-ui@1.1.8-alpha.0/lib/index.js?module"; 9 10 function handleFormInput(fields, formElements) { 11 fields.forEach((field) => { 12 // Use the stored formElements to find the DOM element 13 if (formElements[field.id]) { 14 const element = formElements[field.id]; 15 element.value = field.value; 16 } else { 17 console.warn(`Field with id "${field.id}" not found in formElements`); 18 } 19 }); 20 } 21 22 const touchpointOptions = { 23 config: { 24 applicationUrl: "YOUR_APPLICATION_URL", 25 headers: { 26 "nlx-api-key": "YOUR_API_KEY", 27 }, 28 languageCode: "en-US", 29 }, 30 input: "voiceMini", // Enables voice input with bidirectional support 31 bidirectional: { 32 input: handleFormInput, 33 }, // Explicitly enable bidirectional mode. 34 }; 35 36 </script> 37 </body> 38</html>
Voice+ Knowledge Base Enhanced Responses
Implement application-specific voice commands by attaching a knowledge base to your Voice+ node in the flow builder. This allows you to define custom actions that can be triggered by voice commands.
Enriching the Knowledge Base
To enrich the article Q&A Knowledge Base Responses with custom voice+ commands, you will need to add MetaData to each of the responses.
There are built in metadata keys that will trigger the
input
or navigation
classifications, but you can also define your own custom actions.
Key | Classification | Action | Description |
---|---|---|---|
nlx:destination | navigation | page_custom | Navigate to a specific page or section |
nlx:action | custom | Custom action name | Send custom actions to the frontend. |
nlx:actionPayload | custom | Custom action data | Optional value only taken into account if nlx:action key is present. Sent as payload key to the frontend along with command type custom and action = nlx:action key value |
Example Payloads
Suppose I want to create a custom command that sends users to the contact page when they'd like to get in touch about animal policy along with extra information.
I create a new Article in the Knowledge Base attached to the Voice+ Node with the following content:
- Question: How do I get in touch about animal policy?
- Answer: You can contact us about animal policy by visiting our Contact Page.
metadata key | value |
---|---|
nlx:destination | contact |
nlx:action | animalPolicy |
nlx:actionPayload | {} |
nlx:actionPayload.dog | true |
nlx:actionPayload.cat | true |
I will receive TWO payloads from NLX when this article is triggered, one for the navigation command and one for the custom command.
Example Navigation Command:
1{ 2 "classification": "navigation", 3 "action": "page_custom", 4 "destination": "contact" 5}
Example Custom Command:
1{ 2 "classification": "custom", 3 "action": "animalPolicy", 4 "payload": { 5 "dog": true, 6 "cat": true 7 } 8}
Sample Knowledge Base Response Handler
1<html lang="en"> 2 <head> 3 <title>Touchpoint Sample HTML</title> 4 <meta name="viewport" content="width=device-width, initial-scale=1"> 5 </head> 6 <body> 7 <script type="module"> 8 import { create, React, html } from "https://unpkg.com/@nlxai/touchpoint-ui@1.1.8-alpha.0/lib/index.js?module"; 9 10 function handleCustomCommand(action, payload) { 11 // Example: Voice-enabled product search 12 if (action === "animalPolicy") { 13 setDogPolicy(payload.dog); 14 setCatPolicy(payload.cat); 15 } 16 } 17 18 const touchpointOptions = { 19 config: { 20 applicationUrl: "YOUR_APPLICATION_URL", 21 headers: { 22 "nlx-api-key": "YOUR_API_KEY", 23 }, 24 languageCode: "en-US", 25 }, 26 input: "voiceMini", // Enables voice input with bidirectional support 27 bidirectional: { 28 custom: handleCustomCommand, 29 }, 30 }; 31 32 </script> 33 </body> 34</html>
Sending Page Context
Touchpoint provides NLX with information about your page structure on page load, after filling forms, and on page refresh. All available forms information and any available href links found on the page are analyzed by NLX and then used to send payloads and smalltalk during user interactions.
Directly Sending Page Context
You shouldn't need to send context for basic bidirectional Voice+ interactions. If you have other page interactions that change the DOM, you should send the context manually after processing those changes.
The
analyzePageForms
function scans your page for form elements and returns two important objects:
1import { analyzePageForms } from "@nlxai/touchpoint-ui"; 2// Analyze forms on the current page 3const { context, formElements } = analyzePageForms();
Best Practices
- Always store the
formElements
reference for use in your command handlers - Re-analyze and resend context when your page structure changes
- Use good form accessibility practices such as labeling fields
- Provide form instructions for better voice recognition
When to Send Context
- On page load - Send initial context after touchpoint initialization
- On route changes - In SPAs, resend context when navigating to new pages
- After dynamic form updates - If forms are added/removed dynamically
- After significant DOM changes - When form structure changes
Sending Context Example
1<html lang="en"> 2 <head> 3 <title>Touchpoint Sample HTML</title> 4 <meta name="viewport" content="width=device-width, initial-scale=1"> 5 </head> 6 <body> 7 <script type="module"> 8 import { create, analyzePageForms } from "https://unpkg.com/@nlxai/touchpoint-ui@1.1.8-alpha.0/lib/index.js?module"; 9 10 const touchpointOptions = { 11 config: { 12 applicationUrl: "YOUR_APPLICATION_URL", 13 headers: { 14 "nlx-api-key": "YOUR_API_KEY", 15 }, 16 languageCode: "en-US", 17 }, 18 input: "voiceMini", // Enables voice input with bidirectional support 19 bidirectional: { 20 automaticContext: false, 21 }, 22 }; 23 24 const touchpoint = await create(touchpointOptions); 25 26 const { context, formElements } = analyzePageForms(); 27 // Store formElements for later use when handling commands 28 window.formElements = formElements; // or use state management 29 30 // Array of destinations for navigation commands 31 const destinations = ["about", "contact", "pricing"]; 32 33 touchpoint.conversationHandler.sendContext({ 34 "nlx:vpContext": { 35 url: window.location.origin, 36 fields: context, 37 destinations: destinations, 38 }, 39 }); 40 </script> 41 </body> 42</html>
Complete Custom Implementation Example
A comprehensive example implementing voice-driven form filling, navigation:
1<!doctype html> 2<html lang="en"> 3 <head> 4 <meta charset="UTF-8" /> 5 <title>Enhanced Voice+ Example</title> 6 <style> 7 @keyframes highlight { 8 0% { 9 background-color: #ffeb3b; 10 } 11 100% { 12 background-color: transparent; 13 } 14 } 15 </style> 16 </head> 17 <body> 18 <h1>Enhanced Voice+ Demo</h1> 19 20 <form id="contact-form"> 21 <input 22 type="text" 23 id="firstName" 24 name="firstName" 25 placeholder="First Name" 26 /> 27 <input 28 type="text" 29 id="lastName" 30 name="lastName" 31 placeholder="Last Name" 32 /> 33 <input type="email" id="email" name="email" placeholder="Email" /> 34 <input type="tel" id="phone" name="phone" placeholder="Phone" /> 35 <textarea id="message" name="message" placeholder="Message"></textarea> 36 <button type="submit">Submit</button> 37 </form> 38 39 <script type="module"> 40 import { 41 create, 42 React, 43 html, 44 analyzePageForms, 45 } from "https://unpkg.com/@nlxai/touchpoint-ui@1.1.5/lib/index.js?module"; 46 47 async function initializeVoicePlus() { 48 // Create touchpoint with voiceMini and bidirectional enabled 49 const touchpoint = await create({ 50 config: { 51 applicationUrl: "YOUR_APPLICATION_URL", 52 headers: { 53 "nlx-api-key": "YOUR_API_KEY", 54 }, 55 languageCode: "en-US", 56 }, 57 input: "voiceMini", 58 bidirectional: { 59 navigation: handleNavigation, 60 input: handleFormInput, 61 custom: handleCustom, 62 }, 63 }); 64 65 return touchpoint; 66 } 67 68 // Handle navigation commands 69 function handleNavigation(action, destination) { 70 console.log("Navigation command:", action, destination); 71 switch (action) { 72 case "page_next": 73 window.history.forward(); 74 break; 75 76 case "page_previous": 77 window.history.back(); 78 break; 79 80 case "page_custom": 81 if (destination.startsWith("/")) { 82 window.location.pathname = destination; 83 } else { 84 window.location.href = destination; 85 } 86 87 break; 88 } 89 } 90 91 // Handle form input commands with new structure 92 function handleFormInput(fields, formElements) { 93 console.log("Form input fields received:", fields); 94 fields.forEach((field) => { 95 if (formElements[field.id]) { 96 const element = formElements[field.id]; 97 element.value = field.value; 98 element.classList.add("voice-updated"); 99 100 // Trigger events for frameworks that listen to them 101 element.dispatchEvent(new Event("input", { bubbles: true })); 102 element.dispatchEvent(new Event("change", { bubbles: true })); 103 104 setTimeout(() => { 105 element.classList.remove("voice-updated"); 106 }, 2000); 107 } 108 }); 109 } 110 111 // Handle custom commands 112 function handleCustom(action, payload) { 113 console.log("Custom command:", action, payload); 114 115 // Example: Handle custom search command 116 if (action === "search") { 117 // Implement search functionality 118 console.log("Searching for:", payload.query); 119 } 120 } 121 // Initialize when page loads 122 if (document.readyState === "loading") { 123 document.addEventListener("DOMContentLoaded", initializeVoicePlus); 124 } else { 125 initializeVoicePlus(); 126 } 127 </script> 128 </body> 129</html>