Touchpoint Setup
Bidiretional Voice Plus
Bidirectional Voice Plus enables voice-driven web experiences through bidirectional communication between your application and NLX. Users can navigate pages, fill forms, and interact with your site using natural voice commands.
- Getting Started
- Voice Commands Concepts
- Sending Page Context
- Your Voice Plus Command Handler
- Navigation Command Handler
- Form Fill Command Handler
- Custom Command Handler
- Complete Implementation Example
Getting Started
Voice+ with bidirectional mode enabled requires
voiceMini
input mode. This mode allows your application to handle voice commands while still maintaining a conversational flow with the user.:
1<script type="module"> 2 import { 3 create, 4 React, 5 html, 6 analyzePageForms, 7 } from "https://unpkg.com/@nlxai/touchpoint-ui@1.0.5-alpha.13/lib/index.js?module"; 8 9 const touchpointOptions = { 10 config: { 11 applicationUrl: "YOUR_APPLICATION_URL", 12 headers: { 13 "nlx-api-key": "YOUR_API_KEY", 14 }, 15 bidirectional: true, // Explicitly enable bidirectional mode 16 languageCode: "en-US", 17 userId: crypto.randomUUID(), // Required for voice 18 }, 19 input: "voiceMini", // Enables voice input with bidirectional support 20 }; 21 22 const touchpoint = await create(touchpointOptions); 23</script>
Voice Commands Concepts
Enhanced Voice Plus supports three command types:
Classification | Actions | Description |
---|---|---|
navigation | page_next , page_previous , page_custom | Navigate between pages |
input | Form field updates | Fill form fields with voice data |
custom | Application-specific | Custom commands defined by your flow |
Sending Page Context
Provide NLX with information about your page structure using the Voice Plus context API. This powers the input / formfill commands for NLX to have context of which fields are available and their types.
The
analyzePageForms
function scans your page for form elements and returns two important objects:
1// Analyze forms on the current page 2const { context, formElements } = analyzePageForms();
When to Send Context
- On page load - Send initial context after touchpoint initialization
- On route changes - In SPAs, resend context when navigating to new pages
- After dynamic form updates - If forms are added/removed dynamically
- After significant DOM changes - When form structure changes
Best Practices
- Always store the
formElements
reference for use in your command handlers - Re-analyze and resend context when your page structure changes
- Use good form accessibility practices such as labeling fields
- Provide form instructions for better voice recognition
Sending Context Example
1<script type="module"> 2 import { 3 create, 4 React, 5 html, 6 analyzePageForms, 7 } from "https://unpkg.com/@nlxai/touchpoint-ui@1.0.5-alpha.13/lib/index.js?module"; 8 9 const touchpointOptions = { 10 config: { 11 applicationUrl: "YOUR_APPLICATION_URL", 12 headers: { 13 "nlx-api-key": "YOUR_API_KEY", 14 }, 15 bidirectional: true, // Explicitly enable bidirectional mode 16 languageCode: "en-US", 17 userId: crypto.randomUUID(), // Required for voice 18 }, 19 input: "voiceMini", // Enables voice input with bidirectional support 20 }; 21 22 const touchpoint = await create(touchpointOptions); 23 24 const { context, formElements } = analyzePageForms(); 25 // Store formElements for later use when handling commands 26 window.formElements = formElements; // or use state management 27 28 // Array of destinations for navigation commands 29 const destinations = ["about", "contact", "pricing"]; 30 31 touchpoint.conversationHandler.sendContext({ 32 "nlx:vpContext": { 33 url: window.location.origin, 34 fields: context, 35 destinations: destinations, 36 }, 37 }); 38</script>
Your Voice Plus Command Handler
Register a handler to process voice commands from NLX:
1touchpoint.conversationHandler.addEventListener( 2 "voicePlusCommand", 3 (command) => { 4 const { classification, action } = command; 5 6 switch (classification) { 7 case "navigation": 8 handleNavigation(action, command); 9 break; 10 case "input": 11 handleFormInput(command); 12 break; 13 case "custom": 14 handleCustomCommand(action, command); 15 break; 16 } 17 }, 18);
Navigation Command Handler
Handle voice-driven navigation between pages:
Payload from NLX
Key | Value | Description |
---|---|---|
classification | navigation | Indicates this is a navigation command |
action | page_next , page_previous , page_custom | Type of navigation action |
destination | /about | Relative or absolute URL to navigate to |
Example Payload:
1{ 2 "classification": "navigation", 3 "action": "page_next", 4 "destination": "/about" 5}
Sample Handler
This is a basic navigation handling logic that should be updated based on your application's routing logic. For instance, if you are using a framework like React, Vue, or Angular, you would use their respective routing libraries to handle navigation.
1function handleNavigation(action, command) { 2 switch (action) { 3 case "page_next": 4 // Navigate to next page 5 window.history.forward(); 6 break; 7 8 case "page_previous": 9 // Navigate to previous page 10 window.history.back(); 11 break; 12 13 case "page_custom": 14 // Navigate to specific page 15 if (command.destination) { 16 // Handle relative or absolute navigation 17 if (command.destination.startsWith("/")) { 18 window.location.pathname = command.destination; 19 } else { 20 window.location.href = command.destination; 21 } 22 } 23 break; 24 } 25}
Form Fill Command Handler
Automatically fill form fields based on voice input. The voice agent sends back commands with field IDs that match the IDs from your
formElements
object:
Important Notes:
- The
field.id
in the command will match the element IDs in yourformElements
object- This is different from the element's own
id
attribute. It is unique to Voice+. - Voice+ generates its own
id
for reference as the HTML element may not have anid
at all or the page might violate the HTML spec and assign the same id to multiple elements.
- This is different from the element's own
- Always check if the element exists before trying to update it
Payload from NLX
Key | Value | Description |
---|---|---|
classification | input | Indicates this is a form fill command |
fields | Array of field objects | Each object contains id and value |
id | NLX's unique identifier for the form field | Pairs and will be use to match the ID in your formElements. This is different from the element's own 'id' attribute. It is unique to Voice+ |
value | Value to set for the form field | The value to fill in the form field |
Example Payload:
1{ 2 "classification": "input", 3 "fields": [ 4 { "id": "firstName", "value": "John" }, 5 { "id": "email", "value": "john@example.com" } 6 ] 7}
Sample Handler
1function handleFormInput(command) { 2 if (!command.fields) return; 3 4 command.fields.forEach((field) => { 5 // Use the stored formElements to find the DOM element 6 if (formElements[field.id]) { 7 const element = formElements[field.id]; 8 element.value = field.value; 9 } else { 10 console.warn(`Field with id "${field.id}" not found in formElements`); 11 } 12 }); 13}
Custom Command Handler
Implement application-specific voice commands by attaching a knowledge base to your Voice+ node in the flow builder. This allows you to define custom actions that can be triggered by voice commands.
Enriching the Knowledge Base
To enrich the article Q&A Knowledge Base Responses with custom voice+ commands, you will need to add MetaData to each of the responses.
There are built in metadata keys that will trigger the
input
or navigation
classifications, but you can also define your own custom actions.
Key | Classification | Action | Description |
---|---|---|---|
nlx:destination | navigation | page_custom | Navigate to a specific page or section |
nlx:action | custom | Custom action name | Send custom actions to the frontend. |
nlx:actionPayload | custom | Custom action data | Optional value only taken into account if nlx:action key is present. Sent as payload key to the frontend along with command type custom and action = nlx:action key value |
Example Payloads
Suppose I want to create a custom command that sends users to the contact page when they'd like to get in touch about animal policy along with extra information.
I create a new Article in the Knowledge Base attached to the Voice+ Node with the following content:
- Question: How do I get in touch about animal policy?
- Answer: You can contact us about animal policy by visiting our Contact Page.
metadata key | value |
---|---|
nlx:destination | contact |
nlx:action | animalPolicy |
nlx:actionPayload | {} |
nlx:actionPayload.dog | true |
nlx:actionPayload.cat | true |
I will receive TWO payloads from NLX when this article is triggered, one for the navigation command and one for the custom command.
Example Navigation Command:
1{ 2 "classification": "navigation", 3 "action": "page_custom", 4 "destination": "contact" 5}
Example Custom Command:
1{ 2 "classification": "custom", 3 "action": "animalPolicy", 4 "payload": { 5 "dog": true, 6 "cat": true 7 } 8}
Sample Handler
1function handleCustomCommand(action, command) { 2 // Example: Voice-enabled product search 3 if (action === "animalPolicy") { 4 setDogPolicy(command.payload.dog); 5 setCatPolicy(command.payload.cat); 6 } 7}
Complete Implementation Example
A comprehensive example implementing voice-driven form filling, navigation:
1<!doctype html> 2<html lang="en"> 3 <head> 4 <meta charset="UTF-8" /> 5 <title>Enhanced Voice Plus Example</title> 6 <style> 7 @keyframes highlight { 8 0% { 9 background-color: #ffeb3b; 10 } 11 100% { 12 background-color: transparent; 13 } 14 } 15 </style> 16 </head> 17 <body> 18 <h1>Enhanced Voice Plus Demo</h1> 19 20 <form id="contact-form"> 21 <input 22 type="text" 23 id="firstName" 24 name="firstName" 25 placeholder="First Name" 26 /> 27 <input 28 type="text" 29 id="lastName" 30 name="lastName" 31 placeholder="Last Name" 32 /> 33 <input type="email" id="email" name="email" placeholder="Email" /> 34 <input type="tel" id="phone" name="phone" placeholder="Phone" /> 35 <textarea id="message" name="message" placeholder="Message"></textarea> 36 <button type="submit">Submit</button> 37 </form> 38 39 <script type="module"> 40 import { 41 create, 42 React, 43 html, 44 analyzePageForms, 45 } from "https://unpkg.com/@nlxai/touchpoint-ui@1.0.5-alpha.13/lib/index.js?module"; 46 47 // Initialize Enhanced Voice Plus with bidirectional support 48 const userId = crypto.randomUUID(); 49 const conversationId = crypto.randomUUID(); 50 let formElements = {}; 51 52 async function initializeVoicePlus() { 53 // Create touchpoint with voiceMini and bidirectional enabled 54 const touchpoint = await create({ 55 config: { 56 applicationUrl: "YOUR_APPLICATION_URL", 57 headers: { 58 "nlx-api-key": "YOUR_API_KEY", 59 }, 60 bidirectional: true, // Enable bidirectional communication 61 languageCode: "en-US", 62 userId, 63 conversationId, 64 }, 65 input: "voiceMini", 66 }); 67 68 // Send initial page context 69 sendPageContext(touchpoint); 70 71 // Set up voice command handler 72 setupCommandHandler(touchpoint); 73 74 return touchpoint; 75 } 76 77 // Send page context to NLX 78 function sendPageContext(touchpoint) { 79 const { context, formElements: elements } = analyzePageForms(); 80 formElements = elements; 81 82 // Array of destinations for navigation commands 83 const destinations = ["home", "about", "contact", "products"]; 84 85 // Send context using the new sendContext method 86 touchpoint.conversationHandler.sendContext({ 87 "nlx:vpContext": { 88 url: window.location.origin, 89 fields: context, 90 destinations: destinations, 91 }, 92 }); 93 } 94 95 // Set up voice command handler 96 function setupCommandHandler(touchpoint) { 97 touchpoint.conversationHandler.addEventListener( 98 "voicePlusCommand", 99 (command) => { 100 console.log("Voice command received:", command); 101 102 switch (command.classification) { 103 case "navigation": 104 handleNavigation(command.action, command); 105 break; 106 107 case "input": 108 handleFormInput(command); 109 break; 110 111 case "custom": 112 handleCustomCommand(command.action, command); 113 break; 114 } 115 }, 116 ); 117 } 118 119 // Handle navigation commands 120 function handleNavigation(action, command) { 121 const destination = command.destination || command.data?.destination; 122 123 switch (action) { 124 case "page_next": 125 window.history.forward(); 126 break; 127 128 case "page_previous": 129 window.history.back(); 130 break; 131 132 case "page_custom": 133 if (destination) { 134 if (destination.startsWith("/")) { 135 window.location.pathname = destination; 136 } else { 137 window.location.href = destination; 138 } 139 } 140 break; 141 } 142 } 143 144 // Handle form input commands with new structure 145 function handleFormInput(command) { 146 if (!command.fields) return; 147 148 command.fields.forEach((field) => { 149 if (formElements[field.id]) { 150 const element = formElements[field.id]; 151 element.value = field.value; 152 element.classList.add("voice-updated"); 153 154 // Trigger events for frameworks that listen to them 155 element.dispatchEvent(new Event("input", { bubbles: true })); 156 element.dispatchEvent(new Event("change", { bubbles: true })); 157 158 setTimeout(() => { 159 element.classList.remove("voice-updated"); 160 }, 2000); 161 } 162 }); 163 } 164 165 // Handle custom commands 166 function handleCustomCommand(action, command) { 167 console.log("Custom command:", action, command.payload); 168 169 // Example: Handle custom search command 170 if (action === "search") { 171 // Implement search functionality 172 console.log("Searching for:", command.payload.query); 173 } 174 } 175 176 // Initialize when page loads 177 if (document.readyState === "loading") { 178 document.addEventListener("DOMContentLoaded", initializeVoicePlus); 179 } else { 180 initializeVoicePlus(); 181 } 182 </script> 183 </body> 184</html>