Developer Docs

Touchpoint Setup

Bidirectional Voice Plus

Bidirectional Voice Plus enables voice-driven web experiences through bidirectional communication between your application and NLX. Users can navigate pages, fill forms, and interact with your site using natural voice commands.

Getting Started

Voice+ with bidirectional mode enabled requires

voiceMini input mode. This mode allows your application to handle voice commands while still maintaining a conversational flow with the user. This is all you need to get an out of the box experience that allows you to navigate your site and fill out forms.

1<html lang="en"> 2 <head> 3 <title>Touchpoint Sample HTML</title> 4 <meta name="viewport" content="width=device-width, initial-scale=1"> 5 </head> 6 <body> 7 <script type="module"> 8 import { create, React, html } from "https://unpkg.com/@nlxai/touchpoint-ui@1.1.4/lib/index.js?module"; 9 10 const touchpointOptions = { 11 config: { 12 applicationUrl: "YOUR_APPLICATION_URL", 13 headers: { 14 "nlx-api-key": "YOUR_API_KEY", 15 }, 16 languageCode: "en-US", 17 }, 18 input: "voiceMini", // Enables voice input with bidirectional support 19 bidirectional: {}, 20 }; 21 22 const touchpoint = await create(touchpointOptions); 23 24 </script> 25 </body> 26</html>

Voice Commands Concepts

Enhanced Voice Plus supports three command types:

ClassificationActionsDescription
navigationpage_next, page_previous, page_customNavigate between pages
inputForm field updatesFill form fields with voice data
customApplication-specificCustom commands defined by your flow

Sending Page Context

Provide NLX with information about your page structure using the Voice Plus context API. This powers the input / formfill commands for NLX to have context of which fields are available and their types.

The

analyzePageForms function scans your page for form elements and returns two important objects:

1import { analyzePageForms } from "@nlxai/touchpoint-ui"; 2// Analyze forms on the current page 3const { context, formElements } = analyzePageForms();

When to Send Context

  1. On page load - Send initial context after touchpoint initialization
  2. On route changes - In SPAs, resend context when navigating to new pages
  3. After dynamic form updates - If forms are added/removed dynamically
  4. After significant DOM changes - When form structure changes

Best Practices

  • Always store the formElements reference for use in your command handlers
  • Re-analyze and resend context when your page structure changes
  • Use good form accessibility practices such as labeling fields
  • Provide form instructions for better voice recognition

Sending Context Example

1<html lang="en"> 2 <head> 3 <title>Touchpoint Sample HTML</title> 4 <meta name="viewport" content="width=device-width, initial-scale=1"> 5 </head> 6 <body> 7 <script type="module"> 8 import { create, analyzePageForms } from "https://unpkg.com/@nlxai/touchpoint-ui@1.1.4/lib/index.js?module"; 9 10 const touchpointOptions = { 11 config: { 12 applicationUrl: "YOUR_APPLICATION_URL", 13 headers: { 14 "nlx-api-key": "YOUR_API_KEY", 15 }, 16 languageCode: "en-US", 17 }, 18 input: "voiceMini", // Enables voice input with bidirectional support 19 bidirectional: { 20 automaticContext: false, 21 }, // Explicitly enable bidirectional mode. 22 }; 23 24 const touchpoint = await create(touchpointOptions); 25 26 const { context, formElements } = analyzePageForms(); 27 // Store formElements for later use when handling commands 28 window.formElements = formElements; // or use state management 29 30 // Array of destinations for navigation commands 31 const destinations = ["about", "contact", "pricing"]; 32 33 touchpoint.conversationHandler.sendContext({ 34 "nlx:vpContext": { 35 url: window.location.origin, 36 fields: context, 37 destinations: destinations, 38 }, 39 }); 40 </script> 41 </body> 42</html>

Customize the handler for voice-driven navigation between pages:

Payload from NLX

KeyValueDescription
actionpage_next, page_previous, page_customType of navigation action
destination/aboutRelative or absolute URL to navigate to

Example Payload:

1{ 2 "classification": "navigation", 3 "action": "page_next", 4 "destination": "/about" 5}

Sample Handler

Touchpoint includes an application-agnostic navigation handler built in to the SDK that leverages

window based navigation.

If you are using a framework like React, Vue, or Angular, you would use their respective routing libraries to handle navigation in a custom navigation handler.

1<html lang="en"> 2 <head> 3 <title>Touchpoint Sample HTML</title> 4 <meta name="viewport" content="width=device-width, initial-scale=1"> 5 </head> 6 <body> 7 <script type="module"> 8 import { create, React, html } from "https://unpkg.com/@nlxai/touchpoint-ui@1.1.4/lib/index.js?module"; 9 10 function handleNavigation(action, destination) { 11 switch (action) { 12 case "page_next": 13 // Navigate to next page 14 window.history.forward(); 15 break; 16 17 case "page_previous": 18 // Navigate to previous page 19 window.history.back(); 20 break; 21 22 case "page_custom": 23 // Handle relative or absolute navigation 24 if (destination.startsWith("/")) { 25 window.location.pathname = destination; 26 } else { 27 window.location.href = destination; 28 } 29 30 break; 31 } 32 } 33 34 const touchpointOptions = { 35 config: { 36 applicationUrl: "YOUR_APPLICATION_URL", 37 headers: { 38 "nlx-api-key": "YOUR_API_KEY", 39 }, 40 languageCode: "en-US", 41 }, 42 input: "voiceMini", // Enables voice input with bidirectional support 43 bidirectional: { 44 automaticContext: false, 45 navigation: handleNavigation, 46 }, // Explicitly enable bidirectional mode. 47 }; 48 49 </script> 50 </body> 51</html>

Form Fill Command Handler

Touchpoint includes an application-agnostic form-input handler built in to the SDK that sets the values received from NLX during a conversation.

To customize the formfill behavior build and specify a custom input handler. NLX will send input commands with field IDs that match the IDs from your

formElements object returned from analyzePageForms():

Important Notes:

  • The field.id in the command will match the element IDs in your formElements object
    • This is different from the element's own id attribute. It is unique to Voice+.
    • Voice+ generates its own id for reference as the HTML element may not have an id at all or the page might violate the HTML spec and assign the same id to multiple elements.
  • Always check if the element exists before trying to update it

Payload from NLX

KeyValueDescription
classificationinputIndicates this is a form fill command
fieldsArray of field objectsEach object contains id and value
idNLX's unique identifier for the form fieldPairs and will be use to match the ID in your formElements.

This is different from the element's own 'id' attribute. It is unique to Voice+
valueValue to set for the form fieldThe value to fill in the form field

Example Payload:

1{ 2 "classification": "input", 3 "fields": [ 4 { "id": "firstName", "value": "John" }, 5 { "id": "email", "value": "john@example.com" } 6 ] 7}

Sample Handler

1<html lang="en"> 2 <head> 3 <title>Touchpoint Sample HTML</title> 4 <meta name="viewport" content="width=device-width, initial-scale=1"> 5 </head> 6 <body> 7 <script type="module"> 8 import { create, React, html } from "https://unpkg.com/@nlxai/touchpoint-ui@1.1.4/lib/index.js?module"; 9 10 function handleFormInput(fields, formElements) { 11 fields.forEach((field) => { 12 // Use the stored formElements to find the DOM element 13 if (formElements[field.id]) { 14 const element = formElements[field.id]; 15 element.value = field.value; 16 } else { 17 console.warn(`Field with id "${field.id}" not found in formElements`); 18 } 19 }); 20 } 21 22 const touchpointOptions = { 23 config: { 24 applicationUrl: "YOUR_APPLICATION_URL", 25 headers: { 26 "nlx-api-key": "YOUR_API_KEY", 27 }, 28 languageCode: "en-US", 29 }, 30 input: "voiceMini", // Enables voice input with bidirectional support 31 bidirectional: { 32 input: handleFormInput, 33 }, // Explicitly enable bidirectional mode. 34 }; 35 36 </script> 37 </body> 38</html>

Custom Command Handler

Implement application-specific voice commands by attaching a knowledge base to your Voice+ node in the flow builder. This allows you to define custom actions that can be triggered by voice commands.

Enriching the Knowledge Base

To enrich the article Q&A Knowledge Base Responses with custom voice+ commands, you will need to add MetaData to each of the responses.

There are built in metadata keys that will trigger the

input or navigation classifications, but you can also define your own custom actions.

KeyClassificationActionDescription
nlx:destinationnavigationpage_customNavigate to a specific page or section
nlx:actioncustomCustom action nameSend custom actions to the frontend.
nlx:actionPayloadcustomCustom action dataOptional value only taken into account if nlx:action key is present. Sent as payload key to the frontend along with command type custom and action = nlx:action key value

Example Payloads

Suppose I want to create a custom command that sends users to the contact page when they'd like to get in touch about animal policy along with extra information.

I create a new Article in the Knowledge Base attached to the Voice+ Node with the following content:

  • Question: How do I get in touch about animal policy?
  • Answer: You can contact us about animal policy by visiting our Contact Page.
metadata keyvalue
nlx:destinationcontact
nlx:actionanimalPolicy
nlx:actionPayload{}
nlx:actionPayload.dogtrue
nlx:actionPayload.cattrue

I will receive TWO payloads from NLX when this article is triggered, one for the navigation command and one for the custom command.

Example Navigation Command:

1{ 2 "classification": "navigation", 3 "action": "page_custom", 4 "destination": "contact" 5}

Example Custom Command:

1{ 2 "classification": "custom", 3 "action": "animalPolicy", 4 "payload": { 5 "dog": true, 6 "cat": true 7 } 8}

Sample Handler

1<html lang="en"> 2 <head> 3 <title>Touchpoint Sample HTML</title> 4 <meta name="viewport" content="width=device-width, initial-scale=1"> 5 </head> 6 <body> 7 <script type="module"> 8 import { create, React, html } from "https://unpkg.com/@nlxai/touchpoint-ui@1.1.4/lib/index.js?module"; 9 10 function handleCustomCommand(action, payload) { 11 // Example: Voice-enabled product search 12 if (action === "animalPolicy") { 13 setDogPolicy(payload.dog); 14 setCatPolicy(payload.cat); 15 } 16 } 17 18 const touchpointOptions = { 19 config: { 20 applicationUrl: "YOUR_APPLICATION_URL", 21 headers: { 22 "nlx-api-key": "YOUR_API_KEY", 23 }, 24 languageCode: "en-US", 25 }, 26 input: "voiceMini", // Enables voice input with bidirectional support 27 bidirectional: { 28 custom: handleCustomCommand, 29 }, // Explicitly enable bidirectional mode. 30 }; 31 32 </script> 33 </body> 34</html>

Complete Custom Implementation Example

A comprehensive example implementing voice-driven form filling, navigation:

1<!doctype html> 2<html lang="en"> 3 <head> 4 <meta charset="UTF-8" /> 5 <title>Enhanced Voice Plus Example</title> 6 <style> 7 @keyframes highlight { 8 0% { 9 background-color: #ffeb3b; 10 } 11 100% { 12 background-color: transparent; 13 } 14 } 15 </style> 16 </head> 17 <body> 18 <h1>Enhanced Voice Plus Demo</h1> 19 20 <form id="contact-form"> 21 <input 22 type="text" 23 id="firstName" 24 name="firstName" 25 placeholder="First Name" 26 /> 27 <input 28 type="text" 29 id="lastName" 30 name="lastName" 31 placeholder="Last Name" 32 /> 33 <input type="email" id="email" name="email" placeholder="Email" /> 34 <input type="tel" id="phone" name="phone" placeholder="Phone" /> 35 <textarea id="message" name="message" placeholder="Message"></textarea> 36 <button type="submit">Submit</button> 37 </form> 38 39 <script type="module"> 40 import { 41 create, 42 React, 43 html, 44 } from "https://unpkg.com/@nlxai/touchpoint-ui@1.1.3/lib/index.js?module"; 45 46 // Initialize Enhanced Voice Plus with bidirectional support 47 let formElements = {}; 48 49 async function initializeVoicePlus() { 50 // Create touchpoint with voiceMini and bidirectional enabled 51 const touchpoint = await create({ 52 config: { 53 applicationUrl: "YOUR_APPLICATION_URL", 54 headers: { 55 "nlx-api-key": "YOUR_API_KEY", 56 }, 57 languageCode: "en-US", 58 }, 59 input: "voiceMini", 60 bidirectional: { 61 automaticContext: false, 62 navigation: handleNavigation, 63 input: handleFormInput, 64 custom: handleCustom, 65 }, 66 }); 67 68 // Send initial page context 69 sendPageContext(touchpoint); 70 71 // Set up voice command handler 72 setupCommandHandler(touchpoint); 73 74 return touchpoint; 75 } 76 77 // Send page context to NLX 78 function sendPageContext(touchpoint) { 79 const { context, formElements: elements } = analyzePageForms(); 80 formElements = elements; 81 82 // Array of destinations for navigation commands 83 const destinations = ["home", "about", "contact", "products"]; 84 85 // Send context using the new sendContext method 86 touchpoint.conversationHandler.sendContext({ 87 "nlx:vpContext": { 88 url: window.location.origin, 89 fields: context, 90 destinations: destinations, 91 }, 92 }); 93 } 94 95 // Handle navigation commands 96 function handleNavigation(action, destination) { 97 switch (action) { 98 case "page_next": 99 window.history.forward(); 100 break; 101 102 case "page_previous": 103 window.history.back(); 104 break; 105 106 case "page_custom": 107 if (destination.startsWith("/")) { 108 window.location.pathname = destination; 109 } else { 110 window.location.href = destination; 111 } 112 113 break; 114 } 115 } 116 117 // Handle form input commands with new structure 118 function handleFormInput(fields) { 119 fields.forEach((field) => { 120 if (formElements[field.id]) { 121 const element = formElements[field.id]; 122 element.value = field.value; 123 element.classList.add("voice-updated"); 124 125 // Trigger events for frameworks that listen to them 126 element.dispatchEvent(new Event("input", { bubbles: true })); 127 element.dispatchEvent(new Event("change", { bubbles: true })); 128 129 setTimeout(() => { 130 element.classList.remove("voice-updated"); 131 }, 2000); 132 } 133 }); 134 } 135 136 // Handle custom commands 137 function handleCustomCommand(action, payload) { 138 console.log("Custom command:", action, payload); 139 140 // Example: Handle custom search command 141 if (action === "search") { 142 // Implement search functionality 143 console.log("Searching for:", payload.query); 144 } 145 } 146 147 // Initialize when page loads 148 if (document.readyState === "loading") { 149 document.addEventListener("DOMContentLoaded", initializeVoicePlus); 150 } else { 151 initializeVoicePlus(); 152 } 153 </script> 154 </body> 155</html>