Search
Venky Naganathan

SE Radio 480: Venky Naganathan on Chatbots for Enterprise Applications

Venky Naganathan, Sr. Director of Engineering at Conga specializing in Artificial Intelligence and Chatbots talks about the buzz around Conversational UI. B2B companies have also started adopting the paradigm. Host Kanchan Shringi speaks with Naganathan about the need for the new UI paradigm for Enterprise Apps, what have been the enablers and the business use cases suited for this paradigm. They discuss the basics of Natural language Processing and Natural Language understanding as well as what part Machine Learning and Deep learning plays. The discussion then drills into the platform services required to implement a Conversational assistant for Enterprise Apps and the challenges that are unique to B2B use cases. The discussion also includes staffing of a team, testing and some insights into the build vs buy for the NLU engine.

This episode sponsored by SingleStore and NetApp.


Show Notes

Related Links

Transcript

Transcript brought to you by IEEE Software
This transcript was automatically generated. To suggest improvements in the text, please contact [email protected].

SE Radio 00:00:00 This is software engineering radio, the podcast for professional developers on the [email protected] se radio is brought to you by the computer society, as well as your billing software magazine online at computer.org/software. So far for this episode comes from single store scaling. Modern SaaS is hard. It’s even harder when you’re using databases that were built the last time boy bands and low-rise jeans were all the rage. Single store is a modern cloud database built for today’s data intensive applications on the smart SAS generation single store power is the new wave of SAS technologies. Displacing legacy providers used to break the data bottleneck and reduce your monthly cloud bill in the process. Learn how a single store can help [email protected] slash se radio.

Kanchan Shringi 00:00:50 Hello everyone. This is your host Gunjan Shrinky. Welcome to software engineering radio today. Our guest is Winky Naga. Notten Winky, senior director of engineering at conga. He specializes in artificial intelligence and chatbots in the last two years. Winky has led the Aptus now conga max AI mobile assistant for Coton contract promo cool new lab concept to a live product. Max is a Siri Google assistant and Alexa like AI powered intelligence platform. That acts as a personal assistant for sales and marketing personnel. Prior to conga Winkie was director of engineering AI platform [email protected]. Welcome wonky, really looking forward to our conversation today. Is there anything else you’d like to add to your bio?

Venky Naganthan 00:01:40 Um, no, that looks great. I’m looking forward to our chat. Thank you.

Kanchan Shringi 00:01:44 Let’s jump in with a very basic question. What was the need for this? You conversational AI for your B2B app and why now?

Venky Naganthan 00:01:55 So I think a traditional user interfaces has something that’s more suitable for use cases that we can anticipate in advance. For example, if you’re building a banking website, your traditional user interface is good for users logging in and reclaiming your basic account information and transaction history, but I’m sure their application is capable of delivering a far more diverse set of information at much greater depth. For instance, if the user has questions like what’s my average daily balance, how many ATM transactions are performed in the last month, unless the user NFS has these options. There’s no way for the user to retrieve this information, chat bots help collapse the complexity of your user interfaces in Delaware, far more information using a much simpler UX. So that’s essentially how I view chatbots and logical assistance for most applications. And also if you look at it from a historical perspective, virtual assistant is basically a natural evolution of user interfaces. At one point, we used to live on command line interfaces and then they became gooeys and then they got subsumed them to the web interface. And as soon it was on mobile and nowadays applications are becoming more complex. So users are becoming more demanding. So to address all of the use cases and Delaware and optimal UX, they got a mix traditional user interfaces with non-traditional ones.

Venky Naganthan 00:03:26 That’s what it fits in.

Kanchan Shringi 00:03:29 So why is it taking off now though? Was this need not felt earlier or is it just that we have the technology at this time?

Venky Naganthan 00:03:38 Certainly the rays of Siri, Alexa, Google assistant, like, uh, platforms that have, you know, popularize this right. A lot of users are beginning to use it and they consider it useful. Um, and also the technology has evolved to a point where bullying such technology, such innovation, uh, in a credible manner as possible. Now

Kanchan Shringi 00:04:01 Tell us a little bit more about max. You know, you certainly alluded to the fact that the technological evolution helped that and there was a business need to, but maybe you can tell us a little bit more about the business aspects as well.

Venky Naganthan 00:04:14 Yeah. So first of all, max is a, like you said, it’s a city Alexa, Google assistant, like a virtual assistant it’s capable of for Florida mission skill automation, and also it’s capable of many other things which I’ll talk about, uh, as we go on, there are a couple of reasons why we really built Macs. Uh, first is the business need. And second is, uh, we also saw an interesting industry trend. So on a business need before I talk about it, I need to give you a little bit more context about what we do at Congo. At conga, we build software applications that help our customers run their revenue operations. Our typical end-user is a enterprise sales person. They’re not the one of the persona, but they are the typical persona for our products. And the typical use cases, a sales person taking a pre-sales opportunity all the way through sales demand, quotation process, to turning that into a contract and managing the redlining process and Dunning on getting that signed and managing the life cycle of contracts, turning all of that into cash, et cetera.

Venky Naganthan 00:05:25 So this is a code to cash domain and workflows in this domain tend to be really complex. So we needed a tool are an engine that can simplify user experience and improve productivity in this domain, right? So that’s the Genesis of max. And second on the industry trend, not too long ago when Steve jobs introduced the iPhone probably for a decade and a half or so ago, Blackberry used to used to be the dominant brand in that enterprise today, you don’t see blackberries, it has been completely replaced by smart forms. Similarly, once upon a time, you used to make phone calls to our colleagues to collaborate today, we use messaging, right? So these are all examples of strong consumer trends, pushing an income and trend out and making their way into the enterprise. So you see a similar thing happening with chatbots, right? So the advent of Siri, Alexa, Google assistant, like you said, uh, have, uh, you know, improved user experience in the world of, in the consumer world. And we think that it’s just a matter of time before they find their way into the enterprise. So it’s still very early stages, but it’s going to happen. So those are really the drivers for max.

Kanchan Shringi 00:06:43 Can we take an example of a question that max has asked and then talk about what distinguishes max from the other?

Venky Naganthan 00:06:52 Yeah. So you can ask max to automate a workflow or create a workflow. For example, you can tell max to create an NDA for Google and send it to conscience for signature. So max will automatically figured out all of the details in Walden, orchestrating the workflow and accomplish that. It’s like your virtual secretary, Ari can tell max to pull up information that satisfies a particular criteria. For his, for instance, you can tell max to look up all MLSs that are all $20,000 from last year that I said to X by the squatter, and max will understand the no answers in the sentence and pull up the exact information. Max is also capable of providing you just-in-time notification of important events that happen in the system. It behaves exactly like a, you know, a very intelligent human secretary, constantly reminding their bosses, uh, what needs to be done and also helping the masses, you know, be productive in terms of differences between max and the conventional monster that you see out there.

Venky Naganthan 00:07:56 So, uh, first let me talk about Google assistant, Alexa and Siri. They are more encyclopedic in nature. They go a lot broader. They try to address all possible questions, but they are a little shallow. You can ask them things like how’s the weather in San Francisco, or when is the Thanksgiving next year, typically they answered one or two questions and that’s the scope of those no assistance. You can still go deeper. You can move vertical skills with Alexa. For instance, if you do that, you will end up creating an application like max max is more compatible to the assistance that you see on travel websites and retail websites. And so on it’s purpose built for a particular domain to serve use cases in that domain. Right? One of the things that we did before we built Max’s, you know, didn’t exhaustive research of what’s out there to understand how, how, how to bull max.

Venky Naganthan 00:08:59 And what we found is majority of the bots in the style of websites and media websites, they all promise and under deliver. Um, they are usually very rigid and rugged. What that means is they are, they they’re, they’re just a chat bot like NFS, but they behave like traditional menus, right? Uh, you pretty much have to do what they expect you to do. If you were off course, then, um, you have on a hand up with a suboptimal user experience and typically users notice that quickly and disengaged right away, which is why these bots don’t work. Um, so max is booked like Google, Google maps for an a. So what that means is it helps the user and guides the user and stays with the user and helps the user get to where they want. But if the user wears, of course, it stays with the user and still continues to help the user. Right. And as I talk a little bit about, uh, the nuances of how to, uh, what, what goes into this and how to build this, you will understand the capabilities that we have thought about. And that was

Kanchan Shringi 00:10:09 So we’ll, we’ll max co-exist with your web and mobile UX, or, you know, over time, do you expect it to take over more and more of that?

Venky Naganthan 00:10:21 I see that as a supplementary interface to what exists, right? You can add, uh, um, chatbots one work, or because they are exciting. This is a common misconception that people have, oh, I built the school, new thing, everybody’s going to start using it. And so, first of all, you have to think of them as a tool to improve productivity in specific contexts, right? Also for them to be successful, they have to do skill automation. It cannot just be that it said 14 clicks. That’s not why people use a tool, right? So more generally speaking, if you rewind to maybe 15 years ago, 20 years ago, when you call an airline, you would have heard our welcome to XYZ airline, press one for English two for Spanish. And when you pressed one, you heard one for flight status, two for reservations, et cetera. There were these hierarchy of manuals and then each menu had nested menus.

Venky Naganthan 00:11:16 And then it led to more nested manuals. The complexity of the UI was just through the roof. And if you make a single mistake, you’re probably going to start all over again, right. It meant for a really frustrating user experience. So think about how the same minnows are delivered today. Today. It asks you, what do you want if you want to check for flight status pretty much takes you there right away. So that’s a classic example of how people have used virtual assistant to simplify the user experience. And in the, in the case of max, I about creating an NDA and sending it to somebody for signature, that is a little bit of skill in Walden, creating an NDA. You need to choose the right templates in that conform, and you need to understand little bit of legal. And also you need to know the, you know, your company processes to send it to somebody for signature and get it all done.

Venky Naganthan 00:12:14 So that is an opportunity for us through skill automation. So by, by just giving a simple command, you have a tool that behaves like a secretary and accomplishes exactly what you want. And in addition, you can think of these virtual assistants as something that can help you access information on the goal. If you are on a mobile and an airport, you can access information quickly. You can get into mind or notifications and take follow-up actions on them, et cetera. So these are all the different scenarios where we think these bots or virtual assistants will be useful. I don’t think of them as interfaces that can completely replace the traditional interfaces that are out there because traditional interfaces are still very valuable. If you’re doing something really Lindy and tedious, think about creating a course with 10,000 line items. There’s no way you’re going to do that effectively with a bot. We would much rather have a excellent spreadsheet, like tool where you have complete visibility and control of everything that you want and use you the flexibility to go and tweak anything you want. So the traditional interfaces are still going to be there. And I see bots and nodular assistance as an opportunity to supplement those user interfaces in a targeted manner and make that, make it a lot more useful and productive for them.

Kanchan Shringi 00:13:38 So my takeaway from what we’ve talked so far is that this new assistant is purpose built. It has to be conversational, really conversational, lot rule-based and capable of deep conversation, handling followup questions, given that, can we now discuss, what does it take to build in a system like this?

Venky Naganthan 00:14:00 Absolutely. I think that our four major pieces that go into building a virtual assistant, a really good virtual assistant. So let me talk to them a bit. First is the user interface. You know, the second one is the NLU engine with all of the capabilities that you need to build a sophisticated virtual assistant. And then the third piece is a dialogue orchestrator. This is the, you can think of it like a little operating system for running your bot application. And then finally your actual bot application that runs through your use cases at workflows. Right? So let’s start with the user interface. So you need to think about how do you place a value place, the part, how do you surface the bot? Right. Typically bots don’t exist in a vacuum. It’s very rare for Bart to be the only application that Delawares all of the user experiences that you want to Delaware.

Venky Naganthan 00:14:58 Typically they coexist as part of a larger ecosystem. So typically people put, uh, these virtual assistants on their websites. It’s pretty straightforward. Um, interface, you just have a, a way to launch your bot application on your website, and then you can engage with the user using either wise the text, typically it’s texts are, you can, uh, surface your bot application as part of your IVR. Like you’ve seen with Dolly airline companies. You may already have an IVR. In which case you just attach your bot to that. I build a little IVR app and attach it to your existing system are, you could have, you could consider placing a bot and social channels. If you already have a, an existing channel on Facebook messenger are, you know, a WhatsApp or one of these social channels, you could consider placing a bot there. Um, as part of that ecosystem are, you could, you know, consider placing this in a mobile app.

Venky Naganthan 00:16:00 If you already have one, right in the absence of any of these alternatives, you want to think about surfacing the bot or meeting the user, rather the user already lives as opposed to dragging the user to your own app. Right? This is a classic. A lot of people think, oh, I need to build a mobile app. Uh, so what you wanna think about is, um, if you take your, I have an iPhone 12 and I have 24 apple apps on my home screen, and how many of them do I really launch on a daily basis, maybe one or two, and then there are three or four pages of these apps. I very rarely or never launch an app on the third or fourth page. So if you’re to pour in a lot of R and D dollars to build an application, you make sure that it is going to be useful for an end-user.

Venky Naganthan 00:16:48 And they’re going to use it on a daily basis. Otherwise you may consider placing a bot in existing channels. So we build for users who live in the enterprise and the elderly use tools like slack and teams. So we thought it is more appropriate for this to be, you know, to our max, to live in those ecosystems. So max can be accessed via slack and teams, so that makes it easier for us to solve, you know, user adoption problems and so on. And it reduces the friction. So that’s the first decision you need to make. And once you, uh, decided, uh, how you want to surface it, you build their property or integrations and, and build the appropriate surface, your output Astralis solicit the input then comes the analytic capabilities. So here typically you allow to make a buyer, a bold decision. So if you, unless you’re trying to distinguish your technology on Anil itself, which is pretty rare, are you are operating at such mind-numbing scale that, uh, the economics wouldn’t work out for you. You probably want to consider buying this technology that a lot of good ones out there.

Kanchan Shringi 00:18:05 So NLU, meaning natural language understanding. And as you’re talking, maybe you can also distinguish that from NLP or natural language processing.

Venky Naganthan 00:18:15 Sure. So, yeah. Um, let me talk to the components and then we’ll talk about an LPN ML, et cetera, then maybe. So I was talking about the potential Bibles is both an Anna Lou and I was like mining more by, is that a pretty good technology so that there is Microsoft Lewis, IBM Watson Google dialogue flow, then many more that you can consider. So in that, so Anna helps you take user inputs, process them, understand the, uh, the semantics of what is being spoken. So you know, what to process, how to process it, right? In addition to the analog engine that I suggested buying you probably need to augment that with your own domain specific and are handling. For example, in the case of max, we had to build a stack of announced technologies on parsers that understood business speed, currencies timelines. So our users can say things like pull up codes that, uh, are, uh, are set, uh, ended of the scope to first work day of the third quarter of 2021.

Venky Naganthan 00:19:26 This is a complex timeline expression, right? In order for us to process this correctly, we need to translate these types of expressions to accurate date. So, so in addition to the generic animal handling that understands the semantics, you also need to solve these vertical or domain specific animal expressions. So you typically will end up building on top of the animal engine that you use. So that’s the NLP. So once you’ve received the input processing understood the semantics process, all of these complex business speaks. Now you have all of the information organized to process this and take it to the next step and then comes, uh, uh, your application level processing right here. You need to think about, do you need to build any further infrastructure, uh, in order for you to processes, for example, we had to solve this natural language sequence translation problem for us to accurately take the other information that the user spoke to the actual queries that can be run on that database and the appropriators that’s produced.

Venky Naganthan 00:20:35 So that’s again a part of your analytical system that you need to book, right? Okay. So that’s the NL bot, the tireless, the dialogue orchestrated. So I would, uh, describe as a mini operators system, like a capability to host your bot application. So once you understand what the user said, you need to hand it over to a bot. I do need to initialize the bot. You need to provide the space for the bot to execute another container type, uh, infrastructure to execute. You need to initially the state of the bot. And then as a bot execute step-by-step, you need to stay with the bot, help the bot execute all of these steps. And then when the bot produces an output, you need to hand it back to the user NFS, and then you need to swap the bot out and persist all of that state and multiple bots could exist in this ecosystem potentially.

Venky Naganthan 00:21:30 So that’s what the style of orchestrated does. Now, is this a buyer trouble early? If you are building a one-off bot one or two bots, you can consider purchasing this capability, uh, as part of the core analytics engine itself, because some of the analog vendors tried to solve this for you, but usually they are very limited, but if you are building something generic and you’re going to build hundreds of bots, and you may want to consider building your own dialogue orchestrator, because that usually a lot of flexibility. And then finally you have a bot application. Now the bot application is the one that actually understands the user input and executes the use case and produces results, right? So here, there are a lot of best practice considerations that you may want to think about. First is when a bot is launched. You probably need to offer the user some kind of a menu or a customer’s welcome screen that invites the user to engage in productive panel.

Venky Naganthan 00:22:34 It tells you use an understand what are the possibilities? And then as the user moves along, you want to offer the user proactive tips, right? So if the user pulls up accounts, you want to tell the user what they can do with these accounts, because unlike traditional user interfaces, it’s not very intuitive to what the bots can do, right? And bots can potentially have infinite capabilities. How does the user discover these capabilities? So you need to help the user understand this about are possible without being too invasive. And you also want to offer users help that are, that is context sensitive. So if the user is viewing some information, the user is looking at the payment options and asking for help, you want to offer, help, relate it to payment options. And also the other thing that is important to consider is topic switches. So as humans, we tend to switch topics.

Venky Naganthan 00:23:33 That’s how we think should ask the question. You typically answer the counter question in order for you to uncover the answer to the previous question. That is how we think bots do not understand it. If the bots are rigid and they don’t allow the user to ask these questions, there’ll be pretty limited and not very useful. So it’s kind of like a little sub-routine call from your main thing. If you’re a programmer, right? Although they use it to do the topics which go in to explore this new world and then help them come back, navigation controls to see other concentration. So this is a constant source of frustration for users. Bots often take the user down a path and then box the user into that a little space. And the user cannot come out of it. This makes for a really frustrating user experience using mov by mistake gone there, because he doesn’t know that what exists there, but you need to allow the user to come back.

Venky Naganthan 00:24:28 I go back to the steps. You go back to the previous step, let’s go back to payments, right? So users should have the ability to navigate the bot from wherever it is, and a bonus points for handling graceful endings. So usually users say, thank you. And then it’s good to say, welcome as opposed to meeting that silence. And then sometimes users don’t necessarily ask the questions that you are anticipating, right? Users may have a related question, but still a relevant question, right? If I’m going to a best buy website and the bot let’s say, can handle questions about TVs. And you know, DVR’d, I may have a question about the store location, where is the near the store next to my home? And if the bot cannot answer that, then it doesn’t help me. So being able to handle related questions that are still like valid questions is goes a long way in establishing that trust and credibility with the user.

Venky Naganthan 00:25:34 And the other capabilities that we should consider building are some sort of mechanism for the user to provide feedback, because you don’t want to assume that part works perfectly and produces good results while that’s one of the challenges. When a user has something, you produce something you don’t know in software, whether you produce the right output or not, it’s a user who’s going to judge and you the user an opportunity to provide feedback and listen to this feedback and learn, handle chat bot lingos, if you can, because these days, uh, terms like BRB and TTL kind of, uh, uh, phrases are pretty common. That’s what users use in other chat tools. And they tend to do the same thing with your bot. So understand those things handled profanity. Unfortunately, no matter how good your bot is, sometimes it can make for a frustrating, he was an expedience and this can usually cause users to spat at the bot.

Venky Naganthan 00:26:35 So be prepared to handle that and, uh, use analytics to track the usage of your bot, uh, track the performance of your bar and then pay attention to this data and then keep refining it. But so these are basically the characteristics of your bot application. So overall, uh, to summarize, you need to think about where to place your bot in the user interface and how best to deliver this. And then second, you need to think about the annual capabilities that you want to have. Um, then you need to think about a dialogue orchestrator. And then finally you want to think about now the actual details of the bot application itself. So those are the pieces that go into building a really good and powerful part that makes the Hindu as a product.

Kanchan Shringi 00:27:22 So a couple of followup questions on that you mentioned that could be hundreds of bots, you know, at least several. So what is the scope of a bot typically in this context? And I assume, you know, certainly there are multiple conversations going on, so there must be some work to manage the state across these as well. So can you just comment on these two things?

Venky Naganthan 00:27:46 Yeah. So if you had a B2B vendor and building bots for multiple businesses, each bot could be a part that serves a different tenant. Are you in within the same tenant, you can have a payment journey. You can have a FAQ journey, you can have a account access and cancellation, you can have, you know, uh, uh, so say if you’re in that in the airline business, it would be a flight status, but it would be a no designation spot. It would be a mileage account access part and so on. So these bots could be totally independent or they could play with each other. So all of those are possibilities, right? So how these spots actually interact with each other, the details of that is within the bot applications, but dialogue after state that will allow for that infrastructural piece for you to build these, uh, you know, bots that can interplay with each other more effective.

Kanchan Shringi 00:28:42 Thanks for that Winkie. Let’s now drill deep into NLP NLU, and also where machine learning and deep learning would fit here.

Venky Naganthan 00:28:53 Absolutely. So all of them are kind of related. So I think of this as, okay. Think of artificial intelligence as a super set and then machine learning is a part of that and an NLP and analyst part of machine learning and deep learning. So let’s talk about them one at a time. So Anna LPN, Anna, you are close cousins. You can think of that as, or lapping subset. You can think of that as one within the other. So NLP is basically a set of techniques to take user input process it, prepare it for next steps. So when you say I’m running a hundred meters in the Olympics, there are a lot of words in that sentence. I am running a hundred meters in the Olympics. So NLP tries to talk and ice your input and split it into tokens. Then it does stop ordering more to remove unnecessary words.

Venky Naganthan 00:29:49 It does something called, you know, roadway, ratification by doing something called stemming lemmatization. So in the case of running, running is not as interesting as the word run run is really the root word. So it does all of this preparation to ensure that the subsequent steps can be more easily performed. So that’s essentially what NLP tries to do, right? NLU is understanding the semantics of these utterances. Like what is the user really trying to say? What is the intent behind the statement? For example, and the user says, pull up contracts or 20 games from last year that I said to expire next year, users simply wanted to know, look up the contracts. That agreement records, that is the intent. And there are several parameters to this intent, 20 K or 20 K, uh, from last year expire. Next year, these are various parameters for this sentence.

Venky Naganthan 00:30:49 So Anna Lou tries to understand the parameters for the intent. And then in some cases you probably want to understand the sentiment. And so on in, in another case, it may be more, so this is essentially what NLU tries to do. And NLP captures works with the syntax and the pre-processing. And then Anna Lou works with the semantics. All right, now let’s switch to machine learning. Now all of this is part of machine learning only, but not for us to understand machine learning part, to understand how the world could be without machine learning. Right? If you were to take a bunch of user utterances and then try to guess what the user is trying to say, uh, you probably want to write a whole bunch of rules to understand, okay, if the word has some lookup action, sorry. If the sentence has some lookup action on it refers to an object like contracts on it.

Venky Naganthan 00:31:48 First two, some parameters uses probably talking about looking up contracts, but contracts may not be the only word user may use using memory use. Many other words to describe contracts. Agreements would be where to describe contracts. So you caught all of these rules in your application to understand, okay, if this is a case of this is a case, this is a case, then use this time to look up the contract. And then that you will probably notice some exceptions to these rules using may combine multiple objects in one sentence. So your, your rules space in accord is not adequate to handle that. So you will call it more rules, more rules, more rules. So one thing that is guaranteed to happen with rules based programming is your rules is gonna keep on growing. And at some point your rules may conflict with each other. In which case you need to call exception rules and so on.

Venky Naganthan 00:32:41 So this makes your program very complex and rules is something only an expert program. I can write a non-expert programmer. Our non-expert user cannot build these rules. So that makes the application really hard to build machine learning basically turns this paradigm on its head. Why do we have to write these tools? Is that a way for us to synthesize these rules automatically? So in that case, how do you synthesize it all? So you mentioned learning is basically an art of writing a program that takes cleaning examples and then tries to automatically build these rules, right? So if you are trying to recognize an image, the tiny examples could be, okay, this image is a cat. This image is a dog. This imagine some other animal and so on. And the machine learning algorithm tries to understand the input and your labels and tries to make an association.

Venky Naganthan 00:33:38 Okay. I found these things and the users said, this is actually a cat. So I think these four things could mean cat. So it can still lead to ridiculous rules building. So that’s where your training examples have to be adequate, otherwise, a garbage in garbage out. So with this type of programming paradigms, it’s easy for us to scale your rule set right by, by providing more and more training examples. So this is something a non-expert user can do. Training is something that the business analysts can do, and that makes it easier for you to build it out. And data is something that you’re going to constantly keep getting as more users interact with the system. And it may be possible to automate this training to a degree. So your algorithm gets smarter and smarter. So that is Michelin now onto deep learning. Now machine learning is good and powerful, but it doesn’t keep on getting smarter as you add more data, which is what you want, right beyond a point, the smartness of the, you know, uh, the algorithm saturates.

Venky Naganthan 00:34:49 So that’s where deep learning comes in. So deep learning is in some ways, an old wine in a new bottle, it used to be popular in the nineties. It’s based on neural networks so that there are layers and layers of neural networks, each layer processes information sends it to the next layer and so on and so forth until the real output comes out. That’s essentially how deep learning systems work. Deep learning has become popular. Now, because back in the nineties, we didn’t have the processing capabilities to, first of all, we didn’t have access to all of this data that we have access to. Now on top of it, we didn’t have the capability to process all of this big data. We didn’t have the computing power, et cetera, to do that. Now we have all of that. So deep learning algorithms have come back now, deep learning works better than machine learning because they can get smarter as you add more and more data.

Venky Naganthan 00:35:44 And we have access to all of that data now, and it also has some properties like transfer learning. It can learn things from our own context and take that to the other context, right? So basically you can build models that I’ve been trained to understand the English language, and then you can do some fine tuning of these models to understand how to read English contracts. And it also has the ability to sell select features. So as I was just through the neural net, it can highlight the suppress features that are important or unimportant. So all of these properties make deep learning really interesting and deep learning. Obviously it’s a lot more complex science and machine. So yeah, that’s the difference between

SE Radio 00:36:27 I sponsor for this episode is sponsored by NetApp spot, provides a comprehensive suite of cloud ops tools that make it easy to deliver continuously optimized and reliable infrastructure at the lowest possible cost. Imagine automating your infrastructure to proactively meet the needs of your applications. Imagine leveraging the latest in machine learning and automation to scale your infrastructure, using the most efficient mix of instances and pricing models, eliminating the risks of over provisioning and expensive lock-in from cause management to infrastructure, automation, and CD to running serverless spark on Kubernetes spot ensures you maximize your cloud investment. The end result is simply more cloud at less cost. Discover how the most innovative companies from cloud native growth machines to forward-thinking enterprises are automating, simplifying and optimizing their cloud infrastructures with spot by net NetApp, check them out at spot.io/se radio, where you can find more information, request a demo, or give a try by starting a free trial.

Kanchan Shringi 00:37:25 I’ve heard of what to whack and Bert B E R T in this context, could you elaborate on these?

Venky Naganthan 00:37:34 Sure. So what the is basically a technique to transform words to numbers. So it’s important to do that because we need a mathematical model to work with what’s computer needs a mathematical model to work with. Once you have that mathematical model, you can start in non-performing word added medic comparisons of words. One of the most celebrated examples here is, uh, uh, models can be used to compare. Let’s say if you said, man to woman is same as king too. And then what the model and say, king to queen can associate meanings like, uh, oh, uh, rather you can understand that Sweden is more closer to Norway then, uh, more closer to us. So all of this is pretty much learned in an unsupervised manner. You can read pages and pages of text. Once it’s transformed as a mathematical model, it can make all of these associations pretty on its own without any human intelligence.

Venky Naganthan 00:38:38 And people have built interesting models on top of this, uh, are, are using this principle by embedding more and more contextual meanings by taking words and, uh, you know, mapping them to multiple vector forms and using those models, you can do lots of interesting things, examples of models that work this way, our board then GPD two and GPC people are both models that with GPD three that can read pages of texts and practically answer questions on that are based on that text, like a human word, but no additional training. So that makes it really interesting.

Kanchan Shringi 00:39:18 So it won’t be in the scope of our conversation today to go deeper into birth or what to check, but it will be great if you can point our listeners to any reading material out there.

Venky Naganthan 00:39:29 Yeah. So if you’re on a casual user, um, are either, you can Google your way to a lot of this information, but if you are a little bit more serious that our YouTube lectures, YouTube, Stanford lectures, that you can listen to are, if you want to do something more specialized that of course, at our courses, by Andrew Ang, a very comprehensive machine learning and deep learning courses for their Udemy courses. So you can do all of this. And if you’re really, really serious about, you know, having a career in this space, I mean, you need to actually build some applications to understand these technologies in more detail.

Kanchan Shringi 00:40:07 So we’ll check with you a Winky and puts them off, you know, key links that you found useful in our show notes.

Venky Naganthan 00:40:13 Absolutely.

Kanchan Shringi 00:40:14 Let’s now talk a little bit about other B2B problems that could be solved with this infrastructure. You know, have you encountered any of these?

Venky Naganthan 00:40:25 Yeah, so there are lots of problems we can solve with good NLP NLU capabilities. So chat bots is one immediate application. We’ve talked about it at length. If you, in the world of contracts, there are a lot of interesting problems that you can solve, uh, in this domain, for example, so many times we will see, you know, different contracts by mail, your home RT contracts, your Verizon contract, that’s like 15 pages in size, and you’ll never read to them cause it’s a lot of legal language. It’ll be nice. If somebody can summarize to you the four or five bullet points that you care about things like what is my annual payment or monthly payment, what is my early termination penalty? And what if I moved to a different city? What if I want to change my telephone number, that sort of thing. Right. So being able to read a contract, provide critique on the contract, providing a summary of the contractors is a useful problem to solve.

Venky Naganthan 00:41:27 Being able to have some kind of a best practice template and being able to semantically it would be another interesting problem to solve. Being able to understand, read the contract and understand the different risks in it would be another problem to solve. And so on. This is just a world of contracts. So if you are, you might have used a zoom meetings, a zoom meeting now has this live transcript. So the live transcript, uh, not the live transcript, but the transcript of the meeting that you are in. So that’s another NLP problem. And also auto summarization of your conference call would be another interesting problem. If you were on the call for two hours, what’s the summary, right? But these are all various problems that are really interesting to look at. I think some companies are already beginning to solve them, but it hasn’t become mainstream yet.

Kanchan Shringi 00:42:19 Let’s switch gears a little bit and talk about the data, you know, these bots and conversational interfaces that we have been talking about, a very focused on the domain. So what about the data for training?

Venky Naganthan 00:42:33 Yeah, so you definitely need a domain specific data for them to be meaningfully trained. You need to work with domain experts. So in the case of contracts, you probably need legal operations people. I do probably need to hire liars to train and build these models. So it’s going to be a combination of using the expertise. And also there is a lot of rich data out there on the internet. So if you just need public domain contracts that are probably thousands of them, millions of them. So you can download the, from these data sets and use that as well. But you’re also going to learn a lot from your customer data, but this is a sensitive subject because customers need to opt in and allow you to use that data. And our hope is that this industry is going to LL some market like how the music industry, when Napster came along and said, I’m going to publish all the music on online. That was huge resistance in the industry to go in that direction. But 20 years later, pretty much out of the music is consumed online. So when you think you actually, uh, customers will be more open to the thought of sharing that information for the, for their own good and for the goodness of the entire community. So yeah, using a combination of homegrown the public domain data, as well as your customer data, you can build your models and make them smarter.

Kanchan Shringi 00:44:03 How does testing fit in?

Venky Naganthan 00:44:06 Yeah. Testing is a little tricky here because this is not conventional software already written requirements. And then you code it to those requirements. And, and you can just test through the requirements because if you built a visit like screen and conventional software, unless there is a next button, you cannot click on it and go to the next screen. But this is a chat bot. So anybody can say anything at any point in time. So the testing is somewhat unconventional that you still need your regular automated unit testing and your regression testing and your regular functional testing and all the performance testing and all of that. But in addition, uh, to ensure that what you built is functionally complete, and it makes sense when a news or you need to put it through some kind of a cloud source, alpha testing unit to engage users across different teams.

Venky Naganthan 00:44:56 If you have a sales team or marketing team willing to participate in this testing, you know, invite them, engage them and get feedback from them. If you have, if you’re targeting consumer audience, you can test it with the focus group to understand whether your application is working or not. So once you have a deployment of your app, then you have a lot of data from your customer usage. And from that, you can get the meaningful insights and understanding of how this is going to be used. So typically when you introduce some new feature, yeah, you want to run it through some kind of a trial process to ensure that this is going to work.

Kanchan Shringi 00:45:35 So you also said that once the product has been used by customers, that’s additional feedback. Is that data also useful for the training?

Venky Naganthan 00:45:45 Yeah, absolutely. So here you probably need to declare to the declarer disclose to the customer that you’re planning on doing that. And it’s pretty much part of your software agreements, the licensing agreement, but in some cases, customers may, may want, may be okay with using certain data may not be okay with using some data. So that case, it becomes a little bit more complex. You need to then start anonymizing the data masking important pieces of information that are not significant for your model, because your model probably doesn’t need to know that the value of a contract is $20,000. It probably cares about other aspects of the contract, right? So in that case, you can try and solve the problem that way. But some cases, customers may be extremely sensitive and say, I don’t want you to touch my data at all. I want you to just use it for processing and nothing else. In that case, you pretty much have to honor what the customer’s saying and not use that data for any other

Kanchan Shringi 00:46:50 In terms of training and testing and the whole life cycle. What about localization? So if you have, you know, if you’re done with English and you want to now move on to Spanish, what does that mean?

Venky Naganthan 00:47:04 So this is a lot more complex topic because you need to start with your language models, right? So you need a language agnostic model, first of all, and you need to be able to, and if you’ve written all of these business speak layers, which you’ve probably written for English, you need to write the Marie, write them in four different languages. So that’s a big complex piece of work. And then as you run through your Bart and Bart produces output, you need to tailor the output and the appropriate language. I think the localization, as in presenting the output in a particular language, that’s probably the easier part of the problem. The harder parts of the problem are understanding what the user is doing, doing NLU and an LP and NLU for your languages, and then doing all of the added layers, the domain specific and layers in the specific language. That’s where the complexity,

Kanchan Shringi 00:48:00 We did talk about the Q and a pod and the dialogue orchestrator. What about actions? What are the additional integrations needed to perform actions? Maybe there’s a integration with the notification service. Is there, does that make sense? Is there something else that you need to build for that?

Venky Naganthan 00:48:19 I mean, your actions are going to depend on what type of workload is your action could very well be. If they are asking for information on something, uh, upon information, you, you can very well dip into your CRM system and lift that information. Audio action could be like a, if you’re turning on a heater system, you need to externally interface with some nest or some thermostat to turn that on our, it could even be user asking, Hey, send me reminder notifications about, uh, some something that’s happening in the system. So speaking of notifications would seems like the essence of your question. Yes. I think, uh, see, it may seem like mobile notifications and chatbots are some out around, but they have an opportunity to go exists and deliver a, um, you know, a pretty good set of use cases. So we in max, we have integrated these things very closely.

Venky Naganthan 00:49:15 We send a combination of reminder notifications and just-in-time notifications, and then user can take follow-up actions on these notifications. For example, if you send an agreement for approval, you want an order for the user who needs to approve this agreement and write that as part of the notification, you provide the user, the option to do follow ups and then user can do follow ups and then use it takes follow up actions. It notifies the sender that actions have been taken. It’s the ballers on your code to do the next steps. So this helps as small workflows at the speed of the user and not be like bound or shackled by technology. So chatbots play an important role in making all of this happen.

Kanchan Shringi 00:50:00 So we talked about languages and now I’d like to ask you about wise, did you integrate with boys?

Venky Naganthan 00:50:08 So why is this a very complex topic? So say the Alexa, they work as two AOS engines, right? They work like that primarily because they do, they solve very simple problems, right? If you want to really build a good, solid B2B application twice, what does it take first? You need a wise recognition engine. Typically you would simply buy this engines because then a lot of good ones out there, but there are some unique complexities that you will have to handle. So in the, in the world of B2B, the customer says, pull up the Cisco agreements that are from last year, right? What Cisco are they talking about? Are you talking about CIS CEO, the networking company, R S Y S C or the food distribution company when tax it’s pretty clear which one it is and why is, it’s not clear, right? That is ambiguity in what is being spoken.

Venky Naganthan 00:51:04 So you probably need some dynamic dictionary integration with your input processing. And then when you come back with information, if you are pulling up a cord, there are like 15 line items in accord. How do you render this in Weiss? If it’s like visual, you can simply show it. User can scan it. It makes a lot of sense to the user. If it’s, if it has to be rendered and wise, how do you render this in a manner that, you know, that helps you use a consumer, all of these details easily. So in that case, you probably need to summarize the two or three things you’re missing information that user may care about. And also you meritable state machine surrender output itself. All of this makes, uh, why some are less used to where, why some are less useful, but what may be more useful is one way you can use Siri, Alexa like dictate, uh, like interfaces. Are you, uh, your built in mobile phone dictation services to inject your input and wise, right? And then again, get back the output and texts that way you don’t have to type in your mobile, uh, which may be complex to do with limited real estate and limited keyboard capabilities. But you are able to still effectively use it, uh, and retrieve the information that you are looking for. So, so that’s, that’s how we thought about voice. It’s more one way, why send to

Kanchan Shringi 00:52:33 The last part of the platform section, um, what we’d like to get a little bit of insight into deployment? What were the additional components you had to think of deploying and did they change your basic framework in any way and your pipelines?

Venky Naganthan 00:52:49 So when it comes to the deployment, you still have the traditional software stack. You are running your microservices in a container and you probably have a need for your cloud infrastructure, like your radiation, Mongo, and our whatever database that you’re using, et cetera, et cetera, right? So you still need to solve all of the traditional deployment problems. So when it comes to the applying Ana related stuff, that is models and maybe a base model, and then there may be customer specific models. And then even the base model may not be one thing. It may be a combination of things you need tooling to deploy all of effect. And sometimes you may be, you may not sure that a particular model will function great. So you probably need to have infrastructure to do some kind of an AB testing. So you do apply a newer version of a particular model, and you want to see how that behaves. And you want to study that before you roll it out to a general user base. So you need to probably have infrastructure to do that type of a rollout. And in addition, yeah, there may be configurations and other things that you may want to apply as well. So you need tooling for tooling and an infrastructure for all of this.

Kanchan Shringi 00:54:02 Let’s switch gears and not talk about building the team. What is the makeup of a team that builds this up? What skill sets are needed? Perhaps you can talk about how you build a team was a build and buy decision done before or after some information about the team would be great.

Venky Naganthan 00:54:23 So build versus buy is never straightforward. One or the other. It’s usually almost always a hybrid in practically all projects that I’ve been part of. You buy things where, where you can realize quick time to value. And it’s not significant. I ran it in building. The cost of building is very high and you end up building a whole bunch of things that nobody has solved for you, right? So that’s how I think about biosis both. So it’s, it’s usually a hybrid. So in terms of team skills, it’s going to be a combination of data scientists and traditional software engineers and their ops and user experience experts, their traditional engineering managers, product managers, et cetera. And then if you need to solve, you know, domain specific problems, like you may have to hire domain specific experts, or you can contract the, uh, you can contract the services, the case of contracts, you may, um, you may want to consider adding legal personnel to your team, right?

Venky Naganthan 00:55:26 Because UX is a big part of this. People often underestimate how important UX is to this. Us is a very critical part of building a good virtual assistant, because you need to blend your traditional user interfaces with, you know, the chat bot interface and unlike traditional NFS, where users the salary, pre-trained subconsciously to go look for things in a certain manner. There’s no such thing for chatbots. So you need to figure out a way to educate the user without sounding too invasive. We cannot show like a very complex UI with hundreds of knobs for the user to learn about. You need to like do that in a very subtle manner. So you explained a big role in this. And then, uh, finally, uh, like we had talked about earlier, when it comes to testing, you need to crowd source the effort you need to involve multiple people from different disciplines. So you need to consider them as well as part of a larger team that you are.

Kanchan Shringi 00:56:25 Yeah. Thank you for answering the question about the UX design. That was going to be a specific area. I talked, you know, I drilled into further a key example I have from my recent experience is just that if you don’t do that, you probably will get up ending Jason, somewhere in your chat. And that’s a very basic thing you don’t want to do. You certainly want to format your responses if nothing else.

Venky Naganthan 00:56:52 Absolutely. I think now addition, let’s say you are serving users, use the ones who are food in an airline, right? There’s choices of food that you have smaller appropriately, you know, Delaware as a traditional user interface is a card, right. That you can view. So the user can view the image of the food and then the price of the nutritional information, et cetera. How do you deliver all of this in chatbots? So it has to be a combination of traditional user interface with, you know, the chat bot like stuff. So, I mean, this problem is fairly well solved, but yeah, UX, uh, helps you build these things very creatively and deliver a superior user and user experience to the user.

Kanchan Shringi 00:57:38 So one of our listeners would like to start a new company where conversational interface, or maybe just NLP NLU is at the heart of the experience. How would you advise them to get started?

Venky Naganthan 00:57:52 So I would say, start with your business problem and think about how best to deliver a solution, to address a business problem. And then think about all of the interfaces that you need to deal with that value. And then think about how chatbots will enhance that user experience. Don’t start with the notion that, Hey, you need to build a chat bot company. That’s not how you should think about it. Uh, always start with the business problem. I think once you do that, you will have a clear understanding of how chat bot fits into the overall puzzle and enhances the value of your product. But once you do that and you, you, you understood what use cases to deliver, how to deliver these. Then you can think about all of the things we just talked about early on in the segment, right? How to, where to place the UI and the know what Anil engines to choose. Then do you want to build our, buy your dialogue orchestrator, then how best to build their bot, et cetera, all of those considerations come into the picture.

Kanchan Shringi 00:58:52 Are there any open source technologies are, um, commercial technologies that you think people should evaluate?

Venky Naganthan 00:58:59 Yeah. Yeah. There are tons. In fact, nobody lights everything from scratch in this domain, right? You typically leverage a lot of open source work that has been already done for you. And then you build on that. So if you are solving the traditional chat bot problem, that is Microsoft Lewis, that is IBM Watson. That is Google dialogue flow that you can start with. You probably want to augment that with your own proprietary algorithms and so on. And then if you’re solving different kinds of NLP, general purpose, NLP problems, or contract language, understanding that type of thing, you can start with the language models that has been built for you, right. But then you have lymphoma and GPT two and GPD three and so on, and then see how we can adapt these to solve specific problems that you are solving.

Kanchan Shringi 00:59:47 Thank you. so starting to wind down now, what are the use cases or applications are you thinking about for the platform you built beyond conversational UI?

Venky Naganthan 00:59:58 Yeah. So I talked about conversational UI and alerting in addition, it’s a good, uh, you can think of it as a search engine. Like how do you search on Google? You pretty much type things in national language, right? And then it brings back information that you want. So it’s, it’s an opportunity. It’s a useful interface for a search engine and then it can be a good document analyze analysis tool. So there are different kinds of documentations that exist in our world. I talked a lot about contracts, so there are lots of different problems you can solve with contacts like auto summarization, semantic similarity, and many of the problems that we discussed early on, those could be solved with max like technologies.

Kanchan Shringi 01:00:45 Is there anything we missed today that you would like to cover?

Venky Naganthan 01:00:49 Nothing in particular? I think we’ve pretty much covered everything exhaustively. I mean, I’m a humble little student in this space and I have a lot to learn. She got good pride, productive ideas, and you want to reach out to me. I’ll be happy to share whatever little I know and have read to learn what you can offer me. And let’s do that. Let’s kind of offline if you want to chat for it

Kanchan Shringi 01:01:13 And how can people contact you?

Venky Naganthan 01:01:15 So I’ll leave my email and other information with you. Then I can share this with our viewers.

Kanchan Shringi 01:01:22 We’ll put some contact information in the show notes. Thanks again. Venky this is a great session. I hope our listeners learned from it just like I did. This is country Chingy for suffering engineering radio. Thanks for listening.

Venky Naganthan 01:01:37 Thank you very much. I really love the chat and hope to connect with our users and learn more from them. Thank you.

SE Radio 01:01:45 Thanks for listening to se radio an educational program brought to you by AAA software magazine for more about the podcast, including other episodes, visit our [email protected]. To provide feedback. You can comment on each episode on the website or reach us on LinkedIn, Facebook, Twitter, or through our slack [email protected]. You can also email [email protected], this and all other episodes of se radio is licensed under creative commons license 2.5. Thanks for listening.

[End of Audio]


SE Radio theme: “Broken Reality” by Kevin MacLeod (incompetech.com — Licensed under Creative Commons: By Attribution 3.0)

Join the discussion

More from this show