
Seeking a position where I can contribute my skills to the organization's success and synchronize with new technology while being resourceful, innovative, and flexible. A technology enthusiast and enterprising individual with a strong educational background with 3+ years of experience as an AI Engineer working with traceable projects.
AI Engineer
F.R.O.MAI ML Engineer
Smartsense consulting solutionsAI/ML Engineer Intern
smartSense Consulting SolutionsSoftware Developer Intern
Leeway Soft-Tech Pvt. Ltd.
Google Colab

GitLab

GitHub

Git

Postman

Microsoft Power BI

Jupyter Notebook

Visual Studio Code

Anaconda
REST API

Skype

Microsoft Teams

Slack

Zoom
AWS (Amazon Web Services)

MS Excel

Spreadsheets

Airflow

VS Code
.png)
Docker
Hello, everyone. I'm. I have done my MBA from Ahmedabad University. I'm. I I have been working in this field since more than two and a half years. I have worked on different technologies for NLP machine learning, uh, Hugging Face transformers, tentative AI, uh, LLMs, uh, fine tuning the models, fine tuning pre trained models, and, uh, working with different prompt and prompt end engineering, uh, LLMs, generative AI, OpenAI, etcetera. I have been, uh, I have I have been the good performer in the in my company and always, uh, trained by my coworkers and my peers and my seniors. I'm very hardworking and sincere in working. Uh, I like to work with, uh, different technologist, try different, uh, new things, and always enthusiastic to learn the upcoming tech new technologies trying to do it.
Prevent overfitting in to prevent the over overfitting in the chatbot, uh, with the neural network, we can increase the layers in neural network, increase the data. We can also edit a dropout in that. So the JetBlue, uh, gives the proper replies and the relevant replies. We can also use regularization technology and and and one, l two regularization techniques to reduce the overfitting and
So design, uh, we can use different patterns for real time chatbot message handling. Like, we can store the store the checks in the, um, NoSQL databases that stores the unstructured data. Uh, other than that, we can also train the model with patent recognition or intent identification to identify, uh, what exactly, uh, the client or user wants. We can also if, uh, the JetBot is used to extract the API, uh, extract the information from the API, We can also, uh, use the real time API calls to get the current and the latest data from the API. Like, if we are using the developing the chatbot for weather forecasting, then if user says that I want to know the, uh, weather forecast for the today in particular region, then we will directly, uh, give that query into the API using the Jetbot interface, and then the current real time real time data will be given given to the user. So that is the design patterns that we can use to create the chatbot. You can also use voice voice based and tech text to speech, speech to text, like, uh, patterns also to you, uh, uh, to be used in the, uh, chatbot so that we can use the multimodal functionality as well in the chatbot with the user. Uh, so if user is not able to write or, uh, it won't search it, uh, in the speed speed limit, then also they can, uh, do it.
To ensure the metrics, like, different KPI indicators that we can use to, uh, measure the measure the performance of the j JetBot link. Uh, how much, uh, the users are ending uh, users are visiting the port, how much it is using? Let fall back fall back rate frequency of question asking, uh, then how much, uh, they are we can also collect the feedback, uh, that can be that can help the users that can help to, uh, do the performance monitoring as well. What type of customers are visiting to the chatbot, and, uh, how they are feeling their sentiment analysis, uh, which type of in impression we are, uh, giving to the user. Then bounce rate, what type of questions they are asking, like, frequently asked questions. Then, uh, user rating, conversion rate, conversion duration, how long they, uh, stay to the chatbot, number of sessions per channel, etcetera, we can use to help to, uh, monitor the performance.
For this type of, uh, thing, we have to train the chatbot that, uh, that can identify that these words are slang or not. For example, if we are using the repair latest advanced JPT models that can, uh, identify the things these are slang and these are non slang. Other than that, we can also, uh, have the We can also train the machine learning algorithm for, uh, the deep learning neural networks so that they can identify it can identify that which type of words they are, uh, getting. And based on that, uh, they can filter out those words. We can use, uh, filters for that. We can use, uh, we can have 1 database where the where it is this length or nonstandard language is fed and, uh, based on elastic search queries or, uh, based on text analysis, we can find that this is not not allowed or not good language, and they can, uh, filter out those questions or a report of slides as well.
SQL, uh, SQL aggregation functions can be used to get the chatbot response from from multiple tables based on the user based on the user query. For example, if user is seeking for the product details and it is, uh, distributed across across multiple tables, then from, uh, analyzing with, uh, like, uh, entity extraction from the user query, we can identify that which queries, which entities are, uh, present in the query. Like, if I if user has to query, like, I want to search I want the product details about the, uh, about the soaps, for example, then we can, uh, have the soaps and, uh, the details and descriptions that are distributed, like product details and product, uh, for example, datings, uh, datings have the different in the different table, pricing, the different tables, or all the prices in one table as across the then we can use the multiple. Uh, We can aggregate it with SQL aggregation functions and, uh, get the response from that. And then then, uh, we can also use, uh, use the technique of NLP and, uh, give the proper answer in the nature language from the user of to the user so that, uh, it cannot feel like it is machine generated and, uh, better in, uh, natural language.
Here, it is set, uh, for the time, ma'am, but, uh, it is waiting for the operation to, uh, it is, uh, waiting for the user input so that it has the mistake, um, because, uh, as in when the when as the user input will not get until that, it will, uh, wait for the input. And there is no async await, uh, created in into this. So it is not doing some, uh, asynchronous code handling.
Here, uh, if there is a, uh, database which have the long, uh, large amount of data, then this loop might, uh, have the more computations. And, uh, it takes too many, uh, iterations to, uh, identify the data and fetch the data. Uh, if we are, uh, doing this for loop in a batch batches and, uh, so that we can, uh, have the performance that can be then the performance can be improved.
When scaling an AI checkbook from, uh, handling multiple millions of users, we have to, uh, see we have to see that it should, uh, handle all the users at a time. It should have a a sync await functionality so that so that millions of users can be, uh, handled asynchronously, and everyone, uh, uh, every user don't does not have to wait for the long time, and it is scalable. Other than that, we can also use the docker implementation and make the doc, uh, containers very, uh, use very, uh, lightweight so that it can be handled effectively. And, uh, we can have the suitable, uh, deployment platform. We can we have to ensure that the server there where the chatbot is deployed, uh, is running fine and able to handle the multiple users as well. Now We have, uh, we can have the effective we have to use the effective, uh, Jetbot algorithms that can, uh, handle the multiple requests at a time with, uh, fast processing and getting the response, uh, effectively. We can also have the data volumes where the data has been stored, uh, should be, uh, fast enough and quick enough to, uh, give the response as well. And it is it it should also be able to, uh, fetch the data efficiently. You can also use the APIs, real time APIs to face the record, and, uh, we have to ensure that those APIs are also not taking too much time and, uh, give you the quick reply as well.
If we are if, uh, if we are having the graph databases, then we can, uh, effectively find, uh, find out the responses, uh, in a quick manner. For example, if we are using the no notes based structure and if, uh, there is a to find out the the details of, uh, candidates who are applying, then it will, uh, it will, uh, it will increase the search time, uh, search time. And, uh, for example, there is a one there will be one node with her skills, another node with, uh, number of experience, uh, another mode with, uh, education. And if the different candidates have the same, uh, skills or same number of experience, then we can make a tweak. And, uh, from that, we can, uh, access the information quickly. Other than that, At the end of it, it can give the explicit and complete control over the, uh, answers provided by the chatbot and allow, uh, allows to avoid hallucination. Then if you are using knowledge knowledge graph, then all the repetitive work for the, uh, knowledge the knowledge graph can be, uh, can help to clear concepts, structures, and entities, and everything so far. From there, we can easily identify the answers and, uh, with response to the, uh, user.
The data visualization tools, like, uh, check, uh, Tableau or, uh, Tableau or Power BI can be integrated with using APIs or, uh, other than that, we can also have the graph. Also, we can get the graphs or something, uh, graphs or pie charts to have the response dynamic response generation, uh, for the tabular data or numerical data analysis that the, uh, JetBlue is finding for the response, it can, uh, be easily interoperable interpretable as well to understand the results for the user. You can also use the heat maps as well so that, uh, it will the user will get to know that which are the important and which are not, uh, much required or something like that.