As a dedicated and innovative AI Bot Developer with 3.10 years of hands-on experience in Python, I specialize in designing and implementing intelligent conversational agents and automated systems. My expertise lies in leveraging advanced machine learning techniques and natural language processing to create dynamic and responsive bots that enhance user experiences and streamline business operations. With a strong foundation in Python and a passion for AI-driven solutions, I am committed to driving technological advancements and delivering high-quality, scalable AI applications. I am eager to bring my technical skills and creative problem-solving abilities to a forward-thinking team that values innovation and excellence.
Full Stack Software Developer
Markytics Consulting Pvt LtSoftware Developer
Kintan SystechSoftware Developer
LA Esfera Multiservice LLPMySQL
React
PyCharm
Jupyter notebook
Git
VSCode
XAMPP
REST API
Python
PostgreSQL
Jira
Visual Studio Code
Postman
ClickUp
AnyDesk
GitHub
Microsoft SQL Server
Jupyter notebook
UiPath
Skills Required:
1) Python : Expert
2) LLM : Expert
3) Prompt Engineering : Expert
4) Postgresql : Good
Details -
Created AI Bot which can help -
Skills Required:
1) Python: Expert
2) Django: Expert
3) XGBoost : Good
4) Data Science: Expert
Details -
Skills Required:
1) Python: Expert
2) Django: Expert
3) Postgresql: Good
4) BigQuery: Good
5) Data Science: Good
6) Prompt Engineering: Expert
7) LLM Models: Expert
Details -
Skills Required:
1) Python : Expert
2) LLM : Good
3) Prompt Engineering : Expert
Details -
Skills Required :
1) Python : Expert
2) Pandas : Expert
3) Flask : Good
4) MySQL : Good
Details -
Skills Required:
1) Automation Edge : Good
2) Postgresql : Good
Details -
Could you help me to understand more about your background by giving a brief introduction on yourself? Okay. So, obviously, my name is Himali Doria, and I have worked on a multiple various projects like data analysis, data science, automation. And currently, also, I'm working on a data analysis project where I'm helping one client to decide to whom give a loan, um, whom to not give based on the risk analysis. What is the risk and what type of loans they can disburse. Apart from that, I also do web scrapping from various websites. I scrap data. I clean them. I just put them in a proper required format, and I pass it on to the client or end customer. Apart from this, I have developed chatbots which are completely based on LLM and lantern models just like, uh, chatbotix AI. And, uh, I have reached up to the accuracy up to 80 to 90%.
Okay. So I'll prefer to monitor all the what is the time duration by bot is giving the answer because when I create my own chat, like, my chatbot for my client, so at that time, it was taking a time a lot of time, like, approx 20 to 20 seconds or 28 seconds, which is not so good. So which is actually not at all good. So I prefer that, like, I will monitor whether how accurate the answer a bot is creating apart from that, how speedy. Like, how, uh, quickly it is giving an answer to the customer and customer whoever is chatting with the bot. And to optimize the performance, I will prefer to work on its speed, first of all, and its accuracy because these are the 2 parameters which matters the most.
Optimize an SQL query that aggregates data across multiple tables for a chatbot response. To, you know, optimize SQL queries, I will prefer that I do not use more than 1 database. I will try to avoid it. Many of the scenarios, it is not possible to avoid it, but still, uh, I will try to avoid because, you know, creating multiple joins on different databases or tables, which is present in different databases, it creates, uh, it takes a lot of time. So if there is a situation where I have to pull, you know, data from different databases or a different sources, I will try to create an view For that particular database, I'll create one query, and I will try to fetch answers from that view table rather than actual database Because it will optimize my answer. Suppose if my query which is taking a lot of time across 2 minutes or 3 minutes. Minimize the, you know, multiple joints, multiple databases. I will try to reduce it or avoid it as much as possible. And I will try to write the most simple queries just like select star or select multiple columns from single table with so and so conditions. So I would try to prefer that I should keep all the queries very simple and optimized rather than applying multiple joints.
How would you refactor a chatbot score base to measure the solid principles? Score base, I would like to prefer 1st entire go through the entire core, and then I would try to reduce core at most as much as possible. I can just find out the most generic or common thing which is occurring repetitively. And based on those things, like, it will optimize my time as well as space complexity, and it will adhere to the solid principles. I will try to reuse code as much as possible in most optimized way. I would, uh, try to reduce if there are lots of looping and lots of of conditions inside it. Let's say, nested condition on the nested looping, more than 2 nested loopings and everything. So I would try to minimize those things. I would like to implement more optimized functions which are currently available up to I will like to keep them up to the date.
Name of design pattern that would be suitable for a real time chatbot. Message handling and deeply explain why. Okay. So as per me, uh, when it's a real time chatbot, this for an example, just let chat, Jupyter, or something like that, then I would prefer the pattern like whenever as soon as I get any message thread, depending on on that thread, I would like to get an answer. If suppose there are forming questions, there are most frequent question or generic questions, I would not like to, you know, get it from the database. There is it is if it is very handy, it will give response very quickly. And if I have a good server and everything, then I would like to, you know, student in a this one format or some Excel or somewhere. For the most generic questions, which will be very few. If there are no if there are lots of questions, then I would like to prefer the most quickest database possible. And I would like to prefer that as soon as I get any question, I remove all the punctuation. I correct the grammar so that my bot can understand it very easily. And, uh, I provide the most suitable or the accurate prompt for my bot to answer them. For my current, uh, you know, chatbot, I used prompt engineering. And the more accurate my prompt is, the more better answer I used to get in more quick manner. So I would like to, you know, work on more prompting for my bot.
Okay. I would like to use classic, uh, classification method because, you know, slang and non standard languages. They require lots of classification whether it is slang, it is not slang, what exactly it means. For those scenarios, I would like to prefer any classic, uh, any classification approach, uh, with proper labeling that this one is the slang, which means so and so thing. Apart from this, I would like to even prefer prompting here as well because understanding these lines will require some description to the bot. So those description I can take from my database. But, yes, uh, as per the NLP model, I would like to use classification more often. And, uh, I would like to use RNN for this thing because I guess RNN will be the best suitable method or NLP model to understand text data.
At the service before used to chatbot to identify and explain the mistaken house. Okay. I'm not so aware about the asynchronous how it will be handled because, uh, this completely in a JavaScript. But if I'm not wrong, async function should be there. So there is a function, get user input, then set time out. Okay. That is fine. Written inviting user input dot value. I'm not so sure.
A Java function for NLP has a logical bug. Identify the mistaken menu in the streaming process. Okay? Yeah. I just as for me, there is some issue with the stem, uh, stems dot add brackets words dot substring 0 to word dot length of length minus 3. There is some logical error.
What is the critical aspect to consider when scaling an AI chatbot for handling millions of user? Okay. So the most critical aspect as for as for me will be the hardwares where we are implementing all these blocks because it supposed the systems that we are using, they are not sufficient enough. They can easily get, you know, get crashed. While I was working on a chatbot, that system was not so up to date or up to the mark which was required to handle millions of users. It used to get crash. It used to get hang, and it you you know, it used to make us loss of code in multiple things. So, definitely, system requirements should be on a top priority. Apart from that, I would like to prefer that my all the system have all the proper, uh, security purpose like IT security, so it should not have any kind of threat. Then next thing will be my database. I would like to threat. Then next thing will be my database. I would like to prefer all the database should be in a most most possible structured format because the most structured data I have, the the more quick answer I do get. And it will reduce the user wait time. As user never like to wait, so this will be the most critical aspect that my chatbot while scaling. It should not compromise with speed and quality that it is providing. Because even if it is compromising with any of those particular output, then user experience will get compromised and it will affect in company very badly. So I would like to prefer that it will, you know, it will maintain accuracy and speed. Either it should improve or it should maintain the same thing, but it should not get decreased.
Discuss how would you implement voice recognition and processing capabilities in chatbot. Okay. So for voice recognition, there are medical libraries that I would like to use. For an example, Librosa, Weebs. Uh, Uh, I would like to prefer to do it in Python because I'm good with that. And I have then actually voice recognition and, uh, you know, voice understanding, speech to text, text to speech on those all those processing I have already worked, so I would like to prefer them. And while processing, there will be noise. So first thing will be as soon as I read it, I would remove all the noises. After that, whatever data I'm getting, I would like to perform sampling and other, uh, cleansing process. After all the cleansing process, I would like to, you know, perform necessary required Depending on the problem statement, what exactly it wants me to achieve out of it. If whether it is a why it wants me to, you know, understand that particular record, whether whether it wants to predict something from those voices, whether it want to determine anything, something like that. So depending on the requirement, as soon as I clean my data, I would proceed.
Okay. How would an understanding of a graph database benefits to development of the AI chatbot? Okay. Majorly, graphical representations are the best for any understanding rather than just normal text data or just communication because graphs allows us to view it from a different perspectives. It allows us to see from the client perspective, from the company perspective, from the end user perspective, everything. So whenever there is a graph database, so we can see how many users we have who don't like our stuff or who have complaints, who likes our features, which feature they like the most, which feature they hate the most, where they are getting stuck, improvise, where our cost is getting, you know, affected, where our time, where what is taking lots of time, when what is giving inaccurate answers, when it has all those things chart database helps a lot.