profile-pic
Vetted Talent

Rudri Jani

Vetted Talent

Seeking a position where I can contribute my skills to the organization's success and synchronize with new technology while being resourceful, innovative, and flexible. A technology enthusiast and enterprising individual with a strong educational background with 3+ years of experience as an AI Engineer working with traceable projects.

  • Role

    AI Chatbot Developer

  • Years of Experience

    3 years

  • Professional Portfolio

    View here

Skillsets

  • Flask - 3 Years
  • Chatbots - 3 Years
  • Sklearn - 2 Years
  • spaCy - 2 Years
  • NLTK - 3 Years
  • LangChain - 1 Years
  • Mongo DB - 1 Years
  • LLM - 1 Years
  • NLP - 3 Years
  • OpenAI - 1 Years
  • Fast API - 3 Years
  • Seaborn - 3 Years
  • Matplotlib - 3 Years
  • Python - 3 Years
  • TensorFlow - 3 Years
  • Keras - 3 Years
  • pandas - 3 Years
  • NumPy - 3 Years
  • Elasticsearch - 2 Years
  • Postman - 3 Years
  • Git - 3 Years
  • Docker - 1 Years
  • MySQL - 2 Years
  • Python - 4 Years
  • Mongo DB - 3 Years

Vetted For

10Skills
  • Roles & Skills
  • Results
  • Details
  • icon-skill_image
    AI Chatbot Developer (Remote)AI Screening
  • 63%
    icon-arrow-down
  • Skills assessed :CI/CD, AI chatbot, Natural Language Processing (NLP), AWS, Azure, Docker, Google Cloud Platform, Kubernetes, machine_learning, Type Script
  • Score: 57/90

Professional Summary

3Years
  • Jan, 2024 - Present2 yr

    AI Engineer

    F.R.O.M
  • Aug, 2022 - Aug, 20231 yr

    AI ML Engineer

    Smartsense consulting solutions
  • Aug, 2021 - Aug, 20221 yr

    AI/ML Engineer Intern

    smartSense Consulting Solutions
  • Dec, 2019 - Jun, 2020 6 months

    Software Developer Intern

    Leeway Soft-Tech Pvt. Ltd.

Applications & Tools Known

  • icon-tool

    Google Colab

  • icon-tool

    GitLab

  • icon-tool

    GitHub

  • icon-tool

    Git

  • icon-tool

    Postman

  • icon-tool

    Microsoft Power BI

  • icon-tool

    Jupyter Notebook

  • icon-tool

    Visual Studio Code

  • icon-tool

    Anaconda

  • icon-tool

    REST API

  • icon-tool

    Skype

  • icon-tool

    Microsoft Teams

  • icon-tool

    Slack

  • icon-tool

    Zoom

  • icon-tool

    AWS (Amazon Web Services)

  • icon-tool

    MS Excel

  • icon-tool

    Spreadsheets

  • icon-tool

    Airflow

  • icon-tool

    VS Code

  • icon-tool

    Docker

Work History

3Years

AI Engineer

F.R.O.M
Jan, 2024 - Present2 yr
    • Constructed a chatbot to gather client needs using generative AI, LLMs, and OpenAI models; this sped up project development by 70% by eliminating the project manager's need to gather requirements manually.
    • Created several LLM chains using openAI generative models, using prompt engineering to convert client requirements into technical specifications, creating a sprint schedule for the developers, and assigning the task according to their roles and responsibilities.
    • Worked on developing openAI assistants with customized features utilizing open APIs to offer various functionalities such as flight and hotel searches, job searches, weather forecasts, and so forth via various APIs depending on user queries in natural human language.
    • Implemented custom Langchain agents using existing openAI assistants that automatically determine which custom tools will be used to get the response of the user queries.
    • Designed technical documentation for the application of openAI's function retrieval and code interpretation techniques
    • Developed a Django application using transformers and hugging face models to extract significant information from unstructured paragraphs into sorted and unordered lists

AI ML Engineer

Smartsense consulting solutions
Aug, 2022 - Aug, 20231 yr
    • I have worked on different ML and NLP tasks like named entity recognition, question-answering systems, search engines etc.
    • I know data collection, data cleaning, data annotation tools and fine-tuning hugging face transformers models using Python programming and APIs.
    • I have also fetched and saved data to NoSQL databases like MongoDB and SQL databases like MySQL.
    • I have a good understanding of working with elasticsearch index creation and search queries.
    • I have experience with orchestration tools like Docker.

AI/ML Engineer Intern

smartSense Consulting Solutions
Aug, 2021 - Aug, 20221 yr
    • Created the natural language processing chatbot for an online pet store with Keras, NLP-NLTK, Named Entity Recognition, Elasticsearch, and MongoDB
    • MTech Thesis under the supervision of Mr. Mayur Pabari on Deep Learning Based Natural Language Processing E-Commerce Chatbot

Software Developer Intern

Leeway Soft-Tech Pvt. Ltd.
Dec, 2019 - Jun, 2020 6 months
    • Built a rule-based and pattern recognition chatbot for Leeway Soft-Tech Pvt. Ltd.
    • Created a chatbot for a bank using Python and deployed it on the bank website.
    • Developed a Whatsapp chatbot for the company's website using Google Dialogflow and Whatsapp API key.

Achievements

  • Publication: Link: https://isrdo.org/journal/SRJSET/currentissue/deep-learning-based-natural-language-processing-e-commerce-chatbot-1

Major Projects

6Projects

Foster Search(Elasticsearch, Python)

SmartSense Consulting Solutions
Aug, 2022 - Dec, 20231 yr 4 months
    • This is the feature of the Live web portal, which connects the students, schools/colleges, employers, and jobs availability.
    • Objective is to make the search effective, the boolean complex Elasticsearch query was created using custom mappings and settings.

Foster Advance Search (NLP, Elasticsearch, NER, Python)

Aug, 2022 - Dec, 20231 yr 4 months
    • This is the component of the web portal that connects students, schools/colleges, companies, and career opportunities.
    • It is built to extract the degree, education program, college/university, stream, etc. from the search query and shows the results accordingly.
    • Added multiple features and created modules for the project

Deep Learning Based NLP E-Commerce Chatbot(t (NER, Elasticsearch, Python, NoSQL)

Aug, 2022 - Dec, 20231 yr 4 months
    • The objective of the project was to guide the customers for the pet products using NLP-based AI-driven Chatbot.
    • It performs entity recognition from the user query and finds the products accordingly from the Elastic search and MongoDB database.
    • The chatbot was containerized with Docker

Document Search Engine(Data Preparation, NLP, Deep Learning, Elasticsearch))

Aug, 2022 - Dec, 20231 yr 4 months
    • As a product, this project is being built to give a question-answering utility across the organization utilizing a retriever-ranker-reader pipeline from documents such as resumes, ISO regulations, etc.
    • It can answer questions related to particular candidates resumes like personal details, education details, work experience, interests, etc.
    • Deployed the solution, integrated it with AI Catalog, and designed a custom interface using the Streamilt framework in Python

Classifying exoplanets from NASA database, (Machine Learning, Data Analysis, Statistical Analysis, Cloud Deployment)

Jan, 2021 - May, 2021 4 months
    • Led a team of 4 to create a machine-learning pipeline to classify planets from NASAs exoplanet archive using the XGBoost decision tree, and a working web interface is deployed on Heroku.
    • The dataset has around 9560 examples and 52 features.
    • The working model gave 96.16% accuracy for unseen data.

YouTube Data Analysis (Data Analysis and Visualization, Machine Learning , PySpark)

Jan, 2021 - May, 2021 4 months
    • The project focuses on data analysis and visualization of YouTube videos using different parameters.
    • Data consists of around 158090 examples and 17 columns. Based on views and comment counts, it also predicts the likes using PySpark machine learning algorithms.
    • The regression model gave around 85.84% accuracy on unseen data

Education

  • M.Tech in Computer Science and Engineering

    Ahmedabad University, Ahmedabad (2022)
  • B.E in Computer Science and Engineering

    Gujarat Technological University, Gujarat (2019)

Certifications

  • Managing Big Data in Clusters and Cloud Storage using Hive and Impala Coursera

    (Apr, 2022)
  • Mathematics with machine learning - coursera

    (Feb, 2021)
  • The complete python data analysis and visualization - udemy

  • Introduction to machine learning for data science udemy

Interests

  • Painting
  • Drawing
  • Listening Music
  • AI-interview Questions & Answers

    Hello, everyone. I'm. I have done my MBA from Ahmedabad University. I'm. I I have been working in this field since more than two and a half years. I have worked on different technologies for NLP machine learning, uh, Hugging Face transformers, tentative AI, uh, LLMs, uh, fine tuning the models, fine tuning pre trained models, and, uh, working with different prompt and prompt end engineering, uh, LLMs, generative AI, OpenAI, etcetera. I have been, uh, I have I have been the good performer in the in my company and always, uh, trained by my coworkers and my peers and my seniors. I'm very hardworking and sincere in working. Uh, I like to work with, uh, different technologist, try different, uh, new things, and always enthusiastic to learn the upcoming tech new technologies trying to do it.

    Prevent overfitting in to prevent the over overfitting in the chatbot, uh, with the neural network, we can increase the layers in neural network, increase the data. We can also edit a dropout in that. So the JetBlue, uh, gives the proper replies and the relevant replies. We can also use regularization technology and and and one, l two regularization techniques to reduce the overfitting and

    So design, uh, we can use different patterns for real time chatbot message handling. Like, we can store the store the checks in the, um, NoSQL databases that stores the unstructured data. Uh, other than that, we can also train the model with patent recognition or intent identification to identify, uh, what exactly, uh, the client or user wants. We can also if, uh, the JetBot is used to extract the API, uh, extract the information from the API, We can also, uh, use the real time API calls to get the current and the latest data from the API. Like, if we are using the developing the chatbot for weather forecasting, then if user says that I want to know the, uh, weather forecast for the today in particular region, then we will directly, uh, give that query into the API using the Jetbot interface, and then the current real time real time data will be given given to the user. So that is the design patterns that we can use to create the chatbot. You can also use voice voice based and tech text to speech, speech to text, like, uh, patterns also to you, uh, uh, to be used in the, uh, chatbot so that we can use the multimodal functionality as well in the chatbot with the user. Uh, so if user is not able to write or, uh, it won't search it, uh, in the speed speed limit, then also they can, uh, do it.

    To ensure the metrics, like, different KPI indicators that we can use to, uh, measure the measure the performance of the j JetBot link. Uh, how much, uh, the users are ending uh, users are visiting the port, how much it is using? Let fall back fall back rate frequency of question asking, uh, then how much, uh, they are we can also collect the feedback, uh, that can be that can help the users that can help to, uh, do the performance monitoring as well. What type of customers are visiting to the chatbot, and, uh, how they are feeling their sentiment analysis, uh, which type of in impression we are, uh, giving to the user. Then bounce rate, what type of questions they are asking, like, frequently asked questions. Then, uh, user rating, conversion rate, conversion duration, how long they, uh, stay to the chatbot, number of sessions per channel, etcetera, we can use to help to, uh, monitor the performance.

    For this type of, uh, thing, we have to train the chatbot that, uh, that can identify that these words are slang or not. For example, if we are using the repair latest advanced JPT models that can, uh, identify the things these are slang and these are non slang. Other than that, we can also, uh, have the We can also train the machine learning algorithm for, uh, the deep learning neural networks so that they can identify it can identify that which type of words they are, uh, getting. And based on that, uh, they can filter out those words. We can use, uh, filters for that. We can use, uh, we can have 1 database where the where it is this length or nonstandard language is fed and, uh, based on elastic search queries or, uh, based on text analysis, we can find that this is not not allowed or not good language, and they can, uh, filter out those questions or a report of slides as well.

    SQL, uh, SQL aggregation functions can be used to get the chatbot response from from multiple tables based on the user based on the user query. For example, if user is seeking for the product details and it is, uh, distributed across across multiple tables, then from, uh, analyzing with, uh, like, uh, entity extraction from the user query, we can identify that which queries, which entities are, uh, present in the query. Like, if I if user has to query, like, I want to search I want the product details about the, uh, about the soaps, for example, then we can, uh, have the soaps and, uh, the details and descriptions that are distributed, like product details and product, uh, for example, datings, uh, datings have the different in the different table, pricing, the different tables, or all the prices in one table as across the then we can use the multiple. Uh, We can aggregate it with SQL aggregation functions and, uh, get the response from that. And then then, uh, we can also use, uh, use the technique of NLP and, uh, give the proper answer in the nature language from the user of to the user so that, uh, it cannot feel like it is machine generated and, uh, better in, uh, natural language.

    Here, it is set, uh, for the time, ma'am, but, uh, it is waiting for the operation to, uh, it is, uh, waiting for the user input so that it has the mistake, um, because, uh, as in when the when as the user input will not get until that, it will, uh, wait for the input. And there is no async await, uh, created in into this. So it is not doing some, uh, asynchronous code handling.

    Here, uh, if there is a, uh, database which have the long, uh, large amount of data, then this loop might, uh, have the more computations. And, uh, it takes too many, uh, iterations to, uh, identify the data and fetch the data. Uh, if we are, uh, doing this for loop in a batch batches and, uh, so that we can, uh, have the performance that can be then the performance can be improved.

    When scaling an AI checkbook from, uh, handling multiple millions of users, we have to, uh, see we have to see that it should, uh, handle all the users at a time. It should have a a sync await functionality so that so that millions of users can be, uh, handled asynchronously, and everyone, uh, uh, every user don't does not have to wait for the long time, and it is scalable. Other than that, we can also use the docker implementation and make the doc, uh, containers very, uh, use very, uh, lightweight so that it can be handled effectively. And, uh, we can have the suitable, uh, deployment platform. We can we have to ensure that the server there where the chatbot is deployed, uh, is running fine and able to handle the multiple users as well. Now We have, uh, we can have the effective we have to use the effective, uh, Jetbot algorithms that can, uh, handle the multiple requests at a time with, uh, fast processing and getting the response, uh, effectively. We can also have the data volumes where the data has been stored, uh, should be, uh, fast enough and quick enough to, uh, give the response as well. And it is it it should also be able to, uh, fetch the data efficiently. You can also use the APIs, real time APIs to face the record, and, uh, we have to ensure that those APIs are also not taking too much time and, uh, give you the quick reply as well.

    If we are if, uh, if we are having the graph databases, then we can, uh, effectively find, uh, find out the responses, uh, in a quick manner. For example, if we are using the no notes based structure and if, uh, there is a to find out the the details of, uh, candidates who are applying, then it will, uh, it will, uh, it will increase the search time, uh, search time. And, uh, for example, there is a one there will be one node with her skills, another node with, uh, number of experience, uh, another mode with, uh, education. And if the different candidates have the same, uh, skills or same number of experience, then we can make a tweak. And, uh, from that, we can, uh, access the information quickly. Other than that, At the end of it, it can give the explicit and complete control over the, uh, answers provided by the chatbot and allow, uh, allows to avoid hallucination. Then if you are using knowledge knowledge graph, then all the repetitive work for the, uh, knowledge the knowledge graph can be, uh, can help to clear concepts, structures, and entities, and everything so far. From there, we can easily identify the answers and, uh, with response to the, uh, user.

    The data visualization tools, like, uh, check, uh, Tableau or, uh, Tableau or Power BI can be integrated with using APIs or, uh, other than that, we can also have the graph. Also, we can get the graphs or something, uh, graphs or pie charts to have the response dynamic response generation, uh, for the tabular data or numerical data analysis that the, uh, JetBlue is finding for the response, it can, uh, be easily interoperable interpretable as well to understand the results for the user. You can also use the heat maps as well so that, uh, it will the user will get to know that which are the important and which are not, uh, much required or something like that.