profile-pic
Vetted Talent

Kurapati Venkata Krishna Gopinadh

Vetted Talent

I am a trustworthy and hardworking person who take things in a positive way. I am not self centered person. I used to take account of others view points and their opinions as well as i respect their ideologies.

If I take any work serious then I will complete with in hours .I am a person who believes in myself and learns from the mistakes. I like to help and motivate others so that they can yield more in what they want to do. I am having good communication as well as listening skills. Simply my name denotes about me " Good and obedient person".

  • Role

    Data analyst

  • Years of Experience

    4 years

  • Professional Portfolio

    View here

Skillsets

  • Python - 04 Years
  • PowerBI - 2 Years
  • SQL
  • Hadoop
  • PySpark
  • NLP - 03 Years
  • Deep Learning - 03 Years
  • PyTorch - 3 Years
  • TensorFlow - 3 Years
  • Matplotlib
  • Natural Language Processing
  • PowerBI

Vetted For

10Skills
  • Roles & Skills
  • Results
  • Details
  • icon-skill_image
    AI Chatbot Developer (Remote)AI Screening
  • 57%
    icon-arrow-down
  • Skills assessed :CI/CD, AI chatbot, Natural Language Processing (NLP), AWS, Azure, Docker, Google Cloud Platform, Kubernetes, machine_learning, Type Script
  • Score: 51/90

Professional Summary

4Years
  • Aug, 2020 - Present5 yr 1 month

    Associate

    Cognizant Technology Solutions
  • Aug, 2020 - Present5 yr 1 month

    Software Developer

    Cognizant Technology Solutions India Private Ltd.

Applications & Tools Known

  • icon-tool

    Python

  • icon-tool

    Microsoft Power BI

  • icon-tool

    Microsoft Azure SQL Database

  • icon-tool

    SQL

  • icon-tool

    Pyspark

  • icon-tool

    Hadoop

  • icon-tool

    Java

  • icon-tool

    Matplotlib

  • icon-tool

    PyTorch

  • icon-tool

    Power BI

  • icon-tool

    PeopleSoft

  • icon-tool

    Jupyter Notebook

  • icon-tool

    Pandas

  • icon-tool

    NumPy

Work History

4Years

Associate

Cognizant Technology Solutions
Aug, 2020 - Present5 yr 1 month

    Led data collection and cleansing efforts, ensuring integrity and accuracy for analysis.

    Developed structured fact and dimension tables to organize data efficiently, facilitating streamlined analysis processes.

    Collaborated with team members to create customized views aligned with client requirements, enabling tailored insights.

    Utilized Power BI to transform raw data into visually engaging reports and dashboards, empowering stakeholders with actionable insights.

    Worked closely with a team of five members to coordinate data analysis efforts and ensure seamless project execution.

    Proactively engaged with clients to gather feedback and refine deliverables, ensuring alignment with evolving business needs.

    Applied analytical skills to address data challenges and optimize data processing workflows, enhancing overall project efficiency.

Software Developer

Cognizant Technology Solutions India Private Ltd.
Aug, 2020 - Present5 yr 1 month
    Involvement in various projects, including NLP based chatbots, skin cancer Prediction models, and payroll systems for different countries. Duties included production support, developments, analyzing and resolving production issues using SQL queries.

Achievements

  • Developed an NLP based chatbot for managing Expense module inquiries
  • Developed an LSTM based skin cancer Prediction Model for a Corporate Hospital
  • Implemented New Zealand and China Payroll systems
  • Created many PeopleSoft objects during code changes and major developments

Major Projects

3Projects

Image Captions using CNN+LSTM

    Developed a model that generates captions for images using deep learning models CNN and LSTM, including data preprocessing, transformation, and feature extraction.

House Price Prediction Using Machine Learning Models

    Developed a Jupyter Notebook in Python to predict house prices from a provided dataset. Conducted extensive data pre-processing, cleaning, and feature engineering using Pandas and NumPy libraries. Implemented data encoding techniques and built machine learning models for prediction. Successfully trained and deployed the model to predict house prices based on various features.

Financial Sentiment Analysis for the given dataset using Natural Language Processing

    Developed an NLP model to analyze sentiment (positive, negative, neutral) in financial datasets. Implemented pre-processing techniques including punctuation removal, stop word removal, and POS tagging. Utilized advanced NLP methods for model training and prediction, achieving precise sentiment analysis results. Demonstrated expertise in data-driven decision-making through accurate sentiment predictions in financial contexts.

Education

  • M.Tech, Data Science & Computer Science Engineering

    Birla Institute of Technology & Science (2023)
  • B.Tech, Electronics and Communication Engineering

    Lovely Professional University. (2020)

Certifications

  • ChatGPT Prompt Engineering For Developers Short Course in DeepLearning

  • Chatgpt prompt engineering for developers

  • Short course in deeplearning ai

  • Langchain for llm application development short course in deeplearning ai

  • Chatgpt prompt engineering for developers short course in deeplearning ai

  • Langchain for llm application development

Interests

  • Watching Movies
  • Travelling
  • Driving
  • Badminton
  • Chess
  • Games
  • Cricket
  • Cooking
  • AI-interview Questions & Answers

    Hi, my name is Gopinath, and I'm currently working in Cognizant Technology Solutions as a data analyst, where I deal with, we have a client from different countries other than North America, where we deal with, it's in a banking and insurance company, where we deal with data related to banking, and we create visualisation charts using Power BI, and we will do data modelling using SQL servers, and I completed my M.Tech in BITS Pilani in October 2023, and I also did my BTEC degree from Lawley Professional University in 2020, and I'm good in ML, Python, AI, NLP, and deep learning on Power BI and Power BI technologies, SQL server technologies, I have good knowledge on these technologies, and this is my short info about my job.

    Basically, chatbot's codebase principles is it need to give an accurate result according to the prompt given by the user and it need to be a very active result according to the present situation asked by the user and it need to be respond immediately and there is no like a repetition like multiple repetitions and irritating the user and it gives the appropriate solutions and need to make the user flexible and also if the chatbot gives good suggestions it will be better so that these are the solid principles and

    how to optimize a SQL query that aggregates data across multiple tables for a chartboard response basically we can use different techniques in a SQL query optimizations we can use normalization techniques first normal, second normal, and third normalization SQL query we can use like multiple multiple joins and we can aggregate we can use aggregate functions and we can use we can use CTEs combined expressions and to join multiple tables and we can create a we can create procedures we can create procedures or views to aggregate data we can add for aggregating we can use aggregate functions like sum, count, and minimum we can use group by aggregate functions where using group by we can count or we can sum up the columns in this way we can aggregate the data from multiple tables basically we can use join basically we can join the tables and after that we can aggregate using group by statement where the aggregate functions are sum, minimum, maximum, maximum, count these are the aggregate functions we can use in the SQL query

    To check the chatbot performance we basically we need to check the accuracy and after that how it is a chatbot response is mainly we depend on the NLP a natural language processing techniques where we need to we need to take the sentence and we need to divide the vectors and we need to prompt we need to send the vectorized string to the model and we need to predict the required output and for the better performance we can use like a sentence analyzer and we can use metrics like precision like required solution is giving or not we can check it and for optimal solutions we better use latest technologies for the chatbot so metrics basically for performance check we can use these are the techniques

    Basically, we can handle concurrency using a, actually, by using different cloud technologies and maintain the load balancing and maintain each interacting session in different clusters so that it does not match with the other conversations. So that using these cluster technology or server load balancing technologies, we can handle concurrency in the chatbot while using multiple users. We can use cloud services to do the load balancing technologies in the chatbot so that they can maintain the separate chat for each user so that it will not become difficult for the users.

    We can use different techniques for handling overfitting like we can do regularization techniques so we can use regularization techniques and we can do we can use multiple optimization optimizers, Adam optimizer, Rayleigh optimizers according to the data and the model used so that we can better prevent the overfitting of a model. We can also use different standardization, we can standardize the data so that it will maintain the data consistency for all the rows in the mean, median and mode, in the standardization mode, data will be arranged in the standardization mode so that it will prevent the overfitting and we can use, we can check by different learning modes or different optimizer techniques for this neural network model to prevent this overfitting.

    page data query and database query in database, if query in record data results append I got ok results list of dictionary items with data as a key so basically here basically here why performance issue is it will check each database and it will check whether the whether the database is present or not and whether the record table is present or not and after that only it will append the no query on the tables so it will take time time to check all the databases and all the records which are present in the database so it will be performance issue so chatbot will respond slowly to the user for this code

    here basically, here we are doing this, it is taking a words, words list and after it is finding the ing in the, ending each word, if the ing is present in the word it is, it is, it is taking the substring, it is deleting the ing and after that it is adding in an ArrayList so that and it is returning the ArrayList, here basically, it created a function, public function, stem words, it is, in the ArrayList creation, in that, the list was not mentioned, may be this cause a mistake in the compilation error and if you see, the for loop is fine, it is a word, word is a string and it is a substring, taking zero from word length, it is taking minus 3, here mistake is, it will, it will, the list contains only the words which are removed, which are ing removed, it will not contain the, all the words, this may be, this may be one of the mistake because all the words need to be present in the list but the word need to, but the word contains ing, it need to be removed, I think this is the logic, this is the code they want to develop, may be this, that is one of the logic bug in this code, where it adds only the ing words which are removed, which are removed words only added in the list, all the words will not be added in the list.

    for catching strategy we can use different catching methods basically I am not much aware of these catching strategies so where we have different catching technologies where it store different prompts when we say the cloud we can store different prompts so that when the user response given these type of prompts it automatically gives the response according to the prompt these can be cloud based so in this way we can use it implement a catching strategy these are the things I know about this catching

    Basically, Sentiment Analysis is used to find whether the person is in a good mood or bad or he is in a positive or negative mood. So basically we will find the sentence and we will vectorize it and after that we will find any negative words present like keywords, like not, is, any negative things, no, reflecting effected, like these negative implemented keywords present in the sentence we will think like it's a negative sentence so that user is in a bad mood so that we can respond, we can make him cool by giving him positive sentences so that we can use this type of Sentiment Analysis. First we can vectorize and after that we can find the keywords whether it is in a positive if the positive keywords are more the percentage of the positive will be high and if the negative keywords are more the percentage will be high so that we can finalize that if the negative percentage is high so that we can tell the sentence is a negative sentiment so that we can respond it positively and make the user interactive and cool whether it is in a positive mood make him laugh and make him feel comfortable more and he can chat more to the chatbot and find the absolute result required to him.

    Yeah, importance of continuous integration deployment is very, very much important because nowadays the technologies are increasing, so models are also increasing, and user requirement is also increasing, so we need to find whether if any new bugs are coming in the code, we need to automatically resolve the queries and continuously integrate the new update and deploy the new code in the chatbot, so it can easily respond to the user without any lag and without any delay in the response, so continuous integration deployment is very much important because chatbot is basically interacting with the user according to the query, so it need to be in a continuously monitoring mode where if any errors, we need to resolve this immediately within 5 to 6 hours, so integration and deployment is very much important in chatbot.