profile-pic
Vetted Talent

Manish Choudhary

Vetted Talent

I am a mathematics, backend, data science and product development freak.

  • Role

    Sr. Python Engineer

  • Years of Experience

    6 years

  • Professional Portfolio

    View here

Skillsets

  • Scikit-learn
  • Linux
  • Manjaro
  • Matplotlib
  • MongoDB
  • Neural-networks
  • NLP
  • NoSQL
  • PCA
  • Plotly
  • Putty
  • PyCharm
  • PySpark
  • Regex
  • Keras
  • Scrapy
  • Seaborn
  • Selenium
  • spaCy
  • Spark
  • Spyder
  • TensorFlow
  • Ubuntu
  • VS Code
  • Windows
  • AWS
  • EC2
  • S3
  • ReactJs
  • Python - 6 Years
  • SQL - 5 Years
  • JavaScript - 2 Years
  • HTML
  • Azure
  • Docker
  • NumPy
  • pandas
  • Data Visualization
  • Web Scraping
  • Data Wrangling
  • Deep Learning
  • Machine Learning
  • Django - 4 Years
  • Anaconda
  • Azure DevOps
  • Azure Functions
  • Bash scripting
  • Beautiful Soup
  • Convolutional neural network
  • CSS
  • Dash
  • Data Analysis
  • Flask
  • Git
  • IBM Watson
  • Jupyter Notebook

Vetted For

9Skills
  • Roles & Skills
  • Results
  • Details
  • icon-skill_image
    Python Cloud ETL Engineer (Remote)AI Screening
  • 44%
    icon-arrow-down
  • Skills assessed :SFMC, Streamlit, API, AWS, ETL, JavaScript, Python, React Js, SQL
  • Score: 40/90

Professional Summary

6Years
  • Feb, 2025 - Aug, 2025 6 months

    Sr. Python Engineer

    Objectwin Technologies
  • May, 2024 - Sep, 2024 4 months

    Sr. Software Engineer

    Bizmetric
  • Mar, 2023 - Apr, 20241 yr 1 month

    Sr. Python Developer

    Lancesoft
  • Aug, 2021 - Mar, 2022 7 months

    Data Scientist

    NTT Data Business Solutions
  • Apr, 2022 - Oct, 2022 6 months

    Sr. Software Engineer

    Ness Digital Engineering
  • Oct, 2022 - Feb, 2023 4 months

    Python developer

    Inecta
  • Dec, 2018 - Aug, 20212 yr 8 months

    Data Scientist

    TCS
  • Sep, 2018 - Nov, 2018 2 months

    Python Developer intern

    LHD

Applications & Tools Known

  • icon-tool

    Django

  • icon-tool

    Django REST framework

  • icon-tool

    VS Code

  • icon-tool

    Git

  • icon-tool

    VS Code

  • icon-tool

    Jupyter Notebook

  • icon-tool

    Putty

  • icon-tool

    Pycharm

  • icon-tool

    Spyder

  • icon-tool

    Anaconda

  • icon-tool

    IBM Watson

  • icon-tool

    Linux

  • icon-tool

    Windows

  • icon-tool

    Jupyter Notebook

  • icon-tool

    Putty

  • icon-tool

    Linux

Work History

6Years

Sr. Python Engineer

Objectwin Technologies
Feb, 2025 - Aug, 2025 6 months
    Worked as a back-end developer. Dockerized applications and provided solutions to handle huge data. Tools/Technologies: Python, VS Code, GIT, Azure DevOps, Docker, Pandas, Numpy, MS SQL, SSMS.

Sr. Software Engineer

Bizmetric
May, 2024 - Sep, 2024 4 months
    Worked as a Python backend developer. Heavily utilized regular expressions (Regex) in Python to extract data from PDFs. Used Azure Form Recognizer services to get the data from PDFs. Tools/Technologies: Python, VS Code, GIT, Django, Django-Rest, Azure DevOps, Pandas, Numpy, Regex, Postgres SQL, Azure form recognizer.

Sr. Python Developer

Lancesoft
Mar, 2023 - Apr, 20241 yr 1 month
    Working as a Python backend developer. Involved in many POCs and development work to fulfill business requirements. Developed and maintained REST APIs. Worked on web servers, service-oriented architectures, web services (REST), Microservices architecture, security best practices, and database technologies. Used SQL queries in the Postgres database to make changes in the database as per the client's requirements. Contributed to project documentation. Tools/Technologies: Python, Postgres SQL, VS Code, GIT, Django, Django-Rest, Azure DevOps, Pandas, NoSQL, Numpy, AWS, Sonar-Qube, aws EC2 instance, aws S3 buckets.

Python developer

Inecta
Oct, 2022 - Feb, 2023 4 months
    Developed automation frameworks to update work items from Azure DevOps to SmartSheets and vice versa. Developed a Django-based web app. Developed Dash and Plotly-based dashboards for interactive visualization for ERP-based application. Tools/Technologies: Python, VS Code, GIT, Dash, Plotly, Django, Html, CSS, Javascript, Azure DevOps, Smartsheet, Pandas, Numpy, SQL.

Sr. Software Engineer

Ness Digital Engineering
Apr, 2022 - Oct, 2022 6 months
    Developed a web scraping framework to scrape data from different top rental websites. Ran large-scale web scrapes. Used a Postgres database to store the data during data cleansing or after preprocessing. Designed and developed web crawlers to scrape data and URLs with the Scrapy framework using Python. Cleaned the scraped data to make it ingestible in the database through APIs. Utilized scraping frameworks including Scrapy, BeautifulSoup, Selenium, and WebHarvest, among others. Built the scripts to retrieve data. Created proof of concepts for new crawlers/scrapers. Manipulated data through text processing, regular expressions, etc. Used Azure functions and Azure VMs to host the framework through Azure CI/CD pipelines. Used Azure blob storage to store CSV and JSON files during data cleaning. Tools/Technologies: VS Code, Python, SQL, PyCharm, Git, Scrapy, Beautiful Soup, Selenium, Requests, JSON, Azure DevOps, pandas, NumPy, Azure functions, DevOps, Jupiter Notebooks, Data Analysis.

Data Scientist

NTT Data Business Solutions
Aug, 2021 - Mar, 2022 7 months
    Developed various data analytics platforms to do interactive visualization on data sets for automotive giants to get insights from the data. Worked as a Python data science developer. Used Python as a major language along with Flask, Plotly, and Dash in the project. Developed interactive dashboards for data visualizations in the automotive domain. Tools/Technologies: Python, SQL, Pycharm, Flask, GIT, Requests, JSON, Azure DevOps, Pandas, Numpy, Jupyter Notebook, Data Analysis, Dash, Plotly.

Data Scientist

TCS
Dec, 2018 - Aug, 20212 yr 8 months
    Worked on data science-related projects using Python. Done various data manipulation tasks. Done customization in Jupyter Notebooks. Developed templates and an auto ML library for data scientists to make exploratory data analysis and machine learning tasks easy. Developed machine learning models in regression and classification. Tools/Technologies: VS Code, Pycharm, Python, SQL, GIT, Azure DevOps, Scikit-Learn, Pandas, Numpy, Data Science, Machine Learning, Seaborn, Matplotlib, Jupyter Notebooks, Data Analytics.

Python Developer intern

LHD
Sep, 2018 - Nov, 2018 2 months
    We developed a restaurant website using Python, Django, and Bootstrap. Tools/Technologies: python, django, html, css, javascript, pycharm.

Achievements

  • Got appreciation letters from the UNIT TD head from TCS.
  • Awarded by TCS (28 times) by Fresco Play Miles award which is a learning award.
  • Got 100 out of 100 marks in 10th standard in mathematics.

Major Projects

2Projects

Task Manager

    A micro app using Flask for task management hosted on Azure.

Dashboard for Stock

    A Heroku app displaying live stock prices with a comparison feature.

Education

  • PG Certificate in Capital Markets and Risk Management

    IIM (2025)
  • Bachelor of Engineering in Computer Science & Engineering

    SIRT (2018)

Certifications

  • introduction to databases

    coursera (Jan, 2024)

    Credential URL : Click here to view
  • resume


    Credential URL : Click here to view
  • Machine learning by stanford

  • Using python to interact with operating system

  • Data visualization with python by ibm

  • Databases & sql for data science by ibm

  • Neural networks and deep learning by deeplearning.ai

Interests

  • software development
  • coding
  • programming
  • Travelling
  • AI-interview Questions & Answers

    Hi. This is Manish. Uh, I have been working in the IT industry as a Python developer for the past 6, uh, 5 years 6 months. So far, I have worked on data science, machine learning, and currently working on back end development. So in my recent project, I'm working on API development, maintenance, and doing some data transformation, creating some installations. So that's a brief

    So as we all know that, uh, with this statement, we, uh, usually use to operate or open file operations. Like that, we can use database interactions as well. Uh, whenever we connect to a database, then with the statement automatically close the connections after the operation of the database or any commit or any execution of the query.

    So Python ETL, uh, usually, uh, we can create, uh, APIs on top of, uh, like, Spark. Spark, uh, will be very helpful and useful to deal with the large datasets. Apart from Pandas library, Python can be used to transform the data in chunks, and, uh, Dash is also nowadays popular. So these libraries are, uh, can be used. Apart from that, we can use caching mechanisms of Python, And, uh, we can do multiprocessing or multithreading as well to deal with multiple or, uh, big datasets. So that's

    So, like, uh, in a skill, uh, we can write objects directly or, like, we can do data selections and these deselections, and we can use those objects whenever we need it. So SQL can be very helpful, uh, to use objects directly from the databases.

    So AWS Lambda will be very helpful, uh, as this is serverless, uh, service. So we don't need, uh, any servers for that. We just need to run our scripts on, uh, schedule basis, and it is very much helpful to run the pipelines

    So to when comes, uh, operation on the databases or connecting with databases, so, usually, we have integrate error, which we can use in exception handling. Uh, sorry. Not integrate. Integrity errors or some formatting issues with the query or, uh, some data type errors. So that's we can handle

    A simplified Python code log in to send a page of my constrained database lambda function is below. Appears to be an oversight that could lead to errors or unexpected behavior. There appears to be an oversight there appears to be an oversight that could lead to a lesser unexpected So here we can, uh, like, we are not using any exception handling. So in line number 4 after line number 4 in for loop, whenever we invoke the client, Lambda Lambda client, or both of the client, we would say, Uh, there we can add exception handling so that we are sure that we are able to connect through, uh, Lambda client to the Lambda.

    This is a security snippet, which is meant to select all the calls from sales data where the revenue is higher than the previous month's revenue. So it looks like we are directly using, like, and, uh, revenue in where condition, and we're adding our result. Instead of doing this, we can use, uh, self join, and we can get the desired output.

    So the best way to debug Python applications, uh, over the SQL transactions or transformations, we can add, uh, as much as exception handlings, and we can do the proper logging of those, uh, exceptions, like integrate error or anything else or, uh, if you're not able to connect to database. So they're the best way is to use exception handling.

    Not sure about

    Can you discuss an approach to manage your state effectively in DX application working with stream date based Python backend. Not sure again