profile-pic
Vetted Talent

Snehal Lodhe

Vetted Talent

An experienced Backend Developer with over 5+ years of professional experience, having strong background in designing, developing, and maintaining robust and scalable backend systems for web applications. My expertise lies in building RESTful APIs, implementing data models, optimising database queries, and ensuring high performance and reliability of backend services. My Technical Skills comprise of but not limited to

Python, JAVA, GO, Node.js, Java script, AWS and GCP, Github CI/CD pipelines , Kafka, redis, RESTAPI's, Fast-API, Django, MongoDB, Postgresql, Docker , Kubernetes, Kafka, REDIS, PUB/SUB.

Proficient in Multithreading and multi-processing in Java/ Core Java, Spring.

  • Role

    Senior Software Engineer

  • Years of Experience

    8 years

Skillsets

  • Hadoop
  • TypeScript
  • Vue.js
  • API Gateway
  • AWS
  • CI/CD
  • Cloudfront
  • Computer Vision
  • Deep Learning
  • Firebase
  • Github
  • Springboot
  • Message queue
  • NoSQL
  • OAuth2
  • Redux
  • REST
  • S3
  • Serverless
  • SQL
  • SQS
  • stripe
  • GCP
  • Angular
  • Go
  • Java
  • JavaScript
  • react
  • Apache Beam
  • Apache Spark
  • Django
  • Docker
  • Flask
  • Python - 7 Years
  • GraphQL
  • Kafka
  • Kubernetes
  • Microservices
  • MongoDB
  • Nextjs
  • Node.js
  • PostgreSQL
  • Redis

Vetted For

6Skills
  • Roles & Skills
  • Results
  • Details
  • icon-skill_image
    Backend Python DeveloperAI Screening
  • 45%
    icon-arrow-down
  • Skills assessed :Mongo DB, AWS RDS, MySQL, Django, Python, REST API
  • Score: 45/100

Professional Summary

8Years
  • Oct, 2024 - Present1 yr 2 months

    Senior Software Engineer

    Value Labs
  • Feb, 2024 - May, 2024 3 months

    Senior Software Engineer

    Zinios Edge
  • Nov, 2020 - Dec, 20222 yr 1 month

    Senior Software Developer

    Lincode Labs
  • Jun, 2017 - Dec, 2017 6 months

    Back end Developer

    NTT DATA
  • Feb, 2018 - Jan, 2019 11 months

    Machine Learning Engineer | Backend Developer

    Advanced Risk Analytics
  • Feb, 2019 - Nov, 20201 yr 9 months

    Data Scientist | Backend Developer

    Aventior Digital

Applications & Tools Known

  • icon-tool

    Python

  • icon-tool

    MongoDB

  • icon-tool

    AWS Cloud

  • icon-tool

    Apache Beam

  • icon-tool

    Google Cloud Platform

  • icon-tool

    Elasticsearch

  • icon-tool

    AWS S3

  • icon-tool

    AWS SQS

  • icon-tool

    Javascript

  • icon-tool

    Github

  • icon-tool

    CI/CD pipelines

  • icon-tool

    Kafka

  • icon-tool

    Redis

  • icon-tool

    REST API

  • icon-tool

    Django

  • icon-tool

    Node.js

  • icon-tool

    AWS API Gateway

  • icon-tool

    Docker

  • icon-tool

    Celery

  • icon-tool

    Hibernate

  • icon-tool

    Spring

  • icon-tool

    Struts

  • icon-tool

    AWS S3

  • icon-tool

    Redis

  • icon-tool

    Kafka

  • icon-tool

    Kubernetes

  • icon-tool

    Flask

  • icon-tool

    NextJS

  • icon-tool

    pytest

  • icon-tool

    AWS API Gateway

  • icon-tool

    GraphQL

  • icon-tool

    Material UI

  • icon-tool

    GCP

  • icon-tool

    AWS

  • icon-tool

    Kafka

  • icon-tool

    REDIS

  • icon-tool

    SQL

  • icon-tool

    DynamoDB

  • icon-tool

    NextJS

  • icon-tool

    FAISS

  • icon-tool

    OpenAI

  • icon-tool

    Hugging Face Transformers

  • icon-tool

    Pinecone

  • icon-tool

    Elastic Search

  • icon-tool

    Fast API

Work History

8Years

Senior Software Engineer

Value Labs
Oct, 2024 - Present1 yr 2 months
    Developed a scalable chatbot using Java and Node.js with Rasa for NLP, deployed via AWS Lambda and managed using AWS API Gateway. Integrated Kafka for messaging and Redis for session storage. Containerized services with Docker and orchestrated via Kubernetes. CI/CD handled through GitHub Actions. Frontend built using React. Deployed ML inference pipeline using Flask APIs and Spring Boot microservices with Docker containers orchestrated through Kubernetes. Integrated OAuth2-secured APIs via AWS API Gateway and used S3 for model storage. Set up CI/CD with GitHub Actions and implemented performance monitoring with AWS CloudWatch. Engineered e-commerce site using Spring Boot, Node.js, MongoDB, React, Redux, Stripe, AWS S3, Kafka, Redis, deployed on AWS ECS. Developed online learning platform using Vue.js frontend and Spring Boot backend, integrating PostgreSQL and AWS S3/CloudFront. Designed social media analytics dashboard with Angular frontend and Java + Node.js backend, processed API data using Apache Spark, visualized insights, deployed with Docker, scaled with Kubernetes. Created health and fitness app with React Native, Node.js, Java backend, integrating Firebase and PostgreSQL, real-time sync with Redis, event logging with Kafka, AWS S3 for media, microservices for user management, deployed with Docker and Kubernetes.

Senior Software Engineer

Zinios Edge
Feb, 2024 - May, 2024 3 months
    Designed and developed complex software applications using Java, Node, React, Spring integration, Jersey, Spring boot, annotations-based Spring config, Spring Data, Jest. Written optimized frontend using React and Vue.js. Collaborated with cross-functional teams to understand business requirements and design solutions deployed on AWS and GCP. Implemented and managed CI/CD pipelines to automate deployment of code changes to production environments.

Senior Software Developer

Lincode Labs
Nov, 2020 - Dec, 20222 yr 1 month
    Responsible for building and deployment of a No-code platform for automating quality inspection in manufacturing. Reduced inspection time by at least 50%. Built scalable and distributed backend architecture for large volumes of data. Developed microservices and serverless architectures using AWS Lambda. Skills: AWS Serverless Architecture, AWS S3, AWS SQS, JAVA, Springboot, REST API's, SQL, NoSQL, Message queues, Apache Kafka, Docker, GCP/AWS cloud deployments.

Data Scientist | Backend Developer

Aventior Digital
Feb, 2019 - Nov, 20201 yr 9 months
    Architected Deep Learning system functionality and components, derived algorithm specifications. Built end-to-end applications for Vehicle Detection, Building Detection, and Facility Detection using deep learning frameworks and computer vision. Designed optimized architecture for predictive model intelligence.

Machine Learning Engineer | Backend Developer

Advanced Risk Analytics
Feb, 2018 - Jan, 2019 11 months
    Developed semantic segmentation model for satellite data to predict buildings and oil tanks for insurance value calculation and geo-coordinates. Worked in classification, pattern recognition, segmentation, and semantics. Designed and implemented pipelines for feature detection and prediction. Created strategy for independent model layer training via deep neural networks.

Back end Developer

NTT DATA
Jun, 2017 - Dec, 2017 6 months
    Backend internship focused on server-side development. Worked with server infrastructure, databases, and APIs. Responsibilities included development, maintenance, and optimization. Hands-on experience with programming languages and tools.

Achievements

  • Honors-Awards Selected for the DCT INSPIRE SCIENCE CAMP organized by the Indian Institute of Science Education And Research(IISER),pune and National Chemical Laboratory (NCL),pune. Selected for the Workshop at RED HAT, PUNE on the django python framework in india .
  • Reduced the time taken for the quality inspection process by at least 50% in Lincode Labs

Major Projects

18Projects

AI-Powered E-commerce Website

    Enhanced a MERN stack-based e-commerce platform by integrating an AI-powered recommendation system.

Health and Fitness App

    Created a cross-platform React Native app with Node.js and Java backend, integrating Firebase and PostgreSQL.

Online Learning Platform

    Developed an education platform using Vue.js frontend and Spring Boot backend, integrating PostgreSQL and AWS S3/CloudFront for secure video streaming.

E-commerce Platform

    Engineered a full-fledged e-commerce site using Spring Boot, Node.js, and MongoDB.

Chatbot for Customer Support

    Developed a scalable chatbot using Java and Node.js with Rasa for natural language processing, deployed via AWS Lambda and managed using AWS API Gateway. Integrated Kafka for asynchronous messaging and Redis for session storage. Containerized services with Docker and orchestrated via Kubernetes. CI/CD handled through GitHub Actions. Frontend built using React for real-time interaction.

Property Management System with AI Notifications

    Developed a property management system using Laravel, Vue.js, and MySQL.

Health and Fitness App with AI

    Built a React Native mobile app with Firebase backend and AI-based fitness tracking features.

Social Media Analytics Dashboard

    Designed a dashboard with Angular frontend and Java + Node.js backend stack.

Online Learning Platform with AI

    Built a learning platform using Django and Vue.js, incorporating RAG pipelines.

Machine Learning Model Deployment

    Deployed a ML inference pipeline using Flask APIs and Spring Boot microservices with Docker containers orchestrated through Kubernetes. Integrated OAuth2-secured APIs via AWS API Gateway and used S3 for model storage. Set up CI/CD with GitHub Actions and implemented performance monitoring with AWS CloudWatch. Kafka and Redis supported low-latency prediction flow.

IoT Sensor Data Logger

    Designed a scalable data logger for IoT devices using Python and MQTT.

AI Chatbot for Customer Support

Oct, 2023 - Present2 yr 2 months
    Developed an intelligent customer support chatbot using Python, Rasa, and Quasar.

Loan management and credit systems

AIML
Mar, 2023 - Apr, 2023 1 month

    The loan management and credit application project involve several tasks, including data collection, real-time data processing, API development, containerization, security and compliance, integration with external systems, reporting and notifications, and scalability and performance optimization. Relevant borrower information, credit history, and financial documents are collected for loan assessment. Apache Beam and GCP Dataflow are used for real-time data processing and analysis. Flask APIs are developed for loan application submission and tracking. Docker ensures easy deployment and scalability. Security measures and compliance with regulations are implemented. Integration with external systems enables data exchange and verification. Reporting and notifications provide updates on loan status. Scalability and performance optimization ensure efficient loan processing. The project aims to streamline loan management, automate credit assessments, and provide efficient credit application services in the financial industry.

No-code Platform for Quality Inspection

Nov, 2020 - Dec, 20222 yr 1 month
    Responsible for building and deployment of a No-code platform for automating quality inspection in the manufacturing industry. The platform helped reduce the time taken for the quality inspection process by at least 50% compared to manual inspection.

Cyber Security And Threads Analysis

Lincode Labs
Mar, 2022 - Dec, 2022 9 months

    The Cybersecurity and Threat Analysis project involves several key tasks. API development using Django and Node.js creates robust and secure APIs for data collection, threat analysis, and reporting in the cybersecurity domain. Data storage is handled through MongoDB and SQL databases, managing cybersecurity-related data such as logs, network traffic, and security events. Containerization with Docker ensures easy deployment, scalability, and portability of applications and services across environments. Kubernetes orchestration enables efficient management of containerized applications, providing scalability and automated deployment. CI/CD pipelines, implemented with tools like Jenkins or GitLab CI/CD, automate the build, testing, and deployment processes for rapid and reliable software delivery. MLOps integration incorporates machine learning models and algorithms for threat detection and analysis, using MLOps practices for model deployment, monitoring, and optimization. Advanced analytics and algorithms are employed for threat analysis and reporting, generating actionable insights for stakeholders. Real-time security monitoring tools and techniques are utilized to promptly detect and respond to cybersecurity incidents, ensuring proactive threat mitigation. Overall, the project aims to enhance cybersecurity defenses, provide real-time threat analysis, automate security operations, and deliver actionable insights to improve the overall cybersecurity posture.

Quality Inspection

Lincode Labs
Nov, 2020 - Feb, 20221 yr 3 months
    1. Data Collection: Gather manufacturing data from sensors, IoT devices, and production databases, including relevant parameters, measurements, and attributes.
    2. Real-time Analysis: Process the collected data in real-time using Apache Beam and GCP Dataflow. Perform analysis, anomaly detection, and statistical modeling to identify deviations from quality standards.
    3. Automated Inspection: Apply advanced algorithms and machine learning techniques to automatically inspect manufactured products. Utilize the processed data to identify defects, measure quality metrics, and classify products based on predefined criteria.
    4. Integration with Manufacturing Systems: Develop Flask APIs to seamlessly integrate with existing manufacturing systems such as production management, inventory, and reporting. Enable comprehensive insight, traceability, and data-driven decision-making.
    5. Reporting and Visualization: Generate comprehensive reports, real-time dashboards, and visualizations to provide insights into inspection results. Allow stakeholders to monitor quality status and take prompt actions based on the provided information.
    6. Scalability and Flexibility: Utilize Docker and GCP Dataflow to ensure scalability and flexibility. Handle large data volumes, accommodate additional data sources, and adapt to changing manufacturing requirements, improving operational efficiency.

    By performing these tasks, the quality inspection solution aims to streamline processes, enhance quality control, and improve operational efficiency in the manufacturing industry.

Deep Learning System for Vehicle Detection, Building Detection and Facility Detection

Feb, 2019 - Nov, 20201 yr 9 months
    Architect the complete Deep Learning system functionality and its components with deriving algorithm specifications. Build the end-to-end application for Vehicle Detection, Building Detection and Facility Detection using deep learning frameworks and computer vision.

Semantic Segmentation Model

Feb, 2018 - Jan, 2019 11 months
    Developed a semantic segmentation model that predicts buildings and oil tanks in the satellite data using deep learning methodologies in order to calculate the total insurance value and Geo-Coordinates of the desired locations.

Education

  • Bachelor of Engineering in Electronics and Telecommunication

    Savitribai Phule Pune University

Certifications

  • Certification on the secure the cloud from Microsoft virtual academy (MVA)

AI-interview Questions & Answers

Could you help me to understand, uh, more about your background by giving Okay. Hi. This is Nehal. Uh, I have several plus years of experience into, uh, Python full stack development. Uh, my tech stack is Python, Node. Js, uh, in SQL and NoSQL. I'm aware of both. In AWS, I have worked on various services as AWS Lambda, Kinesis, SQS. Uh, and I have also experienced in Docker, uh, Kubernetes orchestration, Amazon AWS services as well as I'm familiar with the GCP. And, uh, this is my, um, uh, tech stack. I have also worked in financial manufacturing and aerospace industry as in to a full stack

How can solid principles be effectively implemented in Python? Yes. You can, uh, implement the solid principles principles in Python by writing the around the, um, object classes. And while loading the object, you have to, res keep, um, well of the garbage collectors and as well as the object instances should be mapped and referenced. By this way, you can, um, um, effectively implement a Python

How would you resolve issue with real time data processing in Python, particularly in finance product? So for uh, so for the finance, we have, uh, the data in the DB. So if, uh, the data is SQL data, then we can have the indexes for mapping. And as well as there are other passing formats such as JSON and XML format to pass the data. As well as if we are in no SQLs, we can write the wrappers around the 12 datas and implement it as indexes to find. And, also, we can create a single handedly helper classes, um, to manage through the, uh, finance date night processing.

What are the scalability of challenges you could foresee? How would you work around it? There are various challenges. One such, um, challenge is the latency. Since, uh, it includes the operations with the back end, um, a log core logic and the database, we have to also take care of whether the services should not fail in the microservices architecture, uh, as well as in the monolithic architecture. Apart from that, we have to take care of the data coming via passing. And, And for scalability, we have, uh, there are other parameters. Let's say, rate limiting, um, request as well as there are, um, the other parameters such as, um, the alphas and betas, uh, for the rate limiters. And for the scalability challenges, we have the option of having instances map across various regions, but still, we have the availability of the and, uh, servers and the reliability of the servers is still, um, can be modified via architecture. So these are the challenges while scalability of restful APIs.

Do you ensure data consistency in Postgres, SQL, or any other databases when integrating various data sources into unified system? Okay. So for, uh, ensuring the data consistency, we have all the integrations with Okay. Um, so ensuring data consistency, PostgreSQL, or any other databases when integration various data sources into unified system. Uh, suppose, Grace or any other, uh, databases, the integrating various data sources are very important. For that, the third party libraries and wrappers are there. So the third party libraries expose the API endpoint. And via API endpoint, we can integrate it that the third party data sources and we can integrate via having the, uh, having the writing wrap or surround them or importing tables or just in integrating the third party libraries, uh, queries.

If you had to build a high performance API in Python, what are the key consideration we'll keep in mind in the design environment? If I have to build the high performing API in Python, the key points will be the latency, rate limiting, optimization, and how much paper it will take to, uh, execute an API. Those are the important points to take care while building the high performance API in Python.

Assuming we are trying to implement the singleton pattern, what changes you will recommend and why you should use the design pattern? Okay. So in this entire, uh, scenario, while creating a DB connection, um, it has the DB host, DB user, and DB password and execute query So I will make this execute query as abstract, uh, method. And, uh, since having it As an abstract method, just creating the instance of class here and extending method into the particular instances to override it completely. By this way, we can implement, uh, a single ten design pattern, and And by creating abstract methods and extending it with the client class is the solution for this design pattern.

Give now following Python function. Explain what does it point out any issues you see within it? Okay. So it has no breaking point as I can see, and it has no repeater. So function needs some iterator to fall back into. So in this entire function, no iterator is there as to iterate over the end or going through the values. And since it's a recursion function, there is no endpoint or starting point to this recursion

What Python web frameworks do you prefer for a server side logic and why? How does that ensure high responsiveness of web application? So Python framework, I prefer for the Django Flask and FastAPI. Those are frameworks, but I prefer Django for the large community support. Um, and since it has the ability, uh, to scale the entire back end, um, via less latency, and make make the entire system as highly reliable. And it does, uh, I yeah. And since the Django Since the Django framework is a very lightweight framework, it does, um, it's highly responsive for the rest fully um, operations. Um, yes. And, uh, there is other framework called as fast API frameworks. The fast frameworks, the community support is not yet there, but it's highly responsive framework to implement. The entire Instagram back end has been relied on the

Some basic practices of building and managing server side knowledge in Node. Js and web application. So the best practices come up with a design pattern, then create a blueprint of the class, create object methods, and follow the design pattern. By this way, you can have a server side logic in Node. Js and web applications.