profile-pic
Vetted Talent

Rushabh Doshi

Vetted Talent

Results-driven Python Developer with 7 years of experience in developing web applications using Docker Nginx, Mongo DB, SQLite, MySQL, Python and Django. Demonstrated ability to build high-performance applications utilizing advanced programming techniques.

  • Role

    Python Backend Developer

  • Years of Experience

    8.5 years

  • Professional Portfolio

    View here

Skillsets

  • NO SQL - 5 Years
  • Django Rest Framework - 3 Years
  • Python - 2 Years
  • Scrapy - 2 Years
  • Selenium - 2 Years
  • Beautiful Soup - 2 Years
  • Restful APIs - 4 Years
  • Nuxt.js - 2 Years
  • Vue JS - 3 Years
  • AWS - 2 Years
  • Fast API - 3 Years
  • Postgre SQL - 4 Years
  • Rest APIs - 4 Years
  • API - 3 Years
  • ETL - 2 Years
  • Quality Assurance - 2 Years
  • active listening
  • Adaptability
  • Problem Solving
  • Teamwork
  • JS - 3 Years
  • NuxtJs - 2 Years
  • SQLite - 2 Years
  • MySQL - 7 Years
  • Python - 7 Years
  • Django - 4 Years
  • PHP - 3 Years
  • GCP - 3 Years
  • Redis - 2 Years
  • Kubernetes - 3 Years
  • Docker - 3 Years
  • Databases - 7 Years
  • Mongo DB - 4 Years
  • Apache Airflow - 1 Years
  • JavaScript - 4 Years
  • Postgre SQL - 4 Years
  • REST API - 4 Years
  • AWS - 4 Years
  • SQL - 7 Years
  • Flask - 1 Years
  • Go - 4 Years
  • Mongo DB - 4 Years

Vetted For

8Skills
  • Roles & Skills
  • Results
  • Details
  • icon-skill_image
    Backend Python Developer(Remote)AI Screening
  • 73%
    icon-arrow-down
  • Skills assessed :Web Frameworks, AWS, Django, MySQL, Node Js, Postgre SQL, Python, Rest APIs
  • Score: 66/90

Professional Summary

8.5Years
  • Jan, 2023 - Present2 yr 9 months

    Senior Back-end Engineer

    Log Binary
  • Jan, 2021 - Dec, 20221 yr 11 months

    Software Engineer

    Improwised Technologies PVT Ltd
  • Jun, 2018 - Dec, 20191 yr 6 months

    Part Time Developer

    Cranai Technology LLP
  • Aug, 2011 - Dec, 20209 yr 4 months

    Lecturer

    V.V.P. Engineering College
  • Jun, 2016 - Jun, 20182 yr

    Freelancer

    Freelancer

Applications & Tools Known

  • icon-tool

    Python

  • icon-tool

    NPM

  • icon-tool

    MySQL

  • icon-tool

    PHP

  • icon-tool

    Laravel

  • icon-tool

    Vue.js

  • icon-tool

    Asana

  • icon-tool

    Slack

  • icon-tool

    Jira

  • icon-tool

    Visual Studio Code

  • icon-tool

    MongoDB

  • icon-tool

    PostgreSQL

  • icon-tool

    Django

  • icon-tool

    Scrapy

  • icon-tool

    Docker

  • icon-tool

    Kubernetes

  • icon-tool

    GCP

  • icon-tool

    Postgres

  • icon-tool

    NuxtJS

  • icon-tool

    Apache Airflow

  • icon-tool

    Redis

  • icon-tool

    Metabase

  • icon-tool

    Selenium

  • icon-tool

    Flask

  • icon-tool

    AWS SQS

  • icon-tool

    GCP

  • icon-tool

    GKE

  • icon-tool

    CI/CD

  • icon-tool

    Flux

  • icon-tool

    Nuxt.js

  • icon-tool

    Beautiful Soup

  • icon-tool

    FastAPI

  • icon-tool

    NLP

  • icon-tool

    Celery

  • icon-tool

    Redis

  • icon-tool

    Apache Kafka

  • icon-tool

    Swagger

  • icon-tool

    SQLite

Work History

8.5Years

Senior Back-end Engineer

Log Binary
Jan, 2023 - Present2 yr 9 months
    Spearheaded backend development projects, optimizing performance and scalability.

Software Engineer

Improwised Technologies PVT Ltd
Jan, 2021 - Dec, 20221 yr 11 months
    Delivered innovative solutions and maintained high-quality codebases.

Part Time Developer

Cranai Technology LLP
Jun, 2018 - Dec, 20191 yr 6 months
    Designed and implemented multiple web solutions for clients, enhancing operational efficiency.

Freelancer

Freelancer
Jun, 2016 - Jun, 20182 yr
    Making node js, python script, venue booking website in Laravel, windows based application. Game design in Unity (C#).

Lecturer

V.V.P. Engineering College
Aug, 2011 - Dec, 20209 yr 4 months
    Taught and mentored undergraduate students in computer science disciplines.

Achievements

  • Successfully transitioned from a lecturer to a senior backend engineer role.
  • Developed various tools for metadata extraction, sentiment analysis, and ETL processes.

Major Projects

16Projects

Tabless

Log Binary
    Streamlined restaurant order operations with a single tablet solution. Implemented key modules for deposits, restaurant hours, and scraper modules. Built core modules using Django REST, PostgreSQL, MongoDB. Implemented Selenium for real-time data scraping. Managed WebSocket connections for real-time backend data to UI & mobile apps.

Listing Management

    Maintained and enhanced a comprehensive solution for validating, formatting, and publishing business details. A comprehensive system using PHP Laravel for backend and MySQL for database management. Utilized AWS Pub/Sub and deployed on GCP using GKE for CI/CD with Flux.

Tempest

    Provided Wikipedia infobox information for search terms in a private browser web application. Enhance efficient parsing functions using Beautiful Soup for extracting data from Wikipedia pages' infoboxes. Build on FastApi framework

Automatic Metadata Extraction & Summarization Tools

    Automated summarization of digital documents with metadata extraction. Developed the entire backend using FastAPI , frontend with Vue JS with Python framework and NLP support. Ensured the project was Docker-ready and had Kubernetes configuration ready.

Top Headlines News

    Display top headlines with date filter options. Built backend in Django REST and frontend in Vue.js with WebSocket. Automated fetching news headlines from NEWSAPI.org every 2 hours with Celery with Redis.

Historical Stock Price Analysis ETL

    Periodically scraped and loaded stock prices into Metabase for data visualization. ETL scripts using Python SCRAPY Liberary to scrape historical stock prices and load into PostgreSQL. Automated the whole process with Apache Airflow.

Acuity Health Care

Arihant Healthcare
Aug, 2024 - Present1 yr 2 months

    Outcome: Track and update patients' medical information during provider visits, including BP,

    pulse, lab results, medications, conditions, vitals, and diagnoses.

    Role: Back-End Developer

    • Created backend APIs in Django REST Framework and designed the database (PostgreSQL). Contributed to GCP deployment.
    • Developed API test cases for quality assurance.

CityTempTrack

    Implemented a pipeline to get city temperatures based on latitude/longitude using an API and save data in DB. Created producer and consumer scripts using Apache Kafka to fetch and store city temperature data.

Go User Module

    Ready-made user module for CRUD operations and login API. Created user module using Gorilla Mux. Saved in PostgreSQL. Provide API documentation in Swagger

Go Fiber CRM Project

    Developed a Lead CRUD application with Go Fiber and SQLite. Built a Lead CRUD application using Go Fiber. With SQLiteDB. Provide API documentation in Swagger

Venue Booking Project

Cranai Technology LLP
    Developed a website for customers to book different locations of the same venue for varying durations, enabling cost savings. Built the application using PHP Laravel.and MYSQL for DB management

Publishers Reviews

    Maintained and enhanced seamless review scraping backend, enhancing user functionality. A comprehensive system using PHP Laravel for backend and MySQL for database management. Maintain around 40 different spiders and build 5 from scratch. Used Flask for address-based searches and Go-Colly for one publisher;s site scraping. Get batches from AWS SQS for the process.

Acuity-Healthcare

    Track and update patients' medical information during provider visits, including BP, pulse, lab results, medications, conditions, vitals, and diagnoses. Created backend APIs in Django REST Framework and designed the database (PostgreSQL). Contributed to GCP deployment. Developed API test cases for quality assurance.

Thryv Leads Project

    Sustained and enhanced an automated system for handling business details with precision, ensuring timely publication. A comprehensive system using PHP Laravel for backend and MySQL for database management. Utilized AWS Pub/Sub and deployed on GCP using GKE for CI/CD with Flux. Built the UI with Nuxt.js and ensured quality with test cases for both backend and frontend. Manage and enhance multiple Cron jobs, with the ability to run them manually in case of failure.

Publisher review

Improwised PVT LTD
Jan, 2021 - Dec, 20221 yr 11 months

    Outcome: Maintained and enhanced seamless review scraping backend, enhancing user functionality.

    Role: Back-End Development

    • A comprehensive system using PHP Laravel for backend and MySQL for database management. Maintain around 40 different spiders and build 5 from scratch.
    • Used Flask for address-based searches and Go-Colly for one publisher;s site scraping.
    • Get batches from AWS SQS for the process.

Thryv Leads

Improwised PVT LTD
Aug, 2021 - Dec, 20221 yr 4 months

    Outcome: Sustained and enhanced an automated system for handling business details with precision, ensuring timely publication.

    Role: Back-End Development

    • A comprehensive system using PHP Laravel for backend and MySQL for database management.
    • Utilized AWS Pub/Sub and deployed on GCP using GKE for CI/CD with Flux.
    • Built the UI with Nuxt.js and ensured quality with test cases for both backend and frontend.
    • Manage and enhance multiple Cron jobs, with the ability to run them manually in case of failure.

Education

  • Bachelor Of Engineering (Computer Engineering)

    Saurashtra University (2011)
  • Master Of Engineering (Computer Engineering)

    Gujarat Technical University (2014)
  • Master Of Engineering (Computer Engineering)

    Gujarat Technological University, Ahmedabad (2014)
  • Bachelor Of Engineering (Computer Engineering)

    Saurashtra University, Rajkot (2011)

Interests

  • Exploring
  • Watching Movies
  • AI-interview Questions & Answers

    So my name is, uh, Youshro Sabdousi. Uh, I have completed my graduation in 2011. After that, I prepared for the GATE examination for 1 year and create the GATE exam in 2012 and pushing for the masters. Completed my masters in 2014. And, uh, after that, I joined the, uh, institute VEP Engineering College as a contract based lecturer along with the freelancing work. I did the freelancing work, uh, from 2016 to December 2019. Uh, in between, I also work as a academician, uh, from a contact based lecturer in one of the engineering college that is VVP Engineering College. After that sorry. In my, uh, journey with the freelancing, I did the project, uh, in Laravel, Windows application, uh, Python script, and Node JS script. After that, in COVID era, I joined the Improvise as a software intern where I learned the Node. Js, Blue. Js, Nuxt. Js, Golang, Python, uh, based, uh, Python's, uh, scrappy framework, uh, GitLab, GitHub, Dockerization, and Kubernetes config for the same, and CICD pipeline for the same, like build, test, and deploy config file. Then in January 2021, I joined the Improvise as a full time software engineer, uh, full time software engineer and where I handle and maintain the various PHP based, uh, Laravel framework based on based application, like, 5 applications are there. Along with in in one of the project also has a Python scrappy to scrap the reviews from the around 30 to 35 different different providers. And, uh, we have, uh, we have used the Python scrappy for the same, and I have deployed near several spiders for the Python scrappy one. Apart from that, I also worked in one of the tempest of, uh, custom browsing project where I, uh, use the fast API with the dockerization and customizing the Wikipedia parsing for the custom browser application. Apart from that, I also did, uh, several Kubernetes related task in LiveCode and, uh, used the Google Coli to scrap the data from the one year website. After that, in January 2023, I joined the log binary as a senior back end engineer where I used to, uh, where I work on the one project table, uh, which having, uh, which is, uh, the Django is the framework based and, uh, having multiple databases and multithreaded along with the Selenium environment along with the Selenium environment. And, uh, that project in the Django Rest framework with the Docker images and all that, where I mostly work with, uh, API integrations, uh, our modules, and, uh, deposit modules. Uh, after that, uh, I also work with the Alliant Healthcare, uh, one of the Django REST API, uh, based, uh, one application, which having a back with the Django REST framework and, uh, providing the organization to assign the patient.

    To set up a secure endpoint that, uh, I will talk about my live project, Qt Healthcare, where we have used the API gateway for that. API gateway will, uh, authenticate the user with the JSON web token and authenticate the user with the Firebase 1. After that, the particular app token is also there, which would be and, uh, which would be inserted by API gateway whenever the user is authorized. And then and then that API with the app password and authorization, uh, user UUID key and authorization password will be entered, uh, where API with the user UID and, uh, or application key, secret key, uh, would be added into the headers and would be sent it to the Django API. So this way, we will secure the rest API. We have used the API gateway, which will authenticate the user and then add the app key and user UID. So we will also make sure that who which user made this API request.

    So in this way, uh, optimize the Django application interest will have volume of post this database. Uh, in my one of my live project in tableless one, uh, we have also used the NoSQL, that is MongoDB, along with the Postgres SQL to reduce the read request to the Postgres SQL. Uh, that would be the frequent read trigger request would be there to read the data from the PostgreSQL. For that, uh, we reduce that, uh, read request by just, uh, saving those data into the MongoDB. And whenever there is a necessary to read the data, we will read from the MongoDB. And, also, also, we will, uh, maintain the maximum connections, uh, limit and use the post list default library, which having a minimum which having a default connection policies. So if any connection issues would be there, we will close the ideal connections, and, uh, we will call the close connection. So it will be close the ideal connection and, again, make a request for that.

    So for the real time data processing with the Python service, actually, there are mainly 2 to 3 services. Other Lambda services, step services, also step functions are there. Uh, we can use that leverage of all that thing. We can use this CloudStack or CloudFormation to deploy that those, uh, service architecture in AWS where we can use the SQS for the message, uh, transferring and SMS for the notification and lambda that for the whenever the data would be available, we can trigger the lambda function to perform the certain task on it. And, uh, apart from that, if for the step function that would be run the based on the trigger event occurs or the workflow that we have set with the, like, a clone job. So that would be also possible in that case. So we can design the service architecture with the lambda function. Uh, Python service architecture with the lambda function for the real time data whenever there is a file will be uploaded to the s three bucket, then the trigger that will trigger the event to the lambda function, and which will take that file and, uh, uh, convert it into various different different resolution format and save it to another bucket so that, uh, we will, uh, use that particular, uh, different different resolution file for the diff showing, uh, for the different different display devices.

    So for managing the database transaction in Django, mostly, we are dealing with the, uh, this kind of thing in a multithreaded environment. So for that, we can use the atomic transaction, uh, decorator, uh, which will help us to make sure that only one thread would be active in the one function while performing the database transaction. So atomic transaction, other atomic transaction decorator would be the beneficial in that case whenever we're dealing with the database related activity in a multithreaded environment.

    What consideration? Okay. So for consideration, the, uh, uh, annual event scaling the Python based API horizontal that in that case, uh, if we use a step function or if we use, uh, sorry, if we use the lambda function, that will be the serverless application. Step function is also the serverless application. Uh, we can leverage it. And for that, the infrastructure scaling and all that thing, that would be maintained by the AWS itself. So we don't need to worry about it, but we can, uh, that the cloud analytics would be also there, which will which will analysis our incoming and outcome incoming datas and incoming requesters, and it will scale accordingly. In the horizontal case, it will deploy the EC 2 whenever there is a necessary, and it will scale down the EC 2 instance whenever there is no request would be need to solve.

    So here, we need to put the decoration. Uh, we can leverage the, by default, uh, Python func tool LRU cache with the maximum size. Here, the size would be not, uh, specified, so that would also cause a memory leak or out of memory bound. And, also, we need to set the time out limit, uh, for the each and every cache function. So once the cache function will be saved along with the timing one, and if all the timing, uh, that contained the particular, uh, uh, if the cache data, uh, for the particular value having, uh, some threshold time limit, then we need to calculate it again. So that in that way, we will, uh, update with the particular data. In that case, uh, we will update it, uh, we will, uh, response back with the updated value instead of having the old value in the cache 1, and we can replace it. So two cases are there. Uh, we need to store the time time stamp where when the when the particular, uh, particular data would be cached into the function and whenever that data would be, uh, call again, then we need to check whether it is, uh, it whether the data would be present in the cache one, and, uh, it would be there, uh, in the less let's say, in the 5 minutes or 5 seconds. Uh, uh, it is not, uh, more than the 5 minutes and 5 seconds old enough, then if it is if it is so, then we need to calculate and compute again the new value. If it is not, then we will just, uh, return that, uh, cash value so that, uh, in that way, we will, uh, add the timestamp mechanism along with the caching of the value so that we will, uh, uh, response back with the updated value instead of giving the old value. And, also, we are not setting here the caching, uh, limit or memory limit, which we'll also call the memory out of bound error. So we need to also add the cache, uh, limit for the particular number 1, like, uh, length should not be greater than the 64 and all that. If it is there, then we need to, uh, use another then we need to discard the less recently used data less recently used value. And we can leverage the LRP cache mechanism for the func tool for the same.

    Looking at this Python call responsible for connecting to Postgres database using the Pycopg library. And in fact, there's a potential issue in how can the connections being handled. If this function would be called, uh, multiple times, then then there there will be if this function would be called the multiple times, then there will be the possibility that multiple, uh, multiple process connection would be open or the connection refused that we can, uh, encounter in this case. So there is no other mechanism for this one to grab the data from it. And, uh, while executing this cursor execution, if we use the Django one or any other thing, then we need to use the Django query set instead of running this execute, uh, row query directly. So SQL injection, that will be also possible. So we need to avoid SQL injection if it is possible. And, uh, regarding the connection one, we need to make sure that we can use the existing open connection if it is there, and this connection as DB name, name, user, password, and host would be there. But if it is, uh, require any, uh, connection security mechanism like, uh, any, uh, protocol that we need to use it, then there would be the problem in that case. And if it is not connect, then data mechanism is also not there. And here then finally, if on then connection would be closed, that would be also there. But, uh, yes.

    So we can use, uh, AWS Kubernetes config is EKS, uh, Elastic Kubernetes configuration, which will continuously check and, uh, check the app status with the heartbeat, uh, heartbeat and minimum 2 to 3, uh, containers, uh, or port will be running with the same Django application. If one of the Django application clears, then we will immediately, uh, rest, uh, deploy the new port. And, uh, we need to check the CloudWatch log, uh, that what caused the Cheng web crash, and we need to fix it. So in that, uh, downturn deployment, 0 downturn deployment strategy, we can use the gradually increasing by whenever we, uh, deploy the new changes. We can use the gradually increasing in a partition way, in a gradient way, like, uh, out of, uh, if there are 4 nodes already running with the Django application, then we deployed in the 2, uh, Django 2 port with the new implementation, and the 2, all will be remain as it is. After that, the 2 will be deployed fully. Then and then we, uh, deployed the tenant, then we close the old 2 and, uh, replace it with the new one. So that would be the proposing the downtime deployment strategy that I, uh, as far as I can, uh, experienced.

    So Python AWS function for that, the event trigger, uh, that I need to, uh, I need to identify the triggers and set the part, uh, proper I'm role for that, uh, Lembra 1, uh, Lemda function 1. So that not more permission that would be goes to Lemda function. And, uh, that particular the Lemda, the sort of in the in the Lemda function deployment, we don't need to, think about the scalability and reliability issue that all the infrastructure as a code, uh, thing would be handled by the AWS itself. So what are numbers of request that we can make with the lambda function 1, that would be only consider and whatever and the computation time for the lambda function that would be considered in the financial in in in a calculation of the cost so that it would be financial that would be it would be much more, uh, easier, uh, much much more, uh, we can say cheaper contact, uh, compared to we need to deploy the ECS, uh, instance for that or EC two instance for that.

    So advantage using build pipeline for the Django, that would be helpful to, uh, like, uh, we can, uh, we can upload every new images, uh, of the Django with the updated one. And, uh, for that, we need to, uh, we can use the cloud build to load your DML or app build appspec. C. Yml. We can start build a cloud build pipeline with the cloud format and cloud stack 1. Uh, so cloud format which will contain all the necessary service that need to be taken care of. And whenever the any AWS and whenever the any any kind of, uh, application would be made to the Django application from the GitHub or from the AWS commit, then the automatic trigger would be occurred, and it will, uh, generate the Docker image. And after that, the docker image would be deployed into the node, uh, using EKS like, uh, Kubernetes services. Uh, the if 5 nodes are there, then it will, uh, deploy 2 nodes at a time, and the 3 nodes remain as it is with the old deployment. After that, the 2 would be running fully, then and then it will go for the next 2 and then what? So in that way, we can maintain the 0 down downtime and, uh, build time line automated that would be used to cloud stack or cloud format, uh, thing, like, uh, cloud build auto or cloud build pipeline, we can use also the Jenkins and the AWS cloud build thing that which will taken care of all the continuous integration and continuous deployment thing.