Immediate Joiner: Backend Software Developer with 5+ Years of Experience | Ex-Amazon & Salesforce | Expert in Developing & Deploying Web Applications, Databases, APIs | Proven Track Record in Optimizing Backend Systems for Performance & Reliability
Freelance Software Developer
FreelancingMember of Technical Staff
SalesforceSenior Software Development Engineer
SmartQ - Compass Group CompanySoftware Development Intern
MyAdvoSoftware Development Engineer (Contract)
FlipkartSoftware & Automation Specialist
AmazonSelenium
MySQL
Git
CSS3 & CSS5
HTML5
REST API
Python
MongoDB
PostgreSQL
Slack
Asana
Jira
Skype
Visual Studio Code
Postman
tailwind css
SaaS
Airtable
Microsoft Teams
Zoho
AWS (Amazon Web Services)
AWS Cloud
AWS CloudWatch
Confluence
Zoom
Django
Django REST framework
Flask
Web API
WebSocket
Amazon Redshift
Redis Stack
SQLAlchemy
MySQL Workbench
Google Cloud Platform
Serverless
Firebase
Cloud Firestore
Celery
Swagger
FastAPI
Next.js
Docker
Kubernetes
Gunicorn
BigQuery
pandas
Amazon S3
Google Cloud SQL
Bash
Kali Linux
Ontic
. Implemented REST and WebSocket APIs for realtime data acquisition.
. Optimized time-sensitive data storage using Redis, enabling efficient data retrieval through a user-friendly interface.
. Utilized cfscrape, a Python library, to effectively circumvent Cloudflares anti-bot measures.
Grata Inc.
. Developed web scraping solutions compliant with CCPA and GDPR, targeting company websites and data aggregators.
. Designed and implemented a Lambda function, scheduled via Cloudwatch Rule, to ingest collected data into DynamoDB.
Young Alfred
. Created an automation tool for insurance quote submissions using Selenium and HTTP, reducing form processing time by over 90% through website reverse engineering.
. Enabled end-user account switching by collaborating across teams, utilizing Salesforces multi-tenant architecture for efficiency and isolation.
. Enabled account deletion for the user in the Mobile Publisher app improving compliance metrics.
. Implemented the Right to Left language layout for Experience Builder sites, expanding its accessibility.
. Led Agile sprints & scrum responsibilities.
. Resolved Test Failures & Flappers, increasing code coverage to 77%, reducing post-deployment bugs.
. Wrote spike & release planning docs.
. Mentored a team of junior developers, improving team productivity & reducing on-boarding time by 50%.
. Successfully led the development of the Midday Meals Dashboard for regional schools which contributed to a 40% increase in meal plan subscriptions.
. Created Special Diet Portal for students with dietary requirements.
. Implemented real-time order status tracking, reducing order-related inquiries by 60%.
. Built pipeline to register university students increasing enrolment efficiency by over 300%.
. Managed & optimized GCP resource usage, achieving a 20% cost reduction in cloud services.
. Refactored the legacy code & implemented significant code optimizations that reduced server response times by over 30%.
. Implemented real-time Ranking of Service Providers for 16 services across 13 marketplaces improving performance metrics by 40%.
. Implemented Role Based Access Control for service providers to manage the seller requests.
. Piloted an Early Warning System, aiding in 35% increase in seller retention & reduction in churn.
. Scaling authentication service for Big Billion Day & flash sales.
. Automated load & performance testing for the launch of 2gud platform on Big Billion Day.
. Built tool to help clients transfer hearings between courts for timely decisions.
. Automated captcha bypassing & web-scraping of court's decision data from websites
. Created dashboard to internally track various metrics for each court case which helped the support team.
Web platform for sharing personal textile stories via text & audio on a map, created for the 14th Annual Design X Design exhibition.
Link: https://textile-memories.pinkpyjama.com
Link: https://chromewebstore.google.com/detail/ytloop/fgcnicbbpgekbpfhjalafepfgnecaglj?hl=en-GB
Chrome extension which lets the user loop specific sections of YouTube videos, setting custom start and end times and loop counts, with settings stored per video.
Enabled seamless account switching, enhancing user experience.
Developed the back-end for a mobile application facilitating cafeteria food orders.
Link: https://play.google.com/store/apps/details?id=com.thesmartq.compass.foodbook
Represented our college in Inter Collegiate Programming Competition Regional securing all India 38th rank.
Link: https://drive.google.com/file/d/12oRK9R7aJcjldpQT3IMRgA_kJl6JjrQL/view?usp=drive_link
Could you help me understand more about it? You can introduce yourself. Uh, hello. My name is. I'm from the city, Jaipur in India. I've been working as a software developer since 5 years and have worked majorly on back end and few front end solutions. Uh, currently, I'm working as a freelancer. I have worked with uh, few startups, uh, like project twice and have a majorly on their back end and data scraping. Uh, before that, I was working as a member of technical staff at Salesforce, where my team was working with experience community cloud, uh, features for their web
Can you In order to minimize the resource usage, uh, we should be able to identify what is the maximum data load our ETL pipeline can work with. Uh, in order to do that, we can incrementally like, we can keep on increasing the load until we don't, uh, get any error from the ETL pipeline saying that, uh, or or or, like, if, um, it is starting to give us error messages or warnings. So to get that, uh, limit from the detail, uh, we can, like, check for, um, we can do, basically, binary, uh, search, kind of, uh, finding the exact, uh, data load that our ETL pipeline can
Can you discuss a way to efficiently paging it? You pay request in the Python script for ETL purposes. Sure. Uh, so while if we are working with Django framework, it allows us to, uh, paginate our API, uh, paginate our API request and, like, there are many ways to paginate, in the in the settings, uh, of our application in Django. Uh, so many ways to um, paging the request, uh, like, we can get the paging parameter, uh, like, which page, uh, the user or the ETL, uh, in our case, our client is at. So we can get that page number and give the number of results on that page. Um, or what we can do is we can tell that we are at this, uh, record and give us these when it records next in the next, uh, response like that.
Monitor and log a Python interface with AWS service and SQL database to ensure reliability. So, um, while loading the data through our ETL, we can make sure in this Python script that we are correctly logging the data. AWS provides, uh, a solution for logging. So in there, we can keep, uh, the time stamp of what data we are fetching. And and for the logging part as well, like, it also provides different kind of logs, debugging logs, if you want to have keep for our, uh, production server or a warning log or an error log. So we should be looking out for all the error log, specifically. Uh, whenever an error occurs, we should be able to mitigate that error, um, as soon as possible. Warning logs are basically just letting us know that, uh, there is an error, but not like, we have to mitigate that, um, in a timely fashion. But, uh, we can take time, or we will know that this warning is
In short, uh, how can you ensure data integrity when performing transformations and Python ETL processes? Performing, uh, to ensure data integrate t, we will make sure that we are not, uh, modifying the data. We are only analyzing the data or, like, even if we like, we are, uh, modifying the data, for example, for a recommendation engine and we are getting the data fields or parameters which are not, uh, which which have value, uh, like, null values or empty values, and we want these, um, and our recommendation engine only accept values in certain format. So we can replace those values, but not to hinder with the data which is provided.
Would you combine Python asynchronous programming with API calls to improve the performance of an API pipeline? So Python does provide, uh, certain libraries to for example, aiostp, uh, to and s t t p x, that's a new library to asynchronously, uh, run cross multiple processes. Uh, so we will make sure that while we can make multi, uh, asynchronous API calls, uh, so that it won't it will let us use our processing power, uh, and, parallelly, keep on, uh, we'll keep on, um, keep, uh, utilizing the utilizing it for our ETL pipeline.
Unsimplified button code log designed to send the message. The is below. And that appears to be an oversight that could lead to errors or interfere with kind of potential issue. And how, uh, would the test? How would you test, uh, to confirm your suspicions? 2 to 3. The mobile? Okay. So while invoking the Lambda, um, client. Give me a second, please. So, uh, like, as we are in we are, like, running the loop, um, for all the messages we just, uh, to send, uh, through the, uh, Lambda client, uh, by invoking the Lambda client. Uh, here, we are not making sure, uh, that after the like, after getting the response, we are not checking the of the Lambda client, basically. Uh, what is it responding with? We can introduce error handling here. And, also, um, for our Lambda client, we can use asynchronously we can send these messages asynchronously, which can, uh, increase the, um, which will basically optimize the whole process.
Again, this SQL query snippet, which is meant to select all the customers from where the revenue is higher than the previous month's revenue. Okay. So invite my field and how you debug it. Select from Yeah. More or about So here, we, uh, are ordering the revenue by month that, uh, okay. So we are basically ordering the, um, ordering the sales data by month. We should be doing this, uh, as we need to consider the last month. So we should be, uh, descendingly in the in the descending fashion, we should be sorting these months. Uh, and then our sales data, like, as there may be data present for multiple years. So that will also create a confusion. That will also create, uh, like, a wrong, um, output because, uh, like, a month can be for example, a month can be December and this April, but that December month is of last year. So, um, this will lead to error. And to debug it, we should look at the data, um, which we should also consider the year, um, because, uh, like, for the current year, we need to find for the last month. And if it's January, for example, we should be looking at the last year's, um, month, uh, last month, which is December for correcting the last, uh, previous year, previous month's revenue.
What would be secure method to manage sensitive information such as API keys and database in the Python intel code base? So various cloud solutions provide a secure method to store these API keys. Um, like, we can make an API call to that, uh, all, uh, to that service and get the and then the API key from there for our production. And, uh, another method could be, like, uh, we can also yeah. I think I think, uh, if if if we are not using, uh, any cloud service provider, we should be, uh, storing these API keys, and the enrollment, it will not hard coding, uh, these, uh, values in our code. But instead of the enrollment variable in our enrollment so that we will be able to use it directly.
What technique would you recommend for troubleshooting and debugging your Salesforce marketing cloud integration within a Python detail pipeline? I have not worked with Salesforce integration as I was working in Salesforce as a member of technical in the company on the feature for its website builder. I have not worked with the Salesforce integration before, so we won't be able to answer.
Can streamlet be integrated with the React components to enhance user interface capabilities in a Python web application. I have not worked with React. I have only worked with Next. Js. But, uh, so I I won't be able to answer.