Abhilash Jain is a seasoned software engineer with 6 years of experience specializing in the design and implementation of scalable systems. His expertise spans Node.js architectural design, cloud technologies including AWS and CloudWatch, and containerization with Docker and Kubernetes, including EKS. He is skilled in developing and managing Node.js microservices, leveraging MongoDB for robust data management, and utilizing JavaScript for both frontend and backend development. Abhilash also has a proven track record in team management, leading development teams to deliver high-quality, efficient solutions. His blend of technical and leadership skills makes him a valuable asset in driving complex projects to success.
Senior Backend Developer
TekiesTechSenior Backend Developer (L5)
Times InternetSenior Software Developer 2
ByjusExamPrep aka GradeUpBackend Developer
BirdEyePostgreSQL
Kafka
AWS
GCP
REST
GraphQL
Cassandra
Elasticsearch
MongoDB
BigQuery
Docker
Sequelize
Pandas
Puppeteer
DynamoDB
DESCRIPTION: platform designed to streamline marketing analytics and data management processes. Leveraging Node.js & Python for backend development and PostgreQL and ClickHouse for robust data storage capabilities, this tool empowers marketers to efficiently store, analyze, and visualize key user, product, and event data.
• PostgreSQL and ClickHouse databases to securely store large volumes of user, product, and webhook event data.
• Seamless integration of webhook for real-time data ingestion, ensuring that marketers have access to the latest insights.
• Bull MQ is placed on top of the webhooks to throttle the storage request coming in for Clickhouse.
• Python to retrieve data from ClickHouse, leveraging its efficient querying capabilities to show analysis along with Pandas are utilized for data processing, allowing for advanced analytics, manipulation, and transformation of raw data before it's displayed on the dashboard.
• Marketers can create custom reports and dashboards tailored to their specific needs, enabling them to gain deep insights into the performance of their marketing campaigns.
• The tool is built from scratch to handle 10 million requests per minute.
TOOLS: Node.js & Express (webhooks), Sequelize (ORM), Python & Pandas (Reports), Clickhouse, Docker, bull MQ
DESCRIPTION: the project was to rebuild the existing social media platform (longwalks). It took over 2 months to deliver a soft launch but due to funding winter, the project was closed. I have designed the backend architecture and mentored a team of ~3 professionals both backend and frontend.
TOOLS: Python, MongoDB, Docker, GCP (Compute Engine)
DESCRIPTION: The project involves building a Loan Management Microservice using Python, Postgres, Kafka, and AWS S3.It aims to securely store KYC data, establish connections with partner banks, and send EMI notifications. Security measures are implemented to safeguard sensitive customer information. Integration with partner banks is facilitated through APIs, ensuring seamless communication. EMI notifications are sent via various channels for timely updates. Through extensive changes in integration & async data storage methods, I successfully reduced missing data and EMI notifications to users by approximately 20% and made the whole process half a day faster.
TOOLS: Python, Postgres, Kafka (In-house), AWS S3
DESCRIPTION: The project involves migrating Apache Airflow from version 1 to version 2, encompassing the migration of over 200 Python 2.7 scripts to Python 3. This transition necessitates careful planning and execution to ensure compatibility and maintain functionality. The migration process includes thorough testing and debugging to resolve compatibility issues and ensure a seamless transition.
TOOLS: Airflow, Python, GCP BigQuery
Optimized table size and queries, cutting monthly BigQuery expenses by around 30%. Incorporated SMS delivery tracking through Gupshup, loading it into BigQuery using SQS for throttling.This aids
in detecting unresponsive campaign users, enhancing cost efficiency.
TOOLS: Python, GCP, BigQuery, AWS SQS
DESCRIPTION: Leadsquared, a powerful sales and marketing automation platform, plays a pivotal role in generating high-quality leads for BDA (Business Development Activities).
LIMITATION: The API allows the creation/update of up to 25 user profile records per call, with a maximum of 3 concurrent API calls. Additionally, the system supports a rate of 10 API calls per second for efficient data management.
PROBLEM:
• We encountered several errors, such as reaching the maximum limit and profile merge issues, prompting the need to
implement a 5-second delay before sending the next message to address the issues effectively.
• We experienced errors during the creation of new leads and activities
AFTER IMPROVEMENT:
• Our system operates without encountering daily errors and efficiently processes approximately 1500 messages per minute.
• After full development, this system was able to handle 30 lac messages per day.
TOOLS: Node.js, Kafka, AWS Lambda, Big Query, PostgreSQL, AWS MSK, Microservices
Created a dynamic user funnel with an algorithmic approach, providing personalized discounts through customized banners. Efficiently processed 5 million hourly data points and offered
diverse features like multiple coupons, customized pricing, lightning deals, and flexible discounts.
TOOLS: Node.js, GraphQL, Postgres, Cassandra, Redis, Kafka, AWS Lambda, Microservices
DESCRIPTION: After comprehending product requirements, I deconstructed the project into actionable tasks, prioritizing based on complexity and dependencies. This structured approach facilitated efficient planning and task delegation, leading to the successful development of the Super Card Membership feature.
TOOLS: Node.js, GraphQL, Postgres, Redis, Microservices
DESCRIPTION: Collaboration with cross-functional teams ensured smooth data flow and system stability. Noteworthy
achievements include reducing data processing time by 30% through Kafka integration and enhancing data extraction
accuracy by 20% via Puppeteer optimizations, contributing to the successful delivery of a scalable reviews scraping platform.
TOOLS: Node.js, Kafka, Puppeteer, DynamoDB, Microservices