profile-pic
Vetted Talent

Kartikaya Sharma

Vetted Talent

I’m a backend engineer with 4+ years of experience in Erlang, Elixir, Python, and Node.js, specializing in distributed systems and scalable infrastructure. I’ve built high-performance solutions in fintech and AI projects, with a focus on reliability, observability, and system design.

  • Role

    Quantitative Researcher & Developer

  • Years of Experience

    4.1 years

  • Professional Portfolio

    View here

Skillsets

  • Compute Engine
  • react
  • Scikit-learn
  • App Engine
  • C
  • C++
  • Cloud Functions
  • Cloud Pub/Sub
  • Cloud Storage
  • Python
  • GitHub Actions
  • Google Cloud Platform
  • Mixpanel
  • MySQL
  • Prometheus
  • SciPy
  • Seaborn
  • Docker
  • PostgreSQL
  • pandas
  • NumPy
  • Node.js
  • New Relic
  • MongoDB
  • Kubernetes
  • JavaScript
  • Java
  • Go
  • Git
  • Firestore
  • Express.js
  • Erlang
  • Elixir

Vetted For

6Skills
  • Roles & Skills
  • Results
  • Details
  • icon-skill_image
    Senior Backend DeveloperAI Screening
  • 53%
    icon-arrow-down
  • Skills assessed :DevOps, Problem Solving Skills, API Designing, cloud platform, Golang, Java
  • Score: 42/80

Professional Summary

4.1Years
  • Jan, 2025 - Present 10 months

    Quantitative Researcher & Developer

    Self-Directed
  • Jul, 2021 - Feb, 20231 yr 7 months

    Backend Engineer

    OkCredit

Applications & Tools Known

  • icon-tool

    Javascript

  • icon-tool

    React

  • icon-tool

    Python

  • icon-tool

    Node.js

  • icon-tool

    Elixir

  • icon-tool

    Java

  • icon-tool

    MongoDB

  • icon-tool

    Docker

  • icon-tool

    C++

  • icon-tool

    Erlang

Work History

4.1Years

Quantitative Researcher & Developer

Self-Directed
Jan, 2025 - Present 10 months
    Developed adaptive trend-following and mean-reversion strategies using custom EMAs, Keltner channels, Z-score models, OU process, and Kalman filter; integrated regime detection indicator with dynamic signal interpretation and volatility-based filtering; ensured robustness through walk-forward validation, parameter sensitivity testing, and diversification; conducted 5-year backtests on BTCUSDT with Sharpe 4.61, Sortino 9.39, Calmar 6.97, and max drawdown -3.21%; developed modular Python backtesting engine and production-grade Go-based execution system; deployed secure React + Plotly dashboard for live strategy monitoring.

Backend Engineer

OkCredit
Jul, 2021 - Feb, 20231 yr 7 months
    Designed and built customer engagement features to improve retention and reactivation; computed credit risk scores and implemented a defaulter marking system; analyzed user funnel data via Mixpanel; built and maintained real-time message delivery system using Pub/Sub architecture over MQTT with EMQ X broker; developed scalable GoLang APIs; implemented production observability using New Relic and Prometheus; contributed to backend infrastructure improvements.

Major Projects

1Projects

Volatility Prediction ETH Implied Vol (t+10s)

    Refined and organized ETH 1s order book and OHLCV data with peer-asset alignment and outlier capping; engineered microstructure and realized-volatility signals with session/regime flags and cross-asset spillovers; trained a Voting ensemble of LightGBM and XGBoost, achieving R2 = 0.4896 and Pearson r = 0.704.

Education

  • Bachelor of Engineering (Hons.) Computer Science | Minor in Finance

    Birla Institute of Technology and Science Pilani (2022)

AI-interview Questions & Answers

Yeah. So I have a habit to, uh, start writing the problem. So when I start solving a problem, what I do is just write it down. That way, I get a clear picture of what the, uh, and how to solve it. So using this technique, I learn the code quality and deliver it timely.

Okay. So, uh, when I was working as a back end engineer, I took the credit. So there, the main stack was the plan. So what I made was a, uh, chat. I I give the chat feature, uh, to the OkCreditor app. So the earlier, the issue was that, uh, there were some timing issues, I mean. That message delivery issues, and all of those were fixed because of the changes that I made. And the latency was reduced to just, uh, one twenty five milliseconds. And all of that came with with a very low cost, I would say.

What security API practices that I use? Okay. So so the first issue the main issue for me, I think, that I face, uh, during my time at Avocadrid was there was this DDoS attack. And this was before I joined. So my first start as a back end engineer there was, uh, to meet the public API system. So there's that. And other than that, uh, we had some vulnerability, uh, checks time to time to, uh, optimize the APIs to get

Yes. So, uh, I have used both AWS and GCP, uh, in my projects and in my professional experience. Uh, for instance, I made a trading a trading bot, uh, in my quant project. And for that, uh, it is a very low cost run project. And the uptime of that trading model is 100%, and the latency that works on because it is trade related. So it is important to fill the orders as soon as possible. So, uh, the server, the bot, everything just works efficiently in that. Uh, you for that, I used compute engine on GCP.

This is, uh, infrastructure as a code. So I have used Kubernetes, uh, on GCP and Cloud Run, and there's cloud function. I use them all on GCP. In the okay grid and as well as in my personal projects. And, uh, even the chat features the feature that I mentioned up earlier is, uh, based on that.

The key things that I would use to design the REST API, uh, I would start with efficient package management because that is the step one. The second one is I would design a boiler plate because we we cannot just come back to it later. So that would be how I would start building a rest API, Then I would, uh, separate the server side calls and the handlers in different packages in different files. That way, there is no cluttering cluttering up in the code. Handling high throughput. Okay. So, uh, there's see. So there is this Mozilla Max, uh, router that does that handling of high throughput very well. And if the throughput is really high, then we can even switch to some server side optimized languages like Erlang Alexa.

Uh, this is a very quick I mean, it is a very quick, uh, database. So in postgres SQL is kinda slow, but it is cheaper than than Redis. Redis is, uh, mainly based on the memory. Okay. So it is about cost versus, uh, time. I mean, when I when I'll choose Redis over PostgreSQL, that is when I need the data quickly. And there is time to live for that. It would be, uh, something like j w d the, uh, JWT token. We can use it for that for recording, let's say, the data that we will be using in a while, uh, some payment related information that, uh, would stay on the server, uh, till the payment goes through in those scenarios.