I'm a technology enthusiast with 3+ years of targeted experience in the tech landscape, particularly in backend development. Armed with a Bachelor of Engineering (B.E.) in Computer Science, my skill set spans React.JS, HTML, CSS, JavaScript, Node.js, MongoDB, MySQL, and Java. My professional journey is marked by a steadfast commitment to developing scalable, efficient solutions.
My professional odyssey commenced at INDIANIC Infotech LTD, where I delved deep into the realm of backend development. This experience was instrumental in sharpening my expertise in JavaScript and Node.js, along with the MERN stack, allowing me to contribute significantly to our project's success. My tenure here not only honed my technical skills but also instilled a profound appreciation for teamwork, adaptability, and the relentless pursuit of excellence in a fast-paced industry.
Software Engineer
The Briminc SoftechSoftware Engineer
IndiaNIC Infotech LtdSr. Associate - Backend
Amazon
Git

Agile

BitBucket

Linux

Windows

S3

Linux

Agile

Linux
Node.js

Express

MongoDB

CSS

React.js

AWS
.png)
JWT
Jira

Cloudwatch
Hi. My name is Shushang Shekar, and, uh, I'm a software developer with over 3 years of experience. I specialize in, uh, developing the back end sites using the Node. Js and, uh, using JavaScript and JavaScript also. I have used, uh, the Redis and the, like, Redis and the cache based memory option for the optimizing the, uh, on ongoing ongoing project applications and in in the internal games also. I have, uh, like, I have basically a clean clean lean towards my knowledge's development. And in my real organization, I have built the ecommerce websites, uh, uh, for the server side and implemented all the developer schemas and AI diagrams that are a clean architecture, how it works. And, uh, before this this organization, uh, I was working, uh, in Amazon, uh, as a senior associate where I was a part of a small team where I I have to, uh, I have to design some APIs for the internal intent requirement where, uh, the where the datas and data should be phased from the, uh, from the exercise and where we have to manage all the, uh, email templates and and kind of providing the template in the as a form of, like, using, uh, edit functions over there and implementing the required data and storing the data for the future reference, uh, and if there's any, uh, error error is gonna happen in the future. So we will track them if the person has used or not. So we have to basically store that users while implementing the, uh, the editing and sending the block that that page also. So and, uh, I'm I'm I'm completely, uh, like, leaning towards the new trends in technology and Node. Js widely used technology. So I have preferred to go with the Node. Js so far as of now. Thank you.
How would you handle the real time chat features in, you know, js with socket.i for efficient memory use? How do you handle a real time chat features in a Node. Js app with socket. Io for efficient memory uses? So, uh, while implementing Node. Js app with such features like, uh, efficient, uh, the socket, which is basically top layer of of WebSockets. WebSockets, uh, we can, uh, we can basically implement in in such a way. Like, uh, we have, uh, um, we have key features like the use namespace and the rooms. Uh, basically, we the socket has support the namespaces in the rooms to divide the user into different groups. We're reducing the broadcast scope and the unnecessary data transmission. We have, like, limited stored, uh, messages, like implement the mechanism, uh, to limit the number of the chat messages and restore in memory to avoid consuming too much RAM of, uh, offload over, uh, like, uh, offload messages, like the person person is flagged already to the databases. We have, like, garbage collection. Make sure that the client disconnected in the socket out on disconnect handler. It is essential to clean up the memory when the user leaves. And we can apply the radius for the scaling also. Like, uh, radius, basically, you know, using the pops up when you app is scaled across multiple servers, user it is for the pops up message broadcasting growth instances. Redis set with the distributing messages without the overloading the memory on individuals. We can use the Redis for the session storage also where we use the data, which will help the offload memory used by the Node JS process. We can, uh, optimize the data transmission like the total message sending and implement rate rate limiting or message routing to prevent the spamming. And we can only emit the necessary data. Make sure that that is socket even only transmit the required data, reducing the payload size of the memory users. We can also memory leak prevention. We can implement like the avoid the circle references and the monitor tools like, uh, uh, the clinic dotjs, me watch, or the node inspector to monitor memory users and detect leaks, uh, early in the development. We can also use, uh, lazy loading and the cleanup, like, page user rates and messages and other related data only when necessary. And rather than including everything on a connection. So these are the strategy combining will ensure that our real time chat chat features can scale efficiently and keeping memory user, uh, usually under consider consideration.
Can you describe a scenario where using Twilio's API within a Node. Js back end might enhance user experience? So, uh, basically, uh, like It's basically, uh, it's been, actually, uh, I want that much frequent using of the 2 days API. So, basically, I I remember one scenario, uh, in my, uh, in my one of my projects I have done with some other colleagues. So, uh, it's like, uh, whenever we are implementing a platforms, uh, for example, health care platform, to to book the appointments with doctor to, like, to to, uh, improve your experience and reduce the missed appointment, and you want to implement an automated system that sends the appointment reminders, the SMS, or voice calls, uh, this, uh, how much this is the this is the way we are, uh, we are implementing a tool use API. And, uh, within Node. Js, back in might enhance the experience. So, uh, one of the, uh, is, like, SMS reminders for the appointments using tool is programmable. SMS API, we use in our note just back in. It can automatically send a personalized SMS. Uh, and also the voice code reminders are there. It's like the, uh, tool list API also is a two way communications. It's basically the two way SMS communication, meaning patient can respond to the reminders with questions or recommendation. The note is back and can pass the response and update the appointment status accordingly. So, uh, this is, uh, and, also, uh, like, it's emergency notifications we have. Like, we have the multilanguage support of the Twilio. Uh, so by integrating Twilio into our, uh, not just so we can build an effective communication system that keep user information and reduce the no source and also enhance the overall experience and your experience and and time with reminders and flexible communication options so far.
To encrypt before, uh, serving, uh, it to Michael using Node. Js. So just a method to encrypt. Can you suggest a method to encrypt sensitive data before saving it to a MySQL database using node dot JS? So, uh, one of the May 3rd, uh, I will say, uh, like, uh, whenever, uh, users enter the password from, uh, like, uh, in our, like, the while the login password or even, uh, not logging in. So even while you're doing the sign up processes, like, uh, sign up process when you just enter the password is the first time. So at that time, we need that data to be encrypted, uh, while storing the databases. Uh, for example, in the SQL RDBMS, if you're using MySQL. So we need to in encrypt that data before saving into our, uh, databases. So here, we can use, uh, one of the, uh, one of the measures is, like, bigcrypt. Big, uh, big crypt, we can use in our databases also. Uh, just basically, we are hashing the password. Uh, for example, salt hashing business. So basically, whenever they are, uh, we are decrypting that password. So, uh, while we are also creating one one function, for example, we are implementing one hash function, hash password function, and we are we are decrypting that, uh, decrypting our key, uh, like, module which we basically using for the encryption of the password. We also implement the salt rounds over there. So, basically, if you are, uh, giving some salt transfers, so it will return the hash password. And once we will, uh, uh, we have the hash password in our in our some in the in the variables, we will the, uh, that hash password, we have to only store in our databases, not the actual or the real passwords because, uh, this is, uh, this is, like, a clean and, uh, clean and concise way to, uh, store the data, uh, password in our databases. And BigCrypt is one of the method, uh, where we are using to to encrypt the sensor data before saving up to MySQL databases using Node. Js.
What is the best approach to manage and apply configuration changes across multiple environments in Node. Js application? So, Uh, it's we have, like, uh, so for example, we have, like, configuration changes to across multiple environments in a one. So, uh, when we have certain scenario where we are, uh, uh, like, multiple environments and not just applications, so we can create using the environment variable, and it will create in our source files, uh, in in in the same in the same root directory file. So we'll create dotenv file for the local development, but it will not we will not commit it to version control for the security purposes. Like, we can install, like, the dotenv, and we will store all the, uh, key data, like, the required data that is that is to be in dotenv files, and it will store all, like, the for for example, the port information and DB host information, DB user passwords, and even, um, while we are, uh, authenticating the user, we have, like, the JWT secret key also. So we will include this, uh, all this key value pairs like in our development phase, we can use in our dotenv files. For production, uh, set in one of our files at a system level or in a your call by AWS without in dotenv files, and this is the risk of the, uh, accidentally explosion. We can also use the configuring the management library that is using the config library we can use. And it it will allows it will allow us to define the configuration system for different environments like the development or production edition. And in the case of, like, uh, the default JSON will be have, like, the in the part it's like a JSON format is like. We have, like, a production JSON. We have like, we can use in a, uh, in a format, like, automatically detect the environment based on the not in the variables. And, uh, in the case of, like, the, uh, environment specific, you YAML JSON configuration. Is another option to store configuration in a separate of IML. And these are some of the key key key steps, uh, using, like, across, uh, we supposed to manage and apply and configure some changes across the multiple environments in node application. Lastly, we will have, like, uh, Docker application, containerized Docker. Uh, so it will basically, uh, it's a kind of, uh, recommend to pass environment with the Dockers and for the management. Also, best practices is like avoid hard coding configurations, secure, uh, secrets, and, uh, and, like, uh, the secure secrets in, like, the since you're into 1 dotenv file and, uh, the local conditional API in a environment variable or a secure secure management tool. Use a dot nodeenv variable to set the nodeenv to development production test to differentiate between the environments. Like, whenever, uh, using, uh, like, the test development phase, we note dotenv should be on development mode. Whenever we are in production mode, so it should be, uh, in a diff to differentiate the environments we can use. We can also use a version control, uh, that will, uh, keep the specific environment, specific, uh, config files, uh, excluding the secrets in the version control potential dotenv file are listed in the dot git ignore files. And we can also other time monitoring, uh, like, the NV key or file secrets to manage the tracks engine and configuration. So this way, we can achieve, uh, like, the approach to manage to apply configuration changes.
What techniques would you apply to ensure asset compliance when performing concurrent transaction in MySQL? So, um, I I said it's like, uh, it's a atomicity concurrency. It's, uh, but, basically, um, can consists of isolation durability. So whenever you're dealing with asset compliance in our, uh, MySQL system, so, uh, you can we can make this like the my use of transactions. Like, the MySQL transactions allows us to execute a series of operation as single atomic unit of work by wrapping your our operation and transaction. We ensure that either all operations succeed or fail, ensuring the automaticity. And, uh, proper use of the isolation levels and, like, the MySQL supports a different isolation that control how transaction are isolated and, uh, from one another to another, uh, to ensure consistency and isolations. We should have to choose choose about appropriate solution level based on the requirement of, uh, your application. 1 is the rate uncommitted, another rate committed, and we have the repeatable rate. We have serializable. And these these are some, uh, levels in isolations. And certain second third one will be the optimistic versus the pessimistic locking is there to avoid conflicts between the concurrent transaction. Uh, we can apply either the optimistic or the pessimistic locking techniques. And in the case, like, um, we have one thing, uh, we can add in the insert, uh, like, the handling deadlocks is the most important enough. In the concurrent transaction, deadlocks can occur. When 2 transaction are waiting for either to release locks, Myskulytics, uh, like, metrics, the call deadlocks and automatically rolls back one of the transaction to handle the deadlocks gracefully. And, basically, it's a retry when they failed due to deadlocks. Use proper indexing to minimize lock contention and avoid unnecessary locking. And we have, like, uh, durability with the, you know, d b we have and, uh, proper indexing to reduce the log contention. So far, we can implement. And it's like the, uh, we can if you come if you will accumulative on cumulatively approach for, uh, asset compliance, uh, so we can use a used transaction that will ensure that automaticity. Set appropriate isolations label, and, uh, we will apply locking, enter the lock. Ensure ensure durability. Charge proper index in user low case.
Given the JavaScript code for a real time communication using the socket.out, can you spot any potential for harmonization suggest optimization? Interview there, and do run. Con socket. Record socket.ioconstalk. Client. I o dot connection. Client client on. Socket.i Con socket. Require socket dot io. That is perfectly fine. Taken. And we need to create socket server. And so about 3,000. So it's not done. That that is not the right way. It wasn't socket. 3,000. It will not focus. It's still server. So we need to create 1 app. Let's check with the server. We will pass app over here. I have that connection socket. My client. Okay. Socket. Socket and client dot on. Socket dot on. Any message as an event data, handle event. Client. Okay. Connection to. Socket and socket. On, and it will, like, listen for event, like, to be event over data and handle event. Broadcast. Socket.on.letclient.on.disconnect. Okay. Clean up on disconnect. Okay. So It will pass, like, this we need to get to 1 server in in such cases. We need to create server in such cases like the IO or socket, uh, while importing that IO. And we need to, uh, create 1 server, HTTP server over there and pass app, and it should be listen to the that particular port in that in that 3,000. The rest is correctly fine. I don't find an issue because i.on is the connection. On connection, we will pass in this client and client while on. It's event. And event is, like, handle and handling the event. And client dot on. Even client dot on, we will disconnect one connection and disconnect. That is perfectly fine.
Can you suggest a strategy for implementing role based access control in Node. Js API using Passport? So, uh, we have, like when we're implementing, like, role based access with the password in our, uh, Node. Js applications, uh, we basically, we will create a user, uh, user model in our databases, like, uh, user roles in our databases. It basically defines you a type of user, like admin, like you are, uh, admin and and and different types of user, like moderator user. So it's it's basically, uh, kind of, uh, authentications while using, uh, such, uh, passport module. So we will first fill this setup, uh, the passport JWT authentication, and it will basically we will, uh, import all the pass passport JWT modules, uh, and we will generate a token using the JWT tokens. And it will basically when we whenever user user is gonna log in, it will gonna verify. Like, create a role model, uh, for the access control. And once user is authenticated, you can create a custom to where to check their roles and grant or deny the access based on the role. So for example, a person is an admin or a moderator or user will basically, we'll create a, uh, role, um, role middleware, and then we will verify using the tokens provided and the the, uh, tokens provides, uh, for verifications. Also, uh, that will, uh, we will check-in our databases using the roles, which will include that particular users or not. And if that user is not, uh, if someone is accessing that particular, uh, APIs, we will we will display a message that that, like, access denied or, uh, you are not you are not permission to this, uh, this route or not access to this site. So, basically, we can display these messages using the, uh, that, uh, using the middleware. Also, we have the protect routes using the passport and the role role middleware. So we will implement app dot use, uh, that role model middleware, uh, before implement, uh, before implementing all the routes of any particular roles like admin. So prefer using the role middleware. Road middleware will be, uh, only, uh, admin role middleware will be, uh, will be. Uh, we'll put first, uh, all the admin routes over there. So here here we can implement, like, the password, uh, using the JWT authentications.
What is a reliable way to establish 2 factor authentication system using Twilio in a Node. Js back end? So while implementing Twilio, uh, for, like, the 2 factor authentications using, uh, Twilio in our system, we have to install some, uh, like, uh, required packages. Basically, that will be, like, Twilio, Express, BodyCastles, and the JSON and the. So, uh, in the case where we are, uh, using a Twilio account for the set of messages, uh, like, uh, account as ID, account token, and set up the Node. Js back end. So we will, uh, store our all the the, uh, secret files in the our dotenv files, and we will configure the Twilio, uh, using, uh, all the, uh, all the, like, the, uh, like, Twilio account and the or token. After user registration, we will, uh, we'll send OTP for the 2 factor authentication. After user tries to log in after validating the the passwords, we will, uh, send the OTP to their phone using the Twilio. And after, uh, once we'll log in and trigger the OTP, uh, initiate, uh, like, the user's login in their username and password, and, anyway, uh, it will initiate the sending the OTP, verify the OTP, and complete the login, uh, using the, uh, create a function to verify where we are, uh, passing the Twilio service as ID to verify the verification checks, uh, which will basically include the the phone number and the OTP. And after the finalization finalized, once OTP is verified, generate, uh, generate token to complete the login process to, uh, verify, like, the secure routes for the JWT, uh, which will basically routes application to the required, uh, uh, required required APIs. So we can we can use the middleware for the, uh, like, um, like, the routing for the, uh, routing for the applications. And, also, we have, like, the limit request that you can use, uh, to request time stamps and the database of the cache in or deny request if they exceeded a certain threshold. So here are the some way, and we can implement or, uh, establish a 2 factor authentication system within a 2,000,000,000 packet.
What method do you recommend for implementing custom validation logic in type ORM that are not supported out of the box? So I'm not able to recall exactly, uh, the method. We can't I can recommend custom validation logic and type out of the box. So, uh, one I'm able to recall is, like, custom validations, uh, like, like, the class validation, custom class validation, custom, uh, decorators, defined custom validation decorators. We have, uh, so far as I know, this I'm not pretty sure about.
How would you apply TypeScript decorators in a Node. Js protect project to enhance modularity and readability of code? This is about the questions. So, uh, basically, we have to understand the, uh, understand the, uh, decorators. For example, we have logging, validation, authorization, dependency section, caching. So while implementing node just process your typescript, uh, we have to, like, enable the experiment decorators, modify the TS config dot j JSON file to enable decorators. And we'll create the custom decorators, and we have, like, the, uh, login decorator. We have authentication, validation decorator. We have, uh, like, applying decorator from the dependency injection also for the, uh, route management in the express that what so if, um, we are building for the application using express, we can decorate to define routes and answer the readability that, uh, this function only, uh, only get the input, uh, and put the form of in the, uh, in the form of, like, a string only. We will return the return type of that. So, basically, whenever we are using TypeScript, basically, we will we will, uh, separate the business logic from the core functionality of your, uh, of our class leading to cleaner code. The the re reusability will gonna take, uh, gonna to, uh, improve and the readability will and modality also. So types of decorator provide a robust mechanism to enhance the modularity and readability of our node chest applications.