I'm Alak Modak, a Full-Stack Engineer from Tripura, India. My journey into web development started unexpectedly. I was looking for a great website to boost my affiliate marketing game. When I couldn't find a template that clicked, I thought, "Why not build my own?" That decision kicked off an exciting journey that led me to where I am today. As I dug deeper into web development, learning the ins and outs, I fell head over heels for this field. It was like discovering a hidden passion I never knew I had! This newfound love led me to Masai School, where I specialized in the MERN stack and honed my JavaScript, HTML, CSS, and MongoDB skills. Fast-forward to today: I'm all about crafting innovative web applications that are visually stunning and functionally rock-solid. While the MERN stack is my bread and butter, I've also expanded my toolkit to include React Native, Next.js, Nest.js, and PostgreSQL. This diverse skill set allows me to tackle all sorts of development challenges. Oh, and I'm comfortable working with cloud-based infrastructure on the Google Cloud Platform, too!
Senior Software Engineer
The Narayana GroupSoftware Developer Engineer
Stigen Martech Private LimitedSoftware Developer Engineer - 1
StrategyWerksInstructional Associate
Masai SchoolReact
React Native
Redux
Redux-Thunk
NodeJS
MongoDB
JS
Git
WebSocket
GCP
AWS
GPT-3.5
React-Native
NextJS
PostgreSQL
GCP
GPT-4
Angular
AWS
BigQuery
AWS S3
Hi. Uh, I'm Alek. Uh, I'm 25 years old now from, uh, the North Central India. Uh, I have around 3 years of experience in, uh, master development. Uh, apart from, uh, my technologies, uh, I'm also proficient, uh, with, uh, its relevant, uh, tech stack, uh, such as, uh, React Native for mobile application development. Next. Js for service and rendering. Nest. Js for creating scalable API endpoints. Uh, Postgres as a relational DB. Uh, apart from that too, uh, I have experience, uh, with cloud based infrastructure, uh, with Google Cloud Platform, uh, and also having some expertise, uh, with, uh, AWS s 3, uh, EC 2, and, uh, AWS Cognito. Uh, currently, I'm leading the development of SpringHill at the Straging Market Private Limited, uh, where in the front end and using Next. Js, uh, for servers at front end. Uh, and the back end, I'm using Nest. Js, uh, as for creating scalable, uh, API endpoints, uh, and, uh, to manage the data, uh, there are I use Postgres, uh, as a relational DB and have hosted, uh, the front end, uh, back end, uh, to GCP App Engine and uh, hosted, uh, our database in, uh, GCP, uh, Cloud SQL. Uh, Yeah.
Yeah. The main show, uh, there there is, uh, you are just iterating, like, uh, till, uh, the length of, uh, this array. Uh, since, uh, we know the in error end is thing, uh, is a 0 base. Uh, so, uh, it starts, uh, from 0, uh, till the length minus 1, but you are including, uh, that length 2, which is basically undefined. So you will have some error, uh, at that particular instance. Whenever, uh, I equals to error dot length, you'll have nothing for error of error dot length.
See, with, um, uh, including type 1 in my back end application, uh, on top on top of Node. Js, uh, we can have some, uh, atomic operation in MongoDB, uh, with these. Uh, we can, uh, achieve that with that type of volume stuff. And, also, we can achieve that with a bit of query, uh, as per our requirement. Uh, we can deal with such scenarios.
Well, uh, in the front end side, we definitely go with, uh, lazy loading, uh, and, uh, also, uh, the observer pattern, what we say, like, uh, whenever the component or the stuff is needed, we, uh, need to render that particular component, uh, and, uh, let's say, implement while implementing some scrollable functionality, whenever we hit, uh, the very end, uh, only then we would, uh, call some more data and append to the previous data, and we'll do that again and again. So we, uh, need to send data in terms of pages of whenever user hit the bottom of that container, uh, which is as portable component, basically. We would send some particular, uh, section of data to back end to front end, and we'll, uh, show it accordingly as in we would append the, uh, new data with, uh, the existing one, which basically helps us, uh, the with the performance with, uh, while working a large or significant, uh, data volume.
Uh, well, uh, we need to follow the object, a live cycle hooks there, uh, like, uh, component, uh, component, uh, component, uh, will mount, component, uh, data update, and component will unmount. So, uh, we need to cancel out all the all the stuff, uh, that have been used or that have been created in the component date update stuff. Uh, we need to, uh, unmount or we need to, uh, resolve all the functionalities whenever our component is unmounting so that and there would be, uh, no chaos. So we can basically, uh, achieve that, uh, uh, with useEffect, uh, like, after implementing our, uh, record functionality in the callback function of useEffect. Uh, we can dissolve or we can unmount all the stuff we made, uh, whenever the compact unmounts. Uh, we can, uh, achieve that by using, uh, some return with a callback function. And within the callback function, we can, uh, dissolve we can, uh, unmount all the stuff or let's say, uh, any external API call or set time out, set interval. We can clear that, uh, and we can, uh, remove the garbage there whenever component is unknown.
Well, uh, to manage or to efficiently execute, uh, the complex queries in that, uh, it's, uh, write and read in Node. Yeah. Uh, we can, uh, go with some, uh, indexing to read data or to execute a query as as first as we, uh, like, good indexing, if we can optimize, uh, the performance, uh, of fetching the data. Uh, but, uh, while writing on that query, we definitely need to find that particular ID, and then we need to update that It's really a a port request. So auto patch request. So we definitely need to do it, uh, on the convention, as in, uh, like, whenever where we are updating something, uh, we should update, uh, those particular columns or those those particular keys in the memory b object, uh, that are, uh, get getting updated rather than updating the whole, uh, dictionary, uh, which basically takes some time. That's why.
It confirms date management approach. Can you explain the credential issues? Never eyes how it it be improved concerning best practices. Well, uh, we should not, uh, call, uh, any, uh, function independently before, uh, returning our component, uh, in a React comp before returning the, uh, HTML tags in a particular component. Uh, so whenever the component gets rendered, like, it basically gets rendered whenever any state changes, so it would automatically, uh, call that particular function again and again. So would, uh, would generally cause some performance issue. So we should definitely call those particular functions or update in state on, uh, with some event trigger. Uh, also, if we want to update in the state or, uh, put some default value whenever component renders. We can pull that, uh, put that, uh, within the useEffect to, uh, manage this stuff, but we should not, uh, put any, uh, function independently there, uh, in the component.
V dash dashboard that needs to display large dataset. Then show the smooth rendering and efficient performance in data. So, uh, we need to, uh, fetch our data in chunks rather than, uh, fetching the 10,000 rows at once. 22, uh, faster data in chunks as in, uh, firstly, it would fetch around 25 or 50, uh, would put a limit, uh, limit there. And we'll put on the page page edition out there, uh, such as, uh, let's say, page 1 would consist, uh, 50, uh, rows only. And whenever the user, uh, scrolls down to, uh, day, uh, or 45th row, we would, you know, we would call the next page data, uh, that is, uh, 521, 200, and so on. We would we need we really need to divide our, uh, whole data into multiple chunks, and we, uh, we need to, uh, assign it, uh, with, uh, some page numbers so we can definitely manage that 10,000 rows, uh, with an array application smoothly. Rather, if we would, uh, call all the data at once, it would, uh, our application might, uh, be stopped at that particular at, uh, one particular instance. And, uh, also, if we are getting 10,000 in rows at once from the API endpoint, uh, it would take lots of time. The execution time will, uh, get increased, uh, a lot, which definitely cause a performance issue. So definitely divide it into multiple chunks and get it one of another, uh, such as, uh, infinite scrolling.