Software Engineer 2
GLIDEWELL SOFTWARE SERVICES & TECHNOLOGY CENTERJavascript Developer
SRIJAN TECHNOLOGIES (A MATERIAL+ COMPANY)Full Stack Developer
KOCO SCHOOLS (ATHENA & TULLY PTE)Associate Developer
ARTISANS SOFTAssociate Software Developer
RUBICO IT PVT LTD.NET
TypeScript
MongoDB
AWS Lambda
Docker
React
Node.js
Nest.js
PostgreSQL
RestAPIs
Swagger
MySQL
PHP
Laravel
C#
Google API
GraphQL
ClickUp
Github Actions
Gitlab CI/CD
JMeter
NGINX
AWS
CI/CD
Bitbucket
S3
GitHub Actions
Confluence
Git
Jenkins
AWS ECS
AWS EC2
AWS RDS
AWS VPC
API Gateway
CloudWatch
SNS
SQS
Jest
Postman
Kubernetes
ELK Stack
EKS
Kafka
Hi. My name is Akash Kendra. I've been working as a software professional for 5 plus years. I've been working as a back end developer, as a full chain developer, and also on the front end developer. I worked with technologies such as React and Blur. On the back end side, I worked with Node. Js with the updated framework, Nest. Js, and also with the Express. Js, and I worked with some other tools for the, uh, like, key messaging queue settings, uh, using RabbitMQ. Also, with the, uh, one of the project I have worked on, Elasticsearch, uh, for the project, uh, and also worked with the c sharp as well. And, also, I've worked on mostly on the AWS cloud services, uh, including all CloudWise, Lambda, DynamoDB, um, well, with MK 5 as well because it's a new one. So uh, haven't, uh, played it much or played it with much, but, uh, I've all done those creating a steps functions, uh, so that can automatically deploy whatever being pushed from our end. That will the time of our ends. And, also, I can work with, um, Git as a version controlling system. And with the CICD thing, I have worked with most like, mostly with the GitLab, a Bitbucket, and GitHub. And on the database side, I've worked with MongoDB, and on the SQL RDMS thing, uh, it's post SQL. And, uh, testing, I will be just in Mocha. And, uh, the testing tools, uh, mostly I worked on the TD driven thing. So I have to work on, like, test driven development part first. Uh, so created using the Jest. And on the on the the agenda technology, I worked always with the scrumactoolers following the the kanban part. And, uh, the tool which I which I was mainly worked with is Jira, uh, throughout my career. Uh, Jira as well as mostly Jira 1. So, yeah, that is about me. Thank you.
In React, what method would you say is essential for properly and mounting components that involve ongoing API requests? So it's. Uh, so, basically, uh, that can be done, uh, like, like, properly unmounting the, uh, the component, uh, like, which involves the ongoing API request is basically we can do one thing because once that is the the thing is rendering, like, uh, the component after mounted, uh, will need to update all the things so it will render. So we do one thing. We can use that with the help of the abort controller, uh, to cancel the the fetch API request. And that can be done, uh, using, uh, we can, like, create a new controller or in the bot controller, and then, uh, whatever we're fetching, uh, using the thing, so we can, uh, with signal that, oh, okay. The controller got the signal that it needs to be stopped or it needs to be unmoved. So we can that also. Uh, so, like, the thing is that how it should be done is basically, like, whenever we create an instance of our control to handle request, uh, cancellation. So, like, once that controller is ready, so it will pass the con the controller dot signal to the fetch request, and that will the that will, um, uh, that is already linked to the fetch request. So, uh, so it can allow it to abort the the calling, like, control, uh, to show one method will work, like, controller dot abort method will work there. So and once after that is done, so we can clean up the return with controller dot abort and whatever the thing is here, we we unmount it. So that's how we can, uh, do the unmounting using, uh, while we are involving in the API request.
Describe a scenario where an atomic operation in MongoDB is critical within a Node. Js application, and how would you achieve it? Scenario terminal operations. So asset property has been basically asset property. So I'm supposed to asset properties company with your transactions made. Transactions made. Transaction made with the payment. Mhmm. So okay. So, uh, let me think. Situation Uh, so transaction. So, uh, transaction. Yes. Uh, the scenario will be something in the, like, payment related thing. Suppose, uh, I'm a end user, and I'm searching for my product online, and another user doing the same thing and other user mem multiple more like, multiple users are doing the same thing. So, uh, like, whenever, uh, the this suppose there is a 100 stocks. So at a time, there are 50 people log in searching for the same product. So there comes the scenario of atomic operations doing. So, like, the item count at 1 once I have added to the cart, so it should be decreased from the available, uh, available count available quantity. And so it's for the same thing. So we can do this with the, uh, like, we can, uh, we can achieve it by using the multi document, uh, transaction, uh, transaction in, uh, in other, like, uh, replica or in a sharded thing. So, like, suppose I have created, uh, suppose, uh, I have created a session. So, like, I created a session that starts session transaction and then perform the operation. Like, whatever the product data find and update it with the same thing and update the order status as well with the same transaction context. So once that is done so, uh, suppose, uh, I've, uh, once that is done, so I, uh, the the operation succeeded, so I can do with the session whatever call session session commit transaction thing and then session about transactions. And after that, I use that session and session part. So that comes, uh, that can be achieved, uh, using, uh, this map this thing.
Uh, you are building a dashboard that needs to be displayed large dataset, 10,000 rows. How do you ensure smooth rendering and efficient performance in React? Yes. Uh, we have 10,000 data set. Pagination pagination pagination of data. Uh, so for 10,000 data set, I have 10,000. So, like, like, I can do this one either with multiple multiple inbuilt feature or to the external libraries. Uh, so one library is, uh, I guess, react of virtualization virtualized, basically. Uh, so it will easy to implement the windowed, uh, rendering. Uh, so once that isn't so, uh, there is thing, uh, small or subset of rows are rendered, uh, based upon, like, suppose I scroll, so it will render, uh, till the window. And then I, again, scroll, so it will render more. So for the smooth smooth rendering and user experience. So and, uh, one thing, uh, is that, and that can be done with the pagination also. So I can fetch, uh, data in chunks, like, so such as, uh, um, like, 100 or 400 or 300, 800 at a time. And once I I scroll up, uh, so it will fetch again and again. So that that in a way, it will not look awkward for a user and also can be can be done with the lazy loading or infinite scrolling part. Uh, can be done through this, uh, scrolling. And in built, I I can use this thing with the, uh, like, uh, suppose I reload the page, so it will again rerender. So instead of that, I to avoid that thing, I can do with the memoization using the React memo or use memo hooks in the, uh, in the React. And, uh, while doing the searching part or fetching the data, I can also do the debounce, uh, the debounce or debounce thing, uh, for this this purpose.
Increment count here, increment down here, increment plus plus. Increments at 1 plus. So potential can you explain the potential is in the rise? Potentially should your work. Proper super super props pass this to this combo increment increment or each of any code. So set the state of set state compare. This dot set the stage. But it is not in a slice, and it's basically working as a sync. It's in a sync part. This dot is state. Okay. So I'm gonna request the issues. Uh, like, how I pass this make issue the command with the binding, uh, like, there is one issue which I can see is binding issue in increment counterpart. Like, uh, state this dot state dot count plus 1. And, uh, it will automatically, um, started with 1 then 2 then go on. But, yes, one increment part of binding thing is there for sure because you have been increment. And then, uh, okay. So we are using class. So right now, it's not good to use the class based components. So two things I found that this can be improved. Uh, so one, we can use a function based components. And other than that, we can, uh, bind the increment incremental counter, like creating a constructor, and then we can, uh, like, uh, this we can use that this dot bind, uh, like, increment counter, uh, increment count, and then use that this dot, uh, increment count dot bind dot this. So whenever we can use so we directly in the increment count function, we can use that values. So those things, function components and bind the increment count method properly. And other than that other than that, not the pushing here. Yeah. We can go with these 2, or we can use other hooks. There are multiple ways to do that. So we can use hooks also. But most firmly, we can uh, use a functional component and bind the counter increment counter thing with this method. So yes.
Uh, your React app's performance has degraded significantly as data volume grew. What steps would you take using React DevTools and MongoDB profiling to identify and solve the issue? Here. MongoDB profiling. MongoDB profiling. MongoDB profiling production grouping, aggregation, projection aggregation. What more to take? React dev tools. React dev tools profile component profile or the high tab by value. It's bit complete. Mhmm. Profiling. So, like, as of now, uh, with the MongoDB profiling part, I'm assuming that we can do with the help of the, uh, MongoDB, uh, profiling enabling. So we can do that with the tech set profiling level. Uh, like, suppose there are multiple hours. So when there is one level, so we can slow or slow down, uh, like, log the queries taking, uh, longer than 100 millisecond time. So that can be done, uh, also. And, uh, other than that, uh, so, also, we can find the slow, uh, queries which taking more than 100 millisecond time using that, like, whatever like, d b dot system, something like that. Uh, like, so d b system dot find something, uh, or, like, sort whatever the query is being taking time from the, uh, in the decreasing or in the increasing order with the help of those things. And to overcome the situation, we can basically use the indexing in the database version. And, uh, for the other part, uh, for the React apps performance using React app tools. So we can use the, uh, like, the React library, the virtualized library I already told about, or React window library there. And, uh, using the lazy loading part also, we can do that, uh, and, uh, use the like, basically, uh, whatever I have said for the, um, 100 large data chunks, uh, queries. We basically, lazy loading pagination is there and using the external libraries like React virtualized thing can be done.
Examine this JavaScript that is intended to return a new. Yes. I have a function. I reckon the password of the app. My first. Okay. This comes through the function. That is intended to turn you where with this limit is what can you find any logic? They call logical error to this. Logic error. I've 0 per I less than error. Uh, error. I plus increment to logical error loop I less than error rate Uh, basically, got what it which which are added in here, added plus plus plus plus. So we have the, uh, we do not have the added length as of now. Uh, so the areas of, like, kind of zero index. So the valid index valid index, uh, should be, like, added or length minus 1, uh, which can be done here. And okay. Where I okay. So we can remove the equal to, like, I less than less than array dot length because we do not have, uh, we do not have, uh, have any um, length of that as of now because, uh, as it should be, if we are using equal to, we can use the area dot length minus 1. But as of now, we are not, so we can remove the equal to, like, I less than area dot length only. And, uh, where. Okay. So we are instead using block scope. So block scope part, we can use, uh, we we are using where, which can be later used anywhere in the function. So instead of that, we can use, uh, let, uh, like, scope related issues, so we can do that also. So these two things, uh, which needs to be fixed. Other than that, uh, it's good to go.
Explain a method to efficiently execute complex queries in MongoDB that needs to read and write data in a Node. Js application. Explain on that. Thirdly, to efficiently execute complex queries in MongoDB that needs to read and write the time module's application. We can grouping grouping. Uh, so we can do with the bulk right using grouping, like, group groups or, um, like, matching, uh, aggregation methods, basically. So first, we have to match whether the, uh, status, uh, about the status match proper, uh, attribute or grouping or shorting or limiting the whatever the fetching value are getting, Or we can do with the bulk operations or the, uh, like, bulk operations. We can do that, uh, using, uh, like, suppose, uh, I need to update only 1 or update, uh, like, delete 1 or insert 1. So we can do that, uh, with the, uh, bulk writing part, whatever the ID given. And, uh, for the bulk, we also have to, uh, set use a set, like, dollar dot set. So that can be work for efficient, uh, transaction. And the third one is basically what was the the thing I was told told? So, yeah, I already told about the, uh, the the the the the the the transaction related things. So that can be done with the, uh, with the, uh, atomicity. So, like, session starts and, uh, once that's it starts. So whatever the collection is there, so commit the transaction when the transition is commit. If not, then abort the transaction. And once that is done so and the transaction. So those can be done. And, uh, efficiently executing is also comes under the indexation. So, like, if we are using, uh, index optimization for faster query execution, that can be also done.
What is your process for identifying and preventing potential security threats in web application build with the Node GS and React? Yes. So the multiple because, uh, we are more relying on the the the the loggers as well as the queries which we are using. So we can use the the security, uh, threats using the modeling, and, uh, we can also use the c c r, uh, what we call the CSRF, uh, the cross site request forgery. And we also have to deal with the, uh, trouble. K? Uh, dependency, like, SQL induction part or, uh, we follow basically, uh, cross side scripting and CSRF. And now the third one is basically SQL injection. And, like, we have to be sure about the input validation and sanitization in the HTML documents also because, uh, like, uh, if you are using React so React like, it is basically rendering a component, and suppose there is a form. So we can avoid using the the site in our HTML part for x, uh, the, uh, cross site scripting, well, uh, vulnerabilities. And, also, for the dotenv files in the back end, we do not provide any sensitive information there. Either we can, uh, ignore that file using the git dotting code dot, uh, dot git ignore or process that file using the dotenv packages, which we can, uh, store that. And, uh, other than that, authentication authorize is already there. So those are the things which can be, uh, can be identified and, uh, prevent the potential security threat threats.
You are tasked with creating a feature that allows user to upload and process large data files on a taxi fleet management system. Describe your approach to handle the file processing in a scalable way utilizing Node. Js and React. Let's get over. Like, we can do with the describe a pro handle file processing. So file processing, if you are using AWS or other cloud services, so we definitely be using the asynchronous uploads for thinking. And, uh, like, we are we should be handling the error handling. The error handling should be there so we can use a TRICAD block and proper implementation of, like, what is the error there in each, um, part. And, uh, uh, suppose, uh, there is this is big file above max 20 MB. I have certain restrictions, uh, of only 20 MB. So that can be done with the, like, chunked upload part. So, uh, upload the file into chunks so that can be done. And, uh, like, scalability is related while uploading the file can be fixed through the Node JS, uh, like, on the back end part. So we can use the SDR Google Cloud storage for that. And for the temporary storage, uh, we can use the temporary folder there. So we need to create a endpoint for that on the back end also so that can be done.
Uh, sorry. I have not done this one. So I believe I'm not supposed to answer this one because I have not incorporate incorporate.