profile-pic
Vetted Talent

Pushpak Agarwal

Vetted Talent
Having more than 9+ years of IT Experience in Software Testing, Automation Quality assurance and Devops. Have Good Experience in Continuous Integration and Continuous Development. Team oriented with strong analytical and leadership qualities. Worked as POC for various critical projects. Good team player with excellent communication skills and capability to work efficiently and effectively under tight deadlines and pressures. Having sound knowledge in E-Commerce, Banking, Health Insurance, Retail, E-Governance, Crypto, Security. Have good experience in design and execution of test cases and scenarios. Well versed in Defect Tracking & Bug Reporting. Extensively involved in User Acceptance Testing, Feature testing, Adhoc testing and Regression testing. Experienced in using Test Rail, Jira, Quality Center Test Management tool
  • Role

    Automation QA Engineer

  • Years of Experience

    10.5 years

Skillsets

  • Groovy - 5 Years
  • AWS - 5 Years
  • Selenium - 9 Years
  • API - 7 Years
  • Project Management - 5 Years
  • BDD - 6 Years
  • Regression Testing - 10 Years
  • Usability Testing - 8 Years
  • Agile - 9 Years
  • Jenkins - 8 Years
  • Maven - 6 Years
  • Scrum - 6 Years
  • Eclipse - 7 Years
  • EC2 - 3 Years
  • TestNG - 8 Years
  • QA - 10 Years
  • Leadership - 5 Years
  • API Testing - 9 Years
  • Functional Testing - 10 Years
  • Github - 4 Years
  • Cloud - 5 Years
  • Jira - 9 Years
  • GitLab - 7 Years
  • Java - 9 Years
  • QA Automation - 9 Years
  • Quality Assurance - 10 Years
  • CI/CD - 8 Years
  • CI/CD - 8 Years
  • Python - 5 Years
  • Python - 5 Years
  • Automation Testing - 9 Years
  • Automation Testing - 9 Years
  • GitLab - 7 Years
  • Java - 9 Years
  • REST API - 9 Years
  • REST API - 9 Years
  • Cucumber - 6 Years
  • DynamoDB - 2 Years
  • Design patterns - 3 Years
  • AWS Cloud Formation - 4 Years
  • Security Groups - 3 Years
  • Strong analytical - 4 Years
  • DevOps - 2 Years

Vetted For

3Skills
  • Roles & Skills
  • Results
  • Details
  • icon-skill_image
    Senior QA AnalystAI Screening
  • 71%
    icon-arrow-down
  • Skills assessed :Quality Analyst, Automation Testing, Quality Assurance
  • Score: 71/100

Professional Summary

10.5Years
  • May, 2023 - Dec, 2023 7 months

    Sr. SDET

    Netskope Inc. Taiwan
  • Sep, 2020 - Apr, 20232 yr 7 months

    Sr. SDET

    Binance inc. Taiwan
  • Feb, 2017 - Aug, 20203 yr 6 months

    Sr. SDET

    Innova Solutions Taiwan
  • Oct, 2013 - Nov, 20152 yr 1 month

    QA Automation Engineer

    Cybage Software Pvt. Ltd.,
  • Nov, 2015 - Oct, 2016 11 months

    QA Automation Engineer

    Xpanxion UST global

Applications & Tools Known

  • icon-tool

    Git

  • icon-tool

    REST API

  • icon-tool

    Python

  • icon-tool

    Jira

  • icon-tool

    Visual Studio Code

  • icon-tool

    Slack

  • icon-tool

    Figma

  • icon-tool

    Postman

  • icon-tool

    Microsoft Teams

  • icon-tool

    AWS (Amazon Web Services)

  • icon-tool

    Zephyr

  • icon-tool

    Java

  • icon-tool

    GitLab

  • icon-tool

    Jenkins

  • icon-tool

    Selenium

  • icon-tool

    Groovy

  • icon-tool

    SoapUI

  • icon-tool

    TestRail

  • icon-tool

    Cucumber

  • icon-tool

    Specflow

  • icon-tool

    REST Assured

  • icon-tool

    pytest

  • icon-tool

    C#

Work History

10.5Years

Sr. SDET

Netskope Inc. Taiwan
May, 2023 - Dec, 2023 7 months
    • Working as a Team Member and Scrum Master for the API Protection team.
    • Actively participate in requirement gathering meetings with Product owner.
    • Involved in project management activities like metrics collection, status reporting on daily basis.
    • Analyzing the functional requirements and documenting the same.
    • Prepare training plan and induction plan for the entire team.
    • Impart training to new team members in understanding the process and functionality of the project.
    • Resolving queries and issues of the team while authoring and execution.
    • Help the team in preparing test data.
    • Defect communication with on-site counterpart.
    • Keeping Track of Repository and CC Audits.
    • Framework creation for automating script when change in requirement.
    • Test case writing and Test Plan creation.
    • Automating Scenarios using Python, Rest API, Selenium, Pytest
    • Update the Test case result in TestRail, Creation of test Runs and uploading and deleting the test cases from the tool.
    • Involved in Refactoring the framework for UI and BE.

Sr. SDET

Binance inc. Taiwan
Sep, 2020 - Apr, 20232 yr 7 months
    • Worked as a Team Member and Scrum Master for the Channel Integration team.
    • Actively participate in requirement gathering meetings with Product owner.
    • Involved in project management activities like metrics collection, status reporting on daily basis.
    • Analyzing the functional requirements and documenting the same.
    • Prepare training plan and induction plan for the entire team.
    • Impart training to new team members in understanding the process and functionality of the project.
    • Resolving queries and issues of the team while authoring and execution.
    • Help the team in preparing test data.
    • Defect communication with on-site counterpart.
    • Keeping Track of Repository and CC Audits.
    • Framework creation for automating script when change in requirement.
    • Test case writing and Test Plan creation.
    • Automating Scenarios using Java and Rest API.
    • Update the Test case result in Qcenter, Creation of test Runs and uploading and deleting the test cases from the tool.
    • Design And Develop an Automation Framework for Web Application BE and DB Testing using J2SE and Rest API.

Sr. SDET

Innova Solutions Taiwan
Feb, 2017 - Aug, 20203 yr 6 months
    • Worked as a Team Member for the Next-Gen team.
    • Involved in project management activities like metrics collection, status reporting on daily basis.
    • Coordinating with the onsite for various outputs related to the functionality and Implementation.
    • Analyzing the functional requirements and documenting the same.
    • Prepare training plan and induction plan for the entire team.
    • Impart training to new team members in understanding the process and functionality of the project.
    • Resolving queries and issues of the team while authoring and execution.
    • Help the team in preparing test data.
    • Defect communication with on-site counterpart.
    • Keeping Track of Repository and CC Audits.
    • Framework creation for automating script when change in requirement.
    • Automating Scenarios in SoapUI using Groovy and J2SE.
    • Automating scenarios in BDD using Cucumber and GHERKIN code when asked by the client.
    • Update the Test case result in Test rail, Creation of test Runs and uploading and deleting the test cases from the tool.
    • Design And Develop an Automation Framework for Web Application UI Testing using J2SE, TestNG, Maven, Selenium with Design Pattern as Page Object Model.
    • Handling Devops task for AWS Cloud Formation.
    • Involved in creation of DynamoDB tables for Certification and Production Environments.
    • Have worked with TOSCA Tool to automate UI, API and DB, moreover i used TOSCA for E2E Testing.

QA Automation Engineer

Xpanxion UST global
Nov, 2015 - Oct, 2016 11 months
    • Worked as a Team Member for the CivicPlus team.
    • Involved in project management activities like status reporting daily.
    • Coordinating with the onsite for various outputs related to the functionality and Implementation.
    • Analyzing the functional requirements and documenting the same.
    • Prepare training plan and induction plan for the entire team.
    • Impart training to new team members in understanding the process and functionality of the project.
    • Resolving queries and issues of the team while authoring and execution.
    • Help the team in preparing test data.
    • Keeping Track of Repository and CC Audits.
    • Framework enhancement for automating script when.
    • Automating Scenarios in Visual Studio 2015 using Selenium and C#.
    • Worked on Page Object Design Patterns.

QA Automation Engineer

Cybage Software Pvt. Ltd.,
Oct, 2013 - Nov, 20152 yr 1 month
    • Worked As a Team Member for the IDP team.
    • Involved in project management activities like metrics collection, status reporting on daily basis.
    • Coordinating with the onsite for various outputs related to the functionality and Implementation.
    • Analyzing the functional requirements and documenting the same.
    • Prepare training plan and induction plan for the entire team.
    • Impart training to new team members in understanding the process and functionality of the project.
    • Resolving queries and issues of the team while authoring and execution.
    • Help the team in preparing test data.
    • Defect Communication with on-site counterpart.
    • Keeping track of Repository and CC Audits.
    • Framework creation for automating script when change in requirement.
    • Automating scenarios in SoapUI using Groovy and J2SE.
    • Automating scenarios in BDD using Cucumber and GHERKIN code when asked by the client.
    • Update the Test case result in Test rail, Creation of test Runs and uploading and deleting the test cases from the tool.

Achievements

  • Achieved Top Performer award in Cybage Softwares Pvt. Ltd. for two consecutive years. Achieved QMS Award in Innova solutions. Achieved Promotion and Best QA Award In Binance for two years.

Testimonial

Innova Solutions

Change Healthcare

Best QA in the company.

Technical skills are outstanding.

Innovative and creative.

Scrum and Agile knowledge is very good.

Major Projects

6Projects

API Protection Team

Netskope Inc
May, 2023 - Present2 yr 5 months
    • I am working with API Protection team, we are responsible for testing SAAS apps, to check any policy violations occurred while scanning SAAS apps, we are engaged in performing Web UI Automation for our Dashboard using Python + Selenium + Pytest and API Automation for End 2 End cases, we are using Python. For BE automation we are mocking the API response which we get from upstream services and checking actual and expected behavior.
    • Project Name: API Protection
    • Client Name: Own product
    • Domain: Security
    • Team Size: 14
    • Duration: May 2023 Till present
    • Role: Sr. QA Automation Engineer/Sr. SDET/QL

Fiat Channel Integration (Binance Inc)

Binance Inc.
Sep, 2020 - Apr, 20232 yr 7 months
    • I am Engaged in Channel integration for Fiat team, channel integration is we onboard channels (Payment Platform) for currency and country across the globe, which can help consumer to deposit and withdraw funds with these payment platforms based on their location, Channels differ in currency, country, platform fees, transaction time etc.
    • Project Name: Channel Integration Fiat
    • Client Name: Own product
    • Domain: Payment
    • Team Size: 32
    • Duration: Sep 2020 April 2023
    • Role: Sr. QA Automation Engineer/Sr. SDET/QL

Next-Gen Payment Solutions

Innova Solutions
Feb, 2017 - Aug, 20203 yr 6 months
    • Next-Gen is a Payment solution web application used mainly by the customers of United States, this web application gives customers Payment solution platform to pay their Medical Insurance Bills and premiums. For this web application I have handled API automation Testing and Module name is PPOL, my role is framework designing and Automation scripting in SoapUI using Groovy and J2SE as development languages. Our API is Rest based API. Some of our API in Next-Gen are products/Token, Payment/Request, Claims/Post, etc.
    • Project Name: Next-Gen Payment Solutions
    • Client Name: Change HealthCare
    • Domain: Finance and Heath Care Insurance
    • Team Size: 16
    • Duration: Feb 2017 Aug 2020
    • Role: QA Automation Engineer/SDET/QL

CivicPlus (Government Content Management System)

Xpanxion UST Global
Nov, 2015 - Oct, 2016 11 months
    • CivicPlus is GCMS project for the government employees of
    • US, by using this application Municipal website designing and simple-to-use
    • Local government management tools for cities and counties of all populations.
    • Project Name : CivicPlus
    • Client Name   : CivicPlus
    • Domain           : CMS
    • Team Size       : 11
    • Duration         :NOV 2015 October 2016
    • Role                 : SDET/QL

Rakuten IDP (Ichchiba Development Platform)

Cybage Software Pvt. Ltd.
Jun, 2014 - Oct, 20151 yr 4 months
    • Rakuten is an E-Commerce web application used mainly by the customers of Malaysia, Indonesia, UK, Singapore, Spain, Germany, and Taiwan. This E-Commerce web application deals in commodities for daily use just like flipKart. For this web application I have handled API automation Testing and Module name is IDP (Ichchiba Development Platform), my role is framework designing and Automation scripting in SoapUI using Groovy and J2SE as development languages. Our API is Restful API. Some of our API in IDP are products/add, Order/get, Inventory/Update, etc.
    • Project Name: IDP (Ichchiba Development Platform)
    • Client Name: Rakuten
    • Domain: E-Commerce
    • Team Size: 32
    • Duration: June 2014 October 2015
    • Role: QA Automation Engineer/QL/

USAP (US Auto Parts)

Cybage Software Pvt. Ltd.
Oct, 2013 - Jun, 2014 8 months
    • USAP i.e., US Auto Parts is an E-Commerce web application used mainly by the customers of United States. This E-Commerce web application deals in mainly Automobile spare parts. This web application consists of several modules. I have handled a module named Manager; the main functionality of this module is to keep track of the orders placed by customers and payments made by them should be handled and securely credited to clients account.
    • Project Name: USAP (US Auto Parts)
    • Client Name: USAP
    • Domain : E-Commerce
    • Team Size : 45
    • Duration : October 2013- June 2014
    • Role : QA Automation Engineer/QL

Education

  • M.Tech in Computer Science

    Himalayan University (2023)
  • CDAC Computer Diploma in Advance Computing

    Sunbeam institute of information technology (2023)
  • B. Tech in Information Technology

    Rajiv Gandhi Technical University (2023)

Certifications

  • Successfully Completed Lot of AWS Internal Certifications.

  • Certified Scrum Master

    Scrum Alliance (Feb, 2019)
  • Certified Scrum Product Owner

    Scrum Alliance (Feb, 2019)

AI-interview Questions & Answers

Okay. So black box testing and black box testing, uh, what is our difference? Right? So white box testing is, uh, you are means, uh, you can see the code, you can see the design, you can see the requirement and everything. How does That system look like, and you need to test that. That is a white box testing. White box means everything is visible to you. There's nothing hidden, uh, so that you need to assume. And black box testing is that there's a box. Um, you can see a system where the details are not revealed to you. It means that, uh, in a document it is written that, uh, this this piece of code will work like this, but you don't have access to that code. Uh, so you need to test by assuming things that this functionality will work according to this so that I can test in this way. For example, Uh, let's say there is a system which will calculate. Uh, if you send two numbers to it, it will calculate and return you the sum. So in black box testing, Uh, we don't know how it is calculating the sum, but we know that it is doing this functionality. And in white box, you can see the code to make sure that The the two numbers are added by this functionality or this approach or this logic. So this is the mainly difference between white box and black black box. The main definition is if you know the internals of a system, if you are, uh, if you have the code access, it is white box testing And the IC versus black box testing.

What is regression testing and why it is important? Okay. So regression testing is mainly, uh, when we deploy a new code on the environment, it can be staging production or anything. Uh, we need to make sure that the older functionality works fine. Okay? So whenever we are developing a new functionality or a new design, uh, and when we are, uh, deploying that to a QA environment. So before we test the new functionality, we need to make sure that the older functionality is working fine. So if we are testing the older functionality with a newer code on the environment that is called regression testing. And it is important because We need to make sure that the new changes to the function older functionality will not infected that. For example, there's a function which will add. Okay? So I am adding a new functionality that dysfunction will multiply also. But by mistake, I change the logic of addition. So when I test addition, it will fail. Right? So I will get to know that my new code change affected the older one. So, uh, for every new code deployment. We need to do regression testing to make sure that the previous functionality is working fine. That is why regression testing is important.

Can you describe a defect tracking system you have used in the past? Okay. So I have used Jira. Uh, and in Jira, what we are doing so first, the QA or the engineer, the quality assurance engineer test the system. After test the system according to the business requirement, what it gets from the product manager or product owner, uh, uh, he 1st, write the test cases and depending on the test cases, you're testing the system. So once he found that some difference between the actual requirement and the implementation, He defect a log means a ticket in Jira. How he, uh, create it? He need to click on create button. In the create button, he need to select the description, and he need to select the priority, the severity, the summary of the defect, step to reproduce, and the URL credentials, whatever. And then he need to briefly explain that what is expected, what is actual. He will assign to a developer, uh, who have made this functionality, and then He need to add the time and everything. And after that, he created. So once that ticket has been created, there's a notification sent to, uh, the developer, whom it is assigned to, and then he will first investigate, and he will find that, For example, if a QA by mistake added a defect, right, and developer find that this is not a defect, so he will just add a comment, and he will say NAB. He will turn the status from open to NAB. And if he thinks that it's a valid defect, it will change to, um, work in progress or in progress, and then he will fix it. And he will submit a PR, uh, for that defect. And he will assign that, uh, he will assign that defect to the QA who have logged this defect and assigned the defect same to the senior developer to do the code review. So the QA will do this thing, and then it has been resolved. So QA will, uh, mark it fixed after testing, adding comment.

What steps would you take to validate the functionality performance of the software product? Okay. So first, I will first, I will read the requirement, the design document, and the requirement document, the SRS document for that product. And for example, let's take an example that I need to test only the login functionality. So first, I will check how the UI look like. I will check all the fields that it have a username, password, create account button, login, sign up button, Uh, then log in with Google, log in with Facebook, like that. And then I will check the validation. For example, if I get the username and click on uh, login, it will not work. It will give error for, uh, the password and same for the username as well. If I give wrong, It will give me some error that invalid credentials, something like that. And then, uh, if I will provide the right user and password, it will log in to my accounts page. This is for the main functionality for the login page. And for this performance testing, I will do load testing on top of that. I will do stress testing means, Uh, I will, uh, uh, with the help of any tool like JMeter, I will make a traffic on that login page around 501,000 users 1,000,000 users. And I will just check the performance that how the performance is. Everybody get logged in, and then I will validate that Uh, 500 users are logged in with the these credentials, and they can see the account page. And I will check the graph, and I will see, Uh, all the things means it's logged successfully. How much is the time taken for the login API to get the response and the user get logged in? I will check the UI. It's not crashing. So these are all the things I will check for the functionality and performance of the software

How have you worked with cross functional teams in the past with less important requirement and objectives? So sometimes the thing is that when, uh, when we got a new functionality or a new product, so our product managers got a overview and it has been divided into different different, uh, teams. Uh, let's say, uh, let me give you the example of my previous company, Binance. So here, uh, if, uh, I'm into the channel integration team, so here what we are doing, uh, for every channel we integrate, we need to do do 2 flows, withdraw and deposit. So what happens that 1 PM is taking care of withdraw flow, one VM is taking care of the deposit flow. So, um, when we are testing the whole system, let's say, when we're doing the end to end testing, right, so we need to I talk with cross functional team, for example, my team is doing withdraw. Right? So I need to coordinate with the other team to make sure that the messaging queue and the messages are coming alright, a and it is exchanging over the distributed system so that, uh, the deposit is successful, the window is successful, endpoint is successful. A and in our, uh, deposit window flow also, we are checking some services, uh, that is depend on the different teams, like asset service, KYC service, KYC services for particular for the user, asset services particular for the user. So when, uh, so this is it comes into all, uh, into the business flow. So for implementing any new channel, we need to always coordinate with cross teams. Uh, we take meetings, we take calls regularly, we we take the explanatory sessions like that with them. We show our progress, we ask their progress, and we are checking integration, and then we are doing demos together to make sure we understand the whole system properly. In this way, I have, uh, worked with cross teams in the past.

Can you provide an example of a time when you provided constructive feedback to your team? So, uh, every year, we get a performance bonus review. So that time, as I'm leading my team, so I need to provide the feedback to the team. So first, I will, uh, align 1 to 1 meetings with them, and I will, uh, depending on their work and everything, depending on what they have done, achieve their goals or not like that, I give them the feedback, uh, and I need to give the feedback of my team to my managers, higher managers as well for the performance review. And sometimes it happens that, uh, it's not about the performance review cycle. But sometimes in the regular working also, if I find a glitch, For some of my teammates or if I find that there's a guy or a girl who's doing their work very awesome in a very awesome manner or doing in a very effective manner, I need to provide a feedback with them in our daily stand ups. I appreciate them in front of the team and saying that these people are doing like this. It's very good like that. So it happens, So for me, you can say weekly or 1 or 2 times in a month. I always provide constructive feedback to my teammates.

How would you ensure that our product meets regulatory compliance standards? So the thing is that, uh, there is a documentation created by compliance team for every product. So before we, uh, launch our product or before we ship, we need to make sure that all the compliance requirement are fulfilled. So the thing is that here, we do 2 approach. Uh, one approach is if we are if our our timelines are very tight, we send the product documentation to the compliance team and they check their points and they give us a sign off that this thing is lacking. This is not lacking like that, and we change it accordingly. Sometimes it happens that we have a meeting with compliance manager, and I sit with them. And I will check each and everything with their document and our document, and we can check that everything met our criteria with compliance and our product requirement, and there is no further requirement chain from compliance side, then we can say that our product is compliant ready and it is following compliance standard.

In what way have you used test automation doing pre efficient in your previous job? Okay. So in my previous or current job, we are doing 2 types of automation. 1 is UI automation, one is API automation. So, uh, in my previous company, we used 2 framework. 1 is for UI, 1 is for, uh, API. In this company also, we use 2. So for UI automation, we are doing Python with Selenium and p by test as our test, uh, framework to run and execute the test, uh, for the UI. And for the API, we are doing end to end testing. So as of our upstream API are not always response full. So here we are doing mocking. So we mock the response and, uh, we engrave our response with what we are expecting. And then we send a request to the admoc server. We get the response and we match. If the response is matched successfully, we say our test has passed. So Here, what we are doing, we added both the automation test scripts to our Jenkins server. So whenever the new code is deployed to any environment, This test this step will be done automatically. And if every test is passed, the step will pass and the code is deployed to the next environment. But if any of the step is failed, so that step in Jenkins pipeline will fail and we cannot move it further and we can check the logs to make sure that, uh, what other test is failing. We fix it and we redeploy. So in this way, without human intervention, We are doing the regression testing for all automation, UI, and backend. And it really increase efficiency because no human, No human effort required for testing the regression and for testing the scenario which we already tested.

How would you approve the task of building an automated test framework from scratch? Okay. So the thing is that first, I will check what is the automation framework all about. It is doing UI testing or it is doing the API testing. Right? Then I will check the documents. I will check the documents of the API which we need to automate. For, let's say, example of API Automation Framework. So first, I will check all the documents. What are the APIs we need to automate? Is that the REST API or the SOAP web service? Uh, what other type of request it have? Is that it have any authorization, authentication, something like that? Do it have any special headers? First, I will read the requirements of all the APIs. Once the API requirements are fulfilled, then I will discuss with my team to understand that which Language is more better for this approach or for this these API. What tool is better means to use. And once all the thing is finalized, then we make a design. Design means where all our interface will be saved from where we are getting the test data, Then, uh, how we are sending the request to the web server? Uh, so sometimes it happens that, for example, our Upstream API are not responding properly. Right? So we need to mock the response. So first, I will check, are we we need to use mocking or we don't need to use mocking? Or we are just hitting the upstream APIs. Okay. So after depending on them, we have we'll discuss our design because API testing requires database testing as well API testing as well, and we need to match the actual response with database. If it is post, then we need to check our new entries added to the database tables. If it is get, then we need to Check that the entries which is coming from the database table are matching with the entries or the response from the API. Like that. So once I will finalize the design, then we start implementing. I will divide all the framework task into smaller pieces to my team including me. And then we build parallelly the automation framework, and then we integrate that, and then we start testing.

Can you describe a situation where you had a rapidly adjust your testing plan strategies due to sudden change in the product requirement? Yes. It happens with me. So when the product plan is finalized, the SRS document is given to us by the client and the product manager, they agree of for giving these functionalities. Uh, let's say, uh, they have given us 7, but we talked with them and they agreed, okay, 5 is delivered in the 1st phase and the 2nd phase will be 2 and more add on. But in the last moment, they said no. We want all 7. You need to increase the timeline and you need to increase the bandwidth. You can take holidays later like that, but you need to test all these 7. So we have, uh, written the test cases for that, and we started testing. Our developers are working parallelly in building that system, and we are testing in parallel with that. So that time line is very crunchy because at last, the client changed the requirement. And it happens with us, but we have adjusted and somehow we have delivered without bugs. Our client is so much happy, and we got lot of appreciation email. But we have just conveyed this thing to our VP that this time we have delivered. But make sure, please, we have ample amount of time because we don't want any work any product to be delivered with Defects. And we don't want our product is not working fine in production. So we want ample amount of time to make sure that all the Uh, functionality is well tested and well developed delivered to the client.

Have you ever introduced a tool or process that significantly improved QA process at your company? So, actually, I have been actively from master in my team. So I will be checking with all the QAs and developers their task, what They are doing and what I'm do means what they have to do, any problem they are facing in communication, collaboration, something like that. So If you talk about the processes, I established some processes related to the work, related to the, uh, end of the task, related to the deadlines and everything. So that works effect effect effectively in my team, and the performance of everyone increased to 70, 80% after increasing their process. And we are having timely meetings to make sure that everybody's on the same page. But if you talk about the tool, we have made some, Uh, you can say some frameworks for doing automation testing reliable and to adding, uh, means you can say the, Without human intervention, we are doing all the test, uh, to increase the reliability and the speed. And there's 1 tool which I uh, which I introduced in my company that is TestRail. Because TestRail have lot of APIs you can integrate in your, uh, you can integrate in your framework, And you will send all the test cases to it. You will update the results so that everybody can see, uh, how this build is working and all the regression test Passing. You can create the charts and everything. So it will give you the visual effect of the testing and everything. So this increased performance and Uh, improved QA process in our team after I introduced TestRail and improved uh That's

Periods, if any, do you have with using Fintur integration and delivery pipeline? Yep. So all of our automation framework, uh, existing functionality, which are very stable, they are running with Jenkins. So we have created a Jenkins pipeline for our, code to be moved to different environments. So let's say, uh, so there are, uh, in our company, we are using some environments like dev, QA, then we have pre prod and prod. So every, uh, every, uh, environment have a pipeline. In that pipeline, we have 2 or 3 steps that is related to QA. So first is UI automation. 2nd is, uh, API automation. So here, what we are doing in this step, we are giving the get repository address or both UI and, uh, uh, you can say API. Let's say, UI, we are doing with Python and p by test. Right? So we need to give the git repository. So it will clone the code from there, the master branch, and then it will hit one, uh, command line. So that command line starting with py test something something, then you need to give the test suite name and everything. So when the execution will start in that pipeline, it will run all the priority 1, UI automation test cases to make sure that the UI is working fine. Then when it comes to API automation so here, we are using again Python and with some libraries to test the, you can say, uh, APIs. So here we have, uh, 1 plug in that is Python plug in and is provided by p y test. It will give all the methods to us post get like that like that. And we have tailored that method. We have already done that method using our custom tokens and everything. And here also, we give a git repository, and then we give a command like p y test. This is just like that. So it will run all the API test. And when both the tests are worked fine, it means our both the steps in the pipeline worked, and then the code will deploy to the next environment. So it helps in doing regression and make sure that the older functionality is not working.