profile-pic
Vetted Talent

Pushpak Agarwal

Vetted Talent
Having more than 9+ years of IT Experience in Software Testing, Automation Quality assurance and Devops. Have Good Experience in Continuous Integration and Continuous Development. Team oriented with strong analytical and leadership qualities. Worked as POC for various critical projects. Good team player with excellent communication skills and capability to work efficiently and effectively under tight deadlines and pressures. Having sound knowledge in E-Commerce, Banking, Health Insurance, Retail, E-Governance, Crypto, Security. Have good experience in design and execution of test cases and scenarios. Well versed in Defect Tracking & Bug Reporting. Extensively involved in User Acceptance Testing, Feature testing, Adhoc testing and Regression testing. Experienced in using Test Rail, Jira, Quality Center Test Management tool
  • Role

    Automation QA Engineer

  • Years of Experience

    10.5 years

Skillsets

  • Groovy - 5 Years
  • AWS - 5 Years
  • Selenium - 9 Years
  • API - 7 Years
  • Project Management - 5 Years
  • BDD - 6 Years
  • Regression Testing - 10 Years
  • Usability Testing - 8 Years
  • Agile - 9 Years
  • Jenkins - 8 Years
  • Maven - 6 Years
  • Scrum - 6 Years
  • Eclipse - 7 Years
  • EC2 - 3 Years
  • TestNG - 8 Years
  • QA - 10 Years
  • Leadership - 5 Years
  • API Testing - 9 Years
  • Functional Testing - 10 Years
  • Github - 4 Years
  • Cloud - 5 Years
  • Jira - 9 Years
  • GitLab - 7 Years
  • Java - 9 Years
  • QA Automation - 9 Years
  • Quality Assurance - 10 Years
  • CI/CD - 8 Years
  • CI/CD - 8 Years
  • Python - 5 Years
  • Python - 5 Years
  • Automation Testing - 9 Years
  • Automation Testing - 9 Years
  • GitLab - 7 Years
  • Java - 9 Years
  • REST API - 9 Years
  • REST API - 9 Years
  • Cucumber - 6 Years
  • DynamoDB - 2 Years
  • Design patterns - 3 Years
  • AWS Cloud Formation - 4 Years
  • Security Groups - 3 Years
  • Strong analytical - 4 Years
  • DevOps - 2 Years

Vetted For

3Skills
  • Roles & Skills
  • Results
  • Details
  • icon-skill_image
    Senior QA AnalystAI Screening
  • 71%
    icon-arrow-down
  • Skills assessed :Quality Analyst, Automation Testing, Quality Assurance
  • Score: 71/100

Professional Summary

10.5Years
  • May, 2023 - Dec, 2023 7 months

    Sr. SDET

    Netskope Inc. Taiwan
  • Sep, 2020 - Apr, 20232 yr 7 months

    Sr. SDET

    Binance inc. Taiwan
  • Feb, 2017 - Aug, 20203 yr 6 months

    Sr. SDET

    Innova Solutions Taiwan
  • Oct, 2013 - Nov, 20152 yr 1 month

    QA Automation Engineer

    Cybage Software Pvt. Ltd.,
  • Nov, 2015 - Oct, 2016 11 months

    QA Automation Engineer

    Xpanxion UST global

Applications & Tools Known

  • icon-tool

    Git

  • icon-tool

    REST API

  • icon-tool

    Python

  • icon-tool

    Jira

  • icon-tool

    Visual Studio Code

  • icon-tool

    Slack

  • icon-tool

    Figma

  • icon-tool

    Postman

  • icon-tool

    Microsoft Teams

  • icon-tool

    AWS (Amazon Web Services)

  • icon-tool

    Zephyr

  • icon-tool

    Java

  • icon-tool

    GitLab

  • icon-tool

    Jenkins

  • icon-tool

    Selenium

  • icon-tool

    Groovy

  • icon-tool

    SoapUI

  • icon-tool

    TestRail

  • icon-tool

    Cucumber

  • icon-tool

    Specflow

  • icon-tool

    REST Assured

  • icon-tool

    pytest

  • icon-tool

    C#

Work History

10.5Years

Sr. SDET

Netskope Inc. Taiwan
May, 2023 - Dec, 2023 7 months
    • Working as a Team Member and Scrum Master for the API Protection team.
    • Actively participate in requirement gathering meetings with Product owner.
    • Involved in project management activities like metrics collection, status reporting on daily basis.
    • Analyzing the functional requirements and documenting the same.
    • Prepare training plan and induction plan for the entire team.
    • Impart training to new team members in understanding the process and functionality of the project.
    • Resolving queries and issues of the team while authoring and execution.
    • Help the team in preparing test data.
    • Defect communication with on-site counterpart.
    • Keeping Track of Repository and CC Audits.
    • Framework creation for automating script when change in requirement.
    • Test case writing and Test Plan creation.
    • Automating Scenarios using Python, Rest API, Selenium, Pytest
    • Update the Test case result in TestRail, Creation of test Runs and uploading and deleting the test cases from the tool.
    • Involved in Refactoring the framework for UI and BE.

Sr. SDET

Binance inc. Taiwan
Sep, 2020 - Apr, 20232 yr 7 months
    • Worked as a Team Member and Scrum Master for the Channel Integration team.
    • Actively participate in requirement gathering meetings with Product owner.
    • Involved in project management activities like metrics collection, status reporting on daily basis.
    • Analyzing the functional requirements and documenting the same.
    • Prepare training plan and induction plan for the entire team.
    • Impart training to new team members in understanding the process and functionality of the project.
    • Resolving queries and issues of the team while authoring and execution.
    • Help the team in preparing test data.
    • Defect communication with on-site counterpart.
    • Keeping Track of Repository and CC Audits.
    • Framework creation for automating script when change in requirement.
    • Test case writing and Test Plan creation.
    • Automating Scenarios using Java and Rest API.
    • Update the Test case result in Qcenter, Creation of test Runs and uploading and deleting the test cases from the tool.
    • Design And Develop an Automation Framework for Web Application BE and DB Testing using J2SE and Rest API.

Sr. SDET

Innova Solutions Taiwan
Feb, 2017 - Aug, 20203 yr 6 months
    • Worked as a Team Member for the Next-Gen team.
    • Involved in project management activities like metrics collection, status reporting on daily basis.
    • Coordinating with the onsite for various outputs related to the functionality and Implementation.
    • Analyzing the functional requirements and documenting the same.
    • Prepare training plan and induction plan for the entire team.
    • Impart training to new team members in understanding the process and functionality of the project.
    • Resolving queries and issues of the team while authoring and execution.
    • Help the team in preparing test data.
    • Defect communication with on-site counterpart.
    • Keeping Track of Repository and CC Audits.
    • Framework creation for automating script when change in requirement.
    • Automating Scenarios in SoapUI using Groovy and J2SE.
    • Automating scenarios in BDD using Cucumber and GHERKIN code when asked by the client.
    • Update the Test case result in Test rail, Creation of test Runs and uploading and deleting the test cases from the tool.
    • Design And Develop an Automation Framework for Web Application UI Testing using J2SE, TestNG, Maven, Selenium with Design Pattern as Page Object Model.
    • Handling Devops task for AWS Cloud Formation.
    • Involved in creation of DynamoDB tables for Certification and Production Environments.
    • Have worked with TOSCA Tool to automate UI, API and DB, moreover i used TOSCA for E2E Testing.

QA Automation Engineer

Xpanxion UST global
Nov, 2015 - Oct, 2016 11 months
    • Worked as a Team Member for the CivicPlus team.
    • Involved in project management activities like status reporting daily.
    • Coordinating with the onsite for various outputs related to the functionality and Implementation.
    • Analyzing the functional requirements and documenting the same.
    • Prepare training plan and induction plan for the entire team.
    • Impart training to new team members in understanding the process and functionality of the project.
    • Resolving queries and issues of the team while authoring and execution.
    • Help the team in preparing test data.
    • Keeping Track of Repository and CC Audits.
    • Framework enhancement for automating script when.
    • Automating Scenarios in Visual Studio 2015 using Selenium and C#.
    • Worked on Page Object Design Patterns.

QA Automation Engineer

Cybage Software Pvt. Ltd.,
Oct, 2013 - Nov, 20152 yr 1 month
    • Worked As a Team Member for the IDP team.
    • Involved in project management activities like metrics collection, status reporting on daily basis.
    • Coordinating with the onsite for various outputs related to the functionality and Implementation.
    • Analyzing the functional requirements and documenting the same.
    • Prepare training plan and induction plan for the entire team.
    • Impart training to new team members in understanding the process and functionality of the project.
    • Resolving queries and issues of the team while authoring and execution.
    • Help the team in preparing test data.
    • Defect Communication with on-site counterpart.
    • Keeping track of Repository and CC Audits.
    • Framework creation for automating script when change in requirement.
    • Automating scenarios in SoapUI using Groovy and J2SE.
    • Automating scenarios in BDD using Cucumber and GHERKIN code when asked by the client.
    • Update the Test case result in Test rail, Creation of test Runs and uploading and deleting the test cases from the tool.

Achievements

  • Achieved Top Performer award in Cybage Softwares Pvt. Ltd. for two consecutive years. Achieved QMS Award in Innova solutions. Achieved Promotion and Best QA Award In Binance for two years.

Testimonial

Innova Solutions

Change Healthcare

Best QA in the company.

Technical skills are outstanding.

Innovative and creative.

Scrum and Agile knowledge is very good.

Major Projects

6Projects

API Protection Team

Netskope Inc
May, 2023 - Present3 yr
    • I am working with API Protection team, we are responsible for testing SAAS apps, to check any policy violations occurred while scanning SAAS apps, we are engaged in performing Web UI Automation for our Dashboard using Python + Selenium + Pytest and API Automation for End 2 End cases, we are using Python. For BE automation we are mocking the API response which we get from upstream services and checking actual and expected behavior.
    • Project Name: API Protection
    • Client Name: Own product
    • Domain: Security
    • Team Size: 14
    • Duration: May 2023 Till present
    • Role: Sr. QA Automation Engineer/Sr. SDET/QL

Fiat Channel Integration (Binance Inc)

Binance Inc.
Sep, 2020 - Apr, 20232 yr 7 months
    • I am Engaged in Channel integration for Fiat team, channel integration is we onboard channels (Payment Platform) for currency and country across the globe, which can help consumer to deposit and withdraw funds with these payment platforms based on their location, Channels differ in currency, country, platform fees, transaction time etc.
    • Project Name: Channel Integration Fiat
    • Client Name: Own product
    • Domain: Payment
    • Team Size: 32
    • Duration: Sep 2020 April 2023
    • Role: Sr. QA Automation Engineer/Sr. SDET/QL

Next-Gen Payment Solutions

Innova Solutions
Feb, 2017 - Aug, 20203 yr 6 months
    • Next-Gen is a Payment solution web application used mainly by the customers of United States, this web application gives customers Payment solution platform to pay their Medical Insurance Bills and premiums. For this web application I have handled API automation Testing and Module name is PPOL, my role is framework designing and Automation scripting in SoapUI using Groovy and J2SE as development languages. Our API is Rest based API. Some of our API in Next-Gen are products/Token, Payment/Request, Claims/Post, etc.
    • Project Name: Next-Gen Payment Solutions
    • Client Name: Change HealthCare
    • Domain: Finance and Heath Care Insurance
    • Team Size: 16
    • Duration: Feb 2017 Aug 2020
    • Role: QA Automation Engineer/SDET/QL

CivicPlus (Government Content Management System)

Xpanxion UST Global
Nov, 2015 - Oct, 2016 11 months
    • CivicPlus is GCMS project for the government employees of
    • US, by using this application Municipal website designing and simple-to-use
    • Local government management tools for cities and counties of all populations.
    • Project Name : CivicPlus
    • Client Name   : CivicPlus
    • Domain           : CMS
    • Team Size       : 11
    • Duration         :NOV 2015 October 2016
    • Role                 : SDET/QL

Rakuten IDP (Ichchiba Development Platform)

Cybage Software Pvt. Ltd.
Jun, 2014 - Oct, 20151 yr 4 months
    • Rakuten is an E-Commerce web application used mainly by the customers of Malaysia, Indonesia, UK, Singapore, Spain, Germany, and Taiwan. This E-Commerce web application deals in commodities for daily use just like flipKart. For this web application I have handled API automation Testing and Module name is IDP (Ichchiba Development Platform), my role is framework designing and Automation scripting in SoapUI using Groovy and J2SE as development languages. Our API is Restful API. Some of our API in IDP are products/add, Order/get, Inventory/Update, etc.
    • Project Name: IDP (Ichchiba Development Platform)
    • Client Name: Rakuten
    • Domain: E-Commerce
    • Team Size: 32
    • Duration: June 2014 October 2015
    • Role: QA Automation Engineer/QL/

USAP (US Auto Parts)

Cybage Software Pvt. Ltd.
Oct, 2013 - Jun, 2014 8 months
    • USAP i.e., US Auto Parts is an E-Commerce web application used mainly by the customers of United States. This E-Commerce web application deals in mainly Automobile spare parts. This web application consists of several modules. I have handled a module named Manager; the main functionality of this module is to keep track of the orders placed by customers and payments made by them should be handled and securely credited to clients account.
    • Project Name: USAP (US Auto Parts)
    • Client Name: USAP
    • Domain : E-Commerce
    • Team Size : 45
    • Duration : October 2013- June 2014
    • Role : QA Automation Engineer/QL

Education

  • M.Tech in Computer Science

    Himalayan University (2023)
  • CDAC Computer Diploma in Advance Computing

    Sunbeam institute of information technology (2023)
  • B. Tech in Information Technology

    Rajiv Gandhi Technical University (2023)

Certifications

  • Successfully Completed Lot of AWS Internal Certifications.

  • Certified Scrum Master

    Scrum Alliance (Feb, 2019)
  • Certified Scrum Product Owner

    Scrum Alliance (Feb, 2019)

AI-interview Questions & Answers

So black box testing and black box testing, what is our difference? Right? So white box testing is, you can see the code, you can see the design, you can see the requirement and everything. That is a white box testing. White box means everything is visible to you. There's nothing hidden, so that you need to assume. And black box testing is that there's a box. You can see a system where the details are not revealed to you. It means that, in a document it is written that, this piece of code will work like this, but you don't have access to that code. So you need to test by assuming things that this functionality will work according to this so that I can test in this way. For example, let's say there is a system which will calculate. If you send two numbers to it will calculate and return you the sum. So in black box testing, we don't know how it is calculating the sum, but we know that it is doing this functionality. And in white box, you can see the code to make sure that the two numbers are added by this functionality or this approach or this logic. So this is the main difference between white box and black box testing. The main definition is if you know the internals of a system, if you have the code access, it is white box testing and the other is black box testing.

What is regression testing and why it is important? Okay, so regression testing is mainly when we deploy a new code on the environment, it can be staging or production, and we need to make sure that the older functionality works fine. So whenever we are developing a new functionality or a new design, and we are deploying that to a QA environment, before we test the new functionality, we need to make sure that the older functionality is working fine. So if we are testing the older functionality with a newer code on the environment, that is called regression testing. And it is important because we need to make sure that the new changes to the older functionality will not affect it. For example, there's a function which will add. I am adding a new functionality that the function will multiply also. But by mistake, I change the logic of addition. So when I test addition, it will fail. Right? So I will get to know that my new code change affected the older one. For every new code deployment, we need to do regression testing to make sure that the previous functionality is working fine. That is why regression testing is important.

I have used a defect tracking system in the past. Specifically, I have used Jira. And in Jira, what we do is first, the QA or the engineer, the quality assurance engineer tests the system. After testing the system according to the business requirement, which comes from the product manager or product owner, the quality assurance engineer writes the test cases and tests the system based on those test cases. Once the quality assurance engineer finds a difference between the actual requirement and the implementation, they log a defect, which means they create a ticket in Jira. To create it, they need to click on the create button, select the description, priority, severity, summary of the defect, steps to reproduce, and URL credentials, and then briefly explain what is expected and what is actual. They will assign the defect to a developer who implemented the functionality, add the time and other details, and then create the ticket. Once the ticket has been created, a notification is sent to the developer who is assigned to it, and they will first investigate. If the developer finds that the defect is not valid, for example, if the QA by mistake added a defect, they will add a comment saying NAB, change the status from open to NAB, and close the ticket. If they think it's a valid defect, they will change the status to work in progress or in progress, fix it, submit a PR, assign the defect to the QA who logged it, and assign it to a senior developer for code review. The QA will then test it, add a comment, and mark it fixed.

What steps would you take to validate the functionality performance of the software product? Okay, so first, I will read the requirement, the design document, and the SRS document for that product. And for example, let's take an example that I need to test only the login functionality. So first, I will check how the UI looks like. I will check all the fields, including a username, password, create account button, login, sign-up button, and then log in with Google, log in with Facebook, and so on. And then I will check the validation. For example, if I get the username and click on login, it will not work. It will give an error for the password and the same for the username as well. If I give wrong credentials, it will give me an error that says "invalid credentials," something like that. And then, if I provide the right user and password, it will log me into my account page. This is for the main functionality of the login page. And for this performance testing, I will do load testing on top of that. I will do stress testing, which means I will use a tool like JMeter to make traffic on the login page with around 500,000 to 1,000,000 users. And I will just check the performance to see how it is. I will validate that 500 users are logged in with these credentials and can see the account page. And I will check the graph to see all the things, that it's logged successfully. I will check the time taken for the login API to get the response and the user to get logged in. I will also check the UI to see if it's not crashing. So these are all the things I will check for the functionality and performance of the software.

How have you worked with cross-functional teams in the past with less important requirements and objectives? So sometimes the thing is that when we got a new functionality or a new product, our product managers got an overview and it has been divided into different teams. Let's say, I'll give you the example of my previous company, Binance. So here, if I'm in the channel integration team, what we are doing for every channel we integrate, we need to do two flows, withdraw and deposit. So what happens is that one PM is taking care of the withdraw flow, and one VM is taking care of the deposit flow. When we are testing the whole system, let's say, when we're doing end-to-end testing, right, so we need to talk with cross-functional teams. For example, my team is doing withdraw. So I need to coordinate with the other team to make sure that the messaging queue and the messages are coming alright, and it is exchanging over the distributed system so that the deposit is successful, the window is successful, and the endpoint is successful. And in our deposit window flow also, we are checking some services that depend on different teams, like asset service, KYC service, for particular users, asset services for particular users. So this comes into all business flow. For implementing any new channel, we need to always coordinate with cross teams. We take meetings, we take calls regularly, we take explanatory sessions with them. We show our progress, we ask their progress, and we check integration, and then we do demos together to make sure we understand the whole system properly. In this way, I have worked with cross teams in the past.

Every year, we get a performance bonus review. So, I lead my team, and I need to provide feedback to the team. First, I align one-on-one meetings with them, and depending on their work and everything, I determine whether they've achieved their goals or not. I give them feedback, and I also need to give feedback on my team's performance to my managers and higher managers for the performance review. Sometimes, it's not about the performance review cycle. But sometimes, in regular working, if I find a glitch in a teammate's work or if I see someone doing their job exceptionally well and effectively, I need to provide feedback to them during our daily stand-ups. I appreciate them in front of the team and say that these people are doing it that way. It's very good, like that. So, for me, it happens weekly or one to two times a month, I always provide constructive feedback to my teammates.

How would you ensure that our product meets regulatory compliance standards? There is a documentation created by the compliance team for every product. Before we launch our product or ship it, we need to make sure that all compliance requirements are fulfilled. We do two approaches. One approach is if our timelines are very tight, we send the product documentation to the compliance team and they check their points, giving us a sign off that it's compliant. This is not lacking like that, and we change it accordingly. Sometimes, we have a meeting with the compliance manager, and I sit with them. I check each point with their document and our document, ensuring that everything meets our compliance criteria and product requirements. If there's no further requirement from the compliance side, then we can say that our product is compliant and follows the compliance standard.

In what way have you used test automation to improve efficiency in your previous job? Okay, so in my previous or current job, we are doing two types of automation. One is UI automation, one is API automation. So, in my previous company, we used two frameworks. 1 is for UI, 1 is for API. In this company also, we use two. So for UI automation, we are doing Python with Selenium and Pytest as our test framework to run and execute the tests. And for the API, we are doing end-to-end testing. As our upstream APIs are not always responsive. So, here we are doing mocking. We mock the response and engrave our response with what we are expecting. Then we send a request to the Admoc server. We get the response and match it. If the response is matched successfully, we say our test has passed. So, here, what we are doing, we added both the automation test scripts to our Jenkins server. So, whenever new code is deployed to any environment, this test step will be done automatically. And if every test is passed, the step will pass and the code is deployed to the next environment. But if any of the steps fail, that step in the Jenkins pipeline will fail, and we cannot move further, and we can check the logs to make sure what other test is failing. We fix it and redeploy. So, in this way, without human intervention, we are doing regression testing for all automation, UI, and backend. And it really increases efficiency because no human effort is required for testing the regression and for testing the scenarios we already tested.

How would you approve the task of building an automated test framework from scratch? Okay, so the thing is that first, I will check what the automation framework is all about. It is doing UI testing or it is doing API testing. Then I will check the documents. I will check the documents of the API which we need to automate. For example, the API automation framework. So first, I will check all the documents. What are the APIs we need to automate? Is that the REST API or the SOAP web service? What other types of requests does it have? Does it have any authorization, authentication, or special headers? First, I will read the requirements of all the APIs. Once the API requirements are fulfilled, then I will discuss with my team to understand which language is more suitable for this approach or for these APIs. Which tool is better to use. And once all the things are finalized, then we make a design. The design means where our interface will be saved, from where we get the test data, then how we send the request to the web server. Sometimes it happens that our upstream APIs are not responding properly. So we need to mock the response. First, I will check if we need to use mocking or not. If we do, then we will discuss our design because API testing requires database testing as well. We need to match the actual response with the database. If it's a post, then we need to check if new entries are added to the database tables. If it's a get, then we need to check if the entries coming from the database table match the entries in the response from the API. Like that. So once I will finalize the design, then we start implementing. I will divide all the framework tasks into smaller pieces for my team and me. And then we build the automation framework parallelly, integrate it, and then start testing.

Yes. It happens with me. So when the product plan is finalized, the SRS document is given to us by the client and the product manager, they agree on giving these functionalities. Let's say, they have given us 7, but we talked with them and they agreed, okay, 5 is delivered in the 1st phase and the 2nd phase will be 2 and more add-ons. But in the last moment, they said no. We want all 7. You need to increase the timeline and you need to increase the bandwidth. You can take holidays later, but you need to test all these 7. So we had written the test cases for that, and we started testing. Our developers were working parallelly in building that system, and we were testing in parallel with that. So the timeline was very crunchy because at last, the client changed the requirement. And it happens with us, but we had adjusted and somehow we had delivered without bugs. Our client was so much happy, and we got a lot of appreciation emails. But we had just conveyed this thing to our VP that we had delivered. But we made sure that we had ample amount of time because we don't want any product to be delivered with defects. And we don't want our product not to be working fine in production. So we want ample amount of time to make sure that all the functionality is well tested and well developed, and delivered to the client.

Have you ever introduced a tool or process that significantly improved QA process at your company? So, actually, I have been actively managing my team. So I will be checking with all the QAs and developers to ensure they understand their tasks, what they're doing, and what they need to do, as well as any problems they're facing in communication and collaboration, something like that. So if you talk about the processes, I established some processes related to work, related to the end of tasks, related to deadlines, and everything. That worked effectively in my team, and the performance of everyone increased to 70-80% after establishing these processes. And we're having timely meetings to ensure that everybody's on the same page. But if you talk about the tool, we've made some frameworks for doing automation testing reliably and adding means you can say, without human intervention, we're doing all the tests to increase reliability and speed. And there's one tool which I introduced in my company, which is TestRail. Because TestRail has a lot of APIs that you can integrate into your framework, and you can send all the test cases to it. You'll update the results so that everyone can see how this build is working and all the regression tests passing. You can create charts and everything, so it will give you a visual effect of the testing and everything. So this increased performance and improved the QA process in our team after I introduced TestRail.

Fintur integration and delivery pipeline, yes. So all of our automation framework, existing functionality, which are very stable, they are running with Jenkins. We have created a Jenkins pipeline for our code to be moved to different environments. So let's say, there are environments like dev, QA, then we have pre prod and prod. Every environment has a pipeline. In that pipeline, we have 2 or 3 steps that are related to QA. So the first is UI automation, the second is API automation. Here, what we are doing in this step, we are giving the git repository address for both UI and API. Let's say, UI, we are doing with Python and Pytest. So we need to give the git repository. It will clone the code from there, the master branch, and then it will hit a command line. That command line starts with py test something, then you need to give the test suite name and everything. So when the execution starts in that pipeline, it will run all the priority 1 UI automation test cases to make sure that the UI is working fine. Then when it comes to API automation, here we are using again Python and with some libraries to test the APIs. We have a plug-in that is Python plug-in and is provided by Pytest. It will give us all the methods like post, get, and we have tailored that method. We have already done that method using our custom tokens and everything. And here also, we give a git repository, and then we give a command like Pytest. This is similar to that. So it will run all the API tests. And when both the tests work fine, it means our both the steps in the pipeline worked, and then the code will deploy to the next environment. This helps in doing regression and making sure that the older functionality is not working.