Dedicated and results-driven DevOps professional with 3+ years of hands-on experience in optimizing
and streamlining IT operations. Proven ability to design and manage Azure environments and automate
tasks with Terraform and scripting. Expertise in containerization with Docker and Kubernetes. Seeking a
challenging DevOps Engineer role to leverage cloud and DevOps skills within a dynamic team.
Azure Analyst
Sopra Steria - Azure Analyst (UK client - TESCO)Azure Analyst
Sopra Steria, Chennai Azure Analyst [UK client STW]Azure Support Engineer
Sopra Steria, Hyderabad Azure Support EngineerAzure
Docker
Kubernetes
Terraform
Shell Scripting
Git
GitHub
Azure Monitor
Grafana
Prometheus
Linux
Windows
GitHub Actions
Nexus
Azure Storage
K. Uh, Hi. Uh, my name is, uh, Sisi Pawan. Uh, I'm a DevOps and cloud engineer with over 3 years of hands on experience in optimizing IT operations. I have a strong background, uh, in designing and maintaining Azure, uh, environments and also automating tasks with Terraform and PowerShell scripting and have used containerization using Docker and Kubernetes. So, overall, uh, coming to my professional experience, currently, I'm working as a Azure analyst at Sopra Stadia. So with mainly focusing on UK clients like, uh, Tesco. So in this role, I design and implement, uh, CICD pipelines, uh, workflow using GitHub Actions to automate this application deployment. So I manage Kubernetes clusters for high availability and scalability, and I have optimized contain container resource allocation, reducing infrastructure cost by 15%. Uh, coming to my uh, self intro conclusion, I'm very, um, passionate, uh, to about leveraging my skills in, uh, DevOps, uh, engineer role. So I want to, uh, contribute my skills to a very, uh, dynamic, uh, team. So thanks thanks for your time.
So we can use Python in many ways. Uh, there are many libraries in Python, uh, where we can use, uh, to import, uh, into Azure resources. So using those, uh, import Python libraries, uh, we'll, uh, automate the deployment of Azure resources, uh, uh, with, uh, compliance to Azure, uh, the policy constraints. So using various, uh, import libraries, we'll automate the deployment of, uh, Azure resources.
So in Terraform, uh, how we will manage secrets required for cloud resources deployment? So to manage secrets, uh, for cloud resources deployment in Terraform, we use a a Key Vault or, um, a HashiCorp Key Vault provided by Terraform itself. So we'll use any Key Vault to to store our, uh, secrets in the Key Vault or any certificates, uh, related thing. So, uh, it will have encrypted it will be stored in an encrypted format. So we'll use, uh, Key Vault in Terraform to stores to manage secrets uh, required for, uh, cloud resources deployment.
So here, uh, we'll use Terraform workspaces. So, uh, so we'll use, uh, if you want to deploy in multiphase, uh, deployment strategy, if you want to use this strategy, then we will use Terraform workspaces. Uh, so at the time, uh, we can declare different types of variable files in Terraform so that, uh, when we are, uh, you applying the Terraform, uh, multiple resources will be created in, uh, multiple environments. So using Terraform workspaces, we can, uh, integrate these Terraform modules with Azure DevOps pipelines. So directly in the workflow itself, we can, uh, declare workspaces of these Terraform modules so that, uh, we can deploy, uh, into multiple environments.
Importancy. So we have to make sure, uh, our automation script using PowerShell or Python, uh, it should be very optimized, and it should be time based solution where it has to be checked with, uh, time complexity. So it need it needs to be ensured impotency, uh, by we have to use, uh, make sure that our time complexity, uh, meets up so that, uh, our automation script using PowerShell or Python executes, uh, in particular time complexity. And we have to make sure to include, uh, very small chunks of code, uh, where, uh, it will Where we have we can also use, uh, cron jobs, uh, in the script or in the Python, where, uh, it ensures impotency for, uh, automating the script.
So, initially, we can import, uh, the JSON file into Python, and we can modify it using Python. Uh, so to change Azure policy configuration, uh, uh, directly, uh, in Azure portal itself, there will be an option in the automation section. In the auto there will be an automation blade in the Azure portal. So there, we can, uh, see where there, we can check this, uh, automation option so we can pass, uh, this JSON file, uh, there. Uh, so complete, uh, whatever the resources are there, uh, in that solution. So every for that every resource JSON file will be available. So we can, uh, pass the JSON file, uh, and we can modify it using Python. So, uh, we can use this pip command in the Python to modify it into JSON.
So here, uh, the for loop, um, so where VM in VMs to start here, uh, it won't, uh, logically, it it it provides an error there in the line itself. So, uh, while looping it, there, they have provided only 1 parameter, uh, that is VM to start, and they haven't provided computer client. They have provided only, uh, uh, in the for loop, they have provided only 1 parameter. That is VM to start, and they haven't provided computer client. Uh, so for each VM, there should be, uh, a computer client. So, uh, there should be, uh, a for loop, uh, which, uh, it needs to be looped for computer client parameter also so that, uh, then only it will, uh, start each VM separately.
Here in this, uh, Azure policy assignment, uh, definitely, it will fail because, uh, the way of declaring, uh, this HCL programming syntaxes is wrong. Uh, so it will definitely, uh, provide it will definitely fail, uh, so we can't apply it, uh, to resolve this. We need to change the HCL form, uh, the syntax. So we we need to change the syntax in Azure RM, uh, policy assignment. So there also, we need to provide, uh, double quotes there. Uh, only single quote is provided. And here also, uh, the way of syntax is not correct in each each line. We need to change that.
So we have to use, uh, while create while we are creating Azure resources using Terraform modules. So to bring up, uh, the reliability and security, we need to use, uh, RBAC, uh, secure and other security policies. So we have to assign only, uh, limited, uh, roles and limited roles to, uh, limb for users. So they don't so they'll be having only limited access. So we need to configure, uh, role based access control for each resource, uh, while create while, uh, creating with, uh, Terraform modules to ensure, uh, reliability and security for uh, for this, uh, well architect framework in Azure. So for reliability, we have to take, uh, we have to ensure it will it will take a backup, uh, of each resource so it will maintain the liability. So and for security, uh, we can assign uh, or we can use, uh, a Jule, uh, logs also. So if we log that particular resources while, uh, deploying this using Terraform modules, we can show no 1 will have access to delete it. Uh, so it will have security also. So we can use Azure logs.
So we can, uh, use, uh, AWS and, uh, Azure, uh, multi cloud environment in Terraform, uh, where, uh, we can optimize this using, uh, we can declare, uh, in Terraform workspaces and Terraform modules also.
So first, uh, we'll we have to use, uh, encryption, uh, to secure sensitive information used in Terraform scripts. So we have to, uh, if any secrets are there, we have to first, uh, encode that, uh, into base 64. And later, uh, we can decode it. And, also, uh, we have to, uh, store the sensitive information, like, uh, your login details or any tokens, or we have we need to save, uh, in secrets so that, uh, we can, uh, so the secret should be stored in any, uh, keyword, uh, so we can fetch the details from that keyword later.