I have around 11+ Years of experience in the field of Infrastructure Management and Administration. Apart from this, I have around a total of 7+ Years of experience with the AWS Cloud Service implementation, management, and administration.
I have worked in different kinds of domains which include Government organizations, Fintech Firms, and Health Care Firms was gave me a wide range of exposure to tools and services. I have also done hands-on Dev Ops tools and technology and I have implemented project individual in Azure Dev Ops which include creating and managing CI/CD pipelines and release of the products. I have also some experience related to the migration of on-premise data and VMs to AWS Cloud clouds.
AWS Administrator
Quest InfoSoft Private LimitedSystem Operation Engineer (AWS)
The ADSOM Global Tech Private LimitedSystem Administrator (AWS)
The Beast AppsTechnical Assistant (IT-Computer)
Gujarat Council of Science CityTechnical Assistant (IT) at NRS Hall
Gujarat UniversityNetwork Administrator at NRS Hall
Gujarat University
AWS Cloud
AWS (Amazon Web Services)
.png)
AWS CloudFormation

AWS CloudWatch

AWS Lambda

AWS Certificate Manager

AWS CloudTrail

AWS Secrets Manager

AWS Identity and Access Management

Windows Server

CentOS

Ubuntu

Active Directory

IIS

Apache

Nginx

Python

PowerShell

Bash

Microsoft Teams

Microsoft Office 365

Azure Active Directory
Azure DevOps Server

Git
.png)
Jenkins

Redmine

Microsoft Windows

Microsoft Azure

Amazon CloudFront

Amazon EC2

Amazon RDS

Amazon S3

Amazon SQS

Amazon CloudWatch

Amazon WorkSpaces

Amazon Route 53

Amazon SES

Amazon VPC

Amazon SNS

Linux Admin

phpMyAdmin

Apache HTTP Server

SQL Server Management Studio 2022
Hi, my name is Satyam Gajjar, I have completed my MSC IT and I have around 11 years of experience in the field of IT infrastructure management as well as the AWS cloud management and I have total 7 years of experience related to the AWS cloud implementation, administration and management. I have worked with the government organization as well as the fintech firm and my last company was dealing with the healthcare domain so I have very wide variety of experience related to the different kinds of domain which help me to learn new tools and technology relating to those domain. My last job was with the Quest Infosoc Pvt Ltd, there I was working as an AWS administrator, there I was managing the AWS infrastructure as well as the local infrastructure. Basically they are working for the Bora Wound Care Pvt Ltd, basically a US based firm, so we were developing software application, mobile application and website for them. So all the infrastructure was hosted inside the AWS cloud and they are also using some part of the Azure also. And my daily roles and duties include to manage those infrastructure and like if there is a need for any change then infrastructure administrator assign me a task and according to that task I need to give the resolution, I was also managing the deployment, I have also implemented the CI-CD pipeline for some of the projects like I have that includes the web application as well as the windows application and I was also handling the release management where team will schedule a meeting and prepare a document and according to that document I need to like do the release and I am also managing the onboarding and offboarding of that company when new people are joining or left in the organization. I was also managing like offices who are located at the remote location like in Bangalore, Delhi, Trichy and Ahmedabad. I am also managing the three people team which include two junior system administrator and one junior AWS administrator and yeah that's like a very brief introduction to get idea about me. Yeah, that's all I have and thank you so much.
Okay. So for the multi account AWS environment, we have organizational account also. Inside of that, we are managing the, uh, different different kind of AWS account. Okay. And, uh, AWS Security Hub will, uh, check all the account, uh, related to the, like, vulnerabilities, and it will it will give us it will, like, give us the notification or, like, alert information alert, um, in the security hub. So, like, with the use of the Lambda, like, if you if you found any, like, uh, PII or PHI information, then, uh, like, we can, uh, automatically secure those information use of lab coat. So, uh, like, in the multi tenant environment, it will be scanning Google. Sorry. The multi tenant environment, it is, uh, accessing all the account, uh, relating to vulnerabilities, any exploits or, like, any any malware or any breaching of the data. And it will gather all the details in the hub, and the security hub will notify us regarding that, uh, particular email. So when that event is generated, so according to that event ID or according to that event description, we can, um, sort of function, uh, which will perform a basic basic script or, like, uh, which which will run the basic script and perform the particular operation and mitigate, uh, that risk.
Okay. Uh, basically, Lambda is a serverless application. We just need to put our code inside the Lambda, and the Lambda will execute that code and probably will result. But, like, for the stateful application, we need to use any, like, uh, other tool. Uh, let's say, for example, DynamoDB. Let's say, for example, uh, Lambda is processing a file, and that file, uh, after processing Lambda generated, uh, like, uh, some kind of data or some kind of result, then, uh, that result must be stored inside the DynamoDB in the, like, JSON format or any format you require. So so when we got that notification that Lambda just finished this task, Then at the time Lambda has uploaded that result inside the DynamoDB, so our application will contact the DynamoDB and get that result and display, um, uh, accordingly. So so the state persistence, um, will be achieved or like, uh, will be achieved, uh, with the use of any any and the data is, uh, system like RDX or DynamoDB.
So, regarding this we just need to write a lambda function, lambda for this we need to write a lambda function will get the ingestion of data and it will it will pass according to according our requirement, pass according to our requirement and when the lambda function will finish this process then it will it will contact the API gateway with that data. So we can use that AWS API gateway for that. So like when the data ingestion comes inside the lambda then lambda trigger the process of the parsing and when the lambda finish that thing then the parse data will move to the AWS API gateway and API gateway basically is a serverless service so it will easily scale up and scale down according to the according to the request. So we can serve the higher ability accordingly.
Oh, the the main purpose of using the cloud formation is to automate the infrastructure task. Okay. So, uh, for that, we need to create a CloudFormation template, and that CloudFormation template each and every steps inside. Inside, it's like a file. So that file, uh, has everything like what kind of VPC, what kind of subnet we require, what kind of VC we send a security group, any IIM role. So, basically, that's a JSON form of the cloud formation template we'll use. So inside the template, we need to mention in a, uh, the the the JSON format or, like, mention the JSON format. And inside the JSON format, we need to identify what, uh, IMX's role we need to create and, uh, uh, the particular role to assign to the which application or which is it, like, like, which services. So they will after after the, uh, automation of the infrastructure, they will easily communicate with each other.
okay so so if we if you if you want to manage secrets inside the AWS environment basically for the Kubernetes then we can use the secret manager service AWS secret manager service for that inside the secret manager we have to create a secret like user name and password and that secret manager also provide a facility to rotate the password after the particular time like one week or one month or like as per the organization name so so that kubernetes cluster or like that kubernetes ports container will communicate with the AWS secret manager and after that it will fetch that user name password or like any any secret information like like API token or like password or something else we can store inside the secret manager then it will with the use of that thing it will try to communicate with the database or any kind of different services and in the CI CD pipeline flow we can also use the AWS secret manager which like like automatically load the AWS the automatically load the user name and password secret from the AWS secret manager
Oh, the here here the here the, uh, problem, uh, with the script is, like, uh, the last statement, hyphen hyphen policy hyphen document. There we have mentioned the file colon colon slash, uh, uh, my hyphen policy dot JSON. Uh, there we need to provide the, uh, like, a exit path or directly a filename if we are inside the directory or or folder. So we need to change that. Uh, we need to provide the absolute or relative path, uh, inside the policy document field. So it will, like, uh, bypass or, like, it will run correctly.
Okay. Uh, so for the data security at the west or in transist inside the s 3. But first, let's say so when data registered the s 3, then we can use the data s 3 encryption service or server side encryption service or customer managed key or like that. So it will directly encrypt the data inside the s three bucket with the use of that server side encryption key or customer managed key. But, uh, um, like, uh, data in France is to be are using the Python script. So, um, like, like, before sending the result or before storing the files inside the s 3, uh, like, uh, we knew we need to use any, uh, like, encryption mechanism inside the Python script. So it will be very helpful. It will encrypt the data and try to store inside the s 3 bucket.
Okay. Um, sorry. I don't have a answer. I don't know about, uh, this thing. I haven't tried that. So I'm bypassing this.
Okay. The knowledge of container security based practice when orchestration deployment with the Kubernetes on AWS. Uh, like, um, we we only we only need to expose those ports which we are using, and uh, the Kubernetes, uh, like, will use the cluster. So inside the AWS, the Kubernetes underlying hardware, uh, may go must be, uh, like, managed by the AWS, and AWS will are taking care of the security make because it's a set responsibility model. So, like, uh, maybe we are creating cluster or any container service inside the AWS. Uh, Then we just need to, uh, focus on the security or any exposure of the port of our application only.
Okay. Okay. So, uh, let's say let's say, um, let's have an example, like, when a user is uploading a file inside the s three bucket, then a Python script, uh, will trigger, and it will try to process that file because that file was encrypted with the customer managed key. And we we the Python script will, uh, pass the decryption key from the, uh, AWS KMS, and it try to, uh, process with the with that key, uh, when when, uh, he, like, uh, pass the object from the s 3. And after that, it will process, uh, that object and, uh, like, store the data inside the DynamoDB or rdes. And, that's it how how it will use inside the Python, uh, with the use of KMS.