
MLOPS/ML Engineer
Altimetrik IndiaSr. Python Developer
Nityo InfotechSr. Python Developer
Euclid InnovationData Scientist
Capgemini IndiaSr. Python Developer
Bristlecone IndiaSr. Python Developer
CalSoft IndiaPython Developer and Automation
Oliveborad Comptech.png)
Jenkins

AWS

Bitbucket

ServiceNow

Jupyter Notebook

Kubernetes

SQL

Apache Airflow

Scikit-learn
.png)
Docker
Beautiful Soup

Nmap

Wireshark

JSON

requests

Nessus

Qualys
Yeah. So we're starting for my career. I have start working in IS. Yeah. There, I have worked on lots of project. I what on the DNS log analysis where I need to, uh, parse the log, process the log, and then load it into the, SQL SQL server. So for parsing and processing, I use the Python with related library like Scrappy and beautiful show to extract the data from the log file. Then coming to next malware classification problem where I need to collect the malware file data code section using the Python p library and then regenerate the fingerprint using the Anglamps method. And then, uh, the data has value and then prepare the dataset in the thread method, uh, dataset and then load, uh, uh, fed this data to machine learning algorithm, random forest, and, uh, uh, train the model, validate the model. Next, I have word in the Mercedes Benz, where I need to, uh, load the data using the Python. And using the PySpark, I need to Train the model. Validate the model. So, uh, I need to do the transformation, apply the transformation, uh, and then normalization of the data using the Python library. Then next, I worked in the, uh, uh, JPMorgan where I need to develop the API to validate Uh, uh, subnet data and also do the automation process of the red build. So I need to use the Python with the Flask API and use the REST API to, Uh, generate the API. I use also microservice to deploy our product into the uh
So for, um, developing the high performance API, we need to first, we need to, uh, check the, uh, optimization of the code, and then we need To, uh, apply the RESTful API to connect the servers and generate the sessions to further security analysis also. And then we need to generate, uh, create the abstract class so that we can use this this abstract class for our development purpose. We also use design, Python design pattern to, um, develop our application using the singleton, like, creating creating the design pattern with structure behavior and and structure the design pattern and then behavior of this design pattern we need to perform for the API. And, um, we need to deploy our service into microservice to, uh, connect to all API to run the all API with the Swagger, uh, perform uh, yep. We can use the Swagger to, uh, do our API performance check.
So for doing the scalable application, we need to, uh, check the resources available when, uh, uh, for, uh, use in our application. And then, uh, we need to build a pipeline where from where we can push our code. And from there, the, uh, cloud will take our updated code to run this run the code and, uh, perform the, uh, application behavior according to the, uh, application we have written
So for, uh, real time data processing, first, we need to use the, uh, in bill. Uh, there's a lots of open source cloud, like this path architecture we can use for the real time data process. And then for the, uh, like, for the finance product, we need to connect to our financial data center using the files password that we can get the real time data, and we can processes, transform it, and then utilize it. And then we can view it also using the Spark act on the Spark architecture with our real time updated data.
So, uh, in AWS service, like, I have used the Databridge s three to s three with the Databridge so that, uh, we can, I process our data, uh, with the using the different different Python commands, and we can, apply all all the visualization and transformation method on the data so that we can uh, enhance our performance, uh, on the cloud and, uh, further enhancing more of our, uh, application? We need to, I use that, uh, different different cluster, uh, node so that our performance will be, uh
So for analyzing with business needs, we first need to, uh, gather the information. Then we need to, uh, apply the logical, logical behavior, then we need to, uh, uh, store it into that, uh, physical database. And then we apply on the our, uh, logic analysis logic on the dataset and perform our, uh, perform our, uh, application operations on the application so that, uh, we can get the good result of the analysis.
So this is basically the recursive method of calculating, uh, the sum of n numbers. But here we have seen that the n is less than equal to 2, uh, that's one issue if number is greater than 2 Then, uh, get a equal to 2, then in that case, also, there will be the sum will become the 3 bit. It should not return 1. It should return 3. So this one issue is There. But, yeah, this, uh, this is one recursive method, um, we have implemented in the
So the basically so basically, the singleton method we use so that we can use our object once at a time in the entire, uh, entire process like, uh, here, we are connecting the data, uh, database once, and we can use this database anywhere inside our programmer, uh, to perform our connection and apply, um, to do our operation on the database.
So, basically, uh, I use a Python based Application in my, uh, recent project where I need to, um, validate the subnet. So I need to write the API, uh, Then deploy it into the our micro service cloud cloud platform. And then we, uh, connect different different data sources, uh, using the API with the help of the A RESTful API and the Flask API written in the Python. And and then when we validate the subnet whether it is available on the data frame or not. If it is not available, then, Uh, we return the response code not found. Otherwise, we, um, we return return the response code to one way that, uh, uh, subnet is Already exist on the, uh, different
So, basically, using the Python and PostgreSQL or any other SQL, we can, Uh, connect to the database we apply. We can use the Python library like pandas or NumPy or other, um, data frame library in Python to perform our all this, uh, operations, like, um, unified, uh, like the group being aggregation method, transformation method. And then we, uh, uh, do our process data processing and then push back to the, database again. And, uh, using this, uh, transformation aggregation, we can, uh, do the analysis of the s and database, Uh, like, how much the data I use previously use, what we need to do for that future purpose of the data, uh, to use more application use.