Project Lead
Mphasis LtdTechnical Architect
Hexaware Technologies LtdToad
SQL Developer
WinSCP
SQL Server Management Studio
ServiceNow
Control M
SonarQube
Hi, this is Karthik Hilain, I have 8 years of experience in data warehousing, vxsql and vsql development. I have experience in data migration also. I have 3 years of experience on .NET technologies and I have 2 years of experience in VB 6.0 where I was working on application software creation and maintenance. In all the things I have good exposure on complex SQL server creation and analysis. I have experience in data warehousing, data migration and data validation. So, I have good exposure on web technologies too. My current project is data migration project. It is for SIPI data migration project where we migrate the data from Sybase into Postgres by using the pentameter tool. We are creating the ethereum program that will load the data from source into market. This is what we are working on right now. In sync, I have experience on database deployment too. So, we are deploying the database using slowcx. So, I have experience on database creation, database design, database development and database deployment. I also have exposure on complex SQL script and complex SSRS repo and database integrations and I have experience on data monitoring too.
So, database capacity planning in database capacity planning in prior to that, what we have to design the schema that need to accommodate the large, that need to accommodate the large size of data. Likewise, we have to design the data, we have to design the data according to the database loads. Based upon the data input load, we have to design the database.
So, the store procedure that need to handle the simultaneous transaction in a thread-safe manner. So, what we have to do right, we have to create one parent, we have to create one parent store procedure. Inside of the parent store procedure, we have to create multiple sub-procedure in it. While calling the parent procedure, it simultaneously execute the store procedure within it. Likewise, we have to create the store procedure to execute simultaneous transaction in the thread-safe manner.
Optimizing database performance, so optimizing database performance, what kind of metrics? So the database performance, what we have to look into that first in the bad class, so filter class, so the filter column always having an index, if the filter column doesn't have an index means the query will take a long time to execute, so we have to analyze the thing at first. Next one, if we are using complex store procedure or complex queries, we have to check what are the necessary columns we have to use. We won't select all the columns, we have to select only the required columns allowed. And if we are using some distinct statement, which is also the performance impact, so what we need to do, we have to use a union statement instead of distinct. Suppose if any sub-queries are written inside of the query, which will also impact the performance, so we have to replace the sub-query, and suppose if the heavy transaction table are used in the query list, which is also have the performance impact, where we have to address it. Likewise, we have to use the performance metrics to resolve this kind of issues.
so the complex queries which need to join several tables and functions so the complex query should have which table and which functions need to analyze suppose if we are using multiple tables means we have to we have to analyze the data load that means the capacity of size of the table that we have to analyzing first prior to that so that only we have to join the table along with it so the so the table size we have to be probably have to be probably have to be we have to be concentrated and functions so the the functions should return the value so while using the function in the in the in the backglass it have some impact on the performance what we have to join the function in the we have to we have to add the function in the join class in order to improve the performance like this we have to do the things so the table so what we have to join in the in the database right that table should have very precise if it having very long size means we have to we have to we have to split the table and we have to use the unit all statement instead we have to use unit all statement for improve the performance
And now I'm setting this up.
in this while loop we are using delete a record from this table while deleting the record we have to update the row count in this in this in this in this machine statement we have to update what are the records to be deleted that count we need to update in the machine statement that means that to be deleted record count has to be updated in this count so that one only we have to break the loop because in this while loop having count of table is exact is equal to 1 so what we need to write once we delete the record the count will need to be changed deleting the record the count will also be changed so you yes so we have to update the record here then only we can edit the process so we are writing the while loop here the loop statement is missing here the loop statement have to be added
. . . . . . . . . . . . . . . . . . . . . .
Database recovery strategy, so we have to use data, so we have to use the data replication concept for, we have to use the data replication concept to get the database recovery. So in the critical application, if we want to, if we need the, if we need to database recovery means we have to use it from the database replication strategy. So database replication is the one of the strategy, we have to use this method and automatic database backup. It is second strategy where we have to backup the database regularly, we have to take the backup of the database regularly in order to keep critical applications. So we have to use the two strategies, one is data replication and the second one is automatic database backup.
a scale code documentation. So we are creating one mapping sheet where we have to list the tables whatever we are using for the database design. So if we are using the database design script, that is DDL, Data Definition Language. So the Data Definition Language script where we are kept in the Word document and we have to check in the Word document either in TFS or in SVN server. So we have to create database code and we have to check in the file in the TFS. It is a way of documenting in the DDL script. The second way is we have to export the database object from the database and we have to store it as a .sql file. The .sql file we have to check in our TFS server or SVN. Likewise we have to keep our ESQL code. So the database design document we are preparing Word or Excel and store it into the TFS screen. In database code file we have to create the .sql file and the .sql file we will store into the TFS server.
So, the SQL server analysis services, so SQL server analysis service, which is used to which is used to do analyze the data, which is used to do analyze the data in a database. So, we have to create the cubes for this one, but we have to keep the data into the weekly, monthly, the daily data or keep into the cubes and we are using the SSIS for or analyze the data. So, analytics service, that is mean data analysis, which is once analysis the data, then we have to display it to the dashboard and that report we have to send to them, send to them, send to the management team for the decisions.