profile-pic
Vetted Talent

Venkata Chaitanya Jonnalagadda

Vetted Talent
Seeking challenging and progressive work with professional organization, where I can utilize my potentials to the fullest, polish my interpersonal skills and enhance my strengths in conjunction with the organization's goals and objectives.
  • Role

    Tech Lead

  • Years of Experience

    8.4 years

Skillsets

  • SSIS - 4 Years
  • SQL Server
  • Oracle PL/SQL
  • Test Management
  • Informatica
  • Sap callidus
  • Toad developer

Vetted For

4Skills
  • Roles & Skills
  • Results
  • Details
  • icon-skill_image
    Senior SQL DeveloperAI Screening
  • 59%
    icon-arrow-down
  • Skills assessed :TSQL, Data Analysis, Problem Solving Attitude, SQL
  • Score: 59/100

Professional Summary

8.4Years
  • Senior Software Engineer

    Legato Health Technologies
  • Dec, 2019 - Feb, 20244 yr 2 months

    Tech Lead

    Carelon Global Solutions
  • Business Technology Analyst

    Deloitte Consulting US India
  • Associate Software Engineer

    Tech Mahindra

Applications & Tools Known

  • icon-tool

    T-SQL

  • icon-tool

    SQL Server Integration Services

  • icon-tool

    Informatica

  • icon-tool

    Oracle DB

  • icon-tool

    Control-M

  • icon-tool

    Jira

  • icon-tool

    SAP Crystal Reports

  • icon-tool

    PL/SQL

Work History

8.4Years

Senior Software Engineer

Legato Health Technologies
    Analyze code defects for compensation and commission issues for broker agents; Create and update workflows in Informatica; Perform Data Change Requests (DCRs).

Tech Lead

Carelon Global Solutions
Dec, 2019 - Feb, 20244 yr 2 months

    1.   Understand & analyze the business requirements, that comes in the form of PORs/SSCRs (Plan Of Records/Small System Change Requests)

    2.    Analyze code defects for compensation and commission related issues for broker agents and fixing the defects

    3.     Development of ETL objects in Informatica & SSIS from source Anthem preprocessor system to SAP Callidus Cloud system

    4.     Perform Data Change Requests (DCRs) for updating the commissions details for specific agents as per ad-hoc client request

    5.     Mentor the team on technical tasks, conduct deployment meetings, review the tech approach &

    unit test documents for defects and user stories implemented during release deployments

Business Technology Analyst

Deloitte Consulting US India
    Part of the 'Helix Data Migration Team'; Analyzed business requirements and migrated PHI from Eurocare to Mednext. Created and deployed SSIS packages for data migration. Fixed data issues in end-user profiles using PL/SQL.

Associate Software Engineer

Tech Mahindra
    Understand & analyze the business Logic and Data Flow. Fix data issues in SSIS by analyzing the SQL code and fixing the bugs and code issues. Understand the existing logic and develop ETL, Stored Procedures and functions.

Achievements

  • Experience in ETL tools
  • Certifications in Oracle Cloud Infrastructure and Healthcare Management
  • Experience in Test management tools

Major Projects

4Projects

Enterprise Sales Compensation System (ESCS)

    Worked for Anthem, Inc. Analyzed business requirements; Analyzed code defects; Created and updated workflows in Informatica; Performed Data Change Requests (DCRs).

Elevance Health

Carelon Global Solutions
Dec, 2019 - Feb, 20244 yr 2 months

    1.     Understand & analyze the business requirements, that comes in the form of PORs/SSCRs (Plan Of Records/Small System Change Requests)

    2.     Analyze code defects for compensation and commission related issues for broker agents and fixing the defects

    3.     Development of ETL objects in Informatica & SSIS from source Anthem preprocessor system to SAP Callidus Cloud system

    4.     Perform Data Change Requests (DCRs) for updating the commissions details for specific agents as per ad-hoc client request

    5.     Mentor the team on technical tasks, conduct deployment meetings, review the tech approach &

    unit test documents for defects and user stories implemented during release deployments

Anthem Inc

Deloitte Consulting
Apr, 2017 - Dec, 20192 yr 8 months

    1.     Fix data issues in end-user profiles by analyzing the PL/SQL code and fixing the data issues

    2.      Perform change requests whenever a new job or major changes required in the job scripts

    3.     Provided KTs/Walkthroughs to Legato Health Technologies (rebranded as Carelon Global Solutions) about WEM/E3 System

Applied Materials

Tech Mahindra
Nov, 2015 - Dec, 20161 yr 1 month

    1.      Understand & analyze the business Logic and Data Flow.

    2.     Fix data issues in SSIS by analyzing the SQL code and fixing the bugs and code issues.

    3.      Understand the existing logic and develop ETLs, Stored Procedures and functions

Education

  • Bachelor's of Technology

    Raghu Institute of Technology (2015)
  • Graduation in Information Technology

    Raghu Institute of Technology (2015)
  • Intermediate M.P.C

    NRI Academy (2011)

Certifications

  • AHM

    AHIP (Oct, 2020)
  • Oracle Cloud Infrastructure

    Oracle (Dec, 2021)
    Credential ID : 1Z0-1085-21
  • SQL

    TestDome (Sep, 2021)

    Credential URL : Click here to view
  • Certified professional, academy for health care management (pahm) in healthcare management: an introduction (ahm250)

  • Oracle cloud infrastructure foundations 2021 certified associate

Interests

  • Internet Surfing
  • Technology Research
  • Watching Movies
  • Exercise
  • Exploring Places
  • Gyming
  • AI-interview Questions & Answers

    Hi. Uh, my name is. So I in the SQL, Lambda SQL, Python SQL, and other

    So I utilized T SQL in, uh, complex SQL queries by trying to understand the ask first and then, uh, create a solution in my mind and try to implement that with T SQL. It could be incorporating the functions or different procedures while, uh, solving complex problems. Uh

    So to optimize this equal query that is running too long, I would see to, uh, I would first try to see the cost where exactly, uh, the query is taking, uh, long, actually. So I would run the stats, uh, to see where, uh, the cost is getting more. And then I try to gather the stats or try to find fine tune the query by, uh, removing the redundant checks and, uh, filtering the, uh, and I would like to filter only the records that are needed and stuff retrieving, uh, by giving select star. So I will try to, uh, I would basically pull the required fields for the query, and I would, uh, I would use subqueries not necessary subqueries instead of in because in will take a lot of time in, uh, also in joins also. So wherever the cost is more for the or the time is taking more for the

    So for for for retrieving data from, uh, from multiple tables in the database, I would do a join on the, uh, on the tables that, uh, have the data, and I will only pull the required fields that are necessary to be populated in the business report. Uh, this is how I handle data extraction by, uh, uh, collating from data from different table sources and pulling out into the report, so I can, uh, do it with a SQL, uh, with an SQL query by doing a join. And also I can do it with ETL by joining 2 forces to flat file, which is a data which is a file source for the report.

    So designing a SQL database for an application that needs to handle large scale data, uh, which needs high readwrite for options is first, I would allow concurrency control, uh, to be limited. So, basically, multiple readout options are are going to be there. If there are going to be multiple operations as such, then I would allow the concurrency for multiple sessions, uh, by the same or different users to perform the read, write operations at the same time, uh, simultaneously without any stoppage or a blocker in the way. So and I would also, uh, design highly scalable database by, uh, by taking the offerings, which would help, uh, to perform the queries faster, and I would, I would incorporate Turing queries while creating the tables itself and try to optimize the runtime by writing efficient scripts for creating the tables and objects within a database.

    Some of the issues that, uh, we might, uh, encounter while designing and implementing SQL database is to understand the scalability of the database, how much, uh, memory the database could use, and how many users or sessions can the database handle. So these all things come in mind when these all issues might arise when, uh, we start designing and implementing a SQL database. And I would resolve the issues by creating the, uh, database objects that are, uh, that are basically efficient by trying to utilize the memory with less, uh, memory as possible and try not and be cost effective while trying to run the SQL queries. So this is, uh, this can be happened by creating the tables and database objects and other database objects such as, uh, stored procedures, functions, and all, uh, with efficient scripts, actually.

    So we can optimize the query by doing a select count. Instead of star, we can give the select count of a particular single field, which will equate to the same, uh, count that we are really looking for. So we can do a select count of, uh, field name from customers where country for US entity equal to New York. So by doing this, by eliminating the start, so we try to pull less records, which will give the same count as that which we get, uh, while giving a count star, or we can use a count of 1. That will solve the problem.

    Can be made. So doing a select start from orders where order ID equal to 1 to 3. So we can We can try to pull the particular fields instead of giving a start, which would, uh, help improving the snippet and thereby, uh,

    So the process which we would take to migrate a database from 1 to another is to, uh, do a cyclic, uh, check by transferring, uh, the database from one to another. Uh, so basically, we employ pluggable database, which will, uh, ensure to transfer data from a lot. Uh, since it's a large SQL database, we can transfer data from 1 to another using a pluggable database, which will ensure that the downtime is minimal since the Uh, original database from which we are transferring will still be up, and there'll be no, uh, no downtime or no restart of the services. Uh, so using a pluggable database can solve the problem.

    So in the event of a SQL database fails, uh, the steps that we would take to restore the system by minimizing the data and data loss in downtime is to take backups while ensuring we do the data transfer beforehand. So, basically, when we, uh, deploy a pluggable database, uh, we simultaneously take a full backup, uh, on the, um, on the backup drive, which would, uh, help us to restore the data in case of any unexpected failures, uh, and it will minimize the data loss and downtime.