profile-pic
Vetted Talent

SANTHOSHBABU K

Vetted Talent

Hardworking, highly motivated professional eager to lend combined knowledge and skills to enhance business performance. Operates well in both individual and team capacities, leveraging seasoned work ethic to quickly adapt to different processes and drive company objectives. Resourceful and results-driven with a passion for growth and efficiency to meet company needs and increase service value

  • Role

    Power BI Developer

  • Years of Experience

    13 years

Skillsets

  • Data Analysis - 5 Years
  • SQL - 5 Years
  • Python Programming
  • VBA
  • Tracking kpi and sla
  • Report Preparation
  • R programming
  • Python

Vetted For

12Skills
  • Roles & Skills
  • Results
  • Details
  • icon-skill_image
    Power BI - Team LeadAI Screening
  • 53%
    icon-arrow-down
  • Skills assessed :Oracle, Performance Tuning, Queries, Stored Procedures, Data warehouse, Database structure, DAX, Indexing, PowerBI, Data Modelling, Postgre SQL, SQL
  • Score: 48/90

Professional Summary

13Years
  • Dec, 2013 - Present12 yr 1 month

    Senior Analyst

    Accenture Solutions Pvt Ltd
  • Dec, 2015 - Jun, 20182 yr 6 months

    Analyst

    Accenture Solutions Pvt Ltd
  • Dec, 2013 - Dec, 20152 yr

    Associate

    Accenture Solutions Pvt Ltd
  • Aug, 2010 - Dec, 20133 yr 4 months

    Senior Associate

    Wipro Ltd

Applications & Tools Known

  • icon-tool

    POWER BI

  • icon-tool

    VBA

  • icon-tool

    Visio

  • icon-tool

    PowerShell

  • icon-tool

    SQL Server Integration Services

  • icon-tool

    Excel

  • icon-tool

    SharePoint

Work History

13Years

Senior Analyst

Accenture Solutions Pvt Ltd
Dec, 2013 - Present12 yr 1 month
    Leading a two members team with weekly collaboration with client on monitoring suppliers' performance and Automation tool development

Analyst

Accenture Solutions Pvt Ltd
Dec, 2015 - Jun, 20182 yr 6 months
    Moved to another vertical with my continuous support on Chennai flood. Acting as SME and buddy team lead for the entire team of 15 members. Leading the huddle and make the team interactive with general discussion on upcoming technologies and trends. Worked as Agile workforce in developing 25 mini macros for the entire deal.

Associate

Accenture Solutions Pvt Ltd
Dec, 2013 - Dec, 20152 yr
    Part of Operations in sending clients daily, weekly and monthly reports. Created a Outlook macro which will be a second eye for the entire deal consisting of 500 employees to ensure they are sending to right person with right attachment. Part of teams performance management data and initiated macros for wherever necessary.

Senior Associate

Wipro Ltd
Aug, 2010 - Dec, 20133 yr 4 months
    Sending Reports such as Daily, Weekly and Monthly Reports of Process data to Manager

Achievements

  • Awarded multiple times as Numero Uno, Champion
  • Awarded Best collaborator for creating a formulated file for client's survey project by Global workshare team for saving the penalty paid to customer
  • Employee of the month - 2 times
  • Cost avoidance award - 1 time
  • Recognized by peer colleagues - 10 times
  • Awarded Best collaborator for creating a formulated file for clients survey project by Global workshare team for saving the penalty paid to customer
  • EMPLOYEE OF THE MONTH 2 TIMES
  • COST AVOIDANCE AWARD 1 TIME
  • RECOGNIZED BY PEER COLLEAGUES 10 TIMES

Major Projects

4Projects

Analytics Tool

    Tool validates Claims Submitted in system against 25 Checkpoints and gives Validator With additional Information to process the claim.

Dashboards

    Developed various kind of reports to client with respect to various aspects of process using Power BI and PowerPoint.

Analytics Tool for Aerospace Engine Manufacturer

    Client is a leading Aerospace Engine Manufacturer. Developed an Analytics tool to validate Warranty claims submitted in the system against 25 Checkpoints and gives Validator with additional Information to process the claim. Used SQL, R Script, PowerShell Commands, SQL Server Integration Services Solutions and Excel Macros.

Dashboards Development

    Developed various kinds of reports to client with respect to various aspects of process using Power BI and PowerPoint including Monthly Process Performance review, Monthly Client Performance review, Realtime Daily, Weekly and Monthly Performance dashboard, Approval Cycle dashboard, and SLA Report.

Education

  • MASTER OF SCIENCE

    St. Joseph's College (2010)
  • BACHELOR OF SCIENCE

    SKSS Arts College (2008)
  • Master of Science in Information Technology

    St. Josephs College
  • Bachelor of Science in Information Technology

    SKSS Arts College

AI-interview Questions & Answers

So hello. I'm Santosh Babu. I have been working with Accenture Solutions Private Limited as a senior analyst for last 10 years and to, uh, speak about me in a brief, uh, passion. So there, I am working as a analyst, wherein, uh, there is a tool which help them validating the claims. It is a, uh, we are working with the aerospace claim. They have a built in analytic tool, which help them validating the, uh, 25 number of checks out of n number of checks. And out of these 25 checks, I have personally developed, uh, 10 checks which help in validating the tool. In simple, uh, they are raising a warranty claim. That claim has to be validated in, uh, such a way that whether the claim is duplicate or whether the customer is raising a validate valid claim, uh, the parts are legitimate. All these things are, uh, checked in in the tool. And what are the technical specification is? Uh, we have SQL Server, which helps in maintaining the data and SSIS, which helps in, uh, helps us say, uh, ETL tool to validate the data, wherein we have, uh, hypothesis built in the SSIS packages. And using SSA using Excel VBA, we call a batch file. And batch file will execute the SSH packages wherein, uh, the output, uh, will be extracted in the Excel using the SSAS. And in the meantime, the modification will be happening there in the SSIS packages itself. And final, uh, data will be extracted in the Excel, And, also, we can validate the data in the SQL server as well. So this is 1 of my job. And, uh, other than that, I used to publish Power BI reports, uh, for the client as well as the internal, uh, leads, uh, wherein we will publish agents' performance, process performance, SLA data, KPI data, monitoring, and tracking of SLA and KPI for daily, monthly basis. And, uh, the claims incoming and outgoing, uh, data in a pattern that I'll be taking care of. And entire reporting and the data management, I'm the only 1 taking care of. And last 6 years, I have been building Power BI dashboards with simple tax expressions as well as, uh, the wherever Excel BBA macro requires, I'm the first point of contact. I will, I will use entertainment study based on the steps and the, uh, requirement. I will pitch in, and I will download the Excel macro. And, uh, that's it. That's a simple introduction about me in Accenture. Previous to Accenture, I have 3 years of experience with the Pro. There, I have been working as an MIS analyst, wherein, uh, it help we are, uh, tracking care of, uh, sorry, taking care of, uh, agent's performance and the incentive calculation. I used to take care of uploading the incentives into the payroll page. That's it about the

Implementing and testing application security layer models, uh, specific to Power BI integrations. So I I think I am, uh, not that much experience enough to answer these questions, but there are, uh, role level security. Things are there in the Power BI that helps in validating the user's attempt in accessing the, uh, Power BI, as well as we have Power BI reporting server, which has, uh, the data integrations, which is there in the, uh, Power BI reporting server. There also we can monitor the, uh, data usage and the users, uh, login over there. And when it comes to the security layer models, I'm not that much experienced enough to answer this question. And thank you.

So when it comes to large datasets in Power BI, obviously, we can even include Azure, AWS, Azure, uh, and large datasets using Power BI. But, uh, what happens is when we have, uh, in in when we are including large datasets in the Power BI, the performance of the, uh, navigation and the performance of the visuals, uh, may decrease. And, uh, what I what I tend to use is or what I intend to use is, uh, mostly, uh, when we use large datasets, uh, I think we can use SQL in such a way that, uh, we can query only the necessary data and such a way we can use only those data into the visuals. I think that's the prominent answer from my side. And, also, uh, I think accuracy is also matters, obviously. Even in the even in any other application, though, we have databases which handles GB level of data, terabyte level of data. Power BI is 1 of the front end tool, I can say, that we can see the accurate information from the entire database, and I don't believe the accuracy, uh, is compromised here. But when it comes to large datasets, the performance may get compromised all because of the applications, uh, ability as well as the systems, uh, ability. I think that's the problem.

When when faced with the slow running complex tax expression, how do you determine if it should be optimized or rewritten? Let's say if we are using different data tables or different tables altogether. Data tables in the sense I'm, uh, trying to tell it is fetching data from, uh, any other external data other than the table data. If let's say, like, if we are fetching, uh, data from different table, obviously, it may slow down. Also, it will that if we are handling large number of datasets, obviously, Power BI, uh, will respond very slow. Uh, so to to optimize, uh, I can say that, uh, we can bring all the, uh, tax expressions into 1 pivot table wherein this table data can be called out when there is a requirement to the other table. I think, uh, this answers the questions, which means, let's say I have a calculated field or I have a measure new measure introduced. I have a new column introduced using a text or any other built in function using Power BI. What I believe is, uh, let's say if we are bringing a new field, new measure from the other table using other tables completion, other tables filter. What I believe is I I suggest bring those necessary field into the same table so that we can use as a we we can fix the performance issues.

When would you choose to build a new table or view in the database, uh, for a Power BI report? My previous question answered this, uh, question as well. Let's say if we are looking for a specific set of data wherein it it affects the performance of the Power BI, uh, visuals as well. I think it's better to create a view or a new table in the database so that, uh, we can, uh, have a Power BI report, uh, performs very well.

What steps would you take to ensure the security of the dashboards being viewed by multiple user prongs? Uh, Power BI had Power BI reporting server has inbuilt functions, uh, which helps in maintaining the, uh, viewer's, uh, views, uh, we can, uh, take a day out of it. Other than that, we can introduce a row level security row level security there in the Power BI, which helps in securing the data, which is viewed by the users. Obviously, RLS or rollable security is 1 of the, uh, prominent feature in the, uh, Power BI, which helps in handling the multiple users.

Given this tax formula snippet used in the Power BI report, can we explain why the performance might be impacted? How could you optimize it? Calculate some of sales and note, filter all sales. The problems might be impacted. How could you optimize it? So if we are introducing this as a value or a, uh, visual in the Power BI, uh, what I believe is this is only for the particular filter, I could say. This is only for the year 2022. So what I believe is instead of using this tax expression, I believe we can apply filter or else yeah. Obviously, we can apply a filter and introduce a filter in the Power BI in such a way that if you use, uh, filter, uh, here as a filter in the visual, obviously, we can use it, uh, to determine the sales.

Analyzing the import snippet from Power BI's query editor, what would be the issue with how the table joints are being utilized, and what would be your recommendations for improvement. So so sequel to excel workbook sales data. From there, I I am using a sales table tab. There, uh, there I'm using sales customer. So instead of using, So instead of using, uh, table join, I and and to be very clear that I'm new to the same code, I'm, uh, I have been using this Power BI to build the dashboard, but I never use this, uh, query editor, so I've had to, uh, change or transform anything. But, uh, I can bring the visuals in place whenever required to be very specific, I believe, uh, both are from the same table. I think there will not be any improvement needed.

How do you design a Power BI reporting, uh, solution that accurately reflects complex business logic impacts without compromising their performance. Quoting solution that accurately reflects complex business logic in tags without compromising the report performance. And 1 of my previous answers already, uh, answered this question as well. Obviously, I use I use separate table altogether if required. Otherwise, uh, I built the tax expressions, which will not take any data from any other external table such that it will help in improving the report performance. When it comes to complex business logic, obviously, this is kind of a hypothesis. I'm not telling my answer as hypothetical. When I say hypothesis, when we are bringing any large datasets, we will build a hypothesis kind of, uh, logic. When I say, uh, hypothesis logic, let's say, like, if I have a 6 years of data, 10 years of data altogether, this will be a GB level of data. So if we are going to analyze all this data to bring 1 field or bring a common field for a particular line item or data, we will call it as hypothesis. So when it when this kind of scenario is used, uh, I think it's, better to write an SQL expression instead of writing, uh, I can I can call the SQL script or sorry? SQL script in the tags instead of writing a complex logic in the tags. That's way that that's how I can, uh, improve the report performance.

Assuming historical data is needed, describe your approach to setting up and maintaining an Our cubes and strategy that works well with Power BI. Yeah. Obviously, as I told you, if we are, uh, working in a live environment and if we are working in a live environment, obviously, uh, Power BI cannot have all the data in it. And I, uh, believe if, uh, load the entire data into Power BI, it cannot have, uh, all the data loaded into the into the Power BI, uh, back ended back end location or an application. Power BI application cannot handle that the entire historical data. What I believe is I can call some SQL logics and scripting languages help to, uh, get the get the data wherever required. Otherwise, it will be there in the databases itself. I believe, uh, instead of loading the entire data, I can use SQL with with the access well to, uh, handle all this kind of data to bring 1, uh, bring the data or visual thing into the Power BI.

Can we can you outline the process of creating custom connectors in Power BI for new type of data source not natively supported. Can you calculate in the process of creating custom connectors in the Power BI for new type of data source not natively supported. Uh, Power BI has inbuilt, uh, uh, data sources, and I believe there are n number of data sources which can, uh, have it, uh, which is applicable in the Power BI. And, uh, if it is not supported, let's say, and, uh, it's hypothetically, I can say, if it is not supported, I believe, uh, we can load, uh, the data into the databases wherein we can call the databases directly, which is already linked over there. That, uh, is 1 of my suggestion. And since I have not used any other news sources other than the Excel, uh, that's that's my answer for this question.