profile-pic
Vetted Talent

Pradip Nalawade

Vetted Talent

Dynamic and results-driven Techno-Functional professional with a proven track record of over 15 years in Business Analysis, Process Improvement, Data Modelling, and Visualization Technologies. Utilized skills in Power BI Development and Operations Management to excel in a challenging IT role.

  • Role

    Business Analyst

  • Years of Experience

    15 years

Skillsets

  • PowerBI - 6 Years
  • Project Management
  • Quality Assurance
  • Data Analysis
  • Stakeholder Management
  • Change Management
  • Process Improvement
  • Business Intelligence
  • Requirements Gathering
  • Business Analysis
  • Report documentation
  • Operations Management
  • risk mitigation

Vetted For

12Skills
  • Roles & Skills
  • Results
  • Details
  • icon-skill_image
    Power BI - Team LeadAI Screening
  • 52%
    icon-arrow-down
  • Skills assessed :Oracle, Performance Tuning, Queries, Stored Procedures, Data warehouse, Database structure, DAX, Indexing, PowerBI, Data Modelling, Postgre SQL, SQL
  • Score: 47/90

Professional Summary

15Years
  • Jun, 2021 - Present4 yr 3 months

    Business Analyst

    Microsoft India (R&D) Pvt. Ltd.
  • Apr, 2013 - May, 20218 yr 1 month

    Finance Consultant

    AXA Business Services Pvt. Ltd.
  • Aug, 2010 - Mar, 20132 yr 7 months

    Sr. Business Analyst

    eClerx Services Ltd.
  • Jan, 2008 - Aug, 20102 yr 7 months

    MIS Executive

    Tata Teleservices Maharashtra Ltd.

Applications & Tools Known

  • icon-tool

    Power BI

  • icon-tool

    SQL

  • icon-tool

    Microsoft Excel

  • icon-tool

    Microsoft Access

  • icon-tool

    Microsoft PowerPoint

  • icon-tool

    ICM

  • icon-tool

    VBA

Work History

15Years

Business Analyst

Microsoft India (R&D) Pvt. Ltd.
Jun, 2021 - Present4 yr 3 months
    Collecting, analyzing, and interpreting sales data from various sources to identify trends, patterns, and insights. Designing and developing comprehensive sales reports, dashboards, and performance metrics tailored to the needs of different stakeholders. Establishing data governance standards and ensuring adherence to data quality guidelines. Proactively identifying opportunities for innovation and optimization within the enterprise sales reporting function.

Finance Consultant

AXA Business Services Pvt. Ltd.
Apr, 2013 - May, 20218 yr 1 month
    Compiled monthly, quarterly, and semi-annual expense reports detailing Budget, Actuals, Forecast, Variance, and Head Count, presenting financial and performance metrics. Addressed and resolved inquiries from business units regarding budgetary and actual results. Ensured timely generation of quarterly information for publication purposes.

Sr. Business Analyst

eClerx Services Ltd.
Aug, 2010 - Mar, 20132 yr 7 months
    Designed and executed daily, weekly, monthly, quarterly, or periodic reports. Created portfolio goal tracking reports to monitor advancement towards intended objectives. Evaluated reports and aided business decision-making by appending comments and introduced new procedures and reports in alignment with business specifications.

MIS Executive

Tata Teleservices Maharashtra Ltd.
Jan, 2008 - Aug, 20102 yr 7 months

Achievements

  • Recognized as a standout performer during times of crisis in Q2 2020
  • Recipient of the Superstar Award in Q2 2018
  • Acknowledged for outstanding contributions to business process improvement with Best Business Process Improvement Ideas Awards in Q2 of FY2020-21, FY2015-16, and FY2014-15
  • Received the Brainwave award for automating over 7 ideas in FY 2014-2015
  • Honored with Best Business Process Improvement Ideas Awards in Q1 of FY2011-12 and Q2 of FY2010-11
  • Recognized for exceptional performance with spot awards in Q2 and Q3 of FY 2010-11, as well as Q1 of FY2011-12

Education

  • Bachelors in Computer Science

    Shivaji University, Kolhapur (2006)

Certifications

  • Microsoft certified: power bi data analyst associate (pl-300)

AI-interview Questions & Answers

Could you please help me to understand more about the background and giving brief introduction your trip? Good evening. My name is Pradeep. I graduated from Shivaji University, Kolhapur. I have a 15 years of experience in reporting and analysis, stakeholder management, report automations, and process improvement across the sales, finance, and telecom domain. I'm currently working with the Microsoft from last 3 year as a business analyst. My preliminary responsibility in Microsoft is the understanding the business requirement from the business stakeholder and higher management, prepare the data models, and implement them into the power by dashboard, validate the dashboard data flow from source to destination, And lastly, the maintenance and the monitor the performance periodically to maintain an accurate how to make it accurate shows the result. These are preliminary responsibility I'm handling in Microsoft. Prior to Microsoft, I was working with the EXA Business Services where I where I handle the 2 role. One role is the reporting in budgeting reporting team, and one is the automation in the brand new brand new automation team. Thank you.

What key consideration you okay. I'm, uh, reading the first question. What are the key consideration you would communicate to your team when working with a complex DAX expression? Okay? It is a very good question because while working with the DAX expression, we have to maintain couple of things. First, we if if it is possible in a tax, you can use the measure instead of the calculation column. Second thing, you can use the variable for avoiding the duplicate avoiding the duplication calculation of the columns. So you can use the, uh, variable. So variable can use, uh, you know, intermediate level. You can use a variable. 3rd one is the, uh, uh, variable. That's evolution. Use the optimal DAX formula if it is possible. Okay? A word user next year, DAX function in your DAX expression. On DAX expression next year. Uh, one more thing. DAX. Okay. Uh, you we have to make sure all these four things. I'm just repeating. You can just avoid you can just avoid to use the, uh, uh, if it is possible, you can just avoid the California column. You can use the measures. Uh, second thing, use the variable variable. You can break down the you can break down the complex tax formula into, uh, breakdown for minimizing calculation. Use the variable for storing the intermediate result. You can use these four things for, um, applying the, uh, DAX expression. I'm good. Thank you. I'm just moving to the next question.

I'm just reading the question. How do you balance the need of a quick report generation against the performance impact on the database. How do you balance need of quick report generation, quick performance impact in the database. How how do you balance the need of quick report generation against the performance on the dashboard? Okay? It is also very good question how you yeah. And this is my understanding, how you manage the database performance. Okay? For managing the database performance, we have to go on couple of things. First thing is the first thing is the, uh, your data model. You can use the proper, uh, database. For example, when you are connecting your power when you are connecting your SQL, so to the Power BI, we have to make the your database is I got if it if we okay. I'm just sorry. I'm just putting my answer from start. Uh, we have to use a 4 aspect here. 1st aspect is if it is possible, we have to element that we have to, um, aggregate the data at a monthly level if we don't require your data at a granular level. First thing, that is a week you are just oh, sorry. Sorry. I'm just correcting my answer. I think it is, uh, this question is related to the query folding database. Okay? I'm just repeating. Sorry. This is belong to the quick report generation against the performance against the database. If I'm not wrong if I'm not, uh, wrong, I'm just giving an answer into the query folding. Okay? Query folding is a query folding is, uh, is the function in Power BI where we can may we where we can define the steps in inquiry language and translate into native queries, and that can be executed by data source server. Using this, uh, using this query folding, we have 3, uh, 3 benefit. First benefit is we are eliminating transferring the entire data to the from server to the Power BI. We are just eliminating that bunch of data. Second thing is performance because using the query folding, we are running all your transformation at server level. That's why that the performance is optimized. 1 third one is scalability. Why scalability? Because without impacting due to the query folding, Power BI is handling large database and performing trans large database and complex transformation without degradation, uh, PowerBay performance. These are the three things we use for query folding. Please correct if I'm not understand the question. Thank you. I'm just moving to the next question.

When face a slow running complex access version, how do you determine if it should be optimized where or rewritten when you face slow running complex next expression. And how do you determine if it is optimized or written when you face? Okay. Uh, this is, uh, also very good question. This question for understanding DAX expression performance, we I will use a performance analyzer in Power BI. It will show which DAX expression is taking the time to load the data in a Power BI, first thing. 2nd, basis on this performance analyzer, I will value the DAX expression. Okay? For evaluation of our DAX expression, we have I have to however, this is the one way to determination tax expression. Uh, it is taking time or not. And now question is, how I identify rewritten words? If it is required rewritten, I will use the 3 to 4, uh, approach. If that's expression don't have any variable, I will use a variable I will use a variable to store the intermediate result. If DAX expression is very long, I will break it down. That is reduce the complexity. A 4th one is if it is possible, I will use the measures instead of the calculated columns. Okay? Then if it is possible, I will remove the next data function within a tags. I will use this 4 approach for optimizing my tags expression. Also, one more thing, if it is possible, I will remove the role level DAX expression if it is possible. For example, interrupt using the sum x, I will use the sum if it is possible. If it is required, we have to sum it, then I will use the sum. Otherwise, we'll use a sum. This 6 and we also check in a tax expression that is the relation between 2 table. It is 1 to many or men to 1 or try to men to many. I will try to our menu to menu relationship between 2 table. This is a 6 to 5 approach will use to rebuild the DAX expression rebuild the DAX expression and will give the better result as compared to the previous. Thank you. I'm just moving to the next.

Share practical approach to implement and testing application security, uh, layer model specific to the Power BI integration. Okay. I'm just repeating your question. Share practical approach to implementing and testing application. Security layer model. Security layer models. Power BI integration. Sure. Practical approach to implementing and testing application. Security layer. Security layer. Model specific to Power BI integration. Uh, if okay. Security layer. Security layer is basically for the if I'm understanding correctly oh, I'm this is my understanding. This question is related to this question is related to the role level security. How we create the role level security. Okay? For any database, we have to use a role level security to avoid authentic to avoid them, authentic to avoid the on authentication and authentication of the data. Okay. For example, I'm using one example. We have a sales data, and the sale data, we have to use a role level security in the sale data. Okay? For this, we have to load the, uh, we have to load the sale data in a Power BI. You just you can just go to the modeling. You can create the different okay. In a sale data, we have 4 region, east, west, south, and north. Okay? You can just create the different you can just create a different role in for the sale data. Then you can just create the different tax expression for that role. For example, I'm guessing in this example, we will use the east, west, south, and north. Okay? Now you can just apply the proper tax expression to that role. Okay? Now you can just go to the top modeling view as a user, and you can just test this east data, issuing the east region data. If it is going good, then you can just save, close, and save and close the Power BI desktop. Now you can just go after save and close, you can just publish the report in Power BI Services. Now you go to the Power BI Services. You can just select select that Cementeck model. You can just go to the security and then select the user. Now here is a main role. We have to assign the east, west, south, north region to the particular user. For example, east region, we have a 4 user. You can just assign the 4 user for that particular east, west, and south, and north. And you can just as a you can use a test user and validate the history data, waste region show, waste data, and all these things. Using these, uh, role level security, you can just apply your testing and a security layer on a specific power way in the, uh, power way data. Thank you. I'm uploading the man, sir.

Okay. I'm just reading the question. When would you choose to build new table or view in a database for power? When would you choose to build a new table or a view in a database for? Okay, when would you choose to build new table or view? When okay. When would you choose to build new table or view in a database for a Power BI? When would you choose to build? Notable or view database. Okay. When would you choose to build a new table? Okay. My okay. I'm just give me one minute. I'm just when would you choose to build new table or view in a database? Okay. When would you choose to build new new new table or view in the database for a Power BI report? Okay. Okay. It is you are marked as a difficult, uh, easy when you create a new user. Okay. When we want to entering the data in a Power BI table, you can just choose a new table or new database table. Database table. When when will you choose when you would choose to build new table? Okay. Okay. Okay. Got it. Got it. Okay. We have one table, and that table is having a full in or that is all information. But but for our power BI report, we need to select specific column from that table. We will use this method. We can select column select column 1, column 2, column 3, and also select the where clause if you want to we want to remove or remove the unnecessary data. In this case, we use the create new table or view in our database for Power BI report. Yes. Got it. Thank you. Yes. I'm just repeating my answer. We have a one one data. And in this data, we have lot of columns and lot of that is n number of columns and n number of rows. We have but we need we have to select only specific column and a specific row as per the criteria. For example, we have to select year 2022 or 2023, and we have to select 6 column. In this case, we will choose new table or new view database for selecting, you know, Power BI. Thank you.

Okay. Given that's formula, steep use in power airboat, you can explain why the performance might be impacted and how you how you could be optimized. Okay. I'm just reading. Giving tax formula, snip a snippet. Use it. Power BI report. Using Power BI report can explain why the performance might be impacted and how you could optimize it. Calculate some sales of amount. Filter all sales. Okay. Uh, here is the one, um, biggest, uh, we can just say here. We have a huge filter and as well as all sales and sales here. Filter. All sales. Sales. Calculate some filter all sales. Okay. Calculate a high sum sales amount. I think, uh, we have to, uh, remove filter and all filter only. We can just use the and how you optimize this. You I'm just writing the entire, uh, DAX expression. Here is the calculate sum sales amount, comma, then I will directly write sales year is equal to 2022 and all. If we want to select only, uh, we have to, uh, we will I will remove all sales. If you want to use all sales, then I will remove the filter. Here is a two approach. Oh, I'm just repeating my answer. In my first approach, we want to the calculate the sales amount for the year 2022. I will you will see. I will remove filter all sales. I will directly write calculate some sales amount, comma, sales year 2022. Here is a one. Then if you want to use, sir, and if if we are here, we use the all function. This all and if you want to use all, then you can just remove the filter. The second approach is calculate some sales amount amount. All sales uh, sorry. I will use a first approach. Sorry. I'm just correct. I'm just repeating my answer. Calculate some sales amount, comma, sales year is equal 2022. Thank you, sir. I'm moving.

Okay. Read the SQL query below. There is a potential performance if with this. Can you identify the problem and suggest how you would write? Okay. Select star form order, inner join, inner inner join, customer on. Okay. Order dot customer ID is equal to customer not customer ID where customer.country. Select star from order. Okay. We have to select select from, uh, select in our join baseline. Select star from orders in our join customers. Okay? Order.customerid is equal to customer.customerid where customers dot country is equal to Uh, here, uh, instead of using the select star from, uh, I will, uh, write, um, star dot dot, uh, sorry, star dot customer. Um, okay. I'm just rewriting the query, and I'm just, uh, rip I'm rewriting the query in this way. Select star uh, sorry. Select custom, uh, star dot, uh, customers. At customer. All the that is I I will write all the column names, uh, from the order table, uh, comma. I will select all the uh, okay. Oh, stop. Stop. Stop. And yes. And give me one I'm just rethinking. Select star from order. In our joint customer. Order dot customer ID to customer dot customer ID where customers dot country is equal to Germany. Okay. First thing, it is Germany. It is correct. We have to remove the that, uh, which sign this is called. Okay. For, uh, I think we have to just write we can just select select all the column name from customers, comma, all the columns, column name from the order order. And then okay. I'm iterating. Select all columns from customers, comma, all column that is column 1, column 2, column 3 from orders. Okay? From orders, inner join, customer on, customer dot ID is equal to ID where customer dot country is equal to Germany. I think it is a very easy question, but I'm not sure why it is showing that

Okay. What approach do you take for enabling self-service reporting through align with the corporate governance and security policies? What approach do you take for enabling self-service reporting that align with the corporate governance and security policies? What approach do you take for enabling for enabling self-service reporting? Corporate and governance security policy. Okay. For, uh, security policy uh, okay. Governance corporate security and government policy. Um, we will use c, uh, for, um, distribution, we will use the Azure, a a d, Azure address, uh, a a d, Azure data address, um, analysis for align for aligning the, um, for, um, AAT Azure Azure Active Directory 2. We use AAD Azure Active Directory, uh, for first thing, uh, assigning the, uh, user roles because of, uh, assign the user roles. Okay. That's we we assign distribution roles. Second thing, we will use a role level security. That is role level security is also one of the, uh, more tech one of the, uh, major, uh, function. You can use the security policy. Then you can, uh, you can also you can also apply data level security policy as per your company or as per the org to the particular table that is unauthorized user cannot access your data. Okay? You can apply the data policy data secure policy. Uh, okay? You can also, uh, I guess, uh, mentioned 3 things. We will use Azure Azure Active Directory. Uh, we will, uh, also we will use the 4 that is we have 4, uh, access. That is AMCV. V v is a viewer. We can also apply the viewer role as we will apply the at least minimum row, uh, that is the, um, access to the end user. That will also our, um, complete the, uh, data policy. Okay. Now data policy. These 4, uh, I will use. I'm just moving to the next question. Thank you, sir.

In my in migrating report from legacy system to Power BI, how do you ensure minimal discretion and maintain maintain the data con consistency? It is a very good question because most top organization now moving to the from where, uh, one, uh, data, um, that is the one BI system to the another. For example, we have developed in Excel. Now we are moving to the Excel report to the Power BI report. That is a legacy system. How do you ensure minimum distortion? Okay. In this scenario, this I have very good experience in data migration. Data mike that is a data. Uh, you can just not and and, uh, you can just create the very good, uh, it is a good very one ex one word. You can just create the data model as per the URL requirement. And using this data model, you can just create the dashboard and validate the source data from so validate the data from source to destination. Apply all the transformation in your, uh, Power BI report. Once all this you are applying the for example, we are showing for example okay. For example, yes. I'm just, uh, starting again. For example, we have one Excel report, and we are just migration into the Power BI with the, uh, data minimum data dispersion and maintain the data consistency. In this case okay. Sorry. It is the legacy system to Power BI. Yes. Yes. It is a it is a good, uh, it is a uh, I'm just giving the correct answer. Okay. We have to, um, build the same relationship between the legacy system to the Power BI. Hence, we have to make sure that validate the data from source to destination. This is the one way we are validating, and we are applying all the our changes that is which we apply in building our Power BI dashboard report. Okay? This is the one way. Also, after completion of the Power BI report, you can just use the same function. You can just apply the same filter to the new report. On your old report, you can just check the, um, output of the the dashboard. It will show the correct this is the our one methodology to validate our Power BI dashboard. Now here is also one question is how you maintain the back end. We have to maintain the same back end system while in a legal system. We have to build the same relationship between. We have to use a, uh, SQL server. You can use the, uh, for example, on some cases, uh, most of user are connected the data from the SQL and from SQL to the Power BI. We have to use the same methodology. You can just connect all the data to the SQL server. And from SQL server, you can use the into the Power BI. Okay. I'm uploading my answer. Thank you.

Assuming historical reporting data is needed, describe your approach, setting up, maintain efficient data archive strategy that work. Assuming historical assuming assuming historical reporting data is needed. Historical data is needed. Describe your approach setting up and maintaining an efficient data archive strategy that work works well with the Power BI, assuming historical up historical potential is needed. Okay. Uh, question is related to related to the how you maintain the historical data if needed in Power BI. Okay. For, um, okay, for example, we are in 2024, and we are, uh, referring to the 2022 data, or it is a 1 year or 1 1 year before. We have to, uh, okay, in a database, we will use the one trend that we will first thing, we will use the, uh, in our power grid data that is 5 year data. It is a one, uh, one way that is you can use the 5 year trend data. If in this 5 year, the last one, that is 2020, if we don't need the entire data for detailed analysis, you can just aggregate it aggregate the entire 2020 data on sale on your you can just aggregate it and save into the Power BI. This is the one way you can just you can just removing the granular data. There is a detailed data, and you can just only use aggregated data. This is the one way you can use also uh, it is archiving. It is not a scheduling. And this is the I think one way you can just use the your archiving. That is we can show the last 5 year 5 year data in your, uh, you can use the last 5 year, uh, you can use your last 5 year, uh, data to show. Uh, I think I need to, uh, check this, uh, archive strategy in Powerplay. Thank you.