profile-pic
Vetted Talent

Priti Patil

Vetted Talent

Microsoft PL-300 certified Power BI developer with 8+ years of experience in data modeling, visualization, and analysis. Proven ability to transform raw data into actionable insights.

  • Role

    Power BI Developer

  • Years of Experience

    8 years

  • Professional Portfolio

    View here

Skillsets

  • Data Mining
  • Data Modeling
  • DAX queries
  • ETL processes
  • Performance Optimization
  • PowerBI
  • predictive models
  • Risk Management
  • statistical models

Vetted For

12Skills
  • Roles & Skills
  • Results
  • Details
  • icon-skill_image
    Power BI - Team LeadAI Screening
  • 67%
    icon-arrow-down
  • Skills assessed :Oracle, Performance Tuning, Queries, Stored Procedures, Data warehouse, Database structure, DAX, Indexing, PowerBI, Data Modelling, Postgre SQL, SQL
  • Score: 60/90

Professional Summary

8Years
  • May, 2024 - Present1 yr 7 months

    Senior Consultant

    Nice Software Solutions
  • Nov, 2023 - May, 2024 6 months

    Power BI Developer

    Ethiraj Associates
  • Jan, 2023 - Sep, 2023 8 months

    Data Analyst

    Ensono
  • Dec, 2014 - Mar, 20194 yr 3 months

    Senior Executive

    Bajaj Allianz General Insurance
  • Dec, 2019 - Dec, 20212 yr

    Lead Consultant

    Apex visas
  • Mar, 2022 - Jan, 2023 10 months

    Lead Counsellor

    Edu plus now

Applications & Tools Known

  • icon-tool

    Tableau

  • icon-tool

    SQL

  • icon-tool

    Power Automate

  • icon-tool

    MS Access

  • icon-tool

    SSMS

  • icon-tool

    SSIS

  • icon-tool

    Advanced Excel

  • icon-tool

    Informatica

  • icon-tool

    Git

  • icon-tool

    Python

  • icon-tool

    Jira

  • icon-tool

    MySQL

  • icon-tool

    Zoho

  • icon-tool

    Azure

  • icon-tool

    REST API

  • icon-tool

    Teamwork

  • icon-tool

    Microsoft Power BI

  • icon-tool

    AnyDesk

  • icon-tool

    YouTube

  • icon-tool

    Azure Data Lake Storage Gen2 (ADLS)

  • icon-tool

    Visual Studio

  • icon-tool

    Google Drive

  • icon-tool

    Google Chrome

  • icon-tool

    Tableau

  • icon-tool

    SQL

  • icon-tool

    SSMS

  • icon-tool

    SSIS

  • icon-tool

    Power BI

  • icon-tool

    Power Query

  • icon-tool

    Azure Databricks

  • icon-tool

    Azure Synapse

  • icon-tool

    Report Builder

  • icon-tool

    Azure Analysis Services

  • icon-tool

    Tableau

  • icon-tool

    Salesforce

  • icon-tool

    AWS

Work History

8Years

Senior Consultant

Nice Software Solutions
May, 2024 - Present1 yr 7 months
    Generated reports and provided analytics on accounts payable to senior management, resulting in improved budget forecasting. Prepared and presented financial reports to senior management and external stakeholders. Assisted in preparing reports and presentations for senior management, ensuring accuracy and timeliness. Prepared and presented financial reports to senior management that provided insights into the organization's financial health. Designed an enterprise-wide BI system that enabled data-driven decision-making across the organization. Developed a data warehouse that integrated data from multiple sources, allowing for comprehensive reporting and analysis. Collaborated with cross-functional teams to identify and resolve data-related issues, resulting in a 92% improvement in data accuracy. Worked on Power Apps and Power Automate.

Power BI Developer

Ethiraj Associates
Nov, 2023 - May, 2024 6 months
    Developed ETL processes to integrate data from multiple sources into a unified data warehouse, Net Suit, Salesforce, AWS Through API. Developed dashboards and interactive reports that enabled business users to gain insights from data. Optimized the performance of existing BI solutions, resulting in a 20% decrease in query execution time. Automated the deployment of BI solutions, resulting in a 30% reduction in development time. Developed predictive models that accurately forecast customer churn and revenue growth. Collaborated with cross-functional teams to identify and resolve data-related issues. Developed a data warehouse architecture to support the scalability and performance of BI solutions.

Data Analyst

Ensono
Jan, 2023 - Sep, 2023 8 months
    Developed advanced statistical models to identify customer trends and predict future sales growth with 100% accuracy. Transformed raw data into meaningful insights and presented them to stakeholders in an intuitive way. Utilized data mining techniques to uncover hidden patterns and correlations in customer data, resulting in a 73% increase in revenue. Created a dashboard that visually represented key business metrics in real-time, allowing for quick and informed decision-making by management. Created an automated data pipeline that reduced data processing time by 24%, allowing for faster analysis and decision-making.

Lead Counsellor

Edu plus now
Mar, 2022 - Jan, 2023 10 months
    Developed and maintained a supportive and therapeutic environment that fostered trust and encouraged open communication. Developed culturally competent counseling techniques that addressed the unique needs of clients from diverse backgrounds.

Lead Consultant

Apex visas
Dec, 2019 - Dec, 20212 yr
    Developed and implemented a risk management framework that minimized potential risks. Collaborated with cross-functional teams to identify customer needs and develop solutions to meet those needs. Negotiated contracts with vendors and suppliers, resulting in a 30% reduction in costs. Created customer satisfaction surveys to measure customer feedback and identify areas for improvement. Prepared presentations, reports, and other business documents to effectively communicate insights and recommendations to clients. Developed detailed project plans to ensure clients' projects are completed on time and within budget.

Senior Executive

Bajaj Allianz General Insurance
Dec, 2014 - Mar, 20194 yr 3 months
    Created a sales dashboard with real-time analytics to track performance and trends. Developed and delivered compelling presentations to close large deals and increase customer loyalty. Generated over $2 million in revenue in a single quarter by developing and executing an effective sales strategy. Implemented a CRM system that improved data accuracy and reporting accuracy for the sales team.

Achievements

  • Received the certificate as Skipping Rope challenged
  • Received Team Lead Certificate

Major Projects

8Projects

LCM

    Collecting all the Data and storing in to share folders and then cleaning and preparing the data set. creating complex Dax Query, calculated column, calculated Table and Measures.

Salesforce Opportunity and Lead Dashboard

    Objective: Optimize sales team performance through data-driven strategies for increased business sales. Deliverables: Developed custom opportunity and lead dashboard using Tableau & Power BI.

ZOMATO- Food Orders Demand Analysis

    Leveraged data science to analyze food orders demand at Zomato's multiple locations.

Adventure Works Cycle (AWC) Sales Dashboard

    Developed sales dashboard to manage and monitor sales performance.

Data Revenue Analysis

    How to find out which source of Data Getting good revenue 245 Days How many clients are enrolled for the course Six sigma and Data Science.

A CRM lead and opportunity dashboard

    A CRM lead and opportunity dashboard is a powerful tool that provides a comprehensive view of your sales pipeline. It visualizes key metrics, trends, and performance indicators to help you make data-driven decisions.

Data Warehouse Implimentation

Jade Business Services
Nov, 2023 - May, 2024 6 months

    Challenges and Considerations

    Data warehouse implementation projects can be complex and challenging due to various factors:


    Data Quality: Addressing data inconsistencies, errors, and missing values.  

    Data Volume: Handling large volumes of data efficiently.

    Performance: Ensuring fast query response times.  

    Data Security: Protecting sensitive data.

    Change Management: Managing organizational changes associated with the data warehouse implementation.

How to find out which source of Data Getting good revenue

Jan, 2022 - Aug, 2022 7 months
    How many clients are enrolled for the course Six sigma and Data Science.

Education

  • B.Tech/B.E.

    Visveswaraiah Technological University (VTU) (2014)

Certifications

  • Microsoft certified pl-300 power bi developer

    Microsoft (Feb, 2024)

    Credential URL : Click here to view
  • Microsoft certified pl-300

  • Microsoft certified: power bi data analyst associates

Interests

  • Travelling
  • Learning
  • Watching Movies
  • Exercise
  • AI-interview Questions & Answers

    Hi. Uh, my name is Priti Patel. I'm holding total, uh, 8.2 years of work experience, uh, into total overall experience. But when you speak about Power BI, I have total 6 plus work experience. And so I have worked into the many domain, like, for example, insurance, uh, immigration, then patching and life cycle management. And I have worked into the, uh, banking domain also. So I have created multiple dashboard. For example, uh, when you speak about health care, I have created the dashboard as in uh, number of patient admitted and what is the treatment going on. So and what is the cost of each treatment, cost of the medicines, and average, uh, average weekly, how many patients get admitted and for related to what this is. And for specific, uh, disease wise, how much, um, amount is pending? And, uh, in a hospital, doctor, how much they are charging for business wise? So all these have been, um, created the dashboard. Not only, uh, like, uh, creating the dashboard, starting from gathering the requirement, creating the wireframe, then after building a dashboard, how many pages of the dashboard, how many visuals are there, what are the calculation required, uh, everything starting starting from in this level 2 and delivering to the, uh, what I can say, end users also, what all the requirements are there. And apart from that, creating the row level security, RLS, static, and dynamic. When you speak about static, for static low level security, I have, uh, used as a DAX formula as a username. And for dynamic is username principle. User principle name. DAX I have used. And apart from that, uh, have used the, uh, for modeling, star schema in one of the project and snowflake. But mostly snowflake, I have used. But when you speak about performance wise, if it is possible, then I would insist go for stars instead of going for snowflake. And apart from that, uh, adult, uh, whatever the project task that that I'll work. But, additionally, in the organization, whatever the newly features updates come on month on month basis, when you speak about latest update, there is a report. If you would like to share the report to the end user, 1, on through Teams, 2 emails. 2nd, um, we can create a copy link and share with. Then there is a additional features has come, uh, storytelling and Power BI, uh, PPT. Directly, we can share the report in PPT. Earlier, what used to happen if you would like to share a report on PPT, we need to take it, uh, uh, screenshot of that, uh, dashboard, put it in the PPT, but not, um, like this. So directly, uh, we add in features has been come come up, and we can create a PPT. There are number of visuals will be shared into the, uh, PPT and where you can do the slicing, dicing of the data also. That whatever the, uh, dashboard we have been built upon on Power BI services, whatever it has been published into. And if you'd like to publish into the cloud, we do

    What steps would you take into, um, enjoy the security being viewed for the multiple users. 1st, uh, first, there are 2 types of securities are there. One is the static low level security. 2nd is the dynamic low level security. For example, if you would like to share for, uh, end user, uh, statically, they can view the report. They do not want to uh, change anything. So we can give the viewer access. And somebody within organization, there is a 5 members teams who who who are working as a developer role. If anybody wants to change their specific dashboard, then we can give the exercise member or contributed level. Okay? And apart from that, 1st, uh, within the organize it, if you restrict the data, uh, we can implement the RLS. RLS performance low level security. There are 2 types of RLS, static low level security and dynamic low level security. Static in the sense, for example, we, uh, we can use a take a DAX formula as a username. And with the help of this, we can implement the static role level security. And for dynamic user name, uh, user principal name, we can use. Otherwise, we have a DAX formula if else if condition. Uh, I have implemented when I was there in KBR a client. We have a best client. I was working their data into the analytics and ratings. So and I and, uh, there were 3 different department, analytics ratings, and ratings and analytics. How many people can see see the data for specific to the analytics, specific to the ratings, and combine? So in that case, I haven't used user name principle. I have used if formula with the help of that, uh, role level securities implemented.

    How would you explain benefit of using star schema in data warehouse to junior developer? Okay. When you speak about, uh, star schema, there is one fact table and multiple dimension table. So let's take example. There is a data call it as, uh, sales data. So sales information, everything is there in fact table. Dimension is product information, order information, customer details. With the help of this, uh, if any, uh, developer junior there. So in that case, I would have to explain, like, in factable, what are the number of fields are there. And factable, it should not be duplicates. Any record should be there. Dimension, it can be duplicate, but, uh, sometimes we we we whatever the factable is there, fact table is a parent table and dimension is a child table. So whatever the data is there in fact table, uh, sometimes we can interlink with the dimension table. But if you would like to build a relation, it is one to many. We can use And if you are using bidirectional filtering in order to avoid that, we can use a, um, cross filtering DAX formula. Otherwise, bridge a table. So we can create the indexing method and, uh, in between 2 instead of creating bidirectional to avoid that, uh, bidirectional cross filtering, we can, uh, create a bridge table and so that performance wise, it will get improved. And when you speak about data warehouse, I have been worked into one of the project, um, when, uh, that is the client name is Informative. They were having the multiple data sources. For example, next week, financial data, and, um, then we and they were having Salesforce, and then there are AWS. The data was stored into MCD, CD, and CBC. There were so many databases. So all this, uh, data warehouse, the landing zone, we have been created into Azure. We have used, um, that is called as a landing zone, and all these sources is integrated into, and that is, uh, and Gen two account we have been created. And, also, we have used, um, uh, streamlined process data breaks, and we have created the pipeline. Then after I build the, uh, dashboard into the power there with the help of directly integrating, uh, as a source as a Azure and for Azure, uh, there are sources are multiple. So in this, we have used the stats schema only. With the help of stats schema, we have implemented the data warehouse.

    Uh, okay. Practical approach to implement testing application security. There are first one is that there are, uh, production, uh, when you deploy a pipeline in Power BI, there are 3 different environment. 1st is development, then testing and production. So we can deploy in this case, uh, if we have a source, source wise in, uh, directly three environment, we can integrate to this specific source. But if we do not have the one source, one environment, then, uh, development. So whatever the data is done in development stage, that would need to be published into the testing where the testing people can test whatever the, uh, required information there. But here, whatever the I have been worked upon, uh, I had involved, uh, for few project as a tester. The requirement will come through Jira client, a KBR client. They what they used to give the requirement on tickets, Jira ticket. For example, if you want to modify paginated report, if any measure is not working properly, numbers are showing, uh, incorrect, then I need, uh, like, um, alignment is not there properly. Any slices needs to be modified. Any visual needs to be modified. Any in that scenario, they give the requirement. Developer will, uh, develop a dashboard. And after that, uh, I do the testing part. Uh, I I need to check properly that work has been done or not. If it is done, then I can, uh, mark it as a it has been done. Uh, if it is in progress or if it is not related, so everything testing data, uh, testing of the report is done weekly basis. There were, uh, 5 to 10 tickets used to come. So in out of fetch, how many tickets are done or not done in progress that all reports needs to be created weekly report and share to the, uh, VP and director who can communicate directly to the client. Okay? This is the what my role and do. But in different organization, it depends upon the, uh, the source environment is if it is, uh, dev test or, uh, prod. So in that scenario, we can deploy a pipeline in Power BI.

    What is your strategy for version control and deployment of powered by reports into team environment? Question control. Question control in the sense let's take example, uh, in KBR only. Uh, every month or 2 months wise, they what they do, they decommission the dashboard name. For exam in that case, uh, first name was pipeline taxonomy dashboard version 1. Then through after 2 months, version 2, version 3. So we can name the, uh, PBX file in the on desktop and publish into the cloud. So in in Teams environment, let's take example if any one is decommissioning. So we need to intimate to the organization. This is going to be decommissioned, and this will be working for this, uh, this duration for the or twice bimonthly. So we can, uh, create the PBX file and publish into the cloud. So in that case, whatever is decommissioned, so in that case, what we can do, we can create a subscription. If any dashboard is, uh, going to decommission, so we can create one subscription. What time it will be decommissioned? For those number of people, it will go as a email. This dashboard will be decommissioned. And apart from that, we can create alerts also. With the help of that, it will be decommissioned and whichever new is coming upcoming the dashboard, so that subscription needs to be cleared or alerts. If you want to create alerts, we need a premium licensing. Um, but we if you are if you go for subscription, pro license is enough. With the help of that, this is can this can be achieved.

    What method would you use to evaluate enhanced efficiency of data retrieval in Power BI from various database likes MS request, server, and oracle. So in this case, what if we have a, uh, Power BI report server level, so in that case, what I will do is whatever the number of sources are there, I will create a data mart and to which I can indicate a dashboard. This is the one method. 2nd method is, uh, I will use import if you have allocated me as a premium, uh, licensing, so in that case, import method, we can create maximum, uh, import method, 100 GB PBIX file. So in that case, I will import all the return to the p b, uh, Power BI desktop level and start creating a dashboard. Why? Because when you go for direct query, we cannot create a calculated, uh, column measures into the source level for direct query. Instead of that, if you go for import method, we can, uh, do the modification on the, uh, data level. Okay? Instead of going on source level. This is the second method. And data retrieval in the sense, we can cross verify. Here, I have done proper, uh, verification. I have used SSMS, and, uh, I have checked I have the the management has given me the access for various databases that is CD, credit driver data, CBC, and MCD. So, uh, these are the sources set and through API, NetSuite, and Salesforce. So I have cross validated. Uh, what happened? I'll give you one scenario. Uh, where when is was there in informative as a client, that time, what they used to do, uh, they have already developed 1 developer as a dashboard, customer 365 uh, 360 view dashboard. But, uh, in in the source level itself, the, uh, industry type, the field is field is entirely changed. They have changed from, uh, industry type to partner and partner ID. So on the SQL, uh, query, uh, based on that calculation measures, everything that dashboard is broken. So I did the analysis on SSMS properly. Uh, the what all the things are changed and how many views needs to be modified. I have modified the views. Then after within, uh, 2 working days, I have made the changes in view. And then after calculations also, I've done the changes with the help of Aliyah's name. Then, uh, uh, the the numbers are are reflecting properly on the dashboard, and this is the way I given to the management. When when if you go for import method, we can do the, uh, cleaning and cleanse cleansing in, uh, inbuilt power power query editor through which we can do the merge, append, and, uh, replace value, replace error, then use 1st row as a header. This way, we can do the transformations.

    Review the SQL query. It's named below. There is a potential performance issue with it. Can you identify the problem such as how to audio read rewrite it to better perform? Select start form orders. Orders dot customer ID is equal to customer dot customer dot country is jam. Okay. So in this case, what we can do, we can, uh, instead of going this, uh, uh, in the joint, what we what we can do, we can import the data into the, um, import the data into the power via desktop level. And once we go on in the feature that is, uh, uh, idea tool power query editor, there we can use, uh, uh, what we can say, um, merge option through which merge option we this, uh, instead of going this performance wise source level, it will take much time. But if you do it import method way, this will not take much time.

    Calculate sum of sales amount. Filter all sales for the year. Even tax formulas used in power report, can you explain why performance might be impacted, and how could you optimize this? So in this case, instead of going calculate here, we can use the function, call it as sum x. Sum x if instead of going calculate function, what sum x will do, we can, uh, select the table, give the and the expression, and performance wise, it will get improved instead of going calculate for specific year.

    What approach do you take to enable self-service report and aligns to the corporate driven security policy? Self-service BI in the sense, uh, whatever the number of sources are there, we can we can integrate all the sources, do the cleaning and cleansing part in power and build power query editor. After that, create a model that is start with the help of task, schemas, Netflix schema. Once the model is studied, and, uh, we can, uh, give the, uh, this, uh, to the, um, end users. They can just drag and drop, create the as per their requirement. If they need more in con clear and alignment, like, uh, cosmetic aesthetic changes, alignment also, and some calculation required. So in that case, uh, the developer can work upon. Security in the sense, it, uh, we we cannot give the access, like download the file, export the file. That access cannot be given. Just for drag and drop, placing a and the visuals, any any changes, in that case, we can do it.

    How do you incentivize your team to maintain good quality in their development of text expression and SQL query? How do you incentivize your team to maintain code quality in development of tags, expression, and SQL verb? So in that case, first of all, understand, uh, what is the requirement. And, uh, in for creating calculated column, instead of that, if they we can create a measure of scalar value. So that would be beneficial going with. And second method is if it is possible, if the the management has given the access for source level changes, whatever the transformation done that is source level. If it they had given the access, then directly we can do the changes into the source level. Then after we can take a data into the, um, Power BI, uh, desktop level. If that is allowed, then performance wise, it will get improved. But and when you speak about aggregate values, if you need scalar value, then that is a most useful method. Where I have been used, there are so many racks when you speak about time intelligent functions that is QTD, YTD, MTD, all these functions. And if you want to calculate the sales for same period last year, parallel period also, I have used an a lookup value related. So this text and aggregate function like sum, sum x, count, count a, count a x. For Boolean function, if you have a Boolean 2 and 4, and if you want to count, then we can have to use the count a a a x function. Then we have the, uh, variable also. We can use a variable function with the help of variable functions also. Calculate, assign the variable 1, variable 2, and give, uh, if if logic also we can apply. These tax function have been used.

    What strategy would you apply to ensure Power BI report remains, um, remains accessible during database maintenance? What strategy would you apply to your power where you put Siemens acceptable during database maintenance activity? Okay. If it is maintenance activity, so in that case, schedule refresh. So in this schedule refresh, if it is in maintenance activity, so we can go for import method, and we can, uh, give this, uh, we can create a subscription and alerts also when it has been refreshed and what time it is maintenance are there. We can create alerts. So with this help of that with this strategy, uh, reports will be accessible during. And second thing second thing is we can go for instead of going direct query, we can go for import method. Whatever the if it is still maintenance and re, um, and and no issues. Whatever the data is there for specific duration, we can show the dashboard. And, uh, apart from that, if it is in maintenance, we can create a schedule refresh and this stuff, uh, the dashboard is refreshed for that this time. So with with the help of this, this can be achieved.