profile-pic
Vetted Talent

Sumit Saraswat

Vetted Talent

Experienced professional with a decade of expertise in the corporate sector, having made significant contributions at renowned companies including Metlife, Adobe, and Oracle. A natural leader known for fostering collaboration and guiding teams towards success. My skill set spans Business Analytics, Business Intelligence, Advanced Excel, SQL, and VBA. Passionate about leveraging data-driven insights to optimize decision-making and drive organizational growth. Committed to continuous learning and sharing knowledge to empower those around me. Let's connect and explore opportunities to create impact together.

  • Role

    Senior Business Analyst

  • Years of Experience

    12 years

Skillsets

  • Data Analysis
  • Python
  • PowerBI
  • Business Intelligence
  • Excel
  • OBIEE
  • ETL - 6 Years
  • SQL - 6 Years

Vetted For

12Skills
  • Roles & Skills
  • Results
  • Details
  • icon-skill_image
    Power BI - Team LeadAI Screening
  • 51%
    icon-arrow-down
  • Skills assessed :Oracle, Performance Tuning, Queries, Stored Procedures, Data warehouse, Database structure, DAX, Indexing, PowerBI, Data Modelling, Postgre SQL, SQL
  • Score: 46/90

Professional Summary

12Years
  • May, 2023 - Present2 yr 7 months

    Senior Business Analyst

    Oracle
  • Aug, 2019 - Apr, 20233 yr 8 months

    Business Analyst

    Adobe Systems
  • May, 2012 - Aug, 20197 yr 3 months

    Senior Business Analyst

    MetLife GOSC

Applications & Tools Known

  • icon-tool

    Power BI

  • icon-tool

    Advanced Excel

  • icon-tool

    SQL

  • icon-tool

    Python

  • icon-tool

    Cognos

  • icon-tool

    SQL Server

  • icon-tool

    VBA

  • icon-tool

    Macros

  • icon-tool

    OBIEE

  • icon-tool

    Microsoft SQL Server

  • icon-tool

    Excel

Work History

12Years

Senior Business Analyst

Oracle
May, 2023 - Present2 yr 7 months
    Integral member of the Financial Business Intelligence team within Oracle Advertising, focused on enhancing the Oracle Data Cloud revenue management framework. Led the creation and execution of ETL packages, efficiently loading data into reporting databases to support daily revenue calculations and payment distributions to partners. Supported the production of global revenue reports, ensuring accuracy and timely availability of financial data for strategic decision-making. Optimized and simplified data structures, enhancing performance and enabling more efficient data processing and analysis. Extracted, integrated, and analyzed complex datasets from various sources, driving business excellence and supporting informed decision-making. Designed and developed Power BI and Oracle Analytics Cloud dashboards for internal stakeholders, providing insights into revenue streams for advertisers, agencies, and platform usage. Resolved user queries and technical issues through JIRA, addressing data mapping, reporting pipeline, and reporting issues by writing and optimizing complex SQL queries. Created and altered stored procedures and developed SQL automated jobs to streamline data processes and improve reporting efficiency. Collaborated with cross-functional teams to ensure the accuracy and integrity of financial data across the organization. Provided technical support and guidance on data-related issues, contributing to the overall improvement of Oracle's financial data management practices. Continuously improved data management processes, implementing best practices for data extraction, transformation, and loading (ETL).

Business Analyst

Adobe Systems
Aug, 2019 - Apr, 20233 yr 8 months
    Led the design and maintenance of Power BI dashboards to track agent performance KPIs, including productivity, CSAT (Customer Satisfaction), and headcount metrics. Developed and automated agent scorecard reporting using Power BI and Advanced Excel, providing detailed insights into individual and team performance. Conducted in-depth analysis of CSAT scores, identifying trends and areas for improvement in customer service and agent performance. Performed headcount validation and reconciliation, ensuring data accuracy and alignment with business forecasts and reporting requirements. Utilized SQL Server to extract, transform, and load data into Power BI for comprehensive analysis and visualization. Forecasted customer volume for monthly, quarterly, and annual planning, using historical data and predictive modeling techniques in Excel and Power BI. Maintained existing Power BI dashboards, ensuring they accurately reflected real-time data on agent productivity and operational performance. Collaborated closely with WFM and customer experience teams, translating their data needs into actionable insights and effective reporting solutions. Enhanced data extraction processes from operational tools, ensuring seamless integration and accurate data flow into Power BI and other reporting platforms. Streamlined reporting processes by automating repetitive tasks using Advanced Excel functions, VBA, and Power BI features, significantly reducing manual workload. Provided technical support and training to team members and stakeholders, improving their understanding and utilization of Power BI and Excel for reporting purposes. Managed ad-hoc data queries and reporting requests, delivering timely and accurate insights to support decision-making across the organization.

Senior Business Analyst

MetLife GOSC
May, 2012 - Aug, 20197 yr 3 months
    Led the development of Power BI dashboards for financial and operational reporting, aligning with stakeholder requirements and business goals. Conducted thorough data analysis of reinsurance contracts, translating findings into actionable insights for reporting. Managed data extraction and ETL processes from SQL Server and other databases, supporting seamless data integration and reporting. Developed and automated reports using Advanced Excel, VBA, and Power BI, improving reporting efficiency and accuracy. Supported the design and maintenance of dashboards for the Ceded Re function, ensuring robust data visualization and reporting. Documented all reporting processes, ensuring a clear audit trail and compliance with internal and external standards. Maintained effective communication with onshore teams, managing expectations and delivering timely updates on reporting activities. Identified and implemented process improvements, streamlining workflows and enhancing data accuracy and reporting capabilities. Handled ad-hoc queries and managed department procedure manuals, ensuring comprehensive process documentation and consistency.

Achievements

  • Improved data accuracy and efficiency in revenue reporting, contributing to better financial decision-making.
  • Enhanced visibility into agent productivity and customer satisfaction, leading to data-driven improvements in service quality and operational performance.
  • Improved accuracy in headcount reporting and enabled proactive workforce planning, aligning staffing with forecasted demand.
  • Identified key drivers of customer satisfaction and informed strategies to enhance service quality and customer experience.

Major Projects

4Projects

Revenue Management Dashboard - Oracle

    Objective: Streamline revenue calculations and payment distributions using Power BI. Role: Designed and developed Power BI dashboards, integrated ETL processes, and provided data analysis for revenue management. Achievements: Improved data accuracy and efficiency in revenue reporting, contributing to better financial decision-making.

Agent Performance Dashboard - Adobe

    Objective: Track and analyze agent performance metrics to improve operational efficiency and customer satisfaction. Role: Developed and maintained Power BI dashboards, automated scorecard reporting, and performed detailed data analysis. Achievements: Enhanced visibility into agent productivity and customer satisfaction, leading to data-driven improvements in service quality and operational performance.

Headcount Reconciliation & Forecasting - Adobe

    Objective: Ensure accurate headcount reporting and forecast future staffing needs based on historical trends and operational requirements. Role: Validated and reconciled headcount data, forecasted customer volume using Excel and Power BI, and supported strategic workforce planning. Achievements: Improved accuracy in headcount reporting and enabled proactive workforce planning, aligning staffing with forecasted demand.

Customer Satisfaction Analysis - Adobe

    Objective: Analyze CSAT data to identify areas for improvement and drive enhancements in customer experience. Role: Conducted in-depth analysis of CSAT scores, developed Power BI visualizations, and provided actionable insights to the customer experience team. Achievements: Identified key drivers of customer satisfaction and informed strategies to enhance service quality and customer experience.

Education

  • Bachelor of Science

    Dr Bhimrao Ambedkar University

Certifications

  • Oracle certified professional (oracle analytics cloud)

AI-interview Questions & Answers

Yeah. Sure. So, like, I've started my career with MetLife, and I have worked over there approx 7 years 7.5 years as a business analyst or data analyst. In 2019, I have joined Adobe System as a senior business owner where, uh, I was a part of a customer experience domain where we need to keep the track of the performance of the agents who are resolving the customer, uh, queries, uh, via through call or chat. And in Adobe, I have created, uh, like, different different, uh, Power BI dashboards. So we, like, used to get the data from the different sources, like our SQL Server, Excel, and different different JSON and KPIs, and we, uh, load those, uh, BI, uh, datasets into our Power BI system and build our visualizations based on the requirement of the stakeholders. So most of the time, it was a agent performance dashboard where we, like, create KPIs, like product 20 percentage, online percentage, and their contact per handled per day, contact handled per hour kind of a KPS we built on the visualization tool. Apart from that, I've also created the AUX usage report dashboard. Uh, it was totally based on the, uh, Power BI. And, uh, most of the time, we used to get the queries from the different different managers, from the different different teams that, hey. We need this agents, uh, this data from so we need to fetch, uh, that data, uh, from the particular uh, servers. So we, uh, and create the reports based on the requirement. So that's it from Adobe. Like, in 2,023 last year, I have joined Oracle as a senior business analyst. So, uh, in here, my roles, like, include, um, maintaining the dashboards, Power BI dashboards, and the OAC dashboards. Apart from that, we got the request from the stakeholders, hey, that you need to prepare some reports for the particular audience. And the partner side. So we prepare those reports and give, uh, uh, also provide the insights. So, uh, in here, in Oracle, we get the request through Jira. And when they, uh, we, like, explore the Jira request, we observe that, hey. What all are the KPIs they need or what all, uh, requirements they have? So basis on the requirements, we provide the data. I guess that's it from my end.

Uh, so it totally depends upon the requirements. So which method we use, it could be direct query or import. So if we want to improve our efficiency of the dashboard or report, so we preferred, uh, import because, uh, the all the data saves on your data model. But in in other hand, indirect query, the all the data saves on the server side. So we always, uh, if there's no need of, uh, like, daily refresh of the data or the requirement is not a urgent refresh. So we prefer that we can use import query. Uh, so import data. Uh, so what it did, it create a cache memory in Power BI Desktop and store the data in our model. So the retrieving of the data is very fast as compared to the direct query.

Yeah. I have worked on the, like, restrict the data to role level security. So for that, uh, it depends on the user. So if user belongs to the different different region or different different country, we provide the RLS, uh, based on their criteria. Uh, we use DEXs. Uh, like, if a particular user belong to a JPEG region, we provide decks, like, region equal to JPEG. And based on the role, we, uh, restricted them, uh, to use the rest, uh, specific data. Uh, so yeah. Uh, so it the rule level security is basically 2 types. Uh, one is, like, static and dynamic. In static, we just provide a like, based on the their, uh, role, where they belong, what kind of a database team they belong to. And in dynamic, we just pull their, like, username, uh, based based on the user principal name or username decks. And with the help of we provide the dynamic, uh, rollover security to the user. Uh, dynamic rollover security help us to, like, uh, if there are, like, thousands of users, uh, using the our dashboard, so we restrict them based on the dynamic rollover security. And, uh, we create a if they are, like, thousands of users, we create access. It based on their email ID. We map with our, uh, uh, and map we map though that table into our data model. And based on that, we pass the username with the help of the text and provide us, uh, dynamic row level security.

Should a particular post testing. Application security. No. I don't have answer for that right now. We implement, like, rule level security or object level security, but, uh, didn't understand this question completely.

What are the key consideration you would recommend? So most of the time, if we are working on a complex text, uh, we make sure that it should be concise. Variable, uh, should be there, and we don't use the indexes which are taking most of the memory. So we if they are, like, dexes who are, like, taking memory of the cache, we, uh, check with the performance visualizer. And if and if taking too much memory, we try to find, uh, those kind of dexes who are, like, taking less memory compared to the others.

When slow running complex text, how do you determine if it should be optimized for rewritten? If our current debts, which we have create is created, uh, taking too much time to execute, and we, uh, check the performance of the particular decks, uh, if it is, uh, like, good or it is not good as taking too much time in a performance analyzer. So we try to replace that with with the better decks.

Review the SQL query snippet below. There's a potential performance issue with it. Customers dot contribute. Yes. Uh, so, like, in here, we are using a wild card character to search for a specific country. Uh, so what can we do? We, like, replace that with equal to Germany. So it will help us to, uh, find, uh, that data where the country customer dot country is Germany Or, uh, second thing, uh, if, uh, needed, we can, uh, as per the requirement, we only face those columns instead of star or instead of asterisk. We only need to require those column which are required in our report. So it could be product ID, customer ID, revenue, whatever it is. We can fetch that.

I don't have answer for that. Oh, let me try, I guess.

Reporting. So isn't that accurately reflect complex? I don't have answer for that as well.