profile-pic
Vetted Talent

Ankit Singh

Vetted Talent
Dedicated Senior Business Analyst with years of experience in the eCommerce and Insurance Industries, proficient in Python, SQL, Tableau, Power BI, and Excel. Proven track record of delivering actionable insights through data-driven analysis. Seeking opportunities to leverage my analytical skills and industry knowledge to drive business success.
  • Role

    Senior Business Analyst

  • Years of Experience

    9.6 years

Skillsets

  • C++
  • Power BI
  • Network Performance Analysis
  • MapReduce
  • JSON
  • JavaScript
  • Java
  • Hadoop
  • Excel
  • Data Visualization
  • Data Structures
  • Data Governance
  • C
  • Algorithms
  • Tableau - 8 Years
  • SQL - 8 Years
  • Python - 2 Years
  • Cost-Benefit Analysis
  • Data Cleaning
  • Statistical analysis
  • Customer Segmentation
  • Predictive Modeling

Vetted For

9Skills
  • Roles & Skills
  • Results
  • Details
  • icon-skill_image
    Business Data AnalystAI Screening
  • 67%
    icon-arrow-down
  • Skills assessed :PowerBI, SSRS, Business Intelligence, Problem Solving, Data Analysis, Project Management, Quality Assurance, SQL, Statistics
  • Score: 60/90

Professional Summary

9.6Years
  • Jun, 2022 - Present3 yr 9 months

    Senior Business Analyst

    Saras Solutions India
  • Sep, 2017 - Jun, 20224 yr 9 months

    Application Consultant

    SSP India
  • Oct, 2015 - Jul, 20171 yr 9 months

    Assistant Systems Engineer

    Tata Consultancy Services

Applications & Tools Known

  • icon-tool

    Python

  • icon-tool

    SQL

  • icon-tool

    Tableau

  • icon-tool

    Power BI

  • icon-tool

    Excel

  • icon-tool

    Hadoop

  • icon-tool

    JSON

Work History

9.6Years

Senior Business Analyst

Saras Solutions India
Jun, 2022 - Present3 yr 9 months
    Conducted exploratory data analysis to identify areas for improvement in network performance and customer experience. Developed Python scripts for data cleaning, preprocessing, and statistical analysis. Utilized Excel for ad-hoc analysis and reporting to assist senior management in strategic decision-making. Assisted in the design and implementation of automated data pipelines to streamline data extraction and reporting processes. Managed data governance practices to ensure data accuracy, consistency, and compliance with industry standards. Presented data findings and recommendations to executives and key stakeholders in a clear and concise manner. Created and automated interactive dashboards and reports using Tableau and Power BI to provide real-time performance insights. Mentored junior analysts and conducted training sessions on data analysis techniques and tools.

Application Consultant

SSP India
Sep, 2017 - Jun, 20224 yr 9 months
    Analyzed large datasets to extract meaningful insights, trends, and patterns that drive informed business decisions. Developed and maintained SQL queries to extract, manipulate, and transform data from various sources. Collaborated with cross-functional teams to identify key performance indicators (KPIs) and design custom metrics for measuring business performance. Provided data-driven recommendations to optimize network performance, reduce churn, and improve customer satisfaction.

Assistant Systems Engineer

Tata Consultancy Services
Oct, 2015 - Jul, 20171 yr 9 months
    Extracted, cleaned, and transformed data from various sources to support business intelligence initiatives. Collaborated with cross-functional teams to understand business requirements and translate them into data analysis tasks. Developed SQL queries and procedures to extract and manipulate data for reporting and analysis. Created Excel-based dashboards and reports for tracking key performance metrics and monitoring business operations. Assisted in the development of Tableau dashboards for visualizing network performance data. Conducted statistical analysis to identify factors affecting network reliability and quality of service.

Achievements

  • Data Analysis in Excel
  • Analytics Problem Solving
  • Data Analysis using SQL
  • Introduction to Python
  • Programming in Python
  • Python for Data Science
  • Inferential Statistics
  • Hypothesis Testing
  • Introduction to Machine Learning and Linear Regression
  • Logistic Regression
  • Tree Models

Education

  • Executive PG Diploma in Data Science (Business Analytics)

    IIIT Bangaluru & upGrad (2022)
  • B.E. in Computer Science & Engineering

    SSGMCE (2015)

Certifications

  • Executive pg diploma in data science (business analytics)

AI-interview Questions & Answers

Could you help me understand more about your background? Okay. My name is Ankit Singh. Uh, I have 8 years of experience, uh, in ecommerce and insurance domain. The recent one is in ecommerce, uh, in service analytics, where I build dashboards on Tableau and Power BI, uh, for multiple US clients. And, uh, so the role involved, all the things from requirement gathering till delivery. So we had, uh, kickoff calls with client, uh, business workshops to discuss all the requirements and get a sign off on those. Uh, then we discussed the dashboard design as well. We built the data pipelines and then used these data pipelines once built, uh, in the Tableau or Power BI dashboard as per the customer's requirement. The warehouse that we used was in Google BigQuery, which is similar to SQL only. Other than that, uh, we have used a dbt cloud, uh, for the ELD process. So that's pretty much from sales analytics. Uh, the remaining 5 years of experience I have is in SSP India Private Limited where, uh, this was, uh, an insurance domain. Uh, we were tracking the customer's policies and renewals or insurance, etcetera. Uh, we I've worked on some Tableau dashboards. Other than this, uh, the remaining the earlier 2 years of experience I had is in Tata Consultative Services. Again, the same thing, there were some changes that were assigned to us for multiple clients in the Tableau dashboard. So that's pretty much what I have.

Implement a test plan to validate the accuracy of an ETL process. Okay. So the test plan would be, like, let's say, for Amazon. So there are, uh, let's say if you are validating Amazon sales data for a particular customer. So what we do is we download the reports directly from, uh, the client's platform or from Amazon, and then, uh, manually calculate the numbers in Excel. And then we check, uh, the final table in our warehouse, whether the number matches or not. If it does not match, then there might be some mistake in the calculation or there are some coding error that we introduced, uh, so we need to fix that. So this is one of the test plans that we use to validate the accuracy of an ideal process.

And given a complex dataset, what are the steps you take to prepare it for the use in a Power BI dashboard? So the first step, uh, that we take is to check for unacceptable values. Like, if we are expecting a string, uh, value, whether we are getting, uh, whether we should be expecting nominal values or null values in there. If we are expecting integer values, whether we are getting only integer values there or there are some string values also present. So we need to check all these. 1st, we need this is what we do is, first, we check, uh, all the columns for the types of values that we are getting. So the distinct value check and the range of values, like how much we are getting, whether there are outliers or not, um, null value checks, then, um, we do the exploratory data analysis. And after that, we transform it according to our needs, and then we can use it in a Power BI dashboard.

We coordinate with stakeholders to convert business requirements into technical specification, particularly focused on data needs. So we conduct business workshops. Uh, we discuss what are their, uh, data needs, and, uh, then we create the requirement business requirement document, then they sign off. And so, basically, what we want to know from the business stakeholders is what would be the final table or the presentation layer, uh, uh, table structure would be. And then we transform the data, uh, the raw data into a final table format, and then we use it in the Power BI dashboard. So that's pretty much, uh, what our focus is on.

Project coordination with hands on technical work such as writing SQL queries and performing QA activities. So these things go hand in hand. So project coordination, setting up the meetings, uh, so daily huddles with the team, uh, weekly huddles with the customer to share the progress, and then writing SQL queries and performing QA activities that go everyday hand in hand. So there's no issue there.

Performance problem in an SSRS report. Uh, would you identify and resolve the problem? So I haven't worked on this as a RS report, but, yeah, like, if I'm facing any performance problem, like, let's say, uh, I face this, uh, in one of the Tableau dashboards. So I usually check on the Internet, uh, how to resolve these and then identify that Tableau has a feature, uh, to record the performances. So I recorded it and saw that, uh, like, there are, uh, some performances. Like, if we do a manual sort, it takes more time. So that way, we can get access assess the performance issues. And similar way with, uh, we just need to, uh, track the time, like, which processes are taking more time. Uh, that way we can resolve, uh, so we can work on those and resolve that issue.

Examining the queue test script written in pseudo code below. Can you spot any shortcomings in the testing logic that might let bugs slip through? Okay. Verify successful login. Input correct username into users. Name field. Input. Correct password into password field. Click login button. Check if user is redirected to the dashboard page. So the test case is verified successful. Log in. Okay. So click log in button, check if the user is redirected to the dashboard page. Yeah. So, like, shortcomings in the testing logic. So we can check for incorrectness as well. So, like, let's say, we just click on login button and see if the user is directed to the dashboard. We can enter the correct password in the user field username field, correct username in the password field, and then click log in to see if we are the user is directed. We can input incorrect username and passwords. We can keep it blank. So there are multiple use cases that we can, uh, test to see if the login is successful or not.

SQL query, can you spot the logical mistake and explain what the impact it might have on the result set? Okay. Select ID, first name, last name from users where is active equal to 1 and created date is greater than this or updated date is less than this. So here, if I see the query, what I understand is that we are checking for one of the columns that is inactive. It should be 1, and the created date should be 1123. 2023. It should be greater than or updated date should be less than 1 1 20 22. So we are just So the logical mistake would be created date cannot be greater than updated date. So that is 1. Then I what I think is the user might be trying to do it for 1 year. So then it can be instead of all, we should be using and if it is that way. Uh, So dependent on what the user wants to do here, whether they want to see the data for 1 year between 2022 and 2023, then the condition can be changed. So it's up to the user. Like, otherwise, I don't see any problems in the query, but, basically, the intention of what the user wants to do with the query. Logical mistake would be the created date cannot be less than the updated date. But then he has used all conditions, so it will bring all the data that will match this condition.

How did you approach the design and execution of a full test plan for a large scale web application project? Okay. Full test plan. So, basically, I will use the top down approach. So, firstly, it will be what is my high level targets, like, what should be the web application be about. And then I'll break down into small functionalities that it will have. And then for each of the functionalities, I will have the test cases. And once those test cases are passed, then we can so you can say that, uh, the web application is ready.

Power via visualization techniques to convey complex data patterns to nontechnical stakeholders. So, basically, there are multiple line charts, bar charts, and various types of graphs that we can show to the customer or business stakeholders, uh, to show the different data patterns on, like, let's say, where there are, uh, spikes in the line chart. Why is that? We can explain to the business stakeholders. So that way, we can convey the different data patterns to nontechnical stakeholders.

Of how you used ETL tools to enhance data analysis and reporting in your previous projects. In the previous com in my previous company, we used dbt cloud. So we added, uh, null checks as part of adding a layer of, uh, checks so that the data is not corrupt, uh, in the initial state and gives us a better clarity there. So that way that is what we've implemented in our previous project.