Data Analyst Interview — SQL, Statistics, Stakeholder Communication
Data-analyst interviews are heavily SQL-driven, lighter on ML, and put significant weight on stakeholder communication and metric design. The role overlaps with product DS but typically demands less modelling and more business reasoning, dashboarding, and ad-hoc investigation.
SQL test (45–60 min). Often the longest round. Window functions, CTEs, anti-joins, percentile queries, cohort retention, sessionization.
Case / metric design (45 min). 'Design metrics for a new feature' or 'investigate why metric X moved'. Tests structured thinking and business judgement.
Tooling / dashboarding (30 min). Tableau/Looker/Mode familiarity, dashboard design principles, pitfalls of bad visualizations.
Behavioural (45 min). Stakeholder management, ambiguity, when analysis disagreed with leadership, when you killed a project.
Top Data Analyst technical questions
These are pulled from interview-debrief patterns we see most often across Data & Analytics roles. They are not memorization fodder — interviewers reword them constantly. Practice the underlying skill, not the wording.
Write a SQL query to find the second-highest salary per department.
Compute 30-day retention by signup cohort.
Find users whose first purchase was within 7 days of signup AND who came back in the next 30 days.
Sessionize a clickstream where a session ends after 30 minutes of inactivity.
Compute a 7-day rolling sum of daily active users.
When would you use a HAVING vs a WHERE? Give a real example.
Why would you use a self-join? Walk through one practical use case.
We see DAU up 5% but revenue flat. Walk me through your investigation.
What's wrong with this dashboard? (interviewer shows a stacked-bar chart of sessions over time)
How would you measure success of a free-trial-to-paid funnel?
Explain Simpson's paradox with a concrete example.
How do you handle missing data when computing churn?
Behavioural questions
Tell me about an analysis whose result surprised your stakeholders. How did you communicate it?
Describe a time leadership wanted a different conclusion than what the data showed.
Walk me through the most painful data quality issue you've debugged.
When have you had to push back on an unrealistic ask?
Preparation tips for Data Analyst candidates
**Drill SQL until it's automatic.** This round is often pass/fail on its own. Window functions especially — LAG, LEAD, ROW_NUMBER, RANK, NTILE.
**Frame every case answer.** Even quick investigations should start with 'I'd first segment by X to rule out Y, then look at Z'. Hand-wavy answers fail the round.
**Sharpen your statistics intuition.** Selection bias, Simpson's paradox, regression to the mean, novelty effect — name them when relevant.
**Practice the 'what's wrong with this dashboard' question.** Interviewers love bad-chart cleanup as a stand-in for product judgement.
Practice with the AI mock interviewer
Panor's AI Job Assistant runs voice-based mock interviews tuned to the Data Analyst role. It ad-libs follow-up questions, calls out red flags in your answers, and produces a transcript with rubric-graded feedback. Resume × JD matching is also included — paste a target job description and the assistant rewrites your bullets in STAR format with keyword alignment scoring.
Strong candidates with relevant experience generally need 4–6 weeks of focused prep for a competitive Data Analyst loop. Career switchers should plan on 8–12 weeks, weighted heavily toward the data & analytics fundamentals.
Do I need to grind LeetCode?
For most Data Analyst loops in 2026, depth on a curated set of 60–80 problems beats grinding 400. Focus on the patterns the questions above test, not problem volume.
Is the format the same at startups vs Big Tech?
No. Big Tech tends to over-index on coding and system design; startups put more weight on judgement, speed, and 'will this person carry the team'. Read the JD and ask the recruiter for the explicit loop structure — they will tell you.