How to Stand Out as a Data Analyst: Focus on These 5 Key Themes

SQL, Python, and other technical skills are critical, but they are only half the battle

Abhi Sawhney
Towards Data Science

--

Photo by ThisisEngineering RAEng on Unsplash

Introduction

What separates a standout data analyst from a good one? I often think about this as I reflect on my own experiences leading and learning from a number of analysts over the years. The answer always seems to tie back to specific themes around mindset, approach, and systems, as opposed to mastery in any one technical domain.

A strong technical skill set is significantly more effective when combined with habits across the themes described below. I will use a few examples to highlight how anyone, with a certain amount of planning and care, can incorporate similar behaviors into their own workflows. Some of these themes are related but we will go through each of them separately for clarity. The examples are based on analytics in the music streaming domain, but the underlying takeaways are applicable across industries.

(1) Individual Systems and Processes

Given the limited bandwidth and a seemingly endless supply of incoming requests, it is understandable that data analysts often find themselves in a constant cycle of jumping from one operational task to the next.

Standout data analysts are aware of this pattern and know when to jump out and focus on increasing their overall efficiency instead. They are not afraid to invest time to build systems and processes upfront if it saves them time in the future.

Let us say the executive team reaches out to you regularly to explain why key business metrics moved up or down. On any given day, you could receive questions along the lines of:

  1. “It looks like our overall streams dropped a lot yesterday. Is that accurate? What happened?”
  2. “I was just looking at a view of our daily listener trend for the past year and noticed several extreme values scattered throughout the year. Could you explain what happened on each of those dates? Apologies for the short notice, but I need this for an upcoming meeting. Can you please get back to me with this by the end of the day tomorrow? ”

These two questions are both just instances of a common question theme:

Explain why the value of a key metric changed more or less than expected (quickly, please! )

Because you are a savvy data analyst, you recognize this theme and decide to set up a couple of systems to assist you with this recurring question type in the future.

System #1

A simple spreadsheet with the following information updated in an automated fashion daily:

a) Daily values for the core business metrics

b) Day-over-day percentage change in metrics, along with z-scores (or other relevant technique to check for significance)

c) A “Notes” column to manually add comments in the future. You can use this to call out internal or external events that impacted a core metric(s) on a given date. For example, “Internal engineering issue: mobile app was down for 4 hours, negatively impacting daily listener and stream counts.”

System #2

You begin developing a simple process flow that you can repeatedly use to narrow down the potential reasons behind a metric drop. The process is made up of a number of standard questions. Your responses to these questions guide the next step.

The flow won’t be perfect for every situation, but it will serve as a good overall framework and speed up your analysis. You treat this system as a work in progress, iterating and improving it over time.

Here is what it might look like for a drop in the daily streams metric:

Image by Author

To end this section, I wanted to share a quote I recently came across in James Clear’s book, Atomic Habits. It articulates the importance of the “Individual Systems and Processes” theme better than I ever could:

“You do not rise to the level of your goals. You fall to the level of your systems.”

(2) Knowing Your 20

Most of us may know the Pareto Principle or the 80/20 rule. At its simplest, it highlights how a small number of items can have an outsized impact on the final objective or outcome. For example:

  • 80% of your streams will come from just 20% of your listeners
  • 80% of your revenue will come from just 20% of your clients

The 80 and 20 are approximations, but they help highlight the larger point. This principle is just as appropriate when evaluating the most impactful areas of your work. There will always be a small set of definitions, metrics, queries, or other items that are the most critical to know, inside out, at any given point in time.

Data analysts who are quick to identify what this 20 is for them, and take the time to learn it deeply, stand out. Let us assume that you recently began working with the Paid Subscriptions team. This team is responsible for growing the number of paying customers and the overall subscription revenue for the company. They rely heavily on your analytical opinion and actively ask you for data and guidance during team discussions.

To contribute meaningfully to these sessions, you realize that there are a core set of data points that you need to know at your fingertips. These data points are so fundamental to this domain that they come up in almost every discussion. You put together a list of what you think these data points are and block 20–30 minutes every day to study them. You keep repeating this process until the data and underlying insights come to you automatically.

The actual list of “the 20” will vary based on industry and team context but here is an example of what it might look like working with the Paid Subscriptions team:

  • Total number of paying subscribers and the one-year trend
  • Top five markets by paying subscribers
  • Top 5 subscription plans (e.g., Monthly Student Plan, Annual Family Plan, etc.)
  • Number of new, returning, and reactivated paying subscribers
  • Retention rate (or churn) across the different plans
  • Free Trial to Paid Subscriber conversion rate
  • Financial metrics such as CAC, LTV, and ARPU of a paying subscriber

At first glance, this may look like a lot of information to remember, but it becomes very doable with some attention, planning, and repetition. It is less important to know the exact numbers off the top of your head than it is to have a strong sense of their ballpark levels and relative rankings. Since the actual values for a lot of these metrics are unlikely to change drastically week over week, the time spent on learning them will serve you well for an extended period of time.

This is not to say that you should never refer to relevant dashboards or reports in real-time. Doing that on occasion is necessary. However, it is much harder to get ahead of the game and actively participate in conversations if you have to refer to a document for every data point. Knowing your 20 inside out will help you avoid this and be more present.

(3) Anticipation & Proactivity

As analysts and data scientists, we are often quite excited about building models and making predictions. However, we don’t spend nearly enough time exercising this predictive ability when planning our own days or weeks. It helps to have a pulse on the items that haven’t been asked for yet, but are likely to.

The ability to anticipate and proactively share an analysis, model, or simply an email, can go a long way in earning trust and building credibility. Let us go over a straightforward example of this in practice. Suppose you created an automated email report that multiple teams rely on to track daily growth and engagement metrics. When a metric in this email drops significantly, several teams pause their ongoing work to understand what happened.

Now, let us assume that it is right around Christmas, and you remember from a previous analysis that there is a seasonal drop in the number of listeners in the first week of every year. To avoid anxious colleagues reaching out to you for an explanation after the fact, you could get ahead of it and proactively send out an email with the required context before the drop takes place.

Communicating this may seem obvious in hindsight, but it often doesn’t happen in practice. Most of us do a great job responding to incoming questions or requests but pay less attention to proactive delivery. A simple shift in the timing of an action can dramatically change how well it is received.

Anticipation and proactivity are two sides of the same coin. Being able to anticipate a need without taking any proactive action will have no external impact. It may help you validate your thought process but will do little else. Similarly, to be proactive in a manner that is useful and relevant, anticipation is key.

(4) Empathy

Given how much has already been written about the importance of communication as a data analyst, I want to focus our attention on a quality that is a prerequisite for effective communication — empathy.

The value of being empathetic as a data professional can not be overstated. A deep understanding of your stakeholder’s priorities, pain points, strengths, and weaknesses, will increase your effectiveness significantly.

Imagine you are a data analyst working with the Editorial team. This team is responsible for curating playlists across various genres, moods, and other topics. The head of this department reaches out to you with — “Hey, can you please share a list of all playlists by their number of weekly streams and listeners? Thank you!”

Instead of jumping right into a SQL query or a dashboard, you dig deeper to understand the underlying motivation behind the request. Through clarifying questions and further conversation, it becomes clear that what the team is actually trying to determine is the playlists with the most repeat listening and engagement so they can create more similarly themed playlists.

With this deeper understanding of the team’s need, you decide to add playlist listener retention and skip rate metrics to your report as well. In addition, since each Editor is focused on creating playlists for a specific language, you also provide the ability to filter your report by each language.

In this situation, simply providing what was initially asked for may have been satisfactory to the Editorial team, but it would have resulted in an inferior end product. Being empathetic to the team’s needs improved the quality and actionability of your work.

(5) Consistent Learning

I am sure we have all responded to or heard someone respond to an interview question with something along the lines of, “I am a quick learner” or “I don’t know that, but I can learn it.”

Most data analysts convey the desire to learn a new skill or build expertise in a new area. However, it is not the desire or curiosity to learn, but the approach to learning that is the key differentiator. Individuals who treat learning as an ongoing journey instead of an occasional ad-hoc task stand out over extended periods as their knowledge and skill set compound.

For many, the act of learning is completely dependent on whether there is a need for it at work. This, however, gives your work all the control over what and how much you learn. This is not a reliable long-term strategy. It may work well in the first few months of a new job where there is a lot to learn, but it will become less effective over time.

In contrast, an “always on” learning mindset would mean consistently spending a small amount of time daily or weekly getting better at something. This something may or may not be tied to what you are doing at work at the time. This makes the relationship between work and learning less rigid or transactional and more sustainable over the years. In certain cases, work will dictate what you need to learn, while in others, what you learn will influence your roadmap at work.

Conclusion

In summary, most of us tend to do well at staying up to speed on technical tools and skills that could benefit our careers. As important as that is, it is not enough in isolation. Combining a strong technical skill set with habits and behaviors across the five themes can be a game changer.

Thank you for reading!

--

--