The world’s leading publication for data science, AI, and ML professionals.

From Darwin to Deep Work

Focus Strategies for Machine Learning Practitioners

In December 1831, the HMS Beagle set sail on a nearly five-year journey around the world. Among its crew was 22-year-old Charles Darwin, eager to explore the tropical regions of the globe. During this voyage, Darwin collected fossils and developed a growing interest in the natural world, particularly in plants and species. The trip profoundly shaped Darwin’s thinking and laid the groundwork for his seminal work, On the Origin of Species, which was eventually published 23 years later in 1859.

Photo by Torsten Dederichs on Unsplash
Photo by Torsten Dederichs on Unsplash

Writing a book of such groundbreaking importance required clarity of thought. Darwin adhered to a strict daily schedule to stay productive. He would rise at 7 a.m. and dedicate the hours from 8 a.m. to noon to uninterrupted study and writing. These morning sessions were reserved for his most cognitively demanding work: analyzing his findings and refining his theories. After lunch, he would mentally work through challenges on long walks. Darwin’s disciplined routine and deep focus enabled him to tackle complex problems over extended periods – a testament to a concept called "Deep Work."

Deep Work

The term deep work was coined by computer science professor Cal Newport, in his book of the same name, Deep Work. Newport defines deep work as the ability to focus intensely on a single, cognitively demanding task. For Machine Learning (ML) practitioners, deep work is an especially critical and valuable skill.

Deep Work for ML practitioners

Machine learning research involves tasks that are intellectually demanding. Examples include are reading papers, solving mathematical problems, and programming algorithms. Each of these activities is cognitively taxing, requiring full attention over long spans of uninterrupted time.

Take reading research papers, for instance. For beginners, it can take three hours or more to read a single paper end-to-end. Following the authors’ arguments and understanding the underlying concepts is mentally exhausting. If you’re interrupted during this process – whether by an email notification or a colleague dropping by – it disrupts your train of thought. Research has shown that returning to the original point of focus after an interruption takes significant time and effort [1][2].

Reading a paper is only one aspect of machine learning, programming and mathematics equally require continued focus. Writing code to implement an algorithm is a creative and demanding task. Interruptions during coding sessions not only break your concentration but can lead to errors that are frustrating and time-consuming to debug. As any math student can attest, working through complex mathematical proofs or equations demands sustained focus; distractions make it nearly impossible to follow the logical flow.

The Modern Workplace

While being able to work deeply, especially in the field of machine learning, is essential, modern workplaces offer a wide array of distractions. Offices are built on the idea of communication, and thus email and instant messaging systems like Teams are commonly used. Although these tools make collaboration easier, they also encourage constant communication. Even when you are deep down coding a new algorithm, it’s easy to do a quick single click on a shiny icon, fragmenting your concentration.

Statistics suggest that the average worker spends about 50% of their day on email alone [3]. It’s not the case, as Cal Newport describes in Deep Work, that you can instantly switch your attention to answering these emails, and then directly go back to coding. Quite contrary, the residual effect of attention says that parts of your mental capacities are still caught in the previous task (answering messages) when you return to your initial task (coding) [2].

The residual effect is amplified when you could not finish the interrupting tasks (e.g., when a message you saw asks you to prepare a presentation for some event X), causing unfinished fragments to now linger around in your mind. It’s unlikely that Darwin would have been able to think clearly in such an environment.

Deep Work Strategies in ML

So, how can ML practitioners find the time to do what they are good at in a state of deep focus? At first glance, it is appealing to entirely disconnect from email and messaging apps and always focus on reading papers and coding ML algorithms. However, this is neither practical nor advisable. On the contrary, communication is crucial in any workplace, and many ML projects heavily benefit from exchange with colleagues. Ignoring this reality entirely could lead to professional consequences, such as projects being cancelled.

Thus, instead of eliminating communication, the goal should be to balance it with deep work.

Over the past six years of working in machine learning (see Further Reading at this article’s end), I’ve tried several strategies of integrating deep work into my ML practice. I’ve found that the best time to develop these habits is during your university or college years, when your schedule is more flexible. While lecture times might dictate parts of your day, you are still relatively free to structure your own studies.

Once you transition into the professional world, you’ll likely have less control over your schedule. However, even in these environments, you can still make time for deep work. Practically speaking, if deep work enables you to implement ML algorithms faster or develop better solutions, everyone benefits from your practice.

With that in mind, here are the three basic strategies that I found most practical for deeply working on machine learning. Use these as a starting point for implementing your own deep work practice.

Schedule Deep Work Blocks Pre-scheduling blocks of deep work is probably the fastest approach to making time for deep, uninterrupted work. For example, set aside some hours in the morning for tasks that require maximum focus, such as reading papers, coding, or developing theories. In my experience, I found three to four hours of uninterrupted concentration to be a high upper bound, a time after which I usually need a (lunch) break to recharge. Generally, during this deep work time I am silencing my notifications.

Having two blocks of deep work per day (e.g., one in the morning, one in the evening) is doable, but might require practice, especially if your are not (or no longer) used to concentrating intensively over longer periods of time. For starters, I usually recommend two hours first thing in the morning, and then expanding the block with experience. Bonus point: schedule these blocks the evening before.

Batch Communication Pre-scheduling blocks of deep work allows you to focus intensively, but it will not handle communication for you. Inspired by Newport, I suggest batching communication. Depending on your circumstances, you can check mail four times daily: directly in the morning, before lunch, after lunch, and late in the afternoon. I found that this approach minimizes interruptions while ensuring that you are still available for colleagues.

To repeat, skipping communication altogether is not an option and won’t help you and your colleagues. To some degree, machine learning lives from exchanging ideas and concerns – which is what conferences are made for.

Create a Distraction-free Environment Once you have pre-scheduled times for deep work and handling communication, it’s time to optimize your workspace for focus. In my experience, you have two levers: reducing sounds and reducing visual clutter. As for reducing sounds, I use ear plugs (alternative: noise-canceling headphones) to block noise. For reducing visual distractions (think: conference posters, coworkers walking by, etc.), I try to work from distraction-free booths. If you work from home at least partially, then you can follow Darwin, who retreated to his office to work without distractions.


Charles Darwin’s disciplined routine during the writing of On the Origin of Species offers valuable lessons for ML practitioners. To achieve breakthroughs, whether in natural science or machine learning, requires time to think and work deeply. In this article, I presented three strategies that I found most practical for implementing stretches of deep work into your machine learning practice: pre-schedule blocks of deep work, batch communications, and reducing environmental distractions. Start with these central strategies to implement your own deep work practices.


Acknowledgements: The introduction of Darwin is inspired by two Wikipedia entries. The story of Darwin’s routine is described in Deep Work by Cal Newport, another inspiration for this article.

Further reading:

How I’d Learn Machine Learning Again, After 6 Years

ML Beginners Should Read Papers

How (and Where) ML Beginners Can Find Papers

Deep Work: Rules for Focused Success in a Distracted World – Cal Newport

Sources:


Related Articles