
In my day-to-day job, I work with Excel files quite a lot – taking Excel files, most of which containing multiple worksheets and loading them into Python using Pandas for a variety of tasks. I found this process to be rather cumbersome, until a fellow co-worker (thanks Gab Lin!) taught me a simple trick that henceforth allows me to do this task more intuitively and efficiently. Allow me to pay it forward in this article!🙂
The codes presented in this article can be found at this GitHub repo.
The Data
Before getting to the crux, let’s first look at the following Excel file which I have created and will be using for illustrations.

It contains fictitious (and very simple) sales data across three consecutive months – January to March. The sales data for each month is shown in separate worksheets.
What I Did Previously And Its Limitations
Suppose my objective is to combine the data from all worksheets together into a single Pandas DataFrame. To achieve this task, I used to do the following:
- Get a list of names of all worksheets, either using
openpyxl
orpandas
. - Iterate through each worksheet, parse each sheet as a Pandas DataFrame, and append each DataFrame to another list.
- Merge all into a single DataFrame using
pd.concat
.
My code would look something like this:

As you can imagine, there are a few issues with this approach:
- You have at least 8 lines of code to do a simple task, which is quite inefficient.
- You can’t just load the Excel file into Python without first knowing the names of the worksheets. You need a bit of reverse-engineering – you first need to figure out the names of the worksheets before you specify the
sheet_name
argument accordingly in thepd.read_excel()
function. - You need to specify the exact name of the worksheet(s) you wish to load. This may be straightforward if you are loading data from a single worksheet or if the name of the worksheet is simple, but this is prone to errors if you have multiple worksheets or if the worksheet names are complicated.
The Trick
Now, to get around these issues, here’s the trick – simply specify the sheet_name
argument as None
in the pd.read_excel()
function. Yep, as simple as that! Let’s walk through this.
By specifying sheet_name=None
in pd.read_excel()
function, you get a dictionary of Pandas DataFrames, where the keys are the names of the worksheets, and the values are the data in each worksheet as Pandas DataFrames.

Now, if you wish to analyse data from a specific worksheet – say, from the "February" worksheet, you can simply do this:

Notice that this returns a Pandas DataFrame object.
Or if you wish to combine data across multiple worksheets, you can do this:

Here, we specify ignore_index=True
in pd.concat()
so that the indices of the resulting DataFrame run in sequence.
TL;DR
In a nutshell, the full code snippet would look like this:
Or in a format which you can download and save for future reference:

Now, you have fewer lines of code, which also translates to more efficient and readable code.
Before I conclude, do take note of this minor difference in the pd.read_excel()
function:
- For Pandas’ version ≥ 0.21.0:
pd.read_excel(filename, sheet_name=None)
- For Pandas’ version < 0.21.0:
pd.read_excel(filename, sheetname=None)
Conclusion
So, there you have it – a simple trick to load Excel files with multiple worksheets using Pandas! Depending on your use-cases and the type of Excel files you are working with, this approach may or may not be better than how you are used to doing this task. Nonetheless, I hope this article helps to shed light on this subtle, little-known trick and offers you an alternative method.
👇 ️This photo sums up how I felt after my co-worker taught this trick to me, so if you’ve found it useful too, feel free to give me your high-fives in the comments!

That’s it. Thanks for reading!
Before you go…
If this article has been of value and you wish to support me as a writer, do consider signing up for a Medium membership. As a member, you get unlimited access to stories published on Medium. If you sign up using this link, I’ll get to earn a small commission. Feel free to also join my email list if you wish to be notified whenever I publish.