10 simple Python tips to speed up your data analysis
Tips and tricks, especially in the programming world, can be very useful. Sometimes a little hack can be both time and life-saving. A minor shortcut or add-on can sometimes prove to be a Godsend and can be a real productivity booster. So, here are some of my favorite tips and tricks I’ve used and compiled together in the form of this article. Some may be fairly known and some may be new, but I’m sure they’ll come in pretty handy the next time you work on a data analysis project.
Profiling the ‘pandas’ dataframe
Profilingis a process that helps us understand our data, and Pandas Profiling is a python package that does exactly that. It’s a simple and fast way to perform exploratory data analysis of a Pandas Dataframe. The pandas df.describe()and df.info()functions are normally used as a first step in the EDA process. However, it only gives a very basic overview of the data and doesn’t help much in the case of large data sets. The Pandas Profiling function, on the other hand, extends the pandas DataFrame with df.profile_report() for quick data analysis. It displays a lot of information with a single line of code and that too in an interactive HTML report.
For a given dataset the pandas profiling package computes the following statistics:
Installation
Usage
Let’s use the age-old titanic dataset to demonstrate the capabilities of the versatile python profiler.
This single line of code is all you need to display the data profiling report in a Jupyter notebook. The report is pretty detailed including charts wherever necessary.
The report can also be exported into an interactive HTML file with the following code.
Refer to this documentation for more details and examples.
Bringing interactivity to pandas plots
Pandas has a built-in .plot() function as part of the DataFrame class. However, the visualizations rendered with this function aren’t interactive and that makes it less appealing. On the contrary, the ease to plot charts with pandas.DataFrame.plot() function also cannot be ruled out. What if we could plot interactive plotly like charts with pandas without having to make major modifications to the code? Well, you can actually do that with the help of Cufflinks library.
Cufflinks library binds the power of plotly with the flexibility of pandas for easy plotting. Let’s now see how we can install the library and get it working in pandas.
Installation
Usage
Time to see the magic unfold with the Titanic dataset.
df.iplot() vs df.plot()
The visualization on the right shows the static chart while the left chart is interactive and more detailed and all this without any major change in the syntax.
Magic commands are a set of convenient functions in Jupyter Notebooks that are designed to solve some of the common problems in standard data analysis. You can see all available magics with the help of %lsmagic.
List of all available magic functions
Magic commands are of two kinds: line magics, which are prefixed by a single % character and operate on a single line of input, and cell magics, which are associated with the double %% prefix and operate on multiple lines of input. Magic functions are callable without having to type the initial % if set to 1.
Let’s look at some of them that might be useful in common data analysis tasks:
% pastebin
%pastebin uploads code to Pastebin and returns the URL. Pastebin is an online content hosting service where we can store plain text like source code snippets and then the URL can be shared with others. In fact, Github gist is also akin to pastebin albeit with version control.
Consider a python script file.pywith the following content:
Using %pastebin in Jupyter Notebook generates a pastebin url.
%matplotlib notebook
The%matplotlib inline function is used to render the static matplotlib plots within the Jupyter notebook. Try replacing theinline part withnotebook to get zoom-able & resize-able plots, easily. Make sure the function is called before importing the matplotlib library.
%matplotlib inline vs %matplotlib notebook
%run
The %run function runs a python script inside a notebook.
%%writefile
%%writefile writes the contents of a cell to a file. Here the code will be written to a file named foo.py and saved in the current directory.
%%latex
The %%latex function renders the cell contents as LaTeX. It is useful for writing mathematical formulae and equations in a cell.
Finding and eliminating errors
The interactive debugger is also a magic function but I have given it a category of its own. If you get an exception while running the code cell, type %debug in a new line and run it. This opens an interactive debugging environment that brings you to the position where the exception has occurred. You can also check for the values of variables assigned in the program and also perform operations here. To exit the debugger hit q.
Printing can be pretty too
If you want to produce aesthetically pleasing representations of your data structures, pprint is the go-to module. It is especially useful when printing dictionaries or JSON data. Let’s have a look at an example which uses both print and pprint to display the output.
Making the notes stand out
We can use alert/Note boxes in your Jupyter Notebooks to highlight something important or anything that needs to stand out. The color of the note depends upon the type of alert that is specified. Just add any or all of the following codes in a cell that needs to be highlighted.
Blue Alert Box: info
Yellow Alert Box: Warning
Green Alert Box: Success
Red Alert Box: Danger
Printing all the outputs of a cell
Consider a cell of Jupyter Notebook containing the following lines of code:
It is a normal property of the cell that only the last output gets printed and for the others, we need to add theprint() function. Well, it turns out that we can print all the outputs just by adding the following snippet at the top of the notebook.
Now all the outputs get printed one after the other.
To revert to the original setting :
Running python scripts with the ‘i’ option.
A typical way of running a python script from the command line is:python hello.py.However, if you add an additional-iwhile running the same script e.gpython -i hello.pyit offers more advantages. Let’s see how.
Firstly, once the end of the program is reached, python doesn’t exit the interpreter. As such we can check the values of the variables and the correctness of the functions defined in our program.
Secondly, we can easily invoke a python debugger since we are still in the interpreter by:
This will bring us to the position where the exception has occurred and we can then work upon the code.
Ctrl/Cmd + /comments out selected lines in the cell by automatically. Hitting the combination again will uncomment the same line of code.
To delete is human, to restore divine
Have you ever accidentally deleted a cell in a Jupyter Notebook? If yes then here is a shortcut that can undo that delete action.
In case you have deleted the contents of a cell, you can easily recover it by hittingCTRL/CMD+Z
If you need to recover an entire deleted cell hitESC+ZorEDIT > Undo Delete Cells
In this article, I’ve listed the main tips I have gathered while working with Python and Jupyter Notebooks. I’m sure these simple hacks will be of use to you at some point in your career. Till then, happy coding!