Using MATLAB in Jupyter notebooks on Windows

After using R notebooks for a while I found it really unintuitive to use MATLAB in IDE. I read that it’s possible to use MATLAB with IPython but the instructions seemed a bit out of date. When I tried to follow them, I still could not run MATLAB with Jupyter (spin-off from IPython).

I wanted to conduct analyses of electroencephalographic (EEG) activity and the best plug-ins to do it (EEGLAB and ERPLAB) were written in MATLAB. I still wanted to use a programming notebook so I had to combine Jupyter and MATLAB.

I spent a bit of time setting it all up so I thought it might be worthwhile to share the process. Initially, I had three version of MATLAB (2011a, 2011b, and 2016b) and two versions of Python (2.7 and 3.3). This did not make my life easier of Windows 7.

Eventually, I only kept the installation of MATLAB 2016b to avoid problems with paths pointing to other versions. MATLAB’s Python engine works only with MATLAB 2014b or later so keeping the older versions could only cause problems.


  • Install Anaconda (2.7)
  • Install MATLAB (>=2014b) – if you are a student then it’s very likely that your university bought a license. There is also a free MATLAB-like language called Octave, but I have not used with Jupyter. Apparently, it is possible to combine Octave with Jupyter. I’m going to focus exclusively on MATLAB in this post.
  • Install MATLAB’s Python engine – run as admin and follow the steps on the official site.
  • Once the engine was installed. I could move to installing metakernel, matlab_kernel, and pymatbridge. Go to Anaconda prompt (run as admin) and run pip install metakernel
  • In the Anaconda prompt run pip install matlab_kernel – this will use the development version of the MATLAB kernel.
  • Run pip install pymatbridge to install a connector between Python and MATLAB.
  • … voilà!

    MATLAB should now be available in the list of available languages.
    Once you choose it, you can start using it in a Jupyter notebook:

    Obviously, thing were not always this smooth. Initially, I ran into problems with installing MATLAB’s Python engine. The official website suggested running the following code:
    cd "matlabroot\extern\engines\python"
    python install

    Which I did but it resulted in an error:

    Luckily, the error message was clear so I had to point Python to run the 64-bit version. I double-checked my versions with:
    import platform

    Which returned 64-bit as expected:

    Using a command with full path to Python solved the problem:

    I hope this will be useful. I have been messing with other issues which were pretty specific to my system so I did not include them here. Hopefully, these instructions will be enough to make MATLAB work with Jupyter.

    PS: I have also tried using MATLAB with Jupyter on Ubuntu, but this topic probably deserves another blog post.

    How to add code coverage (codecov) to your R package?

    During the development of another R package I wasted a bit of time figuring out how to add code coverage to my package. I had the same problem last time so I decided to write up the procedure step-by-step.

    Provided that you’ve already written an R package, the next step is to create tests. Luckily, devtools package makes setting up both testing and code coverage a breeze.

    Let’s start with adding an infrastructure for tests with devtools:

    Then add a test file of your_function() to your tests folder:

    Then add the scaffolding for the code coverage (codecov)
    use_coverage(pkg = ".", type = c("codecov"))

    After running this code you will get a code that can be added to your README file to display a codecov badge. In my case it’s the following:
    [![Coverage Status](](

    This will create a codecov.yml file that needs to be edited by adding:
    comment: false
    language: R
    sudo: false
    cache: packages
    - Rscript -e 'covr::codecov()'

    Now log in to using the GitHub account. Give codecov access to the project where you want to cover the code. This should create a screen where you can see a token which needs to be copied:

    Once this is completed, go back to R and run the following commands to use covr:

    codecov(token = "YOUR_TOKEN_GOES_HERE")

    The last line will connect your package to codecov. If the whole process worked, you should be able to see a percentage of coverage in your badge, like this:

    Click on it to see which functions are not fully covered/need more test:

    I hope this will be useful and will save a lot of frustrations.

    Sending serial triggers from PsychoPy to ETG-4000 fNIRS

    In the process of designing my latest experiment in PsychoPy I realised that setting up the serial port connection is not the most obvious thing to do. I wrote this tutorial so that others (and future me) won’t have to waste time reinventing the wheel.

    After creating an initial version of my experiment (looping over a wav file) I tried to figure out how to send a relevant trigger to my fNIRS. Luckily, the PsychoPy tutorial has a section about using a serial port, but after reading this I still wasn’t sure how to use the code with my script. After a quick brainstorm with the lab technician, we figured out that a simple script (see below) is indeed sending triggers to the fNIRS:

    To better understand the arguments of the serial.Serial class please consult the pySerial documentation.
    The argument (‘COM1’) is the name of the serial port used. The second argument is the Baud rate, it should be the same as the Baud rate used in the Parameter/External settings of the ETG-4000:

    The last line of the script is sending the signal through the serial port. In this example, it is “A ” followed by Python’s string literal for a Carriage Return. That was one of the strings expected by ETG-4000, i.e. it was on the list previously set up in Parameter/Stim Measurement:

    The easiest way for me to test whether I was sending correct signals was to use the Communication Test in the External tab (see the second screenshot). Once the test is started, you can run the Python script to test whether the serial signals are coming through.

    If the triggers work as expected, the code sections for serial triggers can be embedded in the experiment. It can be done via GUI or code editor. That’s where to add code using GUI:

    The next step is adding the triggers for the beginning, each stimulus/block, and the end of recording.

    Here is an example of an experiment using serial port triggers to delimit blocks of stimuli and individual stimuli.

    Due to the sampling rate, I had to add delay between the triggers delimiting the blocks, otherwise they would not be captured accurately. The triggers for block sections had to be send consecutively because the triggers cannot be interspersed (not sure if that’s because of my settings). For instance, AABBAA is fine, but ABABAB is not.

    fnirsr – An R package to analyse ETG-4000 fNIRS data

    As I mentioned in my previous post, I am trying to get my head around analysing fNIRS data collected using Hitachi ETG-4000. The output of a recording session with ETG-4000 can be saved as a raw csv file (see the example). This file seems to be pretty straightforward to parse: the top section is a header, and raw data starts at line 41.

    I created a set of basic R functions that can deal with the initial stages of the analysis and I wrapped them in an R-package. It is still a very early alpha (or rather pre-alpha), as the documentation is still sparse and no unit tests were made. I only have several raw csv files and they seemed to work fine with my functions but I’m not sure how robust they are.
    Anyway, I think it will be useful to release it even in the early stage and work on the functions as time goes by.

    The package can be found on GitHub and it can be installed with the following command:


    A vignette (Rmd) is here.

    HTML vignette:

    I couldn’t find any other R packages that would deal with these files so feel free to contact me if you work(ed) on something similar. Pull requests are encouraged.

    Loading and plotting nirs data in R

    Recently I started to learn how to use Hitachi ETG-4000 functional near-infrared spectroscopy (fNIRS) for my research. Very quickly I found out that, as usual in neuroscience, the main data analysis packages are written in MATLAB.

    I couldn’t find any script to analyse fNIRS data in R so I decided to write it myself. Apparently there are some Python options, like MNE or NinPy so I will look into them in future.

    ETG-4000 records data in a straightforward(ish) .csv files but the most popular MATLAB package for fNIRS data analysis (HOMER2) expects .nirs files.

    There is a ready-made MATLAB script that transforms Hitachi data into the nirs format but it’s only available in MATLAB. I will skip the transformation step for now, and will work only with a .nirs file.

    The file I used (Simple_Probe.nirs) comes from the HOMER2 package. It is freely available in the package, but I uploaded it here to make the analysis easier to reproduce.

    My code is here:

    This will produce separate time series plots for each channel with overlapping triggers, e.g.:

    The entire analysis workflow:

    Other files:
    RMarkdown file
    html report

    I hope this helps.
    More to follow.

    Postcode and Geolocation API for the UK

    While working with UK geographical data I often have to extract geolocation information about the UK postcodes. A convenient way to do it in R is to use geocode function from the ggmap package. This function provides latitude and longitude information using Google Maps API. This is very useful for mapping data points but doesn’t provide information about UK-specific administrative division.
    I got fed up of merging my list of postcodes with a long list of corresponding wards etc., so I looked for smarter ways of getting this info.
    That’s how I came across which is free, open source, and based solely on open data. This service is an API to geolocate UK postcode and provide additional administrative information. Full documentation explains in details many available options. Among geographic information you can pull using are:

      Westminster Parliamentary Constituency
      European Electoral Region (EER)
      Primary Care Trust (PCT)
      Parish (England)/ community (Wales)
      ONS/GSS Codes

    I conduct most of my analyses in R so I developed wrapper functions around the API. Developmental version of the PostcodesioR package can be found on GitHub and documentation is here. It still doesn’t support all optional arguments but should do the job in most cases. A reference manual is here.

    A mini-vignette (more to follow) showing how to run a lookup on a postcode, turn the result into a data frame, and then create an interactive map with leaflet:

    The code above produces a data frame with key information

    > glimpse(pc_df)
    Observations: 1
    Variables: 28
    $ postcode EC1Y 8LX
    $ quality 1
    $ eastings 532544
    $ northings 182128
    $ country England
    $ nhs_ha London
    $ longitude -0.09092237
    $ latitude 51.52252
    $ parliamentary_constituency Islington South and Finsbury
    $ european_electoral_region London
    $ primary_care_trust Islington
    $ region London
    $ lsoa Islington 023D
    $ msoa Islington 023
    $ incode 8LX
    $ outcode EC1Y
    $ admin_district Islington
    $ parish Islington, unparished area
    $ admin_county NA
    $ admin_ward Bunhill
    $ ccg NHS Islington
    $ nuts Haringey and Islington
    $ admin_district E09000019
    $ admin_county E99999999
    $ admin_ward E05000367
    $ parish E43000209
    $ ccg E38000088
    $ nuts UKI43

    and an interactive map showing geocoded postcode as a blue dot:

    Using ESRI shapefiles to create maps in R

    R has a number of libraries that can be used for plotting. They can be combined with open GIS data to create custom maps.
    In this post I’ll demonstrate how to create several maps.

    First step is getting shapefiles that will be used to create maps. One of the sources could be this site, but any source with open .shp files will do.

    Here I’ll focus on country level (administrative) data for Poland.
    If you follow the link to diva-gis you should see the following screen:

    I’ll plot powiats and voivodeships which are first- and second-level administrative subdivisions in Poland.

    After downloading and unzipping into your working directory in R you will be able to use the scripts underneath to recreate the maps.

    The simplest map is using only the shapefiles without any extra background.
    Clearly, it’s not the most attractive map, but it’s still informative.
    It was generated with the following code:

    Nicer maps can be generated with ggmap package. This package allows adding a shapefile overlay onto Google Maps or OSM. In this example I used get_googlemap function, but if you want other background then you should use get_map with appropriate arguments.
    Code used to generate the map above:

    And last, but not least is my favourite interactive map created with leaflet.


    > sessionInfo()
    R version 3.2.4 Revised (2016-03-16 r70336)
    Platform: x86_64-w64-mingw32/x64 (64-bit)
    Running under: Windows 7 x64 (build 7601) Service Pack 1

    [1] LC_COLLATE=English_United Kingdom.1252 LC_CTYPE=English_United Kingdom.1252
    [3] LC_MONETARY=English_United Kingdom.1252 LC_NUMERIC=C
    [5] LC_TIME=English_United Kingdom.1252

    attached base packages:
    [1] stats graphics grDevices utils datasets methods base

    other attached packages:
    [1] rgdal_1.1-7 ggmap_2.6.1 ggplot2_2.1.0 leaflet_1.0.1 maptools_0.8-39
    [6] sp_1.2-2

    loaded via a namespace (and not attached):
    [1] Rcpp_0.12.4 magrittr_1.5 maps_3.1.0 munsell_0.4.3
    [5] colorspace_1.2-6 geosphere_1.5-1 lattice_0.20-33 rjson_0.2.15
    [9] jpeg_0.1-8 stringr_1.0.0 plyr_1.8.3 tools_3.2.4
    [13] grid_3.2.4 gtable_0.2.0 png_0.1-7 htmltools_0.3.5
    [17] yaml_2.1.13 digest_0.6.9 RJSONIO_1.3-0 reshape2_1.4.1
    [21] mapproj_1.2-4 htmlwidgets_0.6 labeling_0.3 stringi_1.0-1
    [25] RgoogleMaps_1.2.0.7 scales_0.4.0 jsonlite_0.9.19 foreign_0.8-66
    [29] proto_0.3-10