More information about R Notebooks can be found in the RStudio’s blog post.
In the process of designing my latest experiment in PsychoPy I realised that setting up the serial port connection is not the most obvious thing to do. I wrote this tutorial so that others (and future me) won’t have to waste time reinventing the wheel.
After creating an initial version of my experiment (looping over a wav file) I tried to figure out how to send a relevant trigger to my fNIRS. Luckily, the PsychoPy tutorial has a section about using a serial port, but after reading this I still wasn’t sure how to use the code with my script. After a quick brainstorm with the lab technician, we figured out that a simple script (see below) is indeed sending triggers to the fNIRS:
To better understand the arguments of the serial.Serial class please consult the pySerial documentation.
The argument (‘COM1’) is the name of the serial port used. The second argument is the Baud rate, it should be the same as the Baud rate used in the Parameter/External settings of the ETG-4000:
The last line of the script is sending the signal through the serial port. In this example, it is “A ” followed by Python’s string literal for a Carriage Return. That was one of the strings expected by ETG-4000, i.e. it was on the list previously set up in Parameter/Stim Measurement:
The easiest way for me to test whether I was sending correct signals was to use the Communication Test in the External tab (see the second screenshot). Once the test is started, you can run the Python script to test whether the serial signals are coming through.
If the triggers work as expected, the code sections for serial triggers can be embedded in the experiment. It can be done via GUI or code editor. That’s where to add code using GUI:
The next step is adding the triggers for the beginning, each stimulus/block, and the end of recording.
Here is an example of an experiment using serial port triggers to delimit blocks of stimuli and individual stimuli.
Due to the sampling rate, I had to add delay between the triggers delimiting the blocks, otherwise they would not be captured accurately. The triggers for block sections had to be send consecutively because the triggers cannot be interspersed (not sure if that’s because of my settings). For instance, AABBAA is fine, but ABABAB is not.
As I mentioned in my previous post, I am trying to get my head around analysing fNIRS data collected using Hitachi ETG-4000. The output of a recording session with ETG-4000 can be saved as a raw csv file (see the example). This file seems to be pretty straightforward to parse: the top section is a header, and raw data starts at line 41.
I created a set of basic R functions that can deal with the initial stages of the analysis and I wrapped them in an R-package. It is still a very early alpha (or rather pre-alpha), as the documentation is still sparse and no unit tests were made. I only have several raw csv files and they seemed to work fine with my functions but I’m not sure how robust they are.
Anyway, I think it will be useful to release it even in the early stage and work on the functions as time goes by.
The package can be found on GitHub and it can be installed with the following command:
A vignette (Rmd) is here.
I couldn’t find any other R packages that would deal with these files so feel free to contact me if you work(ed) on something similar. Pull requests are encouraged.
Recently I started to learn how to use Hitachi ETG-4000 functional near-infrared spectroscopy (fNIRS) for my research. Very quickly I found out that, as usual in neuroscience, the main data analysis packages are written in MATLAB.
I couldn’t find any script to analyse fNIRS data in R so I decided to write it myself. Apparently there are some Python options, like MNE or NinPy so I will look into them in future.
There is a ready-made MATLAB script that transforms Hitachi data into the nirs format but it’s only available in MATLAB. I will skip the transformation step for now, and will work only with a .nirs file.
The file I used (Simple_Probe.nirs) comes from the HOMER2 package. It is freely available in the package, but I uploaded it here to make the analysis easier to reproduce.
My code is here:
The entire analysis workflow:
I hope this helps.
More to follow.
While working with UK geographical data I often have to extract geolocation information about the UK postcodes. A convenient way to do it in R is to use geocode function from the ggmap package. This function provides latitude and longitude information using Google Maps API. This is very useful for mapping data points but doesn’t provide information about UK-specific administrative division.
I got fed up of merging my list of postcodes with a long list of corresponding wards etc., so I looked for smarter ways of getting this info.
That’s how I came across http://postcodes.io/ which is free, open source, and based solely on open data. This service is an API to geolocate UK postcode and provide additional administrative information. Full documentation explains in details many available options. Among geographic information you can pull using postcodes.io are:
Westminster Parliamentary Constituency
European Electoral Region (EER)
Primary Care Trust (PCT)
Parish (England)/ community (Wales)
I conduct most of my analyses in R so I developed wrapper functions around the API. The beta version of the PostcodesioR package can be found on GitHub. It’s still a bit buggy, doesn’t support optional arguments but should do the job in most cases. Beta reference manual is here.
A mini-vignette (more to follow) showing how to run a lookup on a postcode, turn the result into a data frame, and then create an interactive map with leaflet:
The code above produces a data frame with key information
and an interactive map showing geocoded postcode as a blue dot:
R has a number of libraries that can be used for plotting. They can be combined with open GIS data to create custom maps.
In this post I’ll demonstrate how to create several maps.
First step is getting shapefiles that will be used to create maps. One of the sources could be this site, but any source with open .shp files will do.
Here I’ll focus on country level (administrative) data for Poland.
If you follow the link to diva-gis you should see the following screen:
After downloading and unzipping POL_adm.zip into your working directory in R you will be able to use the scripts underneath to recreate the maps.
Nicer maps can be generated with ggmap package. This package allows adding a shapefile overlay onto Google Maps or OSM. In this example I used
get_googlemap function, but if you want other background then you should use
get_map with appropriate arguments.
Code used to generate the map above:
And last, but not least is my favourite interactive map created with leaflet.
R version 3.2.4 Revised (2016-03-16 r70336)
Platform: x86_64-w64-mingw32/x64 (64-bit)
Running under: Windows 7 x64 (build 7601) Service Pack 1
 LC_COLLATE=English_United Kingdom.1252 LC_CTYPE=English_United Kingdom.1252
 LC_MONETARY=English_United Kingdom.1252 LC_NUMERIC=C
 LC_TIME=English_United Kingdom.1252
attached base packages:
 stats graphics grDevices utils datasets methods base
other attached packages:
 rgdal_1.1-7 ggmap_2.6.1 ggplot2_2.1.0 leaflet_1.0.1 maptools_0.8-39
loaded via a namespace (and not attached):
 Rcpp_0.12.4 magrittr_1.5 maps_3.1.0 munsell_0.4.3
 colorspace_1.2-6 geosphere_1.5-1 lattice_0.20-33 rjson_0.2.15
 jpeg_0.1-8 stringr_1.0.0 plyr_1.8.3 tools_3.2.4
 grid_3.2.4 gtable_0.2.0 png_0.1-7 htmltools_0.3.5
 yaml_2.1.13 digest_0.6.9 RJSONIO_1.3-0 reshape2_1.4.1
 mapproj_1.2-4 htmlwidgets_0.6 labeling_0.3 stringi_1.0-1
 RgoogleMaps_220.127.116.11 scales_0.4.0 jsonlite_0.9.19 foreign_0.8-66
Praat is a great tool for analysing speech data but lately I came across a frustrating problem. While trying to open a txt file (vector of numbers) in Praat I would get the following error message:
File not recognized. File not finished.
After consulting my fellow PhD students I discovered that what I was missing was a header enabling Praat to read txt files.
The simplest way to fix this error is to add the following header to a text file using your favourite text editor:
However, if you want to automate the process then scripting can save you a lot of time. That’s why I created a function (txt2praat.R) appending this header to the original text file and saving the output to a new text file.
You can use the function in the following way:
txtfile <- file.choose()
These commands should create a txt file (testfile - modified) appended with the short header. New file can be then opened in Praat without the error message.
Tableau Desktop 9.1 is out and Web Data Connectors are available as a new data source in this version. Luckily Tableau released several working connectors to popular web data sources and Google Sheets is one of them. Today I tried to connect to Google Sheets but I couldn’t find a step-by-step description of setting it up and the official thread lacked details about configuring WDC. It took me a while to figure out how to start using Web Data Connectors so I decided to write this tutorial which hopefully will help others to start using Web Connectors. This tutorial describes how to use Tableau web connectors hosted locally.
Before starting to use Web Data Connectors you need to activate Internet Information Services (IIS) Manager (more info available in this Stackoverflow thread).
This can be done on Windows 10 by pressing Windows key and typing Windows Features.
Then go to Turn Windows Features On or Off, and tick the box next to Internet Information Services.
Now you should be able to see IIS in Control Panel > Administrative Tools.
Open IIS Manager to check the values in Binding and Path columns. Binding should be set to
*:8888 (http). Path points to a folder where the Tableau Connector files should be stored.
If Binding is set to a different value than 8888 (mine was set to 80 by default) then go to Actions on click on Bindings.
Then change the Port number to 8888 and click OK.
Now your IIS Manager configuration should look like this
Check the value of the Path column. It’s set to
%SystemDrive%\inetpub\wwwroot by default which in most cases means
In order to start using Google Sheets as a data source you need to start with downloading the Web Data Connector SDK.
Unzip the zip file and go to the Examples folder.
Copy all files from the Examples folder to the Path specified in IIS. In my case it was
If you copied the example files provided by Tableau to your IIS path then type the following into the address bar of the WDC pop-up window:
You should see the following window
All you need to do now is to provide a link to a Google Sheet you want to use. You might also need to log into your Google Account if a Sheet provided is private.
Now you can start using Google Sheets as a data source. Hooray!
If you need to use other connectors then just change the path in the Address bar of the pop-up window.