Hi. My name is Laurens, and I'm an *approximately* (time() - 500246280) / (60 * 60 * 24 * 365.25) years old Dutchman currently living and working in Berlin.

I spend most of my time at the Technische Universität, where I pursue a PhD in brain-computer interfacing and neuroadaptive technology. When I'm not there, chances are I'm travelling, reading, listening to music, or feeding ducks.

Noctifer (Latin, "evening star") is where I communicate to the world those little things I sometimes do that might interest some people other than myself. For the kind of communication that's supposed to go in the other direction, please feel free to contact me.



  • Krol, L. R., & Zander, T. O. (2018). Towards a conceptual framework for cognitive probing. In J. Ham, A. Spagnolli, B. Blankertz, L. Gamberini, & G. Jacucci (Eds.), Symbiotic interaction (pp. 74–78). Cham: Springer International Publishing. DOI PDF
  • Krol, L. R., Andreessen, L. M., Podgorska, A., Makarov, N., & Zander, T. O. (2018). Passive Brain-Computer Interfacing in the Museum of Stillness, Proceedings of the Artistic BCI Workshop at the SIGCHI Conference on Human Factors in Computing Systems (CHI), Montréal, Canada. PDF
  • Krol, L. R., Andreessen, L. M., & Zander, T. O. (2018). Passive Brain-Computer Interfaces: A Perspective on Increased Interactivity. In C. S. Nam, A. Nijholt, & F. Lotte (Eds.), Brain-Computer Interfaces Handbook: Technological and Theoretical Advances (pp. 69-86). Boca Raton, FL, USA: CRC Press. Link PDF



  • Krol, L. R., Freytag, S.-C., & Zander, T. O. (2017). Meyendtris: A hands-free, multimodal Tetris clone using eye tracking and passive BCI for intuitive neuroadaptive gaming. In Proceedings of the 19th ACM International Conference on Multimodal Interaction (pp. 433–437). New York, NY, USA: ACM. DOI PDF
  • Krol, L. R., & Zander, T. O. (2017). Passive BCI-Based Neuroadaptive Systems. In Proceedings of the 7th Graz Brain-Computer Interface Conference 2017 (pp. 248–253). DOI PDF
  • Krol, L. R., Zander, T. O., Jaswa, M., Flascher, O., Snelting, A., & Brouwer, A.-M. (2017). Online-capable cleaning of highly artefactual EEG data recorded during real driving. In Proceedings of the 7th Graz Brain-Computer Interface Conference 2017 (pp. 254–259). DOI PDF
  • Zander, T. O., & Krol, L. R. (2017). Team PhyPA: Brain-computer interfacing for everyday human-computer interaction. Periodica Polytechnica Electrical Engineering and Computer Science, 61(2), 209–216. DOI PDF
  • Zander, T. O.,   Andreessen, L. M.,   Berg, A.,   Bleuel, M.,   Pawlitzki, J.,   Zawallich, L., Krol, L. R. & Gramann, K. (2017). Evaluation of a dry EEG system for application of passive brain-computer interfaces in autonomous driving. Frontiers in Human Neuroscience, 11, 78. DOI PDF
  • Zander, T. O., Shetty, K., Lorenz, R., Leff, D. R., Krol, L. R., Darzi, A. W., ... Yang, G.-Z. (2017). Automated task load detection with electroencephalography: Towards passive brain-computer interfacing in robotic surgery. Journal of Medical Robotics Research, 2(1), 1750003. DOI PDF
  • Brouwer, A.-M., Snelting, A., Jaswa, M., Flascher, O., Krol, L. R., & Zander, T. O. (2017). Physiological effects of adaptive cruise control behaviour in real driving. In Proceedings of the 2017 ACM Workshop on An Application-Oriented Approach to BCI out of the Laboratory (pp. 15-19). New York, NY, USA: ACM. DOI


≤ 2016

  • Krol, L. R., Zander, T. O., Birbaumer, N. P., & Gramann, K. (2016). Neuroadaptive technology enables implicit cursor control based on medial prefrontal cortex activity. Proceedings of the National Academy of Sciences, 113(52), 14898–14903. DOI PDF
  • Krol, L. R., Freytag, S.-C., Fleck, M., Gramann, K., & Zander, T. O. (2016). A task-independent workload classifier for neuroadaptive technology: Preliminary data. In 2016 IEEE International Conference on Systems Man and Cybernetics (SMC), (pp. 003171–003174). DOI PDF
  • Stivers, J. M., Krol, L. R., de Sa, V. R., & Zander, T. O. (2016). Spelling with cursor movements modified by implicit user response. In Proceedings of the 6th International Brain-Computer Interface Meeting (p. 59). Graz, Austria: Verlag der TU Graz. DOI PDF
  • Zander, T. O., Brönstrup, J., Lorenz, R., & Krol, L. R. (2014). Towards BCI-based Implicit Control in Human-Computer Interaction. In S. H. Fairclough & K. Gilleade (Eds.), Advances in Physiological Computing (pp. 67–90). Berlin, Germany: Springer. DOI PDF
  • Krol, L. R., Aliakseyeu, D., & Subramanian, S. (2009). Haptic feedback in remote pointing. In CHI '09 Extended Abstracts on Human Factors in Computing Systems (pp. 3763–3768). DOI PDF



SEREEGA is a modular, general-purpose open-source MATLAB-based toolbox to generate simulated toy data mimicking event-related electroencephalography (EEG) data.

Starting with a forward model obtained from a head model or pre-generated lead field, dipolar brain components can be defined. Each component has a specified position and orientation in the brain. Different activation signals can be assigned to these components. EEG data is simulated by projecting all activation signals from all components onto the scalp and summing them together.

SEREEGA is modular in that different head models and lead fields are supported, as well as different activation signals. Currently, SEREEGA supports the New York Head (ICBM-NY) pre-generated lead field, the Pediatric Head Atlases pre-generated lead fields, and FieldTrip custom lead field generation. Four types of activation signals are provided: ERP (event-related potential, i.e. systematic deflections in the time domain), ERSP (event-related spectral perturbation, i.e. systematic modulations of oscillatory activations), noise (stochastic processes in the time domain), and signals based on an autoregressive model. A final data class allows the inclusion of any already existing time series as an activation signal.

SEREEGA is intended as a tool to generate data with a known ground truth in order to evaluate neuroscientific and signal processing methods, e.g. blind source separation, source localisation, connectivity measures, brain-computer interface classifier accuracy, derivative EEG measures, et cetera.

As an example, the following code produces an EEGLAB dataset containing 100 epochs of 64-channel EEG data, each consisting of the summed activity of 64 simulated sources spread across the brain. The final dataset includes a ground-truth ICA decomposition that can be used to verify the accuracy of newly calculated decompositions.

% configuring for 100 epochs of 1000 ms at 1000 Hz
config      = struct('n', 10, 'srate', 1000, 'length', 1000);

% obtaining a 64-channel lead field from ICBM-NY
leadfield   = lf_generate_fromnyhead('montage', 'S64');

% obtaining 64 random source locations, at least 2.5 cm apart
sourcelocs  = lf_get_source_spaced(leadfield, 64, 25);

% each source will get the same type of activation: brown coloured noise
signal      = struct('type', 'noise', 'color', 'brown', 'amplitude', 1);

% packing source locations and activations together into components
components  = utl_create_component(sourcelocs, signal, leadfield);

% simulating data
data        = generate_scalpdata(components, leadfield, config);

% converting to EEGLAB dataset format, adding ICA weights
EEG         = utl_create_eeglabdataset(data, config, leadfield);
EEG         = utl_add_icaweights_toeeglabdataset(EEG, components, leadfield);
SEREEGA manuscript preprint on bioRxiv


plot_erp (MATLAB)

plot_erp is a MATLAB-based script to plot event-related potentials (ERPs) from any number of given epoched datasets (in EEGLAB format), for a single channel. For each ERP curve, any number of datasets can be given. It can optionally calculate and plot a difference wave, standard errors, and permutation-based statistics (using permutationTest). Mean curves and statistics can be calculated either within or between datasets.

Sample screenshots:

  • Two ERPs plus their difference, standard error of the mean for all curves, and statistics highlighting significant differences in grey.

  • ERPs for eight groups of 19 datasets each with custom colour code, labels, and x scale indicator position.

plot_erp on GitHub


permutationTest (MATLAB)

A permutation test (aka randomisation test) for MATLAB, supporting one- and two-tailed tests and capable of visualising the outcome using a histogram. Provides a p-value, the observed difference, and the effect size.

For example, the following tests two samples (n=5000 each) from normal distributions against each other, one shifted .1 to the right (two-tailed, 10000 permutatinos). It produces the subsequent plot.

sample1 = randn(1,5000);
sample2 = randn(1,5000) + .1;
permutationTest(sample1, sample2, 10000, 'plot', 1);

permutationTest on GitHub
permutationTest on MATLAB Central


maptorange (MATLAB)

MATLAB script to map values from one range onto another range, either linearly or along a given exponential curve. Also supports inverse ranges, and can extrapolate.

For example, we may map the values 0:.01:1 to their corresponding values between 0 and .5, where 0 becomes .5 and 1 becomes 0, with an exponential curve falloff using the power of 5.

plot(0:.01:1, maptorange(0:.01:1, [0 1], [.5 0], 'exp', 5));

maptorange on GitHub
maptorange on MATLAB Central


Multi-Dimensional Image Browser (Python)

Python script (Python 2.7, uses PIL and TKinter) to allow image browsing across multiple dimensions indicated in the images' file names.

This has made the analysis of some visual data a little easier for me. Rather than browsing through image files in alphabetical order, this script allows them to be browsed through in a multi-dimensional fashion. Take for example the following files:

apple-brown.jpg     banana-brown.jpg      pear-brown.jpg
apple-green.jpg     banana-green.jpg      pear-green.jpg
apple-yellow.jpg    banana-yellow.jpg     pear-yellow.jpg

Regular alphabetical ordering would have you browse first through all apples, colour by colour, then through the bananas, colour by colour, and then the pears. These files, however, contain two dimensions: fruit type, and fruit colour. This script makes it possible to jump from the green banana to the green apple with one button, and from the green apple to the yellow apple with another.

Simply supply it with a directory and a file pattern, and the script automatically extracts and orders all dimension values.

File names should be consistent except for the variable parts, and the variable parts should be separated somehow. A variable part of the file names representing one dimension is indicated using an asterisk (*). For the above example, the pattern would be


The script then allows you to browse through the available dimensions using the keyboard. The keys for the first dimension are under the "1", i.e. on a QWERTY keyboard, "q" for one up in this dimension, and "a" for one down. "w" and "e" increase and decrease the second dimension, respectively, and "o" and "l" represent the ninth dimension. The first two dimensions can additionally be controlled by the arrow keys.

Dimensions are numbered in the order that they appear in the file names.

Keyboard shortcuts include shift+D to set the directory, and shift+P for the pattern. Enter applies, escape exits.

Multi-Dimensional Image Browser on GitHub


Noctifer Gallery (PHP)

A small, single-file PHP script that handles everything you need in order to turn a directory into an online image gallery. It contains both a thumbnail browser and an image viewer. The browser shows thumbnails for supported image files (currently jpg, png, gif), and lists subdirectories for navigation. It automatically creates thumbnails of new files (given appropriate writing permissions). The image viewer can show images at 100% of their size, or fit them to the screen, and can be controlled by keyboard. Two colour schemes are included, which can be easily customised.

This online example uses the dark colour scheme.

Noctifer Gallery on GitHub


Photography Gallery

A selection of pictures I've taken over the years.
Noctifer Photography Gallery