Welcome. This is my portfolio site showcasing my scientific work, other projects and photography. I’m originally from The Netherlands but currently living and working as a PhD student at Justus-Liebig-University Giessen in Germany. Here I study the perception of liquids and other deformable materials using psychophysical techniques. Next to work I also have great interest in photography. Mostly while traveling I find time to pick up my camera as you might see in the photography section.
I have a background in computer science and a bit of computer/electrical engineering. After following mostly HCI (human computer interaction) oriented courses I ended up in the field of psychophysics. However after my PhD research I will try to steer my career back to more HCI oriented work where I can combine my knowledge in psychophysics and engineering. In the end I’m really interested in creating digital extensions of our physical world enabling experiences beyond the capabilities of our regular senses.
If you have more specific questions please don’t hesitate to contact me at mail [at] janjaap [dot] info.
Currently I work as a PhD student at the department of Experimental Psychology at the Justus-Liebig-Universität Gießen, Germany in the lab of Roland Fleming. I’m studying the perception of liquids and in particular the perception of viscosity. This research was part of an EU funded Marie Curie Initial Training Network called PRISM (Perceptual Representation of Illumination, Shape & Material). The last year of my PhD research is funded by a Google Faculty Research Award where we collaborate with Google DeepMind.
Fluids and other deformable materials have highly mutable shapes, which are visibly influenced by both intrinsic properties (e.g. viscosity, velocity) and extrinsic forces (e.g. gravity, object interactions). There is not much known on how we perceive liquids so constantly despite their ever changing shapes and volumes. I try to map some of the perceptual dimensions and physical cues we use to perceive liquids. We looked at perceived viscosity constancy over time, the effect of optical material appearance on liquid properties and currently we are looking at specific shape cues we use to abstract intrinsic properties. This is done in cooperation with Inria in Bordeaux with whom we are building a toolbox to measure both local and global 3D shape features. In case of liquids we use particle data, mesh vertices and voxel analysis.
During my PhD I spend quite some time setting up a technical pipeline to generate stable, precise and realistic liquid stimuli. To be able to do this I followed a three months internship in Madrid at Next Limit. Here I learned to work with their particle simulation program RealFlow made for the VFX industry. Next to this I learned how te render the simulated liquids realistically with Maxwell Render. The computational costs of these stimuli are quite high. I wrote specific scripts to distribute the calculations over various systems and the university cluster. Some of these stimulus sets took over two months to compute. Over 50.000 images were generated showing liquids over time, with different viscosities and with different optical material appearances. The video shows an overview of some of the stimuli used in my studies.
For my Master studies at the University Leiden in The Netherlands I did a graduation project on gloss perception. The actual research was done in the Perceptual Intelligence Lab at Delft University of Technology where Sylvia Pont and Maarten Wijntjes were my external supervisors. This was the first time I came in contact with psychophysical research.
We investigated the influence of the spatial structure of the illumination on gloss perception. The inspiration came from various art works like the paintings of Vermeer, where much simpler highlight shapes are used compared to a real world situation. We performed three experiments of which two took place using digital photographs on a computer monitor and one with the real spheres in the light box. The observers had to compare and rate glossiness. The results show that more complex highlight shapes were perceived to produce a less glossy appearance than simple highlight shapes such as a disk or square. These results show that, contradictory to some beliefs, the complexity of a highlight shape’s spatial structure alone is not the main criterion for increasing perceived glossiness.
A diffuse light box in combination with differently shaped masks was used to produce a set of 6 simple and more complex highlight shapes. In the box we placed spherical stimuli that were painted in 6 degrees of glossiness. This resulted in a stimulus set of 6 highlight shapes and 6 gloss levels, a total of 36 stimuli. Observers were asked to rate glossiness looking at the real scene in the light box, but we also performed experiments with photographs of the stimuli. The figure below shows a subset of the stimuli. Six masks where chosen to represent common shapes used in studio photography, paintings and cartoons, like the disk, square, window and ring shape. The other shapes where chosen to see if highlight shape complexity increases the perceived glossiness.
2017 Vivian C. Paulun, Filipp Schmidt, Jan Jaap R. van Assen, Roland W. Fleming (2017). Shape, motion, and optical cues to stiffness of elastic objects. Journal of Vision, 17(1):20, 1–22, doi: 10.1167/17.1.20. [PDF]
2016 Jan Jaap R. van Assen, Maarten W. A. Wijntjes, & Sylvia C. Pont (2016). Highlight shapes and perception of gloss for real and photographed objects. Journal of Vision, 16(6):6, 1–14, doi: 10.1167/16.6.6. [PDF]
2016 Jan Jaap R. van Assen, Pascal Barla and Roland W. Fleming. Identifying shape features underlying liquid perception. European Conference on Visual Perception (ECVP) Annual Meeting 2016.
2015 Jan Jaap R. van Assen and Roland W. Fleming. The influence of optical material appearance on the perception of liquids and their properties. Visual Sciences Society (VSS) Annual Meeting 2015.
2016 Filipp Schmidt, Vivian C. Paulun, Jan Jaap R. van Assen and Roland W. Fleming. Optical and shape cues to the visual perception of softness. Annual Meeting of the Collaborative Research Center Cardinal Mechanisms of Perception (SFB/TRR 135) 2016, Rauischholzhausen, Germany. Poster
2016 Vivian C. Paulun, Jan Jaap R. van Assen and Roland W. Fleming. Visual cues to stiffness of elastic objects. Visual Sciences Society (VSS) Annual Meeting 2016. Poster
2014 Jan Jaap R. van Assen, Vivian C. Paulun and Roland W. Fleming. Constancy of visually perceived fluid viscosity. Conference of Experimental Psychologists (TeaP) Annual Meeting 2014. Abstract|Poster.
2013 Jan Jaap R. van Assen, Sylvia C. Pont and Maarten W. A. Wijntjes. Highlight shape influences gloss perception. European Conference on Visual Perception (ECVP) Annual Meeting 2013. Abstract|Poster.
2015 Elsevier/Vision Research Travel Award
2013 ECVP 2013 Travel Grant Award
A selection of projects I worked on during my Bachelor and Master studies.
One elective course I followed was Light Architecture at Delft University of Technology. We worked with seven students on this project with different backgrounds in industrial design and architecture. We had to design a light plan emphasising the architecture of in our case the H. Antonius Abt in Scheveningen. This church was build in 1927 with a very sober exterior and very rich interior. This is something we wanted to translate into the light design. The church was interested in a free detailed light plan and therefore provided building schematics and were part of the conceptual process. After the requirements were defined we started to make a light plan.
To express the design we used 3D renderings. Google Sketchup was used to make the model using the building schematics and the Vray render engine was used to render the light. It was at that time the newest version of Vray supporting IES profiles. An IES profile captures the specific light characteristics of an illuminant which is measured under lab conditions. This is used to simulate very realistic representations of an existing light source. We used lights made by ACDC lighting and implemented their IES files in our design. The rendering represented the light emitted of their existing products and therefore give a very good impression of the light design.
I was always interested in ferrorfluids and during my Master studies I got a chance to experiment with this during the course Hardware & Physical Computing. We didn’t had much resources to our disposal like a good electro magnet which required some creative solutions. A circuit was designed using TIP122 transistors, 3-Amp diodes and a modified transformer (cut in half) combined with permanent magnet.
The power source was a modified PSU (computer power supply) that delivered 6 Amps at 12 volts. Two LDRs were used as sensors to control the ferrofluid. The entire circuit was controlled by an Arduino where one LDR was controlling the amplitude of the magnetic field and one LDR the frequency of the pulse width modulated sinusoid. This modulation made it look like as if the ferrofluid was breathing.
This crazy piece of machinery was build to let you perceive the visual world around you in a more dynamic way making changes to the visual field. The eyes are being replaced by two motorized webcams which send their own signal to one of the eyes. These motorised effects are really strange to perceive since the ‘new’ eyes can turn to much more extremes than a normal eye. There is a small 10″ inch netbook build in the helmet. The display of the notebook is used to show the two webcam feeds. There is an Arduino on top with three servo motors. To keep the helmet wireless we need power.
There is a 9v battery for the Arduino and four heavy ‘D’ batteries to power the servo’s. We used Max/MSP Jitter to display the webcam feeds on the screen. We used the accelerometer of an Android phone to control the servo’s. The data is send trough wifi network locally set up on the laptop. The OSC protocol is used to send the data from the accelerometer to Max/MSP. The main idea was that the phone can also be fixed on the helmet and therefore your head movement will also control your eyes. The result, The Desyncapitator, is shown in the following video:
T3 Touch Table
T3 or Tastbare Taal Tafel (Tangible Language Table) is an interactive touch table with finger and object recognition. The table was designed for primary schools where children could write/tell/draw stories on it. This was part of the HCI (Human Computer Interaction) semester during my Mediatechnology study at the University of Applied Sciences of Utrecht. Reactivision was used for fiducial recognition and Community Core Vision for multitouch finger recognition. It is a multitouch table with object recognition and a flash application for the story telling.
There are some additional tools which are physical objects which can be placed on or removed from the screen. The diameter of the screen in somewhat limited because the height of the table. The users are children between 6 and 9 years old so the table can’t be that high. There is a projector placed inside the table and a mirror to extend the projection length and therefore the screen size. There is a webcam with infrared pass-trough filter and 6 infrared LED spots for the finger and object recognition. The video below shows the design and functions (in Dutch).
Footsteps of Mutation
Are the possibilities of delivering a meaningful sentence endless? Can sentence meaning transform into a higher level of poetic art through language? Experience the mutations of sentence meaning as Dutch sentences take their first footsteps to mutation. We needed to build an installation that depicted the main theme ‘obsolete’ and sub-theme ‘mutation’. We chose to put this in the context of language where the mutation of language results in meaning becoming obsolete. Our installation showed this by projecting mutated sentences on a canvas. For the mutation of language we used the Delilah system from Leiden University Centre for Linguistics.
This system can parse and generate Dutch semantically. So the user enters a sentence which then will be parsed by Delilah and from this parsed meaning a new sentence will be generated and projected on the screen. By doing this, sentences mutate and if they mutate too much their meaning becomes obsolete. We used an old typewriter to enter the sentences because typewriters became obsolete too with digital devices taking over. Some technical hurdles had to be taken to integrate a 1930’s typewriter, Delilah using SICStus Prolog and display the sentences on a canvas using processing and java. The final installation was very promising even though it could use bit more refinement.
Dot Matrix Organ
The assignment for the New Media & New Technologies course was to make a tribute to an old new media carrier. We (classmate Erik and me) chose the matrix printer, which was in its time very important because in combination with a pc it completely replaced the typewriter from the average home. We tried to connect various stepper motors from different printers to an arduino. This was some work since stepper motors work only by sending synchronized signals to the different magnets in the motor. Luckily there are some useful libraries availible. The plan was to connect the printer to a keyboard and generate different tone heights by pressing different keys.
My parents had a matrix printer from 20 years ago somewhere hidden on the attic we could use for the final product. The experience gained exploring other parts got useful connecting the Epson. The stepper motors used in the Epson turned out to be more advanced with a bipolar instead of unipolar design. We had to change some circuit plans including the ULN2003 chip which increases the current over line using a Darlington array. After figuring out the right circuit we only needed to fine tune the software part and connect the keyboard which was much easier. The result although not musically fine-tuned worked perfectly.