Tim Cornelissen
Experience Prototyper | Creative Technologist
Experimental Psychology PhD
Experience Prototyper | Creative Technologist
Experimental Psychology PhD
Coming up!
(Renewing this page is a WIP for a few more days)
Apple Inc.
Cupertino, CA, USA
Apple Inc.
Cupertino, CA, USA
Apple Inc.
Cupertino, CA, USA
Apple Inc.
Cupertino, CA, USA
Scene Grammar Lab, Goethe University
Frankfurt, Germany
Humanities Laboratory, Lund University
Lund, Sweden
Utrecht University
Utrecht, The Netherlands
Cornelissen, T., Sassenhagen, J., & Võ, M. L.-H. (2019) Improving free-viewing fixation-related EEG potentials with continuous-time regression. Journal of Neuroscience Methods.
Niehorster, D.*, Cornelissen, T.*, Holmqvist, K., & Hooge, I. (2018) Searching with and against each other: spatiotemporal coordination of visual search behavior in collaborative and competitive settings. Attention, Perception, & Psychophysics.
Cornelissen, T. & Võ, M. L.-H. (2016) Stuck on semantics: Processing of irrelevant object-scene inconsistencies modulates ongoing gaze behavior. Attention, Perception, & Psychophysics.
* authors contributed equally to the paper
Lauer, T., Cornelissen, T.H.W., Draschkow, D., Willenbockel, V., & Võ, M. L.-H. (2018). The Role of Scene Summary Statistics in Object Recognition. Scientific Reports.
Hessels, R.S., Benjamins, J.S., Cornelissen, T.H.W., & Hooge, I.T.C. (2018). A validation of automatically-generated Areas-of-Interest in videos of a face for eye-tracking research. Frontiers in Psychology.
Hessels, R.S., Holleman, G.A., Cornelissen, T.H.W., Hooge, I.T.C. & Kemner, C. (2018). Eye contact takes two: autistic and social anxiety traits predict gaze behavior in dyadic interaction. Journal of Experimental Psychopathology.
Niehorster, D., Cornelissen, T., Holmqvist, K., Hooge, I., & Hessels, R. (2017). What to expect from your remote eye-tracker when participants are unrestrained. Behavior Research Methods.
Hessels, R.S., Cornelissen, T.H.W., Hooge, I.T.C. & Kemner, C. (2017). Gaze Behavior to Faces During Dyadic Interaction. Canadian Journal of Experimental Psychology.
Nyström, M., Niehorster, D., Cornelissen, T., & Garde, H. (2016). Real-time sharing of gaze data between multiple eye trackers - evaluation, tools, and advice. Behavior Research Methods.
Hooge, I., Nyström, M., Cornelissen, T., & Holmqvist, K. (2015). The art of braking: post saccadic oscillations in the eye tracker signal decrease with increasing saccade size. Vision Research.
Hessels, R.S., Cornelissen, T.H.W., Kemner, C. & Hooge, I.T.C. (2014). Qualitative tests of remote eyetracker recovery and performance during head rotation. Behavior Research Methods.
Dalmaijer, E.S., Van der Stigchel, S., Nijboer, T.C.W., Cornelissen, T.H.W. & Husain, M. (2014) CancellationTools: All-in-one software for administration and analysis of cancellation tasks. Behavior Research Methods
I'd really love to tell you... but most of what happens inside the spaceship in cupertino stays inside the spaceship for a few more years.
Screen Distance can help children engage in healthy viewing habits that can lower
their risk of myopia and can give people of all ages the opportunity to reduce digital eyestrain.
Viewing something like a device or book too closely for an extended period of time
can increase eye strain and the risk of myopia. The Screen Distance feature detects
when you hold your iPhone closer than 12 inches for an extended period, and encourages
you to move it farther away.
I was deeply involved in guiding this project from its earliest stages. My tasks included:
protoyping initial experiences, showing technical feasibility, setting specifications and requirements,
and advocating for the feature. Of course nothing like this happens without very talented
clinicians, scientists, and developers around you.
“You navigate by simply using your eyes, hands, and voice”
Developed an eye movement classification algorithm that contributes to all Vision Pro interactions.
Provided eye movement expertise for model architecture and data labeling instructions. Created metrics to evaluate ML models against ground truth.
Collaborated with vision scientists, machine learning experts, and developers to make this a reality.