Welcome to my website!

About me

My name is Philipp Kratzer, I am a machine learning and robotics researcher, within the International Max Planck Research School for Intelligent Systems (IMPRS-IS) and the University of Stuttgart. Currently I work on my PHD topic about using human motion prediction to improve close distance human-robot interaction.


For a full list of my publications, check out my Google Scholar Profile.

Recent publications:

Prediction of Human Full-Body Movements with Motion Optimization and Recurrent Neural Networks (ICRA 2020; Best Paper award finalist)

Human movement prediction is difficult as humans naturally exhibit complex behaviors that can change drastically from one environment to the next. In this work we investigate encoding short-term dynamics in a recurrent neural network, while we account for environmental constraints, such as obstacle avoidance, using gradient-based trajectory optimization. Our experiments demonstrate that our framework improves predictions and even can be used to coordinate motions with a human partner!
presentation slides
  title={Prediction of human full-body movements with motion optimization and 
  recurrent neural networks},
  author={Kratzer, Philipp and Toussaint, Marc and Mainprice, Jim},
  booktitle={2020 IEEE International Conference on Robotics and Automation (ICRA)},

MoGaze: A Dataset of Full-Body Motions that Includes Workspace Geometry and Eye-Gaze (IEEE Robotics and Automation Letters; 2020)

As robots become more present in open human environments, it will become crucial for robotic systems to understand and predict human motion. Such capabilities depend heavily on the quality and availability of motion capture data. Here we present a novel dataset that includes 1) long sequences of manipulation tasks, 2) the 3D model of the workspace geometry, and 3) eye-gaze. The dataset includes 180 min of motion capture data with 1627 pick and place actions being performed.
  title={MoGaze: A Dataset of Full-Body Motions that Includes 
  Workspace Geometry and Eye-Gaze},
  author={Kratzer, Philipp and Bihlmaier, Simon and Midlagajni, Niteesh 
  Balachandra and Prakash, Rohit and Toussaint, Marc and Mainprice, Jim},
  journal={IEEE Robotics and Automation Letters},

Anticipating Human Intention for Full-Body Motion Prediction in Object Grasping and Placing Tasks (RO-MAN; 2020)

In this work, we focus on manipulation movements in environments such as homes, workplaces or restaurants, where the overall task and environment can be leveraged to produce accurate motion prediction. For these cases we propose an algorithmic framework that accounts explicitly for the environment geometry based on a model of affordances and a model of short-term human dynamics both trained on motion capture data. The prediction of grasp and placement probability densities are used by a constraint-based trajectory optimizer to produce a full-body motion prediction over the entire horizon.
  title={Anticipating Human Intention for Full-Body Motion Prediction in Object Grasping and 
  Placing Tasks},
  author={Kratzer, Philipp and Midlagajni, Niteesh Balachandra and Toussaint, Marc and 
  Mainprice, Jim},
  booktitle={29th IEEE International Conference on Robot and Human Interactive 
  Communication (RO-MAN)},