pkratzer.net

Welcome to my website!

About me

My name is Philipp Kratzer, I am a machine learning and robotics researcher at the Machine Learning and Robotics Lab of the University of Stuttgart. Currently I work on my PHD topic about using human motion prediction to improve close distance human-robot interaction.

Publications

For a full list of my publications, check out my Google Scholar Profile.

Recent publications:

Prediction of Human Full-Body Movements with Motion Optimization and Recurrent Neural Networks (ICRA 2020; Best Paper award finalist)

Human movement prediction is difficult as humans naturally exhibit complex behaviors that can change drastically from one environment to the next. In this work we investigate encoding short-term dynamics in a recurrent neural network, while we account for environmental constraints, such as obstacle avoidance, using gradient-based trajectory optimization. Our experiments demonstrate that our framework improves predictions and even can be used to coordinate motions with a human partner!
paper
presentation slides
Bibtex
@inproceedings{kratzer2020prediction,
  title={Prediction of human full-body movements with motion optimization and 
  recurrent neural networks},
  author={Kratzer, Philipp and Toussaint, Marc and Mainprice, Jim},
  booktitle={2020 IEEE International Conference on Robotics and Automation (ICRA)},
  pages={1792--1798},
  year={2020}
}

MoGaze: A Dataset of Full-Body Motions that Includes Workspace Geometry and Eye-Gaze (IEEE Robotics and Automation Letters; 2020)

As robots become more present in open human environments, it will become crucial for robotic systems to understand and predict human motion. Such capabilities depend heavily on the quality and availability of motion capture data. Here we present a novel dataset that includes 1) long sequences of manipulation tasks, 2) the 3D model of the workspace geometry, and 3) eye-gaze. The dataset includes 180 min of motion capture data with 1627 pick and place actions being performed.
paper
website
Bibtex
@article{kratzer2020mogaze,
  title={MoGaze: A Dataset of Full-Body Motions that Includes 
  Workspace Geometry and Eye-Gaze},
  author={Kratzer, Philipp and Bihlmaier, Simon and Midlagajni, Niteesh 
  Balachandra and Prakash, Rohit and Toussaint, Marc and Mainprice, Jim},
  journal={IEEE Robotics and Automation Letters},
  year={2020}
}

Anticipating Human Intention for Full-Body Motion Prediction in Object Grasping and Placing Tasks (RO-MAN; 2020)

In this work, we focus on manipulation movements in environments such as homes, workplaces or restaurants, where the overall task and environment can be leveraged to produce accurate motion prediction. For these cases we propose an algorithmic framework that accounts explicitly for the environment geometry based on a model of affordances and a model of short-term human dynamics both trained on motion capture data. The prediction of grasp and placement probability densities are used by a constraint-based trajectory optimizer to produce a full-body motion prediction over the entire horizon.
paper
Bibtex
@inproceedings{kratzer2020anticipating,
  title={Anticipating Human Intention for Full-Body Motion Prediction in Object Grasping and 
  Placing Tasks},
  author={Kratzer, Philipp and Midlagajni, Niteesh Balachandra and Toussaint, Marc and 
  Mainprice, Jim},
  booktitle={29th IEEE International Conference on Robot and Human Interactive 
  Communication (RO-MAN)},
  pages={1157--1163},
  year={2020,
}

Teaching

I am involved in teaching courses at the University of Stuttgart.