This PhD topic is part of SHARED research project funded by the French National Agency for Research (ANR). The net salary will be approximately 1500 Euros per month (comfortable for living in France). Context: Shape registration and analysis of people’s surface dataset has become a new mainstream, gradually replacing conventional methods based on 2-dimensional images. Across a variety of disciplines ranging from anthropometry, computer aided design (CAD) computer graphics, and psychology, adopting 3D laser scanners for surface shape capture and building statistical models from a set of registered surface data is now widely accepted. While there is a large amount of research done on the static datasets with a proliferation of algorithms and a solid theoretical background, this does not seem to be the case for dynamic, time-varying datasets, due to the limited accessibility to the dynamic surface. In most of the shape capture sessions, the person is required to remain motionless during the scanning time. Naturally, current registration techniques (and therefore shape analysis techniques) handle the geometric features of static dataset, and the dynamic behavior of people’s skin relatively remain unsaid. This is unfortunate, since dynamic features cannot be captured solely by using geometric features when the target subjects undergo deformation. Although the use of geometric feature based on anatomical knowledge is still a golden standard, it is quite obvious that it may generate results with limited capability of reliable correspondence computation, because some commonly observed subjects like human body are highly mobile and drastically change not only its spatial arrangement but also geometric features over time.
PhD advisors: Hyewon SEO https://lsiit.u-strasbg.fr/igg-fr/index.php/Hyewon_Seo Dominique
Bechmann http://lsiit.u-strasbg.fr/igg-en/index.php/Dominique_Bechmann – Master in Computer Science/Electrical Engineering, or Mathematics – Good programming/communication skills – Good level of English is an obligation.