-
Notifications
You must be signed in to change notification settings - Fork 4
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
1 parent
fab86f5
commit ce8f710
Showing
1 changed file
with
116 additions
and
0 deletions.
There are no files selected for viewing
116 changes: 116 additions & 0 deletions
116
_teaching/2025-SS-Seminar-Human-Performance-Capture.html
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,116 @@ | ||
--- | ||
title: Seminar on Human-Performance Capture | ||
ref: shpc2025 | ||
description: | ||
semester: Spring 2025 | ||
number: <a href="https://www.vorlesungen.ethz.ch/Vorlesungsverzeichnis/lerneinheit.view?lerneinheitId=188654&semkez=2025S&lang=en">263-3712-00L</a> | ||
lecturer: Juan Zarate | ||
head-ta: | ||
ta: | ||
assistants: | ||
edoz: | ||
session-time-place: Tue 14:15-16:00, in-person at <a href="https://www.rauminfo.ethz.ch/Rauminfo/grundrissplan.gif?gebaeude=STD&geschoss=G&raumNr=1&lang=en" title="STD G1"> STD G1</a>. Virtual option for external students. | ||
links: | ||
credits: 2 | ||
--- | ||
|
||
<h3>Overview</h3> | ||
<p> | ||
Performance Capture refers to the technology and processes involved in digitally recreating the movements, poses, gestures, appearance, and expressions of a human and its surroundings. | ||
Methods to do so typically use body-worn sensors, RGB and depth cameras, and advanced Computer Vision algorithms to record the intricate details of a performer's movements. | ||
</p> | ||
|
||
<br/> | ||
<img src="/assets/teaching/teaser_performance_capture.png" alt="Teaser image for Human-Performance Capture course" width="660" /> | ||
<br/> | ||
|
||
<h3>Objective</h3> | ||
<p> | ||
The goal of the seminar is to familiarize students with exciting new research topics in this important area but also to teach basic scientific writing and oral presentation skills. | ||
</p> | ||
|
||
<h3>Content</h3> | ||
<p> | ||
Performance Capture refers to the technology and process involved in capturing and recreating the movements, gestures, and expressions of a human and its surroundings. | ||
It's a relevant field for various applications, including animation, virtual reality, and video games. | ||
It involves using advanced sensors, cameras, and computer vision algorithms to record the intricate details of a performer's movements, typically focusing on their face, body, clothing, and interactions. | ||
We will also explore the technology and applications behind high-end, multi-view capture stages, as the | ||
<a href="https://ethz.ch/staffnet/en/news-and-events/internal-news/archive/2022/06/holograms-at-the-touch-of-a-button.html"title="Volumetric Capture Lab"> Volumetric Capture Lab </a> recently opened. | ||
</p> | ||
|
||
<p> | ||
Papers from scientific venues such as CVPR, ICCV, ECCV, and SIGGRAPH will be examined in-depth to explore the latest research in the field. | ||
Students present and discuss the papers to extract techniques and insights that can be applied to software & hardware projects. | ||
</p> | ||
|
||
<h3>Format </h3> | ||
<p> | ||
The seminar is structured to encourage critical reading and discussion of the papers, and the attendance in the weekly meetings is mandatory. | ||
We use a case-study format where all students read the same paper each week but fulfill different roles and hence prepare with different viewpoints in mind ( "presenter", "PhD", and “Journalist”). | ||
The final deliverables include: | ||
<ol> | ||
<li> 1x time: 20 Minute presentation as <b>Presenter</b>, giving a <i>short</i> presentation about the paper that you prepare in depth. </li> | ||
<!-- <li> 1x time: 5 Minute presentation as <b>Historian</b>, explaing how this paper sits in the context of the related work. </li> --> | ||
<li> 1x time: A4 research proposal as <b>PhD student</b>, proposing a follow-up project for your own research based on this paper </li> | ||
<li> 1x time: A4 summary of the discussion as <b>Journalist</b>, writing an article about the the paper that can be understood by the general public </li> | ||
<li> Every week: <b>All students</b> come up with an alternative title and three questions/comments about the paper before the start of the seminar </li> | ||
</ol></p> | ||
|
||
|
||
<h3>Slides</h3> | ||
<p>(use ETH credentials to access the file) | ||
|
||
</p> | ||
|
||
|
||
<!-- <a class="anchor" id="schedule"></a> | ||
<h3>Schedule</h3> | ||
T.B.D. --> | ||
|
||
<!-- possible link-icon-classes: | ||
.a-ext (Link to external Website) | ||
.a-pdf (PDF) | ||
.a-cod (Code) | ||
.a-bib (BibTeX) | ||
.a-vid (Video) | ||
.a-dem (Demo) | ||
.a-ppt (Powerpoint-Presentaion) | ||
.a-zip (Zipped Files) | ||
--> | ||
<!--<b>We will assign students to topics and roles during or after the first seminar on Wed 21/9/2016.</b>--> | ||
<!-- | ||
<table><tbody> | ||
<tr><th>Wk.</th> <th>Date</th> <th>TA</th> <th>Presenter</th> <th>Historian</th> <th>PhD student</th> <th>Journalist</th> <th>Paper</th> | ||
<tr><td>1</td> <td>19.02.2019</td> <td> <h5>--</h5> <td><h5></h5> <td><h5></h5> <td><h5></h5> <td><h5></h5> <td> --- Introduction --- </td></tr> | ||
<tr><td>2</td> <td>26.02.2019</td> <td><h5> -- </h5> <td><h5></h5> <td><h5></h5> <td><h5></h5> <td><h5></h5> <td> </td></tr> | ||
<tr><td>3</td> <td>05.03.2019</td> <td><h5> -- </h5> <td><h5></h5> <td><h5></h5> <td><h5></h5> <td><h5></h5> <td> </td></tr> | ||
<tr><td>4</td> <td>12.03.2019</td> <td><h5>Thomas Langerack</h5> <td><h5>A.B.</h5> <td><h5>P.B.</h5> <td><h5>Y.B.</h5> <td><h5>J.L.</h5> <td><a href="https://www.gillesbailly.fr/publis/BAILLY_MenuOptimizer.pdf">Bailly - MenuOptimizer: Interactive Optimization of Menu Systems (UIST 2013)</a></td></tr> | ||
<tr><td>5</td> <td>19.03.2019</td> <td><h5>Christoph Gebhardt</h5> <td><h5>R.B.</h5> <td><h5>J.L.</h5> <td><h5>P.B.</h5> <td><h5>A.B.</h5> <td><a href="https://ait.ethz.ch/projects/2018/adam/downloads/park2018chi.pdf">Park - AdaM: Adapting Multi-User Interfaces for Collaborative Environments in Real-Time (CHI 2018)</a></td></tr> | ||
<tr><td>6</td> <td>26.03.2019</td> <td><h5>Anna Feit</h5> <td><h5>Y.B.</h5> <td><h5>A.B.</h5> <td><h5>R.B</h5> <td><h5>P.B</h5> <td><a href="http://users.comnet.aalto.fi/oulasvir/pubs/May-AI-CHI2019.pdf">Koch - May AI? Design Ideation with Cooperative Contextual Bandits (CHI 2019)</a></td></tr> | ||
<tr><td>7</td> <td>02.04.2019</td> <td><h5>David Lindlbauer</h5> <td><h5>P.B.</h5> <td><h5>R.B</h5> <td><h5>Y.B</h5> <td><h5>J.L</h5> <td><a href="https://www.microsoft.com/en-us/research/wp-content/uploads/2016/02/layoutopt_short_optimized.pdf">Gal - FLARE: Fast Layout for Augmented Reality Application (ISMAR 2014)</a></td></tr> | ||
<tr><td>8</td> <td>09.04.2019</td> <td><h5></h5>tbd. <td><h5></h5> <td><h5></h5> <td><h5></h5> <td><h5></h5> <td>tbd.</td></tr> | ||
<tr><td>9</td> <td>16.04.2019</td> <td><h5>Velko Vechev</h5> <td><h5>J.L.</h5> <td><h5>Y.B.</h5> <td><h5>A.B.</h5> <td><h5>R.B.</h5> <td><a href="https://dl.acm.org/citation.cfm?id=3126594.3126662">Habib - DreamSketch: Early Stage 3D Design Explorations with Sketching and Generative Design (UIST 2017)</a></td></tr> | ||
<tr><td>10</td> <td>23.04.2019</td><td colspan="5"><h5>--- Easter week: no Seminar --- </h5><td></tr> | ||
<tr><td>11</td> <td>30.04.2019</td> <td><h5></h5>David Lindlbauer <td><h5></h5> <td><h5></h5> <td><h5></h5> <td><h5></h5> <td><a href="https://www.microsoft.com/en-us/research/publication/foveated-3d-graphics/">Guenter - Foveated 3D Graphics (ACM SIGGRAPH Asia 2012)</a></tr> | ||
<tr><td>12</td> <td>07.05.2019</td> <td><h5>--</h5> <td><h5></h5> <td><h5></h5> <td><h5></h5> <td><h5></h5> <td></td></tr> | ||
<tr><td>13</td> <td>14.05.2019</td> <td><h5></h5>Thomas Langerack <td><h5></h5> <td><h5></h5> <td><h5></h5> <td><h5></h5> <td><a href="http://www.oxfordscholarship.com/view/10.1093/oso/9780198799603.001.0001/oso-9780198799603-chapter-11">Howes - Interaction as an Emergent Property of a Partially Observable Markov Decision Process (Computational Interaction, 2018)</td></a></tr> | ||
<tr><td>14</td> <td>21.05.2019</td> <td><h5>--</h5> <td><h5></h5> <td><h5></h5> <td><h5></h5> <td><h5></h5> <td></td></tr> | ||
<tr><td>15</td> <td>28.05.2019</td> <td><h5></h5>Anna Feit <td><h5></h5> <td><h5></h5> <td><h5></h5> <td><h5></h5> <td><a href="https://userinterfaces.aalto.fi/ability-based-optimization/">Sarcar - Ability-Based Optimization of Touchscreen Interactions (IEEE Pervasive Computing, 2018)</a></td></tr> | ||
</tbody></table> --> | ||
|