Continuity Editing for 3D Animation

Authors: Quentin Galvane, Rémi Ronfard, Christophe Lino, Marc Christie

AAAI 2015 | Conference PDF | Archive PDF | Plain Text | LLM Run Details

Reproducibility Variable Result LLM Response
Research Type Experimental We also show that our method can generate movies with different editing rhythms and validate the results through a user study. and (iii) a validation of our model through a user evaluation comparing the original edit of an existing movie with our optimal edit and with degraded approaches.
Researcher Affiliation Academia Quentin Galvane INRIA Univ. Grenoble Alpes & LJK; R emi Ronfard INRIA Univ. Grenoble Alpes & LJK; Christophe Lino INRIA Univ. Grenoble Alpes & LJK; Marc Christie IRISA University of Rennes I
Pseudocode No The paper describes algorithms and mathematical models but does not include structured pseudocode or explicitly labeled algorithm blocks.
Open Source Code No The paper states: 'For evaluation purposes, we are making our experimental data (including rushes and their annotations) and our experimental results publicly available 4 https://team.inria.fr/imagine/continuity-editing/'. This refers to data and results, not explicitly the source code for the methodology.
Open Datasets Yes For evaluation purposes, we are making our experimental data (including rushes and their annotations) and our experimental results publicly available 4 https://team.inria.fr/imagine/continuity-editing/
Dataset Splits No The paper uses a single recreated scene for validation through a user study, but does not describe explicit training, validation, and test dataset splits in the conventional sense of machine learning experiments.
Hardware Specification No The paper does not provide specific hardware details (e.g., GPU/CPU models, memory amounts) used for running experiments.
Software Dependencies No The paper mentions general tools and concepts like '3D animation packages' or 'graphics pipeline' but does not specify any software dependencies with version numbers (e.g., libraries, frameworks).
Experiment Setup Yes Rather than making automatic decisions, our system is designed to let the user/director choose the average shot length (ASL) which dictates the rhythm of editing, and hence the editing style. To enforce those values, we compute a cost measuring, for each shot sj of duration dj, the deviation of its duration from the log-normal distribution CR(dj) = (log dj µ)2 / 2σ2 + log dj. and To improve our system efficiency, we further restrain the search in our algorithm to a constant horizon H (longest allowable shot duration) of 30 seconds. and Twenty-five cameras were manually placed for the whole duration of the sequence (sixteen of them closely approximating the actual cameras from the original movie, and nine providing alternative angles). and We prepared 5 stimuli (a fully edited version of 80 seconds per method). Participants were asked to rank the global film-making quality3 on a discrete scale ranging from 0 (very bad) to 10 (very good).