Information
- Publication Type: Journal Paper with Conference Talk
- Workgroup(s)/Project(s):
- Date: August 2018
- Journal: ACM Transactions on Graphics (SIGGRAPH 2018)
- Volume: 37
- Open Access: yes
- Number: 4
- Location: Vancouver, Canada
- Lecturer: Karoly Zsolnai-Fehér
- ISSN: 0730-0301
- Event: SIGGRAPH 2018
- DOI: 10.1145/3197517.3201307
- Conference date: 6. August 2018 – 12. August 2018
- Pages: 76:1 – 76:14
- Keywords: gaussian material synthesis, neural rendering, neural rendering
Abstract
We present a learning-based system for rapid mass-scale material synthesis that is useful for novice and expert users alike. The user preferences are learned via Gaussian Process Regression and can be easily sampled for new recommendations. Typically, each recommendation takes 40-60 seconds to render with global illumination, which makes this process impracticable for real-world workflows. Our neural network eliminates this bottleneck by providing high-quality image predictions in real time, after which it is possible to pick the desired materials from a gallery and assign them to a scene in an intuitive manner. Workflow timings against Disney’s “principled” shader reveal that our system scales well with the number of sought materials, thus empowering even novice users to generate hundreds of high-quality material models without any expertise in material modeling. Similarly, expert users experience a significant decrease in the total modeling time when populating a scene with materials. Furthermore, our proposed solution also offers controllable recommendations and a novel latent space variant generation step to enable the real-time fine-tuning of materials without requiring any domain expertise.Additional Files and Images
Weblinks
BibTeX
@article{zsolnai-2018-gms, title = "Gaussian Material Synthesis", author = "Karoly Zsolnai-Feh\'{e}r and Peter Wonka and Michael Wimmer", year = "2018", abstract = "We present a learning-based system for rapid mass-scale material synthesis that is useful for novice and expert users alike. The user preferences are learned via Gaussian Process Regression and can be easily sampled for new recommendations. Typically, each recommendation takes 40-60 seconds to render with global illumination, which makes this process impracticable for real-world workflows. Our neural network eliminates this bottleneck by providing high-quality image predictions in real time, after which it is possible to pick the desired materials from a gallery and assign them to a scene in an intuitive manner. Workflow timings against Disney’s “principled” shader reveal that our system scales well with the number of sought materials, thus empowering even novice users to generate hundreds of high-quality material models without any expertise in material modeling. Similarly, expert users experience a significant decrease in the total modeling time when populating a scene with materials. Furthermore, our proposed solution also offers controllable recommendations and a novel latent space variant generation step to enable the real-time fine-tuning of materials without requiring any domain expertise.", month = aug, journal = "ACM Transactions on Graphics (SIGGRAPH 2018)", volume = "37", number = "4", issn = "0730-0301", doi = "10.1145/3197517.3201307", pages = "76:1--76:14", keywords = "gaussian material synthesis, neural rendering, neural rendering", URL = "https://www.cg.tuwien.ac.at/research/publications/2018/zsolnai-2018-gms/", }