ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2510.27632
85
0

Sketch-to-Layout: Sketch-Guided Multimodal Layout Generation

31 October 2025
Riccardo Brioschi
Aleksandr Alekseev
Emanuele Nevali
Berkay Döner
Omar El Malki
B. Mitrevski
Leandro Kieliger
Mark Collier
Andrii Maksai
Jesse Berent
C. Musat
Efi Kokiopoulou
    3DV
ArXiv (abs)PDFHTMLGithub (4★)
Main:6 Pages
19 Figures
Bibliography:3 Pages
6 Tables
Appendix:19 Pages
Abstract

Graphic layout generation is a growing research area focusing on generating aesthetically pleasing layouts ranging from poster designs to documents. While recent research has explored ways to incorporate user constraints to guide the layout generation, these constraints often require complex specifications which reduce usability. We introduce an innovative approach exploiting user-provided sketches as intuitive constraints and we demonstrate empirically the effectiveness of this new guidance method, establishing the sketch-to-layout problem as a promising research direction, which is currently under-explored. To tackle the sketch-to-layout problem, we propose a multimodal transformer-based solution using the sketch and the content assets as inputs to produce high quality layouts. Since collecting sketch training data from human annotators to train our model is very costly, we introduce a novel and efficient method to synthetically generate training sketches at scale. We train and evaluate our model on three publicly available datasets: PubLayNet, DocLayNet and SlidesVQA, demonstrating that it outperforms state-of-the-art constraint-based methods, while offering a more intuitive design experience. In order to facilitate future sketch-to-layout research, we release O(200k) synthetically-generated sketches for the public datasets above. The datasets are available atthis https URL.

View on arXiv
Comments on this paper