An Unsupervised Deep Learning Model to Discover Visual Similarity Between Sketches for Visual Analogy Support
Abstract Visual analogy has been recognized as an important cognitive process in engineering design. Human free-hand sketches provide a useful data source for facilitating visual analogy. Although there has been research on the roles of sketching and the impact of visual analogy in design, little work has been done aiming to develop computational tools and methods to support visual analogy from sketches. In this paper, we propose a computational method to discover visual similarity between sketches, considering the following practical application: Given a sketch drawn by a designer that reflects the designer’s rough idea in mind, our goal is to identify the shape similar sketches that can stimulate the designer to make more and better visual analogies. The first challenge in doing so is how to discover the similar shape features embedded in sketches from various categories. To address this challenge, we propose a deep clustering model to learn a latent space which can reveal underlying shape features for multiple categories of sketches and cluster sketches simultaneously. An extensive evaluation of the clustering performance of our proposed method has been carried out in different configurations. The results have shown that the proposed method can discover sketches that have similar appearance, provide useful explanations of the visual relationship between different sketch categories, and has the potential to generate visual stimuli to enhance designers’ visual imageries.