Real-world size is automatically encoded in preschoolers’ object representations
When adults see a picture of an object, they automatically process how big the object typically is in the real world (Konkle & Oliva, 2012a). How much life experience is needed for this automatic size processing to emerge? Here, we ask whether preschoolers show this same signature of automatic size processing. We showed 3- and 4-year-olds displays with two pictures of objects and asked them to touch the picture that was smaller on the screen. Critically, the relative visual sizes of the objects could either be congruent with their relative real-world sizes (e.g., a small picture of a shoe next to a big picture of a car) or incongruent with their relative real-world sizes (e.g., a big picture of a shoe next to a small picture of a car). Across two experiments, we found that preschoolers were worse at making visual size judgments on incongruent trials, suggesting that real-world size was automatically activated and interfered with their performance. In a third experiment, we found that both 4-year-olds and adults showed similar item-pair effects (i.e., showed larger Size-Stroop effects for the pairs of items, relative to other pairs). Furthermore, the magnitude of the item-pair Stroop effects in 4-year-olds did not depend on whether they could recognize the pictured objects, suggesting that the perceptual features of these objects were sufficient to trigger the processing of real-world size information. These results indicate that, by 3–4 years of age, children automatically extract real-world size information from depicted objects.