VERT Test
Download of the current version of the VERT kit
VERT is continuously updated and versioned, with the latest version always available for download on this page, clearly marked with its version number. This analog, user-friendly solution is designed to integrate seamlessly into design workflows, ensuring accessibility for a broad audience without requiring prior training.

Observation: A 'gap' in visual design testing
In the related literature review, it is described how I have observed a significant lack of effective testing methods tailored specifically to visual design in both educational and professional settings. Students often struggle to obtain actionable feedback on their visual projects, while practitioners in the design industry face similar challenges when trying to validate creative decisions. Existing frameworks for testing visuals, such as "think-aloud" tests, often produce oversimplified feedback like "I like it" or "I don’t like it." This lack of nuance limits opportunities for deeper analysis and improvement. Is the reason that the test participants feel insecure, shy or inadequate when it comes to verbalizing their experience? It is, for sure, difficult for test participants to articulate their responses to visual elements, further compounding the issue.
I observe that this absence of robust methodologies often leads to stagnation in design workflows. We can call it a dead end in the design process. The need for a tool that facilitates structured and meaningful feedback on visual designs became apparent, driving the development of the VERT test.
From BERT to VERT: A conceptual evolution
The Visual, Emotional, Rational Test (VERT) builds upon the principles of the BERT test, a heuristic tool for understanding user emotions in UX design. However, my research revealed limited academic resources validating or detailing BERT’s origins, with most information stemming from practical sources such as blog-style introductions on UX for the Masses by Neil Turner (A), UX Design by Sherwin Pollack (B), and Clearleft by Harry Brignull (C). While these resources provide valuable insights into BERT’s application, they lack a rigorous theoretical foundation.
Inspired by George Kelly’s Repertory Grid Technique (1955) (1) and J.A. Russell’s Circumplex Model of Affect (2), I developed the VERT kit. Kelly’s framework, which captures subjective evaluations using bipolar scales, aligns closely with BERT’s approach to measuring user sentiment. Similarly, Russell’s circumplex model maps emotional states across dimensions of arousal and valence, providing a nuanced way to analyze affective responses. These theoretical underpinnings informed the structure and metrics of VERT, enabling it to evaluate both emotional and rational dimensions of visual design.
VERT is designed to 'measure' and gather data from users' immediate, impulsive reactions to visual stimuli, whether that be an interface, layout, color palette, or other visual elements. VERT is a structured test designed to assess visual designs by integrating visual, emotional, and rational evaluation metrics. It provides a systematic approach to understanding how design elements resonate with users, offering insights into By combining scientific rigor with practical usability, VERT serves as a versatile tool for students, educators, and professional designers alike.
By focusing on these instinctual responses, VERT provides a framework for understanding how design choices resonate on both an emotional and rational level. The kit employs a series of bipolar scales and open-ended prompts, encouraging participants to articulate their perceptions and emotional reactions. This process not only captures the initial impact of the visual design but also enables designers to identify areas where the product aligns or diverges from intended goals.
The structured yet flexible nature of VERT allows it to be applied across a range of design contexts, from evaluating branding elements to fine-tuning user interfaces. By anchoring the evaluation in both theory and practical application, VERT aims to provide actionable insights that support iterative design processes, making it a valuable tool for designers seeking to refine their visual products.
Scientific methods in developing VERT kit
The development of the VERT Kit was rooted in three primary scientific methods, all aligned with an explorative design approach.
First, a pilot study was conducted, involving two design students who tested an initial version of the kit in a project on visual identity. The past design process leading to this pilot study was grounded in my literature review and insights derived from The Circle Model. Initially, I dedicated time to conducting further research and interviews as well. I developed a prototype for the VERT Kit, ensuring that it had sufficient value and practical application to be tested in-depth. Pilot studies, as emphasized by Van Teijlingen and Hundley (3), are critical for identifying potential challenges, refining research tools, and enhancing feasibility. Therefore I decided to facilitate this pilot study as a qualitative test session before further development of VERT. This approach provided direct input from relevant stakeholders and allowed me to witness how the tool was applied in practice. While the design showed potential, there were numerous areas for improvement and critical feedback points. One of the key objectives of the pilot study was to gather early-stage feedback that could shape and refine the product. Observations and feedback from this study informed initial adjustments to the kit, particularly in terms of usability and clarity, ensuring that it was better aligned with the needs of its users. Even though I got a lot of critical feedback, I also, immediately observed the potential and the core value of VERT.
Second, a session utilizing cultural probes offered valuable qualitative insights. Further developed versions of VERT were distributed to 18 groups of design students, who were asked to use the tools freely and document their usage. They got a short manual, the test pages, some markers. As Gaver, Dunne, and Pacenti (4) outline, cultural probes facilitate open-ended exploration and user-driven feedback, making them ideal for capturing diverse perspectives on tool usability. This approach revealed key areas for improvement, such as enhancing the kit's flexibility, neutrality, and graphical design, while also generating critical oral and written feedback. I got the executed VERT kits back from the students and could thereby analyze how VERT was used. I facilitated a feedback session with all the students did also collect data in a survey.
Third, a structured comparative testing session was conducted with two classes of design students, offering an opportunity to directly evaluate the effectiveness of the VERT Kit against traditional testing methods. The students were tasked with testing identical visual products, including a color scheme and website designs, using both a conventional "think-aloud" method and after the VERT Kit. The think-aloud protocol, as described by Ericsson and Simon (5), is widely used to capture participants' verbalized thought processes during a task. However, it was this project thesis and expectation, that its application in visual testing often results in overly simplistic feedback, such as "I like it" or "I don’t like it," without delving into the underlying rationale or emotional responses to design elements. In contrast, the VERT Kit enabled more detailed, structured feedback, promoting richer conversations between designers and test participants. By incorporating bipolar axes and targeted prompts, the kit guided participants to evaluate visual elements systematically. This approach helped designers capture specific insights about emotional resonance, rational evaluation, and the effectiveness of their visual choices. But it also helped the test subjects to gain a rapid, impulsive language about the visual stimuli and it facilitated the development of language, streamlined the dialogue, and enhanced the quality of feedback.
Critiques of VERT during these sessions also offered valuable insights for its refinement. For instance, some students expressed concerns that the structured axes in the kit limited test participants' ability to articulate their full emotional and linguistic responses. What if a non-included emotion was relevant? Didn't they guide the test subject i a certain direction? To address these concerns, guidelines were added to the kits manual in how to facilitate the test, allowing space for open-ended feedback where test participants could freely express their thoughts and feelings without constraint. I added 3 empty axes for the test subject to fill out with her/his own adjectives. Additionally, the feedback highlighted the importance of ensuring the objectivity of test sessions, avoiding any influence from designers that could bias participants' responses.
The comparative testing session also included a deliberate design to demonstrate the practical benefits of the VERT Kit. Participants first tested the visual products using the think-aloud method, followed by a second round using the VERT Kit. This sequencing allowed students to directly compare the outcomes and reflect on how the VERT Kit either enhanced or hindered their testing process. Many students reported that the structured nature of the VERT Kit made the testing process easier and more efficient, enabling them to gather richer, more actionable data. Furthermore, designers noted that the tool facilitated more meaningful discussions with test participants, fostering a collaborative exploration of the visual design.
However, the session also revealed areas for improvement. For example, some students observed that the kit's structured approach could inadvertently restrict the spontaneity of participants' responses. These critiques informed subsequent iterations of the VERT Kit, including the addition of more flexible components and a recommendation to incorporate unstructured feedback sessions alongside the formal evaluation framework. These adjustments were instrumental in creating a tool that balances structure with openness, ensuring nuanced and objective test results while maintaining the kit’s utility for systematic analysis.
Through these sessions, the iterative and exploratory nature of the VERT Kit’s development became evident, reflecting the principles of explorative design as described by Sørensen, Mattsson, and Sundbo (6). By integrating structured comparative testing with participant feedback, the VERT Kit evolved into a versatile and robust tool for visual design evaluation, addressing both the need for detailed analysis and the flexibility required for nuanced feedback. The iterative development of the VERT Kit reflects the principles of explorative design, characterized by cycles of design, observation, reflection, and refinement (ibid). This process parallels the IDEO Design Thinking (D) framework, which emphasizes user-centric, iterative innovation to solve complex problems. By continually iterating on the VERT Kit, the research integrated empirical insights, interdisciplinary methods, and practical adjustments to create a robust tool for visual design evaluation.
Digitalization
Although VERT is currently a manual, analog tool, its potential for digital transformation is significant. A digital version could incorporate advanced analytics, real-time feedback, and dynamic interfaces to enhance its application. However, in its current form, VERT remains an essential foundational tool for bridging the gap in visual design testing, empowering designers to make informed and impactful creative decisions.
Aligning VERT with broader themes
The development of VERT is done with color and color palettes as the test objective. Though VERT can be used for testing most visual products: compositions, layouts, printed matters, website, typographies, interfaces and visual identities. The development directly supports the themes discussed in the literature review, particularly the gaps in communication and articulation highlighted in sections on Color & Language and Tacit Knowledge.
Sources
Online Articles and Resources
A) Neil Turner, UX for the Masses, Read here, Last accessed: 10.12.2024.
B) Sherwin Pollack, UX Design, Visit here, Last accessed: 10.12.2024.
C) Harry Brignull, Clearleft, Visit here, Last accessed: 10.12.2024.
D) IDEO. Design Thinking. IDEO Design Kit: https://www.designkit.org/.
Scholarly and Literary References
1) Kelly, George, The Psychology of Personal Constructs, 1955, Norton.
2) Russell, J.A., A Circumplex Model of Affect, 1980, American Psychological Association.
3) Van Teijlingen, E. R., & Hundley, V. (2001). The Importance of Pilot Studies. Social Research Update, University of Surrey.
4) Gaver, W., Dunne, T., & Pacenti, E. (1999). Cultural Probes. Interactions, 6(1), 21–29.
5) Ericsson, K. A., & Simon, H. A. (1993). Protocol Analysis: Verbal Reports as Data. MIT Press.
6) Sørensen, F., Mattsson, J., & Sundbo, J. (2009). ICE and the Experiment Method. ICE-Project Working Paper, 2009:01, Roskilde University.