From Visual Perception to Deep Empathy: An Automated Assessment Framework for House-Tree-Person Drawings Using Multimodal LLMs and Multi-Agent Collaboration
Published:Dec 23, 2025 09:26
•1 min read
•ArXiv
Analysis
This article describes a research paper exploring the use of Large Language Models (LLMs) and multi-agent systems to automatically assess House-Tree-Person (HTP) drawings. The focus is on moving beyond simple visual perception to infer deeper psychological states, such as empathy. The use of multimodal LLMs suggests the integration of both visual and textual information for a more comprehensive analysis. The multi-agent collaboration aspect likely involves different AI agents specializing in different aspects of the drawing assessment. The source, ArXiv, indicates this is a pre-print and not yet peer-reviewed.
Key Takeaways
- •The research explores automated psychological assessment using AI.
- •It utilizes multimodal LLMs for a more comprehensive analysis.
- •Multi-agent collaboration is employed for drawing evaluation.
- •The work is currently a pre-print.
Reference
“The article focuses on automated assessment of House-Tree-Person drawings using multimodal LLMs and multi-agent collaboration.”