The primary objective of this research is to realize an adaptable online architectural virtual reality (VR) model whose color attributes can be changed dynamically according to the identified emotional state of the user. We believe that the current approach to developing electronic based design environments is fundamentally defective with regard to support for multi-person multi-modal design interactions. This paper outlines new facilities within ubiquitous media spaces supporting embodied interaction between human-emotion and computation. Also This paper addresses how to capture a specific user emotion through the web and use it for modifying architectural VR model mainly for its color adaptation. This adaptation process consists of three phases: 1) identification of the user emotional state projected onto the selected paintings 2) translation of the extracted emotional keywords into a pertinent set of colors 3) automated color adaptation process for the given VR model. In this paper, we introduced a method of using well-known paintings and their variations to derive online viewer emotional state which can be utilized to find a new color coordination scheme reflecting the identified emotion. This color harmony scheme can provide a useful information for a dynamic color adaptation for the objects embedded in the given VR model. The outcome of this study could enable an interactive and dynamic architectural VR model supporting emotion-responsive interior design simulations or the realization of an architectural environment where interior colors are changed according to the captured mood of the occupant.