In face-to-face communication, the occasional need for intentional lies is something with which everyone can identify. For example, when we get mad, circumstances may force us to put on a big smile instead of expressing our anger; when we feel miserable, good manners may dictate that we greet others warmly. In short, to abide by social norms, we consciously lie. On the other hand, if we consider the signs that our bodies express as communication (body language), we can say that the body does not lie even while the mind does.
Considering this phenomenon, we propose a means of "touching the heart" in a somewhat Japanese way by measuring the heartbeat of the "honest" body and using other technologies to reveal a new code of non-verbal communication from a hidden dimension in society. We call this "techno-healing art."
Two computer-generated mermaids function as individual agents for two viewers. Each mermaid agent moves in sync with the heart rate detected by an electrode attached to the collarbone of its viewer. Then, using a synchronization interaction model that calculates the mutual heart rate on a personal computer, the two mermaids express hidden non-verbal communication. The data of relax-strain calculated from the heart rate and the interest calculated from the variation in the heart rate are mapped on the model. The synchronization interaction model reveals the communication codes in the hidden dimension that do not appear in our superficial communication. Then, using a camera to pick up hand gestures and a personal computer to analyze the images, the synchronization interaction model is applied to determine the mermaid's behavior. For a high degree of synchronism, the agents mimic the hand gestures of their subjects, but for a low degree of synchronism, the agents run away. In the event that one mermaid agent touches the other, a pseudo-touch can be felt through the use of a vibration device. As for background sound, the heart sound of the subjects are picked up by an electronic stethoscope and processed for output on a personal computer.
Synchronization interaction model
The data of relax-strain calculated from the heart rate and the interest calculated from the variation of the heart rate are mapped on the model. The synchronity interaction model reveals the communication codes in the hidden dimension that do not appear in our superficial communication.
When both people are in the domain where they are highly relaxed and interested, they are considered synchronized. An animation is generated in which, for example, their CG-reactive embodiments join hands in brotherhood or enjoy friendly actions.
When both people are in a situation where they are highly strained and less interested, unfriendly communication is generated. An animation is generated in which, for example, their CG embodiments quarrel with each other.
When both people are in the domain where they are highly relaxed and less interested, they are considered, "going their own ways". An animation is generated in which, for example, their CG embodiments do not interfere with each other.
when two persons are in a situation where they are highly strained and highly interested, they are assumed to have stress and feelings of shyness. An animation is generated in which, for example, their CG-reactive embodiments behave shyly.
In this way, new codes of non-verbal communication that can't be seen in face-to-face communication are found through the CG of the embodiments.
Hand recognition by using a range sensor
A person's hand is recognized by using a range sensor. The range data of an image is generated from two images inputted from two cameras. A contrast of the image of the hand on the display represents the distance from the cameras; white means that the distance is short and black means it is long.
A personals heart rate is measured by putting the electrocardiographis electrodes on his body. The heart rate is sent to a PC connected to the electrocardiograph via RS232C and is mapped on the synchronity interaction model depending on the heart rate.
Heart Sound Wavy analysis is used to analyze the input data and to send the data to Event Control as event data and to Sound Processing as MIDI commands. Event Control sends several commands to CG Generate if some CG needs to be changed depending on the heart sound data. CG Generate creates CG based on these commands and outputs the CG. Sound Processing processes the sound data as required and then outputs it. Image Analysis analyzes the image data fed from a camera and the relational information of the hand, and the CG displayed is sent to Event Control. Event Control sends some commands to CG Generate if some CG needs to be changed depending on the data. It also sends, when necessary, the changes of the sound data as MIDI commands to Sound Processing or operates vibrators.
For installation, a space with four meters wide, four meters deep and three meters high is required. A dark and quiet space is preferable. Interactive actions are displayed on one main screen and two Japanese "shoji" screens. A Japanese "hinoki" wooden bucket with a diameter of one meter that is filled with water is placed in the center of the installation,. Two persons, fitted with a stethoscope, experience non-verbal communication by touching their CG embodiments in the bucket. The synchronity based on the heart rate from the electrodes of the electrocardiograph is calculated by the PC, and the PC generates an arbitrary feeling in a CG form. The hand movements of the two persons are caught by an installed camera and an image analysis for the data is performed. In accordance with the synchronity interaction model, the CG embodiment either follows the movement of the hand of the partner with high synchronity or goes away from the hand of the partner with low synchronity. When one touches the CG embodiment of the partner, a vibrator gives him a simulated feeling of touch. The stethoscope measures the sound of the heart, which is processed by the PC and outputted.