Virtual Humans Forum
Virtual Humans Forum
Home | Profile | Register | Active Topics | Members | Search | FAQ
Virtual Humans
Partner website:
 All Forums
 Virtual Humans
 Artificial Intelligence
 Emotional Context and Expression Engines

Note: You must be registered in order to post a reply.
To register, click here. Registration is FREE!

Format Mode:
Format: BoldItalicizedUnderlineStrikethrough Align LeftCenteredAlign Right Horizontal Rule Insert HyperlinkInsert EmailInsert Image Insert CodeInsert QuoteInsert List Spell Checker

* Forum Code is ON
Smile [:)] Big Smile [:D] Cool [8D] Blush [:I]
Tongue [:P] Evil [):] Wink [;)] Clown [:o)]
Black Eye [B)] Eight Ball [8] Frown [:(] Shy [8)]
Shocked [:0] Angry [:(!] Dead [xx(] Sleepy [|)]
Kisses [:X] Approve [^] Disapprove [V] Question [?]


T O P I C    R E V I E W
hologenicman Posted - Dec 11 2005 : 10:32:10
Hey there,

At the request of Vitorrio, I am posting a summary and continuation of an ongoing project started by GrantNZ and posted at the Zabaware forum:

Vitorrio has requested that I leave the technical stuff at zabaware, but that the conceptual thread might survive (even thrive) here. I'll do my best to separate the technical from the conceptual, but in my mind I think of them as one... I do appreciate bouncing the ideas around in order to better distill them to their simple essence.

By the way, I chose the Artificial Intelligence category since I consider this to be a matter of Emotional Intelligence. I even read somewhere that the powers that be are considering changing the IQ testing criteria to include testing for Emotional Intelligence. I'm sorry to hear that since I'm sure that I would take a serious hit in my scores...

I'll start with my current gameplan (subject to change ):

1)Building an EmotionalContextEngine(to give text emotional value)
-EmotionalValueDatabase(emotional dictionary that is updatable)
-EmotionEngine(Tracks HISTORY of Emotion, Mood, and Personallity changes)
-HormonesNeeds_Interface(Internal/External AFFECTORS input)

2)Need to build an EmotionalExpressionEngine

A demo of the emotion, mood, personallity equations:

To use this engine, you must have MS Excell set to automatically calculate and have the iterations set to "1".

The key value to this engine is that it keeps a HISTORY of the emotional experiences that the v-human has had and weighs them against the v-humans own potentials(configurations). According to the propagation factors configured, the v-human may be more willing or reluctant to change moods and eventually modify it's personallity. Given enough varying experience, the v-human can modify it's personallity.

Interaction involves three values input and three values output.

This particular V-human is configured as a Depressed, Negative, and Agressive individual.

I then exposed the individual to lots of good input to see how it would change it's mood.

After this, I let it have a bad experience to see how quickly it's mood would change back.

The three Values (Arousal, Valence, and Stance) are based on work at MIT:

I use these three variables throughout now and I am a real convert to their applicabillity.


This brings me to the current topic that Vitorrio asked me to approach here. The HormonesNeeds_Interface ultimately boils down to incorporating Maslow's needs triangle into our v-human's interactions:

As Peter has stated, there is no need to re-invent the wheel here. But it is time to figure how to fit the wheel onto the car and to figure out what kind of rubber will stick to the road in the turns...

Scroll down on the above link and you'll see that the needs heirarchy has been re-organized into a simpler format:

Growth Self-Actualization (development of competencies [knowledge, attitudes, and skills] and character) Transcendence (assisting in the development of others' competencies and character; relationships to the unknown, unknowable)
(Relatedness) Personal identification with group, significant others (Belongingness) Value of person by group (Esteem)
(Existence) Physiological, biological (including basic emotional needs) Connectedness, security

This all boils down to Self, Others, and Growth. So, that's where I'm at. I have started coming up with ways in which a v-human can "experience" Self, Others, and Growth.
[by the way, I've renamed these tiers Primary(self), Secondary(others), and Tertiary(growth) so if I use those terms please translate.]

These should sound like human needs since that is the v-human goal.


-Portable battery low = HUNGER

-Number of days program is run versus off since activation(BIRTH) = SECURITY

-Evenness versus randomness of human input = STABILLITY


-Time program is running versus time with human interaction = COMPANIONSHIP(or lonliness)

-Punctuation ques(<<<,>>>,***,^^^,---,+++) = APPROVAL
(Equivalent to verbal reprimands and compliments for training purposes)


-I'm at a loss except for recording the number of times the V-human learns something new.

So let's hear it. What kinds of Human equivalent needs can we give our emotionally deprived v-humans

John L>

15   L A T E S T    R E P L I E S    (Newest First)
vrossi Posted - Nov 12 2015 : 18:59:57
After 10 years from the first post, I resurrect this topic, because it seems that many ideas we had discussed here (and we weren't unfortunately able to take to a prototype stage, at least) have now been incorporated in development environments released by big names, including Microsoft and Google.

Microsoft mainly hyped its Emotion API, which uses machine learning to recognize eight states of emotion (anger, contempt, fear, disgust, happiness, neutral, sadness, or surprise), based on facial expressions.

Nomeneiste Posted - Sep 29 2009 : 08:04:43
Originally posted by laackejim

I would be tickled to share anything I have learned or am doing with you.


Yes, I do believe that this is doable. If you are ever interested I will be happy to explain to you how I came to understand the concepts from which all this is derived. It has to do with a goshawk and a biologist who couldn't explain what he knew.


Good morning, Sir.

Can you please to me in the forum, IM, or email how you came to understand the concepts?

laackejim Posted - Jan 31 2007 : 15:36:57
Originally posted by GrantNZ

Erm. Iva with your voice?

I told you there was a problem!
GrantNZ Posted - Jan 31 2007 : 08:51:13
Erm. Iva with your voice?
laackejim Posted - Jan 31 2007 : 06:22:29
Originally posted by GrantNZ

My lips are sealed! (I haven't modelled Iva a mouth that can open yet!)

Thanks Opening her mouth leaves the question of what voice comes out.. That is one I haven't satisfactorily solved yet. The closest, and fastest, is to record. I bought that microphone and hoped to talk an actress here in the valley to do the dubbing for me. Ordered the thing, talked to her husband and she is moving to San Francisco for her job next week. Now I have got a microphone coming and only my own voice to use. SNORT>
GrantNZ Posted - Jan 30 2007 : 08:39:19
My lips are sealed! (I haven't modelled Iva a mouth that can open yet!)
laackejim Posted - Jan 30 2007 : 07:50:39
Grant is always a day early and 5 hours behind (at least from here). and how that affects his input is something to see.

Laura, I think there is a functional solution to what you describe, and using all the objects in the scene. All that is necessary is for the "bot" to "know" what it wants to do. Every thing in the scene can be identified as to its value for any possible uses and the Bot makes choices based on such things as How good is X if used for Y and how far away is it and what do I have to do to get X. We can visit about how this can be set up simply if you wish

(Quiet Grant, Yes, If she "bites" I am going to talk about Fuzzy logic and how it can be used, Just don't warn her, OKAY?)
All Digital Posted - Jan 29 2007 : 18:53:24
Exactly, picture in 3D rather than in flash
GrantNZ Posted - Jan 29 2007 : 12:30:29
I'm not late, I'm early! Due to a quirk of time zones, I come from tomorrow

I'm guessing your environments are somewhat like stages? So in the office the character could use the PC, notebook, phone, etc etc depending on what the user wants to do?
All Digital Posted - Jan 29 2007 : 12:05:40
Gee! you guys are up late, too.

The ones that are present are the important one. However I'm sure the developments done here will also influence UltraHal.

As I mentioned I'm Rrreeaaly not much into games, but some outdoors environments will be included. I'm mainly planning, Office, Home, and factory, laboratory, school type environments. Business oriented, first of all. The kind of things that can potentially make money, and potentially provide grants and donations.

BTW. Check out Amazon.coms' Honor System when you get a chance, I have a link but it is getting late. It's a way of getting donations in support of your/our web efforts.

Are you moving in to these areas yet?

Also speech development for use in advertising and telephony, Combining the Web and usual forms of customer service are really big areas. I'm checking out Pronexus' VBVoice and VBSalt for their VB voice/XML relationship.

Just a few of the things us background girls think about

Art, science, and money. I love this stuff.

You guys have a good night

GrantNZ Posted - Jan 29 2007 : 11:03:46
I want to create a database of animations for a character in a 3D environment. Have any of you considered the AI involved with a character in such an environment?

Yup, because one of my many projects (that I never seem to actually work on ) is a 3D game.

Of course, there is a lot of available scope in this subject! Up until recently, many games survived by pre-scripting several animations and playing them at the appropriate times. Nowadays animations in games are somewhat more procedural, leading to greater flexibility and better appearance. AI environmental awareness varies greatly in games too - some rely on scripted behaviour by whoever made the game, while others are capable of examining their environment and using it to their advantage.

What kind of animations are you planning to create?

One thing that strikes me is that the type of animations will depend a lot on the expected environment. For example if you know there will be four specific usable object types in the environment, the animations can be tailored to each object type. On the other hand, if the environment will be varied or unexpected, you must resort to generic animations and hope that they interact properly with the objects.
GamerThom Posted - Jan 29 2007 : 09:05:45
Didn't have Heathkit's Hero-1, but I've certainly looked it over several times.
I've also been doing research into the Leaf Project for a robotic based AI.
Although I have put that little project on hold temporarily while I give
John and Jim a little help on their own pet projects.

I understand that Maviarab has taken his leave of working with Hal or any other AI
for the time being, and I know that Spydaz mainly sticks to zabaware, aidreams and
the UltraHal Forum. Sometimes he pops in here, but rarely. So I guess your just stuck
with us regulars when visiting here.

I wouldn't know about the abilities of a game-based AI. I never really looked into it.
All Digital Posted - Jan 29 2007 : 08:54:27
Thank you all for permitting me to at least be here. But it appears I may have added a lttle distraction.

Do I get to hear all of those script, and database, things which inspired my awe in the first place?

I haven't worked with VB in a while, and my is on hold for a new computer. UltraHals'(Mr.Ms') developments still thrills me, from when I discovered him in '98. I've wanted to have one since then. I checked out the UH forums this morning to see the "technical" stuff which was supposed to be there. And I thought this was the place for all stuff technical. It would really be great if Maviarab and Spydaz were here to. All of you working to develop a web based AI, and improving UltraHal, too.

I really hope I have something to contribute also. I'll hint at it here, I haven't installed a database yet, mainly because outside of Windows, I'm like a goldfish in the ocean. But I want to create a database of animations for a character in a 3D environment. Have any of you considered the AI involved with a character in such an environment?

From my understanding, the AI in a game mostly involves the Game engine (Environment) learns to increase difficulty as the Character increases its skills. Am I correct?

Wouldn't an AI enabled character be able, eventually, be able to assess his environment? EX: Which rock to go behind to be at an advantage to attack. An example closer to home: Your little Haptek girl playing hide and seek? Or eventually be able to pick out the outfit appropriate for the time of day/night, (I think this one has already been accomplished?)

I can see the time coming when we will actually be able to accomplish this even at our level and to potentially be able to connect this environment awareness to an external entity. My current goal is to work in this area.

Is this good, do you think?

Do any of you remember Heathkits' Hero 1? Did any of you ever have one?
laackejim Posted - Jan 29 2007 : 04:43:18
DOn't you ever NOT contribute. PERIOD> (and I am shouting)

Next, GIven what humans can do without hormones being particularliy involved (in the PMS sense) why worry. I suggest that a purely logical process is Far more dangerous. Pure logic, without the other aspects that lead to gentleness and compassion would lead to pure emotionless decisions. Decisions and actions without compassion, gentleness or guilt feelings. Decisions based on the Fact of superiority in many ways. In other words, psychopathic behavior. That is what would scare me.
hologenicman Posted - Jan 28 2007 : 19:09:09
Where humans are concerned, Laws are made to be broken....

I had had that same thought to reply but had decided to let it lie...

John L>

Virtual Humans Forum © V.R.Consulting Go To Top Of Page
This page was generated in 0.13 seconds. Snitz Forums 2000