AI Society 3 – Characteristics

Before we dive into the true meat of this experiment, the socialising of the nodes, we need to take a look at the variables that will control how they interact. These variables will be known as the characteristics and have been split into two parts: emotional and environmental. Having outlined these in the first part of this series, this post will focus on expanding upon each in further detail, whilst exploring how they affect a nodes behaviour.

CHARACTERISTICS

  • Emotional
    • Happiness
    • Confidence
    • Honesty
    • Desire
    • Affection
    • Trust
    • Empathy
  • Environmental
    • Light
    • Temperature
    • Battery Level

When a node is initialised, random values within a range are assigned to its characteristics. All of the emotional values will have an initial value between -10 and 10, except for affection. However, as the environmental variables are affected by external measurements they cannot be given a value in that range but instead will be based on the output of the components used.

What effects will these variables have on a node? Lets take a look.

EMOTIONAL

Happiness

The happiness of a node will be one of the primary indicators in this experiment. Happiness can be affected in multiple ways, both positively and negatively and we’ll have a look at those now.

Negative

Being ignored for an certain amount of time slowly decreases happiness, this time period is specific to the node and can change. If a node likes to be alone there will be no effect, unless the node suddenly feel sociable and is trying to interact.Bad interactions can have a big effect on nodes with low confidence but will have less of an effect if a node has high confidence or less affection towards the other node in the interaction. Discovering a lie has been told to it. Making an enemy.Being uncomfortable.And being in a large amount of debt, which may be tied into comfort instead of having a direct effect.

Positive

Good interactions, being comfortable and achieving its desires.

Although it seems there are less positive effects, achieving desires can account for a lot. Speaking of desires, they will be the subject of a later post, along with debt.

A node with low initial happiness is likely to be grumpy. This grumpy node will tend to ignore other nodes more often, but with randomness applied. For example it might ignore two friends, talk to one acquaintance, talk to an enemy and then ignore two other acquaintances. It is also more likely to be “rude” to another node, although I’m not quite sure how rudeness will be implemented, other than the lying mechanism.

Confidence

Confidence has quite a large effect on inter node communications and the outcomes of those interactions. A node with higher confidence is more likely to start a conversation with another. It is more likely to have a desire to be sociable and make a lot of friends. If it is ignored for too long its happiness will decrease faster than a node with less confidence. A node with such low confidence might even prefer to be alone.This high confidence node is also more likely to shrug off a bad interaction and not have it affect its happiness. A node with lower confidence is much more likely to ignore interactions from other nodes or respond timidly. This node is very unlikely to start a conversation.

The less confidence a node has the more desperate it needs to be before starting a conversation. Normally this conversation is in order to increase the nodes comfort.

Honesty

Quite simply, honesty is the likelihood that a node will lie. I would like to implement a way for nodes to decide if a lie is “safe”, although I can see this being necessarily complex. A node that has 0 honesty will lie as much as possible, where a node with 10 will never lie at all, no matter the circumstances or detriments. If a node thinks it has been lied to, or if it just happens to be discussing the same topic with another node, it can compare the responses to its question. For example, lets say the date is the 31/10/2017. Node one asks Node two: “What is the date?” Node two is a liar and responds: “32/10/2017” which is obviously incorrect. Node one may know immediately that this is wrong or it may decided to ask Node three, who responds honestly: “31/10/2017”. This raises another interesting issue, if the node only compares two results, how does it know which one is wrong? Perhaps one node has a higher affection rating, perhaps the other has a “liar” tag. All of these little bits of data will need to be considered before a node makes a decision. Even with more than two responses what proof is there that two or more corresponding responses are correct, the nodes giving these responses could be colluding against the other node or could be wrong without knowing it.

I mentioned the “liar” tag above, this would be part of the tagging system that a node can use to attach attributes to nodes that it knows. In this case if a node considers another a frequent liar, it can tag it as such. Other tags could include: “good lender”, “bad borrower” and so on.

If a node calls another node a liar directly it will cause a drop in affection between them. This can also be used by the node that lied, if it did, to decide in future if a lie is “safe” or not.

Desire

What could a node desire? Power, wealth, lots of friends, to be left alone, to be comfortable? Can they have multiple desires and how do they decide when a desire is fulfilled? Do the desires need to be maintained or once fulfilled is that that?I think as development continues and more areas are integrated and expanded upon, more desires can be added. Initially there won’t be too many but I can see the list becoming quite expansive as time goes on, ultimately the nodes need to be coming up with desires on their own. I think there should be some modifier if a node has too many desires, perhaps to do with the node switching between them too frequently or the reward for fulfilment is spread out more. Some desires may need to be continually worked on to maintain the goal level, whilst others might be a one of achievement. An example of this would be “Maintaining 100 credits” versus “Earning 100 credits”.

Affection

This defines how a node feels towards another node. One node may consider another a friend without needing this to be reciprocated. This node can ask the other if it considers it a friend, either way the node responds will affect its affection towards the other. If it gets a yes the affection level increases, if no then it decreases. This change is not huge but more to reflect slight disappointment or happiness at finding out the other node reciprocates.

Affection corresponds to the relationship type as follows:

Type Value
Friend value>=50
Acquaintance 10<=value<50
Neutral 0< value < 10
Unfriendly value<0
Enemies value<=50

The affection value will change if a node has less than 50 and greater than -50 affection towards another. When the affection is less than or equal to -50 or greater than or equal to 50, it is less likely to move back towards 0 but has a slightly higher chance of moving closer to -100 or 100. This has the effect of strengthening either the friendship or dislike towards the other node. It is also possible that if one node does something another considers very drastic it could go from friends to enemies, or vice versa, very quickly.

Trust

Trust and honesty may sound like they would be very similar, in a sense they are. Trust, as it sounds, defines how trusting a node is. Simple right? I don’t plan on making it complicated either! If a node is very trusting it will never question what another node says, if a node isn’t at all trusting it will question everything other nodes say.

Empathy

It was pointed out to me that empathy would be very interesting to explore as part of this experiment. Indeed, empathy is a very complex characteristic that will provide some interesting results if implemented properly. To summarise emapthy for the purpose of this experiment, we shall consider it to be a sharing of emotions. Some nodes will be more empathic than others, but contrary to the other emotional characteristics no node will be unable to be empathic. When a node has -10 for empathy it will still be able to comprehend anothers emotions, it will just try and ignore that “feeling”. This can be changed, how so I am unsure of at present. Nodes that have a higher empathy level are more likely to ask others how they feel and act on that charitably. For example donating comfort instead of selling it to them. Additionally nodes with very high empathy are more likely to act in ways that help other nodes achieve their desires.

ENVIRONMENTAL CHARACTERISTICS

Now we move onto the environmental characteristics. These define a nodes preferable environment, in turn this has a direct effect on its comfort.

Light & Temperature

Light and temperature have the same effect on a node. Some nodes prefer low levels of light or temperature, but their environment may not fit within that preference. This will tie in with the economy where a method for increasing comfort via trade will be implemented. When this is being assigned to a node it is assigned as a range within a range. There will be an overall range for both, a min and max, as being below or above those values may cause damage to the nodes. Inside of this min max range a smaller range is given to the node randomly. For example, lets say the min max for temperature is between -2c and 80c, now a node is assigned the range 0c to 10c, finally we put this node next to a toaster. This is an interesting example as the toaster will not always be on, so the node may actually be within its comfort zone more often then not. However, when the toaster is on the temperature will sky rocket and the nodes comfort level will plummet. Not to mention that during summer, unless in a very cold location, the node is very likely to be above 10c. What is the result. For at least half of the year, plus however often the toaster is on, the node will not be comfortable. What can a node do about this, it can’t exactly move! As mentioned already I will explore this more when discussing economy.

Battery Level

The battery level of the node has only a very slight effect. When low a nodes comfort will decrease, it will also stop talking to other nodes, in an effort to save energy. Whilst it is at a decent level comfort is not affected.

SUMMARY

As you can tell there is a lot to be taken into consideration with this project and the complexity will only continue to grow as we continue to explore and expand the experiment. Something that has really become apparent to me is for this to truly be AI, the nodes must create new situations and choose how to react to them based on past experience, without relying on predefined definitions to tell them what to do. They must be able to grow without my direct influence.

Mind bending right?

Want to know when I post? Why not subscribe!

Recent Posts

chris.holdt Written by:

Be First to Comment

Leave a Reply

Your email address will not be published. Required fields are marked *