*Define Initial Population*

The initial population in the final version of Alphabet Soup may likely contain one of each keyboard character available. For now, for simplicity’s sake, let’s begin with 4 characters: A,B,a,b. In our game, all 4 characters decide to run into each other at once.

First we will convert the 16×16 characters into binary, 0 for gray and 1 for yellow:

A:
0000000000000000
0000000100000000
0000001110000000
0000001010000000
0000011011000000
0000010001000000
0000110001100000
0000100000100000
0000100000100000
0001111111110000
0001111111110000
0001000000010000
0011000000011000
0010000000001000
0110000000001100
0000000000000000

A, Binary String: 0000000000000000000000010000000000000011100000000000001010000000000001101100000000000100010000000000110001100000000010000010000000001000001000000001111111110000000111111111000000010000000100000011000000011000001000000000100001100000000011000000000000000000
A, Binary-Decimal conversion: 95786086615602871731981918389383308042547593781186608

B:
0000000000000000
0000000000000000
0001111111000000
0001111111100000
0001000000110000
0001000000010000
0001000000010000
0001000000110000
0001111111100000
0001111111110000
0001000000010000
0001000000010000
0001000000110000
0001111111100000
0001111111000000
0000000000000000

B, Binary String: 0000000000000000000000000000000000011111110000000001111111100000000100000011000000010000000100000001000000010000000100000011000000011111111000000001111111110000000100000001000000010000000100000001000000110000000111111110000000011111110000000000000000000000
B, Binary-Decimal conversion: 46403387827017775033653233413204161160929512263696

a:
0000000000000000
0000000000000000
0000000000000000
0000001111100000
0000011111110000
0000110000110000
0000000000010000
0000000111110000
0000011111010000
0000110000010000
0000110000110000
0000111001110000
0000011111011000
0000000000000000
0000000000000000
0000000000000000

a, Binary String: 0000000000000000000000000000000000000000000000000000001111100000000001111111000000001100001100000000000000010000000000011111000000000111110100000000110000010000000011000011000000001110011100000000011111011000000000000000000000000000000000000000000000000000
a, Binary-Decimal conversion: 86418088698874722540000455600782178239016967

b:
0000000000000000
0000100000000000
0000100000000000
0000100000000000
0000101111000000
0000111111100000
0000110000110000
0000100000010000
0000100000010000
0000100000010000
0000100000010000
0000110000110000
0000111011100000
0000101111000000
0000000000000000
0000000000000000

b, Binary String: 0000000000000000000010000000000000001000000000000000100000000000000010111100000000001111111000000000110000110000000010000001000000001000000100000000100000010000000010000001000000001100001100000000111011100000000010111100000000000000000000000000000000000000
b, Binary-Decimal conversion: 766259462624453036391004127445265907255673543052767246

So, our Initial Population (POPinit) consists of the following four characters and their decimal representation:
1. (A) 95786086615602871731981918389383308042547593781186608
2. (B) 46403387827017775033653233413204161160929512263696
3. (a) 86418088698874722540000455600782178239016967
4. (b) 766259462624453036391004127445265907255673543052767246

*Define fitness of POPinit*

First, we establish the Total, the Best, the Worst and the Average based on their decimal representations:
Total: (A+B+a+b) 862091952714301014596894421608062875060164244585234517
Best: (b) 766259462624453036391004127445265907255673543052767246
Worst: (a) 86418088698874722540000455600782178239016967
Average: 23958122541618481117083899450474345137254323064508656.25

Based on these findings, we can calculate the chance of Reproduction / Survival for each character.
A: 0.1111088977388315456010566688830799217993208329796695901253… = 11% chance of reproducing
B: 0.0000538264945878643991068686448919016779835612586246604953… = <1% chance of reproducing
a: 0.0000000001002423099146058828565205588056220034311729841463397… = MUCH <1% chance of reproducing
b: 0.8888372756663382800852305796155076177170736023305327652329… = 88% chance of reproducing

*Perform Reproduction Function*

As you can tell from the fitness of each character, the character “b” has by far the best chance at survival, while character “A” is the only other letter with much of a chance at all. In the actual game, the reproduction function will decide who actually survives reproduction so it is impossible to say for sure, but based on these percentages we will go with three b’s and one A surviving to population generation 0.

So, the result of the reproduction function should look something like:

1. (b) 766259462624453036391004127445265907255673543052767246
2. (b) 766259462624453036391004127445265907255673543052767246
3. (b) 766259462624453036391004127445265907255673543052767246
4. (A) 95786086615602871731981918389383308042547593781186608

*Define fitness of gen0″

Again, we determine the total, best, worst and average fitness of the new population, gen0:

Total, better: (b+b+b+A) 2394564474488961980904994300725181029809568222939488346
Best, same: (b) 766259462624453036391004127445265907255673543052767246
Worst, better: (A) 95786086615602871731981918389383308042547593781186608
Average, better: 598641118622240495226248575181295257452392055734872086.5

Note that the Total, Worst, and Average fitness of the population all improved as an expected result of natural selection. The Best fitness, meanwhile stays the same–and it has no chance for improvement until running the next stage of the GA…

*Perform genetic crossover operation on gen0*

Start with 2 parents. Let’s use 3 & 4 as an example with an arbitrary crossover point (we’ll say 170 digits in–though this would actually be determined by the randomly generated genetic operation):

3. (b) 766259462624453036391004127445265907255673543052767246 =
00000000000000000000100000000000000010000000000000001000000000000000101111000000000011111110000000001100001100000000100000010000000010000001000000001000000100000000100000 | (crossover point) 01000000001100001100000000111011100000000010111100000000000000000000000000000000000000

4. (A) 957860866156028717319819183893833080425475937811868 =
00000000000000000000000100000000000000111000000000000010100000000000011011000000000001000100000000001100011000000000100000100000000010000010000000011111111100000001111111 | (crossover point) 11000000010000000100000011000000011000001000000000100001100000000011000000000000000000

From this crossover, we get 2 new offspring characters:

offspring1 (o1):
0000000000000000
0000100000000000
0000100000000000
0000100000000000
0000101111000000
0000111111100000
0000110000110000
0000100000010000
0000100000010000
0000100000010000
0000100000110000
0001000000010000
0011000000011000
0010000000001000
0110000000001100
0000000000000000

o1, Binary String: 0000000000000000000010000000000000001000000000000000100000000000000010111100000000001111111000000000110000110000000010000001000000001000000100000000100000010000000010000011000000010000000100000011000000011000001000000000100001100000000011000000000000000000
o1, Binary-Decimal conversion: 766259462624453036391004127445265907255673543589892144

offspring2 (o2):
0000000000000000
0000000100000000
0000001110000000
0000001010000000
0000011011000000
0000010001000000
0000110001100000
0000100000100000
0000100000100000
0001111111110000
0001111111010000
0000110000110000
0000111011100000
0000101111000000
0000000000000000
0000000000000000

o2, Binary String: 0000000000000000000000010000000000000011100000000000001010000000000001101100000000000100010000000000110001100000000010000010000000001000001000000001111111110000000111111101000000001100001100000000111011100000000010111100000000000000000000000000000000000000
o2, Binary-Decimal conversion: 95786086615602871731981918389383308042547593244061710

So gen1 looks like:
1,2 are the same as before; 3,4 are the offspring of crossover:
1. (b) 766259462624453036391004127445265907255673543052767246
2. (b) 766259462624453036391004127445265907255673543052767246
3. (o1) 766259462624453036391004127445265907255673543589892144
4. (o2) 95786086615602871731981918389383308042547593244061710

*Convert gen1 offspring to valid characters*

Now, 3 and 4 are new shapes that do not correspond with any of our 4 letters. The BBC (Bitmap-to-Binary Converter) will be used to “read” these bitmaps and find the closest letter in the space of possible letters which we will define. So, the offspring will turn into something like “h” and “8″, because those are the closest possibilities within the alphabet.

I was going to include a range in decimal numbers and apply that to the nearest alphabet character within that range, but I don’t think that will work because more often than not the offspring will just end up reverting to one of the two characters from which it was created. From this perspective, the BBC may be a very useful portion of the AS program.

So, o1 become “h” and o2 becomes “8″, converting gen1 characters to:

o1=h:
0000000000000000
0000110000000000
0000110000000000
0000110000000000
0000110111000000
0000111111100000
0000111000110000
0000110000110000
0000110000110000
0000110000110000
0000110000110000
0000110000110000
0000110000110000
0000110000110000
0000000000000000
0000000000000000

h, Binary String: 0000000000000000000011000000000000001100000000000000110000000000000011011100000000001111111000000000111000110000000011000011000000001100001100000000110000110000000011000011000000001100001100000000110000110000000011000011000000000000000000000000000000000000
h, Binary-Decimal conversion: 1149389193936678235951120191876555791760811195370319884

o2=8:
0000001111000000
0000011111100000
0000110000110000
0000100000010000
0000100000010000
0000110000110000
0000011111000000
0000011111100000
0000110000110000
0000100000010000
0000100000010000
0000100000010000
0000110000110000
0000011111100000
0000001111000000
0000000000000000

8, Binary String: 0000001111000000000001111110000000001100001100000000100000010000000010000001000000001100001100000000011111000000000001111110000000001100001100000000100000010000000010000001000000001000000100000000110000110000000001111110000000000011110000000000000000000000
8, Binary-Decimal conversion: 23539885800661303801528815919744719668529798326595472592908

So the gen1 now looks like:
1. (b) 766259462624453036391004127445265907255673543052767246
2. (b) 766259462624453036391004127445265907255673543052767246
3. (h) 1149389193936678235951120191876555791760811195370319884
4. (8) 23539885800661303801528815919744719668529798326595472592908

From which we evaluate the fitness again (“8″ clearly will be the new dominate gene) and so on, generation after generation…

*Conclusion*

This is the basic gist of Alphabet Soup by GA. The final aspect of using GA is to incorporate mutations: the random changing of various pixels within the letters, which will obviously introduce the same effect of crossover in adding to the population of letters. Mutations will only be applied to a small portion of the entire population in each generation, just like in nature. With only 4 characters to choose from, let’s say there are no mutations in this particular generation. That gives us the current pool of characters, including the initial population, gen0 and the current generation (gen1):

[POPinit][gen0] [gen1]
[A,B,a,b][b,b,b,A][b,b,h,8]

So, we have 6 “b”s, 2 “A”s, 1 “a”, 1 “h”, 1 “B” and 1 “8″ floating around. Granted this is an example run, but it plays out pretty realistically: note that the dominate gene (b) is taking up quite a bit of the population, while a and B have not grown at all in population. “h” and “8″ are the new kids on the block, and they will have a direct impact on the future generations, again based on their fitness for reproduction and crossover–with a little bit of mutation thrown in here and there.

One of the core questions remains: How do we make the shapes of the characters matter? In other words, how do we specify the importance of various pixels? This, I believe, can also be accomplished through the BBC, and I have some ideas for that. But more on that later.

So we have a situation where we have 256 bits that represent the location of pixels in a 16×16 grid. I’m getting the feeling this is going to surprisingly like IP addresses with their subnet masks only eight times larger.

Each pixel will have a one or a zero in the 256 bits. In most of my examples I’ll be using 8 bits because it demonstrative of the same information, only legible to the hu-mons.

The first eight pixels when blank are [0000000]. This is pretty simple. But it would improve most of the processes to build a little encoder to make these characters to binary conversions simpler rather than to hard code all the charters once and everytime there needs to be a change. Plus it would add in the functionality to allow the user to make their own shapes in a free form mode.

Now if a letter with the [00000011] code breeded with one that had a [11000000], it would produced the Base Subnet of [11000011]. The child could contain any shape within the range and possibly none of them. If we choose to add dominant and recessive traits—which we should—than we’ll need to add another tag to show which pixels will retain dominance. Simply enough this can be accomplished with another two subnets.

Back up and let’s look at the parent classes again. [00000011] now needs another three addresses to properly pass down its heritage. [00000011] has a Base Subnet of [00000011] because he’s never once breeded. And he has a Dominant Subnet of [00000011] because we want him to retain his shape as a letter. His Recessive Subnet should be [00000000] just to allow the option to change without the likelyhood. And [11000000] will have similar characteristics. (BS-[11000000] DS-[11000000] RS-[00000000])

When these two parents meet and attempt to breed the BS will become [11000011] with the DS of [XX0000XX] (the ‘X’s are random numbers) and RS [????????]. I don’t quite know how the RS will be generated. Maybe the opposite of the BS that isn’t changed from the RS? Like if the DS becomes [10000010] than we have the unchanged bits [1X00001X]. And make them the opposite of the BS would be [10000010]. I’d have to see this done a few more times further down the line to make sure the DS and the RS don’t always stay the same. I wouldn’t really see the point in just randomizing the RS, so it has to be derived from so other stat.

The child would be made from these ranges. At this point other than the Dominant and Recessive context I’ve been using, there’s nothing GP about this. Fitness hasn’t become a factor in these combinations. It’s just a baseline for breeding and how it affects the appearance of the alphabet. And from appearance, it’s reactions to the environments.

I think I’ll drop some pictures of a character ‘A’ in a 16×16 bit map expounded by its traits.



The black is what it is. The red is the BS traits. The blue is DS traits. And the block that looks blank is actuality the RS traits.



Same here.



The child needs a lot of explaining. The cells are one through seven from left to right.

Ignore the first cell on the left. That comes last.

The second is the BS which is the potential, all the things the child could be.

The third is an overlay showing in green where the ‘A’ in light-blue and the ‘B’ in dark-blue overlay.

The fourth is just the bits that are overlayed. This represents the bits that are common to both and are to be forced dominate.

The fifth is the bits that are the ones possible to become dominant depending on randomness.

The sixth is if the bottom half was somehow all flipped-on making them dominant.

The seventh is the top half that was not flipped on and became recessive.

And now the first is the product. The dark-red and green are manifested because of the dominance and the plum color is now the recessive and does not show itself.



A problem with this system I can already foresee is that everything will soon become either recessive or dominant after only a couple generations of crossover and soon everything will manifest making every character a big black block.

Maybe taking only one side of the family’s gene in step five’s dominance check and make the other un-flipped bits for the opposite family recessive.

Now this is something. Here are all the same things but with the families jumbled between all the four possibles I made by cutting them in half (if it were random like I was hoping there would be hundreds of possibilities). The problem is they make some cool looking half-breed letters. But they wouldn’t look like that with only the dominant side (dark-red) and green manifested. Maybe the whole dominant/recessive thing needs to be reevaluated? Maybe when the dominant bits are chosen it’ll be different. I don’t know. But it also makes the BS pointless. Maybe I should scrap it and start over.

Tagged with:
 

Over six months have passed since The Project began, and it was long before that the ideas were seeded. Pariahpism published his version of The Project origins right away. While this was an accurate description, I feel it is time to post my view on where it all started, what the past six months have involved, and where we might be going with The Project.

It began with an article published in 2007 by Alan Bellows on DamnInteresting.com about hardware evolution, “On the Origin of Circuits”. A comment in this article led me to Kevin Kelly’s book, Out of Control which is fully readable online. Now normally in my internet browsing I am not going to follow a link to a book and end up reading it in its entirety, but that is exactly what happened with Out of Control. The book starts out in Biosphere 2 in Arizona, and I thought, “Hey, I’ve been there!” And it only got more interesting as it went. Who would have thought ecology, philosophy, technology, economies and evolution had so many things in common.

Turns out Kevin Kelly is no scientist or engineer, but he is a hub of information, finding new meaning and understanding in many subjects, especially technology, with direct access to the top thinkers of our time. His blog The Technium continues to amaze and inspire me, while helping me think outside the box.

Out of Control covered a broad range of subjects, but some information seemed to stick out from the rest. I distinctly recall being impressed by the following, from Chapter 2:

In the film Batman Returns a horde of large black bats swarmed through flooded tunnels into downtown Gotham. The bats were computer generated. A single bat was created and given leeway to automatically flap its wings. The one bat was copied by the dozens until the animators had a mob. Then each bat was instructed to move about on its own on the screen following only a few simple rules encoded into an algorithm: don’t bump into another bat, keep up with your neighbors, and don’t stray too far away. When the algorithmic bats were run, they flocked like real bats.

It all came down to simple mathematical algorithms. Throw enough math into the software and, TADA! Biology plays out on your computer screen! Then, Chapter 15 dealt specifically with Artificial Evolution, while Chapter 17 “An Open Universe” expounded on that. Tom Ray and his game Tierra were highlighted, as was the GP master John R. Koza. These would soon become household names to The Project.

Fast-forward to July 2009. Pariahpism had taken to game journaling, and while not being interested in gaming I was surprisingly enthralled by his stories. My experience with software engineering combined with the memories of Origin of the Circuits and Out of Control prompted a sudden want to try to develop something of the same substance, but on a different scale. Imagine gaming journals about a self-evolving, unpredictable game! Immediately ideas started to take hold. We could provide the goals and the necessary environments, but let the games do the work!

After brief but furious discussions on the matter, Pariahpism proceeded to write about a number of game ideas brewing in our heads:

Alphabet Soup
The Field of Blobs
The Arti-Field of Blobs – The Field of Blobs Side-Quest
Game Untitled

And then the serious research began to see how to go about bringing such ideas to fruition. Little did we know everything that was already out there. Computational Intelligence is a huge realm under AI which includes, among others, Genetic Programming, Genetic Algorithms, A-life, Neural Networks, and so on. Books, websites and whole development environments are relegated to these subjects, prompting us to back up a step and begin collecting information at a rapid pace.

And here we are—still collecting. I have decided to completely overhaul the old website and devote it to The Project, and video gaming in general. Why dailyvillain.com? It started in 2004 as an inside joke, turned nickname, turned URL. It even stood as an acronym on the website for a while—something like Vagabond Interpreting Life’s Lessons and Idiosyncratic Nuances—but that was way too long. Anyway, the web space was already there and now it exists and persists as The Project’s home.

So where is The Project headed? Our original goals still stand. Early versions of Alphabet Soup and various programming languages are the focus of most of our attention for the moment, but concerning the future, anything is fair game at this point. The nice thing is that the interwebs allows the unprecedented vast sharing of information, letting us build on what others have already created. The beauty of Open Source is that it is more than just convenient, it also allows for the acceleration of information sharing, and of course, innovation. That’s what The Project is all about.

For a tad more organization… This blog will help to track our thoughts and progress, while this blog has become the home for the aforementioned gaming journals, other game reviews and pretty much anything game-related.

Tagged with:
 

(Originally published January 17, 2010)

As we attempt to become the premier resource for everything relating to computational intelligence, The Project website continues to expand. The Grand Link Library continues to be updated, and now we have a growing Computational Intelligence Hierarchy to help organize all facets of The Project into something legible. Thanks to Pariahpism for putting together the hierarchy; now I am integrating it into the website and soon will be tying it together with the Grand Link Library. These will simply form the backbone of our work for The Project. The ultimate goals will be realized later, through the details.

Here’s V 1.0:


hierarchy

CS AI Old GOFAI Sub BU Alife CI EC Fuz NN AE EA GA GP EP ES GEP MEP SI Secret Passage

Tagged with:
 

(Originally published January 10, 2010)

Time to get some thoughts down on digital paper concerning the actual code for Alphabet Soup. For simplicity’s sake, here are some more specific ideas:

To stay close to the code in MSDN’s ant program, we can start with a similar “ant” grid, allowing only up/down and left/right movements. This is not our “floating letters” environment yet, just think of it as a first step toward that.

We also have to redefine what the classes are and where they belong. The “ant” class becomes our “alphabet” class. “Food” in our case should be redefined to individual letter “species.” Let’s start with Alpha, Beta and Charlie to keep it simple. So, while an ant in MSDN’s program will encounter food along its path, any alphabet character will be encountering alphabet subclasses (formerly food) along its path. What the code decides to do results in whether the encountered character is devoured (or eaten, as in the original program), morphed, defended against or something else. So, the individual letters must obey the general rules of the alphabet class, but what they do individually depends on the evolutionary code that is run every time a collision occurs–and what happens should depend upon which letters are involved. I call this the ContactReaction() function.

Here is a general framework:

Alphabet class, stay with MSDN example, with adjustments:
public Void Init(Cell[,] grid, int StepLimit, int SurvivalGoal)

Move() function includes basic movements and the results of contact between letters, to call on evolution code:
public void Move()
{
    if(NotFinished())
    {
        GetFacingCellCoord(outxLocation, outyLocation);
        mTrail.Add(newTrailSpace(xLocation, yLocation));
        mTotalSteps++;
            if(mGrid[xLocation, yLocation].HasAlpha())
            {
            ContactReaction(Alpha)
// calls function for what happens
           when character encounters Alpha
            // This will be where the evolutionary choice takes place,
            well, part of it (see below)

            }
            if(mGrid[xLocation, yLocation].HasBeta())
            {
            ContactReaction(Beta)
            }
            if(mGrid[xLocation, yLocation].HasCharlie())
            {
            ContactReaction(Charlie)
            }
    }
}

Here we will place the evolution stuff–to include within ContactReaction() function:
{
    // (Define basic function here with evolutionary aspect,
    including specifics for each letter (see below))

    if(result=Devour)
    {
        mGrid[xLocation,yLocation].SetEmpty();
// this is as if the
        letter eats the other character

    }
    else
    {
       
(OTHER OPTIONS/TRANSFORMATIONS/EVOLUTION) // Run Generator code
       }
}

The ContactReaction() function will involve a lot of tweaking in combination with MSDN’s Generator class. It will call on the Generator class and expression tree constructors to perform the genetic crossover algorithms after every collision. The code for these can be found at http://msdn.microsoft.com/en-us/magazine/cc163934.aspx#S5

An important question…

Generator and Expression Tree from MSDN look great for our purposes, but should we include separate generator/expression tree specs for each of the different letters? More specifically, how will the letter’s shape have an effect on what happens in a collision? Should we define this aspect of it within the generator/expression tree? I think the best solution is to define those specs in our ContactReaction() function, meaning that function would have to be tailored to each individual character–but that way the generator/expression tree class can remain one and just be referenced by the ContactReaction() function–as I have shown above. So the evolutionary choice will still take place in the MSDN-defined classes, but each of the collisions will involve a “lean” toward reaction based on letter shape, as we define in ContactReaction(). This may or may not be the best solution, but it is a start.

Tagged with:
 

The Master Link List

(Originally published December 30, 2009)

Today’s project: categorizing, alphabetizing and updating the master link list for The Project. Based on the links we have collected so far, the best categories I could divide everything into included:
Blogs
Books Online
Games / Examples / Source Code
Potentially useful link dumps
Tutorials / Articles

Check it out.

Tagged with:
 

(Originally published December 29, 2009)

The Project still requires an immense amount of reading and research before it can produce much fruit, but there is nothing to keep us from starting with the basics (except of course the risk of extinction like everyone who has ventured before us).

We obviously cannot do this completely from scratch, so to find out where to begin, we look to simple GP and GA examples. Pariahpism discovered MSDN’s example program using ants to find the most efficient way to follow a trail of food in a 32×32 square matrix. The program uses C# in the .NET framework, along with Microsoft’s System.CodeDom Namespace.

Full source code for the example program is available here.

Now, to apply the principles of the Ant/Food program to Alphabet soup, we begin by breaking the program into the comparable classes. I consider the User Interface a separate aspect of the program to be built later, but additionally there are two necessary sets of high level classes.

The first set is the problem set, which includes the customized goals and limitations. Basically, these are the details specific to our program (letter class, number class, case class, goals, movements, starting population, and so on). While the article’s program describes an ant following a path of food, Alphabet Soup will involve alphabet characters reacting to collisions with one another. A secondary but important aspect of this set of classes is a private method (or methods) included to record what happens within the program–the generational “history” which is saved to an array so the array instances can be compared for fitness level. This allows the program to use natural selection for determining “who” survives to the next generation–and, in the case of Alphabet Soup–what the future holds for those letters which collide, based in part on the fitness of past results. For the early versions of our program, the problem set and UI will take up the bulk of our customized coding.

The GP framework, on the other hand, already exists more or less. It does still have to be customized as far as the variable names are concerned, but the basic set of code instructions will be very similar to what the article defines–especially if we create our first program in C#. In other words, the GP framework is a standardized code including the necessary algorithms for mutations and inheritance.

At the core of the GP framework, the Ant/Food program uses the Generator class, which the article refers to as the “kernel” of the GP implementation. It does, after all, perform: “selection, breeding, code generation, execution, and result sorting.”

Something I’ve been seeing over and over again in GP is Koza’s “tree” full of terminals and functions. For the ant program, functions include “FacingFood” and terminals include “Left” and “Right.” For Alphabet Soup, let’s use the examples of function: “CollidingLetter” with terminals “Avoid” and “Attack.” Random expressions are created by the algorithm by combining available functions and terminals from the tree. For instance, “if CollidingLetter, then Attack else Avoid.” These random combinations are weighed against a fitness measure (defined elsewhere in the program), which helps determine the outcome of the future generations. In following traditional GP, the selected alphabet characters (a certain percentage) are copied for crossover and reproduction while the rest are discarded–depending on how exactly we choose to do it. Again, this code should be easily modified from MSDN’s version to what we need for Alphabet Soup.

The article also addresses the role of mutation in the ant program. Mutation will be crucial to how Alphabet Soup plays out:

“[Mutation] involves making a random change in a small number of offspring in order to maintain diversity in the population. Mutation is essential because otherwise the population becomes completely dominated by a few good bloodlines, which makes it difficult to continue progress beyond a certain point. [The Ant/Food program] performs a simple mutation step during the crossover operation to introduce randomness. When a terminal node is copied, a random terminal is substituted two percent of the time.”

So these are the basics of the basics to get us started. Another note to keep in mind is the usefulness of XML. The Ant program copies the ant path to an XML file which can later be reloaded. This may not have much use for the application itself, but it can be interesting for reviewing past generations and possible patterns/flaws/suggestions in the code.

Also, check out another simple example in C.

Tagged with:
 

Getting Organized

(Originally published December 22, 2009)

Introducing: SniderWebLLC.com!

Here is where the SniderWeb project begins: platform for web hosting as well as the legitimization of The Project.

Meanwhile, prepare for the dailyvillain.com revamp–where The Project continues to take form. And here is our master index of useful information–to evolve with The Project.

Tagged with:
 

Game Untitled

(Originally published December 10, 2009)

So my latest idea for a game came to me in the shower like all good ideas. I was trying to be less similar to the last few and was trying to do something in the same vein but exactly contrary. Mostly our idea is to use artificial evolution against the player and watch him struggle and die in varying degrees of difficulty. So my thought led me to attempt to think of a way for the player to be the one evolving.
I’ve seen plenty of games that fake evolution. You can buy Evolution Upgrades that either improve you caprice shell or sharpen your pincers with evolution points. I want a game where the engine for evolution is your play-style.

Let’s say there are 10,000 in your herd of The DeathDinkles. They all do what you do in 10,000 similar situations. The only variable is individual strengths and weakness. You avatar only has the strengths; the best of everyone.

You see a bear. Do you fight it, jump over it, climb a tree, outrun it, hide form it, …etc? What you do determines which of your species lives or dies. Half of them have the strength to fight the bear and survive and only a half of them have the fortitude to stave of infection from the wounds inflicted. You could quarter your herd in one simple choice but the resulting herd would carry the traits needed to survive similar situations. The offspring of this herd would have children and a higher likelihood of producing mutations of increased strength, stamina, fortitude and disease tolerance. Pretty soon bears become your easy prey and staple food as was naturally selected (by you). The bears undergo the same evolutionary changes and maybe become faster and stronger, or maybe become lithe, agile and harder to see. Or maybe the fail as a species and go extinct.

So your herd of tough brutes is having increasing trouble with the new agile b’ougars (bear-cougars) and have started to lose population. There are some mountain goats in the west mountain range that seem like easy prey. The high jumps of the mountains split your herd between those that have good balance, equilibrium and jumping strength. The split doesn’t kill the other half, simple divides them into another species variant, The PrairieDinkles, while you become the MountainDinkles. The PrairieDinkles will continue to survive unassisted with an emulation of your behavior prior to the split. They may someday target you as a prey but completely unrecognizable as having a common ancestor.

The mountain goat high rate of reproduction makes them a good sustainable resource. The cold mountain winds and winters weeded out those with thin pelts and make your herd more hardy to the cold weather. Layers of fat to protect against the cold have made your herd a bit slower and not as agile. You, the player, continue to force your herd to areas of increasing agility and cold; contradictory elements usually. You herd thins dramatically and is split many times. The few that can survive your demanding play-style become smaller and wiry; almost rodent like. No longer can you feed on goats unless skeletonized via an en mass assault where many are crushed or hurled off into the abyss. It’s a grim choice but a better one than the losses expected in changing the diet to the local poisonous fauna (perhaps 100%). The massive regular losses from both the climate and behavior engage an high birth rate as being your only saving grace. Birthing twins leads to triplets leads to litters of pups with higher and higher numbers.

I could just go on forever writing this. I think the same could be said for playing it. I can see some cool 2D models for this. There would be different segments for each moving part that would be randomized and passed down each generation. Say like you grew one arm larger than the other giving it greater mass, strength and therefore hitting power. And it became so favorable that it almost became exacerbated into one giant club arm and another small arm on the side for delicate work. Maybe backjointed legs? Longer necks? Nothing would be default except perhaps the parameters. Things would be designed to mailable through chaos and natural selections.

You can’t say, “I want a tiger with a unicorn horn,” and expect it to happen. You’ve got no choice in the way you herd looks, just acts.

There could be a farmer option where favorable traits and looks are grown purposefully with genetic engineering. But that would be an easy little side addon to the main quest part of the game.

It’d be cool if it was a persistent world. Say you got tired of the RodentDinkles and started a new one. You’d be going along and learn of the PrairieDinkles and make them your new prey to your poisonous venom snakes or whatever.

Tagged with:
 

(Originally published December 8, 2009)

Finding meaning:
Chapter VI of “Godel, Escher, Bach” begins to delve into trying to pin down the location of meaning. It discusses the recursive enumerable set: something being defined in terms of simpler versions of itself (such as Fibonacci numbers, or cells in a body).

Small, simple parts make up complexity, out of which meaning is drawn. Levels of complication increase until the recursive system might be “strong enough to break out of any predetermined patterns… Instead of just considering programs composed of procedures which can recursively call themselves, why not get really sophisticated, and invent programs which can modify themselves—programs which can act on programs, extending them, improving them, generalizing them, fixing them, and so on? This kind of ‘tangled recursion’ lies at the heart of intelligence.” (p. 152).

“…meaning is part of an object to the extent that it acts upon intelligence in a predictable way” (p. 165) – In context, deciphering may take more or less effort but the eventual result is in essence inevitable: the intended interpretation remains the same no matter what. We are just “drawing out” the information that is inherent in the message. (Consider elemental DNA being chemically drawn out to establish the “who” of the overall person). It is like the principle that “information is in the air” waiting to be grasped as it inevitably will be when the right condition, place, time and intelligence exists to grasp it. The Project should be able to grasp such ideas using the power of evolutionary computation and memristors beyond human brain capability. So, a little more about grasping meaning:

According to GEB, 3 mechanisms are needed to draw meaning from any message (p.167):

Message: 1+1=2
(just the “meaningless” figures themselves) / GEB’s FRAME MESSAGE

Core message: 1+1=2
(meaning: start with one, add one, end up with two) / GEB’s INNER MESSAGE

Trigger: 1+1=2 is understood for what it is to be important through the use of mathematics / GEB’s OUTER MESSAGE

The trigger is the interesting part. The message can be anything. So the question for the trigger becomes, “How do I interpret the message? What method do I use to accomplish this?” This can require a lot of information, and it acts as the “schematic,” the barebones algorithm used to draw meaning from the meaningless. Where does it come from, how is it determined?

The way this triggering occurs may be a universal dialect (algorithm?) which can be applied to biology, technology, and so on. From GEB: “We could ascribe meanings… of a message to the message itself, because of the fact that decipher mechanisms are themselves universal—that is, they are fundamental forms of nature which arise in the same way in diverse contexts” (p. 171) .

But when we really try to pin it down, no meaning is absolute; at its core, meaning is accepted by what makes the most sense and there can be no absolute certainty. “…one can never give an ultimate, absolute proof that a proof in some system is correct. Of course, one can give a proof of a proof, or a proof of a proof of a proof—but the validity of the outermost system always remains an unproven assumption, accepted on faith.” (pp. 192-193).

We as humans recognize these patterns subconsciously; our intelligence rises above the inherent contradictions and endless downward spiral of circular reasoning. Machines need to be trained to do the same for effectively drawing meaning from patterns in data. Computers already have more processing capability than our brains; we simply need to learn how to mold those capabilities.

Methods:
Here are some methods for extracting data in a meaningful way, mostly drawing from what we know about the human brain, as consolidated in Kurzweil’s “AI toolkit” (found here):

Expert systems: Using human-like decision making with not only blatant logic, but also underlying logic which our biological brains take for granted (something that can be spelled out in the Propositional Calculus [GEB, chapter VII]): Basically determines things based on small bits of logic combined into large decision-making systems.

Bayesian Nets: A self-training system, learning from past experience. “Many expert systems based on Bayesian techniques gather data from experience in an ongoing fashion, thereby continually learning and improving their decision making.” (i.e., spam filters).

Markov Models: Useful for application in speech recognition, including word patterns and grammar rules discovered through machines “listening to” and analyzing human speech.

Neural Nets: Many dumb parts (neurons) organizing in layers and patterned configurations. Meant to replicate the neurons in a brain. Ignorance is improved through “coaching” (experience) from one or more sources, until eventually the neural net makes its own choices based on experience. “A powerful, well-taught neural net can emulate a wide range of human pattern-recognition faculties. Systems using multilayer neural nets have shown impressive results in a wide variety of pattern-recognition tasks, including recognizing handwriting, human faces, fraud in commercial transactions such as credit-card charges, and many others. In my own experience in using neural nets in such contexts, the most challenging engineering task is not coding the nets but in providing automated lessons for them to learn their subject matter.”

Genetic Algorithms: Given a goal, the GA makes use of mutations and sexual reproduction (inheriting parts of two chosen parents) to feel out the best paths towards this goal while discarding bad paths along the way. Basically, natural selection in code. “Like neural nets GAs are a way to harness the subtle but profound patterns that exist in chaotic data. A key requirement for their success is a valid way of evaluating each possible solution. This evaluation needs to be fast because it must take account of many thousands of possible solutions for each generation of simulated evolution.”

Recursive Search: This involves analyzing a tree of every possible outcome, and narrowing the search by eliminating bad or unlikely branches to more quickly find adequate solutions. This recursive algorithm was used by Deep Blue and Deep Fritz for analyzing chess positions. “Key to the success of the recursive formula is pruning this huge tree of possibilities and ultimately stopping its growth.”

So what is the best AI methodology? Use a combination of all of the above, depending on the application. Generally, the more varied the approach, the more robust the processing and analyzing power, along with better potential for evolutionary growth. And this still doesn’t touch on the use of memristors (et cetera). There are more tools to come as the workings of the brain become more understood.

Tagged with: