• Register

Thrive is a game about the universe - we want to let you play however you want, make whatever you want, and live however you like. Thrive will be a free, open-source game created by an online community of dedicated volunteers. Our team seeks to accomplish two major goals: create engaging, compelling gameplay that respects our players' intelligence, and remain as accurate as possible in our depiction of known scientific theory.

Post news Report RSS Programming News

Our lead programmr has revealed some of his genious, and would like to brng you some information about how our creature engine will work.

Posted by on

After a long absence, Bahinerox, our favorite ModDB find, has come back with a whole load of information on our creature engine. For those of you who are interested in the technical aspects of our game, this is definitely a good read for you.

Someone wrote: In the meantime, I've been working both of the two most important part of the game. The first being how to visualise creatures in the creature engine. I've tried out many different theories, and have finally rested on what I hope will be as close as possible to realistic-looking organic constructs.

First, lets look at spore. People may go "omg its such a crappy system blah blah blah" but in reality, what they did is actually quite complex, and impressive. They allowed anyone with absolutely no talent whatsoever to create their very own creature with minimal effort.

And to do that took alot of work.

They used the very common combination of metaballs and marching cubes in a smart way. This part is important, as it forms the basis for our visualisation engine:

(The following section of this post is for two purposes.)

(A) -> To Show just how complicated it is to program stuff like this. This is getting into reasonably advanced maths, jsut to draw stuff to the screen. We're talking trig, matrix math, calculus, rotation using quaternions, its all pretty insane.
Programming is easy. Programming something useful is difficult. Just because i know a programming language, doesn't mean i always know what to say to the computer. I talk code, not calculus :/

(B) -> To show what i'm currently actually doing, for the curious.

A metaball is a function whereby

i = 0, nsum{ metaballi(x,y,z)} <= threshold

One of the most common functions for which is:

f(x,y,z) = 1 / ((x - x0)2 + (y - y0)2 + (z - z0)2)

where (x0,y0,z0) is the centre of the metaball

Which, for those that aren't mathematically inclined, is a "fancy" way of saying that at the centre of the metaball the density is 1, and reduces by the inverse-square of the distance. Or,

f(r) = (1 − r2)2

where r is the distance to metaball(x0,y0,z0)

Anyways, the "<= threshold" is the boundrary condition. in order to actually render the metaball, we need to take this boundrary condition and turn it into polygons.

IN order to do that, we use Marching cubes.

Marching cubes samples the metaball at various points on a cubic grid in 3d space.

This is where things get a little more complicated

The grid is divided into cubes (which share edges) and each corner of the cube (8 of them) is sampled, compared against the threshold value (in the case of metaballs, this is usually 0.5f)
which means that each point is clamped to a boolean of either 1 (the value was higher than the threshold) or 0(The value was lower than the threshold)

this creates a total of 28 = 256 combinations of values.

In order to make it faster all of these combinations are stored as a giant table in memory.

Each combination responds to a polygon configuration, which is rendered into that cubic space, and the next cube in the grid is computed. This generated the mesh of the metaball.

If we want to create two metaballs on the screen, we add the two metaball functions for that point in space. This is where it gets interesting, as a metaball by itself is essentially a sphere, however, when you have two metaballs in close proximity, the fact that the metaballs are reppresented using a density field that decreases with distance, means that at a point between the two metaballs whereby the density was below threshold for each metaball alone, the two combined may be above the threshold.

This gives a melded organic appearance which many people will recognise as "looking like mercury"

where is this all going with Spore?

Spore had a metaball for each point on the creature's spine.

the mouse wheel scrolling on a point of the spine changed the metaball's intiial density.

(instead of starting at 1, each metaball has a variable intial density. so for:

f(r) = (1 − r2)2

We have

f(r) = (d − r2)2

where d is initial density)

Which, of course, changed the size of the body of the creature

So, what about Thrive?

Thrive's Solution: PARMFIELDOBJ: The new metaball. (Or rather, the extended Metaball.)

It stands for PARaMetized density FIELD OBJect

Or Parametized Density Field Object

I spent alot of time wracking my brain on how to further improve the hulling of the popular metaball method. spline conversion to polygonised surfaces was great until you get to hulling said surfaces with a "skin"

the answer came to me when converting the rendering engine from openGL to DirectX:

Lighting. Lighting is essentially a density field system.

You basically take the density (or rather, intensity) of a point on a polygon at every pixel for that polygon, and multiply that intensity by the normal (the direction that polygon is facing) as well as the camera's angle in order to determine final pixel intensity.

Now, light systems usually have more than just an omnidirectional point light. (basically the same as a metaball function, it even uses the inverse square law to calculate falloff.)

The most important type that sprang the Parametized Density Field Object idea is the cylindrical light.

You have a line segment, and calculate the light intensity by getting the distance to the nearest point on the line (which just happens to be at a right angle to the line for cases where the sample point lies inbetween the line on parallel axes(that is the plural of Axis, btw.)) Of course if outside the line you get the distance to the nearst endpoint of the line segment.

This creates a sort of capsule shaped density field. A bone, as it were.
Which can also be extended to an arc instead of a line as detailed below:

And then you have a horn:

a line segment, or arc, (a circular arc, that is, an arc that if extended would create a perfect circle. Complex spiral shapes would be made up of many arc primitives) that has two different density values at each end. the density is interpolated (linearly or otherwise) along the line or arc, to create a "bone" with altering shape.

To calculate the closest point of an arc is quite simple, you find the centrepoint of the circle representing the arc (which you initially have if you are creating the arc programmatically anyway) and get the intersection point by drawing a line from the circle's centrepoint to the sample point. Mathematically, this involves some creativeness, involving dot and cross products in order to get a vector representing that line. of course, if the angle is greater than or less than the arc endpoints, you find the distance to that endpoint instead.

Anyways, a third type would be muscle,
whereby the density at the centrepoint of an arc is set, and decreases to zero towards the endpoints.

Then you take the density values and run marching cubes (Actually i'm using marching tetrahedrons, its almost the same but uses on it.

Now. We have a bunch of parametised constructs, and order them in a heirachy so that we can move them around at the joins (Inverse Kinematics om nom nom) This is called rigging

but how do we move the polygonised hull aronud the rig?

Yet again, good ol' spore figured out a way.

I accidentally reverse engineered this in my head, by the way. Yes, i accidentally reverse engineer game technology in my head.

for every vertex created, you assign metadata to the vertex containing the percentage (clamped to a value between zero and one, one being 100%) of density that each density generating object contributed to bringing that vertex to the threshold. then when moving the parts of the rig (The representation of the parametised objects) around, you multiply the distance each object moved (that represents a given vertex) by it's contribution (a technique similar to the standard rigging processes known as weighting) add up the weighted movement vectors for each object and you get the new position of the vertex.

This means that if two rig items move in opposite directions, a vertex containing 50% contribution (weighting) from each object will no move at all. vertexes offset slightly to that will move slightly, increasing in distance for vertexes originating further away from shared density space.

Currently, I am working on constructing the Parametised objects. The mathematics gets quite complicated, so it'll take a while.

Post comment Comments
Callinstead09
Callinstead09

GREAT NEWS! I know there is still a long way too go, But this awesome work so far!!

Reply Good karma Bad karma+2 votes
DELTΔ
DELTΔ

Win my friend, just win :)

Reply Good karma Bad karma+2 votes
timstro59
timstro59

i couldnt understand half of what you said.

but good job.

Reply Good karma Bad karma+1 vote
bikkebakke
bikkebakke

... ill just redirect my thoughts of being a game designer into a web designer or something... still great to see some updates :D

Reply Good karma Bad karma+1 vote
madcat1030
madcat1030

You had me at quaternion.

Reply Good karma Bad karma+2 votes
sciocont Author
sciocont

Thanks for the responses!

Reply Good karma+1 vote
explorer13
explorer13

I wish this was updated more, but either way great post!

Reply Good karma Bad karma+2 votes
Post a comment

Your comment will be anonymous unless you join the community. Or sign in with your social account: