====== 3D Game Development ======
----
3D Space
- x, y, and z
- Right-hand system
- Open your right hand
- Stick out the thumb
- Point your thumb along one of the axes in a positive direction
- Curl your fingers around the axis
- A positive rotation around that axis will be in the same direction that your fingers curl
- Left-hand system
3D Primitives
- Points
(x, y, z)
- Lines
- Line strip (connected)
- Line loop
- Triangles
- Triangle strip
- Triangle fan
- Quads (Quadrilaterals)
- Quad strip (series of connected quadrilaterals)
- Polygon (Polygon with an arbitrary number of vertices)
Transformations
- Translation - moving a point from one coordinate in 3D space to another
- Scaling
- Rotation
Projections
- Display part of the 3D world onto a 2D display
- "Synthetic camera"
- Projection plane - the camera lens of the 3D world
- Parallel projection:
- Often used by CAD engineers
- Orthogonal projections: top, front, and side views
- Not really realistic, in fact, loss of depth information
- Perspective projection:
- Object's size in the view depends on the object's distance from the viewer
- Need to define the location of the center of projection
- Focus distance - the distance between the viewer's eye and the projection plane
- More realistic (what we are used to)
Lighting and Shading
- The 3D sphere problem
- Diffuse light - Light comes from a particular direction, and is reflected evenly across a surface (see sunlight)
- Ambient light - with light source, fills a room until the light rays no longer have direction
- Specular light - Light is heavily reflected in a particular direction; creates a bright spot on the surface it points to
- Shading - Amount of light is computed based on angle of light on surface
- Add color to your object
- Define the color of the light
- Set a direction for the light
Ray Tracing
- Realistic rendering of 3D graphics with light interactions
- Simulates the way real light rays work
- Looks nice but can be computationally intensive. Examples:
- Reflection
- Refraction
- Ambient lighting
- Shadows
- Transparency
Texture Mapping
- Texture - Any pattern or drawing, including actual images.
- Texture mapping - Applying a texture onto some surface
- Primarily use 2D textures (e.g., marble, brick, glass, wood)
- Set up your object for textures
- Set the texture coordinates
- Load the texture, and set the appearance of the object
What is Wrong With Java for Game Development (Standard Edition)?
- Deep, and rather confusing, library
- Java Sound API is very low-level and buggy
- The GDI+ (Graphics Device Interface) 2D coordinate system can be tricky to get used to
- No standard binding to the industry 3D standard -- OpenGL
- Java3D is the de facto standard for 3D programming in Java -- and it is deficient
- Why teach 2D using the Java SDK anyway? It is still relevant in the real world, especially for application development. In addition, you can create good games using just the SDK as is, with no additional libraries.
Introducing the Lightweight Java Game Library (LWJGL)
- Designed to fix many of Java's problems with respect to game development. Meant for game development.
- Provides access to controllers such as gamepads, steering wheels, and joysticks
- Provides Java bindings to OpenGL (for 2D and 3D graphics) and OpenAL (for 3D sound)
- Relatively small (hence, lightweight), simple, and stable API
- The project's goals:
- Speed
- Simplicity
- Ubiquity
- Small size
- Security
- Robustness
- Minimalism
- Think small, as in Java 2 Mobile Edition (J2ME)
- Problems:
- Does not support AWT or Swing
- Lack of documentation and examples readily available
- Uses Cartesian coordinate system
- http://www.lwjgl.org
- Classes:
org.lwjgl.*
User Interfaces in LWJGL
- Display (very easy to do fullscreen -- without the usual platform-specific glitches)
- Keyboard
- Mouse
- Controllers
OpenGL
- Open Graphics Library
- Developed by Silicon Graphics, Inc. (SGI) in 1992
- The industry's most widely used and supported 2D and 3D graphics application programming interface (API); hardware and software independent; vendor neutral
- Requires/Enables direct access to the hardware (graphics card) -- designed using a client/server paradigm, allowing the client application and the graphics server controlling the display hardware to exist on the same or separate machines.
- Windowing system independent, and therefore contains no windowing operations or mechanisms for user input.
- Uses a right-handed coordinate system for the viewing transformations (hence, +x to the right, +y up, +z out of the screen)
- OpenGL Utility Library (GLU) - library that is provided with OpenGL - contains several routines that use lower-level OpenGL commands to perform such tasks as setting up matrices for specific viewing orientations and projections, and rendering surfaces.
- Does not provide direct support for complex geometrical shapes, such as cubes or spheres -- must be built up from supported primitives
OpenGL Functions and Demos
Early Example OpenGL Game
- Battalion: http://evlweb.eecs.uic.edu/aej/AndyBattalion.html
Camera and Viewing
GL11.gluPerspective(fovy, aspect, zNear, zFar)
- fovy - Field of view angle (between 0 and 180 exclusive) in the y direction
- aspect - Field of view in the x direction (width and height)
- zNear - Distance from viewer to near clipping plane
- zFar - Distance from viewer to far clipping plane
GL11.gluLookAt({eye}, {center}, {up; usually (0.0f, 1.0f, 0.0f)})
- The proper calling process (must be followed in order):
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
gluPerspective(50.0, 1.0, 3.0, 7.0);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
gluLookAt(0.0, 0.0, 5.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0);
- Read http://www.opengl.org/resources/faq/technical/viewing.htm for more information
Lighting
- The following is a tuple that defines the position of a light source:
FloatBuffer pos = FloatBuffer.wrap(new float[] {-0.5f, 1.0f, 1.0f, 0.0f});
.
- Light positioning:
GL11.glLight(GL11.GL_LIGHT0, GL11.GL_POSITION, pos);
- Light intensities:
- Use
GL11.glLight()
with a tuple that defines the intensity
GL_AMBIENT
GL_DIFFUSE
GL_SPECULAR
- Spotlight:
- Must define a light position
- Need a few specific parameters: the spot cutoff, the spotlight's direction, and the spotlight's focus (all three are necessary)
- The spot cutoff (
GL_SPOT_CUTOFF
) is the parameter that defines the angle between the edge of the light cone and the cone's axis. Thus, GL11.glLight(GL_LIGHT0, GL_SPOT_CUTOFF, 15.0f)
creates a 30 degree (15.0f*2) light cone.
- By default, the spotlight's direction is (0.0, 0.0, -1.0), down the negative z-axis.
GL11.glLight(GL11.GL_LIGHT0, GL11.GL_SPOT_DIRECTION, spotlight direction via tuple)
- The focus of the spotlight:
GL11.glLight(GL11.GL_LIGHT0, GL11.GL_SPOT_EXPONENT, some floating point number that defines amount of concentration)
- Material:
- Define reflectance of material on object (via color tuple)
GL11.glMaterial(GL11.GL_FRONT, GL11.{GL_AMBIENT, GL_DIFFUSE, GL_SPECULAR, GL_EMISSION}, your color tuple});
- Shininess (specular exponent):
GL11.glMaterial(GL11.GL_FRONT, GL11.GL_SHININESS, some floating point number that defines shininess);
Texture Mapping
- The process:
- Enable texture mapping in your
init()
routine via GL11.glEnable(GL11.GL_TEXTURE_2D);
- Load the image
- Create the texture in OpenGL:
GL11.glGenTextures()
- Bind (make current) the texture:
GL11.glBindTexture();
- Determine texture filter:
GL11.glTexParameteri();
- Load image data into memory:
GL11.glTexImage2D();
- When drawing your object, specify texture coordinates: e.g.,
GL11.glTexCoord2f(0.0f, 0.0f);
Terrains
- Heightfield terrains
- Land (greens) and water
OpenGL Loose Ends
- Color buffer
- Store color of each pixel of current frame
- RGBA mode
- Depth buffer (a.k.a., z-buffer)
- Store depth information for each pixel of current frame
- Associated with
GL_DEPTH_TEST
GL_DEPTH_TEST
compares the current depth stored in the depth buffer with the depth of the new pixel to draw
- Common problem that people run into: which objects are in front or behind? In other words, there is a problem with the order in which the objects are rendered.
- Depending on the test, some pixels may not be drawn (i.e., hidden surface removal)
- See example of depth buffer enabled and disabled at http://www.zeuscmd.com/tutorials/opengl/11-Depth.php
- Blending
- Combine the color of a given pixel that is about to be drawn with the pixel that is already on the screen
- Use alpha value in RGBA sequence
- If the alpha value is 0 => transparent; 1 => opaque
GL_BLEND()
- Process: specify blending function / equation for translucency (
glBlendFunc()
) => turn off the depth buffer => render transparent object
- Example: water and particle system -
Terrain3DSpecial.java
3D Model Loading via OpenGL / LWJGL
- 3DS model: created using 3D Studio Max
- There are no such things as OpenGL models. The procedure to load a model into an OpenGL program is: read the model file => capture the coordinates of the vertices, texture mapping coordinates, etc. => render via OpenGL primitives
- A little about the OBJ format:
- Simple format
- List all vertices
- Texture coordinates
- Normals
- Additional file:
.mtl
- Defines the material and texture images
- A little about a 3DS file:
GLModel
- A wrapper written by Mark Napier to render a model. Process: instantiate a GLModel
with name of 3DS model file, and then call its' render()
routine
- Creating a 3DS or OBJ model in Blender: create your model and then File > Export
- Another popular 3D model format: MD2 (Quake II). Java libraries available.
Particle Systems
- A collection of a number of individual elements, or particles. It is important to note that each particle acts autonomously, that is, not depending on other particles.
- Each particle has individual attributes including velocity, gravity, direction, color, etc.
- A particle system controls the set of particles
- Particle system management items:
- Position
- Emission rate
- Forces and gravity
- Ranges
- Blending
- State
- Special effects created from particle systems: fireworks, firecracker, flame, hair, flowing water, smoke, star fields, snow
3D Modeling Using Blender 3D
Putting It All Together (3D): The Game Engine
- A complex topic
- The core of a game: it handles everything from input to graphics rendering
- The point: reusability
- Half-Life, Counter-Strike, Natural Selection, Opposing Forces, Blueshift are all essentially the same games. Most of the source code from these games is almost exactly the same. They all use the Half-Life game engine. Also, the case with Hexen using the DOOM game engine.
- The game itself IS NOT THE SAME as the game engine. Example: a car vs. the engine
- A typical game engine design, the components:
- Input
- Game logic
- World database
- Audio subsystem
- World objects
- Texture handling
- Physics subsystem
- Particle subsystem
- Game engine design principles:
- Should be as cross-platform as possible (i.e., on the PC platform, not consoles)
- Should be reusable
- Should support cross-platform networking (i.e., games on Linux can network with games on Windows, and so on)
- Should be simple to add new components or replace existing ones on a project-by-project basis
- Should run at adequate speeds on a computer with the "bare minimum" hardware
- Should be designed in a way that is fairly simple to understand and teach
- Professional game engines (that people can use and afford; all in C++):
jMonkey Engine (jME)
- Open source, Java-based 3D game engine -- BSD licensed
- Started in 2003 by Mark Powell (a.k.a., MojoMonkey)
- Meant to be a complete solution for game development
- Architecture:
- Communicates natively with the platform's hardware via
DisplaySystem
and Renderer
- Provides real-time rendering based on OpenGL
- Scene graph-based
- Requires LWJGL
- jME architecture
- Capabilities include model loading, particle system, and sound
- Documentation still rather weak
- Games developmed using jME:
- Installation instructions via Eclipse and SVN (Subversion)
jME Scene Graphs
- Tree structure (with a root node, parent and child / leaf nodes, branches)
- In jME, there are two types of nodes:
- Internal - Can contain children. Example:
LightNode
and CameraNode
- Geometry - Leaf nodes (no children); contains the geometric information about the object (e.g., rendering properties, points, colors, texture coordinates, normals). Example:
TriMesh
- Spatial representation of a graphical scene; each branch of the tree are grouped based on location in the game world
- Placing an item in one branch will affect the children of that branch
- Why a scene graph? Many optimizations can be made in the render pipeline.
- If you can not see the node, then you can not see its children, thus being irrelevant and not rendered automatically.
- Large sections of the game data can be removed from processing.
- Culling - A process that determines if any part of an object is visible in the scene from the observer's perspective. If it is not, it will not be rendered.
- Most objects such as 3D models are represented in a tree structure (e.g., 3DS)
- Easy to implement and describe scene graphs in XML, and can be loaded into jME
- Visual example:
jME Development
jME Tools
----