Semantic Variables in Mathematics

November 20th, 2012

I think a lot about how to teach people programming and mathematics. Particularly since beginning to use Python a couple years ago, I have focused a lot on teaching people to produce readable code.

A perennially confusing topic when teaching programming is the variables i and j commonly used in 2D iteration. Without fail, you will one day be working [at 3am/in a new language/drunk] and forget which of these is the row of your matrix and which is the column. There is a very simple solution to this problem, which I stress to anyone I teach: Always use semantic names for your variables. I have doubly stressed the word always because I mean it. In the case of matrices, the variable names row and col are still pretty short but communicate infinitely more information. I no longer even use i when doing simple loops, instead preferring semantic names like step. This is good practice for anyone, but isn't a new idea when writing code.

A couple days ago, I was helping my girlfriend learn how to program partial differential equations using the finite difference method. We were working on the canonical example of the 2D time-dependent heat equation:

\frac{\partial^2 h}{\partial x^2} + \frac{\partial^2 h}{\partial y^2} = \frac{\partial h}{\partial t}

To discretize this equation, you usually split up your x,y plane into a matrix of points separated by some step size, Δx. You then consider the amount of heat at one point i,j at time n, and how it evolves to time n+1. The usual notation is something like this (assuming that the step size in x and y is the same):

\frac{h_{i-1,j}^{n} + h_{i+1,j}^{n} + h_{i,j-1}^{n} + h_{i,j+1}^{n} - 4h_{i,j}^{n}}{\Delta x^2} = \frac{h_{i,j}^{n+1} - h_{i,j}^{n}}{\Delta t}

Horrifying! Here are all these i's and j's again. What is their semantic meaning? A careful reading will show that they are referring to contributions from the neighboring points in our gridded domain. The i,j tuple here in fact refers to a single point, just in two dimensions, and the +/-1 refers to its neighbor in each direction. Suppose we applied the semantic naming idea here, and didn't limit ourselves to only letters:

\frac{h_{p \leftarrow}^{now} + h_{p \rightarrow}^{now} + h_{p \uparrow}^{now} + h_{p \downarrow}^{now} - 4h_{p}^{now}}{\Delta x^2} = \frac{h_{p}^{next} - h_{p}^{now}}{\Delta t}

I think this goes a long way toward better communicating what is actually going on in the discretization equation. The symbols now help us instead of confuse us, because they carry semantic meaning. We are used to understanding p to mean a point in space (and we could even call it pt), and the arrows symbolically indicate that we're looking at neighbors. It's about as compact as the original when written by hand, and only slightly longer when marked up in LaTeX. You do still have to explain the notation, but that was also true of i,j, and n — they just didn't carry any semantic meaning to help you remember their purpose.

Unity3D Surface Shader Tutorial

April 24th, 2012

I've been meaning to learn about Unity's Surface Shader paradigm since I saw Shawn White's talk at Unite 2010 (newer 2011 version here). Shawn was our shader and visuals programmer during the Blurst period at Flashbang, so I rarely messed with shaders, but he became something of an expert. I finally got around to trying this out 2 years later and figured I would share the knowledge I gained. Here's the final version of what I made, a model of Earth using NASA's Blue Marble Next Generation dataset. Textures are big so it takes a bit to load. Click and move the mouse to rotate the camera. Take a moment to watch how lighting interacts with water (especially rivers), the transition from night lights to daylight along the terminator, and the way that clouds scatter light around the edge. I'll explain how each of these effects is achieved below. The full shader files are linked at the bottom of the post.

Blue Marble Surface Shader Demo

Normally shaders are written as two related programs in a single file – a vertex program that modifies vertex data (like position, normals, etc.), followed by a fragment or pixel program that takes interpolated vertex data for a single pixel and converts it into the final output color. This pixel program includes everything that normally makes a shader pretty – texture combines, lighting, etc. It's also usually a horrible mess of maths, which can be daunting if you don't have a degree in linear algebra. Speaking of which, here's a tip when thinking about shaders, which use lots of dot products – the dot product is basically just the angle between two vectors, when both of those vectors have length 1. Specifically, a•b = cos(theta). So a•b = 1 => theta = 0 degrees, a•b = 0 => theta = 90 degrees, a•b = -1 => theta = 180 degrees.

The surface shader paradigm abstracts the pixel program into two steps – a surface program, that modifies the per-pixel properties of the surface (color, alpha value, surface normal, emissive light color, etc.) and a lighting program, that applies the lights to each pixel. This abstraction means that whenever you calculate a surface property, you should think about whether it is dependent on lighting. If yes, the calculation should go in the lighting program. If no, it can go into the surface program.

Picking Inspirations

A good way to learn shader programming is to try and emulate an effect you've seen somewhere else. I was interested in making a set of shaders that produced realistic looking planets, but I also specifically liked rim-lighting effects, and specular reflections on rivers.

Property and Program Declarations

Unity shaders start out with property declarations. These are the properties that will be exposed to the Material inspector and can be modified on a per-material basis. Here are the property declarations for the planet shader:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
Shader "Custom/Earth" {
	Properties {
		_MainTex ("Diffuse(RGB) Spec(A)", 2D) = "white" {}
		_BumpMap ("Bumpmap", 2D) = "bump" {}
		_EmissionMap("Night Lights Map", 2D) = "white" {}
		_EmissionStr("Night Lights Strength", Range(0,1)) = 0.5
		_EmissionColor("Night Lights Color", Color) = (1.0, 1.0, 1.0, 1.0)
		_SpecColor ("Specular Color", Color) = (0.5,0.5,0.5,1)
		_Shininess ("Shininess", Range (0.01, 1)) = 0.078125
		_SpecPower ("Specular Power", Float) = 48.0
	}
	SubShader {
		Tags { "RenderType" = "Opaque" }
		CGPROGRAM
		#pragma surface surf PlanetSpecular noambient
		...
		ENDCG
	}
	Fallback "Diffuse"
}

Each property has a name (which is the variable name in the Cg code, as well as the material property name in Unity), a description, a type (2D for textures, Range for sliders, Float for type-in values, etc.), and a default value. The first three in the example above are textures supplied to the shader, the others are colors or floating point values that give us knobs to turn to change the shader look. Here's what the inspector for this shader ends up as:

The actual Cg shader code goes within the Subshader{} block, between CGPROGRAM and ENDCG. The separate Pass{} blocks, which are normally used when writing fragment shaders, are generated automatically by the shader compiler – that's the beauty of this higher abstraction level. The #pragma surface ... line tells the compiler that we're writing a surface shader, and then tells it the names of our surface function (surf) and lighting model (PlanetSpecular). There are two built-in lighting models, Lambert (Diffuse) and BlinnPhong (Specular). If we give any other value, the surface shader will expect us to write that lighting function as well. The noambient part at the end is an optional parameter that instructs it not to apply scene-based ambient lighting. There are several other optional parameters, which are in the Surface Shader documentation. The Fallback "Diffuse" at the end of the shader tells it which shader to use if the current platform doesn't support the shader we write. In fact simple Diffuse would not be a good fallback for this shader, so I should later change it to one that includes specular highlighting.

Surface Program

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
struct Input {
	float2 uv_MainTex;
	float2 uv_BumpMap;
	float2 uv_EmissionMap;
};
sampler2D _MainTex;
sampler2D _BumpMap;
sampler2D _EmissionMap;
 
void surf (Input IN, inout SurfaceOutput o) {
	o.Albedo = tex2D (_MainTex, IN.uv_MainTex).rgb;
	o.Specular = tex2D (_MainTex, IN.uv_MainTex).a;
	o.Normal = UnpackNormal (tex2D (_BumpMap, IN.uv_BumpMap));
	o.Alpha = length(tex2D (_EmissionMap, IN.uv_EmissionMap).rgb);
}

This is the surface program, which comes after the #pragma surface ... line above. It has three parts – definition of the input structure (i.e. what data we want per-pixel), variable declarations of the material-wide (non per-pixel) properties we want to use, and the surface function (surf), which we named in the #pragma surface ... line.

The input structure is any set of values that we want on a per-pixel basis. Usually this should include the uv coordinates of your input textures, named uv (or uv2 for the second coordinate set) followed by the texture name from the property definitions. The input structure can also contain some built-in values such as viewDir (the direction to the camera). The full list is again on the Surface Shader documentation page.

Next come variable declarations for per-material data. This includes the texture samplers for texture properties (remember, the shader loads the texture only once, and then samples it using the uv coordinates per-pixel), which are named the same as the properties. It also includes any per-material constants that will be applied before lighting, such as thresholding values.

The surface function is where the actual per-pixel, pre-lighting surface property calculations are done, and stored in the output structure. The output structure is the data that will be passed to the lighting model. The default structure is:

1
2
3
4
5
6
7
8
struct SurfaceOutput {
    half3 Albedo;
    half3 Normal;
    half3 Emission;
    half Specular;
    half Gloss;
    half Alpha;
};

While each of these have a specific meaning if you use the standard lighting models, you can think of them as simply three 3-vectors and three numbers where data can be stored. Some of them have specific semantic meanings in certain cases – for instance, Emission always appears to be combined with other lights before being passed to the lighting model (you can still access its value in your custom lighting function, but its value will also be added to other lights in the scene). Likewise Alpha has a specific meaning for alpha-blended shaders (like the Clouds shader below).

In the surface function for the Earth shader, I simply store the surface color in the Albedo variable, the surface specular reflectance (alpha channel of _MainTex) in the Specular variable, the surface normal vector in the Normal variable, and the emissive light (city night lights) in the Alpha variable. Two things to note here:
1) I stored the surface emissive light in Alpha instead of Emission because I can then control the strength of the surface lights based on other scene lights – i.e. I can turn off night lights when a pixel is illuminated by the Sun. If it had stored the emissive light in Emission, it would have been combined with the scene lights before being passed to the lighting model, and I would not have been able to control the two independently.
2) Right now, the shader uses three texture samplers – _MainTex for albedo and specularity, _BumpMap for surface normals, and _EmissionMap for the city lights. This is wasteful, since I only use the grayscale value of _EmissionMap. Instead, it might be smarter to pack the city lights into the alpha channel of _BumpMap. This would reduce to only two texture samplers, using both rgb and a values of each.

Lighting Program

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
float _Shininess;
float _SpecPower;
float _EmissionStr;
half4 _EmissionColor;
 
half4 LightingPlanetSpecular (SurfaceOutput s, half3 lightDir, half3 viewDir, half atten) {
	half diffuse = max (0, dot (s.Normal, lightDir));
	half3 lightView = normalize (lightDir + viewDir);
	float specStr = max (0, dot(s.Normal, lightView));
	float spec = pow (specStr, _SpecPower);
	half4 c;
	c.rgb = _LightColor0.rgb * (atten * 2) * (s.Albedo * diffuse +
		spec * s.Specular * _Shininess * _SpecColor) +
		(saturate(1.0-2*diffuse) * s.Alpha * _EmissionStr * _EmissionColor);
	c.a = s.Specular;
	return c;
}

For this shader, this is where the magic happens. The first thing I do is declare the material property variables. In this case it's three floats – two that control the specular reflection (_Shininess and _SpecPower) and one that controls the intensity of the night lights (_EmissionStr).

The lighting function is named based on the lighting model name you supply in the #pragma surface ... line. The default lighting function is simply Lighting<ModelName>, so in this case LightingPlanetSpecular. There are variants of this function signature based on whether or not the lighting model is view direction dependent (specular reflection is), as well as whether or not the shader works with deferred lighting, or needs to handle directional lightmaps. These are detailed on the Surface Shader Lighting Models page.

The per-pixel inputs to the lighting model function are the surface property structure from the surface program (SurfaceOutput s), the vector direction to the light (half3 lightDir), the vector view direction (half3 viewDir), and the lighting attenuation factor (half atten). The output of the lighting model is a half4 vector – the final RGBA color of the rendered pixel.

In the PlanetSpecular lighting model above, line 7 calculates the diffuse emission  factor per-pixel, which ranges from 0 to 1. Lines 8-10 calculate the specular emission factor per-pixel, using the Blinn-Phong algorithm, which again ranges from 0 to 1.

Line 12 is where I set the final output color for the pixel. It is composed of the diffuse color term (_LightColor0 * 2 * atten * diffuse * s.Albedo), the specular color term (_LightColor0 * 2 * atten * spec * _Shininess * _SpecColor), and an emissive color term (saturate(1 - 2 * diffuse) * s.Alpha * _EmissionStr * _EmissionColor) for the night lights. Again, the diffuse and specular terms are totally standard Blinn-Phong. That's it!

Night Lights

For the night lights, saturate(1 - 2 * diffuse) serves as the strength of the lights for the pixel, where saturate() is a Cg function that clamps a value between 0 and 1. It's necessary so that you don't get negative emissive lighting (black cities) when the diffuse term is 0.5-1. I used 1 - 2 * diffuse because I wanted the lights to turn on gradually as a city neared the terminator between daylight and night. Using only 1 - diffuse meant that lights would begin to turn on as soon as a city passed local noon, which looked unnatural (actually the 2 should probably be a tunable material property). By the time a city reaches the terminator, the diffuse term is 0 and the lights are fully on.

Night lights slowly fade as cities pass the day/night boundary

 

Emphasizing River Specular Reflection

I loved the look of the Amazon in the Flight404 example above, so I played around with the specular map (stored in the alpha channel of _MainTex), until I got an effect I liked. I eventually settled on selecting water using a false color map, the same one used for the Flight404 example. I used Photoshop's Select Color Range tool to pick the water color, with the fuzziness cranked all the way up. I then ran an edge detection filter to emphasize the rivers. This also had the side effect of emphasizing shorelines, a subtle but really cool effect. I picked a silver-blue color for _SpecColor that contrasts well with the dark.

River specular reflection on the Amazon. It's more subtle than the Flight404 example, but still stunning. The shoreline is also emphasized thanks to the edge detection filter I applied to the spec map.

Cloud Shader

I wanted the cloud shader to approximate subsurface scattering in the atmosphere, like in the rim lighting planet shader in the inspirations section above. I didn't like two things about basic rim lighting though – it is applied equally around the entire rim, and for shaders with bump maps (like the example one I linked), it looks ugly as all hell. So in the spirit of learning, I decided to write a custom shader for this, too. Here it is:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
Shader "Custom/Clouds" {
	Properties {
		_MainTex ("Alpha (A)", 2D) = "white" {}
		_RimColor ("Rim Color", Color) = (0.26,0.19,0.16,0.0)
		_RimPower ("Rim Power", Float) = 3.0
	}
	SubShader {
		Tags { "RenderType"="Transparent" }
 
		CGPROGRAM
		#pragma surface surf WrapLambert alpha
 
		struct Input {
			float2 uv_MainTex;
			float3 viewDir;
		};
		sampler2D _MainTex;
		float _RimPower;
 
		void surf (Input IN, inout SurfaceOutput o) {
			half rim = 1.0 - saturate(dot (normalize(IN.viewDir), o.Normal));
			o.Specular = pow (rim, _RimPower);
			o.Alpha = tex2D (_MainTex, IN.uv_MainTex).a + o.Specular;
		}
		half4 _RimColor;
 
		half4 LightingWrapLambert (SurfaceOutput s, half3 lightDir, half atten) {
			half NdotL = dot (s.Normal, lightDir);
			half diffuse = max(0, NdotL* 0.9 + 0.1);
			half4 c;
			c.rgb = (atten * 2) * _LightColor0.rgb * diffuse * (1.0 - s.Specular)
				+ (diffuse * s.Specular * _RimColor);
			c.a = s.Alpha;
			return c;
		}
		ENDCG
	}
	FallBack "Diffuse"
}

And here is the inspector:

There are a few specific interesting things about this shader.

First, it's an alpha blended shader (#pragma surface ... alpha). In surface shader parlance, this means that all we have to do is fill o.Alpha with the proper alpha value for the pixel and Unity will handle the actual blending. Cool, but it does mean we have to be careful about what we put in that variable.

Second, the rim-lighting calculation. Lines 21-22 are the standard way to do this – it's based on the view direction (always around the edges from where we look), and increases in power as the surface normal points sideways from us, hence the 1.0 - dot(IN.viewDir, o.Normal). I store the value in o.Specular because I want to use lighting to later modify its visibility, so there isn't rim light around the entire planet.

Third, the alpha value. I get the alpha value from the supplied _MainTex, but then I add o.Specular, because I want to make the rim lighting visible even if there aren't clouds in a pixel (i.e. the alpha from _MainTex is 0).

Fourth, the lighting model is not a standard Lambert (diffuse) model, but is based on Wrapped Lambert, a modified diffuse shader where light is allowed to wrap around the edges of a surface. This again helps to fake subsurface scattering, because it lets the sunlight from the scene wrap slightly around Earth's limb. Line 29 is where this happens, and we basically just take 90% of the normal diffuse factor, then add 10%. When we actually apply the lighting, we have a diffuse term (2 * atten * _LightColor0.rgb * diffuse * (1.0 - s.Specular)) and the emissive term (diffuse * s.Specular * _RimColor). By multiplying the rim factor (s.Specular) by the diffuse factor, I ensure that we only apply rim lighting where there is normally light. We just add some sky blue for scattered light if we happen to be looking through the atmosphere sideways. The diffuse term gets multiplied by 1.0 - s.Specular so that rim lighting always takes precedence, otherwise we would just get white light when a brightly lighted limb had the rim lighting added.

Rim lighting on the cloud shader, to fake atmospheric scattering.

The end result of the rim lighting is decent, but I don't like the hard edge between the limb and black space. This could be fixed by using a ramp texture for the rim lighting, where it slowly increases in intensity from center to limb, then falls off abruptly to simulate thinning atmosphere. That will be a test for another day though!

Update! Using a Ramp Texture For Rim Lighting

As I mentioned above I wasn't totally pleased with how the rim lighting looked. Since its strength was 100% at the edge, it looked too stark against the black of space. Earth's real atmosphere is very thin and quickly fades to nothingness if you look at the horizon from space. So instead of the mathematical power-law that I was using, I replaced the clouds surface function with one that uses a "ramp texture" to map viewDir•normal to actual opacity. This lets the artist (me!) pick any gradient that looks good, rather than just changing the power law index.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
...
Properties {
	_MainTex ("Alpha (A)", 2D) = "white" {}
	_AtmosRamp ("Atmosphere Ramp (RG or B)", 2D) = "black" {}
	_RimColor ("Rim Color", Color) = (0.26,0.19,0.16,0.0)
}
...
struct Input {
	float2 uv_MainTex;
	float3 viewDir;
};
sampler2D _MainTex;
sampler2D _AtmosRamp;
 
void surf (Input IN, inout SurfaceOutput o) {
	float2 uv_Ramp = 1.0 - saturate(dot (normalize(IN.viewDir), o.Normal));
	o.Specular = tex2D (_AtmosRamp, uv_Ramp).r;
	o.Alpha = tex2D (_MainTex, IN.uv_MainTex).a + o.Specular;
}
...

Line 16 generates the UV coordinate where we sample the ramp texture (this syntax makes both u and v coordinates the same, but the texture is the same on every line in the y direction). Here's the inspector:

Inspector for the new ramp texture atmosphere and clouds shader.

This looks much, much nicer, and can be tuned easily by simply using a different gradient in the texture. I also gave the sphere model an upgrade with more triangles, so it doesn't look so ugly on the edge! Here's a comparison to the above picture that used just power-law rim lighting:

Atmosphere using a ramp texture with an abrupt falloff at the edge

Full Planet Shader
Full Cloud Shader

Useful Surface Shader Learning Links

Unity Manual: Writing Surface Shaders – Includes all the #pragma surface optional flags

Unity Manual: Surface Shader Lighting – Very brief, but gives naming conventions for lighting functions

Unity Manual: Surface Shader Examples – Some good examples but on a rather boring model

Unity Manual: Surface Shader Lighting Examples – Good examples for lighting models (toon ramp for instance)

Unity Manual: ShaderLab Bultin Values – some predefined variables you can use in Cg programs

Cg Tutorial: Standard Library Functions – Cg functions you can use in your shaders. Note that noise() doesn't work in Unity?

Cg Tutorial: Lighting and Lighting Models – For more information on lighting models

Flight 404 – Rad visualization porn for inspiration

Decision to defund JWST^H^H^H^H The Afghan War

July 15th, 2011

Putting into perspective the House's recent recommendation to defund the James Webb Space Telescope.

The James Webb Space Telescope (JWST) Afghan War (AW) Independent Comprehensive Review Panel revealed chronic and deeply rooted management problems in the JWST AW project. These issues led to the project cost being underestimated by as much as $1,400,000,000 trillions of dollars relative to the most recent baseline, and the budget could continue to rise depending on the final launch withdrawal date determination. Although JWST the AW is a particularly serious example, significant cost overruns are commonplace at NASA DoD, and the Committee believes that the underlying causes will never be fully addressed if the Congress does not establish clear consequences for failing to meet budget and schedule expectations. The Committee recommendation provides no funding for JWST AW in fiscal year 2012.

The Committee believes that this step will ultimately benefit NASA DoD by setting a cost discipline example for other projects and by relieving the enormous pressure that JWST AW was placing on NASA's DoD’s ability to pursue other science national security missions.

Accuracy in Labeling -- Supernovae

March 23rd, 2010

Last September I started a PhD in Astrophysics at Arizona State. I really enjoy doing public outreach and engaging in informal education, so as I learn new and awe-inspiring things I spend a lot of time thinking, "What's so cool about this, and how do I explain it to my mother?" I think Carl Sagan expressed the motivation best in The Demon-Haunted World: "Not explaining science seems to me perverse. When you're in love, you want to tell the world." What set Carl apart, however, was his unique ability to articulate this love in a way that expressed his enthusiasm and was understandable to a wide audience. Your mother probably doesn't want to sit through a stuffy lecture, even if the contents are astounding.

One of the most amazing discoveries of modern astrophysics is that almost all of the chemical elements we see around us were produced in supernovae -- energetic explosions that typically mark the death throes of massive stars. Elements heavier than oxygen are disseminated mostly through supernovae, and elements heavier than iron come almost only from supernovae. This means that literally everything around you is full of atoms that were originally created in massive stars that exploded and sent those elements flying into interstellar space, where they eventually coalesced into dust and became you and the Earth you're standing on.

Being a big fan of stickers you can put anywhere, I thought a sticker campaign would be the perfect cheeky way to engage in some informal education! I modeled them after the labels warning of cancer risk that you find on household chemicals, furniture, and almost every building in the state of California. They're both factually correct, but while knowing that everything causes cancer is a buzzkill, knowing that everything came from supernovae is awesome.

Almost all elements around us on Earth originated in supernovae! So cool!

The image I chose is one of my favorite Hubble Space Telescope pictures, the Antennae Galaxies. I've photoshopped the fake supernovae over the top. As always, there's a Flickr set that will continue growing. Here are the source files so you can print some yourself -- they're intended for 2.5" x 2.5" sticker backs.
Master psd
UBahn font
Patagonian font (with bonus dinosaurs!)

Power Glove Updates, Maker Faire Bay Area

May 24th, 2009

Over the past few weeks I've been working on some improvements and extensions to my Power Glove 20th Anniversary Edition. On the tech side of things, I replaced the ugly 9V battery I was using with a low-profile, rechargeable Lithium-Polymer battery. I've updated the steps in the Instructable with new pictures and instructions.

I also re-wrote my Java-Unity bridge using a UDP socket. This is a lot more elegant than the text file approach I had been using before. Now the Java program acts as a server, reading in serial data from Bluetooth and broadcasting each line as a UDP packet. The Unity input manager then reads the UDP packets and parses the actual sensor values. This should reduce disk writes, and is more reliable, so I don't have to reset the Java bridge as often. I've updated the code bundle with the new Java and Unity source code.

See me at Maker Faire!

The other big news is that I'm going to be exhibiting at Maker Faire Bay Area! Maker Faire is one of my favorite gatherings -- a fantastic nexus of creative people making wonderful things. If you're in the Bay Area, you can come try the Power Glove out for yourself this weekend, May 30-31, at the San Mateo County Expo Center!

As a bonus for Maker Faire attendees, I've finished adding Power Glove support to our most popular Blurst game, Off-Road Velociraptor Safari! I recorded a demo video to show it off:

Make the Future You Imagined: The Power Glove -- 20th Anniversary Edition

April 3rd, 2009

I always loved the Nintendo Power Glove. Not because it was a fun or useful peripheral -- it wasn't. In fact it wasn't bad, as Lucas asserted, it was absolutely terrible. Only two games were ever made to work with it -- Super Glove Ball and Bad Street Brawler. You could use it with other NES games of course, but it was just an obfuscated controller. Plus, it was horribly imprecise, and since it required a sensor bar to find its orientation, you had to hold your hand at shoulder level all the time. No, I loved the Power Glove for what it represented -- a precursor to virtual reality, a way for humans to directly manipulate computers, like an artifact from some sort of alternate future Earth.

I realized one day that we're actually living in that future. It doesn't look the same as we imagined it, but the necessary elements are all there. It's been 20 years now since Mattel released the Power Glove, in 1989. Especially in the last few years, the availability of sophisticated sensing equipment to hardware hackers has grown by leaps and bounds. Technology like programmable microcontrollers, accelerometers, and Bluetooth are readily available -- and cheap. In short, the time is ripe to re-make the Power Glove -- and make it right.

Over the past month, I've done just that. I stripped the guts out of an original Power Glove, replaced the ultrasonic sensors with an accelerometer, the proprietary microcontroller with an open-source Arduino, and the wired connection with Bluetooth. I wrote an input manager to get the data into Unity, and hooked it up to the boxing game Adam and I are making for iPhone, Touch KO. What's more, I've documented the whole process so that you can make you own!

I have a video, photos, and an instructable of the build process, and have the schematic, Arduino, and Unity code available for download. You can read the data in any way you like, but since many software packages don't have direct access to serial ports (Unity included), I've also written a small Java program that takes the input and dumps it directly to a text file.

Side note: Since my last post I tried and now totally dig twitter. Follow me.

Cable Wrangling -- Making Your Own Ribbon Cables

February 20th, 2009

I'm working on a larger-scope project right now that involves collecting a lot of sensor data with an Arduino to transmit over Bluetooth. For past projects, I've relied mostly on individually cut wires, but even if you use heat-shrink tubing or the like to bundle them up, they're still a pain to keep in order. Consumer electronics often use ribbon cables for this sort of thing when they can't just run traces on a board. I realized that I could make custom cables using the ribbon cables from old floppy and CD drives. With a steady hand and a utility knife, cut off the number of cables you need, cut them to length, and split and strip the ends. Voila -- custom-sized ribbon cables, and a great way to recycle old computer parts!

3.5 inch floppy drive ribbon cableVoila - custom ribbon cable!

Integrating Cocoa With Unity iPhone

January 23rd, 2009

I just finished writing up a post for the Flashbang Technology Blog about integrating custom Cocoa content with Unity iPhone projects. I spent about two and a half weeks in December developing a system that would work for all of our Unity iPhone projects. The goal was to allow me to develop all of our menus and other non-gameplay content using Apple's super-slick UI development application, Interface Builder. I used this in !Rebolt! and managed to finish all the menus in a couple days. Here's a snippet:

So I set a goal: Make an easily extensible Cocoa frontend for Unity iPhone that supports Blurst logins and supports any menus we might want. It should work for any project we add it to, so we don’t have to do tons of custom code for every game. Further, it should require changing as little of ReJ’s existing Objective-C AppController code as possible, in the event that it changed in a later build. Finally, I wanted an easy way to add my additional files to the XCode project once I created a build. This is particularly important because, to maintain rapid iteration times, there must be a minimal amount we have to do in XCode between creating a build and installing that build on the phone.

You can read my full article at technology.blurst.com.

!Rebolt! nearing completion

December 6th, 2008

Just a quick update on !Rebolt!, the excellent little robot-combat game Adam and I have been working on the last couple months. We're just about done, having spent a good deal of time optimizing the game for the phone. I've been pretty much eating, sleeping, and breathing iPhone since we got Unity's first beta. We both love how it's been coming along, and I've just spent the last couple weeks integrating login for Flashbang's Blurst.com accounts. You'll be able to save your high scores and track your achievements online using the same kickass system (and account!) we developed for Off-road Velociraptor Safari. Here's the final !Rebolt! trailer, which Adam whipped up today:

We're also releasing another iPhone game, Raptor Copter, simultaneously, along with our new web game, Minotaur China Shop. That's three games being released in one month, oi am I gonna need some vacation in December!

Accuracy in Labeling -- Property of the Bavarian Illuminati

October 16th, 2008

One of my favorite authors is the late Robert Anton Wilson. His own particular brand of absurdism resonates well with my own and has given me inspiration for all manner of crazy things! My favorite works tend to be his fiction -- especially the Illuminatus! trilogy and its successor the Schrödinger's Cat trilogy. Illuminatus! in particular tells the story of the most fantastically absurd far-reaching conspiracy ever dreamed up in fiction or even reality. In short, it's pretty much got to have some truth to it.

In Illuminatus!, it's noted that you can tell where the Illuminati are exerting their influence by watching for subtle and recondite symbols they use -- the numbers 17 and 23, the images of the Ouroboros and the Eye in the Pyramid, and the phrase "Property of the Bavarian Illuminati! Ewige Blumenkraft!" Now, 17 and 23 I see everywhere. The Eye in the Pyramid is on the one dollar note. But I've noticed a disproportionately small number of property claims by the Illuminati, given their clearly far-reaching influence. In the spirit of accurate labeling, such as "antibacterial," "All-natural," and "Now SLOWER and with MORE BUGS!," I've made these "Property of the Bavarian Illuminati! Ewige Blumenkraft!" stickers. They should be placed wherever the Illuminati's influence is painfully obvious, yet conspicuously undeclared!

Illuminati-owned cat food bowlIlluminati-owned fake surveillance camera?
Illuminati-owned MoleskineIlluminati-owned parking meterIlluminati-owned national wildlife refuge

There's a developing Flickr set where those came from! If anyone wants to modify them or make their own, I've posted the source psd. I'll also have gobs at Maker Faire Austin this weekend (look for a gentleman in a top hat).

Ewige Blumenkraft!
O Hail Eris!


GENERIC ANTABUSEVENTOLIN SIDE EFFECTSRevia