Hand-it: Motion-tracking Rhythm Game
Research Slides: PDF
We decided we would create a project that would take all the positive aspects of the games of its genre and combine them into one interactive rhythm game that we call “Hand It”. Instead of an interface with plain notes sliding across the screen we decided it’d be best if our game had a metaphor. We chose a space portal theme for the creation of galaxies. There are 4 separate scoring systems for each type of astronomical object being planets, moons, meteors, and stars. The more the player succeeds in capturing a type of note the more of that object will be in the galaxy.
The interface is designed to be as clear as possible to its purpose. There are no distractions in the background. It is always clear where the hands are on the screen and which hand corresponds to right and left: blue and red. Notes appear in the center of the screen and head towards 1 of the 4 corners of the screen to 4 different portals. The portals are placed in the corners rather than top, left, right, or bottom because it is easier for the hands to move into these quadrants. Aesthetically it’s designed to look like a vortex in space is spewing out materials and with their hands the player snatches up these materials to a rhythm. The scores for these materials are displayed neatly in their respective corners of the screen to give the feedback that the player is doing well or poorly with each quadrant. The lower the score in one quadrant the more attention should be paid there.
We had many sorts of inspirations to create this interactive rhythm game. We tried to mix in the best ideas from Guitar Hero, Rock Band, Para Para, and mainly the eye-toy from the Playstation 2, to create a completely original theme of rhythm game. The eye toy does utilize the user’s body movements to show results on a screen, so we decided to take it further and implement it into a rhythm game. This combination of genres was essentially a good way to go because it takes two elements of particularly popular and fun interactive games and expands on the main idea.
Video Documentation:
Friday, April 17, 2009
Tuesday, March 10, 2009
Week 8
Sketch Three - Week 1 / 5
Team:
Alfred Darakjian - 301041807
Zac Bush - 301040794
Bryan Ottho - 200108250
Thomas Cheung - 301053272
"Hand It!"
We chose to design and program a rhythm game that allows for much user-to-interface interactivity. The main idea is that the screen is divided into four different sections. In each section, a thin rectangle would move from the bottom of the screen to the top, and the user would have to make contact with the rectangle in the specified area to acquire points. The method of contact used will be with the user’s hands. Each hand will be covered with a glove taped with Infrared Tape which will help MAX/MSP detect and produce a result on the screen. This will be accomplished with the aid of a Nintendo Wii controller. To avoid cheating, the hand must be moved out of the specified section before moving to the next one; this helps eliminate points gained for simply wildly waving the gloves to catch all the moving rectangles. Our inspiration came from the original Japanese rhythm game called “Para Para Paradise”. Similarly, in this game, dots move up on the screen to different areas. The player moves their hands into the specified area and removes it to activate it.
Timeline:
Week 10: (Present) Proposal and planning
Week 11: Interface testing
Week 12: Gameplay testing
Week 13: Project presentation
Week 14: Documentation
Week 15: Show
Team:
Alfred Darakjian - 301041807
Zac Bush - 301040794
Bryan Ottho - 200108250
Thomas Cheung - 301053272
"Hand It!"
We chose to design and program a rhythm game that allows for much user-to-interface interactivity. The main idea is that the screen is divided into four different sections. In each section, a thin rectangle would move from the bottom of the screen to the top, and the user would have to make contact with the rectangle in the specified area to acquire points. The method of contact used will be with the user’s hands. Each hand will be covered with a glove taped with Infrared Tape which will help MAX/MSP detect and produce a result on the screen. This will be accomplished with the aid of a Nintendo Wii controller. To avoid cheating, the hand must be moved out of the specified section before moving to the next one; this helps eliminate points gained for simply wildly waving the gloves to catch all the moving rectangles. Our inspiration came from the original Japanese rhythm game called “Para Para Paradise”. Similarly, in this game, dots move up on the screen to different areas. The player moves their hands into the specified area and removes it to activate it.
Timeline:
Week 10: (Present) Proposal and planning
Week 11: Interface testing
Week 12: Gameplay testing
Week 13: Project presentation
Week 14: Documentation
Week 15: Show
Monday, March 9, 2009
Week 7
Sketch Two Revisions:
Please refer to this post for how Sketch Two works.
For our sketch two revisions we changed our three thresholds to nine. This added a smoother progression of how much energy is being exerted. It also provides better feedback to the user as to what their movement is actually affecting.
This is with 0% energy:
This is with 100% energy:
Some of our research added this week was from Christa Sommerer and the installations she's done regarding media arts and plant growth.
Please refer to this post for how Sketch Two works.
For our sketch two revisions we changed our three thresholds to nine. This added a smoother progression of how much energy is being exerted. It also provides better feedback to the user as to what their movement is actually affecting.
This is with 0% energy:
This is with 100% energy:
Some of our research added this week was from Christa Sommerer and the installations she's done regarding media arts and plant growth.
Sunday, March 1, 2009
Week 6
Sketch Two Completion
Sketch Two was completed this week. Our project is titled 'Energy Over Camera'. According to how much movement is in front of the camera, a different image will be displayed. The meaningfulness of this project can be extremely varied. One idea we have is to show how human effects on the environment can have negative consequences by showing a tree dying by however much movement is detected. On the plus side, we plan to add a reason for inputting the energy in the first place. Such as, depending on how much movement and energy is being inputted, a seed (for example) will grow into a flourishing plant.
How it works:
-The camera takes in a feed via a difference threshold. The difference is summed up by the amount of white pixels taken in and put into an integer. According to the integer a different image is displayed on the screen. The sum is calculated by a third-party library called cv.jit.
What it needs:
-More meaningful reason to input the energy.
-Feedback for the users so that they know their movement is having an effect (Done via a simple integer or percentage displayed on the screen).
Slides: PDF
Patches for PC and Mac: RAR
Sketch Two was completed this week. Our project is titled 'Energy Over Camera'. According to how much movement is in front of the camera, a different image will be displayed. The meaningfulness of this project can be extremely varied. One idea we have is to show how human effects on the environment can have negative consequences by showing a tree dying by however much movement is detected. On the plus side, we plan to add a reason for inputting the energy in the first place. Such as, depending on how much movement and energy is being inputted, a seed (for example) will grow into a flourishing plant.
How it works:
-The camera takes in a feed via a difference threshold. The difference is summed up by the amount of white pixels taken in and put into an integer. According to the integer a different image is displayed on the screen. The sum is calculated by a third-party library called cv.jit.
What it needs:
-More meaningful reason to input the energy.
-Feedback for the users so that they know their movement is having an effect (Done via a simple integer or percentage displayed on the screen).
Slides: PDF
Patches for PC and Mac: RAR
Sunday, February 22, 2009
Week 5
Learning MaxMSP & Jitter
This was our first week learning MaxMSP and I found it to be quite an interesting programming environment. I unfortunately started a little late learning it as my health left me out of commission over reading break but one meet-up with Greg has been the most beneficial part of my week and got me back on track. Initially starting with Max 4.7 I soon moved up to Max 5 which allowed me to work with Jitter. I also installed a library which Greg suggested called cv.jit.
Sketch Two: Brainstorming
Our project is almost completed. To describe it briefly, the webcam takes input by detecting frame difference (which shows its changes in white pixels). The white pixels are averaged and summed up into an integer and then according to that integer a different output image is displayed. For example: Low movement shows an image of a man standing static, medium movement shows an image of the man smiling, heavy movement shows the man laughing. There are a number of practical applications we can think to do with this program and we will show different examples next week when we present.
This was our first week learning MaxMSP and I found it to be quite an interesting programming environment. I unfortunately started a little late learning it as my health left me out of commission over reading break but one meet-up with Greg has been the most beneficial part of my week and got me back on track. Initially starting with Max 4.7 I soon moved up to Max 5 which allowed me to work with Jitter. I also installed a library which Greg suggested called cv.jit.
Sketch Two: Brainstorming
Our project is almost completed. To describe it briefly, the webcam takes input by detecting frame difference (which shows its changes in white pixels). The white pixels are averaged and summed up into an integer and then according to that integer a different output image is displayed. For example: Low movement shows an image of a man standing static, medium movement shows an image of the man smiling, heavy movement shows the man laughing. There are a number of practical applications we can think to do with this program and we will show different examples next week when we present.
Saturday, February 7, 2009
Week 4
Sketch One: The Scouter
For details regarding the Scouter, please see the previous week's post or the PDF attached.
The construction of the scouter proved to be a difficult but rewarding experience. Mounting a light-blocking aesthetic to the outside of a pair of sunglasses and wiring some LEDs and sensors to the inside and outside of it wasn't as easy as we thought. In the end the wires became a little messy and the arduino board become unstable. The distance and proximity sensors worked as we had desired, however, and the device performed exactly as intended (with the exception to one LED not being as bright as we wanted).
Arduino Code:
Slides:
SketchOneSlides.pdf
Photos:
For details regarding the Scouter, please see the previous week's post or the PDF attached.
The construction of the scouter proved to be a difficult but rewarding experience. Mounting a light-blocking aesthetic to the outside of a pair of sunglasses and wiring some LEDs and sensors to the inside and outside of it wasn't as easy as we thought. In the end the wires became a little messy and the arduino board become unstable. The distance and proximity sensors worked as we had desired, however, and the device performed exactly as intended (with the exception to one LED not being as bright as we wanted).
Arduino Code:
int potPin = 2; // select the input pin for the potentiometer
int potPin2 = 3;
int ledPin = 13; // select the pin for the LED
int ledPin2 = 12;
int val = 0; // variable to store the value coming from the sensor
int val2 = 0;
void setup() {
pinMode(ledPin, OUTPUT); // declare the ledPin as an OUTPUT
Serial.begin(9600);
Serial.println("Systems On");
}
void loop() {
val = analogRead(potPin); // read the value from the sensor
Serial.println(val);
val2 = analogRead(potPin2);
Serial.println(val2);
if (val > 100) {
Serial.print("Object Detected on Sensor 1");
digitalWrite(ledPin, HIGH); // turn the ledPin on
} else {
digitalWrite(ledPin, LOW); // turn the ledPin off
}
if (val2 > 100) {
Serial.println ("Object Detected on Sensor 2");
digitalWrite(ledPin2, HIGH); // turn the ledPin on
} else {
digitalWrite(ledPin2, LOW); // turn the ledPin off
}
}
Slides:
SketchOneSlides.pdf
Photos:
Tuesday, January 27, 2009
Week 3
Sketch One Progress Report:
The idea for our Sketch One involves a headset which is to aid blind people who are not 100% blind (able to detect minute differences in light change). The headset is stylish and able to have many different appearances but is also capable of being discrete.
Inside the visor will be lights which light up the INSIDE of the visor (while showing a display on the outside that they are lit up), on the outside will be sensors which are triggered with proximity to other objects (can be light sensors as well).
The most simple explanation is, when an obstruction is detected, the lights in the visor will light up indicating to the wearer that an obstruction is ahead. This is a hands-free aid for blind people to enhance the changes in light they can already detect. Buttons on the outside of the visor will be able to turn the lights on and off voluntarily. We are still brainstorming ideas as to why this would be a beneficial addition.
The visor is built out of light-weight foam and the display is filtered through red cellophane. This is a prototype design which is in the concept of video game and comic characters such as Cyclops from X-men or Godot from Phoenix Wright.
Similar media devices to aid the blind will be researched.
The idea for our Sketch One involves a headset which is to aid blind people who are not 100% blind (able to detect minute differences in light change). The headset is stylish and able to have many different appearances but is also capable of being discrete.
Inside the visor will be lights which light up the INSIDE of the visor (while showing a display on the outside that they are lit up), on the outside will be sensors which are triggered with proximity to other objects (can be light sensors as well).
The most simple explanation is, when an obstruction is detected, the lights in the visor will light up indicating to the wearer that an obstruction is ahead. This is a hands-free aid for blind people to enhance the changes in light they can already detect. Buttons on the outside of the visor will be able to turn the lights on and off voluntarily. We are still brainstorming ideas as to why this would be a beneficial addition.
The visor is built out of light-weight foam and the display is filtered through red cellophane. This is a prototype design which is in the concept of video game and comic characters such as Cyclops from X-men or Godot from Phoenix Wright.
Similar media devices to aid the blind will be researched.
Subscribe to:
Posts (Atom)