Final DECO2606 Blogpost

November 13, 2011 § Leave a comment

The final blogpost. Here we go!

——————————————————————————–

Process . . .

Odd to think it’s been a semester since the concept of ‘Stomping Grounds‘ grabbed ahold of me. It’s fun to imagine the infinite paths this project could’ve taken, but I’m grateful for the steps taken along the way.

Throughout this course, I have maintained this blog documenting my progress. (Quite a rewarding commitment).

From first flight:

to Strings of Ribbon:

and all the spaces:

and inspirations in between.

A final brief took form, generating starfields , fractals , and other odd things.

And have since been culminating:

to reach a nice casual game.

——————————————————————————–

. a casual immersive game .

~

 This casual immersive game is designed for play in social spaces. The simulations demonstrated here would be set up in an open public area, with one portion (the 3D Kinect view) being displayed as a large immersive (potentially panoramic) projection. Adjacent to this would be a multi-touch table which, like the kinect, can also promote collaborative gameplay.

These platforms can encourage strangers, both passer-bys and veterans, to mutually engage their environment in a casual immersive game.

~

( Want to poke around at the java source code? Download here. )

——————————————————————————–

. Яeflection

DECO2606 Real Time Multimedia is an awesome class. Learnt a lot. One of the few projects I’ve thought could realistically be extended after my degree concludes. Perhaps as a tablet game, or an avenue of research into collaborative interactions in social spaces.

Though the last deco2606 blogpost, this is certainly not the end of this project. After improvement and demos, components and knowledge learnt will be reused extensively for other projects (recursive elements like fractals are looking more and more interesting by the day as my next venture).

Overall a very fun and enjoyable class.

Cheers.

———————

P.S. Also, please check out my other classmates’ sites, there’s some pretty amazing work!

Kevin Chan, Sanjay Dutt, Kendrick Khoo, Stella Kim, Mark C Mitchell, Ashleigh Sutton, Garry Taulu, Spencer Walden, David Wallis, William Yong, Adam Younis, David Zheng, Tim Zapevalov }

Advertisements

Two Connected Laptops

November 13, 2011 § Leave a comment

So, this is to demonstrate that I got UDP working, synchronizing the states of two components of a casual game I’ve been working on.
One would work on a multitouch table, the other, on a large projector screen, with input from a Kinect.

( To any readers: Unfortunately, it doesn’t look like I’ll be able to access the facilities to showcase this app running in their intended environment, so I will make do with simulations for now, and upload one when I get a chance. )

Table Fail. Kinect View.

November 12, 2011 § Leave a comment

Ok, for the past few weeks, I’ve been trying to get the touch table working with the program again, but havn’t ever had the chance for a proper sit-down to try and figure it out. Just spent three hours with it, and I’ve narrowed down the problem space. I’ve ruled out drivers’ and code fault. I don’t believe it’s the usb connecting the laptop to the touch overlay (as messages are still being delivered, ableit at default null / empty values, even when powered off). I believe it to be the powers’ fault, primarily due to the current scenario being the same ‘with’ and without power. However, this suggestion may be contrary to other stories of people whom have gotten the table working for them in the past week. For now, I will be ignoring the table due to time constraints, though it’d be nice to put the program back on it sometime in the future.

Meanwhile, I have also been working on the kinect-view of the application. The strings of boids seen in the previous post have now been transformed to their 3-dimensional representations, similarly with their ‘shootable’ enemies!

Currently the network runs off the same computer, but should work nicely (perhaps with less elements) over two when required (this has also been tested at smaller cases).

Next on the agenda:

  • Cleaning up the code so we can ‘shoot’ enemies properly in 3D!
    (Individual identification of entites such as Enemies required).
  • Shooting enemies with the Kinect.
  • Aesthetic touchups.

UDP Network Established

November 11, 2011 § 1 Comment

After trialling a few different methods of networking in processing, a UDP library was used .

UDP was used over TCP here as it was faster. It was found that any data set should be ideally encoded to minimize errors. However, errors and missed data still persist in UDP (which is something I will have to adjust for).

Here are some screenshots that resulted from drawing the data received in a program by the original multitouch table program, to nice effect.

 ( Drawing boid positions. )

 ( You can see the errors in UDP and hence interpretation of encoded co-ordinate data via the white dots that are stuck at the top of the screen. )

( Showing enemy positions over time as well, getting gobbled up by the boids. )

Shooting in 3D!

November 11, 2011 § Leave a comment

So shooting in 3D has always been a problem for me. I have always thought these types of problems (anything to do with transposing between multiple dimensions) had to be approached manually. Luckily for me, processing can translate it’s 3D co-ordinates to screen co-ordinates with screenX and screenY !

It was then a simple matter of detecting objects that were close to the cursor by their relative viewability, and z-depth. As such, direct selection of objects could be obtained, as well as the ability to ‘auto-target’ targets.

 ( Auto-selection (Green) of two objects due to their positions and relative buffer sizes. )

 

Kinect Basic Implementation Demo

November 11, 2011 § 1 Comment

Did a quick technical demo of how I had setup the kinect. (This date was also the first time that I couldn’t figure out how to fix the multitouch table to work.)

Anyway here’s a (clickable) animated gif of the reactions.

It was a simple technical demo, basically allowing users to sync in and ‘catch’ fireflies with their hands.

The demo revealed that the implementation did not scale well for an environment populated by many people (I did do some initial testing, but this was only up to four people and I believed I had caught all the exceptions in Java), however the error is coming from OSCeleton crashing when there’s too many objects to track (producing a “segmentation fault” message).

Oliver previously had this problem and transitioned to SimpleNI as this was more stable. I may do this given time.

Fractal Experimentation

November 11, 2011 § 1 Comment

So I’ve been experimenting simulating moving fractals, zooming into them infinitely, in a faux 3D space.

I found the Diamond-Square algorithm, implemented it, and adjusted it so a faux-sense of Z-depth would be generated (by having the variance values increase closer to the centre of the screen). The following are images that were generated.

Unfortunately, the goal was to get a constantly moving, animated fractal, which this method wouldn’t work for (especially at this resolution), as the generation took too long, and could not be controlled for a consistent-looking animation.

As such, the diamond-square algorithm was re-implemented with an attempt to move 2d pixels outwards inverse-exponentially to simulate 3D movement. However contrast in this is not consistent.

Use of animated noise was also attempted, but the result was too ‘smooth’ and were more like hippy,metal,rings, rather than gases.

For now I have pushed fractal experimentation aside, but it’ll be useful for future projects.

 

 

Where Am I?

You are currently browsing the DECO2606 category at DECO } Han.