All joints are now being saved, and I included a “dashboard” that is what I expect the conductor to be able to do (change colors, point size, select joints, and so on). But it is possible now to do all of that without a second Kinect, making this a viable standalone application (not to mention the help it’ll be in debugging). We can toggle visibility on all joints, and change size and color on them individually. There’s a lot more we can do, of course! Plus, there’s a “conductor mode” checkbox (to be implemented next) that toggles whether a conductor is present or not.
Here is the new UI!
I’m uploading the new version on Dropbox right now.
I have just merged my code into Danilo’s, so that now we have a couple of improvements. Data is now saved as a continuous stream (rather than being limited by what is currently on screen), and the filtering script also outputs all of the joints as separate *.obj files, and can be imported into Blender and the like!
We’re testing the first prototype of MotionDraw with our new friends Abbie and Tony from the Dance Department.
Troels and I went on a mission to interview UCSD dancers about motionDraw. We asked the following questions:
1) What are some examples of dance that has used technology that you like? Dislike?
2) Have you ever tried working with computing and projection with dance?
3)Could you see benefits of interacting with a computer while dancing?
a. In real real time vs. post production
4)Would you prefer to control a device that changed a projection, or would you prefer a second “conductor” to alter the atmosphere?
a. In general as a dancer, do you prefer set choreography or a more open improvisation?
We learned a lot and will post answers soon!!
I’m testing some ways of drawing using a 2D canvas with some effects.