06 August 2011

This Blog is on Vacation Until Further Notice

Dear Quantum Cineastes,

I have decided to stop updating this blog for the foreseeable future. While at some point I may revive posting on here, now that I have my own blog on my personal website I feel it's somewhat redundant to continue posting on both sites. This decision is primarily motivated by convenience.

I am still passionately committed to continue the theoretical and practical dialogue about the future of cinema, so be sure to subscribe to gabrielshalom.com for the latest posts about Quantum Cinema.

Thanks for reading!
Gabriel

23 November 2010

3D Video Capture with Kinect vs Neato Robotic Vacuum



Let's see this done with two Kinects and some kind of interpolation algorithm, so that we get full-on volumetrics! That, or trump the whole Kinect thing by hacking the XV-11 Lidar!

14 November 2010

Object Recognition using Kinect on the PC



If this kind of thing is possible with a live video stream it should be possible to have seamless hardware and software integrations in the future. Either the camera sees objects and writes semantic objects into video metadata while recording, or video could be processed after being recorded.

09 November 2010

One Step Closer to Universal EDL



Last week I attended the Mozilla Drumbeat Festival in Barcelona. It gave me an opportunity to collaborate with an amazing ad hoc team of people in the context of the Open Video Lab, chaordinated by Brett Gaylor and David Humphrey. Together over the course of a two day sprint, a big team of us collaborated on a demo of the popcorn.js javascript library that really shows off the potential beauty of web made movies. The vimeo video above is just a screen capture; for the live demo visit this page.

Photo by Samuel Huron BY-NC-ND
It was a very rewarding experience to contribute to the aesthetic and conceptual process. I enjoyed the challenge of conducting interviews in languages I don't speak, and collaborating with the multilingual Xabier Cid on the editing process. I was honored to be able to address the audience at the "BEST of the FEST closing variety slam showcase" for the need for new approaches to film school in the face of scrum/agile approaches to storytelling.



What is great about the demo is how it utilizes time-coded metadata to retrieve live content from flickr and twitter in real time. It shows how as we move towards an object-oriented moving image we will continue to redefine what cinema is and also our notion of editing. The tweets are aggregated from the #futureofeducation hashtag. The flickr photos that appear in the demo are called based on timeline metadata that I approximated by putting dummy content (the blue events in the screenshot above) on the timeline to get a sense of a rough rhythm. I then gave a rough approximation of that timecode information to Berto Yáñez, the programmer who did much of the heavy lifting on the demo. Oscar Otero helped with the design of the page. Oscar, Berto and Xabier all work together at the Galician web company A navalla suíza.

Photo by Homardpayette
This process, which involved swapping lots of data across computers via USB sticks, underscored the need for a Universal Edit Decision List (EDL). This was something I identified  about a year ago as part of my rubric for open source cinema. The Universal EDL got discussed quite a bit during the video lab, and together with the amazing work that's already been done creating a web-based timeline interface with Universal Subtitles, it seems like the seed of inspiration to take things a step further has been planted. I am very excited to have contributed to these developments towards an object-oriented open source cinema!

It would be great to see all the names of the participants in the workshop added to the demo. During the demo Laura Hilliger, David Humphrey and I put together a nice cloud-based credit concept for solving the dilemma of crediting multiple parties with multiple credits. Laura should have a rough list of names and roles, and those who are missing could use the #drumbeat #videolab hashtags on twitter to ID themselves, or comment on the video, so we can round everybody up.