week.7

MIDTERM

PRODUCT

1.) Out of ContextCam >>> Capture triggered by sound. A small camera attached to my face captures video triggered by a sound threshold. Talk and it records. Shut-up and it stops. I am interested here in capturing meaningful (or not) moments throughout a day. Compile the footage and we see and hear one side of a conversation completely out of context. What are the implications for generating creative content? Will Richard Foreman want one? What other applications does this system lend itself to? I’ve experimented with the camera trained just on the mouth, as well as looking outward from the face. The video shows a compiled example of each.

2.) Requiem JumpCut >>> Using the JumpCutCam in Java which inherits Dan O’Sullivan’s Motion Detector Cam, I’ve algorithmically edited the film “Requiem For a Dream” down to only the first frames following a cut. Certain patterns can be seen such as strobing between two characters in particular scenes, as well as this film’s quick-cutting techniques to portray drug use. I hope to further refine the Cam using “seed planting” video-tracking techniques which are much less computationally expensive. I will also explore the juxtapositions between the rapid and constant cutting in this film and a slower paced film such as Roy Andersson’s “Songs From the Second Floor.”