Project 1 – Place

When preparing for the projects about ‘place’ at Fitzroy Gardens, I could only bring an underwater camera. My sister was using the proper camera for the day.

I hoped there would be ponds, and there were! I wanted to look at this public space and find the bits that were a bit more private, a bit more secluded. That was unreachable for the everyday person (unless you read your book in a pond). I was looking for a perspective that wasn’t often seen.

I was also specifically looking for wildlife (fish, bugs, tadpoles), and I saw none, which made me very sad. But I persevered and looked for places with interesting shadows and diverse textures.

After properly seeing the footage after I got to a computer, I realised most of it was out of focus, which happens when your arms are under the water trying to film something you can’t see.

I still liked the quality of the videos; it kept an otherwordly feeling and showed cool micro things.

I didn’t have time to edit the videos properly before the group tutorial. It ended up being just a chronological order of my favourite bits I had filmed. Not what I wanted to present, but it I was all was able to do at that time. But I got some good observations.

-the shakey camera made it seem naturalistic and raw

-the soft focus shots highlighted changes in tone throughout the clip

These are interesting observations, but I can’t help but think my peers were just being kind. However, I do agree with them… sort of

Entire footage captured on that day and what I presented to the group tutorial

Because I couldn’t record any sound underwater, I wanted to create a soundtrack to create a more cohesive and emotional theme between the clips. I wanted something a bit plunkier; I usually do dramatic drones that change over long periods.

So I went with an organic-sounding instrument in the first half, almost to ease the listener in. then changed it up in the second half by adding a synthesizer to make it feel more alien and otherworldly.

I then cut up my favourite clips from the day and separated them by above water and below water. I matched the shots with the music, changing shot lengths to create variety and keep pace. Especially in the second half, where the intent was that the viewer felt like they were thrown into this otherworld. A place that is difficult to see and can only be viewed A place shrouded in obscurity, visible only to those brave enough to delve into the depths and murk of a pond

I was delighted with the outcome, and the music was better than I expected of myself.

What I think was successful.

  • The music was catchy and had a melody that made sense with the footage.
  • The music had diverse sounds that still worked (mostly) with each other. Which is something I have struggled with in my music practice.
  • The change-up of the timing of the length of shots meant that the repetition of similar shots wasn’t too boring, and it kept a pace that matched the music in a different way

What could be improved upon.

  • The shakiness of the shots meant that they were often moving in sporadic directions that were hard to control with the quick cuts.
  • The out-of-focus meant that a lot of exciting shapes were lost
  • The colours of the shots in the second half were very same-samey. If I could have shot in different ponds with different plant types, it would have helped. Also, probably because it was a very overcast day, and not much light was being let into the pond.
  • I just repeated the melody twice, which I am mostly okay with because it’s okay to repeat when you’re onto something good. I could’ve added chords underneath it to make some more musical movement and emotional swelling, but that is a lacking area in my music education. So I will have to do some youtubing.

Project 2

Materiality

I was given ‘Cats and Dogs’ and ‘Through the Past, Darkly’ by the Rolling Stones

My original intent with this video was to take apart and reorder it using only clips with Yellow and/or Blue. Destroying the linearity and breaking the footage up like a paint palette. I intended to make a seamless transition from blue to yellow. After putting them all together, the theme didn’t come through, obviously enough. After talking with Sofi, she suggested I look at thematic elements and mentioned that polar bears are quite a powerful symbol. So I attempted to use that to talk about animal cruelty in everyday life. And how it is not as distant as a zoo in China (where the second polar bear clip is from). This video is as close to that concept as I could manage. And I wouldn’t say I liked it at all; it was incredibly boring, disjointed, and with no clear theme. I needed to rethink it. I needed a…

BRainSTorM

  • I could make the animals look like video game characters and give them different statistics and abilities.
  • I could make animations of the pets running.
  • I could print these frames on paper, cut them out and take a photo with a different, real-world background. (I didn’t do this because cutting all those frames out of paper would take too long).

Plan of action

I decided that cutting the images out was an interesting take on materiality, transforming the material back into something tangible and then back to film. I then decided that literally cutting them out of paper and taking photos would take a bit too much time but cutting them out virtually was a possibility.

The theme of removing the characters from the original footage and transforming them digitally still held the concept of altering the essence of the material and the director’s original intent. Which I thought was cool.

I took screenshots of each frame and then sent them to my iPad, where I cut out each frame using Procreate. I then layered different backgrounds behind them.

I chose for the cat to go through the Australian bush into Minecraft, into the nether (an area in Minecraft that you can only access by going through a portal) and out again.

I added the cut-out cat back into the original film and added the rolling stones music back into it to further the strangeness.

What `I feel was successful.

  • The cat coming out and transforming was really unexpected, and I feel throws the linearity of the piece away.
  • I think how I cut out the cat with chunky legs looked good.

What I could work on

  • The timing of the music was really jolting, and I think it ruined the piece, to be honest. After I presented the piece, I figured out how to loop the cat, which was the reason for the original weird timing.
  • The length of the shots for the backgrounds was probably too quick; I could also fix this by looping the cat to make the shot longer.
  • I also would have re-recorded the VHS because I think all the fast-forwarding was distracting and didn’t necessarily help with the concept. unless I moved the cat through the fast-forwarding, that would be cool.

Project 3 – Colour

I was very lucky to be given dark blue as my colour! It is one of my favourite colours, a colour I intend to use in my art, and it is historically potent! Something you need to know about me is I love whales.

Giant Underwater Whale GIF | GIFDB.com

I recently watched a documentary about how intelligent specifically sperm whales are and that they have a very complex language system. The young whales would babble for the first three years of their life and then, with their mother’s guidance, start singing the same songs as their family. I thought this was awesome. And apparently, researchers are trying to create an ai (similar to chat.gpt) that will help with decoding what they are saying. by collecting thousands of hours of sounds and collating them into a format and then looking at usage rates of words between all languages. For example, every language has its version of the word mum, which is used about the same amount across all human cultures. So they intend to extrapolate these statistics to try and get close to understanding sperm whales as we can.

I love the idea of using ai, this new and technological instrument, to discover the secrets of whales and language deciphering itself. We are using this intelligence we created to try and understand the intelligence of a different species.

This is why I chose to make my video using ai.

How I did it

  • Step 1: Download file locator and converter from FaceHugger (this took about 6 hours)
  • Step 2: Download free code from Stable Diffusion (a visual ai system)
  • Step 3: Upload both into collab, a google chrome code interface
  • Step 4: Fill out all system settings. Including frame rate, ratio, camera movement
  • Step 5: Trial a prompt. I did (sperm whale, in the distance, underwater, swimming towards camera, ocean
  • Step 6: Wait 8 seconds for each frame to render.

First ““`frame

Last frame

  • This was not how I wanted it to look. I got the aspect ratio wrong, and the imagery was weird and was all above water which is not what I wanted.
  • I found out that when you put the word prompt in at the start, it slowly devolves, and because the previous frame infers the next frame it still looks cohesive. But I think this inherent entropy is weird but also beautiful.
  • Step 7: rewrite the word prompt. I removed ‘ocean’ and the ‘sperm’ from sperm whale because I wanted it to be underwater. And I think the sperm bit made the whales look weird and unrecognisable. And then render it.
  • Step 8: The photos looked great. I then uploaded all 500 of them to Imovie and timed them so the frame rate would work. I also matched it to the music.

What I think was successful.

  • The whole piece was cohesive, and I think the music matched well with the pace.
  • I love how, as the film went along, the whale imagery distorted into swaths of colours

What could be improved upon.

  • I could’ve put different word prompts timed to different frames so that the images would change whilst still referencing the initial prompt. For example, the whale looks left, the whale blinks, whale spins around. This would’ve of taken hours of rendering and all the space on my hard drive, but would have looked great. I also would have lost that entropy in the last half. if there are more prompts in a shorter timeframe, there is less time for each frame to devolve