CONTEXTUAL LANDSCAPES

I’m interested in how we can elevate our connection with a landscape through peoples contextual information.  I’m exploring this by using peoples emotional and physiological data from an important moment in their life as a lens to create a visual effect on the landscape around them.  The works use this data to visually interact with the landscape, a merging of human context and the natural setting.

I use accessible tools for data acquisition including heart rate, movement and skin conductance through mobile and smart watches, as well as brain data and emotion detection through EEG brain scanning headsets and emotion AI cameras.

The works are generally a collaboration with the subject and their data, capturing a moment in time that is special to them.  These have included an EEG brain scan of someone practising their wedding vows, the heart rate data of two people at the moment of their marriage proposal and the emotion tracking of a child listening to The Avalanches “Frontier Psychiatrist” for the first time.

The physiological data is converted to a flight path for a drone to fly within the landscape “stage”.  The drone is fitted with lighting that paints a line through the space and illuminates the surrounding environment relevant to the physiological data.  Long exposure photography captures the result as a contextual and artistic landscape.  The data adds unexpected layers of context to our perspective of the natural setting and the subject affecting it.

Process

Creating these work is a ridiculous challenge, and I freaken love it. It’s a combination of digital photography, drone piloting and biometric data capture. On top of this is the requirement to get away from city lights, meaning 4 wheel driving to remote beaches. Just terrible really.

I use a range of data from people including technically challenging tools like EEG brain scanning headsets to simpler tools like heart monitoring watches eg. Apple watch, facial tracking, even voice wave data.

For EEG, I use Emotiv’s brain scanning sensors and software to measure people’s brain activity. I then export this data in a simplified CSV format. From here, I’m able to convert the brain wave activity into X and Y coordinates. The coordinates are then used to plot a flight path for a Phantom 4 drone to fly the path using a drone piloting application which allows the drone to fly plot line autonomously. The drone is fitted with 2 Lume Cube lights, controlled via an iPhone app to help visualise the path taken. The drone is then launched at a beach and the flight path captured using a Nikon D750 with a 25 second exposure to capture the path over time.

LEARNINGS
Technology is amazing and frustrating. I’ve had some serious challenges with Emotiv’s Insight brain scanner and eventually upgraded to the Emotiv’s Epoc unit which was much more reliable in data display and device connection.
Drones are hard, especially when you’re adding extra gadgets to them – the first one just flew away leaving its master dumbstruck and bruised. The second drone had violent altercation with a tree. The 3rd drone is a serious upgrade with obstacle avoidance and much better range. Attaching lights and external brackets and cameras to a drone is still a huge risk, especially when you’re flying it over the ocean on an autonomous path based on brain wave activity. Photographing this work is one of the greater challenges I’ve had. Managing a 25 second window, difficult focus profiles, remote drone lighting and ambient lighting together is a great stretch. Doing all of this while flying the drone, in the dark is hard!





Next
Little People
Previous
A Bit Garie




More Portfolios