Media Research Symposium

Video documentation, Keynote as QuickTime and ‘script’ of a presentation at the Media Research Symposium, MMU on 3rd May 2011.

 

Lewis Sykes – Media Research Symposium, MMU – 03-05-11 from Lewis Sykes on Vimeo.

Presentation of The Augmented Tonoscope by Lewis Sykes at the Media Research Symposium, MMU on 3rd May 2011.

 

Lewis Sykes – Research Media Symposium, MMU – Keynote HD from Lewis Sykes on Vimeo.

Keynote for a presentation of The Augmented Tonoscope by Lewis Sykes at the Media Research Symposium, MMU on 3rd May 2011.

 

This ‘script’ is an overly long expansion on my RD1 (I think I got to Slide 8 before abandoning it), which is unreferenced and prosaic – but I’m including it in my journal because it provides the best structured overview and ‘snapshot’ of my research to date.

Slide 1 – Splash

I’m Lewis Sykes.

I’m going to present my PhD practice as research project – The Augmented Tonoscope – which is a study into the aesthetics of sound and vibration. But you can probably tell that from my opening slide 😉

The Sketchup model gives an impression of what The Augmented Tonoscope might look like exhibited in a gallery context… and  although I’m not going to explain it now, hopefully all should become clear by the end.

I’m going to read a prepared ‘script’ from my iPhone rather than just talk to slides. I’ve structured this presentation by describing in turn the motivation, context, inspiration, research question, field map, aims and objectives, timeline, method and knowledge development within my research.

Slide 2 – Motivation

For most of the 90s I was a semi-professional bass player performing and recording with several ‘barely notable’ bands. I also ran a small, underground, independent record label. When that went belly-up in the late 90s I wondered what to do next – and decided to align my interests in music, graphics, experimental film and technology.

I enrolled for a two-year, part-time MA Hypermedia Studies at the University of Westminster – and through a module in interactive media design I discovered programming with code and specifically Lingo in Director. My practical project for that module, the Groovatron, applied a visual feedback very similar in principle to the dub reggae style delay I used in my music studio.

Fast forward to 2005 and I’m a musician and performer with The Sancho Plan – an audiovisual collective who perform live, triggering both the music and the animated visuals in real-time via electronic drum pads. We also created interactive installations, some of which were featured as permanent exhibits in the Ars Electronic Centre, Linz.

In Monomatic, my current collaboration with Nick Rothwell, we create physical works that investigate rich and sonorous musical traditions – such as PEAL:  a virtual campanile – an interactive sound installation that models the layout and operation of a traditional English church bell tower.

It’s only natural then that my PhD should build on this existing work in exploring;

  • acoustic and visual interplay;
  • real-time, musical interactions;
  • design and fabrication of musical devices.

Slide 3 – Context

From Isaac Newton’s colour circle correlating hues with musical notes which inspired a succession of colour organs and Lumia, to the synthetic sound production film experiments of Oskar Fischinger and Norman McClaren, there is a long and rich history of scientific and artistic study into the aesthetic and theoretical relationships between the aural and visual.

This evolving practice is broadly defined by the term Visual Music – and my research is located within this context – which currently includes:

  • methods or devices which translate sounds or music into a related visual presentation – possibly including the translation of music to painting;

This was the original definition of the term, as coined by art critic Roger Fry in 1912 to describe the work of Kandinsky – such as this 1911, oil on canvas entitled Composition IV.

Around the same time Leopold Survage produced hundreds of sequential paintings for an abstract film Rythme Coloré – which he hoped to film in one of the new multicolour film processes that were being developed. The onset of World War I prevented that and he subsequently sold a number of the paintings so that they became widely dispersed. They have still never been filmed.

  • the use of musical structures in visual imagery;

Of all of the possible correspondences between the elements of color (hue, saturation and value) and those of sound (pitch, amplitude, and tone), the most often proposed mapping is of hue to pitch.

While Isaac Newton’s colour circle is the first in a series of colour mappings many people subsequently built instruments, usually called ‘color organs,’ to display modulated coloured light in some kind of fluid fashion comparable to music. One notable example is the Clavier à Lumières, invented In 1893 by British painter Alexander Wallace Rimington.

In the 1920s, Danish-born Thomas Wilfred created the Clavilux and by 1930 had produced numerous household Clavilux Junior units. Significantly, these instruments were designed to project coloured imagery, not just fields of coloured light and were sold with sets of glass disks bearing art. Wilfred coined the word Lumia to describe the form.

While this is all very interesting, my research has a clear focus in the last of these definitions

  • systems which convert music or sound directly into visual forms (and vice versa) by means of a mechanical instrument, an artist’s interpretation, or a computer.

This is especially true of my research since I plan to utilise all three of the suggested means – a mechanical instrument, artistic interpretation and a computer – in to the Augmented Tonoscope.

Filmmakers such as Oskar Fischinger and Norman McLaren worked in this latter tradition of creating a visual music comparable to auditory music in their animated abstract films – of literally converting images to sound by drawn objects and figures on a film’s soundtrack in a technique known as drawn or graphical sound.

Oskar Fischinger believed his synthetic sound production experiments held extraordinary potential for the future of musical composition and sound analysis: “Now control of every fine gradation and nuance is granted to the music-painting artist, who bases everything exclusively on the primary fundamental of music, namely the wave – vibration or oscillation in and of itself.”

In broader terms, he also held his abstract imagery to contain qualities parallel to the ones found in music and the actual soundtrack used as means to attract the attention of the audience to recognise this.

Norman McClaren was a pioneer in a number of areas of animation and filmmaking, including drawn on film animation, visual music, abstract film, pixilation and graphical sound. McLaren was attracted to synthetic music for his films because he could compose the music frame-by-frame and its rhythmic, percussive sound was also ideal for frame-by-frame motion; hence, McLaren called his music “animated sound”.

The use of the computer in experimental film making has a rich history which reached a peak in the late 1960s, but stemmed from the early approaches and experimental 16mm work by key figures John and James Whitney in the late 1940s. For John Witney music was visual, imagery musical and digital computers offered the possibility of algorithmically melding the two – as described in his 1981 book Digital Harmony: On the Complementarity of Music and Visual Art.

I see an artistic lineage for my research in the work of these early experimental filmmakers and computer artists and their artistic experiments into Visual Music using the technology and mediums of their time. I plan to investigate their experimental approaches and methodologies and research critical consideration of their work to inform and guide my own practice.

Slide 4 – Inspiration

The term Cymatics (from the Greek: κῦμα “wave”) was coined by the researcher Dr Hans Jenny who studied this subset of modal wave phenomena using a device of his own design – the ʻtonoscopeʼ.

 

Contemporary Cymatics researcher John Telfer explains “Sound can induce visible pattern. When physical matter is vibrated with sound it adopts geometric formations.”

 

This understated video from a 1960s documentary of Jenny’s research is a source of inspiration to me. If Jenny could achieve such a direct and responsive link between sound and image almost 50 years ago by simply singing down a cardboard tube with sand sprinkled on to a rubber membrane stretched over one end – what could I realise with the computational power, fabrication facilities and access to a global network of knowledge, materials and components at my disposal now?

Slide 5 – Research Question

Which segues neatly into my research question:

How far can artistic investigation into visible sound and vibration – also know as Cymatics – contribute towards a deeper understanding of the aesthetics and interplay of sound and image in Visual Music?

Slide 6 – Field Map

 

To highlight the need for my own research I’ve summarised existing literature and artwork in the field, indicated how my work builds on past theory and practice and most importantly, fills an ‘absence’.

The study of visible modal wave phenomena has a long and rich history and a significant body of empirical evidence, theory and practical application most notably:

  • The aforementioned Dr Hans Jenny – Cymatics: A Study of Wave Phenomena and Vibration Vol. 1 (1967) & Vol. 2 (1972). Jenny emphasises the “triadic nature” of Cymatics – hear the sound, see the pattern, feel the vibration – highlighting three essential aspects and ways of viewing a unitary phenomenon and suggesting to me the intriguing potential for developing a multimodal sensory instrument – to simultaneously hear, see and feel it’s output;
  • Ernst Chladni – Discoveries in the Theory of Sound (1787) – Chladni’s Law describes the frequency of modes of vibration for flat circular surfaces providing the essential mathematics for modelling the virtual physics of the emulated tonoscope;
  • and Margaret Watts-Hughes’ inventor of the Eidophone (1885) – in Visible Sound, her article in an 1891 edition of The Century Magazine, she observes “…the extreme sensitiveness of the eidophone as a test for musical sounds detecting and revealing to the eye… what the ear fails to perceive” This suggests to me that an instrument of modern design could demonstrate particular sensitivity and subtlety.

The work of these pioneers has inspired contemporary research into Cymatics as a means to reveal a deeper understanding into areas such as harmonicism, acoustics and spatial form:

  • The aforementioned John Telfer – Cymatic Music (2011) – an audiovisual science and music project investigating the possibilities of creating a system of visual, or rather visible music. Telfer favours manufacturing acoustic musical instruments to interpret the harmonicism within his Lambdoma matrix but recognises – and to my mind points to – the possibility of an electronic (and digital) approach which I plan to adopt.
  • John Stuart Reed – CymaScope (2000) – arguably the current world leader in scientific cymatic research such as interpreting dolphin communication and co-inventor of the patented Cymascope – an instrument that makes sound visible. In contrast with my experimental artistic approach and ‘open source’ development strategy;
  • and Benlloyd Goldstein – Cymatica (2009) – an architectural thesis investigation exploring the synthesis of spatial proportion and form generated from sound.  Goldstein cites Goethe – “Architecture is the frozen music; music is the flowing architecture…” as a rationale for his research – and this seems to connect in an interdisciplinary fashion, with my own art and design practice as research investigation in to the aesthetics of cymatic pattern and form as a means to visualise music.

There’s inspiration and insights to be gleaned from the realisation, process and methodology of  a wide range of contemporary artistic output exploring Cymatics:

  • Gary James Joynes/Clinker – Frequency Paintings: 12 Tones (2011); Jan Meinema and Dan Blore – Cymatics: Liquids and Sound (2009); Carsten Nicolai – Milch (2000); Thomas McIntosh with Mikko Hynninnen and Emmanuel Madan – Ondulation (2002); and others… [Anthony Hall – Coffee Cup Oscillator (2003); John Richards and Tim Wright – Cymatic Controller (2008); Arnold Marko – Kymatika (2009); Jodina Meehan – Cymatic “paintings” (2009);]

as well as software and ‘open source’ creative coding modelling its effects and manifestations:

  • Graham Wakefield – Chladni 2D & 3D MaxMSP patches (2009); Flight404 – Robert Hodginʼs Processing & Flash experiments in ferro/cymatic liquids (1995-2009); and others… [Paul Falstadʼs Math and Physics Java applets (2005-2009); Software Tonoscope (2008-2010); ; Karsten Schmidt – Toxi & PostSpectacular (2000-2010) – interactive artworks & using code as primary design.]

Slide 7 – Aims

I aim to develop this line of research by exploring the aesthetics of cymatic patterns and forms. Can the inherent geometries within sound provide a meaningful basis for Visual Music? Will augmenting these physical effects with virtual simulations realise a realtime correlation between the visual and the musical? Is it possible to develop a cymatic visual equivalence to the auditory intricacies of melody, harmony and rhythm?

Slide 8 – Objectives

The focus of the research is to design, fabricate and craft a sonically and visually responsive hybrid analogue/digital instrument – a contemporary version of Jenny’s  sound visualisation tool – the Augmented Tonoscope.

The physical device will be a minimalist ʻobjet dʼartʼ – inspired in part by the aesthetics of high-end audiophile components. The analogue cymatic patterns it produces will be filmed and analysed and then projected and superimposed with digital cymatic forms derived from a secondary but integrated emulated tonoscope. In combination they will form an augmented device where real and virtual outputs interplay and are artistically analysed and treated.

I then intend to play, record and interact with the instrument to produce a series of artistic works for live performance, screening and installation. These works will be the main evidence I submit supported by a thesis and an ʻevidence boxʼ of materials.

Slide 8 – Timeline

I see the project developing through the following stages – with several running in parallel:

Year One

  • devise key personal frameworks to guide the research – an artistic experimental method, an informed design process, the historical evolution of the instrument,  tools for critical reflection, ‘open’ journal and documentation;
  • map the field and establish a solid foundation of complementary writing – locate my praxis in a lineage of similar practices and relate and reference my specific inquiry to broader contemporary debate;
  • build a series of experimental analogue tonoscopes – iteratively refine the design to extend responsiveness and define form;
  • develop a blueprint for the virtual system – summarise the physical laws and underpinning mathematics that describe the effects of sound and vibration and collate and explore existing software and open source code that models this behaviour;

Year Two

  • collate and critique relevant artistic and aesthetic influences – draw on these to inform the artistic and aesthetic qualities of the virtual output;
  • code a series of prototype digital tonoscopes – initially to respond closely to the analogue device and then to augment the analogue patterns with supplementary information, meta data and interpretative content;
  • integrate the virtual system into the physical device – produce artistic visualisations of sound where the digital output is extended, twisted and abstracted in ways the analogue never could;

Year Three

  • practice with the Augmented Tonoscope and produce new work – develop dynamic and aesthetic visual music and works for performance, screening and installation.
  • produce supporting materials which makes the research context manifest – a 30-40k word thesis and an ʻevidence boxʼ drawn from my complementary writing and documentation which is deemed to assist the sub-panel.

Slide 9 – Research Methods

 

I plan to draw significantly from the model of research methods and critical approaches developed through the PaR initiative.

By specifying a clear research enquiry at the outset succeeded by attention to and dialogue with the process of my art practice I aim to encourage and reveal research imperatives and significance, original insight and distinct and discreet approaches to the curation and dissemination of knowledge.

Specifically I plan to devise a working ‘artistic experimental method’ to guide my creative practice and drive it forward – a personal response to the established ‘scientific experimental method’ of gathering observable, empirical and measurable evidence subject to specific principles of reasoning. I’m sourcing definitions for a set of artistic paradigms – beauty, aesthetics, authorship, process, serendipity and technology misuse – and will apply these measures systematically to gauge, reflect and draw effective conclusions on the outputs from my rolling series of artistic experiments.

I’m keeping an ‘open’ research journal via a WordPress blog – structuring how the various aspects of my study inform one another through the ʻcategoriesʼ of:

  • complementary writing – locating my praxis in a lineage of similar practices and relating and referencing my specific inquiry to broader contemporary debate;
  • critical reflection – making my embodied ʻperformer knowledgeʼ explicit – comparing and contrasting other work, finding resonance between my research and contemporary debates, offering new insights into the conceptual framework and theory implicated within my practice, capturing moments of insight and happy accident;
  • documentation of process – recording evidence of my ongoing practical, experimental and iterative design including tool sets, methodology and outputs;
  • artistic outputs – demonstrating rigour in respect to the imaginative creation, thoughtful composition, meticulous editing and professional production of new artwork;
  • review and feedback – presenting evidence of professional peer review and limited data gathering through structured interviews with select audiences

I’ll be using the following documentation methods over the course of the project:

 

Slide 10 – Developing Knowledge

 

This research project is unique in the field in combining the analogue and digital domains. I believe there is significant potential in the real-time, dynamic and aesthetic interplay between audio and augmented cymatic visual outputs to provide new insights and understanding into the relationship between sound and image – which can only be addressed by designing and building a new hybrid device.

I plan to share my knowledge and insights with the research and wider communities through a decidedly ʻopen sourceʼ modus operandi – making my own evolving tool set, methodology, code and software, electronic and design schematics, documentation and outputs freely available under a Creative Commons Attribution-Non-commercial-ShareAlike licence.

Hankins & Silverman (1995) in Instruments and the Imagination assert “Instruments have a life of their own they do not merely follow theory often they determine theory because instruments determine what is possible and what is possible determines to a large extent what can be thought.”

While I hope that I may contribute new thinking and an emerging terminology to future practice as research initiatives through my my research methods I’m aspiring to demonstrate that a decidedly artistic experimental method can indeed help to “determine what is possible and… what can be thought” helping to validate artistic practice alongside scientific experimental method as a means to enhance insight and develop knowledge.

Slide 11 – Where To Find Me

If you’d like to find out more about my research and keep up to date with developments:

Research Journal  – http://www.augmentedtonoscope.net

Digital Sketchbook – http://tumblr.augmentedtonioscope.net

Email –  lewis@augmented tonoscope.net

Thank you.