Jeff Rona Interview

INSIGHTS: Interview with Jeff Rona

Using a Project Studio for scoring feature
films and TV projects

Interviewed in November 2004 by Mel Lambert

Jeff Rona started his musical career as a synthesist whose extensive music scores for films, TV series and records include unique qualities of color and texture. His diverse musical background includes composition for theater, dance, records and orchestral works; his influences stem from classical, ambient, electronic, minimalism and rock, and several forms of World Music. Rona's film scores include Ridley Scott's "White Squall," recorded with the London Symphony Orchestra; the independent features "Trading Favors" with Rosanna Arquette; "Lipstick Camera;" and "Black Cat Run," written and produced by Frank Darabont.
   He contributed music to such films as "Assassins," "The Net," "Toys," "Kafka" and "The Fan." His television work includes scores for Barry Levinson's critically acclaimed "Homicide- Life On The Street;" David Kelly's "Chicago Hope;" Steven Speilberg's "High Incident;" "The Critic;" "Profiler;" and the main title music to "L.A. Doctors," "Sleepwalkers" and "Teen Angel." His more recent work includes a two-hour score for the ABC Mini-series, "Tom Clancy's NetForce," directed by Rob Leiberman.
   Rona has also scored several documentary films, including the award-winning "The Art Of Survival." and his music has been heard in several prominent commercials. He has recorded, collaborated, performed and arranged music with such artists as Mark Isham, Philip Glass, Hans Zimmer, Jon Hassell, Brian Eno, Earth Wind & Fire, Basil Poledorous, Cliff Martinez, and many others.


Jeff Rona
What role does a Project Studio play in your day-to-day productions?
Most everything I do starts and ends in this room. One-stop shopping! I started off not really thinking about being a full-time composer. I just went out and bought musical instruments that I had a personal affinity for. But then, once I started getting professional work as a composer, I knew immediately that the process would be a key element of my musical product. Which meant that I needed to have all my creative options open- and they needed to be available simultaneously.
   I knew that I wanted a more "virtual" MIDI Studio. Intrinsically, I'm kind of a lazy person. I didn't want to be loading sounds into samplers and having to imagine [how they would sound against other, yet-to-be-recorded elements]. I wanted to close my eyes and have every sound in front of me. I didn't want to look at one part of my canvas and then the another, because you loose perspective.
   I slowly built a studio that was capable of realizing my musical vision in real time, as a creative process. I started off with a little [Electro -Voice] TAPCO mixer that was barely good enough for live sound! Then I got my first Soundcraft 200B 24-track console, and then another. They had lots of inputs- I'm always collecting instruments, samplers and more synths. Unlike conventional studios, in a Project Studio I don't want a split-format console; I want one with inputs that are all the same and active [simultaneously].
   Working with my three Soundcraft boards was great, but it also brought up a real problem. When working on albums, I typically worked on one project at a time. But in a film or TV score I may need to write 20 to 30 pieces of music that, collectively, are a project, but each can be a musical world unto itself. Because I was a sound designer before I became a composer, it's important to be adventurous with sound as part of the composition process. Which meant that I was held back by [working] with a generic console setup. Automation was the key. For me, the advent of affordable boards with total recall was a great leap forward. I replaced my Soundcrafts, one at a time, with these three Yamaha 02Rs.

The 02R's automaton allows you to keep every mixer setting with the appropriate cue?
That's the main thing. I work in my [Opcode Studio Vision MIDI] sequencer, with one file [holding] all my [MIDI-based] cues. I like to be able to compare and listen back to things I've already written and use one cue as the inspiration or starting point for the next. I think about what makes that cue work, and then how to approach that with the next cue. I [send] a MIDI patch change at the beginning of each piece of each cue to the 02Rs that recalls my snapshot.

Give us an example of how the Project Studio Environment enhances your creativity. You have just finished composing the music for an ABC mini-series, "Tom Clancy's NetForce."
The cool thing about that project was that [the producers] had heard some of my music and when I went to the spotting session it had been temped almost entirely with my compositions. There were some pieces I did for the movie "Assassins," something from "The Fan," some TV projects from "Profiler," "High Incidence," etc. I had about 110 minutes of music to do for that project in three and a half weeks.
   The most important thing when starting a project is developing a sound palate and a theme. I go through all my samples and chose the things that evoke what I think I want to do. The theme doesn't always come first, not for me, because I'm going to use the sounds to inspire the theme. I knew that I wanted to try some experiments with some extended, weird guitar [elements] and noise loops. I want to use some kinds of percussion that I hadn't done before.
   I watched [the work print] a couple of times and tried to understand the pacing, and what they were trying to do with the temp score- to get into the head of a director you've never worked for before. You learn what they're trying to do. You sit with the director and talk about books; what movies they have seen lately, and what music they like. I asked him to describe what it is about each piece in the temp score that made him choose it to communicate the emotion of a scene.
   Then I came to my studio and it was time to get going. I spend a good two or three days just thinking about how the score should sound. Do I want an orchestra? Do I want percussion? What kind of percussion? What live instruments am I going to use in addition to samples and synthesizers? Do I need to have samples of those live elements? What sort of motifs- sounds that will be a re-occurring element within the score- might I do sonically? It could be something as simple as a solo trumpet, or it could be scratching a gong, or something purely sonic that becomes a signature. Eventually that builds into an entire pallet. I spent a few days sampling sounds and doing extensive processing. I often go back through my library and regenerate the samples with something fresh. I know what I want, but I end up experimenting.

How much sound design is involved at this stage?
It varies. Sometimes there's one sound that I know will be great; so I'll use that. I build my pallet and then start writing. Sometimes I'll build a rhythmic bed and then I'll start writing music on top of that. Sometimes I'll sit with a piano sound and come up with a melodic idea. I'll take time to get that initial, melodic idea that's going to take me through the film. It really is important to come up with a really good tune; the main melodic theme. Again, a theme to me can mean a melody and a chord progression, or just a chord progression, or a simple tonal motif. It runs the spectrum from sound design to very traditional composition. But once you've created something that's solid and expresses what you think the movie is all about, you're half way there, because your art now becomes [a matter of] how to exploit that theme in these different ways. Here it is in a quiet way; here it is in an aggressive way; here it is in a dark way; here it is in a mysterious way. This project had a main theme, an "heroic" theme and a "love" theme. They are completely unique themes, yet there are ties that bind them together.

How big was your sound pallet for "NetForce"?
I created a pretty huge pallet with all sorts of noises from orchestral strings, brass and percussion, to distorted noises, weird evolving sounds, interesting processed rhythmic loops, some unusual ethnic instruments and massive percussion sounds- but I didn't use all of them.
   The three 02Rs have about 120 inputs in all. I maxed out here and am planning some scaling up in my studio. I could probably do with some more inputs. Isn't that sad? [Chuckles.] It's not really about the fact that I'm going to write a piece of music with 150 sounds. I doubt that I would never do that, because I'm a bit of a minimalist; the biggest musical idea I would probably have might be 10 or 20 [elements] at a time. But within an entire score it can become extremely complex. I wrote a 15-minute cue for the last reel [of "NetForce']. which went from suspense to action to romance to heroism to final resolution. Everything that happened in the story coalesced and culminated in this magnum opus. So I pulled out all the stops.

How important is random-access video for you while composing here in your Project Studio?
It has become essential. I did some research and I found a [Macintosh-compatible PCI] card called the MiroMotion DC30 made by Pinnacle Systems. It has video and stereo audio in and out. I have my VCR going to the card and the card [output] going to my video monitor- I use Adobe Premier for the QuickTime video capture, and Studio Vision has built -in video playback capabilities. It's fantastic. I needed to get a SCSI-II accelerator card and some fast [Seagate] 9 Gbyte Cheetah drives for the video, which easily hold a four-hour miniseries.
   It saves so much time. I'm doing an action score, which means I'm going to be hitting a lot of specific moments. The ability to play through a section over and over again while fine tuning the music without stopping to rewind video is very liberating. It took me about a week to stop reaching for the jog wheel on the VCR! Every time I hit play in the sequencer the video and dialog are just there in perfect sync. Plus, when I'm writing under important dialog, it gives me the ability to keep the notes away from the words more easily.
   I don't use MIDI Time Code for synchronization, since the video is now inside the sequencer. MIDI Clock is very important to me because I use it for all my arpeggiators in the synthesizers, as well as to synchronize all my audio effects and DDLs. I spent an awful long time searching for the best delay line that wouldn't glitch when you changed tempos in the sequence. The Lexicon PCM-80 has a "glide" function, which creates smooth ramping of tempos so they don't click. I have five or so delay patches I use, such as dotted-quarter, half-note, dotted-eight, (which is my favorite), etc. To change it, I just send a MIDI patch change to the DDL from the sequencer, and never worry about BPMs or millisecond delay times.

How processed are the tracks that leave here?
I deliver my music spread out over stems; I do all the compression, EQ, and processing that I think is necessary. Effects are part of the composition. Having different sonic elements spread out onto many tracks gives the dubbing mixers some flexibility without loosing my concept of the mix.

How do you make a decision between using electronic/sampled and live sources?
If I need to record a large ensemble, I'll go somewhere else. The same is true for instruments that I can't record in my room: drums, percussion, really loud brass. When I go to another studio I'll put down a demo of my synth tracks, a click track on a third track and record the players onto a 8-track [MDM], like a DA-88. Or I've done sessions where I've just printed my sequence to a Pro Tools or even a Studio Vision file. With my sequencer set up at a scoring stage I can record parts and do punches directly on any desired bar lines. It saves a huge amount of time. Then I can bring the files right back into my sequencer in my studio for final editing and mixing. For projects I record in my studio, everything goes directly to hard disk in Studio Vision.
   I prefer not to use electronic textures as soloists, unless I'm going for a very specific synthesized sound. You cannot replace that musicality, that legato sensibility, with samples. My musical training was on woodwinds and flutes- whenever there are flute parts I do all my own parts right here. I hit record, play my part right into the sequencer to hard disk. I'll then go in and mercilessly hack it up, fix the pitch if needed, and shift it around.
   As beautiful as you can make a line with a sample or synth, you do remove a level of musicality that just diminishes your score by that much. If you're doing a very low-budget score, it's unbelievable how much emotion you can pick up by bringing in one or two musicians to lay on top of your lovely expansive orchestra of samples and synthesizers. It makes all the difference in the world, even if you just bring in a guitar player. Ain't nothin' like the real thing!
   Hard-disk recording on the sequencer has become an integral part of my process for both composed and improvised lines. I will often have musicians sit right behind me and read notes on my computer monitor while they play or improvise. I love that sense of collaboration to add textures, lines and colors. My singers stands over there as far from the hard drives as possible.
   It's a physical thing; we're interacting. The digital picture is up, digital audio is running, I'm flailing my hands around to conduct them as they perform. We'll do three or four takes, then I stop being a composer and become a producer. I chop, hack and come up with things that don't exist any other way. You really can't do these things in a recording studio; you don't have the time and you don't have the interactivity.

What do you use for monitoring?
The bigger speakers, Yamaha NS-40Ms, [either side of the keyboard and sequencer screens] are for when I'm composing, and the smaller speakers [above the trio of 02Rs] are for when I'm mixing. They are Audix A1s. I bought them when I was working a TV series, and they were the speakers being used on the dub stage. They replaced my ubiquitous NS10s; for the same money they actually sound much better. I'm very happy with them.

What item of hardware would dramatically improve your efficiency?
A good question. I have a Pro Tools system which gives me eight ins and 16 outs. You know what I want? I want a Pro Tools this big! [Extends his arms to embrace his entire synth rack, which is currently housed in five, floor-to-ceiling racks.] I want to be able to take every instrument I own, every piece of outboard sampling and synthesis, and have everything go into audio inputs that have total processing - basically, have it all happening in one domain. I want real-time signal processing on all channels. I want within the 02Rs the same kind of power I have in the Pro Tools and Studio Vision plug-ins; pitch-shift, normalizing and all the rest.

TopMove


©2023 Media&Marketing. All Rights Reserved. Last revised: 02.20.23