Matt Sim is a Grammy-nominated mixing engineer that’s amassed a long list of credits in the last decade, both in the West and the East. He’s one of the lead mixers for Warner Music Asia, as well as a former assistant and engineer at Germano Studios in New York. We thought to have a chat with him after learning that zplane’s FENNEK had become a regular part of his workflow, and we ended up learning some unexpected things about the mastering industry as a result. Enjoy the read below.
– Hi Matt. Thanks for talking to us about your use of zplane plugins. Can we start by asking about your background?
Sure. I attended Berklee College of Music where I studied audio engineering and production. After graduating in 2014, I worked as a freelancer in the Boston area before moving to New York where I got an assistant job at Germano Studios. I eventually worked my way up to being their full-time recording engineer whilst doing mixing jobs on the side, and I now work as a full-time freelance mixer for both stereo and immersive formats.
– How did you become involved with immersive mixing?
Starting last year, Apple Music began requesting that record labels submit Dolby Atmos mixes. There weren’t many engineers who worked with that format, so Warner Music Asia signed me up for Dolby’s training program and made me their in-house mixer once I got certified. I currently oversee all of Asia for immersive mixing, and I’m pre-approved by Universal and Sony to do their Atmos mixes as well.
– What are some of the bigger records that you’ve mixed?
I did some Atmos mixes for Jackson Wang, who’s a big K-Pop star in Asia. Aya Nakamura is one of the biggest female artists in France and I mixed her single with Oboy called “Je m’en tape”. I’ve also worked with jazz artists like Dee Dee Bridgewater, trumpet players like Theo Croker and Italian pop singers like Rose Villain. On the recording side, I’ve done sessions with the likes of Nicki Minaj and Big Sean.
– That sounds great. And how were you able to integrate FENNEK into your workflow?
I mainly use FENNEK on my master buss when mixing for Apple Music since it’s important to follow their guidelines if you want an “Apple Digital Master”. The peaks in my tracks have to fall under a certain loudness threshold, so FENNEK’s history graph is really helpful for seeing when the music gets too loud. I’ve also found myself using FENNEK for mastering albums, as it helps me achieve a consistent level for all the songs.
To be honest, I think FENNEK is one of the best and most comprehensive metering plugins, even rivaling iZotope Insight. Granted, Insight has windows for things like Sound Field and Stereo Imaging, but I prefer FENNEK because of its simplicity; the GUI is easy to use and clearly presents the information I need to see, like LUFS levels. Also, the real-time history graph is something Insight doesn’t provide. I can even run FENNEK whilst rendering from Pro Tools – I just bounce the music, be it five-minute or two-hours, and FENNEKs readings are given right away without needing extra time for the loudness measurement. That saves me a lot of time.
– Have you used FENNEK in your Dolby Atmos work at all?
Not yet. The current Dolby infrastructure doesn’t allow for the use of external plugins on the master bus, plus it has its own built-in metering software. So even though FENNEK would be helpful, I can’t use it for immersive mixing just yet.
– So for mixing and mastering, would you say FENNEK has become your go-to metering plugin?
It’s definitely my go-to. FENNEK is the only thing I use now, as I don’t really need iZotope Insight anymore.
– That sounds great. As you probably know, FENNEK offers a number of presets that match the loudness standards of platforms like Spotify and Tidal. Have you found those to be useful?
Mastering engineers rarely pay attention to the loudness standards on streaming platforms. I know it’s common for people to talk about LUFS -14, but most engineers ignore those guidelines because the music would sound too quiet otherwise. Here’s how it works: when you upload music to a streaming service, there’s a soundcheck function that examines your levels too see if they’re above or below LUFS -14. It then adjusts the volume accordingly. But the soundcheck feature is entirely optional and most people don’t even use it because doing so will result in a quieter track. So no-one follows those guidelines, which is something I learned from my engineer friend at Sterling Sound. The only exceptions are for immersive mixes and Apple Music – Dolby Atmos has strict guidelines that you have to follow or your track will get rejected, and the “Apple Digital Master” badge also has its own criteria, so we look at the LUFS for those. But when it comes to Spotify and Tidal, we just master as loud as we can. If you listen to a Justin Bieber or Dua Lipa record, it gets pretty loud, and everyone is competing with that.
– So if a master recording was audibly distorting, Spotify and Tidal would still accept the upload?
Yep. They would let it pass through, even if it sounds terrible.