Power Field Studio

Power Field Studio

quarta-feira, 17 de maio de 2017

Pré-Produção Para Entrar No Estúdio E Gravar

Pre-production for the recording studio


You want to make the most of your time in the practice room before your session in a recording studio. The importance of rehearsals before making a professional studio recording cannot be understated. Since there is no audience, the energy of the song must be self-generated in a way more obvious in a recording.

The difference between a professionally recording versus an amateur one, is mostly in the preparation. If a song is not properly rehearsed, minor annoyances can create confusion and frustration. Any advantage to be had before entering the recording studio will pay dividends no matter the level of recording facility. A simple suggestion like making sure the drummer changes the heads could easily save hours. Drumheads will stretch over time and lose their pitch quickly fixed. Allowing adequate time for them to fully stretch will make the drum sounds more consistent making the engineer's job much easier.

Prepare for your recording
The process of rehearsing can be an effective way of preparing a vocalist who is singing on a programmed or produced recording. A vocalist should be taken to task on the technical aspects of a performance such as pitch, timing, phrasing and annunciation. If there are difficult parts that are tongue twisters or stretch the range of the artist, they can be worked on and strengthened before entering the studio.
If you get what you want from the rhythm section in the band rehearsals, record them to use when you rehearse the vocalist or other musicians. This way you will not burn out your rhythm section by making them play the piece repeatedly. You can bring everyone in together for a final rehearsal before the recording if necessary.

Rehearse, rehearse, rehearse
The performance process is very different in the recording studio than any other place. Rehearse until your songs are second nature. No part or passage should come as an overwhelming challenge when you press record. The maze of microphones and cables can make any musician feel confined or restricted. The use of isolation booths to separate musicians, headphones and lack of clear sight lines between musicians can diminish visual cues to usher in transitions between sections of a song. The strange studio environment, the listening setup, and the sound engineer sat behind the console all play a part in adding stress to the situation, so you don’t want to mess up when more preparation could have saved the day.
Communication
Setting up a short video conference or phone call ahead of the recording can prepare the musician for what to expect in the studio. They can prepare ideas and rehearse in their own time. Since so many musicians have recording setups, have them record and send ideas back to you. This will help you sort out the best of what they have to offer and fashion it into a part before the recording.
Recording live
If you’ve done proper preparation, tracking separately may not be necessary. You might be able to get everyone to record altogether, live, at the same time, and with the feel a live band recording makes possible. Or maybe the drummer nailed the take and everyone else is going to layer their parts. Some songs benefit by recording to a click track. Some songs suffer. You might not know which approach is best until you get into the studio and record a couple of takes. For this reason, as a band, you rehearse every single song both ways.
Involve your sound recording engineer
Once you have sorted through all the performance and part issues in the band rehearsals, it is usually a good idea to bring in the engineer who will be recording the band. By seeing the setup, meeting the musicians and hearing the music, they will be able to better prepare the studio. A good engineer will be able to make suggestions regarding sounds, what resources are available at the studio and what to expect on the day of the session.
Attitude and feeling
Most people never bought a record because the artists pitch, timing and tone were perfect. They buy tracks because the attitude, feeling or emotion struck a chord. If well rehearsed, your engineer or producer can focus on the more important aspects of the vocal performance like the expression and the continuity of the song from section to section. The listener will relate to such things and influence them to buy your song. It is at this stage when you will need to decide where you want to record. If you have a producer they will help you to select a recording studio.
Selecting a recording studio
Factors to think about include previous notable clients that represent your sound, the equipment, size of its live room and what backline is included. Perhaps it has a live room famed for its drum sound? Perhaps you want to record drums at one studio and guitars at another? Pre-production on professional recordings will often focus on details such as testing out microphones that best suit the vocalist. 
Sourcing reference tracks by other artists can help everyone involved agree and understand the sound you are trying to achieve. For those acts still starting out, budget will also be a factor. If your budget is tight, it’s always best to do less songs of a higher quality than try to do more songs and leave them unfinished. Most studios will offer advice when trying to work out how to get the most value for your money.
Above all be ready for all eyes to be on you while you’re recording.

terça-feira, 16 de maio de 2017

A Era Do MP3 Acabou!

The Music Ends for MP3 as Its Developer Discontinues the Once Pioneering Sound Format

The age of the MP3 is officially over. The developer of the early digital audio codec announced this week that it has ended its licensing agreement, ceding to more versatile formats as the new standard for audio files.

Gizmodo reports that the Fraunhofer Institute, which licensed MP3 patents to software developers, said newer MPEG codecs such as AAC “can deliver more features and a higher audio quality at much lower bitrates compared to MP3.”

The decision likely won’t have a major impact on the digital audio industry, as most streaming and other services already use newer formats. But while the MP3 has faded from use, it had an enormous and lasting impact on the world of digital sound.

The format was among the first to make music easily downloadable, and helped Apple (AAPL, -0.24%) become a dominant force in music devices and distribution. The iPod and iTunes propelled the company to the top of the industry.

The MP3 was in some ways a revolutionary innovation that made audio compact and shareable, but due to its poor quality compression it is unlikely to enjoy the vintage revivals of older analog formats such as vinyl.

Inspiração Musical - Como Ter Aquela Faísca

Musical Inspiration: How to Catch Lightning in a Bottle

First of all thank to  Dan Hulsman  for this article.

We’ve all heard the stories: Some composer or inventor or some other creative person is in the shower, unwittingly working up a fine shampoo lather when suddenly out of nowhere – BAM!  The most glorious idea magically strikes them, and they race out of the shower – still dripping water with a headful of shampoo – to the piano/notebook/aisle/etc to pour out and record this amazing new gift from the creative ether before it passes by and is forever lost in the river of time.

Recently, someone asked me a question via email about the “lightning bolt” ideas that just “come to people.”  This aspiring composer explained that they had read/heard interviews with composers whom he admires and kept hearing about that moment when they’re in their shower or doing something mundane and then – BAM!  A melody strikes them out of nowhere and they suddenly have a new amazing piece of music.  He went on to express a troubling question: (A) Do you have to be born great in order for those “lightning bolt,” ideas to strike you, or (B) can you compose without that seemingly-magical source of musical inspiration?

I answered his email with some of my thoughts below, but I wanted to dig a bit deeper.  My answer?  Secret option “C”.

Eureka!  Deconstructing the Magic of “Divine Inspiration”

Sometimes, impactful creative works or brilliant ideas are attributed to a moment of pure creative inspiration that seems to come out of nowhere.  This phenomenon is often referred to as “divine inspiration,” which is the concept of a supernatural external force causing a person or people to experience a creative desire.  The concept is thousands of years old and most often attributes the source of creativity to some deity or another.

My personal opinion: For 99.9% of the creative population, I’m gonna go ahead and call “bullshit,” on that one.  I actually think there are 3 different processes that go into engineering these moments of creative bliss, and I’m going to break them down for you now and describe them to the best of my ability:

I believe the three main steps of artistic creation are the following:
  1. Building competence
  2. Generating ideas
  3. Developing ideas
Let’s look at each one in turn, shall we?

Step 1: Building Competence


When was the last time you heard about an influential piece of music composed by a person without any musical training or experience?  While I look forward to whatever random examples internet trolls drum up about a pastry chef who suddenly became an inventor or whatever, I think we can all agree that almost all great musical works are created by (spoiler!) musicians.

I know, I know.  It sounds crazy, am I right?  All jokes aside, there’s a reason that musicians experience musical inspiration and carpenters experience… uh… carpenterial inspiration?  If you spend hundreds/thousands of hours of your life practicing, playing, experiencing, and listening to something your brain is going to be wired for that activity.  Every music teacher who brings their high school choir to competition listens to the other competitors and thinks to themselves, “Why, if I had 5 minutes with that choir I could fix X, Y, and Z.”  They’re listening analytically.  Basketball players watch basketball games through a lens of understanding and kinesthetic appreciation, while I’m just impressed that nobody ever seems to get hit in the face by a chest pass gone wrong.  I don’t have the same level of competence as a basketball player, which limits my ability to understand and enjoy what’s going on.

“So, is that a piano?”


If you’re a musician or composer, you’re listening on a completely different level than a person who has never played a note in his life.  You understand what’s going on, even if you just intuitively understand certain things and don’t have formal theory knowledge.  You can probably anticipate some of the notes that come next in the melody as you listen to a new song for the first time, as an example.  This is because you are competent at music, and without a baseline level of competency you really don’t have the tools to recognize, interpret, and record a musical idea regardless of when and how it comes to you.

This came as a bit of a shock to me during college when my father and I were driving somewhere and listening to/talking about music.  He’s an engineer and a very smart guy, and although he grew up listening to Rodgers & Hammerstein records as a kid and classic rock as a teen and into adulthood, he’s never had any real music education or personal experience.  This gap between his musical competence and my own became painfully clear during that car ride when an instrument began a solo and he asked me, “So, is that a piano?”  I remember being so surprised by the question that I wasn’t sure if he was making a joke or not, but he went on to explain that he legitimately couldn’t distinguish very well between different instruments just by hearing them.  If he saw them, he could probably tell you which was making which sound with the visual aid – but he couldn’t pick out even common musical instruments purely by their sound.  Whoa.

Bringing it Back to the Creative Process

Whether you’re uniquely gifted or through sheer willpower trying to force creativity to happen, if you have no idea what the hell you’re talking about then you’re probably not coming up with many ideas.  I teach music to children, and when I ask most 3rd graders to improvise a drum solo they stare blankly at me or look panicked before shrugging their shoulders and saying something like “I can’t do it!”  Alternatively, if I ask them to use quarter notes and eighth notes to come up with their own patterns, suddenly they’re improvising within the field of their own comfort and competence.  If I tell them to use both high sounds and low sounds on their drum on their second try, they are magically empowered to make even more sophisticated solos.  Any musician who is capable of basic decision-making is able to create within the limitations of their musical comfort zones.

If you want to compose music, you need to ask yourself if you have some basic competency before putting any pressure on yourself to create good music.  Can you read music?  Can you notate music?  Can you transcribe what you hear?  Can you play an instrument?  Can you sing?  Do you know basic music theory concepts? Have you analyzed any of the music that you like?  The more of these answers you can give a “Yes!” to, the more comfortable you’re going to be with creating new music out of thin air and the more building blocks you’ll have at your disposal to do so.

My point here is that all of these “geniuses,” were immersed and well-versed in their craft.  They spoke the language of their craft, and you have to be able to as well.  A good idea can hit you, but you won’t even know what a good idea sounds like if you aren’t proficient.

Step 2: Generating (and Recording) Ideas

Here we are at the part of the process where – in my opinion – most people think they have a problem, totally psych themselves out, and get stuck.  Little do those folks know that the next step is where the real work begins!  But for now, let’s focus on idea generation.

The legendary Koji Kondo, composer of the Super Mario theme and countless other Nintendo gems, has said in many interviews that he’ll be showering or walking or something mundane when a melody idea comes out of nowhere.  This is a very common description given by someone who is both competent and experienced with composing.  HOWEVER, if you check out this excerpt from a lengthy interview he did back in 2001 for Game Maestros Vol. 3, he describes his process a little differently:

Interviewer: What was the pace like for writing songs? How long did each one take?
Kondo: It depended on the song, but on the long side, maybe about a week. A short one might be done in a matter of minutes. (laughs) That doesn’t necessarily take into account all the time spent at home trying to come up with a good idea, though.

This distinction is huge!  Aside from the fact that he’s probably been asked the same interview questions a million times, his process is probably so internalized that he himself may fail to see the distinction between idea generation and the next step: idea development.  Good Ol’ Dan caught him on it, though: Koji Kondo clearly describes that idea generation is a separate process that he completes before actively working on an idea through to completion.

Musical ideas can take many different forms: an unusual instrumentation, a chord progression, a short piece of a melody, etc.  But where to they come from?  Just as the writer fears the dreaded blinking cursor on a blank white screen so does the composer fear the empty staff paper, so is there a way to avoid it or bust through?  Sort of.

Ideas are Improvisations

When you get right down to it, “inspired” musical ideas are spontaneous pieces or concepts of a work while improvisation is the spontaneous creation of music.  It’s the same damn thing.  Don’t get me wrong; you don’t have to become a jazz expert or anything crazy like that.  But let’s get some things straight, here:

If you’re reading this, you have an interest in creating music.  If you are that interested in music, you have a discerning ear that can tell – probably better than you give yourself credit for – when something sounds good VS. when something sounds bad.  If you’re reading this, you probably have fingers and can type with at least one of them.  Ergo, you have everything you need to peck around on a GD keyboard randomly or otherwise until you accidentally come up with something that sounds good and – like everything else in this existence – the more you do it the better you’ll get at it.

So let’s talk about how to get better at it.

Creating Time & Space

I strongly believe that it’s 100% vital to deliberately set aside time, space, and energy for idea generation, improvisation, or composition (however you’d like to describe it to yourself).  This doesn’t mean that you aren’t allowed to be spontaneous ever, but it does mean that you have to prioritize music creation by giving this part of the process the physical and mental space that it requires and deserves.  This skill is a skill and requires disciplined practice and lots of room for mistakes to happen.

For me, I like to come up with new ideas at the piano away from the computer.  Most of the time, I use a full-sized digital piano that is not hooked up to my computer and I do this on purpose because OH LOOK FACEBOOK!  Know what I mean?  You should also be mindful of and leverage times of the day when you feel mentally relaxed and have some energy to spare.  Finally, until generating ideas is second nature to you it is a habit that you need to build for yourself.  Come to grips with the fact that some days you’ll generate nothing, others a bunch of crap, and on some you’ll come up with some great stuff.

Recording Your Ideas

I have 3 different ways of recording these ideas: the voice memos app on my iPhone, staff paper, or – my personal favorite – whiteboard slates with music staves on them that I later take photos of.  Eventually, everything ends up in my staff paper notebook or my iPhone this way and I can transfer it to a DAW or notation program later, but I do this to keep production and development separate from the task of coming up with new tune ideas.  Otherwise, the pressure would be too high or I’d get hung up/distracted by crafting the sound as I’m concurrently coming up with and recording ideas.  Some people do both simultaneously, but I either cannot or do not want to and that’s A-OK with me.

The Search for Fragments

I like to think about idea generation like I’m digging around in the dirt for precious fragments.  I like the word “fragment,” because it implies that the idea is incomplete or small which takes a big deal of pressure off of me and allows me to recognize little treasures as I’m improvising.  Also, it sounds cool to say “I came up with this beautiful melody fragment today.”  I’ll often start by playing a a chord or a couple of chords and just peck around until I hear something small that catches my ear.  When I catch something like this, I’ll massage it a bit and try a few different variations of the idea to see if it sparks my interest further.  If it does, I’ll either dive right into the next phase (developing the idea) or just write down/record the fragment and move on to the next search.

These fragments – these precious little diamonds in the rough – can be much, much smaller than most people expect, I think.  Several of the ideas that I initially captured in my iPhone voice memos clock in at around 30 seconds or less with my shortest fragment coming in at a whopping 9 seconds long.  That’s it!  The beautiful thing is that these little fragments can be enough.  Some of them are definitely longer, and I did go from idea-to-sketch with a lot of them… but some of them are just pretty little fragments waiting for their day in the sun.

Food for thought: Nobuo Uematsu’s One Winged Angel from Final Fantasy VII is often admired, performed, requested, and otherwise put up on a pedestal – and for good reason.  Aside from being revolutionary for its time, it’s a great piece.  How did Uematsu-san conceive of such a beautifully dreadful piece of music?  He brainstormed a bunch of 4-bar ideas and then figured out a way to stitch them together.  If you listen to it, almost all of the singable musical ideas are 4 bars long.
 

Step 3: Developing Ideas

This is the sneaky one.  I think that many people incorrectly imagine this phase as a magical pouring-out of ones idea until it is fully realized, but I personally find that this almost never happens and putting that kind of pressure on yourself is paralyzing.  What do I mean by developing ideas, exactly?  Basically, I mean taking a very short snippet of a melody or a hook or a short chord progression or even just a single chord or WHATEVER you have to start with and deliberately and strategically add new material, structure, and more to expand your idea into a completed work.

This step has a couple of major pitfalls: one is an old problem, one is a new problem.

Pitfall #1: No Tools in the Toolbox (the old problem)

There are a LOT of options when it comes to fleshing out a new musical idea.  Do you know what some of them are?

Let’s use Beethoven’s 5th symphony as an example.  Everyone knows the famous four-note beginning, and it’s the perfect example of a musical fragment that needs expansion and development.  What’s the first thing that happens in Beethoven’s 5th?  He repeats the four-note motif a bit lower using the same instrument (a technique called sequencing), repeats the motif in other voices to create harmonies (a technique called imitation), and used sonata form to give the first movement structure just to name a few examples.  Not to diminish his brilliance, but he used a lot of very standard techniques that we can also use!  No magicial inspiration: Beethoven had an idea and then he got to work developing that idea with the full breadth of his compositional toolbox and vast musical knowledge.

Pitfall #2: Combining Too Many Compositional Tasks


A lot of people load up a blank project in their DAW and get to work “composing,” until – hopefully – they have a finished product.  This might work for some people, but I would strongly caution against this approach until you feel confident at all of the three stages I outlined in this article and then some.  Why?  Because when you compose directly into a DAW, you’re blurring the lines, overlapping, and often times combining the following processes:
  1. Generating Ideas
  2. Developing those Ideas
  3. Producing
That’s a lot to be doing all at the same time.  Some people can totally do that, but those people are usually well-experienced and confident in their abilities.  Personally, I’m not a confident producer but I AM comfortable with generating ideas and developing those ideas.  Because I have an uncomfortable process awaiting me, I separate it out completely so that it doesn’t muddle up the two parts that are actually working for me.  I improvise at the piano until I have an idea and – when appropriate – I’ll immediately latch on to that idea and work on developing it into a piece.  Once I have a sketch, I can always lay it into a DAW later as an outline.  When I interviewed Hitoshi Sakimoto last year, he explained that he basically does the same thing: he’ll come up with an idea and sketch it out, record it in the DAW, and use that recording as a placeholder outline as he produces and re-records over it.

Closing Thoughts & Big Take-Aways

Here’s my abbreviated advice: If you’re uncomfortable with (1) generating ideas, (2) developing ideas, or (3) music production I would separate these processes from one another.  It will increase your efficiency and allow you to focus on the easy parts without getting bogged down and the hard parts without getting distracted.  Some people do these one at a time, which does take longer overall.  Some people do all three at the same time.  You do what works for you, and be mindful of which part of the process you’re in and which tools you have at your disposal.  If you’re attacking these parts separately and you find yourself stuck or struggling, you’ll be able to identify which part of the compositional process you need to become more competent in instead of just chalking it up to a talent shortcoming or lack of divine inspiration.

13 Falhas De Guitarra Embaraçantes

13 Embarrassing Guitar Fails


Let’s start off the week with a satisfying batch of Guitar Fails! From Slash to Lil Wayne to some guy from Australian Idol, we rounded up 13 Embarrassing Guitar Fails that’ll make you laugh while you cringe.

Shred legends like Slash and Eddie Van Halen seem to do no wrong while wielding a guitar, but they are, in fact, human. You’d be hard pressed to find a Van Halen fan who isn’t interested in witnessing an Eddie solo, but this one particular show left the crowd wishing VH had just played “How Many Say I” instead. We suppose you can’t fully blame Slash for his multiple mess-ups of “Sweet Child O’ Mine” either, as the substance consumption of the world’s most dangerous band was super heroic back in the day.

Let’s all agree on one thing: Lil Wayne should never touch a guitar again… ever. The Internet has destroyed the successful rapper multiple times for busting out an axe during concerts. Eventually, it seemed like the sound guys pulled a Sid Vicious and just unplugged the guy, as seen in the infamous “tapping” clip. For this compilation, however, we stuck to the classics, using Lil Wayne’s epic two-note guitar solo.


We threw some non-famous “guitarists” on this list too, though they’ve been infamous for having the gall to play an instrument while cameras were rolling. On an episode of Australian Idol, one of the greatest all-time fails took place. When you leave an audience genuinely questioning if you’ve ever picked up a guitar before, you know something went wrong. After butchering John Lennon’s “Imagine” (which is played entirely on piano, you dummy!) this guy brought the house down with a rendition of “Somewhere Over the Rainbow” that somehow sounded exactly like the Lennon attempt.

Check out these 13 Embarrassing Guitar Fails in the Loud List above!


O Documentário "The Last Songwriter" Discute Sobre Escrever Música Na Era Digital

Documentary Film ‘The Last Songwriter’ Discusses Songwriting in a Digital Age


The Last Songwriter is a new documentary about the back bone of the music industry featuring Garth Brooks, Emmylou Harris, Jason Isbell and Jim Lauderdale that showcased at the Nashville Film Festival 2017 in April.  It is directed by the award-winning filmmaker Mark Barger Elliott.  It’s a thoughtful look at the potential degrading of song quality in the age of streaming music.
According to country legend Garth Brooks, ‘Writing a song is the most important step in music.’  His concern, along with many in the industry, is the devastating effect the digitalisation of music has had on full time songwriters.  Many say it is the death knell of a treasured craft that is responsible for so many inspiring and superior songs throughout the years.  The Nashville Songwriters Association International has reported a cut of 80 percent of all full time songwriters.
The producers and writers felt the film’s purpose is to shed clear light on the plight of the burgeoning songwriter. As an artist, they could see their chance to grow into their trade disappear before they have the opportunity to discover their feet in the industry due to the radical change streaming music has brought.
Songwriters have been governed by long standing copyright laws that were in place before streaming music would become an issue.  Unlike the artist and record labels, songwriters and the publishers as well, have their royalties regulated by the US government and do not have the leverage to change that, despite how unfair it may seem now the digital age has brought big changes to earning potential in the industry.
And with that monetary value decreased, it has become increasingly difficult for a songwriter to be employed full time by a publisher, to be invested in as an apprentice.  And, it stands to reason that their development may be diminished or at the very least delayed when devotion to their songwriting skills will have to take a backseat while they earn a living elsewhere.  Their full attention cannot be on songwriting when there are the day to day concerns of securing an income.
Historically, a young songwriter would have come to Nashville and learned to hone their skills in the industry under the watchful eye of a publishing company.  In the current market, that is becoming less and less possible, even for the most talented young writers. Therefore, that is the issue that the film seeks to bring to the fore, in the hope that some change can be made to save what could result in the art of songwriting being fatally damaged.
Songs have the power to delight, empower and relieve pain and to have that art diminished in any way would be a tragedy we would all feel the loss of the filmmakers say.
The Last Songwriter asks-“In the future, who will write the soundtrack of our lives-songs we dance to at our wedding, sing at sporting events, and play at a loved one’s funeral?”… This is a film for everyone who has been moved by a song to step out onto a dance floor, to give it one more try or to fall in love with the stranger across the room.  The Last Songwriter is for those who love music and care about its future.’
Credits:
thelastsongwriter.com

segunda-feira, 15 de maio de 2017

Ferramentas De Marketing Que as Gravadoras Usam Agora Estão Disponíveis No CD Baby

Music marketing tools the major labels use — now available to all CD Baby artists

First of all thanks to   for this article.


Get free access to the same marketing tools used by Nettwerk Music Group, Sony, Universal, The Orchard, and more.

Do you want to build your Spotify following, grow your email list, and boost engagement on platforms like YouTube and SoundCloud? Do you want to spend less money or time on promotion, but have a greater effect? Does the sun rise in the east?
Last year CD Baby purchased Show.co, a company that makes elegant marketing tools used by major label artists and their marketing teams (such as Ignition Records, who uses Show.co to run contests and build lists for the band Oasis).
I’m excited to say that CD Baby is making these tools available to all our clients FOR FREE!
Show.co: pro music marketing tools

To market your music through Show.co:

  1. Log into your CD Baby member account
  2. Look to the righthand side of the account dashboard
  3. Click on “Free Marketing Tools!”
CD Baby offers marketing tools used by major label artists

As a CD Baby artist, you can have two of the following campaign types active online at any time:

  • Email-for-Download — Build your email list.
  • Social Unlock — Grow your following on platforms like Spotify, YouTube, Twitter and SoundCloud.
  • SoundCloud — Boost streams and fan conversions.
  • YouTube — Increase views and fan conversions.
Pro music marketing tools available to CD Baby artists for free

How are artists and labels using Show.co?

Last December I used the Social Unlock tool to grow my Spotify following and email list, and I had great results. The campaign was easy to build — it probably took 5 minutes — and yielded an almost 50% conversion rate (if you’re not up on marketing stuff, that’s pretty good), helping me capture both Spotify followers and email addresses in exchange for unlocking a holiday playlist I’d created.
Music marketing with Show.co
When Ignition Records uses the Email-for-Download or Social Unlock tools, they often see even more impressive conversion rates (in the 80-90% range) — particularly when running a contest to give away concert tickets, box sets, or meet-and-greets.

One of the best features of Show.co is that you can easily reward fans with almost any kind of “unlocked” content:

  • a download
  • a PDF
  • an unlisted video link
  • a hidden URL
  • a playlist
  • a coupon code
  • or whatever else your audience will be interested in
In exchange, your fans take a specific action: follow you on Spotify, SoundCloud, or Twitter, subscribe to your YouTube channel or email list, etc.
Effective music marketing tools
Show.co gives you an easy and fun way to turn casual fan interest into real results. Try it and you’ll see for yourself.
If you’re a CD Baby client and you already have a Show.co account, contact us and we’ll switch your account to FREE.

A few user tips to make your marketing campaigns as effective as possible:

  1. Choose an image that is very light or very dark in the center — When you design a campaign, you’ll upload an image that is displayed as the background for the campaign URL. The text for your campaign (for instance, “Follow us on Spotify to hear our new album before it’s released”) is displayed in the center of the page OVER that image. You can set the text to be white or black, but in order for it to POP on the page, you want the text to contrast with the portion of the image directly behind it.
  2. Play with the “blur” settings on the image — To add clarity to your messaging, you can blur the image behind your text, just for an extra bit of design control.
  3. Start by testing a Social Unlock campaign for Spotify — This is an easy way to get a feel for Show.co, and you can use a single campaign to build both your Spotify following AND your email list. Check it out and you’ll see what I mean (just be sure to enable email capture).

Create a Show.co campaign to market your music today! 

Effective online music marketing tools

Google IA Inventa Sons Que Nenhum Ser Humano Escutou Antes

Google’s AI Invents Sounds Humans Have Never Heard Before


JESSE ENGEL IS playing an instrument that’s somewhere between a clavichord and a Hammond organ—18th century classical crossed with 20th century rhythm and blues. Then he drags a marker across his laptop screen. Suddenly, the instrument is somewhere else between a clavichord and a Hammond. Before, it was, say, 15 percent clavichord. Now, it’s closer to 75 percent. Then he drags the marker back and forth as quickly as he can, careening though all the sounds between these two very different instruments.
“This is not like playing the two at the same time,” says one of Engel’s colleagues, Cinjon Resnick, from across the room. And that’s worth saying. The machine and its software aren’t layering the sounds of a clavichord atop those of a Hammond. They’re producing entirely new sounds using the mathematical characteristics of the notes that emerge from the two. And they can do this with about a thousand different instruments—from violins to balafons—creating countless new sounds from those we already have, thanks to artificial intelligence. 

Engel and Resnick are part of Google Magenta—a small team of AI researchers inside the internet giant building computer systems that can make their own art— and this is their latest project. It’s called NSynth, and the team will publicly demonstrate the technology later this week at Moogfest, the annual art, music, and technology festival, held this year in Durham, North Carolina.
The idea is that NSynth, which Google first discussed in a blog post last month, will provide musicians with an entirely new range of tools for making music. Critic Marc Weidenbaum points out that the approach isn’t that far removed from what orchestral conductors have done for ages—“the blending of instruments in nothing new,” he says—but he also believes that Google’s technology could push this age-old practice into new places. “Artistically, it could yield some cool stuff, and because it’s Google, people will follow their lead,” he says.

The Boundaries of Sound

Magenta is part of Google Brain, the company’s central AI lab, where a small army of researchers are exploring the limits of neural networks and other forms of machine learning. Neural networks are complex mathematical systems that can learn tasks by analyzing large amounts of data, and in recent years, they’ve proven to be an enormous effective way of recognizing objects and faces in photos, identifying commands spoken into smartphones, and translating from one language to another, among other tasks. Now, the Magenta team is turning this idea on its head, using neural networks as a way of teaching machines to make new kinds of music and other art. 
Nsynth begins with a massive database of sounds. Engels and team collected a wide range of notes from about thousand different instruments and then fed them into a neural network. By analyzing the notes, the neural net—several layers of calculus run across a network of computer chips—learned the audible characteristics of each instrument. Then it created a mathematical “vector” for each one. Using these vectors, a machine can mimic the sound of each instrument—a Hammond organ or a clavichord, say—but it can also combine the sounds of the two.
In addition to the NSynth “slider” that Engel recently demonstrated at Google headquarters, the team has also built a two-dimensional interface that lets you explore the audible space between four different instruments at once. And the team is intent on taking the idea further still, exploring the boundaries of artistic creation. A second neural network, for instance, could learn new ways of mimicking and combining the sounds from all those instruments. AI could work in tandem with AI. 
The team has also created a new playground for AI researchers and other computer scientists. They’ve released a research paper describing the NSynth algorithms, and anyone can download and use their database of sounds. For Douglas Eck, who oversees the Magenta team, the hope is researchers can generate a much wider array of tools for any artist, not just musicians. But not too wide. Art without constraints ceases to be art. The trick will lie in finding the balance between here and the infinite.