Advice for first-time amp buyers
Buy a really good guitar, then worry about the amp. You connect physically with the guitar, so that's far more important than the amp. Especially when you learn how to get the sounds you want with your hands rather than effects...
The choice of the "right" instrument (and this includes amps as well as guitars) depends upon the player's preferences, skills, musical objectives, budget, etc. I'll try to cover some of the important considerations in this article.
First, let's talk about power vs. loudness. A 1-watt amp through a typical guitar speaker will put out about 95 to 100 dB of sound one meter in front of the amp. If you want to double the perceived loudness, you'll need ten times as much power, which is an increase of 10 dB. So a ten-watt amp will push 105 to 110 dB through the same speaker. A hundred-watt amp will push 115 to 120 dB.
At 95 dB, you can listen continuously for 4 hours in a 24-hour period without risking hearing loss (according to OSHA). The safe exposure time is halved for every 5 dB increase in loudness. Thus 100 dB is safe for two hours' exposure, 105 dB is safe for one hour, 110 dB is safe for a half-hour, 115 dB is safe for a fifteen minutes, etc. If you use ear plugs, just subtract their dB reduction. For example, 15 dB ear plugs (typical of "musician" ear plugs) would reduce 115 dB in the room down to 100 dB at the ear, thereby increasing your safe exposure time from fifteen minutes to two hours.
So what's a typical stage volume? As you know, all but the smallest venues tend to mic every instrument and use the PA to drive the room. The room in most rock venues is at least 110 dB. Close to the stage, rock venues can exceed 120 dB. Other venues take a slightly saner approach, but it's rare to hear live music indoors at less than about 100 dB. In order to keep the stage volume from bleeding into the house, your stage volume needs to be at least 10 dB quieter than the house. So figure about 90 to 95 dB on stage.
Yah, look up above... Go ahead, I'll wait. A one watt amp will (theoretically) give you plenty of volume to play any venue that has a PA. So why do people use bigger amps? This is where it gets interesting (and complicated).
A good guitar amp will begin to distort long before it reaches its rated power output. I'm not talking about obvious, Santana-like distortion. It's the low levels of distortion that help to give the electric guitar some warmth and character. As you push the amp harder, the distortion will increase. This phenomenon is called "touch sensitivity": you play harder to get more distortion. Alternatively (or in addition to touch) you can roll back the guitar's volume to clean up your tone.
What that means is that you'll find a "sweet spot" where the amplifier sounds "right": loud enough when you're playing clean, but not too loud when you're playing dirty. That sweet spot depends upon you (technique and preferences) and your guitar as much as it depends upon the amp itself. But wait, there's more...
All guitar amps made since the late 1950s have an inherent midrange scoop. Leo Fender's people added this scoop to reduce the amount of energy contributed by the guitar's fundamental frequencies. It's a neat trick, giving the guitar a bit more depth and shimmer while allowing it play clean at higher volumes. The frequency and depth of that midrange scoop plays a huge role in determining the character of the amp. Fender, Marshall, Vox and Mesa/Boogie amps all have different tonal characters because of this. And then there's other "voicing" that happens in different parts of the amp to affect both frequency- and touch response. And then the speaker adds its own distortions and tonal coloration. If all that's not enough, the amp has a dynamic response character that depends upon various sub-audio time constants in the amp circuit and the construction of the transformers.
So what's an appropriate power range? Most players nowadays use 30- to 50-watt amps. These usually give you enough headroom to play clean along with a drummer who doesn't hit too hard, and enough warmth to sound good when you dig in with your picking hand. Some blues players prefer even smaller amps so they can push them harder and get more of that singing quality to their lead lines. Metal players tend to prefer high-powered amps that have master volume controls, allowing them to set distortion levels independent of volume.
Master volume controls seem like a brilliant idea. If you really have to have a lot of distortion over a wide range of volumes, then its a reasonable approach. But so is a distortion pedal. The problem with master volume amps is that they get a lot of their distortion from the preamp. The power stage is left to run mostly clean, bringing the preamp distortion up to whatever level you like. And master volume amps don't really have any dynamic behavior to speak of. By way of contrast, a good non-master volume amp (like an old Fender, Marshall and Vox amp or a modern reissues of an old design) is designed so that all the parts of the amp contribute to the distortion and the dynamic behavior, making the sound and response of the amp seem "multi-dimensional". It's hard to explain, but easy to hear and feel if you do a side-by-side playing comparison.
None of the above really applies to straight-ahead jazz guitar. Traditional jazz guitarists tend to look for clinically-clean amps that don't distort under any circumstances. They also use speakers that color the sound less and have a lower efficiency than guitar speakers, so it takes more power to get comparable loudness. In many of ways, a jazz amplifier rig is a lot closer in concept to a bass rig than a guitar rig. Also, jazz guitarists seem to like to travel light, so they often choose special lightweight solid-state amps and high-tech speaker cabinets.
Long story short: It's complicated, and decades away from even coming close to being a science. The good news is that if you pay attention to the sound and the feel of an amp while you're playing at performance volumes, you'll have all the information you need.
Modelling amps attempt to replicate real amps in software. So far, they haven't done a great job. Modelling amps are best suited for low-volume playing. They're the proper tool for apartment dwellers and for dads who'd like to play while the baby sleeps. For everyone who plays at actual performance volumes (except jazzers), a tube amp is the way to go.
Don't be put off by the price of tube amps. At the high end, you can spend several thousand dollars on an amp that's hand-wired in the way they used to do it back in the `50's and `60s. There are tangible benefits to these high-end amps, but a tube amp built using modern production methods will get you 95% of the benefits at 10% to 20% of the cost. Fender, Marshall, Peavey and Carvin all make decent tube amps at affordable prices. And if you know what to look for, there are a lot of "vintage" amps from the `70s that can be bought at reasonable prices and brought up to spec for another couple hundred dollars. A recapped vintage amp will be good for another thirty years or so.
Maintenance on tube amps is also not a big deal. Stock a few spare tubes (at a cost of under $100) and you'll be able to perform virtually all of the maintenance yourself. Most tube-amp failures are failures of the tubes themselves. Tubes are meant to be replaced by the consumer: ordinary people did it all the time right up through the 1960s.
You'll hear a lot about the "necessity" of biasing your amp when you replace power tubes. There are a lot of amp techs and tube vendors out there looking for income, so take their advice with a grain of salt. Most amps ship with the bias set conservatively. Manufacturers want their amps to sound good, yet they don't want to be be replacing failed tubes under warranty. If you replace the power tubes with the same kind and grade1 of tube from the same vendor, you shouldn't need to have a tech adjust the bias. If you want to experiment with different tubes from other vendors, then you may need to get the bias checked and possibly adjusted.
Now, there's nothing inherently wrong with adjusting the bias on your amp. The problem is that there's so much bad advice on biasing amplifiers, and a lot of techs don't really know what they're doing or why.
The bias setting on your amp determines how much heat your power tubes dissipate when the amp isn't doing anything. In addition to heat generation, bias also has an effect on the sound. It's very popular to bias amps hot - hotter than they come from the factory - because it helps the amp to open up and "breathe" more at lower volumes2. The problems of this approach are several. First, the amp may not perform as well at stage volumes. Second, modern tubes don't always meet published specs; it's prudent to operate them conservatively unless you want to be replacing them every six months to a year. Third, some techs don't fully understand what biasing does and are likely to misadjust the bias. Fourth, biasing the tubes hotter than normal makes it more likely that the next replacement set, despite having been chosen from the same type, grade and vendor, will run too hot and self-destruct. That's not something you want to worry about if you have to swap a bad tube during sound check and don't have a tech in your entourage.
1 Tube vendors sell their tubes according to a grading system (often denoted by a color code) that sorts tubes according to their operating points. All the tubes of the same kind and grade from the same vendor have approximately the same operating characteristics, and are safely interchangeable so long as the amp bias is still set at factory spec.
2 Low volumes?! Tube amps are not designed to operate at low volumes. If you really want good tube-amp sound at volumes lower than normal stage volumes, get a modeling amp.