Category: Geek

Vinyl Records

Vinyl has become trendy again, and record pressing plants are pumping out as many new records as the plants can produce.  Some plants are even expanding.

Vinyl records have a mythology around them promulgated by audiophiles.  It is said that they are analog (they are), and thus more accurately reproduce the original audio than the digital “stair steps” (they don’t), and that, somehow music heard via vinyl is “purer” than digital music.  Almost exactly the opposite is true.

I hate to break it to you, but vinyl is a terrible medium for reproducing audio, and its various deficiencies require countermeasures that significantly change the audio.  Tom Scholz, the leader/recording engineer for the rock group Boston, supposedly tried to get the first Boston album recalled when he heard what his mixes sounded like on vinyl.  Tom Scholz’s experience aside, many of the countermeasures make changes to the audio that audiences can find pleasing.

These countermeasures were implemented in the process of “mastering”.  Originally, mastering was just creating a master disk, from which the pressing plates for the vinyl records would be made.  The mastering setup was simply a cutting lathe that created the sound groove in a metal plate.

One of the physical properties of a vinyl record is that the width of the groove is determined by the volume of bass frequencies.  When music started being recorded with electric bass, mastering engineers found they could often only get five or ten minutes of audio per side of a long-playing record, instead of the normal 15-20 minutes, because the grooves were too wide.  This resulted in them adding devices to the mastering setup to do compression and limiting on the bass frequencies.  The same measures are required for classical music with lots of tympani and/or low brass, and jazz with a prominent bass part.

Another issue with vinyl is that it does not reproduce high frequencies well, and midrange frequencies tend to be prominent.  Mastering engineers added equalization to their mastering setups to partially compensate, and recording engineers would often boost high frequencies in their mixes to help them be audible on the record.  Even with these measures, high frequencies on records gradually disappear toward the top of our hearing range.

The dynamic range of vinyl–the range in loudness from the quiet background hiss of the record to the loudest sound it can produce–is much smaller than that of our ears.  On vinyl it is about 70-80 db, while our ears have a range of about 120 dB.  Every 3 dB represents a doubling in loudness, so the extra range can be pretty important.  For music that goes from being quiet to very loud, it can exceed vinyl’s limits, so the quiet parts are buried in the background hiss.  To deal with this issue, vinyl mastering engineers compress the entire mix (as well as adding extra compression and limiting for the bass frequencies), which reduces the dynamic range.  This technique is used on all types of music, but it is most important on classical recordings because they often have wider dynamic ranges.

There are other, more arcane, measures taken in mastering, but many listeners find the ones I’ve described add a quality pleasing to the ear.  Overall compression makes it easier to hear all the parts, bass compression often makes the bass sound better, and the rolling off of high frequencies results in a sound many describe as “smooth” or “warm”.

At least part of the blame for the vinyl mythology has to do with a shortcut record companies took.  When Compact Discs first came out, the record companies believed that they didn’t need to do any mastering for digital because digital didn’t have vinyl’s limitations.  They sent the master tapes to CD manufacturers with no mastering, and the CDs that were produced did not sound anywhere near as good as vinyl.  They didn’t have any compression (or only what the recording engineer used), and because the high frequencies were boosted for vinyl, they sounded “harsh” or “tinny”.

These problems were caused by a lack of mastering, not, as audiophiles believed, an inherent flaw in digital audio technology.  It took a few years for the record companies and engineers to figure out that, in order to sound good, a similar mastering process was required for digital media.  CDs manufactured in the early 1980s often have these sonic problems, while later “remastered” versions mostly sound better (to my ears) than the vinyl, or at least more similar to the original master tape.

Today, great tools exist for mastering digital recordings, and pretty much every digital recording, whatever medium, gets mastered.  Mastering engineers have built on the vinyl techniques to create a large bag of tricks that make recordings sound better to listeners.  Over time, the ears of audiences have adjusted to being able to hear high frequencies without cringing, so they accept recordings where you can hear what the cymbals really sound like.  As a friend of mine who is a mastering engineer said to me yesterday, even an mp3, if it has a reasonable bit rate, will sound much closer to the original than vinyl will.

If you love the sound of vinyl, please enjoy it with my blessing.  Apart from the sonic aspects, I find the 15-20 minute album side a more satisfying chunk to listen to than a 3-minute mp3.  Just let go of the idea you are hearing what the recording engineer heard when he was mixing.

Now that I’ve rained pretty hard on the vinyl parade, do I have an alternative?  Is there a different technology that I think will serve listeners even better?  Stay tuned for Giving Good Audio for Music Part II: 24-bit Audio.

Many have taken the position that pure Net Neutrality is essential for an open Internet.  Today the FCC announced that they will not be requiring a pure Net Neutrality solution, but what they will require is not clear.  And, to quote Ross Perot, the devil is in the details.
Traditionally, on the Internet there has been the concept of “peering”.  This means that if AOL and Hotmail were sending each other a fairly balanced amount of traffic, they wouldn’t owe each other any money.  But if a site was sending a lot more traffic into your site than you were sending to it, that site would owe you “peering fees”.
Imagine this.  A small city builds a set of roads that is adequate for its normal traffic.  The normal traffic of its citizens travelling to other cities is balanced by citizens visiting from other cities.  At some point, another city starts sending a massive number of trucks into the small city, jamming the roads so the normal traffic can’t get through.  Traditionally on the Internet, the other city would help pay for the small city to widen and maintain its roads, since the other city is making money selling furniture (or whatever) to the citizens of the small city.
This system worked reasonably well when the “cities” were distinct in purpose; there were residential cities (access providers like AT&T and Comcast) and commercial cities (Netflix, Amazon, Google, etc.)  But now the residential cities want to be the providers of stuff as well, and they want to use the peering fees, and sanctions for not paying the peering fees, to disadvantage the commercial cities.  As a result, sites like Netflix want to stop paying peering fees.
Pure Net Neutrality advocates think we should require access providers never give preferential access to any site, nor charge any other site for the demands that its traffic put on their network.  That, in effect, means they must provide whatever level of bandwidth is required for any arbitrary application on the Internet.  This requirement seems overreaching to me.
When Netflix came online, the bandwidth at many access providers increased more than a thousand times what it was before.  Streaming movies have many orders of magnitude more data than email or normal websites like Facebook and Google.  And that was after YouTube had greatly increased the bandwidth people were using before that.  These increases required access providers to do massive upgrades to prevent the streaming movies from slowing down all the other traffic, and/or for them to restrict how much bandwidth Netflix and YouTube were using.  And Netflix is not the last Internet application that will require an increase in bandwidth.  I suspect that an understanding of these factors has caused the FCC to be uncomfortable with a pure Net Neutrality position.
That said, we need to do something.  For example, I have AT&T U-verse for my Internet access provider.  AT&T wants me to buy movies from them rather than getting them from Netflix.  They should not be able to use the fact that I get my access from them to disadvantage Netflix or other sites, but they will if they get the chance, as any competitive company would.  Netflix should help pay for the extra bandwidth, but they shouldn’t be taken advantage of.  I’m not sure there’s a good way for the FCC to balance this.
It’s a thorny problem.  I don’t think a naïve pure Net Neutrality approach is the right solution, but we need something.  A decent solution might be to re-regulate the former phone companies and other access providers, banning them from providing commercial services, but guaranteeing them a good rate of return.  I’m aware, however, that will never happen.

Netduino vs Arduino


The Netduino and Arduino are inexpensive (about $30-$35) small single-board computers that have allowed lots of regular people to create devices containing an embedded computer.  If you’ve never heard of them, you probably don’t care about the rest of this article.

I recently got one of the new Netduinos, and have been playing with it.  I’d previously done half a dozen Arduino projects, so I was interested in the differences.  I have to say, I was very impressed with it, but there are differences you should know about before you jump into using a Netduino.

Before We Even Start

The slugline for the Netduino is that it is like an Arduino, only using C# and .NET for programming.  That’s accurate, but there’s more to it.  Current Arduinos have the ATmega328, or its USB-supporting cousin, the ATMega8U2.  These are fairly simple 8-bit processors, running at 16MHz.  The Netduino has an Atmel 32-bit ARM7 processor running at 48MHz, similar to the processor in many laptop computers.  It has a much larger program space (128K, not including the .NET runtime, vs. 32K for everything on Arduino), and a much larger RAM space (60K vs. 2K).  The Netduino itself, the schematic, layout and code, is entirely open-source.

First Look

The Netduino board is the same size and shape as an Arduino board.  It has the same sockets for shields labled the same way, the same power connector, and a USB connector.  The USB connector is the mini size like many cell phones use, rather than the full size one on the Arduino Duemilanove and Uno.  (This is an improvement, as shields rest dangerously on top of the metal USB connector.)  The USB connector is in the center of that end of the board rather than on the left edge.  Like the Arduino, the Netduino has a reset button in the same spot, and a power LED (bright white), and an LED on digital output 13 (bright blue) in different board locations than on the Arduino.  There is a place to install a 6-pin header at the back of the board, though no header is installed, in the same spot Arduinos have a similar header.  The TX and RX monitor LEDs that Arduinos have do not exist on the Netduino.

Development Environment

First, as you might expect, the development environment only runs under Windows.  It requires Vista or Windows 7.  Like the Arduino, you can set up a complete development environment for free.  Unlike the Arduino, it is not all open source, and in order to be legitimate, you will need to register for one component.  There are three components you need to install (one is open-source), but once you do that, it works well.  You don’t even need to install a device driver (this is done as part of the other installs).  You will be working in the Visual Studio 2010 environment, which is pretty bug-free and easy to use, once you get used to it. 

Something I don’t hear mentioned is that this setup provides a far superior debugging environment.  You can do both emulation and in-circuit debugging, unlike the Arduino environment, which currently doesn’t do either.  When I told Visual Studio to debug my program, it downloaded my code onto the board and started running it.  I was greatly surprised that when I clicked next to a line of code to set a breakpoint, the code running on the board immediately stopped at the breakpoint, and I could single-step through it, then set other breakpoints and proceed.

I have been programming the Netduino in C#, which is similar to Java in many ways, but you may be able to use Visual Basic as well.  Once I got used to doing embedded development in it, I liked C# better than the Arduino language.  The Arduino language is a simplified version of C, but almost anyone who uses it ends up needing regular C constructs (like sizeof()), so you get code that is a mix of Arduino and C.  C#, like Java, has many constructs that make the code more elegant and easy to read than C.  And the .NET micro library is more extensive for some functions than the Arduino standard library.  Also, C# delegates are a much cleaner way of setting up handlers for events, a lot of what your code likely will be doing.

A Drop-In Replacement?

The Netduino is not a drop-in replacement.  If you will only be doing digital I/O at low current, you probably can get away with using it that way, but there are a variety of differences you need to be aware of.  Some of these differences may make it a better fit, and some of them may make it a worse fit.  In any case, you don’t want to plug a Danger Shield (for example) into it and turn it on (analog voltages are too high).

Category Difference
 Chip power  Internally, the CPU runs at 3.3V, not 5V like the Arduino, though it uses the same power sources
 Digital I/Os  Go from 0V to 3.3V, not 5V.  It will work with most 5V logic circuits, input and output.
 Analog Inputs  Must not go higher than 3.3V!
 PWM Outputs  PWM is often used like an analog output.   Since 100% averages to 3.3V instead of 5V, circuits may work differently
 Libraries  None of the Arduino libraries, which are C and C++ code, will work on the Netduino without modification.  If you use a board-specific library, you may have to rewrite it.
USB Connector  Uses cell-phone type mini USB connector
 I/O Current  The pins on the CPU can drive a maximum of 8mA of current, which is less than Arduino
 CPU  32-bit Atmel ARM, instead of 8-bit ATmega
 Speed  48MHz instead of 16MHz
 Program Memory  128K instead of 32K
 RAM  60K instead of 2K
 EEPROM  Netduino has none
 In-circuit debugging  Netduino has it
 Emulation  Netduino development environment has it
 Price  As of this writing, while the Arduino Uno has a street price of about $30, the Netduino goes for about $35

Beyond Netduino

Something interesting you will find if you look at the schematic of the Netduino is that a lot of processor pins aren’t connected to anything!  The processor has a lot more I/O capability than it can connect up through the standard Arduino footprint.  For that reason, the Netduino guys are working on the Netduino Plus.  It still has the Arduino footprint, but on the Netduino Plus board, they have an ethernet connector and a micro SD card slot.  (It suddenly becomes clear why they moved the USB connector.)  As of this writing, the Netduino Plus is in beta, and not generally available.

If that is not enough for you, there are currently 21 separate development boards you can buy that are based on the .NET micro framework.  Most are available from Mouser. 
.NET Micro Framework Hardware


If you want to write a more serious program that is larger, requires a faster processor and you want a better debugging environment, the Netduino has a lot to recommend it, and a variety of options if you outgrow it.  If you want maximum compatibility with existing Arduino shields and libraries, the Netduino may not be your best option.

Getting Started

Here are some links to get you started on Netduino:

Netduino Site
Netduino Getting Started PDF
Atmel Microcontroller Data
Atmel Microcontroller Full Datasheet
Netduino Schematic
Netduino Forums

Development Software
Microsoft Visual C# Express 2010
.NET Micro Framework SDK v4.1
Netduino SDK v4.1 (32-bit)
Netduino SDK v4.1 (64-bit)
.NET Micro Framework Reference


One piece of information that I missed including in my original post is that the USB port works a bit differently on the Netduino from how it does on the Arduino.

The Arduino lets you treat the USB port as a simple serial port, and it is very easy to write code that communicates across it. The Arduino has sorted out how to differentiate between the communication to download a new program and normal communication, and for most applications, it just works the way you want it to.

The Netduino works differently. The much more complicated communications that allow you to run debug commands and break into a running program do not allow this. You can recompile the download package without the debug monitor, which should allow you to do this (I have not tried it), but it is more trouble than working with Arduino for applications where this is important. That said, working with an in-circuit debugger is pretty useful if your code is longer than a few lines of code.

The Netduino Plus has become available in the interim (about $60), and the addition of an Ethernet port and a microSD card slot on the same size board make it appropriate for a broader range of applications. You can get free shipping if you buy it from Secret Labs through the Amazon storefront.

Addendum #2

It’s been great to see all the response to this article.  Here are some additional book resources you might be interested in.

Expert .NET Micro Framework  A couple of years old (2009), with nothing specifically about the Netduino, but a very thorough exploration of software development and the framework on similar devices.

Embedded Programming with the Microsoft .NET Micro Framework  Even older (2007) this is Microsoft’s official book on the subject.

Getting Started With Netduino  This book is not quite out yet as I write this.  It is Make Magazine’s book on the Netduino.  It looks to be less deeply technical than the other books, more hobbyist-friendly, and is geared specifically at the Netduino with examples you can do right away.


If you have a significant geek factor, you may have more than one computer in a room at home.  Sometimes you have your old computer plus your new computer, or your home computer plus your laptop from work, or a large stack of machines tracing your computer history over the last decade.

If you find yourself in this situation, you might find a use for a device I have never seen in any computer store or swap meet.  Fortunately, with very minimal soldering skill, you can build it in an evening very cheaply.

The problem this solves is what to do with the audio from both (or all) of those computers.  With this computer audio mixer, you can use one set of powered speakers and have the audio from all of your machines come through them. 

Note: this only works for powered speakers.  The mixer does not work for unpowered speakers.

For my setup, I decided to have four inputs, but you can use the same approach for however many inputs you need.  Here’s the schematic:

Here’s what the circuit board looks like assembled:


 You can use either 1/4 watt or 1/8 watt resistors.  Here’s what the board looks like from the other side, with the locations of resistors shown:


 Here it is built into a box:

 I used some parts I had around the house, but you can build it from the following parts from Radio Shack:

Name Part Number Quantity
Proto Board 276-158 1
10K Resistors 271-1335 2
1/8 inch Stereo Jack 274-246 5
Box 270-1805 1
1/8 inch Stereo Cable 42-2387 4

Just use the stereo cables to connect the speaker outputs of your computers to the inputs of the box.  Then plug the powered speakers into the output.

In the 1970s, the first personal computers did not seem to be very important.  Arguably, it took far more time and energy to get them to do something than was ever saved by using them.  Nonetheless, tinkerers all over the place talked about how important they were going to be. 

By the mid-1980s, they actually started being useful, and by the 1990s, they had begun transforming our lives.  Secretarial pools, travel agents, newspaper classified ads, and letters sent through postal mail have largely become anacronisms, and the technology has changed almost every area of our life.

But when we look at popular technologies that have come along since, they have been evolutionary, not revolutionary, despite what the ads say.  iPods, iPhones and iPads have just made some functions of personal computers available when you are not in front of a traditional computer.

But this week I got to play with a technology I believe may be truly transformative.  The device in question is called a 3D printer.  Calling it a printer is a bit misleading.  It creates three dimensional objects, in the case of the one I played with, out of ABS plastic, from a 3D model downloaded to it from a computer. 

Professional 3D printers exist, but they are expensive.  They start somewhere around $25-$30K.  But some NYC hackers created a kit to build one for less than $1000, popularly known as a MakerBot (the actual name is the Cupcake; MakerBot Industries is the name of their company), and they’ve been selling out each production run months ahead of time for the last year.  That’s what I got to play with.  You can see an object I printed below:

 MakerBot CS logo

The triangle is the actual object, the logo of Crash Space,
a hackerspace in Los Angeles. The frame underneath is
called a “raft” and is there to prevent curling as it cools.
The raft is peeled off and discarded.

The Cupcake is a machine you do a fair amount of futzing with in order to get it tuned in perfectly.  Once you do that, it runs well, but getting there can take a bit of work.  And the parts it creates sometimes are less polished than ones molded in a factory.  But if the technology evolves over the next decade in any way similar to how personal computers did in the late 1970s and early 1980s, the world will be a different place.  Here’s an example:

Twelve-year-old Julie buys a new cell phone.  They ship her the guts of it (a circuit board with a display and keyboard attached), and expect her to get the case for it separately.  She looks online, and finds a design she likes.  She edits the design, adding her name and a butterfly embossed on it.  She pays a license fee for the design, and then either prints out the case on the family 3D printer,  or goes to Ginko’s Copy Shop and has them print it for her.  She prints it in bright pink plastic that matches her room. 

Meanwhile, a fitting has broken on the dishwasher.  Her father dowloads the part data from the appliance manufacturer’s website and prints out the part.  Julie’s birthday party is the following weekend, and her mother prints out personalized party favors for the party shaped like butterflies (Julie likes butterflies). 

Does this sound fantastic?  Well, a MakerBot 3D printer was used about a month ago to create a replacement part for a dishwasher, though the manufacturer did not have a 3D model or even the plans available online.  At least for MakerBot tinkerers, this vision of the future is already becoming a reality.

I’m currently designing brackets to mount my cell phone and iPod in the car, which I’m hoping to print out soon (yes, I do know how geeky this sounds).  Meanwhile, the MakerBot folks announced the successor to the Cupcake this week, called the Think-O-Matic.  It can print slightly larger objects, has more accuracy, and has a small conveyor belt to move completed objects off the printing surface, so printing the next object can continue uninterrupted.  If you are interested in learning more, you can read their press release or watch a MakerBot in action.  Get ready for the future, here it comes!



Preamp Heaven

Musicans and audio people tend to go through a progression.  First they are fascinated with musical instruments.  Next they are fascinated with microphones.  But eventually they become fascinated with preamps.  It sounds like it should be a very minor accessory, something like a cup holder on an SUV, but over time one comes to realize that the preamp may be the determining factor in how your music sounds, whether you are playing live or recording, or just building a great stereo system.

A confusing array of preamps are available running from under $100 to many thousands of dollars.  The factor I was really looking for in a preamp (which many musicians look for) is referred to as “warmth”.  This is an imperfection (referred to by engineers as a “nonlinearity”) that was common in early tube-based audio equipment.  This imperfection tended to make louder sounds harmonically richer (adding even-order harmonics), and made them be less different in volume from the quieter sounds.  Eventually, engineers built chips (called operational amplifiers) that did not have this imperfection (our engineer would say they are perfectly linear), but when people heard the new perfect sound, it did not please them.  They said it was “cold” and lacking the warmth of tubes. 

This led to a fetish for vintage equipment from the era before the new chips started being used.  But equipment does not have to be vintage to have the warmth that makes it pleasing to the ear.  It also does not have to have tubes in order to have warmth.  There is a type of transistor called a FET (field effect transistor) that has a similar nonlinearity to what preamp tubes have.  With all this in mind, I decided to create my perfect preamp.

This week I built the first model for actual use.  There were lots of prototypes over the last year, and I created a couple of different printed circuit boards along the way.  The preamp has two channels, and each channel has three stages.  The first stage has a FET transistor, the second stage has tone controls, mimicking those on a Fender Twin Reverb guitar amplifier, and the third stage is a gain stage that uses one of those operational amplifier chips, which will perfectly reproduce the warm sound of the FETs.

Below are pictures of the preamp.  I have used it with my Chapman Stick, electronic keyboards, and a Taylor acoustic guitar (with a mic and a piezo pickup) with good results on all of them.

Preamp Front

Gloster Preamp Open

Gloster Preamp Circuit Board

A Visit to Hackerland

Over the last few years hackerspaces have sprung up all over the U.S.  These are places where members chip in some money every month to rent a space and stock it with exotic tools and projects.  The members tend to have an eclectic mix of technical skills.  Often in any group of 5 you will find experience in writing computer programs, digital circuit design, analog circuit design, mechanical design, use of tools like CNC mills and laser cutter/etchers, art school, user interface design, musicianship, rocketry and building exotic radio equipment.  Plus a love of science fiction, comic books and techno music.  Whenever I encounter a hackerspace, I have the sense that I have found my tribe.

A local hackerspace in the Los Angeles area that is particularly active (there are a couple more around town) is called Crash Space.  I visited them a few times and joined their email list.  When I went to a meeting about a month ago, the club had just been selected to participate in the VIMBY/Scion Hackerspace Challenge.  Several hackerspaces around the country were given $3,000 each and asked to build something cool with it over the course of a couple of weeks.  Though I am not a member of Crash Space (yet, anyway), they let me be part of the team.

The project they decided to do was to put an array of ultrasonic sensors along the front of the storefront that is Crash Space.  As people walked in front of the building, they triggered the various sensors, which triggered different noise making devices, many of them consisting of a relay driving a “thwacker” that banged a bottle or flower pot.  There was even an “Easter Egg”, where if you ran back and forth the entire length of the building several times, it would start playing the Close Encounters 5-note signature tune (you know, Boo-boo-boo-bowww-boooo). 

I built a stereo audio mixer with phantom power for the project, wired up ultrasonic sensors and helped to assemble the thwackers.  You can see pictures of the unveiling here:

 On a personal note, if you look at the 11th picture in the series, you can see the mixer I built on the lower level, just to the right of the Memory Man.

 Anyway, the sponsors are VIMBY, a competitor to YouTube, and Scion, the car company.  A videographer filmed the entire process, and there will be a video of it at some point on the VIMBY site.  And some of that footage will make its way into a Scion car commercial after that.  Also, they will choose a winner from among the hackerspaces who competed.  We are up against the legendary NYC Resistor hackerspace (in New York City).  But all that is in the future.  For now, it was great to be part of such an interesting and (to their neighbors) unbelievably cryptic project.