Rerurn to Romy the Cat's Site


In the Forum: Horn-Loaded Speakers
In the Thread: The tapped horns: cons, pros and Sound
Post Subject: I'm not convincedPosted by JoshK on: 4/12/2011
fiogf49gjkf0d
I know the standard logic is that everything needs to be time-aligned, but essentially the lower the frequency seemingly the less critical time-alignment should be, at least on paper.  I've heard of studies where it is suggested at low frequencies that the human ear cannot hear the sound until a full wavelength has been played.  If that is true, then ponder this:

6ms is 167hz or ~2m.  100hz is roughly 3.4m.  If you didn't hear that frequency until a full wavelength, then the tone would have to tranverse 3.4m before you could detect it.  So is moving it 10" one way or another going to matter much?

Now, I do not know if 100hz is low enough to be considered "low frequency" for the above study, in fact I am sure it isn't, but the point was that are hearing acuity drops with the inverse of frequency down low.  So it seems to follow that time alignment at 100hz would be less critical than 200hz and 20hz would be far less critical than 100hz. 

The question then follows is how critical is time-alignment at  100hz?

I am just not convinced by dogmatic theorizing.  I'd be more swayed by someone's empirical analysis (which I've not done, which is why I am skeptical, not decided) but one needs to be very careful that room modes aren't the main cause of the difference in perception rather than time alignment.   I actually think it might be easier to test this idea with headphones and computer generated delay. 

Rerurn to Romy the Cat's Site