| Hi all,
I've been attempting to run a simulation of Fletcher's band-
widening experiment where the threshold amplitude of a pure tone is
measured as a function of the a bandwidth of a bandpassed noise
masker centered at the pure tone frequency (2000Hz in this case).
It's essential for the experiment that noise power density is held
constant for different bandwidths of noise masker. I have been using
the Butterworth filter opcode to filter 'rand' generated white
noise, and in this basic form the max amp in a segment (as indicated
by CSound) increases with bandwith. I've then tried the approach
of using 'balance' to match the filtered noise samples to a reference
RMS value, and/or multiplying the instr 'out' signal by a fraction
that causes all max amp values for each bandwidth to be approximately
equal. I need some more info. or reasurrance that this
approach is indeed producing a more or less constant power density
across frequencies in the noise bands. When I look at the wave files
with a digital frequency analyser (Wave SE) the noise power density
seems to decrease with bandwidth, eventhough all the max amp values
are now the same. Is this just an artifact of the processing in the
frequency analyser, have I got decreasing noise power density, or
both ?
Any help appreciated......
Thanks in anticipation,
Pete Kearton
|