Electrical neuroimaging in humans identified the speed and spatiotemporal brain mechanism whereby sounds of living and man-made objects are discriminated. Subjects performed an "oddball" target detection task, selectively responding to sounds of either living or man-made objects on alternating blocks, which were controlled for in their spectrogram and harmonics-to-noise ratios between categories. Analyses were conducted on 64-channel auditory evoked potentials (AEPs) from nontarget trials. Comparing responses to sounds of living versus man-made objects, these analyses tested for modulations in local AEP waveforms, global response strength, and the topography of the electric field at the scalp. In addition, the local autoregressive average distributed linear inverse solution was applied to periods of observed modulations. Just 70 ms after stimulus onset, a common network of brain regions within the auditory "what" processing stream responded more strongly to sounds of man-made versus living objects, with differential activity within the right temporal and left inferior frontal cortices. Over the 155-257 ms period, the duration of activity of a brain network, including bilateral temporal and premotor cortices, differed between categories of sounds. Responses to sounds of living objects peaked approximately 12 ms later and the activity of the brain network active over this period was prolonged relative to that in response to sounds of man-made objects. The earliest task-related effects were observed at approximately 100 ms poststimulus onset, placing an upper limit on the speed of cortical auditory object discrimination. These results provide critical temporal constraints on human auditory object recognition and semantic discrimination processes.