Search within:

Mitchell Day

Mitch Day, portrait
Associate Professor
Life Sciences Building 241


Ph.D., New York University

Auditory Neurophysiology Laboratory

Courses Taught

  • BIOS 3430/5430: Principles of Physiology
  • BIOS 3450: Human Physiology
  • BIOS 4130/5130: Neuroscience
  • BIOS 4140/5140: Molecular and Cellular Neuroscience

Research Interests

  • Lab: Life Sciences Building 220

Everything we experience in the sensory world (i.e., vision, hearing, touch, smell, taste and proprioception) is translated into electrical bursts of activity—called spikes—in the neurons of our brains. I have a general interest in how sensory information is encoded in trains of spikes across neurons, and how higher-level areas of the brain decode the spike trains of lower-level neurons to form a percept of the sensory world. In investigating sensorineural coding, I focus on binaural hearing—i.e., the specific type of hearing conferred by having two ears. Binaural hearing underlies our ability to pinpoint the location of a sound source, identify different reverberative environments (e.g., shower stall vs. cathedral), and assists in segregating a sound source of interest in a world of constant, competing sources. My lab uses neurophysiological approaches to measure the spiking of neurons in auditory areas of the brain and psychophysical approaches to measure perceptual abilities, both in response to binaural stimuli. We use sophisticated mathematical analyses to quantify information in neural data, and develop computational models to explore the function of neural circuits.

Current questions being addressed in the lab are:

  1. How does the neural encoding of sound source location change in the face of changes in other aspects of an auditory stimulus, such as sound level, frequency spectrum and amplitude modulation?
  2. How does the central auditory system encode the location of a particular sound source in the presence of many competing sound sources?
  3. How does noise-induced hearing loss (affecting about 26 million Americans; NIDCD, 08/2015) affect the ability to both localize a single sound source and detect the separation of two concurrent sources? Further, how does hearing loss affect the neural encoding of binaural stimuli?


I earned my B.S. in Physics and Mathematics at the University of Iowa in 2002. There, I developed an interest in computational modeling in the biological sciences, particularly neuroscience. After college, I spent a year teaching algebra to eighth and ninth graders in the Boston public school system. I then did my graduate studies in Neural Science at New York University, first working with Dr. John Rinzel, investigating the effect of potassium channels on computational models of neurons in the auditory brainstem, which perform some of the fastest computations in the brain. Later with my adviser, Dr. Malcolm Semple, I investigated the underpinnings of binaural hearing by measuring sound-evoked spikes in neurons of the medial superior olive, which are the first neurons along the auditory pathway whose spiking activity is sensitive to differences in sound at the two ears. After earning my Ph.D. in 2009, I did postdoctoral research in the lab of Dr. Bertrand Delgutte at the Eaton-Peabody Laboratories of Massachusetts Eye and Ear and Harvard Medical School. There, I continued work on binaural hearing, measuring how the locations of two concurrent sound sources are encoded in spike trains of neurons in the inferior colliculus. I then joined the Department of Biological Sciences at Ohio University in 2015, where I continue to measure sound-evoked responses in auditory brain areas.

Representative Publications

Day, M.L. (2024) Head-related transfer functions of rabbits within the front horizontal plane. Hear Res 441:108924.

Haragopal, H., Dorkoski, R., Pollard, A.R., Whaley, G.A., Wohl, T.R., Stroud, N.C. and Day, M.L. (2020) Specific loss of neural sensitivity to interaural time difference of unmodulated noise stimuli following noise-induced hearing loss. J Neurophysiol. 124: 1165-1182.

Ryan Dorkoski , Kenneth E. Hancock, Gareth A. Whaley , Timothy R. Wohl, Noelle C. Stroud, Mitchell L. Day (2020) Stimulus-frequency-dependent dominance of sound localization cues across the cochleotopic map of the inferior colliculus. J Neurophysiol. 123: 1791-1807.

Haragopal, H., Dorkoski, R., Johnson, H.M., Berryman, M.A., Tanda, S. and Day, M.L. (2020) Paired measurements of cochlear function and hair cell count in Dutch-belted rabbits with noise-induced hearing loss. Hear Res 385:107845.

Day, M.L. and Delgutte, B. (2016) Neural population encoding and decoding of sound source location across sound level in the rabbit inferior colliculus. J Neurophysiol 115: 193-207.

Day, M.L. and Delgutte, B. (2013) Decoding sound source location and separation using neural population activity patterns. J Neurosci 33:15837-15847.

Day, M.L., Koka, K. and Delgutte, B. (2012) Neural encoding of sound source location in the presence of a concurrent, spatially-separated source. J Neurophysiol 108: 2612-2628.

Day, M.L. and Semple, M.N. (2011) Frequency-dependent interaural delays in the medial superior olive: implications for interaural cochlear delays. J Neurophysiol 106:1985-1999.

Day, M.L., Doiron, B. and Rinzel J. (2008) Subthreshold K+ channel dynamics interact with stimulus spectrum to influence temporal coding in an auditory brain stem model. J Neurophysiol 99:534-44.