1
40
9
-
Text
A resource consisting primarily of words for reading. Examples include books, letters, dissertations, poems, newspapers, articles, archives of mailing lists. Note that facsimiles or images of texts are still of the genre Text.
URL Address
<a href="http://doi.org/10.3389/fnbeh.2016.00038" target="_blank" rel="noreferrer noopener">http://doi.org/10.3389/fnbeh.2016.00038</a>
Pages
38–38
Volume
10
Dublin Core
The Dublin Core metadata element set is common to all Omeka records, including items, files, and collections. For more information see, http://dublincore.org/documents/dces/.
Title
A name given to the resource
Contextual Modulation of Vocal Behavior in Mouse: Newly Identified 12 kHz "Mid-Frequency" Vocalization Emitted during Restraint.
Publisher
An entity responsible for making the resource available
Frontiers in behavioral neuroscience
Date
A point or period of time associated with an event in the lifecycle of the resource
2016
1905-07
Subject
The topic of the resource
mouse; context; isolation; restraint; stress; vocalization
Creator
An entity primarily responsible for making the resource
Grimsley Jasmine M S; Sheth Saloni; Vallabh Neil; Grimsley Calum A; Bhattal Jyoti; Latsko Maeson; Jasnow Aaron; Wenstrup Jeffrey J
Description
An account of the resource
While several studies have investigated mouse ultrasonic vocalizations (USVs) emitted by isolated pups or by males in mating contexts, studies of behavioral contexts other than mating and vocalization categories other than USVs have been limited. By improving our understanding of the vocalizations emitted by mice across behavioral contexts, we will better understand the natural vocal behavior of mice and better interpret vocalizations from mouse models of disease. Hypothesizing that mouse vocal behavior would differ depending on behavioral context, we recorded vocalizations from male CBA/CaJ mice across three behavioral contexts including mating, isolation, and restraint. We found that brief restraint elevated blood corticosterone levels of mice, indicating increased stress relative to isolation. Further, after 3 days of brief restraint, mice displayed behavioral changes indicative of stress. These persisted for at least 2 days after restraint. Contextual differences in mouse vocal behavior were striking and robust across animals. Thus, while USVs were the most common vocalization type across contexts, the spectrotemporal features of USVs were context-dependent. Compared to the mating context, vocalizations during isolation and restraint displayed a broader frequency range, with a greater emphasis on frequencies below 50 kHz. These contexts also included more non-USV vocal categories and different vocal patterns. We identified a new Mid-Frequency Vocalization, a tonal vocalization with fundamental frequencies below 18 kHz, which was almost exclusively emitted by mice undergoing restraint stress. These differences combine to form vocal behavior that is grossly different among behavioral contexts and may reflect the level of anxiety in these contexts.
Identifier
An unambiguous reference to the resource within a given context
<a href="http://doi.org/10.3389/fnbeh.2016.00038" target="_blank" rel="noreferrer noopener">10.3389/fnbeh.2016.00038</a>
Rights
Information about rights held in and over the resource
Article information provided for research and reference use only. All rights are retained by the journal listed under publisher and/or the creator(s).
2016
Bhattal Jyoti
College of Anatomy & Neurobiology
context
Department of Anatomy & Neurobiology
Frontiers in behavioral neuroscience
Grimsley Calum A
Grimsley Jasmine M S
isolation
Jasnow Aaron
Latsko Maeson
mouse
NEOMED College of Medicine
Restraint
Sheth Saloni
Stress
Vallabh Neil
Vocalization
Wenstrup Jeffrey J
-
Text
A resource consisting primarily of words for reading. Examples include books, letters, dissertations, poems, newspapers, articles, archives of mailing lists. Note that facsimiles or images of texts are still of the genre Text.
URL Address
<a href="http://doi.org/10.3389/fnbeh.2012.00089" target="_blank" rel="noreferrer noopener">http://doi.org/10.3389/fnbeh.2012.00089</a>
Pages
89–89
Volume
6
Dublin Core
The Dublin Core metadata element set is common to all Omeka records, including items, files, and collections. For more information see, http://dublincore.org/documents/dces/.
Title
A name given to the resource
Automated classification of mouse pup isolation syllables: from cluster analysis to an Excel-based "mouse pup syllable classification calculator".
Publisher
An entity responsible for making the resource available
Frontiers in behavioral neuroscience
Date
A point or period of time associated with an event in the lifecycle of the resource
2012
1905-07
Subject
The topic of the resource
vocalization; cluster analysis; communication call; isolation calls; mouse pup calls; mouse song
Creator
An entity primarily responsible for making the resource
Grimsley Jasmine M S; Gadziola Marie A; Wenstrup Jeffrey J
Description
An account of the resource
Mouse pups vocalize at high rates when they are cold or isolated from the nest. The proportions of each syllable type produced carry information about disease state and are being used as behavioral markers for the internal state of animals. Manual classifications of these vocalizations identified 10 syllable types based on their spectro-temporal features. However, manual classification of mouse syllables is time consuming and vulnerable to experimenter bias. This study uses an automated cluster analysis to identify acoustically distinct syllable types produced by CBA/CaJ mouse pups, and then compares the results to prior manual classification methods. The cluster analysis identified two syllable types, based on their frequency bands, that have continuous frequency-time structure, and two syllable types featuring abrupt frequency transitions. Although cluster analysis computed fewer syllable types than manual classification, the clusters represented well the probability distributions of the acoustic features within syllables. These probability distributions indicate that some of the manually classified syllable types are not statistically distinct. The characteristics of the four classified clusters were used to generate a Microsoft Excel-based mouse syllable classifier that rapidly categorizes syllables, with over a 90% match, into the syllable types determined by cluster analysis.
Identifier
An unambiguous reference to the resource within a given context
<a href="http://doi.org/10.3389/fnbeh.2012.00089" target="_blank" rel="noreferrer noopener">10.3389/fnbeh.2012.00089</a>
Rights
Information about rights held in and over the resource
Article information provided for research and reference use only. All rights are retained by the journal listed under publisher and/or the creator(s).
2012
Cluster Analysis
College of Anatomy & Neurobiology
communication call
Department of Anatomy & Neurobiology
Frontiers in behavioral neuroscience
Gadziola Marie A
Grimsley Jasmine M S
isolation calls
mouse pup calls
mouse song
NEOMED College of Medicine
Vocalization
Wenstrup Jeffrey J
-
Text
A resource consisting primarily of words for reading. Examples include books, letters, dissertations, poems, newspapers, articles, archives of mailing lists. Note that facsimiles or images of texts are still of the genre Text.
URL Address
<a href="http://doi.org/10.1523/JNEUROSCI.2205-13.2013" target="_blank" rel="noreferrer noopener">http://doi.org/10.1523/JNEUROSCI.2205-13.2013</a>
Pages
17538–17548
Issue
44
Volume
33
Dublin Core
The Dublin Core metadata element set is common to all Omeka records, including items, files, and collections. For more information see, http://dublincore.org/documents/dces/.
Title
A name given to the resource
Coding the meaning of sounds: contextual modulation of auditory responses in the basolateral amygdala.
Publisher
An entity responsible for making the resource available
The Journal of neuroscience : the official journal of the Society for Neuroscience
Date
A point or period of time associated with an event in the lifecycle of the resource
2013
2013-10
Subject
The topic of the resource
Female; Male; Animals; Mice; Acoustic Stimulation/*methods; Auditory Perception/*physiology; Action Potentials/*physiology; Amygdala/*physiology; Cats; Animal/*physiology; Inbred CBA; Vocalization
Creator
An entity primarily responsible for making the resource
Grimsley Jasmine M S; Hazlett Emily G; Wenstrup Jeffrey J
Description
An account of the resource
Female mice emit a low-frequency harmonic (LFH) call in association with distinct behavioral contexts: mating and physical threat or pain. Here we report the results of acoustic, behavioral, and neurophysiological studies of the contextual analysis of these calls in CBA/CaJ mice. We first show that the acoustical features of the LFH call do not differ between contexts. We then show that male mice avoid the LFH call in the presence of a predator cue (cat fur) but are more attracted to the same exemplar of the call in the presence of a mating cue (female urine). The males thus use nonauditory cues to determine the meaning of the LFH call, but these cues do not generalize to noncommunication sounds, such as noise bursts. We then characterized neural correlates of contextual meaning of the LFH call in responses of basolateral amygdala (BLA) neurons from awake, freely moving mice. There were two major findings. First, BLA neurons typically displayed early excitation to all tested behaviorally aversive stimuli. Second, the nonauditory context modulates the BLA population response to the LFH call but not to the noncommunication sound. These results suggest that the meaning of communication calls is reflected in the spike discharge patterns of BLA neurons.
Identifier
An unambiguous reference to the resource within a given context
<a href="http://doi.org/10.1523/JNEUROSCI.2205-13.2013" target="_blank" rel="noreferrer noopener">10.1523/JNEUROSCI.2205-13.2013</a>
Rights
Information about rights held in and over the resource
Article information provided for research and reference use only. All rights are retained by the journal listed under publisher and/or the creator(s).
2013
Acoustic Stimulation/*methods
Action Potentials/*physiology
Amygdala/*physiology
Animal/*physiology
Animals
Auditory Perception/*physiology
Cats
College of Anatomy & Neurobiology
Department of Anatomy & Neurobiology
Female
Grimsley Jasmine M S
Hazlett Emily G
Inbred CBA
Male
Mice
NEOMED College of Medicine
The Journal of neuroscience : the official journal of the Society for Neuroscience
Vocalization
Wenstrup Jeffrey J
-
Text
A resource consisting primarily of words for reading. Examples include books, letters, dissertations, poems, newspapers, articles, archives of mailing lists. Note that facsimiles or images of texts are still of the genre Text.
URL Address
<a href="http://doi.org/10.1371/journal.pone.0194091" target="_blank" rel="noreferrer noopener">http://doi.org/10.1371/journal.pone.0194091</a>
Pages
e0194091–e0194091
Issue
3
Volume
13
Dublin Core
The Dublin Core metadata element set is common to all Omeka records, including items, files, and collections. For more information see, http://dublincore.org/documents/dces/.
Title
A name given to the resource
Communication calls produced by electrical stimulation of four structures in the guinea pig brain.
Publisher
An entity responsible for making the resource available
PloS one
Date
A point or period of time associated with an event in the lifecycle of the resource
2018
1905-07
Subject
The topic of the resource
Female; Male; Animals; Acoustic Stimulation/methods; Auditory Perception/physiology; Brain/*physiology; Electric Stimulation/methods; Guinea Pigs; Neurons/physiology; Animal/physiology; Vocalization
Creator
An entity primarily responsible for making the resource
Green David B; Shackleton Trevor M; Grimsley Jasmine M S; Zobay Oliver; Palmer Alan R; Wallace Mark N
Description
An account of the resource
One of the main central processes affecting the cortical representation of conspecific vocalizations is the collateral output from the extended motor system for call generation. Before starting to study this interaction we sought to compare the characteristics of calls produced by stimulating four different parts of the brain in guinea pigs (Cavia porcellus). By using anaesthetised animals we were able to reposition electrodes without distressing the animals. Trains of 100 electrical pulses were used to stimulate the midbrain periaqueductal grey (PAG), hypothalamus, amygdala, and anterior cingulate cortex (ACC). Each structure produced a similar range of calls, but in significantly different proportions. Two of the spontaneous calls (chirrup and purr) were never produced by electrical stimulation and although we identified versions of chutter, durr and tooth chatter, they differed significantly from our natural call templates. However, we were routinely able to elicit seven other identifiable calls. All seven calls were produced both during the 1.6 s period of stimulation and subsequently in a period which could last for more than a minute. A single stimulation site could produce four or five different calls, but the amygdala was much less likely to produce a scream, whistle or rising whistle than any of the other structures. These three high-frequency calls were more likely to be produced by females than males. There were also differences in the timing of the call production with the amygdala primarily producing calls during the electrical stimulation and the hypothalamus mainly producing calls after the electrical stimulation. For all four structures a significantly higher stimulation current was required in males than females. We conclude that all four structures can be stimulated to produce fictive vocalizations that should be useful in studying the relationship between the vocal motor system and cortical sensory representation.
Identifier
An unambiguous reference to the resource within a given context
<a href="http://doi.org/10.1371/journal.pone.0194091" target="_blank" rel="noreferrer noopener">10.1371/journal.pone.0194091</a>
Rights
Information about rights held in and over the resource
Article information provided for research and reference use only. All rights are retained by the journal listed under publisher and/or the creator(s).
2018
Acoustic Stimulation/methods
Animal/physiology
Animals
Auditory Perception/physiology
Brain/*physiology
Electric Stimulation/methods
Female
Green David B
Grimsley Jasmine M S
Guinea Pigs
Male
Neurons/physiology
Palmer Alan R
PloS one
Shackleton Trevor M
Vocalization
Wallace Mark N
Zobay Oliver
-
Text
A resource consisting primarily of words for reading. Examples include books, letters, dissertations, poems, newspapers, articles, archives of mailing lists. Note that facsimiles or images of texts are still of the genre Text.
URL Address
<a href="http://doi.org/10.1371/journal.pone.0044550" target="_blank" rel="noreferrer noopener">http://doi.org/10.1371/journal.pone.0044550</a>
Pages
e44550–e44550
Issue
9
Volume
7
Dublin Core
The Dublin Core metadata element set is common to all Omeka records, including items, files, and collections. For more information see, http://dublincore.org/documents/dces/.
Title
A name given to the resource
Social vocalizations of big brown bats vary with behavioral context.
Publisher
An entity responsible for making the resource available
PloS one
Date
A point or period of time associated with an event in the lifecycle of the resource
2012
1905-7
Subject
The topic of the resource
Animals; *Animal Communication; Chiroptera/*physiology; Electrocardiography; Acoustics; *Behavior; Animal
Creator
An entity primarily responsible for making the resource
Gadziola Marie A; Grimsley Jasmine M S; Faure Paul A; Wenstrup Jeffrey J
Description
An account of the resource
Bats are among the most gregarious and vocal mammals, with some species demonstrating a diverse repertoire of syllables under a variety of behavioral contexts. Despite extensive characterization of big brown bat (Eptesicus fuscus) biosonar signals, there have been no detailed studies of adult social vocalizations. We recorded and analyzed social vocalizations and associated behaviors of captive big brown bats under four behavioral contexts: low aggression, medium aggression, high aggression, and appeasement. Even limited to these contexts, big brown bats possess a rich repertoire of social vocalizations, with 18 distinct syllable types automatically classified using a spectrogram cross-correlation procedure. For each behavioral context, we describe vocalizations in terms of syllable acoustics, temporal emission patterns, and typical syllable sequences. Emotion-related acoustic cues are evident within the call structure by context-specific syllable types or variations in the temporal emission pattern. We designed a paradigm that could evoke aggressive vocalizations while monitoring heart rate as an objective measure of internal physiological state. Changes in the magnitude and duration of elevated heart rate scaled to the level of evoked aggression, confirming the behavioral state classifications assessed by vocalizations and behavioral displays. These results reveal a complex acoustic communication system among big brown bats in which acoustic cues and call structure signal the emotional state of a caller.
Identifier
An unambiguous reference to the resource within a given context
<a href="http://doi.org/10.1371/journal.pone.0044550" target="_blank" rel="noreferrer noopener">10.1371/journal.pone.0044550</a>
Rights
Information about rights held in and over the resource
Article information provided for research and reference use only. All rights are retained by the journal listed under publisher and/or the creator(s).
*Animal Communication
*Behavior
2012
Acoustics
Animal
Animals
Chiroptera/*physiology
College of Anatomy & Neurobiology
Department of Anatomy & Neurobiology
Electrocardiography
Faure Paul A
Gadziola Marie A
Grimsley Jasmine M S
NEOMED College of Medicine
PloS one
Wenstrup Jeffrey J
-
Text
A resource consisting primarily of words for reading. Examples include books, letters, dissertations, poems, newspapers, articles, archives of mailing lists. Note that facsimiles or images of texts are still of the genre Text.
URL Address
<a href="http://doi.org/10.1371/journal.pone.0017460" target="_blank" rel="noreferrer noopener">http://doi.org/10.1371/journal.pone.0017460</a>
Pages
e17460–e17460
Issue
3
Volume
6
Dublin Core
The Dublin Core metadata element set is common to all Omeka records, including items, files, and collections. For more information see, http://dublincore.org/documents/dces/.
Title
A name given to the resource
Development of social vocalizations in mice.
Publisher
An entity responsible for making the resource available
PloS one
Date
A point or period of time associated with an event in the lifecycle of the resource
2011
2011-03
Subject
The topic of the resource
Female; Male; Animals; Mice; *Social Behavior; Acoustics; Aging/physiology; Phonetics; Sound Spectrography; Nonlinear Dynamics; Vocal Cords/physiology; Newborn; Animal/*physiology; Vocalization
Creator
An entity primarily responsible for making the resource
Grimsley Jasmine M S; Monaghan Jessica J M; Wenstrup Jeffrey J
Description
An account of the resource
Adult mice are highly vocal animals, with both males and females vocalizing in same sex and cross sex social encounters. Mouse pups are also highly vocal, producing isolation vocalizations when they are cold or removed from the nest. This study examined patterns in the development of pup isolation vocalizations, and compared these to adult vocalizations. In three litters of CBA/CaJ mice, we recorded isolation vocalizations at ages postnatal day 5 (p5), p7, p9, p11, and p13. Adult vocalizations were obtained in a variety of social situations. Altogether, 28,384 discrete vocal signals were recorded using high-frequency-sensitive equipment and analyzed for syllable type, spectral and temporal features, and the temporal sequencing within bouts. We found that pups produced all but one of the 11 syllable types recorded from adults. The proportions of syllable types changed developmentally, but even the youngest pups produced complex syllables with frequency-time variations. When all syllable types were pooled together for analysis, changes in the peak frequency or the duration of syllables were small, although significant, from p5 through p13. However, individual syllable types showed different, large patterns of change over development, requiring analysis of each syllable type separately. Most adult syllables were substantially lower in frequency and shorter in duration. As pups aged, the complexity of vocal bouts increased, with a greater tendency to switch between syllable types. Vocal bouts from older animals, p13 and adult, had significantly more sequential structure than those from younger mice. Overall, these results demonstrate substantial changes in social vocalizations with age. Future studies are required to identify whether these changes result from developmental processes affecting the vocal tract or control of vocalization, or from vocal learning. To provide a tool for further research, we developed a MATLAB program that generates bouts of vocalizations that correspond to mice of different ages.
Identifier
An unambiguous reference to the resource within a given context
<a href="http://doi.org/10.1371/journal.pone.0017460" target="_blank" rel="noreferrer noopener">10.1371/journal.pone.0017460</a>
Rights
Information about rights held in and over the resource
Article information provided for research and reference use only. All rights are retained by the journal listed under publisher and/or the creator(s).
*Social Behavior
2011
Acoustics
Aging/physiology
Animal/*physiology
Animals
College of Anatomy & Neurobiology
Department of Anatomy & Neurobiology
Female
Grimsley Jasmine M S
Male
Mice
Monaghan Jessica J M
NEOMED College of Medicine
Newborn
Nonlinear Dynamics
Phonetics
PloS one
Sound Spectrography
Vocal Cords/physiology
Vocalization
Wenstrup Jeffrey J
-
Text
A resource consisting primarily of words for reading. Examples include books, letters, dissertations, poems, newspapers, articles, archives of mailing lists. Note that facsimiles or images of texts are still of the genre Text.
URL Address
<a href="http://doi.org/10.1152/jn.00422.2011" target="_blank" rel="noreferrer noopener">http://doi.org/10.1152/jn.00422.2011</a>
Pages
1047–1057
Issue
4
Volume
107
Dublin Core
The Dublin Core metadata element set is common to all Omeka records, including items, files, and collections. For more information see, http://dublincore.org/documents/dces/.
Title
A name given to the resource
A novel coding mechanism for social vocalizations in the lateral amygdala.
Publisher
An entity responsible for making the resource available
Journal of neurophysiology
Date
A point or period of time associated with an event in the lifecycle of the resource
2012
2012-02
Subject
The topic of the resource
*Social Behavior; Acoustic Stimulation; Action Potentials/*physiology; Amygdala/*cytology/physiology; Animal/*physiology; Animals; Auditory Pathways/*physiology; Chiroptera; Dextrans/metabolism; Echolocation/physiology; Female; Male; Neurons/*physiology; Reaction Time/physiology; Rhodamines/metabolism; Time Factors; Vocalization
Creator
An entity primarily responsible for making the resource
Gadziola Marie A; Grimsley Jasmine M S; Shanbhag Sharad J; Wenstrup Jeffrey J
Description
An account of the resource
The amygdala plays a central role in evaluating the significance of acoustic signals and coordinating the appropriate behavioral responses. To understand how amygdalar responses modulate auditory processing and drive emotional expression, we assessed how neurons respond to and encode information that is carried within complex acoustic stimuli. We characterized responses of single neurons in the lateral nucleus of the amygdala to social vocalizations and synthetic acoustic stimuli in awake big brown bats. Neurons typically responded to most of the social vocalizations presented (mean = nine of 11 vocalizations) but differentially modulated both firing rate and response duration. Surprisingly, response duration provided substantially more information about vocalizations than did spike rate. In most neurons, variation in response duration depended, in part, on persistent excitatory discharge that extended beyond stimulus duration. Information in persistent firing duration was significantly greater than in spike rate, and the majority of neurons displayed more information in persistent firing, which was more likely to be observed in response to aggressive vocalizations (64%) than appeasement vocalizations (25%), suggesting that persistent firing may relate to the behavioral context of vocalizations. These findings suggest that the amygdala uses a novel coding strategy for discriminating among vocalizations and underscore the importance of persistent firing in the general functioning of the amygdala.
Identifier
An unambiguous reference to the resource within a given context
<a href="http://doi.org/10.1152/jn.00422.2011" target="_blank" rel="noreferrer noopener">10.1152/jn.00422.2011</a>
Rights
Information about rights held in and over the resource
Article information provided for research and reference use only. All rights are retained by the journal listed under publisher and/or the creator(s).
*Social Behavior
2012
Acoustic Stimulation
Action Potentials/*physiology
Amygdala/*cytology/physiology
Animal/*physiology
Animals
Auditory Pathways/*physiology
Chiroptera
College of Anatomy & Neurobiology
Department of Anatomy & Neurobiology
Dextrans/metabolism
Echolocation/physiology
Female
Gadziola Marie A
Grimsley Jasmine M S
Journal of neurophysiology
Male
NEOMED College of Medicine
Neurons/*physiology
Reaction Time/physiology
Rhodamines/metabolism
Shanbhag Sharad J
Time Factors
Vocalization
Wenstrup Jeffrey J
-
Text
A resource consisting primarily of words for reading. Examples include books, letters, dissertations, poems, newspapers, articles, archives of mailing lists. Note that facsimiles or images of texts are still of the genre Text.
URL Address
<a href="http://doi.org/10.1016/j.psyneuen.2016.11.003" target="_blank" rel="noreferrer noopener">http://doi.org/10.1016/j.psyneuen.2016.11.003</a>
Pages
203–212
Volume
75
Dublin Core
The Dublin Core metadata element set is common to all Omeka records, including items, files, and collections. For more information see, http://dublincore.org/documents/dces/.
Title
A name given to the resource
Prenatal stress changes courtship vocalizations and bone mineral density in mice.
Publisher
An entity responsible for making the resource available
Psychoneuroendocrinology
Date
A point or period of time associated with an event in the lifecycle of the resource
2017
2017-01
Subject
The topic of the resource
*Bone mineral density; *Corticosterone; *Courtship; *GR; *Prenatal stress; *Testosterone; *Ultrasonic vocalizations; Animal/*physiology; Animals; Bone Density/*physiology; Female; Inbred C57BL; Male; Mice; Pregnancy; Prenatal Exposure Delayed Effects/*metabolism/*physiopathology; Psychological/*metabolism/*physiopathology; Stress; Vocalization
Creator
An entity primarily responsible for making the resource
Schmidt Michaela; Lapert Florian; Brandwein Christiane; Deuschle Michael; Kasperk Christian; Grimsley Jasmine M S; Gass Peter
Description
An account of the resource
Stress during the prenatal period has various effects on social and sexual behavior in both human and animal offspring. The present study examines the effects of chronic restraint stress in the second vs third trimester in pregnancy and glucocorticoid receptor (GR) heterozygous mutation on C57BL/6N male offspring's vocal courtship behavior in adulthood by applying a novel analyzing method. Finally, corticosterone and testosterone levels as well as bone mineral density were measured. Prenatal stress in the third, but not in the second trimester caused a significant qualitative change in males' courtship vocalizations, independent of their GR genotype. Bone mineral density was decreased also by prenatal stress exclusively in the third trimester in GR mutant and wildtype mice and - in contrast to corticosterone and testosterone - highly correlated with courtship vocalizations. In Gr(+/-) males corticosterone serum levels were significantly increased in animals that had experienced prenatal stress in the third trimester. Testosterone serum levels were overall increased in Gr(+/-) males in comparison to wildtypes as a tendency - whereas prenatal stress had no influence. Prenatal stress alters adult males' courtship vocalizations exclusively when applied in the third trimester, with closely related changes in bone mineral density. Bone mineral density seems to reflect best the complex neuroendocrine mechanisms underlying the production of courtship vocalizations. Besides, we demonstrated for the first time elevated basal corticosterone levels in Gr(+/-) males after prenatal stress which suggests that the Gr(+/-) mouse model of depression might also serve as a model of prenatal stress in male offspring.
Identifier
An unambiguous reference to the resource within a given context
<a href="http://doi.org/10.1016/j.psyneuen.2016.11.003" target="_blank" rel="noreferrer noopener">10.1016/j.psyneuen.2016.11.003</a>
Rights
Information about rights held in and over the resource
Article information provided for research and reference use only. All rights are retained by the journal listed under publisher and/or the creator(s).
*Bone mineral density
*Corticosterone
*Courtship
*GR
*Prenatal stress
*Testosterone
*Ultrasonic vocalizations
2017
Animal/*physiology
Animals
Bone Density/*physiology
Brandwein Christiane
Deuschle Michael
Female
Gass Peter
Grimsley Jasmine M S
Inbred C57BL
Kasperk Christian
Lapert Florian
Male
Mice
Pregnancy
Prenatal Exposure Delayed Effects/*metabolism/*physiopathology
Psychological/*metabolism/*physiopathology
Psychoneuroendocrinology
Schmidt Michaela
Stress
Vocalization
-
Text
A resource consisting primarily of words for reading. Examples include books, letters, dissertations, poems, newspapers, articles, archives of mailing lists. Note that facsimiles or images of texts are still of the genre Text.
URL Address
<a href="http://doi.org/10.1016/j.jneumeth.2015.07.001" target="_blank" rel="noreferrer noopener">http://doi.org/10.1016/j.jneumeth.2015.07.001</a>
Pages
206–217
Volume
253
Dublin Core
The Dublin Core metadata element set is common to all Omeka records, including items, files, and collections. For more information see, http://dublincore.org/documents/dces/.
Title
A name given to the resource
An improved approach to separating startle data from noise.
Publisher
An entity responsible for making the resource available
Journal of neuroscience methods
Date
A point or period of time associated with an event in the lifecycle of the resource
2015
2015-09
Subject
The topic of the resource
*Electronic Data Processing; *Noise; Acoustic startle reflex; Acoustic Stimulation/methods; Analysis of Variance; Animal locomotion; Animals; Auditory/*physiology; Automated classification; Evoked Potentials; Inbred CBA; Male; Mice; Reflex; Startle waveform analysis; Startle/*physiology; Time Factors; Video Recording
Creator
An entity primarily responsible for making the resource
Grimsley Calum A; Longenecker Ryan J; Rosen Merri J; Young Jesse W; Grimsley Jasmine M S; Galazyuk Alexander V
Description
An account of the resource
BACKGROUND: The acoustic startle reflex (ASR) is a rapid, involuntary movement to sound, found in many species. The ASR can be modulated by external stimuli and internal state, making it a useful tool in many disciplines. ASR data collection and interpretation varies greatly across laboratories making comparisons a challenge. NEW METHOD: Here we investigate the animal movement associated with a startle in mouse (CBA/CaJ). Movements were simultaneously captured with high-speed video and a piezoelectric startle plate. We also use simple mathematical extrapolations to convert startle data (force) into center of mass displacement ("height"), which incorporates the animal's mass. RESULTS: Startle plate force data revealed a stereotype waveform associated with a startle that contained three distinct peaks. This waveform allowed researchers to separate trials into 'startles' and 'no-startles' (termed 'manual classification). Fleiss' kappa and Krippendorff"s alpha (0.865 for both) indicate very good levels of agreement between researchers. Further work uses this waveform to develop an automated startle classifier. The automated classifier compares favorably with manual classification. A two-way ANOVA reveals no significant difference in the magnitude of the 3 peaks as classified by the manual and automated methods (P1: p=0.526, N1: p=0.488, P2: p=0.529). COMPARISON WITH EXISTING METHOD(S): The ability of the automated classifier was compared with three other commonly used classification methods; the automated classifier far outperformed these methods. CONCLUSIONS: The improvements made allow researchers to automatically separate startle data from noise, and normalize for an individual animal's mass. These steps ease inter-animal and inter-laboratory comparisons of startle data.
Identifier
An unambiguous reference to the resource within a given context
<a href="http://doi.org/10.1016/j.jneumeth.2015.07.001" target="_blank" rel="noreferrer noopener">10.1016/j.jneumeth.2015.07.001</a>
Rights
Information about rights held in and over the resource
Article information provided for research and reference use only. All rights are retained by the journal listed under publisher and/or the creator(s).
*Electronic Data Processing
*Noise
2015
Acoustic startle reflex
Acoustic Stimulation/methods
Analysis of Variance
Animal locomotion
Animals
Auditory/*physiology
Automated classification
Department of Anatomy & Neurobiology
Evoked Potentials
Galazyuk Alexander V
Grimsley Calum A
Grimsley Jasmine M S
Inbred CBA
Journal of neuroscience methods
Longenecker Ryan J
Male
Mice
NEOMED College of Medicine
Reflex
Rosen Merri J
Startle waveform analysis
Startle/*physiology
Time Factors
Video Recording
Young Jesse W