openHSU – Research Showcase
4694
Research outputs
849
People
140
Organizational Units
109
Projects
35
Conferences
17
Journals
- PublicationMetadata only
- PublicationMetadata onlyTransient, sustained, and task-specific non-perceptual conflict adjustment(Pabst Science Publishers, 2012)
- PublicationMetadata only
- PublicationMetadata only
- PublicationMetadata only
- PublicationMetadata only
- PublicationMetadata onlyA functional MRI study of happy and sad emotions in music with and without lyrics(Frontiers Research Foundation, 2011)Musical emotions, such as happiness and sadness, have been investigated using instrumental music devoid of linguistic content. However, pop and rock, the most common musical genres, utilize lyrics for conveying emotions. Using participants’ self-selected musical excerpts, we studied their behavior and brain responses to elucidate how lyrics interact with musical emotion processing, as reflected by emotion recognition and activation of limbic areas involved in affective experience. We extracted samples from subjects’ selections of sad and happy pieces and sorted them according to the presence of lyrics. Acoustic feature analysis showed that music with lyrics differed from music without lyrics in spectral centroid, a feature related to perceptual brightness, whereas sad music with lyrics did not diverge from happy music without lyrics, indicating the role of other factors in emotion classification. Behavioral ratings revealed that happy music without lyrics induced stronger positive emotions than happy music with lyrics. We also acquired functional magnetic resonance imaging data while subjects performed affective tasks regarding the music. First, using ecological and acoustically variable stimuli, we broadened previous findings about the brain processing of musical emotions and of songs versus instrumental music. Additionally, contrasts between sad music with versus without lyrics recruited the parahippocampal gyrus, the amygdala, the claustrum, the putamen, the precentral gyrus, the medial and inferior frontal gyri (including Broca’s area), and the auditory cortex, while the reverse contrast produced no activations. Happy music without lyrics activated structures of the limbic system and the right pars opercularis of the inferior frontal gyrus, whereas auditory regions alone responded to happy music with lyrics. These findings point to the role of acoustic cues for the experience of happiness in music and to the importance of lyrics for sad musical emotions.
- PublicationMetadata onlyFluid-structure interaction simulations of wind gusts impacting a hyperbolic paraboloid tensile structure(AIP Publishing, 2024-10)The paper focuses on fluid–structure interactions (FSI) between a turbulent, gusty fluid flow, and a membrane structure. Lightweight structures are particularly vulnerable to wind gusts and can be completely destroyed by them, making it essential to develop and evaluate numerical simulation methods suited for these types of problems. In this study, a thin-walled membrane in the shape of a hyperbolic paraboloid (hypar) is analyzed as a real-scale example. The membrane structure is subjected to discrete wind gusts of varying strength from two different directions. A partitioned FSI approach is employed, utilizing a finite-volume flow solver based on the large-eddy simulation technique and a finite-element solver developed for shell and membrane structures. A recently proposed source-term formulation enables the injection of discrete wind gusts within the fluid domain in front of the structure. In a step-by-step analysis, first the fluid flow around the structure, initially assumed to be rigid, is investigated, including a grid sensitivity analysis. This is followed by examining the two-way coupled FSI system, taking the flexibility of the membrane into account. Finally, the study aims to assess the impact of wind gusts on the resulting deformations and the induced stresses in the tensile material, with a particular focus on the influence of different wind directions.
- PublicationMetadata onlySelective suppression of self-initiated sounds in an auditory stream(Wiley-Blackwell, 2011)Numerous studies have shown that the N1 event-related potential (ERP) response is attenuated when it is elicited by self-initiated sounds. This N1 suppression effect is generally interpreted to reflect an internal prediction mechanism, which enables the discrimination of the sensory consequences of our own actions and those of others. The blocked design used in the forerunner studies (i.e., self- and externally initiated sounds presented in different blocks) seriously limits the relevance of these findings, because the N1 effect can simply be explained by contextual task differences. In the present study, self- and externally initiated sounds were mixed within blocks. N1 suppression was found, and its magnitude was even larger than that observed in a traditional blocked condition. This result supports the involvement of an internal prediction mechanism in the discrimination of the sensory consequences of one's own actions and those of others.
- PublicationMetadata onlyMisleading phonetic information due to stimulus cross-slicing is processed automatically(Wiley-Blackwell, 2011)