jarecki32
04.12.06, 21:39
Cybernetyka jako pojecie juz traci mycha, ale z braku laku tak to nazwalem
Jest takie czasopismo co sie zwie IEEE Proceedings. Nie tak dawno, bo dwa
lata temu mieli special issue na temat zwiazkow inzynierii elektrycznej w
szerokium pojeciu slowa a muzyka. Jak dla mnie to ciekawa literaura, godna
polecenia innym. Dla zachety zalaczam dwa abstrakty. Pelny tekst mozna
przeczytac w kazdej wiekszej bibliotece uniwersyteckiej.
Mysle, ze te elektroniczne liasons moga zaowocowac w coraz ciekawszych
efektach, bo wkrotce bedzie trudno odroznic muzyke syntetyczna od naturalnej,
jesli tylko uda sie wprogramowac emocje, brak precyzji, szumy, itd. A do tego
juz niewiele brakuje.
Musical instrument classification and duet analysis employing music
information retrieval techniques
Kostek, B.
Dept. of Multimedia Syst., Gdansk Univ. of Technol., Poland;
This paper appears in: Proceedings of the IEEE
Publication Date: Apr 2004
Volume: 92, Issue: 4
On page(s): 712- 729
ISSN: 0018-9219
INSPEC Accession Number: 7952114
Digital Object Identifier: 10.1109/JPROC.2004.825903
Posted online: 2004-11-08 17:41:12.0
Abstract
The aim of this paper is to present solutions related to identifying musical
data.
These are discussed mainly on the basis of experiments carried out at the
Multimedia Systems Department,
Gdansk University of Technology, Gdansk, Poland. The topics presented in
this paper include automatic
recognition of musical instruments and separation of duet sounds. The
classification process is shown
as a three-layer process consisting of pitch extraction, parametrization,
and pattern recognition.
These three stages are discussed on the basis of experimental examples.
Artificial neural networks (ANNs)
are employed as a decision system and they are trained with a set of feature
vectors (FVs) extracted
from musical sounds recorded at the Multimedia Systems Department.
The frequency envelope distribution (FED) algorithm is presented, which was
introduced to musical duet
separation. For the purpose of checking the efficiency of the FED algorithm,
ANNs are also used.
They are tested on FVs derived from musical sounds after the separation
process is performed.
The experimental results are shown and discussed.
Music via motion: transdomain mapping of motion and sound for interactive
performances
Ng, K.C.
Interdisciplinary Centre for Sci. Res. in Music, Univ. of Leeds, UK;
This paper appears in: Proceedings of the IEEE
Publication Date: April 2004
Volume: 92, Issue: 4
On page(s): 645- 655
ISSN: 0018-9219
INSPEC Accession Number: 7952109
Digital Object Identifier: 10.1109/JPROC.2004.825885
Posted online: 2004-03-30 08:19:12.0
Abstract
This paper presents a framework called Music via Motion (MvM) designed for
the transdomain
mapping between physical movements of the performer(s) and multimedia events,
translating activities from one creative domain to another-for example,
from physical gesture to audio output. With a brief background of this
domain and prototype designs, the paper describes a number of inter- and
multidisciplinary
collaborative works for interactive multimedia performances. These include a
virtual musical
instrument interface, exploring video-based tracking technology to provide
an intuitive
and nonintrusive musical interface, and sensor-based augmented instrument
designs.
The paper also describes a distributed multimedia-mapping server which
allows multiplatform
and multisensory integrations and presents a sample application which
integrates a real-time
face tracking system. Ongoing developments and plausible future explorations
on stage augmentation
with virtual and augmented realities as well as gesture analysis on the
correlations of musical
gesture and physical gesture are also discussed.