On-line music and video will soon start to be tagged so that search engines and other software can gather more information about the sound and "meaning" within the audio of the file itself. You may be soon able to search for songs that sound like other songs you enjoy and search for audio files that sound like, for example, a cheetah and get audio files that meet that criteria. Or if you want, you can search video that may evoke the same emotions as listening to Bethoven's 5th.
This new technology could revolutionize sound effects generation and open up a whole new realm within the digital music and musical instrument industry as well as finally providing a way to properly search through audio and video content in a standardized way. Having the ability to tag exact points within audio and video files and to reference moving objects and attach emotional and spatial meaning and such to them will provide a whole new generation of tools for the professional and the everyday user of multimedia content. Sights like YouTube and tools such as iTunes will benefit greatly from this new technology.
A few search examples as taken from the link at the bottom of this post:
· Play a few notes on a keyboard and retrieve a list of musical pieces similar to the required tune, or images matching the notes in a certain way, e.g. in terms of emotions.
· Draw a few lines on a screen and find a set of images containing similar graphics, logos, ideograms,...
· Define objects, including color patches or textures and retrieve examples among which you select the interesting objects to compose your design.
· On a given set of multimedia objects, describe movements and relations between objects and so search for animations fulfilling the described temporal and spatial relations.
· Describe actions and get a list of scenarios containing such actions.
· Using an excerpt of Pavarotti’s voice, obtaining a list of Pavarotti’s records, video clips where Pavarotti is singing and photographic material portraying Pavarotti.
Technical Information:
MPEG7 Overview
This new technology could revolutionize sound effects generation and open up a whole new realm within the digital music and musical instrument industry as well as finally providing a way to properly search through audio and video content in a standardized way. Having the ability to tag exact points within audio and video files and to reference moving objects and attach emotional and spatial meaning and such to them will provide a whole new generation of tools for the professional and the everyday user of multimedia content. Sights like YouTube and tools such as iTunes will benefit greatly from this new technology.
A few search examples as taken from the link at the bottom of this post:
· Play a few notes on a keyboard and retrieve a list of musical pieces similar to the required tune, or images matching the notes in a certain way, e.g. in terms of emotions.
· Draw a few lines on a screen and find a set of images containing similar graphics, logos, ideograms,...
· Define objects, including color patches or textures and retrieve examples among which you select the interesting objects to compose your design.
· On a given set of multimedia objects, describe movements and relations between objects and so search for animations fulfilling the described temporal and spatial relations.
· Describe actions and get a list of scenarios containing such actions.
· Using an excerpt of Pavarotti’s voice, obtaining a list of Pavarotti’s records, video clips where Pavarotti is singing and photographic material portraying Pavarotti.
Technical Information:
MPEG7 Overview
No comments:
Post a Comment