Go home now Header Background Image
Search
Submission Procedure
share: |
 
Follow us
 
 
 
 
Volume 14 / Issue 10

available in:   PDF (719 kB) PS (6 MB)
 
get:  
Similar Docs BibTeX   Write a comment
  
get:  
Links into Future
 
DOI:   10.3217/jucs-014-10-1792

 

Imagesemantics: User-Generated Metadata, Content Based Retrieval & Beyond

Marc Spaniol (RWTH Aachen University, Germany)

Ralf Klamma (RWTH Aachen University, Germany)

Mathias Lux (Klagenfurt University, Austria)

Abstract: With the advent of Web 2.0 technologies a new attitude towards processing contents in the Internet has emerged. Nowadays it is a lot easier to create, share and retrieve multimedia contents on the Web. However, with the increasing amount in contents retrieval becomes more challenging and often leads to inadequate search results. One main reason is that image clustering and retrieval approaches usually stick either solely to the images' low-level features or their user-generated tags (high-level features). However, this is frequently inappropriate since the "real" semantics of an image can only be derived from the combination of low-level and high-level features. Consequently, we investigated a more holistic view on image semantics based on a system called Imagesemantics. This system combines MPEG-7 descriptions for low-level content-based retrieval features and MPEG-7 keywords by a machine learning approach producing joined OWL rules. The rule base is used in Imagesemantics to improve retrieval results.

Keywords: MPEG-7, Web 2.0, social media platform, user-generated content

Categories: H.3.3, H.3.4, H.3.5, H.5.1