Session Program

 

  • 11 July 2017
  • 08:00AM - 10:00AM
  • Room: Catalana
  • Chair: Radko Mesiar, Javier Montero, Irina Perfilieva Humberto Bustince

Applications of aggregation functions I

Abstract - Morphological operators have been used extensively when dealing with binary or grayscale images, but there is no general-purpose approach for multivariate images that meets the expectactions of practitioners under diferent circumstances. Although several approaches have been proposed, state-of-the-art applications tend to process channels independently, obviating interchannel correlation. In this work, we introduce a new definition of erosion and dilation that can handle images with any number of channels, study their theoretical properties and analyse its behaviour. It is based on the Fuzzy Mathematical Morphology, from which it inherits essential theoretical properties. Our operators consider the first channel to evaluate a pixel's importance, but handles all channels to generate coherent outputs. It successfully processes natural images in the L*a*b* space and can also avoid the creation of new chromatic values, specially important for hyperspectral imagery. We provide, thus, a general and well-founded framework to process color images with morphological operators.
Abstract - We present an image comparison method based on the greatest solution of a system of bilinear fuzzy relation equations A*B= B*x, where "*" is the max-min composition, A and B are the compared images, normalized in [0,1] and considered as fuzzy relations, and x is an unknown vector. Due to symmetry, A (resp. B) could be the original image and B (resp. A) is an image modified of A (resp. B) (for instance, either noised or watermarked). Our index is more robust than other two comparison indexes already known in literature.
Abstract - This article investigates image filtering and smoothing from the perspective of a recent generalisation of the notion of aggregation functions in fuzzy systems, called pre-aggregation functions. Mixture functions describing a broad class of robust spatial-tonal filters and smoothers are derived using penalty-based methods. Several existing filters are re-derived using this approach and several novel filters are proposed, which are able to better handle filtering in contexts where the pixel to be filtered is itself an outlier in the local neighbourhood. The proposed class of Robust Bilateral Filters formalises and generalises a recent result of Chaudhury, who noted that using a filtered version of an image to compute tonal weights for a Bilateral Filter gave more robust denoising. Filter performance is validated using standard test images and quantified using peak signal-to-noise ratio and visual similarity, finding novel filters that exceed the performance of the standard Bilateral Filter.
Abstract - We propose a new hybrid image compression algorithm which combines the F-transform and the JPEG. At first, we apply the direct F-transform and then, the JPEG compression. Conversely, the JPEG decompression is followed by the inverse F-transform to obtain the decompressed image. This scheme brings three benefits: (i) the direct F-transform filters out high frequencies so that the JPEG can reach a higher compression ratio; (ii) the JPEG color quantization can be omitted in order to achieve greater decompressed image quality; (iii) the JPEG-decompressed image is processed by by the inverse F-transform w.r.t. the adjoint partition almost lossless. The paper justifies the proposed hybrid algorithm by benchmarks which show that the hybrid algorithm achieves significantly higher decompressed image quality than the JPEG.
Abstract - In many computer vision applications, vignetting is an undesirable effect which must be removed in a pre-processing step. Recently, an algorithm for image vignetting correction has been presented by means of a minimization of log-intensity entropy. This method relies on an increase of the entropy of the image when it is affected with vignetting. In this paper, we propose a novel algorithm to reduce image vignetting via a maximization of the fuzzy entropy of the image. Fuzzy entropy quantifies the fuzziness degree of a fuzzy set and its value is also modified by the presence of vignetting. The experimental results show that this novel algorithm outperforms in most cases the algorithm based on the minimization of log-intensity entropy both from the qualitative and the quantitative point of view.
Abstract - Studies about the reduction of the size of the L-fuzzy concept lattices have been addressed over the past. The number of objects and attributes of the L-fuzzy context is one of the most important factors that influence the size of the L-fuzzy concept lattice. In this paper, we tackle the problem of reducing the L- fuzzy context size. In order to do this, Choquet integrals can be a good tool for aggregating values. The possibility of giving importance not only to the individual observations, but also to the groups provides an appropriate way to solve the problem.