Histoire et préhistoire de l'analyse des données
Recently, the utilization of invariant aggregation operators, i.e., aggregation operators not depending on a given scale of measurement was found as a very current theme. One type of invariantness of aggregation operators is the homogeneity what means that an aggregation operator is invariant with respect to multiplication by a constant. We present here a complete characterization of homogeneous aggregation operators. We discuss a relationship between homogeneity, kernel property and shift-invariance...
Although the first rule-based systems were created as early as thirty years ago, this methodology of expert systems designing still proves to be useful. It becomes especially important in medical applications, while treating evidence given in an electronic format. Constructing the knowledge base of a rule-based system and, especially, of a system with uncertainty is a difficult task because of the size of this base as well as its heterogeneous character. The base consists of facts, ordinary rules...
This work is devoted to find and study some possible idempotent operators on a finite chain L. Specially, all idempotent operators on L which are associative, commutative and non-decreasing in each place are characterized. By adding one smoothness condition, all these operators reduce to special combinations of Minimum and Maximum.
Machine learning is an appealing and useful approach to creating vehicle control algorithms, both for simulated and real vehicles. One common learning scenario that is often possible to apply is learning by imitation, in which the behavior of an exemplary driver provides training instances for a supervised learning algorithm. This article follows this approach in the domain of simulated car racing, using the TORCS simulator. In contrast to most prior work on imitation learning, a symbolic decision...
Low complexity realizations of Least Mean Squared (LMS) error, Generalized Sidelobe Cancellers (GSCs) applied to adaptive beamforming are considered. The GSC method provides a simple way for implementing adaptive Linear Constraint Minimum Variance (LCMV) beamformers. Low complexity realizations of adaptive GSCs are of great importance for the design of high sampling rate, and/or small size and low power adaptive beamforming systems. The LMS algorithm and its Transform Domain (TD-LMS) counterpart...
Applying the generalised extension principle within the area of Computing with Words typically leads to complex maximisation problems. If distributed quantities-such as, e.g., size distributions within human populations-are considered, density functions representing these distributions become involved. Very often the optimising density functions do not resemble those found in nature; for instance, an optimising density function could consist of two single Dirac pulses positioned near the opposite...
The purpose of feature selection in machine learning is at least two-fold - saving measurement acquisition costs and reducing the negative effects of the curse of dimensionality with the aim to improve the accuracy of the models and the classification rate of classifiers with respect to previously unknown data. Yet it has been shown recently that the process of feature selection itself can be negatively affected by the very same curse of dimensionality - feature selection methods may easily over-fit...
A new learning method tolerant of imprecision is introduced and used in neuro-fuzzy modelling. The proposed method makes it possible to dispose of an intrinsic inconsistency of neuro-fuzzy modelling, where zero-tolerance learning is used to obtain a fuzzy model tolerant of imprecision. This new method can be called ε-insensitive learning, where, in order to fit the fuzzy model to real data, the ε-insensitive loss function is used. ε-insensitive learning leads to a model with minimal Vapnik-Chervonenkis...
A nearlattice is a join semilattice such that every principal filter is a lattice with respect to the induced order. Hickman and later Chajda et al independently showed that nearlattices can be treated as varieties of algebras with a ternary operation satisfying certain axioms. Our main result is that the variety of nearlattices is -based, and we exhibit an explicit system of two independent identities. We also show that the original axiom systems of Hickman as well as that of Chajda et al are...
Mechanization of inductive reasoning is an exciting research area in artificial intelligence and automated reasoning with many challenges. An overview of our work on mechanizing inductive reasoning based on the cover set method for generating induction schemes from terminating recursive function definitions and using decision procedures is presented. This paper particularly focuses on the recent work on integrating induction into decision procedures without compromising their automation.
An important field of probability logic is the investigation of inference rules that propagate point probabilities or, more generally, interval probabilities from premises to conclusions. Conditional probability logic (CPL) interprets the common sense expressions of the form “if ..., then ...” by conditional probabilities and not by the probability of the material implication. An inference rule is probabilistically informative if the coherent probability interval of its conclusion is not necessarily...