Mechanized experiment planning in automaton-environment systems
In this article, a technique called Meta-Optimization is used to enhance the effectiveness of bio-inspired algorithms that solve antenna array synthesis problems. This technique consists on a second optimization layer that finds the best behavioral parameters for a given algorithm, which allows to achieve better results. Bio-inspired computational methods are useful to solve complex multidimensional problems such as the design of antenna arrays. However, their performance depends heavily on the...
Firstly we present a geometric interpretation of interval-valued fuzzy sets. Secondly, we apply the method of least squares to the fuzzy inference rules when working with these sets. We begin approximating the lower and upper extremes of the membership intervals to axb type functions by means of the method of least squares. Then we analyze a technique for evaluating the conclusion of the generalized modus ponens and we verify the fulfillment of Fukami and alumni axioms [9].
Based on rough set theory many algorithms for rules extraction from data have been proposed. Decision rules can be obtained directly from a database. Some condition values may be unnecessary in a decision rule produced directly from the database. Such values can then be eliminated to create a more comprehensible (minimal) rule. Most of the algorithms that have been proposed to calculate minimal rules are based on rough set theory or machine learning. In our approach, in a post-processing stage,...
Classical association rules, here called “direct”, reflect relationships existing between items that relatively often co-occur in common transactions. In the web domain, items correspond to pages and transactions to user sessions. The main idea of the new approach presented is to discover indirect associations existing between pages that rarely occur together but there are other, “third” pages, called transitive, with which they appear relatively frequently. Two types of indirect associations rules...
In the present paper we investigate the life cycles of formalized theories that appear in decision making instruments and science. In few words mixed theories are build in the following steps: Initially a small collection of facts is the kernel of the theory. To express these facts we make a special formalized language. When the collection grows we add some inference rules and thus some axioms to compress the knowledge. The next step is to generalize these rules to all expressions in the formalized language....
Recently a new interesting architecture of neural networks called “mixture of experts” has been proposed as a tool of real multivariate approximation or prediction. We show that the underlying problem is closely related to approximating the joint probability density of involved variables by finite mixture. Particularly, assuming normal mixtures, we can explicitly write the conditional expectation formula which can be interpreted as a mixture-of- experts network. In this way the related optimization...
Information retrieval in information systems (IS) with large amounts of data is not only a matter of an effective IS architecture and design and technical parameters of computer technology used for operation of the IS, but also of an easy and intuitive orientation in a number of offers and information provided by the IS. Such retrievals in IS are, however, frequently carried out with indeterminate information, which requires other models of orientation in the environment of the IS.
Updating probabilities by information from only one hypothesis and thereby ignoring alternative hypotheses, is not only biased but leads to progressively imprecise conclusions. In psychology this phenomenon was studied in experiments with the “pseudodiagnosticity task”. In probability logic the phenomenon that additional premises increase the imprecision of a conclusion is known as “degradation”. The present contribution investigates degradation in the context of second order probability distributions....
A new approach to control an omnidirectional mobile manipulator is developed. The robot is considered to be an individual agent aimed at performing robotic tasks described in terms of a displacement and a force interaction with the environment. A reactive architecture and impedance control are used to ensure reliable task execution in response to environment stimuli. The mechanical structure of our holonomic mobile manipulator is built of two joint manipulators mounted on a holonomic vehicle. The...
Let G be an undirected graph with n vertices. Assume that a robot is placed on a vertex and n − 2 obstacles are placed on the other vertices. A vertex on which neither a robot nor an obstacle is placed is said to have a hole. Consider a single player game in which a robot or obstacle can be moved to adjacent vertex if it has a hole. The objective is to take the robot to a fixed destination vertex using minimum number of moves. In general, it is not necessary that the robot will take a shortest path...
Multifractal analysis is known as a useful tool in signal analysis. However, the methods are often used without methodological validation. In this study, we present multidimensional models in order to validate multifractal analysis methods.
A framework for multi-label classification extended by Error Correcting Output Codes (ECOCs) is introduced and empirically examined in the article. The solution assumes the base multi-label classifiers to be a noisy channel and applies ECOCs in order to recover the classification errors made by individual classifiers. The framework was examined through exhaustive studies over combinations of three distinct classification algorithms and four ECOC methods employed in the multi-label classification...