Exploring the gap between linear and classical logic.
Post-training rounding, also known as quantization, of estimated parameters stands as a widely adopted technique for mitigating energy consumption and latency in machine learning models. This theoretical endeavor delves into the examination of the impact of rounding estimated parameters in key regression methods within the realms of statistics and machine learning. The proposed approach allows for the perturbation of parameters through an additive error with values within a specified interval. This...
We define H- and EH-expressions as extensions of regular expressions by adding homomorphic and iterated homomorphic replacement as new operations, resp. The definition is analogous to the extension given by Gruska in order to characterize context-free languages. We compare the families of languages obtained by these extensions with the families of regular, linear context-free, context-free, and EDT0L languages. Moreover, relations to language families based on patterns, multi-patterns,...