Displaying similar documents to “A branch and bound algorithm for the two-machine flowshop problem with unit-time operations and time delays”

On the power of randomization for job shop scheduling with k -units length tasks

Tobias Mömke (2009)

RAIRO - Theoretical Informatics and Applications - Informatique Théorique et Applications

Similarity:

In the job shop scheduling problem k -units- J m , there are m machines and each machine has an integer processing time of at most k time units. Each job consists of a permutation of m tasks corresponding to all machines and thus all jobs have an identical dilation D . The contribution of this paper are the following results; (i) for d = o ( D ) jobs and every fixed k , the makespan of an optimal schedule is at most D + o ( D ) , which extends the result of [3] for k = 1 ; (ii) a randomized on-line approximation algorithm...

On the weighted Euclidean matching problem in d

Birgit Anthes, Ludger Rüschendorf (2001)

Applicationes Mathematicae

Similarity:

A partitioning algorithm for the Euclidean matching problem in d is introduced and analyzed in a probabilistic model. The algorithm uses elements from the fixed dissection algorithm of Karp and Steele (1985) and the Zig-Zag algorithm of Halton and Terada (1982) for the traveling salesman problem. The algorithm runs in expected time n ( l o g n ) p - 1 and approximates the optimal matching in the probabilistic sense.

Implicitization of Parametric Hypersurfaces via Points

Ferruccio Orecchia, Isabella Ramella (2018)

Rendiconto dell’Accademia delle Scienze Fisiche e Matematiche

Similarity:

Given a parametric polynomial representation of an algebraic hypersurface 𝐒 in the projective space we give a new algorithm for finding the implicit cartesian equation of 𝐒 .The algorithm is based on finding a suitable finite number of points on 𝐒 and computing, by linear algebra, the equation of the hypersurface of least degree that passes through the points. In particular the algorithm works for plane curves and surfaces in the ordinary three-dimensional space. Using C++ the algorithm...

An improvement of Euclid's algorithm

Zítko, Jan, Kuřátko, Jan

Similarity:

The paper introduces the calculation of a greatest common divisor of two univariate polynomials. Euclid’s algorithm can be easily simulated by the reduction of the Sylvester matrix to an upper triangular form. This is performed by using c - s transformation and Q R -factorization methods. Both procedures are described and numerically compared. Computations are performed in the floating point environment.

Uniform convergence of the greedy algorithm with respect to the Walsh system

Martin Grigoryan (2010)

Studia Mathematica

Similarity:

For any 0 < ϵ < 1, p ≥ 1 and each function f L p [ 0 , 1 ] one can find a function g L [ 0 , 1 ) with mesx ∈ [0,1): g ≠ f < ϵ such that its greedy algorithm with respect to the Walsh system converges uniformly on [0,1) and the sequence | c k ( g ) | : k s p e c ( g ) is decreasing, where c k ( g ) is the sequence of Fourier coefficients of g with respect to the Walsh system.

Greedy approximation and the multivariate Haar system

A. Kamont, V. N. Temlyakov (2004)

Studia Mathematica

Similarity:

We study nonlinear m-term approximation in a Banach space with regard to a basis. It is known that in the case of a greedy basis (like the Haar basis in L p ( [ 0 , 1 ] ) , 1 < p < ∞) a greedy type algorithm realizes nearly best m-term approximation for any individual function. In this paper we generalize this result in two directions. First, instead of a greedy algorithm we consider a weak greedy algorithm. Second, we study in detail unconditional nongreedy bases (like the multivariate Haar basis...

Distributed H estimation for moving target under switching multi-agent network

Hu Chen, Qin Weiwei, He Bing, Liu Gang (2015)

Kybernetika

Similarity:

In this paper, the distributed H estimation problem is investigated for a moving target with local communication and switching topology. Based on the solution of the algebraic Riccati equation, a recursive algorithm is proposed using constant gain. The stability of the proposed algorithm is analysed by using the Lyapounov method, and a lower bound for estimation errors is obtained for the proposed common H filter. Moreover, a bound for the H parameter is obtained by means of the solution...

Diagonalization in proof complexity

Jan Krajíček (2004)

Fundamenta Mathematicae

Similarity:

We study diagonalization in the context of implicit proofs of [10]. We prove that at least one of the following three conjectures is true: ∙ There is a function f: 0,1* → 0,1 computable in that has circuit complexity 2 Ω ( n ) . ∙ ≠ co . ∙ There is no p-optimal propositional proof system. We note that a variant of the statement (either ≠ co or ∩ co contains a function 2 Ω ( n ) hard on average) seems to have a bearing on the existence of good proof complexity generators. In particular, we prove that...

On the Kaczmarz algorithm of approximation in infinite-dimensional spaces

Stanisław Kwapień, Jan Mycielski (2001)

Studia Mathematica

Similarity:

The Kaczmarz algorithm of successive projections suggests the following concept. A sequence ( e k ) of unit vectors in a Hilbert space is said to be effective if for each vector x in the space the sequence (xₙ) converges to x where (xₙ) is defined inductively: x₀ = 0 and x = x n - 1 + α e , where α = x - x n - 1 , e . We prove the effectivity of some sequences in Hilbert spaces. We generalize the concept of effectivity to sequences of vectors in Banach spaces and we prove some results for this more general concept.

Seasonal time-series imputation of gap missing algorithm (STIGMA)

Eduardo Rangel-Heras, Pavel Zuniga, Alma Y. Alanis, Esteban A. Hernandez-Vargas, Oscar D. Sanchez (2023)

Kybernetika

Similarity:

This work presents a new approach for the imputation of missing data in weather time-series from a seasonal pattern; the seasonal time-series imputation of gap missing algorithm (STIGMA). The algorithm takes advantage from a seasonal pattern for the imputation of unknown data by averaging available data. We test the algorithm using data measured every 10 minutes over a period of 365 days during the year 2010; the variables include global irradiance, diffuse irradiance, ultraviolet irradiance,...