Summer 2001 | former semesters |
Time: Mondays, 14:15 - 15:45
Location: IFW A 32
(unless stated otherwise)
Organizers: CGC
faculty at ETH Zurich
Contact: Emo Welzl
The CGC colloquium does not take place April 16 (Easter Monday), April 23 (Sechseläuten), May 14 (CGC Workshop at Monte Verita), and June 4 (Whit Monday).
April 09, 2001
Shmuel Onn
Technion - Israel Institute of Technology, Haifa, Israel
Micha Sharir
Tel Aviv University, Israel (and ETH Zurich, Summer 2001)The Union of Geometric Objects and Generalized Voronoi Diagrams
Jürg Nievergelt
ETH Zurich
Martin Dietzfelbinger
University of Technology Ilmenau, Germany
Leonid Khachiyan
Rutgers University, New Jersey, U.S.A.
Heribert Vollmer
University Würzburg, GermanyA Polynomial-Time Approximation Scheme for Base Station Positioning in UMTS-Networks
Volker Diekert
University Stuttgart, Germany
Monday, April 09, 2001, 14:15 - 15:45, IFW A 32
Shmuel Onn
Technion - Israel Institute of Technology, Haifa, Israel
The partition problem concerns the partitioning of a set of vectors in d-space into p parts so as to maximize an objective function which is convex on the sum of vectors in each part, subject to constraints on the shape of admissible partitions. The problem has broad expressive power and arises in a variety of areas ranging from economics to symbolic computation.The problem can be treated by maximizing the same objective over a suitable partition polytope in d x p matrix space, and its computational complexity is intimately related to the vertex complexity of such polytopes.
My talk will provide an introduction to the subject and overview our ongoing ISF project on vector partitions, drawing from several articles with various coauthors including Alon, Aviran, Hwang, Rothblum, Schulman and Sturmfels.
In particular, I will describe recent polynomial upper and lower bounds on the complexity of the problem with no shape restrictions, based on zonotopal refinements of partition polytopes, on the introduction of Momentopes, and on certain polynomial realizations of Davenport-Schinzel type sequences.
If time permits I will mention an application to symbolic computation - an efficient algorithm for constructing universal Gröbner bases of vanishing ideals of configurations by partition-realization of their state polytopes.
Monday, April 30, 2001, 14:15 - 15:45, IFW A 32
Micha Sharir
We analyze the complexity of U in several special cases, where one
can obtain a smaller complexity bound. There are many such instances in the
2-dimensional case (pseudo-disks, fat objects), but the problem is quite
hard in 3 and more dimensions.
The cases that we consider include convex polyhedra in 3 dimensions,
axis-parallel cubes in any dimension, infinite congruent cylinders in 3
dimensions, fat dihedral wedges and congruent (arbitrarily oriented) cubes
in 3 dimensions, and more.
A special important case is the union of Minkowski sums, where the objects
are of the form
{ Ai + B }i=1..n , where
the Ai's are convex pairwise-disjoint objects and
B is another convex object, and X + Y denotes
the Minkowski sum of X and Y.
The boundary of such a union can be regarded as a cross-section or a
`level-set' of the generalized Voronoi diagram with the
Ai's as sites and with the metric (or convex distance
function) induced by B as a `unit ball': The union boundary is the
locus of points whose B-distance from the nearest site is
exactly 1.
We discuss a few cases where sharper bounds are known either for the
complexity of the union of Minkowski sums, as above, or for the complexity
of the associated Voronoi diagram (a much harder problem).
Monday, May 07, 2001, 14:15 - 15:45, IFW A 32
Jürg Nievergelt
Asymptotic complexity theory has emerged as a surprisingly effective tool
for predicting run times of polynomial-time algorithms. For NP-hard
problems, on the other hand, it yields overly pessimistic bounds. It
asserts the non-existence of algorithms that are efficient across an entire
problem class, but ignores the fact that many instances, perhaps including
those of interest, can be solved efficiently. For such cases we need a
complexity measure that applies to problem instances, rather than to
over-sized problem classes.
Combinatorial optimization and enumeration problems are modeled by state
spaces that usually lack any regular structure. Exhaustive search is often
the only way to handle such "combinatorial chaos". Several general purpose
search algorithms are used under different circumstances. We describe
reverse search and illustrate this technique on a case study of enumerative
optimization: enumerating the k shortest Euclidean spanning trees.
(18-35 in "Sofsem 2000 - Theory and Practice of Informatics",
V. Hlavac, K.G. Jeffery and J. Wiedermann (eds.), Springer
LNCS Vol 1963, 2000)
Monday, May 28, 2001, 14:15 - 15:45, IFW A 32
Martin Dietzfelbinger
The basic approach is as follows: Let U and M be
fixed (finite) sets. Consider sets H of functions from
U to M that can be evaluated efficiently and so that the
values of a function chosen at random from H behave in some way
similar to a random function from U to M.
The classical example is Carter and Wegman's concept
"c-universality":
Many other concepts formulating stronger requirements have since been
introduced, in particular "k-wise
independence", for any k >= 2:
In the lecture, we will look at some interesting constructions of such
classes, and at methods from probability theory, algebra, and
combinatorics that allow us to establish positive and negative
properties of these classes. The role of these properties in selected
applications will be described, in particular in recent constructions
of "perfect" hash functions, i.e., functions with a very
simple structure that are one-to-one on a given set S in
U with |S| = |M|.
Tel Aviv University, Israel (and ETH Zurich, Summer 2001)
Let C be a collection of n objects in d-space,
and let U denote the union of C. The combinatorial
complexity of U is the number of faces, of all dimensions, of the
arrangement of the boundary surfaces of the objects in C, that lie
on the boundary of U (e.g., a vertex is an intersection of
d boudaries). If each of the objects in C has `constant
description complexity' then the complexity of U is
O(nd) and in general this bound is tight.
Exploring the potential of raw computing power
ETH Zurich
For half a century since computers came into existence, the goal of finding
elegant and efficient algorithms to solve "simple" (well-defined and
well-structured) problems has dominated algorithm design. Over the same
time period, both processing and storage capacity of computers have
increased by roughly a factor of a million. The next few decades may well
give us a similar rate of growth in raw computing power, due to various
factors such as continuing miniaturization, parallel and distributed
computing. If a quantitative change of orders of magnitude leads to
qualitative advances, where will the latter take place? Only empirical
research can answer this question.
University of Technology Ilmenau, Germany
Since it was introduced by Carter and Wegman in 1979, the general and
elegant concept of a universal class of hash functions has found
numerous applications in a wide variety of areas in Theoretical
Computer Science, ranging from the construction of data structures
over fundamentals of parallel and distributed computing all the way to
computational complexity theory.
For x, y in U with
x != y we have Probh in
H (h(x) = h(y)) <= c / |M|
for a constant c.
For every choice of k distinct elements
x1, ..., xk in U
the random values h(x1), ..., h(xk) are independent and
uniformly distributed in M.
!!! This document is stored in the ETH Web archive and is no longer maintained !!!