Conference
Communications
International
Conferences
- Ya.D. Sergeyev,
Infinity Computer and Calculus
(invited plenary lecture), in Proc. of the
3rd International Conference of Applied
Mathematics TICAM, Plovdiv (Bulgaria), August 12-18, 2006,
Bainov D. and Nenov S., eds.,
University of Plovdiv Press, Vol. 2, pp. 246-247.
[Top]
- Ya.D. Sergeyev,
Infinity Computer and Calculus
(invited plenary lecture), in Proc. of the
Conference on Applied Optimization and
Metaheuristic Innovations, Yalta (Ukraine), July 19-21, 2006,
Sergienko I. V., ed., pp. 40-41.
[Top]
- G. Zanghirati, L. Zanni,
Some properties of gradient-based methods with application to
machine learning
(invited talk),
EuroXXI - 21st European Conference on Operational Research,
Reykjavik (Iceland), July 2-5, 2006.
[Top]
- G. Zanghirati, L. Zanni,
Large-scale Support Vector Machines: Decomposition and Cascade Approaches
(invited talk)
EuroXXI - 21st European Conference on Operational Research,
Reykjavik (Iceland), July 2-5, 2006.
[Top]
- G. Spaletta,
Modelling the Thyroid Geometry,
8th International Mathematica Symposium,
Palais des Papes, Avignone (France), June 19-23, 2006.
[Top]
- M. Sofroniou,
Mathematica's Numerics
(series of invited talks),
8th International Mathematica Symposium,
Palais des Papes, Avignone (France), June 19-23, 2006.
[Top]
- G. Spaletta, M. Sofroniou,
Matrix Polynomials and the Matrix Exponential
(invited talk),
Castellon Conference on Geometric Integration,
University Jaume I, September 18-22, 2006.
[Top]
- E. Galligani, V. Ruggiero, G. Zanghirati,
Splitting methods for nonlinear diffusion filtering,
3rd IASC World Conference on
"Computational Statistics and Data Analysis",
Limassol (Cyprus), October 28-31, 2005.
[Top]
- T. Serafini, G. Zanghirati, L. Zanni,
Gradient Projection-Type Quadratic Solvers
in Parallel Decomposition Techniques for Support Vector Machines,
3rd IASC World Conference on
"Computational Statistics and Data Analysis",
Limassol (Cyprus), October 28-31, 2005.
[Top]
- T. Serafini, G. Zanghirati, L. Zanni,
Some Improvements to a Parallel
Decomposition Technique for
Training Support Vector Machines,
EURO PVM MPI 2005, 12th European Parallel Virtual Machine and
Message Passing Interface Conference,
Sorrento (Naples, Italy), September 18-21, 2005.
[Top]
- T. Serafini, G. Zanghirati, L. Zanni,
Decomposition techniques and gradient projection methods in
training Support Vector Machines,
SIAM Conference on Optimization 2005,
Stocholm (Sweden), May 15-19, 2005.
[Top]
- T. Serafini, G. Zanghirati, L. Zanni,
On gradient projection-based decomposition techniques for training
SVMs on parallel architectures,
PASCAL Workshop,
Thurnau (Germany), March 16-18, 2005.
[Top]
- Ya.D. Sergeyev, D.E. Kvasov,
Diagonal global search based on a set of possible
Lipschitz constants, In Proc. of the
International Workshop on Global Optimization GO05,
Almerìa (Spain), September 18-22, 2005, pp. 219-224.
[Top]
- YA.D.Sergeyev,
Global search based on efficient diagonal partitions
(invited plenary lecture),
International Conference on Complementarity, Duality, and Global
Optimization in Science and Engineering CDGO2005,
Blacksburg (Virginia, USA),
August 15-17, 2005. .
[Top]
- V.A. Grishagin, Ya.D. Sergeyev,
Parallel algorithms for multidimensional
global optimization with non-convex constraints, in Proc. of the
5th International Workshop "High-Performance Computing on Clusters",
Nizhni Novgorod "Lobachevsky"
University, Nizhni Novgorod (Russia)
November 22-25, 2005,
R.G. Strongin, Ed., NNGU Press, pp. 74-83.
[Top]
- F.Zama,
Image Processing in FEMLAB using Diffusion Filters,
Femlab Conference 2005,
Stocholm (Sweden), October 3-5, 2005.
[Top]
- Ya.D. Sergeyev,
Infinity Computer: Principles of work
and applications (invited plenary lecture),
in Proc. of the International Conference on Selected
Problems of Modern Mathematics,
Ishanov S.I., Ed., Kaliningrad
(Russia), April 4-8, 2005, pp. 211-213.
[Top]
- G. Landi, The Lagrange Method for the Regularization of Discrete
Ill-Posed Problems, 22nd
International Federation for Information Processing Conference on System
Modeling and Optimization TC 7 Conference, Torino,
18-22 Luglio 2005.
[Top]
- V.A. Grishagin, Ya.D. Sergeyev
(2004), Efficiency of
parallelization of characteristic
global optimization algorithms in
the framework of the nested
optimization scheme, Proceedings
of the 4th International Workshop
"Parallel Computations on
Clusters", Samara, Russia,
September 30 - October 2, 2004,
Soyfer V. ed., RC RAN Press, pp.
70-74.
[Top]
- Ya.D. Sergeyev (2004), A
new computational paradigm:
Mathematical model and applications,
Proceedings of the VI-th
International Congress on "Mathematical
Modeling", Nizhni Novgorod
State University, Nizhni Novgorod,
Russia, September 20-26, p. 27.
[Top]
- Ya.D. Sergeyev, D. E. Kvasov
(2004), Lipschitzian global
optimization without the Lipschitz
constant based on adaptive diagonal
curves, Proceedings of the VI-th
International Congress on "Mathematical
Modeling", Nizhni Novgorod
State University, Nizhni Novgorod,
Russia, September 20-26, p. 121.
[Top]
- M. Sofroniou, G. Spaletta (invited), The Matrix
Exponential: Efficient Computation and Error Analysis,
SIMAI XIV, September
20-24, 2004, Venice (Italy).
[Top]
- G. Spaletta (invited), Numerical
Assessments in the Work of Vito
Volterra, ICMMB XIV, September
16-18, 2004, Bologna (Italy).
[Top]
- E. Loli Piccolomini, A descent
method for computing the Tikhonov
regularized solution of linear
inverse problems,
SPIE Annual Meeting "Optical
Science and Technology",
Denver, Colorado (USA), August 2-6,
2004.
[Top]
- G. Landi, A Total Variation based
Regularization strategy in Magnetic
Resonance Imaging,
SPIE Annual Meeting "Optical
Science and Technology",
Denver, Colorado (USA), August 2-6,
2004.
[Top]
- Y. Sergeyev, Global
optimization methods and classes of
test functions, First
International Conference on
Continuous Optimization ICCOPT-I.
Troy (NY), USA, August 2-4, 2004.
[Top]
- Y. Sergeyev, Algorithms
and partition strategies for
Lipschitzian global optimization,
40th Workshop Large Scale Nonlinear
Optimization, Erice (TP), Italy,
June 22 - July 1, 2004.
[Top]
- T. Serafini, G. Zanghirati, L.
Zanni, Recent improvements to
gradient projection-based
decomposition techniques for Support
Vector Machines, International
Conference "MML2004 -
Mathematical Methods for Learning",
June 21-24, 2004, Como (Italy).
[Top]
- G. Spaletta, M. Sofroniou, Solving
Linear Systems Accurately,
Workshop on Dynamical Systems on
Matrix Manifolds, May 27-28, 2004,
Bari (Italy).
[Top]
- S. Bonettini, V. Ruggiero, E.
Galligani, Some iterative methods
for the solution of a reduced
symmetric indefnite KKT system,
International Conference
"HPSNO2004 - High Performance
Algorithms and Software for
Nonlinear Optimization", May
18-20, 2004, Ischia (Naples, Italy).
[Top]
- T. Serafini, G. Zanghirati, L.
Zanni, A Gradient
Projection-based Decomposition
Software for Large Quadratic
Programs in Training Support Vector
Machines, International
Conference "HPSNO2004 - High
Performance Algorithms and Software
for Nonlinear Optimization",
May 18-20, 2004, Ischia (Naples,
Italy).
[Top]
- S. Bonettini, V. Ruggiero, E.
Galligani, Interior point method
as inexact Newton method for KKT
systems
CORS/INFORMS Joint Int. Meeting, May 16-19, 2004, Banff,
Alberta (Canada).
[Top]
- G. Spaletta, M. Sofroniou, Efficient
Matrix Polynomial Computation and
Application to the Matrix
Exponential, IWTAM II, April
1-3, 2004, Montecatini (Italy).
[Top]
- V. Ruggiero (invited), Interior
point method as inexact Newton
method for KKt systems, Second
International Workshop on the
Technological Aspects of Mathematics,
April 1-3, 2004, Montecatini (Italy).
[Top]
Communications
at the Scientific Meeting "Numerical
Methods for Local and Global Optimization:
Sequential and Parallel Algorithms",
INdAM Conference in Cortona (Italy), July
14-20, 2003.
- G.
Landi, E. Loli Piccolomini
(2003), The Total Variation
Regularization in Dynamic MR Imaging.
Abstract.
The Total Variation (TV)
regularization method, proposed by
Rudin, Osher and Fatemi in [Physica
D., 1992], has recently become a
very popular technique in image
restoration problems. The motivation
for its success is that the TV
regularization performs well for
denoising and deblurring while
preserving sharp discontinuities.
The good performance of the TV model
makes it particularly attractive for
biomedical imaging. We study the use
of TV regularization technique in
the reconstruction of dynamic
Magnetic Resonance images [Z.P.
Liang, P.C. Lauterbur, 1994].
Several iterative schemes have been
proposed in the literature for
solving the TV minimization problem.
We compare a Newton and a fixed
point method [C.R. Vogel, M.E. Oman,
1996] performances on both test
problems and real MR data.
[Top]
- F.
Zama, E. Loli Piccolomini, Truncated
Conjugate Gradient Iterations for
Solving Ill-Posed Problems.
Abstract.
A large variety of applications give
raise naturally to ill-posed
problems. Whenever the underlying
physical or technical problem is
modelled by an integral equation of
the first kind with a smooth kernel.
The data usually stem from
measurements with a limited
precision, i.e., only perturbed data
are available. The inverse problem
is ill-posed and requires
regularization methods. In this work
we describe an iterative algorithm
for finding the solution and the
regularization parameter. Truncated
Conjugate Gradients Iterations are
implemented for computing the
solution, while the value of the
regularization is determined in
order to have decreasing values of
the objective functional. We develop
a stopping criterion for the
CG-iterates which is linked to the
noise level and the current value of
the regularization parameter.
[Top]
- G.
Spaletta, M. Sofroniou, Symmetric
Composition of Symmetric Numerical
Integration Methods.
Abstract.
This work focuses on the derivation
of composition methods for the
numerical integration of ordinary
differential equations. In contrast
to the Aitken-Neville algorithm used
in extrapolation, composition can
conserve desirable geometric
properties of a base integration
method, such as symplecticity [E.
Hairer, Ch. Lubich, G. Wanner,
2002]. We survey existing
composition methods and describe
results of a numerical search for
new methods [R.I. McLachlan, G.R.W.
Quispel, 2002; H. Yoshida, 1990].
The optimization procedure that has
been adopted [P.E. Gill, W. Murray,
M.H. Wright, 1984] can be very
intensive, especially when deriving
high order composition schemes. To
overcome this, we make use of
parallel computation. Numerical
examples indicate that the new
methods perform better than
previously known methods.
[Top]
- F.
Tinti, Numerical Solution of
Pseudomonotone Variational
Inequalities Problems by
Extragradient Methods.
Abstract.
In this work we have analyzed from
the numerical point of view the
class of projection methods for
solving variational inequality
problems. We focus on some specific
extragradient methods making use of
different choices of the steplengths.
Subsequently we have analyzed the
hyperplane projection methods in
which we construct an appropriate
hyperplane which strictly separates
the current iterate from the
solutions of the problem. Finally we
have included a collection of test
problems implemented in Matlab to
illustrate the effectiveness of the
proposed methods.
[Top]
- S.
Bonettini, A Nonmonotone
Inexact Newton Method.
Abstract.
In this work we describe a variant
of the inexact Newton method for
solving nonlinear systems of
equations. We define a nonmonotone
inexact Newton step and a
nonmonotone backtracking strategy.
For this nonmonotone scheme we
present the convergence theorems.
Finally, we show how we can apply
these strategies to Newton inexact
interior-point method and we present
some numerical examples.
[Top]
- T.
Serafini, G. Zanghirati, L.
Zanni, Parallel Training of
Support Vector Machines.
Abstract.
We present a parallel approach for
the solution of quadratic
programming (QP) problems with box
constraints and a single linear
constraint, arising in the training
of Support Vector Machines. In this
kind of application the problem is
dense and generally large-sized (larger
then 104). An iterative
decomposition technique has been
presented in [T. Joachims, 1998],
which solves the problem by
splitting it into a sequence of very
small QP (inner) subproblems (generally
with size less than 102).
The approach proposed in [G.
Zanghirati, L. Zanni, 2003] follows
this decomposition idea, but it is
designed to split the whole problem
into QP subproblems of sufficiently
large size (> 103),
so that they can be efficiently
solved by special gradient
projection methods [T. Serafini, G.
Zanghirati, L. Zanni, 2003]. On
scalar architectures this new
technique allows for comparable
performance with those of the
algorithm in [T.
Joachims, 1998],
but it is much more suited for
parallel implementations. In fact,
parallel versions of the gradient
projection methods can be applied to
solve the large QP inner subproblems
and the other expensive tasks of
each decomposition iteration can be
easily distributed among the
available processors. We present
several improvements of this
approach and we evaluate their
effectiveness on large-scale
benchmark problems both in scalar
and parallel environments.
[Top]
- M.
Gaviano, Complexity analysis
in miniimization problems.
Abstract.
The investigation of the numerical
complexity of algorithms that
minimize functions from Rn
into R is a very difficult
issue. In recent years new results
have been found giving useful
guidelines for a better
understanding of the algorithm
behaviors and even for improving
their performances. Many results
established for general or specific
minimization problems are reviewed.
The case of algorithms for global
minimization is also considered.
[Top]
- D.
Lera, Stochastic Global
Optimization Methods.
Abstract. In
this seminar we will present some
stochastic techniques for solving
global optimization problems.
Stochastic methods are techniques
that contain some stochastic
elements. This means that either the
outcome of the method is itself a
random variable (i.e. algorithms in
which the placement of observations
is based on the generation of random
points in the domain of the
objective function), or we consider
the objective function to be a
realization of a stochastic process.
Excellent surveys on the subject are
in [B. Betrò, 1991; C.G.E. Boender,
H.E. Romeijn, 1995; F. Schoen 1991;
A.A. Torn, A. Zilinskas, 1989]. Here
we will discuss, in particular, the
so-called two-phase methods, i.e.
methods in which both a global phase
and a local phase can be
distinguished. In the global phase,
the function is evaluated in a
number of randomly sampled points.
During the local phase the sample
points are "manipulated"
in order to yield a candidate global
minimum. We will give a brief
presentation of clustering
techniques and finally we will show
random search methods and Simulated
Annealing algorithms.
[Top]
- D.E.
Kvasov, M. Gaviano, D. Lera,
Ya.D. Sergeyev, GKLS-Generator
for Testing Global Optimization
Algorthms.
Abstract.
Development of efficient global
optimization algorithms is
impossible without examination of
their behaviour on sets of
sophisticated test problems. The
lack of complete information (such
as number of local minima, their
locations, attraction regions, local
and global values, ecc.) describing
global optimization tests taken from
real-life applications does not
allow to use themfor studying and
verifying validity of the algorithms.
That is why a significant effort is
made to construct test functions
[C.A. Floudas et al, 1994; C.A.
Floudas et al., 1999; M.
Gaviano, D. Lera, 1998; J. Pintèr,
2002]. In this communication, a
procedure for generating three types
(non-differentiable, continuously
differentiable, and twice
continuously differentiable) of
classes of multidimensional and
multiextremal test functions with
known local and global minima is
presented. The procedure consists of
defining a convex quadratic function
systematically distorted by
polynomials. Each test class
provided by the GKLS-generator
consists of 100 functions and is
defined by the following parameters:
(i) problem dimension, (ii) number
of local minima, (iii) global
minimum value, (iv) radius of the
attraction region of the global
minimizer, (v) distance from the
global minimizer to the quadratic
function vertex. The other necessary
parameters are chosen randomly by
the generator for each test function
of the class. A special notebook
with a complete description of all
functions is supplied to the user.
[Top]
- L.
Zanni, Gradient Projection
Methods for Quadratic Programs.
Abstract.
Gradient projection methods for
quadratic programming problems with
simple constraints are considered.
Starting from the analysis of the
classical versions of these schemes,
some recent gradient projection
algorithms are introduced and the
importance of using appropriate
linesearch strategies and steplength
selection rules is stressed.
Linesearch techniques based on both
limited minimization rules and
nonmonotone strategies are
considered. For the steplength
selection, the Barzilai-Borwein
spectral rules are discussed and
some new suggestions are presented.
Numerical results showing the
behaviours of the proposed
approaches are reported.
[Top]
Communications
at the 4th International Conference "4th
International Conference on Global
Optimization", Santonini (Greece),
June 8-12, 2003.
- Ya.D.
Sergeyev, Lipschitz Global
Optimization and Local Information,
Proc. of the 4th Inter. Conf. on
Frontiers in Global Optimization,
Aegean Conferences Series 10,
p. 21.
[Top]
- D.E.
Kvasov, M. Gaviano, D. Lera,
Ya.D. Sergeyev, Generator of
Classes ofTest Functions for Global
Optimization Algorthms, Proc. of
the 4th Inter. Conf. on Frontiers in
Global Optimization, Aegean
Conferences Series 10, p. 55.
[Top]
- L.G.
Casado, J.A. Martinez, I. Garcia,
Ya.D. Sergeyev, B. Toth, Efficient
Use of Gradient Information in
Multidimensional Interval Global
Optimization Algorthms, Proc. of
the 4th Inter. Conf. on Frontiers in
Global Optimization, Aegean
Conferences Series 10, p. 61.
[Top]
Other
talks at International Conferences
- G.Spaletta, M. Sofroniou, Solving
orthogonal matrix differential
systems in Mathematica, ICCS'02,
April 21-24, 2002, Amsterdam (The
Netherlands).
[Top]
- M.
Sofroniou, G. Spaletta, P.C. Moan,
G.R.W. Quispel, A generalization
of Runge-Kutta methods, CSC
2002, Geneva University, Mathematics
Section, June 26-29, 2002, Geneva (Switzerland).
[Top]
- T. Serafini, G. Zanghirati, L.
Zanni, Adaptive Steplength
Selections in Gradient Projection
Methods for QP, "NA03 -
20th Biennial Conference on
Numerical Analysis", June
24-27, 2003, Dundee (Scothland).
[Top]
- T.
Serafini, G. Zanghirati, L. Zanni, Parallel
Decomposition Approaches for
Training SVMs, International
Conference "ParCo2003",
September 2-5, 2003, Dresda (Germany).
[Top]
- G. Zanghirati, L. Zanni, Decomposition
Techniques in Training Support
Vector Machines: Inner QP Solvers
and Parallel Approaches,
International Workshop on "Mathematical
Diagnostics", June 17-26, 2002,
Erice (Italy).
[Top]
- G.
Zanghirati, L. Zanni, Variable
Projection Methods for Quadratic
Programs in Training Support Vector
Machines with Gaussian Kernels,
International Conference "SIAM
Meeting on Optimization 2002",
May 20-22, 2002, Toronto (Canada).
[Top]
National
Conferences
- G.Spaletta,
Some invariance theorems for one--step integration methods
(invited talk),
Structural Dynamical Systems: Computational Aspects (SDS 2006),
Monopoli (Bari, Italy), June 13-16, 2006.
[Top]
- Ya.D. Sergeyev,
Infinity Computer and Calculus,
Atti del convegno "SIMAI 2006 - VII Congress of the Italian Society for Applied
and Industrial Mathematics",
Baia Samuele (Ragusa, Italy), May 22-26, 2006,
a cura di Puccio L. et al., Università
degli Studi di Messina, 2006, p. 230.
[Top]
- G.Spaletta,
Concetti di Forma Geometrica ed Integrazione:
introduzione alla Approssimazione di Dati Sperimentali,
(Lettura Magistrale su invito),
Università di Parma, March 29, 2006.
[Top]
- T. Serafini, G. Zanghirati, L. Zanni,
Numerical Topics on SVMs
Classification, workshop "ASTAA Project Meeting 2005",
Genova, 9-10 Giugno 2005.
[Top]
- V. Ruggiero (invited), Ottimizzazione
e Calcolo Parallelo, Workshop
honoring Alfonso Laratta, October
13, 2004, Modena (Italy).
[Top]
- G. Spaletta (invited), M.
Sofroniou, The Matrix Exponential:
Efficient Computation and Error
Analysis, Congresso SIMAI VII,
September 20-24, 2004, Venice (Italy).
[Top]
- T. Serafini, G. Zanghirati, L.
Zanni, Regole adattative per
linesearch e selezione del
passo in metodi del gradiente
proiettato per l’ottimizzazione
non lineare, Convegno GNCS,
February 9-11, 2004, Montecatini.
[Top]
- E. Galligani, V. Ruggiero, S.
Bonettini, A Perturbed-Damped
Newton Method for Large-Scale
Constrained Optimization, National
Congress "Analisi Numerica:
Stato dell'Arte", September
2002, Rende (Cosenza).
[Top]
- S.
Bonettini, E. Galligani, V.
Ruggiero, Analisi del Metodo di
Newton del Punto Interno, XVII
Congresso UMI, September 8-13, 2003,
Milan.
[Top]
- S.
Bonettini, E. Galligani, V.
Ruggiero, On the Newton
Interior-Point Method for Nonlinear
Programming, AIRO Conference,
September 2-5, 2003, Venice.
[Top]
- G.
Zanghirati, L. Zanni, A
Parallel Solver for Large Quadratic
Programs in Training Support Vector
Machines, National Conference
"SIMAI 2002", May 26-31,
2002, Chia (Cagliari).
[Top]
- G.
Zanghirati, L. Zanni, Decomposition
Techniques for Large Quadratic
Programs in Training Support Vector
Machines, International
Conference "APMOD 2002",
June 17-21, 2002, Milan.
[Top]
- T.
Serafini, G. Zanghirati, L. Zanni, Accelerazione
della Convergenza in metodi del
Gradiente Proiettato per Problemi di
Programmazione Quadratica, XVII
Congresso UMI, September 8-13, 2003,
Milan.
[Top]
- T.
Serafini, G. Zanghirati, L. Zanni, Variable
Projection Decomposition Techniques
for Large-Scale Support Vector
Machines, National Congress
"Analisi Numerica: Stato
dell'Arte", September 2002,
Rende (Cosenza).
[Top]
- T.
Serafini, G. Zanghirati, L. Zanni, Steplength
Selections in Gradient Projection
Methods for Large-Scale Quadratic
Programs, AIRO
Conference, September 2-5, 2003,
Venice.
[Top]
- G. Spaletta, Symplectic
Elementary Differential Runge-Kutta
Methods, June 22-25, 2003,
Monopoli (Bari).
[Top]
- Y. Sergeyev, L'infinito in
matematica, fisica e filosofia,
Pisa, Italy. March 26, 2004.
[Top]
- Y. Sergeyev, First Workshop of DEIS,
Cetraro (CS), Italy, July 6-8, 2004.
[Top]
Summer
Schools
- Metodi Numerici per Equazioni di
Evoluzione, Prof. A. Ostermann (Innsbruck
University), Prof. J.G. Verwer (CWI,
The Netherlands),
Dobbiaco (Italy), June 28 - July 2,
2004.
.
[Top]
|
|