Conference
Communications
International
Conferences
 Ya.D. Sergeyev,
Infinity Computer and Calculus
(invited plenary lecture), in Proc. of the
3rd International Conference of Applied
Mathematics TICAM, Plovdiv (Bulgaria), August 1218, 2006,
Bainov D. and Nenov S., eds.,
University of Plovdiv Press, Vol. 2, pp. 246247.
[Top]
 Ya.D. Sergeyev,
Infinity Computer and Calculus
(invited plenary lecture), in Proc. of the
Conference on Applied Optimization and
Metaheuristic Innovations, Yalta (Ukraine), July 1921, 2006,
Sergienko I. V., ed., pp. 4041.
[Top]
 G. Zanghirati, L. Zanni,
Some properties of gradientbased methods with application to
machine learning
(invited talk),
EuroXXI  21st European Conference on Operational Research,
Reykjavik (Iceland), July 25, 2006.
[Top]
 G. Zanghirati, L. Zanni,
Largescale Support Vector Machines: Decomposition and Cascade Approaches
(invited talk)
EuroXXI  21st European Conference on Operational Research,
Reykjavik (Iceland), July 25, 2006.
[Top]
 G. Spaletta,
Modelling the Thyroid Geometry,
8th International Mathematica Symposium,
Palais des Papes, Avignone (France), June 1923, 2006.
[Top]
 M. Sofroniou,
Mathematica's Numerics
(series of invited talks),
8th International Mathematica Symposium,
Palais des Papes, Avignone (France), June 1923, 2006.
[Top]
 G. Spaletta, M. Sofroniou,
Matrix Polynomials and the Matrix Exponential
(invited talk),
Castellon Conference on Geometric Integration,
University Jaume I, September 1822, 2006.
[Top]
 E. Galligani, V. Ruggiero, G. Zanghirati,
Splitting methods for nonlinear diffusion filtering,
3rd IASC World Conference on
"Computational Statistics and Data Analysis",
Limassol (Cyprus), October 2831, 2005.
[Top]
 T. Serafini, G. Zanghirati, L. Zanni,
Gradient ProjectionType Quadratic Solvers
in Parallel Decomposition Techniques for Support Vector Machines,
3rd IASC World Conference on
"Computational Statistics and Data Analysis",
Limassol (Cyprus), October 2831, 2005.
[Top]
 T. Serafini, G. Zanghirati, L. Zanni,
Some Improvements to a Parallel
Decomposition Technique for
Training Support Vector Machines,
EURO PVM MPI 2005, 12th European Parallel Virtual Machine and
Message Passing Interface Conference,
Sorrento (Naples, Italy), September 1821, 2005.
[Top]
 T. Serafini, G. Zanghirati, L. Zanni,
Decomposition techniques and gradient projection methods in
training Support Vector Machines,
SIAM Conference on Optimization 2005,
Stocholm (Sweden), May 1519, 2005.
[Top]
 T. Serafini, G. Zanghirati, L. Zanni,
On gradient projectionbased decomposition techniques for training
SVMs on parallel architectures,
PASCAL Workshop,
Thurnau (Germany), March 1618, 2005.
[Top]
 Ya.D. Sergeyev, D.E. Kvasov,
Diagonal global search based on a set of possible
Lipschitz constants, In Proc. of the
International Workshop on Global Optimization GO05,
Almerìa (Spain), September 1822, 2005, pp. 219224.
[Top]
 YA.D.Sergeyev,
Global search based on efficient diagonal partitions
(invited plenary lecture),
International Conference on Complementarity, Duality, and Global
Optimization in Science and Engineering CDGO2005,
Blacksburg (Virginia, USA),
August 1517, 2005. .
[Top]
 V.A. Grishagin, Ya.D. Sergeyev,
Parallel algorithms for multidimensional
global optimization with nonconvex constraints, in Proc. of the
5th International Workshop "HighPerformance Computing on Clusters",
Nizhni Novgorod "Lobachevsky"
University, Nizhni Novgorod (Russia)
November 2225, 2005,
R.G. Strongin, Ed., NNGU Press, pp. 7483.
[Top]
 F.Zama,
Image Processing in FEMLAB using Diffusion Filters,
Femlab Conference 2005,
Stocholm (Sweden), October 35, 2005.
[Top]
 Ya.D. Sergeyev,
Infinity Computer: Principles of work
and applications (invited plenary lecture),
in Proc. of the International Conference on Selected
Problems of Modern Mathematics,
Ishanov S.I., Ed., Kaliningrad
(Russia), April 48, 2005, pp. 211213.
[Top]
 G. Landi, The Lagrange Method for the Regularization of Discrete
IllPosed Problems, 22nd
International Federation for Information Processing Conference on System
Modeling and Optimization TC 7 Conference, Torino,
1822 Luglio 2005.
[Top]
 V.A. Grishagin, Ya.D. Sergeyev
(2004), Efficiency of
parallelization of characteristic
global optimization algorithms in
the framework of the nested
optimization scheme, Proceedings
of the 4th International Workshop
"Parallel Computations on
Clusters", Samara, Russia,
September 30  October 2, 2004,
Soyfer V. ed., RC RAN Press, pp.
7074.
[Top]
 Ya.D. Sergeyev (2004), A
new computational paradigm:
Mathematical model and applications,
Proceedings of the VIth
International Congress on "Mathematical
Modeling", Nizhni Novgorod
State University, Nizhni Novgorod,
Russia, September 2026, p. 27.
[Top]
 Ya.D. Sergeyev, D. E. Kvasov
(2004), Lipschitzian global
optimization without the Lipschitz
constant based on adaptive diagonal
curves, Proceedings of the VIth
International Congress on "Mathematical
Modeling", Nizhni Novgorod
State University, Nizhni Novgorod,
Russia, September 2026, p. 121.
[Top]
 M. Sofroniou, G. Spaletta (invited), The Matrix
Exponential: Efficient Computation and Error Analysis,
SIMAI XIV, September
2024, 2004, Venice (Italy).
[Top]
 G. Spaletta (invited), Numerical
Assessments in the Work of Vito
Volterra, ICMMB XIV, September
1618, 2004, Bologna (Italy).
[Top]
 E. Loli Piccolomini, A descent
method for computing the Tikhonov
regularized solution of linear
inverse problems,
SPIE Annual Meeting "Optical
Science and Technology",
Denver, Colorado (USA), August 26,
2004.
[Top]
 G. Landi, A Total Variation based
Regularization strategy in Magnetic
Resonance Imaging,
SPIE Annual Meeting "Optical
Science and Technology",
Denver, Colorado (USA), August 26,
2004.
[Top]
 Y. Sergeyev, Global
optimization methods and classes of
test functions, First
International Conference on
Continuous Optimization ICCOPTI.
Troy (NY), USA, August 24, 2004.
[Top]
 Y. Sergeyev, Algorithms
and partition strategies for
Lipschitzian global optimization,
40th Workshop Large Scale Nonlinear
Optimization, Erice (TP), Italy,
June 22  July 1, 2004.
[Top]
 T. Serafini, G. Zanghirati, L.
Zanni, Recent improvements to
gradient projectionbased
decomposition techniques for Support
Vector Machines, International
Conference "MML2004 
Mathematical Methods for Learning",
June 2124, 2004, Como (Italy).
[Top]
 G. Spaletta, M. Sofroniou, Solving
Linear Systems Accurately,
Workshop on Dynamical Systems on
Matrix Manifolds, May 2728, 2004,
Bari (Italy).
[Top]
 S. Bonettini, V. Ruggiero, E.
Galligani, Some iterative methods
for the solution of a reduced
symmetric indefnite KKT system,
International Conference
"HPSNO2004  High Performance
Algorithms and Software for
Nonlinear Optimization", May
1820, 2004, Ischia (Naples, Italy).
[Top]
 T. Serafini, G. Zanghirati, L.
Zanni, A Gradient
Projectionbased Decomposition
Software for Large Quadratic
Programs in Training Support Vector
Machines, International
Conference "HPSNO2004  High
Performance Algorithms and Software
for Nonlinear Optimization",
May 1820, 2004, Ischia (Naples,
Italy).
[Top]
 S. Bonettini, V. Ruggiero, E.
Galligani, Interior point method
as inexact Newton method for KKT
systems
CORS/INFORMS Joint Int. Meeting, May 1619, 2004, Banff,
Alberta (Canada).
[Top]
 G. Spaletta, M. Sofroniou, Efficient
Matrix Polynomial Computation and
Application to the Matrix
Exponential, IWTAM II, April
13, 2004, Montecatini (Italy).
[Top]
 V. Ruggiero (invited), Interior
point method as inexact Newton
method for KKt systems, Second
International Workshop on the
Technological Aspects of Mathematics,
April 13, 2004, Montecatini (Italy).
[Top]
Communications
at the Scientific Meeting "Numerical
Methods for Local and Global Optimization:
Sequential and Parallel Algorithms",
INdAM Conference in Cortona (Italy), July
1420, 2003.
 G.
Landi, E. Loli Piccolomini
(2003), The Total Variation
Regularization in Dynamic MR Imaging.
Abstract.
The Total Variation (TV)
regularization method, proposed by
Rudin, Osher and Fatemi in [Physica
D., 1992], has recently become a
very popular technique in image
restoration problems. The motivation
for its success is that the TV
regularization performs well for
denoising and deblurring while
preserving sharp discontinuities.
The good performance of the TV model
makes it particularly attractive for
biomedical imaging. We study the use
of TV regularization technique in
the reconstruction of dynamic
Magnetic Resonance images [Z.P.
Liang, P.C. Lauterbur, 1994].
Several iterative schemes have been
proposed in the literature for
solving the TV minimization problem.
We compare a Newton and a fixed
point method [C.R. Vogel, M.E. Oman,
1996] performances on both test
problems and real MR data.
[Top]
 F.
Zama, E. Loli Piccolomini, Truncated
Conjugate Gradient Iterations for
Solving IllPosed Problems.
Abstract.
A large variety of applications give
raise naturally to illposed
problems. Whenever the underlying
physical or technical problem is
modelled by an integral equation of
the first kind with a smooth kernel.
The data usually stem from
measurements with a limited
precision, i.e., only perturbed data
are available. The inverse problem
is illposed and requires
regularization methods. In this work
we describe an iterative algorithm
for finding the solution and the
regularization parameter. Truncated
Conjugate Gradients Iterations are
implemented for computing the
solution, while the value of the
regularization is determined in
order to have decreasing values of
the objective functional. We develop
a stopping criterion for the
CGiterates which is linked to the
noise level and the current value of
the regularization parameter.
[Top]
 G.
Spaletta, M. Sofroniou, Symmetric
Composition of Symmetric Numerical
Integration Methods.
Abstract.
This work focuses on the derivation
of composition methods for the
numerical integration of ordinary
differential equations. In contrast
to the AitkenNeville algorithm used
in extrapolation, composition can
conserve desirable geometric
properties of a base integration
method, such as symplecticity [E.
Hairer, Ch. Lubich, G. Wanner,
2002]. We survey existing
composition methods and describe
results of a numerical search for
new methods [R.I. McLachlan, G.R.W.
Quispel, 2002; H. Yoshida, 1990].
The optimization procedure that has
been adopted [P.E. Gill, W. Murray,
M.H. Wright, 1984] can be very
intensive, especially when deriving
high order composition schemes. To
overcome this, we make use of
parallel computation. Numerical
examples indicate that the new
methods perform better than
previously known methods.
[Top]
 F.
Tinti, Numerical Solution of
Pseudomonotone Variational
Inequalities Problems by
Extragradient Methods.
Abstract.
In this work we have analyzed from
the numerical point of view the
class of projection methods for
solving variational inequality
problems. We focus on some specific
extragradient methods making use of
different choices of the steplengths.
Subsequently we have analyzed the
hyperplane projection methods in
which we construct an appropriate
hyperplane which strictly separates
the current iterate from the
solutions of the problem. Finally we
have included a collection of test
problems implemented in Matlab to
illustrate the effectiveness of the
proposed methods.
[Top]
 S.
Bonettini, A Nonmonotone
Inexact Newton Method.
Abstract.
In this work we describe a variant
of the inexact Newton method for
solving nonlinear systems of
equations. We define a nonmonotone
inexact Newton step and a
nonmonotone backtracking strategy.
For this nonmonotone scheme we
present the convergence theorems.
Finally, we show how we can apply
these strategies to Newton inexact
interiorpoint method and we present
some numerical examples.
[Top]
 T.
Serafini, G. Zanghirati, L.
Zanni, Parallel Training of
Support Vector Machines.
Abstract.
We present a parallel approach for
the solution of quadratic
programming (QP) problems with box
constraints and a single linear
constraint, arising in the training
of Support Vector Machines. In this
kind of application the problem is
dense and generally largesized (larger
then 10^{4}). An iterative
decomposition technique has been
presented in [T. Joachims, 1998],
which solves the problem by
splitting it into a sequence of very
small QP (inner) subproblems (generally
with size less than 10^{2}).
The approach proposed in [G.
Zanghirati, L. Zanni, 2003] follows
this decomposition idea, but it is
designed to split the whole problem
into QP subproblems of sufficiently
large size (> 10^{3}),
so that they can be efficiently
solved by special gradient
projection methods [T. Serafini, G.
Zanghirati, L. Zanni, 2003]. On
scalar architectures this new
technique allows for comparable
performance with those of the
algorithm in [T.
Joachims, 1998],
but it is much more suited for
parallel implementations. In fact,
parallel versions of the gradient
projection methods can be applied to
solve the large QP inner subproblems
and the other expensive tasks of
each decomposition iteration can be
easily distributed among the
available processors. We present
several improvements of this
approach and we evaluate their
effectiveness on largescale
benchmark problems both in scalar
and parallel environments.
[Top]
 M.
Gaviano, Complexity analysis
in miniimization problems.
Abstract.
The investigation of the numerical
complexity of algorithms that
minimize functions from R^{n}
into R is a very difficult
issue. In recent years new results
have been found giving useful
guidelines for a better
understanding of the algorithm
behaviors and even for improving
their performances. Many results
established for general or specific
minimization problems are reviewed.
The case of algorithms for global
minimization is also considered.
[Top]
 D.
Lera, Stochastic Global
Optimization Methods.
Abstract. In
this seminar we will present some
stochastic techniques for solving
global optimization problems.
Stochastic methods are techniques
that contain some stochastic
elements. This means that either the
outcome of the method is itself a
random variable (i.e. algorithms in
which the placement of observations
is based on the generation of random
points in the domain of the
objective function), or we consider
the objective function to be a
realization of a stochastic process.
Excellent surveys on the subject are
in [B. Betrò, 1991; C.G.E. Boender,
H.E. Romeijn, 1995; F. Schoen 1991;
A.A. Torn, A. Zilinskas, 1989]. Here
we will discuss, in particular, the
socalled twophase methods, i.e.
methods in which both a global phase
and a local phase can be
distinguished. In the global phase,
the function is evaluated in a
number of randomly sampled points.
During the local phase the sample
points are "manipulated"
in order to yield a candidate global
minimum. We will give a brief
presentation of clustering
techniques and finally we will show
random search methods and Simulated
Annealing algorithms.
[Top]
 D.E.
Kvasov, M. Gaviano, D. Lera,
Ya.D. Sergeyev, GKLSGenerator
for Testing Global Optimization
Algorthms.
Abstract.
Development of efficient global
optimization algorithms is
impossible without examination of
their behaviour on sets of
sophisticated test problems. The
lack of complete information (such
as number of local minima, their
locations, attraction regions, local
and global values, ecc.) describing
global optimization tests taken from
reallife applications does not
allow to use themfor studying and
verifying validity of the algorithms.
That is why a significant effort is
made to construct test functions
[C.A. Floudas et al, 1994; C.A.
Floudas et al., 1999; M.
Gaviano, D. Lera, 1998; J. Pintèr,
2002]. In this communication, a
procedure for generating three types
(nondifferentiable, continuously
differentiable, and twice
continuously differentiable) of
classes of multidimensional and
multiextremal test functions with
known local and global minima is
presented. The procedure consists of
defining a convex quadratic function
systematically distorted by
polynomials. Each test class
provided by the GKLSgenerator
consists of 100 functions and is
defined by the following parameters:
(i) problem dimension, (ii) number
of local minima, (iii) global
minimum value, (iv) radius of the
attraction region of the global
minimizer, (v) distance from the
global minimizer to the quadratic
function vertex. The other necessary
parameters are chosen randomly by
the generator for each test function
of the class. A special notebook
with a complete description of all
functions is supplied to the user.
[Top]
 L.
Zanni, Gradient Projection
Methods for Quadratic Programs.
Abstract.
Gradient projection methods for
quadratic programming problems with
simple constraints are considered.
Starting from the analysis of the
classical versions of these schemes,
some recent gradient projection
algorithms are introduced and the
importance of using appropriate
linesearch strategies and steplength
selection rules is stressed.
Linesearch techniques based on both
limited minimization rules and
nonmonotone strategies are
considered. For the steplength
selection, the BarzilaiBorwein
spectral rules are discussed and
some new suggestions are presented.
Numerical results showing the
behaviours of the proposed
approaches are reported.
[Top]
Communications
at the 4th International Conference "4th
International Conference on Global
Optimization", Santonini (Greece),
June 812, 2003.
 Ya.D.
Sergeyev, Lipschitz Global
Optimization and Local Information,
Proc. of the 4th Inter. Conf. on
Frontiers in Global Optimization,
Aegean Conferences Series 10,
p. 21.
[Top]
 D.E.
Kvasov, M. Gaviano, D. Lera,
Ya.D. Sergeyev, Generator of
Classes ofTest Functions for Global
Optimization Algorthms, Proc. of
the 4th Inter. Conf. on Frontiers in
Global Optimization, Aegean
Conferences Series 10, p. 55.
[Top]
 L.G.
Casado, J.A. Martinez, I. Garcia,
Ya.D. Sergeyev, B. Toth, Efficient
Use of Gradient Information in
Multidimensional Interval Global
Optimization Algorthms, Proc. of
the 4th Inter. Conf. on Frontiers in
Global Optimization, Aegean
Conferences Series 10, p. 61.
[Top]
Other
talks at International Conferences
 G.Spaletta, M. Sofroniou, Solving
orthogonal matrix differential
systems in Mathematica, ICCS'02,
April 2124, 2002, Amsterdam (The
Netherlands).
[Top]
 M.
Sofroniou, G. Spaletta, P.C. Moan,
G.R.W. Quispel, A generalization
of RungeKutta methods, CSC
2002, Geneva University, Mathematics
Section, June 2629, 2002, Geneva (Switzerland).
[Top]
 T. Serafini, G. Zanghirati, L.
Zanni, Adaptive Steplength
Selections in Gradient Projection
Methods for QP, "NA03 
20th Biennial Conference on
Numerical Analysis", June
2427, 2003, Dundee (Scothland).
[Top]
 T.
Serafini, G. Zanghirati, L. Zanni, Parallel
Decomposition Approaches for
Training SVMs, International
Conference "ParCo2003",
September 25, 2003, Dresda (Germany).
[Top]
 G. Zanghirati, L. Zanni, Decomposition
Techniques in Training Support
Vector Machines: Inner QP Solvers
and Parallel Approaches,
International Workshop on "Mathematical
Diagnostics", June 1726, 2002,
Erice (Italy).
[Top]
 G.
Zanghirati, L. Zanni, Variable
Projection Methods for Quadratic
Programs in Training Support Vector
Machines with Gaussian Kernels,
International Conference "SIAM
Meeting on Optimization 2002",
May 2022, 2002, Toronto (Canada).
[Top]
National
Conferences
 G.Spaletta,
Some invariance theorems for onestep integration methods
(invited talk),
Structural Dynamical Systems: Computational Aspects (SDS 2006),
Monopoli (Bari, Italy), June 1316, 2006.
[Top]
 Ya.D. Sergeyev,
Infinity Computer and Calculus,
Atti del convegno "SIMAI 2006  VII Congress of the Italian Society for Applied
and Industrial Mathematics",
Baia Samuele (Ragusa, Italy), May 2226, 2006,
a cura di Puccio L. et al., Università
degli Studi di Messina, 2006, p. 230.
[Top]
 G.Spaletta,
Concetti di Forma Geometrica ed Integrazione:
introduzione alla Approssimazione di Dati Sperimentali,
(Lettura Magistrale su invito),
Università di Parma, March 29, 2006.
[Top]
 T. Serafini, G. Zanghirati, L. Zanni,
Numerical Topics on SVMs
Classification, workshop "ASTAA Project Meeting 2005",
Genova, 910 Giugno 2005.
[Top]
 V. Ruggiero (invited), Ottimizzazione
e Calcolo Parallelo, Workshop
honoring Alfonso Laratta, October
13, 2004, Modena (Italy).
[Top]
 G. Spaletta (invited), M.
Sofroniou, The Matrix Exponential:
Efficient Computation and Error
Analysis, Congresso SIMAI VII,
September 2024, 2004, Venice (Italy).
[Top]
 T. Serafini, G. Zanghirati, L.
Zanni, Regole adattative per
linesearch e selezione del
passo in metodi del gradiente
proiettato per l’ottimizzazione
non lineare, Convegno GNCS,
February 911, 2004, Montecatini.
[Top]
 E. Galligani, V. Ruggiero, S.
Bonettini, A PerturbedDamped
Newton Method for LargeScale
Constrained Optimization, National
Congress "Analisi Numerica:
Stato dell'Arte", September
2002, Rende (Cosenza).
[Top]
 S.
Bonettini, E. Galligani, V.
Ruggiero, Analisi del Metodo di
Newton del Punto Interno, XVII
Congresso UMI, September 813, 2003,
Milan.
[Top]
 S.
Bonettini, E. Galligani, V.
Ruggiero, On the Newton
InteriorPoint Method for Nonlinear
Programming, AIRO Conference,
September 25, 2003, Venice.
[Top]
 G.
Zanghirati, L. Zanni, A
Parallel Solver for Large Quadratic
Programs in Training Support Vector
Machines, National Conference
"SIMAI 2002", May 2631,
2002, Chia (Cagliari).
[Top]
 G.
Zanghirati, L. Zanni, Decomposition
Techniques for Large Quadratic
Programs in Training Support Vector
Machines, International
Conference "APMOD 2002",
June 1721, 2002, Milan.
[Top]
 T.
Serafini, G. Zanghirati, L. Zanni, Accelerazione
della Convergenza in metodi del
Gradiente Proiettato per Problemi di
Programmazione Quadratica, XVII
Congresso UMI, September 813, 2003,
Milan.
[Top]
 T.
Serafini, G. Zanghirati, L. Zanni, Variable
Projection Decomposition Techniques
for LargeScale Support Vector
Machines, National Congress
"Analisi Numerica: Stato
dell'Arte", September 2002,
Rende (Cosenza).
[Top]
 T.
Serafini, G. Zanghirati, L. Zanni, Steplength
Selections in Gradient Projection
Methods for LargeScale Quadratic
Programs, AIRO
Conference, September 25, 2003,
Venice.
[Top]
 G. Spaletta, Symplectic
Elementary Differential RungeKutta
Methods, June 2225, 2003,
Monopoli (Bari).
[Top]
 Y. Sergeyev, L'infinito in
matematica, fisica e filosofia,
Pisa, Italy. March 26, 2004.
[Top]
 Y. Sergeyev, First Workshop of DEIS,
Cetraro (CS), Italy, July 68, 2004.
[Top]
Summer
Schools
 Metodi Numerici per Equazioni di
Evoluzione, Prof. A. Ostermann (Innsbruck
University), Prof. J.G. Verwer (CWI,
The Netherlands),
Dobbiaco (Italy), June 28  July 2,
2004.
.
[Top]

