Seminários de Otimização 2010-2
Os seminários do Grupo de Otimização do IME, UFG, no segundo semestre 2010, acontecerá nas sexta-feiras as 14-15.30 horas. Local: IME, sala 110.
Acesse aqui os seminários que aconteceu no primeirto semestre 2010.
Palestrante | Título | ||
---|---|---|---|
20 |
Agosto | José Yunier Bello Cruz | A strongly convergent direct method for monotone variational inequalities in Hilbert spaces |
Abstract: We introduce a two-step direct method, like Korpelevich's, for solving monotone variational inequalities. The advantage of our method over the former, is that it converges strongly in Hilbert spaces, while only weak convergence has been proved for Korpelevich's algorithm. Our method also has the following desirable property: the sequence converges to the solution of the problem which lies closest to the initial iterate. | |||
27 | Agosto | José Yunier Bello Cruz | A strongly convergent method for nonsmooth convex minimization in Hilbert spaces |
Abstract: In this paper we propose a strongly convergent variant on the projected subgradient method for constrained convex minimization problems in Hilbert spaces. The advantage of the proposed method is that it converges strongly, when the problem has a solution without additional assumptions. Furthermore, the method can be applied even when the problem has no solution. In this case the generated sequence diverges. The method also has the following desirable property: the sequence converges to the solution of the problem which lies closest to the initial iterate. Several motivations and applications on the new method are discussed, justifying the required modifications of the projected subgradient method with the aim of improving the convergence results in infinite-dimensional Hilbert spaces. |
|||
03 | Setembro | Carlos Rubianes Silva | Conditional Subgradient Optimization |
Abstract: We generalize the subgradient optimization method for nondifferentiable convex programm to utilize conditional subgradient. We derive the new method and establish its convergence by generalizing convergence results for traditional subgradient optimization. | |||
10 | Setembro | Jurandir de Oliveira Lopes, UFPI |
Algorithms for Quasiconvex Minimization |
Abstract: we show two algorithms for minimization of quasiconvex functions. The first algorithm is of the type subgradient. In the second one, we consider the steepest descent method with Armijo's rule. In both of them we use elements from Plastria's lower subdifferential. We prove that, under certain conditions, the sequence generated by these algorithms converges to a global minimizer. We give a counter-example showing that the choice of the minus gradient direction does not assure the global convergence of the descent method to the global minimizer. This counter-example is related to a mistake in a proof of the technical note "Dussault J.P., {it Convergence of Implementable Descent Algorithms for Uncostrained Optimization}, JOTA, Vol. 104, No. 3, pp. 739-745, 2000". We also point out the mistake in that proof. | |||
17 | Setembro | Orizon Perreira Ferreira | Convexity of Sets on the Sphere |
Abstract: Convexity of sets on the sphere will be studied in this seminar. First of all, some important properties of the Riemannian distance on the sphere will be presented, and in particular, the spectral decomposition of its Hessian. Also, some properties and characterizations of the convex sets on sphere and projection onto convex sets will be studied. | |||
24 | Setembro | Orizon Perreira Ferreira | Convexity of Functions on the Sphere |
Abstract: Convexity of functions on the sphere will be studied in this seminar. First of all, the convexity of the Riemannian distance on the sphere will be investigated. Also, some properties and characterizations of convex functions of the sphere and optimality conditions for convex problens will be studied. | |||
01 | Outubro | Jefferson G. Melo | Dualidade e Convexidade Abstrata |
Abstract: Neste seminário apresentaremos uma noção abstrata de convexidade (convexidade com respeito a uma função de acoplamento). Veremos como convexidade abstrata pode ser usada para construir problema dual de um problema não necessariamente convexo ou diferenciável. Discutiremos como obter ausência de brecha de dualidade entre o problema primal e o problema dual. |
|||
08 | Outubro | Bryon Richard Hall | O Problema de Equilíbrio de Tráfego como Problema de Inequaçaão Variacional |
Abstract: Apresenta-se o Problema de Equilíbrio de Tráfego, um problema de minimização simples se certas restrições são feitas. Na ausência das restrições, pode ser resolvido por uma seqüência de minimizações de um problema que muda de parâmetros a cada iteração. Mas pode ser visto como inequação variacional, e assim há outras opções de encontro de solução. Versão preliminar já foi implementada e em fase de expansão. | |||
15 | Outubro | ||
22 | Outubro | Luis Roman Lucambio Pérez | Full convergence of a projected gradient method for quasiconvex multiobjective optimization |
Abstract: We consider a projected gradient type method to solve the problem for finding a Pareto optimum of a quasiconvex multi-objective function restricted to a closed convex set. We show full convergence of the sequence generated by the algorithm to a stationary point. Furthermore, when the function is pseudoconvex, we obtain that the unique limit point is a weakly efficient solution. | |||
29 | Outubro | Luis Roman Lucambio Pérez | Full convergence of a projected gradient method for quasiconvex multiobjective optimization |
Abstract: We consider a projected gradient type method to solve the problem for finding a Pareto optimum of a quasiconvex multi-objective function restricted to a closed convex set. We show full convergence of the sequence generated by the algorithm to a stationary point. Furthermore, when the function is pseudoconvex, we obtain that the unique limit point is a weakly efficient solution. | |||
05 | Novembro | Carlos Rubianes Silva | Soluções Primais Aproximadas para Método Subgradiente Dual |
Abstract: Vamos analisar soluções primais obtidas pelo Método subgradiente quando resolvemos o subproblema dual Lagrangiano de um problema de otimizacão convexo (possívelmente não diferenciável). Apresentamos o método subgradiente usando média ergódica para gerar soluções primais aproximadas. Proporcionamos cotas sobre a quantidade da violação de viabilidade e cotas superior e inferior dos valores funcionais primais nas soluções aproximadas. | |||
12 | Novembro | Max Leandro Nobre Gonçalves | Convergence analysis of the Gauss-Newton method for convex composite optimization under majorant conditions |
Abstract: In this paper, we present a convergence analysis of Gauss-Newton method for solving convex composite optimization problems. Under the hypothesis that the derivative of the function associated with the problem satisfies a majorant condition, we obtain that the method is well-defined and converges. Our analysis provides a clear relationship between the majorant function and the function associated with the convex composite optimization problem. It also allow us to obtain an estimate of convergence ball for method and some important special cases. | |||
19 | Novembro | Max Leandro Nobre Gonçalves | Convergence analysis of the Gauss-Newton method for convex composite optimization under majorant conditions |
Abstract: In this paper, we present a convergence analysis of Gauss-Newton method for solving convex composite optimization problems. Under the hypothesis that the derivative of the function associated with the problem satisfies a majorant condition, we obtain that the method is well-defined and converges. Our analysis provides a clear relationship between the majorant function and the function associated with the convex composite optimization problem. It also allow us to obtain an estimate of convergence ball for method and some important special cases. | |||
26 | Novembro | Ole Peter Smith | Truss Topology Optimization |
|
|||
03 | Dezembro | ||
10 | Dezembro |
Fonte: Grupo de Otimistas do IME