Nonlinear Optimization
-   
    
  
-   
    
  In this paper we propose the use of damped techniques within Nonlinear Conjugate Gradient (NCG) methods. Damped techniques were introduced by Powell and recently reproposed by Al-Baali and till now, only applied in the framework of quasi--Newton methods. We extend their use to NCG methods in large...
-   
    
  
-   
    
  
-   
    
  
-   
    
  
-   
    
  
-   
    
  Starting from the paper by Nash and Sofer (1990), we propose a heuristic adaptive truncation criterionfor the inner iterations within linesearch-based truncated Newton methods. Our aim is to possibly avoid ‘‘over-solving’’ of the Newton equation, based on a comparison between the predicted...
-   
    
  Speaker: Tommaso Colombo Title: Recurrent Neural Networks: why do LSTM networks perform so well in time series prediction? (Joint work with: Alberto De Santis, Stefano Lucidi) Abstract: Long Short-Term Memory (LSTM)... 
