Location: National University of Colombia, Bogotá, Colombia.
Dates: Oct. 4-5, 2019.
Location: UPC, Barcelona, Spain.
Dates: May 23-24, 2019.
Location: BCAM, Bilbao, Spain.
Dates: May 24-25, 2018.
Location: Valparaíso, Chile.
Dates: January 17-18, 2019.
Professor: Jay Gopalakrishanan (University of Portland, USA)
Place and time: BCAM, Bilbao, Spain (Sala Beta 2), June 4-8, 2018.
Professor: Victor M. Calo (Curtin University, Australia)
Place and time: BCAM, Bilbao, Spain (Sala Beta 2), May 18-22, 2020, 10:30-12:30 (10 hours course).
Description: Prof. Victor M. Calo will discuss several discretizations that build upon the ideas of finite elements and residual minimization to develop stable and adaptable numerical approximations. We use the mathematical framework developed by the discontinuous Galerkin community to build a new class of stabilized finite elements. Several model problems in 2D and 3D will serve to validate the presented theoretical results.
Professor: Artzai Picon (Tecnalia, Spain)
Place and time: BCAM, Bilbao, Spain (Sala Beta 2), Dec. 18th, 2019, 10:0-14:0 (4 hours course).
Description: Prof. Artzai Picon will discuss several aspects of TensorFlow 2.0 (TF2.0). He will show how to implement Deep Neural Networks (DNN) in TF2.0 for solving computational mechanics problems.
Professor: Dr. Adrián Galdrán (INESC TEC, Portugal)
Place and time: BCAM, Bilbao, Spain (Sala Beta_2) Wednesday, June 13th, 2018, 14:00-20:00 (6 hours).
Description: Dr. Galdrán will provide a short course with hands-on experience on how to install a Deep Learning framework and train deep neural networks on it. The course will verse about a specific library, Keras. Examples of how to use Keras for different problems will be reproduced and analyzed.
Professor: Dr. Adrián Galdrán (INESC TEC, Portugal).
Location: BCAM, Bilbao, Spain. Dates: Jan 24-26, 2018.
Description: The aim of this course is to provide an introduction to deep learning algorithms for their use in computational mechanics. The instructor will introduce concepts like convolutional deep neural networks, supervised and unsupervised algorithms, activation functions to include nonlinearities (e.g, ReLU), data augmentation for teaching physical principles we know, the possibility of using existing libraries (e.g., those available in google, facebook), residual neural networks, and transfer learning.
Kick-off meeting: Mar 2018
Second Meeting: March 28th, 2019