Reading Club

Discussion of machine learning, its applications, and related ideas

A brown-bag (bring your own lunch) reading club at TReNDs Center held mainly for cross pollination of ideas and “out of the comfort zone” fun. We read and discuss papers, tutorials, or packs of papers and ideas at once. The topics span all areas of machine learning, with a focus on Deep Learning, but also venture into brain imaging and theories of how the brain works. The goal is to both learn and refine the knowledge of important foundational concepts and stay current with the literature.

Please sign up for the mailing list or, if you’re a part of the TReNDS Center, join the #reading_group channel on our Slack workspace.

A list of topics with associated materials can be found on our GitHub page.

Nonlinear Dimensionality Reduction methods and a generalization: November 22st, 2019

This time we will talk about visualization and exploratory analysis of data and results. Deb will give an overview and detailed explanations of how LLE, MDS, and tSNE work in his talk. If time permits, he will also cover Isomap. We will discuss alternative approaches to the problem of data embedding and briefly touch on Divide&Concur, a method for solving constraint satisfaction problems, and how it can be used in this domain.

Upcoming meetings (subject to change)

12/06/2019 – “Function Approximation with Neural Networks”

12/06/2019 – Meta-learning for neuroimaging

12/13/2019 – What you wanted to know about ICA but were afraid to ask

Normalizing flows
Time-varying Neural Networks
Contemporary Ethical Issues in Data Science/Neuroimaging
Information Theory for Deep Learning
Word Embeddings (Word2Vec, GloVe, etc.)

Past meetings

11/08/2019 – Stop Explaining Black Box Machine Learning Models for High Stakes Decisions and Use Interpretable Models Instead

11/01/2019 – Synthesizing Programs for Images using Reinforced Adversarial Learning

10/25/2019 – Introduction to Deep Q-Network

10/18/2019 – Hidden stratification

10/11/2019 – Neural Ordinary Differential Equations

10/04/2019 – Model Utility

09/27/2019 – Wasserstein GANs and other easier to train models

09/20/2019 – Generative Adversarial Networks (Intro)

09/13/2019 – Predictive coding theory of the mind: part II

09/06/2019 – Predictive coding theory of the mind: Intro

08/30/2019 – Confidence and accuracy: On Calibration of Modern Neural Networks

08/23/2019 – Deep learning model introspection

08/16/2019 – Deep learning trends from the deep learning summer school

08/09/2019 – Variational Autoencoders: part II

08/02/2019 – Variational Autoencoders: part I

07/26/2019 – Attention in deep learning models: part II

07/19/2019 – Attention in deep learning models: part I

07/12/2019 – NLP successes of 2018: transfer learning

Designed & Developed by ThemeXpert