Yesterday, Amazon released a new open-source project, Chronos - a family of pre-trained time series forecasting models based on language model architectures.
It is introductory, and very very basic, but still I am a bit nervous as it is not my field, I've just used bits and bobs from it here and there. Hopefully there are no CS students rolling their eyes 😅
(1/2) Foundation Models & Generative AI Course - MIT Course 🚀
The course by Rickard Brüel Gabrielsson is a crash course on foundation models. This is the second version of the course, and it covers topics such as:
✅ Introduction to foundation models
✅ Different algorithms (ChatGPT, Stable-Diffusion & Dall-E)
✅ Supervised learning
✅ Neural networks
✅ Reinforcement learning
✅ Self-supervised learning
✅ Auto-encoders
The ML for Beginners is a course by Microsoft that covers the foundation of machine learning. As the name implies, this 12-week course is for beginners and provides intro to the following topics:
✅ Fairness and machine learning
✅ Regression and classification
✅ Clustering
✅ Natural language processing
✅ Time series forecasting
✅ Reinforcement learning
Applications for students and TAs are open for all Neuromatch Academy courses - Computational Neuroscience, Deep Learning, & NeuroAI! You don't want to miss this!
➡️ Student apps close March 24
➡️ TA apps close March 17
(1/2) Apple open source a new Python library for simulation framework for accelerating research in Private Federated Learning.
The library - pfl is a Python framework developed at Apple to empower researchers to run efficient simulations with privacy-preserving federated learning (FL) and disseminate the results of their research in FL.
I spent my Sunday morning reading the Distributed Machine Learning Patterns by Yuan Tang. The book, as its name implies, focuses on machine learning at scale using tools such as Tensorflow, Kubernetes, Argo, etc. That includes the following topics:
✅ Handling large dataset
✅ Approaches for training ML models with distributed machines
✅ ML workflow and operation design
✅ Building and deploying ML pipelines
The Carpentries run two-day workshops to teach basic programming skills to people who've never touchd the command line, never used version control, and never written a 'for' loop. Who's running the workshops to teach programmers similarly basic concepts about the human sciences and the humanities so that, if nothing else, they'll understand why managers, teachers, and therapists can't "just" be replaced with AI? If we don't teach them, it's not their fault they don't know. 1/2
@gvwilson This was basically the concept behind the day workshop that I taught as part of the University of Cologne's Deep Learning for Language Analysis Summer School. My workshop featured absolutely no #DeepLearning but solely focused on basic concepts of #linguistics and why these matter whenever language is analysed (whatever the methods).
Enrollment is now open for teaching assistant positions at Climatematch Academy 2024 😀! Join us for Computational Tools for Climate Science 2024 from July 15 - 26th 💻.
TA positions available 📣.
Enroll on the neuromatch.io website before March 24th.
Are you interested in learning advanced techniques in Deep Learning and apply them ethically to advance science? Don’t miss out on this opportunity to dive deep into the world of Deep Learning with the guidance of our expert instructors and teaching assistants!
Whether you’re a seasoned data scientist or just starting out, our course provides a comprehensive curriculum that covers all the core topics you need to know to become a proficient deep learning practitioner.
Student Applications Close Sunday, March 24 midnight in the last time zone on Earth.
Just spent at least two hours deleting all of my work from Tumblr, before their AI scraping shit hits the fan, although it's probably too late. In that case, the deletion functions as a gesture of protest.
This shameless large-scale intellectual property theft by greedy tech business assholes everywhere is starting to make the internet pretty annoying. 😖
This lecture focuses on the following topics:
✅ Optimized Matrix Multiplication
✅ Shared Memory Techniques for CUDA
✅ Implementing Shared Memory Optimization
✅ Translating Python to CUDA and Performance Considerations
✅ Numba: Bringing Python and CUDA Together