The development of neural networks to create artificial intelligence in computers was originally inspired by how biological systems work. These "neuromorphic" networks, however, run on hardware that looks nothing like a biological brain, which limits performance.
JOSS publishes articles about open source research software. It is a free, open-source, community driven and developer-friendly online journal. JOSS reviews involve downloading and installing the software, and inspecting the repository and submitted paper for key elements
Please reach out if you are interested in reviewing this paper or know one who could review this paper.
14 years after Alan Turing's death, an unpublished manuscript emerged where he suggested the idea of a "disordered" computer that anticipated the rise of connectionism.
Are you interested in cortico-basal ganglia networks and would like to model them, but only have a basic proficiency in Python or computational modeling in general?
Well then, I’m happy to announce the release of CBGTPy, a software package for running biologically-realistic simulations of the cortico-basal ganglia-thalamic (CBGT) networks in a dynamic range of tasks. The latest tool out of our Exploratory Intelligence group at CMU, University of Pittsburgh, and University of the Balearic Islands (Spain).
Pleased to share my latest research "Zero-shot counting with a dual-stream neural network model" about a glimpsing neural network model the learns visual structure (here, number) in a way that generalises to new visual contents. The model replicates several neural and behavioural hallmarks of numerical cognition.
Quite interesting but confusing, as I come from #backpropagation DL.
If I got it right, the authors focus on showing how and why biological neural networks would benefit from being Energy Based Models for Predictive Coding, instead of Feedforward Networks employing backpropagation.
I struggled to reach where they explain how to optimize a ConvNet in PyTorch as an EB model, but they do: there is an algorithm and formulae, but I'm curious about how long and stable training is, and whether all that generalizes to typical computer vision architectures (ResNets, MobileNets, ViTs, ...).
Code is also #opensource at https://github.com/YuhangSong/Prospective-Configuration
I would like to sit a few hours at my laptop and try to better see and understand, but I think in the next days I will go to Modern #HopfieldNetworks. These too are EB and there's an energy function that is optimised by the #transformer 's dot product attention.
I think I got what attention does in Transformers, so I'm quite curious to get in what sense it's equivalent to consolidating/retrieving patterns in a Dense Associative Memory. In general, I think we're treating memory wrong with our deep neural networks. I see most of them as sensory processing, shortcut to "reasoning" without short or long term memory surrogates, but I could see how some current features may serve similar purposes...
The Machine Learning with Graphs course by Prof. 𝐉𝐮𝐫𝐞 𝐋𝐞𝐬𝐤𝐨𝐯𝐞𝐜 from Stanford University (CS224W) focuses on different methods for analyzing massive graphs and complex networks and extracting insights using machine learning models and data mining techniques. 🧵🧶👇🏼
(2/3) The course includes 47 lectures, and it covers topics such as:
✅ ML applications for graph
✅ Graph neural networks (GNN)
✅ Knowledge graph completion
✅ Recommendation with GNN
✅ Geometric deep learning
✅ Link prediction and causality
Very nice picture that was shared by Ronald van Loon on X, you can discuss if the categories are complete and correct, but it illustrates that the field of AI is much more then just transformers/LLMs. #AI#Machinelearning#neuralnetworks#deeplearning#LLM#Transfomers
Researchers grow bio-inspired polymer brains for artificial neural networks (phys.org)
The development of neural networks to create artificial intelligence in computers was originally inspired by how biological systems work. These "neuromorphic" networks, however, run on hardware that looks nothing like a biological brain, which limits performance.