Last week, we went over some basics of Artificial Intelligence (AI) using Ollama, Llama3, and some custom code. Artificial intelligence (AI) encompasses a broad range of technologies designed to enable machines to perform tasks that typically require human intelligence. These tasks include understanding spoken or written language, recognizing visual patterns, making decisions, and providing recommendations. Machine learning (ML) is a specialized subset of AI that focuses on developing systems that improve their performance over time without being explicitly programmed. Instead, ML algorithms analyze and learn from large datasets to identify patterns and make decisions based on these insights. This learning process allows ML models to make increasingly accurate predictions or decisions as they are exposed to more data.
A few months ago, I added Liner to the resource page of my website. It allows you to easily train an ML model so that you can do image, text, audio, or video classification, object detection, image segmentation, or pose classification. I created “Is this Joe or Not Joe?” using that tool. TensorFlow.js is running client-side with a model that is trained on a half dozen examples of photos that are Joe and a half dozen examples of photos that are not Joe. You can supply a photo and get a prediction if Joe is in the image or not. You can always retrain the existing model with more examples. That is an example of machine learning.
So, you can think of ML as a subset of AI and Deep Learning (DL) as a subset of ML.
Have any questions, comments, etc? Please feel free to drop a comment, below.
The below course by Dhaval Patel is a beginner-level course for Deep Learning in Python with Tensorflow 2.0 and Kares. The course covers the foundations of neural network and deep learning, which includes the following topics: 🧵👇🏼
For a hackathon this weekend, we built a small application in #Kotlin, #Javalin and #Lit that uses #Tensorflow to detect if you're about to upload something you might want to reconsider, and then allows stripping Exif metadata for privacy.
We also looked at distorting the image to make it unusable for training an #AI. In one day we could just garble the image beyond human recognition, but a better option would be integrating #Glaze to distort it for AI yet not for the human eye.
Tensorflow has the same modus operandi. Update and you can’t open models trained (for days) in the previous version, replete with cryptic error messages.
Anybody out there looking for an ML or software engineer with >30 years total experience and ~20 years in the industry?
I have extensive experience with #Python and #ML frameworks, particularly #TensorFlow, and I've worked on #NLP and #ImageProcessing both in the workplace and in personal open source projects. My resume is available here:
A new crash course for getting started with #CUDA with #Python by Jeremy Howard 🚀. CUDA is NVIDIA's programming model for parallel computing on GPUs. CUDE is being used by tools such as #PyTorch#tensorflow and other #deeplearning and LLMs frameworks to speed up calculations. The course covers the following topics:
✅ Setting up CUDA
✅ CUDA foundation
✅ Working with Kernel
✅ CUDA with PyTorch
JOSS publishes articles about open source research software. It is a free, open-source, community driven and developer-friendly online journal. JOSS reviews involve downloading and installing the software, and inspecting the repository and submitted paper for key elements
Please reach out if you are interested in reviewing this paper or know one who could review this paper.
I want to build something and I need to say my plans out loud, but I don't have any friends let alone any that use #Mastodon or #JavaScript so I'm just gonna info dump here:
🔥 #AMDlabnotes presents another new article - this time to assist data scientists/ML practitioners get their #PyTorch or #TensorFlow environment up and running on #AMD#GPUs 🔥
(1/2) I always enjoy seeing my LinkedIn friends publishing books, particularly on MLOps topics ❤️.
Enjoy reading Yuan Tang's Distributed Machine Learning Patterns. As its name implies, the book focuses on machine learning at scale using tools such as #Tensorflow , #Kubernetes , #argo , #python , etc.
As I'm working on live camera object detection and tracking application like #OpenDataCam I've been running my typical #TensorFlow#javascript benchmark in Chrome and Safari.
It being TensorFlow and all, I expected the Pixel 6a to be faster, but the actual numbers are:
iPhone 11: ~15fps
Pixel 6a: ~5fps
Hence my initial question: What the hell Google? 🤨
The crybabies who freak out about The Communist Manifesto appearing on university curriculum clearly never read it - chapter one is basically a long hymn to capitalism's flexibility and inventiveness, its ability to change form and adapt itself to everything the world throws at it and come out on top:
And the source-code is licensed under a homebrewed license cooked up by Meta's lawyers, a license that only glancingly resembles anything from the #OpenSourceDefinition:
Core to Big Tech companies' "open AI" offerings are tools, like Meta's #PyTorch and Google's #TensorFlow. These tools are indeed "open source," licensed under real OSS terms.
Without going into too much detail, my thesis was criticised for developing a web service with C++. I It was questioned why I didn't use #NodeJS or #Java for the web service. "It's not performance critical" said the professor.
Dude, have you used the internet lately?
EVERYTHING is performance critical!
This sort of teaching explains why most aps/websites run like absolute dogshit.
The point of these exercises is to point out that it's too simplistic to simply say that programming lanuage X is better or faster than programming language Y.
Last time I had such discussions was as teenager.😉
Typically you use and combine various programming languages, e.g. #python (for high-level code) with #cpp (for performance critical low-level code). Textbook sample: #numpy. Or #tensorflow.
As a more serious note: As discussed by various people in various other posts, the performance of a program depends on so many things, the language itself is only one of many factors. And in reality a non-trivial software is often a combination of multiple programming languages anyway, like #python for complex high-levell code, and #cpp for specific performance-critical code. As done with #numpy or #tensorflow.