>“[#Opensource#AI] is really fundamental because it allows everyone to seize the technology, to diminish the fear of limited understanding or of not being qualified to use AI,” says Remi Cadene, head of robotics at #HuggingFace in Paris
>Open-source AI firms are meanwhile offering a better alternative to #SiliconValley.
You can now host #Quarto sites on Hugging Face using the new Quarto Space template.
#HuggingFace is the most used open platform for AI. We think that #QuartoPub is one of the best frameworks for scientific writing, making it a perfect tool for the Hugging Face community.
We’re very proud to support Hugging Face’s effort to build AI models under #OpenSource licenses and we can’t wait to see what people build! 🤝
I recently got thrown into the world of AI and LLMs. My role suddenly flipped from working on Search products for Microsoft Office to being right at the forefront of Microsoft Copilot exploration.
So, here is my Hugging Face profile... because that's a thing. I don't know how I'll use my account yet, but if you want to follow me, there I am.
On Monday, Mistral AI announced a new AI language model called Mixtral 8x7B, a "mixture of experts" (MoE) model with open weights that reportedly truly matches OpenAI's GPT-3.5 in performance—an achievement that has been claimed by others in the past but is being taken seriously by AI heavyweights such as OpenAI's Andrej...
After creating a simple example of natural language to SQL translator with the OpenAI API, this weekend, I extended my POC to open-source models with the Hugging Face API. The good news is that it is straightforward and simple to work with the API and get decent output. On the other hand, it is very slow to run locally. Any tips, articles, or examples on how to improve the model performance when running it locally? Thx!
Stanford has released a crash course for Hugging Face 🤗 as part of the update for the NLP with Deep Learning course, taught by Eric Frankel. The course focuses on basic applications of transformers using the Hugging Face Transformers Python library 🚀.
Okay #haters we've enabled #description#text on our #images courtesy of #huggingface#ai we run the #LLM locally because AI is that consumable now. I hope this helps those who are in need of such texts.
Everybody’s talking about Mistral, an upstart French challenger to OpenAI (arstechnica.com)
On Monday, Mistral AI announced a new AI language model called Mixtral 8x7B, a "mixture of experts" (MoE) model with open weights that reportedly truly matches OpenAI's GPT-3.5 in performance—an achievement that has been claimed by others in the past but is being taken seriously by AI heavyweights such as OpenAI's Andrej...