Setting Up Ollama With Docker

Setting Up Ollama With Docker

docker ps Method 2: Running Ollama with Docker compose Ollama exposes an API on http://localhost:11434, allowing other tools to connect and interact with it. That was when I got hooked…

I Ran the Famed SmolLM on Raspberry Pi

I Ran the Famed SmolLM on Raspberry Pi

Their compact nature makes them well-suited for various applications, particularly in scenarios where local processing is crucial. As the industry shifts towards local deployment of AI technologies, the advantages of…

What is Hugging Face?

What is Hugging Face?

There are 9,00,000+ models on the platform, and you can easily utilize each one of them on your system as per their usage instructions, and license requirements.A type of deep-learning…