πŸ¦™ Shiv's Custom Local LLM (Ollama Ready)

Welcome to ollama-model-shiv, a custom-built local language model designed to run completely offline using Ollama. This repo packages everything β€” from Modelfile to model blobs β€” ready for Docker-based deployment and local inference.


Link to model - https://huggingface.co/shiv1119/ollama-model-shiv

πŸ“ Directory Structure

OllamaModelBuild/ β”œβ”€β”€ model_test.py # Sample Python script to interact with the model β”œβ”€β”€ .gitattributes # Git LFS or text handling config β”œβ”€β”€ ollama_clean/ β”‚ β”œβ”€β”€ docker-compose.yml # Docker config to run Ollama with this model β”‚ β”œβ”€β”€ Modelfile # Ollama build instructions β”‚ └── models/ # Model weights & metadata β”‚ β”œβ”€β”€ blobs/ # Binary blobs of the model β”‚ └── manifests/ # Manifest describing model structure


πŸš€ Features

  • βœ… 100% Offline β€” No internet or API key needed
  • πŸ‹ Docker-ready with docker-compose.yml
  • ⚑ Works with Ollama CLI
  • πŸ” Full model reproducibility via blobs and manifests
  • 🧠 Based on LLaMA2 / open-weight LLM architecture

πŸ› οΈ Getting Started

πŸ”§ 1. Install Prerequisites

πŸ‹ 2. Run the Model Using Docker

In the root of the project:

cd ollama_clean
docker-compose up --build

This builds and runs the model container locally with your custom blobs and Modelfile. πŸ§ͺ Test the Model (Optional)

docker-compose exec ollama ollama run tinyllama "Hello"

Use the included Python script to test interaction:

python model_test.py

Customize it to query your local Ollama model running at http://localhost:11434. 🧰 Model Components

Modelfile: Blueprint for Ollama to build the model

blobs/: Raw model weights

manifests/: Metadata describing model format/version

docker-compose.yml: Encapsulates build/run config

🧠 About Ollama

Ollama makes it simple to run LLMs locally on your own machine β€” private, fast, and API-free.

πŸ“¦ Repo Purpose

This repository was created to:

Host a working local LLM solution

Enable offline inference

Serve as a template for packaging custom models with Ollama

πŸ“œ License

This repo is for educational/research purposes only. Please ensure you comply with the license of any base models used (e.g., LLaMA2, Mistral, etc.). πŸ™Œ Credits

Crafted with ❀️ by Shiv Nandan Verma

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for shiv1119/ollama-model-shiv

Finetuned
(1467)
this model