π¦ Shiv's Custom Local LLM (Ollama Ready)
Welcome to ollama-model-shiv
, a custom-built local language model designed to run completely offline using Ollama. This repo packages everything β from Modelfile
to model blobs β ready for Docker-based deployment and local inference.
Link to model - https://huggingface.co/shiv1119/ollama-model-shiv
π Directory Structure
OllamaModelBuild/ βββ model_test.py # Sample Python script to interact with the model βββ .gitattributes # Git LFS or text handling config βββ ollama_clean/ β βββ docker-compose.yml # Docker config to run Ollama with this model β βββ Modelfile # Ollama build instructions β βββ models/ # Model weights & metadata β βββ blobs/ # Binary blobs of the model β βββ manifests/ # Manifest describing model structure
π Features
- β 100% Offline β No internet or API key needed
- π Docker-ready with
docker-compose.yml
- β‘ Works with Ollama CLI
- π Full model reproducibility via blobs and manifests
- π§ Based on
LLaMA2
/ open-weight LLM architecture
π οΈ Getting Started
π§ 1. Install Prerequisites
- Install Docker
- (Optional) Install Ollama CLI β used during build
π 2. Run the Model Using Docker
In the root of the project:
cd ollama_clean
docker-compose up --build
This builds and runs the model container locally with your custom blobs and Modelfile. π§ͺ Test the Model (Optional)
docker-compose exec ollama ollama run tinyllama "Hello"
Use the included Python script to test interaction:
python model_test.py
Customize it to query your local Ollama model running at http://localhost:11434. π§° Model Components
Modelfile: Blueprint for Ollama to build the model
blobs/: Raw model weights
manifests/: Metadata describing model format/version
docker-compose.yml: Encapsulates build/run config
π§ About Ollama
Ollama makes it simple to run LLMs locally on your own machine β private, fast, and API-free.
π¦ Repo Purpose
This repository was created to:
Host a working local LLM solution
Enable offline inference
Serve as a template for packaging custom models with Ollama
π License
This repo is for educational/research purposes only. Please ensure you comply with the license of any base models used (e.g., LLaMA2, Mistral, etc.). π Credits
Crafted with β€οΈ by Shiv Nandan Verma
Model tree for shiv1119/ollama-model-shiv
Base model
meta-llama/Llama-3.1-8B