(base) PS D:\anaconda_win> ollama run llava-phi3:latest
pulling manifest
pulling 377876be20ba: 100% ▕█████████████████████████████████████▏ 2.3 GB
pulling 004fc0969720: 100% ▕█████████████████████████████████████▏ 607 MB
pulling c608dc615584: 100% ▕█████████████████████████████████████▏ 149 B
pulling cadf483f03b5: 100% ▕█████████████████████████████████████▏ 155 B
pulling 3921dac664c1: 100% ▕█████████████████████████████████████▏ 492 B
verifying sha256 digest
writing manifest
success
>>> What kind of LLM model are you?
I am a large language model based on the transformer architecture. I have been trained using machine learning
techniques to process and generate natural language text, allowing me to understand and respond to various types
of questions and prompts. Some examples of other transformer-based models include BERT, GPT-3, and T5.
>>> /bye
(base) PS D:\anaconda_win> ollama ls
NAME ID SIZE MODIFIED
llava-phi3:latest c7edd7b87593 2.9 GB 2 minutes ago
gemma3:12b f4031aab637d 8.1 GB 6 weeks ago
hf.co/tatsuyaaaaaaa/DeepSeek-R1-Distill-Qwen-14B-Japanese-gguf:latest 64961a571e20 6.9 GB 4 months ago
deepseek-r1:latest 0a8c26691023 4.7 GB 6 months ago
llama3.2:latest a80c4f17acd5 2.0 GB 6 months ago
gemma3:latest a2af6cc3eb7f 3.3 GB 6 months ago
(base) PS D:\anaconda_win>