Port details |
- ollama Run Llama 2, Mistral, and other large language models
- 0.3.6_2 misc =5 0.3.6_1Version of this port present on the latest quarterly branch.
- Maintainer: yuri@FreeBSD.org
- Port Added: 2024-08-06 10:06:06
- Last Update: 2024-11-08 20:58:46
- Commit Hash: 79271d5
- People watching this port, also watch:: drm-61-kmod, firefox, ffmpeg, pipewire, lapce
- License: MIT
- WWW:
- https://ollama.com/
- Description:
- Ollama allows to get up and running with large language models.
Ollama supports a list of models available on ollama.com/library.
- ¦ ¦ ¦ ¦
- Manual pages:
- FreshPorts has no man page information for this port.
- pkg-plist: as obtained via:
make generate-plist - Dependency lines:
-
- Conflicts:
- CONFLICTS_BUILD:
- To install the port:
- cd /usr/ports/misc/ollama/ && make install clean
- To add the package, run one of these commands:
- pkg install misc/ollama
- pkg install ollama
NOTE: If this package has multiple flavors (see below), then use one of them instead of the name specified above.- PKGNAME: ollama
- Flavors: there is no flavor information for this port.
- ONLY_FOR_ARCHS: amd64
- distinfo:
- TIMESTAMP = 1724010094
SHA256 (go/misc_ollama/ollama-v0.3.6/v0.3.6.mod) = 16c078d8f0b29f84598fb04e3979acf86da41eb41bf4ff8363548e490f38b54e
SIZE (go/misc_ollama/ollama-v0.3.6/v0.3.6.mod) = 2992
Packages (timestamps in pop-ups are UTC):
- Dependencies
- NOTE: FreshPorts displays only information on required and default dependencies. Optional dependencies are not covered.
- Build dependencies:
-
- bash : shells/bash
- cmake : devel/cmake-core
- glslc : graphics/shaderc
- vulkan-headers>0 : graphics/vulkan-headers
- go122 : lang/go122
- pkgconf>=1.3.0_1 : devel/pkgconf
- Library dependencies:
-
- libvulkan.so : graphics/vulkan-loader
- Fetch dependencies:
-
- go122 : lang/go122
- ca_root_nss>0 : security/ca_root_nss
- This port is required by:
- for Run
-
- misc/alpaca
Configuration Options:
- No options to configure
- Options name:
- misc_ollama
- USES:
- go:1.22,modules pkgconfig zip
- pkg-message:
- For install:
- You installed ollama: the AI model runner.
To run ollama, plese open 2 terminals.
1. In the first terminal, please run:
$ OLLAMA_NUM_PARALLEL=1 OLLAMA_DEBUG=1 LLAMA_DEBUG=1 ollama start
2. In the second terminal, please run:
$ ollama run mistral
This will download and run the AI model "mistral".
You will be able to interact with it in plain English.
Please see https://ollama.com/library for the list
of all supported models.
The command "ollama list" lists all models downloaded
into your system.
When the model fails to load into your GPU, please use
the provided ollama-limit-gpu-layers script to create
model flavors with different num_gpu parameters.
ollama uses many gigbytes of disk space in your home directory,
because advanced AI models are often very large.
Pease symlink ~/.ollama to a large disk if needed.
- Master Sites:
|
Commit History - (may be incomplete: for full details, see links to repositories near top of page) |
Commit | Credits | Log message |
0.3.6_2 08 Nov 2024 20:58:46 |
Ashish SHUKLA (ashish) |
all: Bump after lang/go122 update
PR: 281842 |
0.3.6_1 27 Aug 2024 19:44:05 |
Yuri Victorovich (yuri) |
misc/ollama: Remove unnecessary paragraph from pkg-message |
0.3.6_1 27 Aug 2024 17:44:27 |
Yuri Victorovich (yuri) |
misc/ollama: Add environment variables to 'ollama start' to work around memory
allocation issues |
0.3.6_1 19 Aug 2024 01:12:09 |
Yuri Victorovich (yuri) |
misc/ollama: Improve pkg-message |
0.3.6 18 Aug 2024 20:44:06 |
Yuri Victorovich (yuri) |
misc/ollama: update 0.3.4 → 0.3.6 |
0.3.4_4 10 Aug 2024 07:07:35 |
Yuri Victorovich (yuri) |
misc/ollama: add CONFLICTS_BUILD |
0.3.4_4 09 Aug 2024 06:24:09 |
Ashish SHUKLA (ashish) |
all: Bump after lang/go122 update |
0.3.4_3 09 Aug 2024 05:03:35 |
Yuri Victorovich (yuri) |
misc/ollama: Fix Vulkan compatibility |
0.3.4_2 08 Aug 2024 20:01:10 |
Yuri Victorovich (yuri) |
misc/ollama: Fix inference; Add ONLY_FOR_ARGHxx lines; Add pkg-message |
0.3.4_1 07 Aug 2024 18:33:34 |
Yuri Victorovich (yuri) |
misc/ollama: Add llama-cpp as dependency |
0.3.4 06 Aug 2024 22:32:55 |
Yuri Victorovich (yuri) |
misc/ollama: Remove one unnecessary architecture-specific place in scripts |
0.3.4 06 Aug 2024 10:04:44 |
Yuri Victorovich (yuri) |
misc/ollama: New port: Run Llama 2, Mistral, and other large language models |