Skip to content

Commit d8c2c85

Browse files
authored
Native compilation and Homebrew instructions. (#18)
* chore: Added `nuitka` as a candidate native code compiler system. * chore: Bumped patch version. * chore: Simple improvement with list-models displaying a sorted list. * chore: Added sorting in all listing and tags. HF model sorting is per page only. chore: Upgraded dependencies. * chore: Code refactoring. chore: Case-insensitive sorting of models and tags. chore: Minor version bumped. * feat: Added Nuitka-based native compilation script. chore: Updated README. chore: Upgraded packages. * chore: Added a `version` command. * chore: Updated README to reflect the `version` command. * chore: Updated README to reflect Homebrew-based installation. chore: Upgraded dependencies.
1 parent 1b362b0 commit d8c2c85

File tree

16 files changed

+242
-91
lines changed

16 files changed

+242
-91
lines changed

.gitignore

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -210,3 +210,6 @@ __marimo__/
210210
conf/
211211
# Pytest profiling
212212
prof/
213+
214+
# Native binaries created by Nuitka
215+
od-native*

README.md

Lines changed: 51 additions & 18 deletions
Original file line numberDiff line numberDiff line change
@@ -115,27 +115,44 @@ Usage: od [OPTIONS] COMMAND [ARGS]...
115115

116116
A command-line interface for the Ollama downloader.
117117

118-
╭─ Options ───────────────────────────────────────────────────────────────────────────────────╮
119-
│ --help Show this message and exit. │
120-
╰─────────────────────────────────────────────────────────────────────────────────────────────╯
121-
╭─ Commands ──────────────────────────────────────────────────────────────────────────────────╮
122-
│ show-config Shows the application configuration as JSON. │
123-
│ auto-config Displays an automatically inferred configuration. │
124-
│ list-models Lists all available models in the Ollama library. If pagination options │
125-
│ are not provided, all models will be listed. │
126-
│ list-tags Lists all tags for a specific model. │
127-
│ model-download Downloads a specific Ollama model with the given tag. │
128-
│ hf-list-models Lists available models from Hugging Face that can be downloaded into │
129-
│ Ollama. │
130-
│ hf-list-tags Lists all available quantisations as tags for a Hugging Face model that │
131-
│ can be downloaded into Ollama. Note that these are NOT the same as │
132-
│ Hugging Face model tags. │
133-
│ hf-model-download Downloads a specified Hugging Face model. │
134-
╰─────────────────────────────────────────────────────────────────────────────────────────────╯
118+
╭─ Options ────────────────────────────────────────────────────────────────────────────────╮
119+
│ --help Show this message and exit. │
120+
╰──────────────────────────────────────────────────────────────────────────────────────────╯
121+
╭─ Commands ───────────────────────────────────────────────────────────────────────────────╮
122+
│ version Shows the app version of Ollama downloader. │
123+
│ show-config Shows the application configuration as JSON. │
124+
│ auto-config Displays an automatically inferred configuration. │
125+
│ list-models Lists all available models in the Ollama library. If pagination │
126+
│ options are not provided, all models will be listed. │
127+
│ list-tags Lists all tags for a specific model. │
128+
│ model-download Downloads a specific Ollama model with the given tag. │
129+
│ hf-list-models Lists available models from Hugging Face that can be downloaded into │
130+
│ Ollama. │
131+
│ hf-list-tags Lists all available quantisations as tags for a Hugging Face model │
132+
│ that can be downloaded into Ollama. Note that these are NOT the same │
133+
│ as Hugging Face model tags. │
134+
│ hf-model-download Downloads a specified Hugging Face model. │
135+
╰──────────────────────────────────────────────────────────────────────────────────────────╯
135136
```
136137

137138
You can also use `--help` on each command to see command-specific help.
138139

140+
### `version`
141+
142+
The `version` command displays the application version.
143+
144+
Running `uv run od version --help` displays the following.
145+
146+
```bash
147+
Usage: od version [OPTIONS]
148+
149+
Shows the app version of Ollama downloader.
150+
151+
╭─ Options ────────────────────────────────────────────────────────────────────────────────╮
152+
│ --help Show this message and exit. │
153+
╰──────────────────────────────────────────────────────────────────────────────────────────╯
154+
```
155+
139156
### `show-config`
140157

141158
The `show-config` command simply displays the current configuration from the settings file in the configurations directory, if it exists. If it does not exist, it creates that file with the default settings and shows the content of that file.
@@ -372,11 +389,27 @@ tests/test_typer.py 62 0 100
372389
TOTAL 655 95 85%
373390
```
374391

375-
There is a handy script for running tests `run-tests.sh` in _WD_. It can accept any parameters to be passed to `pytest`. Thus, tests can be filtered using the `-k` to specify tests to run or not. Likewise, profiling can be done by calling `./run-tests.sh --profile`. The resulting profile information will be generated and saved in _WD_`/prof`. A SVG of the complete profile can be generated by calling `./run-tests.sh --profile --profile-svg`.
392+
There is a handy script for running tests `run_tests.sh` in _WD_. It can accept any parameters to be passed to `pytest`. Thus, tests can be filtered using the `-k` to specify tests to run or not. Likewise, profiling can be done by calling `./run_tests.sh --profile`. The resulting profile information will be generated and saved in _WD_`/prof`. A SVG of the complete profile can be generated by calling `./run_tests.sh --profile --profile-svg`.
376393

377394
Profile information can also be filtered and a SVG of the filtered profile generated by editing the somewhat hacky script _WD_`/tests/filter_profile_data.py`.
378395
The script can be run using `uv` as `uv run tests/filter_profile_data.py`.
379396

397+
## Native compilation and execution
398+
399+
This is an _experimental feature_ by which Ollama downloader can be compiled into a single executable binary file -- `od-native` -- using [Nuitka](https://nuitka.net/). To compile the native binary on your platform, run the script `./compile_native.sh`. Notice that you must have the `dev` group dependencies of the project installed.
400+
401+
_Note that having a `.env` file may cause the natively compiled binary to crash, with an error message `OSError: Starting path not found`._ Should you want to pass any of the environment variables to the executable, do so using the command line interface.
402+
403+
Once the native executable has been created, run it from the command line interface as `./od-native` on UNIX/Linux systems and simply `od-native` on Windows. The native executable may have executable file type extension (`.exe`) on Windows.
404+
405+
_Note that the natively compiled binary is unlikely to be significantly faster than the Python code that you can execute using `uv`_. After all, the bottleneck in most of the operations in Ollama downloader is more likely to be the network speed as opposed to code execution speed.
406+
407+
## Installation on macOS and Linux using Homebrew
408+
409+
Ollama downloader can be installed on macOS and Linux using Homebrew so that the installation of Python and the management of a virtual environment is all done by Homebrew leaving a single command `ollama-downloader` available on the command line interface.
410+
411+
To do so, add the new tap by running: `brew tap anirbanbasu/tap`. Then, Ollama downloader can be installed using `brew install ollama-downloader`.
412+
380413
## Contributing
381414

382415
Install [`pre-commit`](https://pre-commit.com/) for Git by using the `--all-groups` flag for `uv sync`.

compile_native.sh

Lines changed: 20 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,20 @@
1+
#!/bin/bash
2+
NATIVE_FILE="od-native"
3+
PRODUCT_NAME="ollama-downloader"
4+
# Remove the existing native file
5+
rm $NATIVE_FILE
6+
uv run python -m nuitka \
7+
--onefile \
8+
--standalone \
9+
--clean-cache="all" \
10+
--disable-cache="all" \
11+
--remove-output \
12+
--static-libpython="yes" \
13+
--noinclude-pytest-mode="nofollow" \
14+
--product-name=$PRODUCT_NAME \
15+
# Extract version from pyproject.toml but only the numerical parts, e.g., "1.2.3.rc1" -> "1.2.3"
16+
--product-version=$(sed -n 's/^version = "\([0-9.]*\).*/\1/p' pyproject.toml | head -n 1) \
17+
--output-file=$NATIVE_FILE \
18+
--macos-prohibit-multiple-instances \
19+
--macos-app-name=$PRODUCT_NAME \
20+
src/ollama_downloader/cli.py

pyproject.toml

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
[project]
22
name = "ollama-downloader"
3-
version = "0.1.1.post1"
3+
version = "0.2.0"
44
description = "A library and Hugging Face model downloader for Ollama."
55
readme = "README.md"
66
license = "MIT"
@@ -61,6 +61,7 @@ build-backend = "hatchling.build"
6161
[dependency-groups]
6262
dev = [
6363
"icecream>=2.1.7",
64+
"nuitka>=2.7.16",
6465
"pre-commit>=4.3.0",
6566
]
6667
test = [
File renamed without changes.
File renamed without changes.

src/ollama_downloader/__init__.py

Lines changed: 21 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,18 +1,35 @@
11
import logging
2-
from environs import env
2+
import os
3+
from environs import Env
4+
from marshmallow.validate import OneOf
35
from rich.logging import RichHandler
46

5-
from ollama_downloader.common import EnvVar
6-
77
try:
88
from icecream import ic
99

1010
ic.configureOutput(includeContext=True)
1111
except ImportError: # Graceful fallback if IceCream isn't installed.
1212
ic = lambda *a: None if not a else (a[0] if len(a) == 1 else a) # noqa
1313

14+
env = Env()
15+
if os.path.exists(".env"): # This check is only necessary for Nuitka-compiled binaries.
16+
env.read_env() # Read .env file, if it exists
17+
18+
19+
class EnvVar:
20+
LOG_LEVEL = env.str(
21+
"LOG_LEVEL",
22+
default="info",
23+
validate=OneOf(["NOTSET", "DEBUG", "INFO", "WARNING", "ERROR", "CRITICAL"]),
24+
).upper()
25+
26+
OD_UA_NAME_VER = env.str("OD_UA_NAME_VER", default="ollama-downloader/0.1.1")
27+
28+
OD_SETTINGS_FILE = env.str("OD_SETTINGS_FILE", default="conf/settings.json")
29+
30+
1431
logging.basicConfig(
15-
level=env.str(EnvVar.LOG_LEVEL, default=EnvVar.DEFAULT__LOG_LEVEL).upper(),
32+
level=EnvVar.LOG_LEVEL,
1633
format="%(message)s",
1734
datefmt="[%X]",
1835
handlers=[

src/ollama_downloader/cli.py

Lines changed: 31 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -14,8 +14,9 @@
1414
from rich import print as print
1515
from rich import print_json
1616
import psutil
17+
from importlib.metadata import version as metadata_version
1718

18-
from ollama_downloader.common import OllamaSystemInfo
19+
from ollama_downloader.sysinfo import OllamaSystemInfo
1920
from ollama_downloader.data.data_models import AppSettings
2021
from ollama_downloader.downloader.ollama_model_downloader import OllamaModelDownloader
2122
from ollama_downloader.downloader.hf_model_downloader import HuggingFaceModelDownloader
@@ -64,6 +65,24 @@ def _cleanup(self):
6465

6566
logger.debug("Cleanup completed.")
6667

68+
async def _version(self):
69+
package_name = "ollama-downloader"
70+
name_splits = package_name.split("-")
71+
if len(name_splits) != 2:
72+
abbreviation = package_name
73+
else:
74+
abbreviation = f"{name_splits[0]}{name_splits[1][0]}"
75+
return (
76+
f"{package_name} ({abbreviation}) version {metadata_version(package_name)}"
77+
)
78+
79+
async def run_version(self):
80+
try:
81+
result = await self._version()
82+
print(result)
83+
except Exception as e:
84+
logger.error(f"Error in getting version. {e}")
85+
6786
async def _show_config(self):
6887
return self._model_downloader.settings.model_dump_json()
6988

@@ -308,25 +327,11 @@ async def run_list_models(
308327
):
309328
try:
310329
self._initialize()
311-
result = await self._list_models()
312-
filtered_result = result
313-
if page_size and page:
314-
# Adjust page number for 0-based index
315-
start_index = (page - 1) * page_size
316-
end_index = start_index + page_size
317-
filtered_result = result[start_index:end_index]
318-
if len(filtered_result) == 0:
319-
logger.warning(
320-
f"No models found for the specified page {page} and page size {page_size}. Showing all models instead."
321-
)
322-
filtered_result = result
323-
page = None
324-
if page:
325-
print(
326-
f"Model identifiers: ({len(filtered_result)}, page {page}): {filtered_result}"
327-
)
330+
result = await self._list_models(page=page, page_size=page_size)
331+
if page and page_size and page_size >= len(result):
332+
print(f"Model identifiers: ({len(result)}, page {page}): {result}")
328333
else:
329-
print(f"Model identifiers: ({len(filtered_result)}): {filtered_result}")
334+
print(f"Model identifiers: ({len(result)}): {result}")
330335
except Exception as e:
331336
logger.error(f"Error in listing models. {e}")
332337
finally:
@@ -407,6 +412,13 @@ async def run_hf_model_download(self, user_repo_quant: str):
407412
self._cleanup()
408413

409414

415+
@app.command()
416+
def version():
417+
"""Shows the app version of Ollama downloader."""
418+
app_handler = OllamaDownloaderCLIApp()
419+
asyncio.run(app_handler.run_version())
420+
421+
410422
@app.command()
411423
def show_config():
412424
"""Shows the application configuration as JSON."""

src/ollama_downloader/data/data_models.py

Lines changed: 4 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -4,9 +4,8 @@
44
from pydantic import AfterValidator, BaseModel, Field, HttpUrl
55
from typing import Annotated, ClassVar, List, Optional, Tuple
66

7-
from environs import env
87

9-
from ollama_downloader.common import EnvVar
8+
from ollama_downloader import EnvVar
109

1110
logger = logging.getLogger(__name__)
1211

@@ -96,7 +95,7 @@ def __new__(cls: type["AppSettings"]) -> "AppSettings":
9695

9796
@staticmethod
9897
def load_or_create_default(
99-
settings_file: str = EnvVar.DEFAULT__OD_SETTINGS_FILE,
98+
settings_file: str = EnvVar.OD_SETTINGS_FILE,
10099
) -> "AppSettings | None":
101100
"""
102101
Load settings from the configuration file, or create default settings if the file does not exist.
@@ -115,9 +114,7 @@ def load_or_create_default(
115114

116115
@staticmethod
117116
def load_settings(
118-
settings_file: str = env.str(
119-
EnvVar.OD_SETTINGS_FILE, default=EnvVar.DEFAULT__OD_SETTINGS_FILE
120-
),
117+
settings_file: str = EnvVar.OD_SETTINGS_FILE,
121118
) -> "AppSettings | None":
122119
"""
123120
Load settings from the configuration file.
@@ -140,9 +137,7 @@ def load_settings(
140137
@staticmethod
141138
def save_settings(
142139
settings: "AppSettings",
143-
settings_file: str = env.str(
144-
EnvVar.OD_SETTINGS_FILE, default=EnvVar.DEFAULT__OD_SETTINGS_FILE
145-
),
140+
settings_file: str = EnvVar.OD_SETTINGS_FILE,
146141
) -> bool:
147142
"""
148143
Save the application settings to the configuration file.

src/ollama_downloader/downloader/hf_model_downloader.py

Lines changed: 5 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -4,10 +4,8 @@
44
from urllib.parse import urlparse
55

66

7-
from environs import env
87
from pydantic import Field
98

10-
from ollama_downloader.common import EnvVar
119
from ollama_downloader.data.data_models import ImageManifest
1210
from ollama_downloader.downloader.model_downloader import ModelDownloader, ModelSource
1311

@@ -18,7 +16,6 @@
1816

1917
# Initialize the logger
2018
logger = logging.getLogger(__name__)
21-
logger.setLevel(env.str(EnvVar.LOG_LEVEL, default=EnvVar.DEFAULT__LOG_LEVEL).upper())
2219

2320

2421
class HuggingFaceModelDownloader(ModelDownloader):
@@ -164,7 +161,10 @@ def list_available_models(
164161
model_identifiers = [
165162
model["modelId"] for model in list(models_response.json())
166163
]
167-
164+
logger.warning(
165+
"HuggingFace models are sorted in the context of the selected page only."
166+
)
167+
model_identifiers.sort(key=lambda s: s.lower())
168168
return model_identifiers
169169

170170
def list_model_tags(self, model_identifier: str) -> List[str]:
@@ -189,4 +189,5 @@ def list_model_tags(self, model_identifier: str) -> List[str]:
189189
raise RuntimeError(
190190
f"The model {model_identifier} has no support for Ollama."
191191
)
192+
tags.sort(key=lambda s: s.lower())
192193
return tags

0 commit comments

Comments
 (0)