Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Modified the _config_get function to properly handle nested YAML values. #845

Merged
merged 7 commits into from
Feb 11, 2025

Conversation

SamuelWN
Copy link
Contributor

@SamuelWN SamuelWN commented Feb 9, 2025

The OmegaConf's get function does not appear to handle nested YAML values as expected.
Instead of interpreting the value to the left of . as the parent key, it seems to interpret it literally (e.g. searches for a key named chatgpt.chat_system_template).

The select function is able to handle nested values (and searches top-level keys with . if _CONFIG_KEY is empty). However, it will not search upward for the key, and instead returns None, so get is still required.

ToDo: Move gpt_config stuff to a parent class

…lues.

@todo: Move config import functions to parent class to consolidate code.
@todo: Move config import functions to parent class to consolidate code.
…onfigGPT).

    - Added `rgx_capture` variable to config file. Provide a regex string with capture group(s), provides customizable approach to filtering GPT response (concatenates multiple capture groups, if provided). (Default: capture all.)
    - Improved `_config_get` to handle `n`-nested keys (e.g. `ollama.deepseek.rgx_capture`)

Revamping of `ollama.py`
    - Removal of esoteric translation refusal handling and other such fixes.
    - Made child class of ConfigGPT
    - Use rgx_capture to filter output
    - Added `OLLAMA_MODEL_CONF` variable to imported keys
        - Due to `ollama` being an interface rather than a specific model, different config options are desirable depending on the model used. This additional key allows one to specify a sub-set of configs (e.g. `ollama.deepseek`, `ollama.llama`, etc.)
@zyddnys zyddnys merged commit 4a6e26c into zyddnys:main Feb 11, 2025
0 of 2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants