Skip to content

Modified the _config_get function to properly handle nested YAML values. #845

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 7 commits into from
Feb 11, 2025

Conversation

SamuelWN
Copy link
Contributor

@SamuelWN SamuelWN commented Feb 9, 2025

The OmegaConf's get function does not appear to handle nested YAML values as expected.
Instead of interpreting the value to the left of . as the parent key, it seems to interpret it literally (e.g. searches for a key named chatgpt.chat_system_template).

The select function is able to handle nested values (and searches top-level keys with . if _CONFIG_KEY is empty). However, it will not search upward for the key, and instead returns None, so get is still required.

ToDo: Move gpt_config stuff to a parent class

…lues.

@todo: Move config import functions to parent class to consolidate code.
@todo: Move config import functions to parent class to consolidate code.
…onfigGPT).

    - Added `rgx_capture` variable to config file. Provide a regex string with capture group(s), provides customizable approach to filtering GPT response (concatenates multiple capture groups, if provided). (Default: capture all.)
    - Improved `_config_get` to handle `n`-nested keys (e.g. `ollama.deepseek.rgx_capture`)

Revamping of `ollama.py`
    - Removal of esoteric translation refusal handling and other such fixes.
    - Made child class of ConfigGPT
    - Use rgx_capture to filter output
    - Added `OLLAMA_MODEL_CONF` variable to imported keys
        - Due to `ollama` being an interface rather than a specific model, different config options are desirable depending on the model used. This additional key allows one to specify a sub-set of configs (e.g. `ollama.deepseek`, `ollama.llama`, etc.)
@zyddnys zyddnys merged commit 4a6e26c into zyddnys:main Feb 11, 2025
0 of 2 checks passed
@popcion
Copy link
Contributor

popcion commented Apr 4, 2025

@SamuelWN
After this PR was merged, all subsequent versions in server produced an "INFO: Could not find files for the given pattern(s)", but it doesn't affect the actual running of the program. Do you have any ideas about this?

@SamuelWN
Copy link
Contributor Author

SamuelWN commented Apr 4, 2025

@popcion I have never actually used the web mode and haven't ever gotten that message when using the command line.

I searched for that error message, and the most likely culprit I found was:

def get_translator(key: Translator, *args, **kwargs) -> CommonTranslator:
if key not in TRANSLATORS:
raise ValueError(f'Could not find translator for: "{key}". Choose from the following: %s' % ','.join(TRANSLATORS))

Though introduced only in a later merge (not this one), I could imagine that warning being triggered if the user provides openai as the translator (rather than "chatgpt"). Though intended nice and easy way to allow alternate names, I could imagine it confusing the other code:

# Map 'openai' and any translator starting with 'gpt'* to 'chatgpt'
@classmethod
def _missing_(cls, value):
if value.startswith('gpt') or value == 'openai':
return cls.chatgpt
raise ValueError(f"{value} is not a valid {cls.__name__}")

If changing the translator to chatgpt gets rid of the warning, then that is probably the cause.

@popcion
Copy link
Contributor

popcion commented Apr 4, 2025

Thank you for your analysis, but after trying, this info still occurs. I saw this issue on a Discord server, and it still exists.
Before this PR was merged:
image

After merging:
image

However, when I tested it with Python 3.10 on Linux, it didn't show up. I'm using Python 3.11.4 on Windows, and I'm not sure yet if it's a version issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants