Skip to content

Releases: stratusadv/dandy

v2.2.0

07 Mar 04:47

Choose a tag to compare

Features

  • Diligence System (Experimental)
    • The llm service class has a new sub service called diligence.
    • You can activate different diligence systems through Bot().llm.diligence.
      • Note diligence can have both positive and negative effects and should be used with caution
      • Example Bot.llm.diligence.stop_word_removal.activate() will activate the stop word removal in the llm connector.
      • Bot.llm.diligence.vowel_removal & Bot.llm.diligence.second_pass

v2.1.0

04 Mar 15:27

Choose a tag to compare

Features

  • CLI !!!
    • Use dandy to access the new command line interface.
    • All commands have a basic help provided with autocomplete.
    • The default / no command will answer questions about the cli.
  • Diligence System (Experimental)
    • The Bot class has a new attribute called diligence which is defaulted to 1.0
    • You can adjust diligence between 0.0 (almost no effort) and 2.0 (maximum effort)
    • Adjusting the diligence level allows you to control processing with any LLM.
    • This feature is experimental and works with any llm model.

Fixes

  • Updated the dandy.conf.settings to be much more flexible with different use cases and systems.
  • Lots of refactoring, typing and code cleaning to new ruff and ty configuration.

v2.0.0

09 Feb 23:42

Choose a tag to compare

Major Release

  • This is a major release, and we highly recommend reading our documentation before upgrading.

Breaking

  • Ollama API support has been removed from this project and is now defaulted to the OpenAI API standard.
  • Since Ollama supports the OpenAI API standard, you can continue to use Dandy with Ollama.
  • Removed calculator module.
  • Removed processor module and BaseProcessor class; Bot is now the primary entry point.
  • Adapted Decoder from standalone processor into a service usable via the Bot module.
    • Bot().llm.decoder
  • Removed Agent module.
  • Removed all LLM_DEFAULT_* from settings and now require it to be set inside of LLM_CONFIGS for each model.
    • By default, it uses the defaults on the llm endpoint.
  • All exceptions that were postfixed Exception are now postfixed Error.
    • Example: DandyCriticalException is now DandyCriticalError
  • The example project has moved to tests.example_project and has been added as a required test.
  • Removed PromptOrStr and PromptOrStrOrNone TypeAlias's.
  • Removed toolbox module (functionality replaced by the new CLI).
  • Removed makefile.
  • All Prompt methods have had the argument triple_quote changed to triple_backtick.
  • All attributes on Bot that were prefixed with llm_ have had their prefix removed (except llm_config).
    • Example: Bot().llm_task is now Bot().task

Changes

  • All options in LLM_CONFIGS now need to be inside an OPTIONS key and are set as lower case keys.
    • Example: OPTIONS: {'temperature': 1.4, 'top_p': 0.7, 'frequency_penalty': 0.2, 'presence_penalty': 0.1}.

Features

  • FileService is now available on Bot via Bot().file to make it easy to manipulate and work with files.
  • HttpService is now available on Bot via Bot().http for making HTTP requests.
  • IntelService is now available on Bot via Bot().intel for manipulating Intel classes and objects.
  • The Bot().llm.prompt_to_intel method now supports Vision and Audio.
  • The command line interface is back and better than ever, check it out by typing dandy in your terminal.
    • Type into the input to learn more about the features.
    • Use / to run a command.
  • BaseListIntel is a new base class for creating Intel objects that primarily wrap a list of items, providing list-like access.
  • DefaultIntel is a simple Intel class with a text field for quick use.
  • The BaseIntel has some new convenience methods save_to_file and create_from_file for easy long-term storage.
  • Configuring LLM options can now be done through Bot().llm.options and Bot().llm.decoder.options.
    • Example: new_bot = Bot() ... new_bot.llm.options.temperature=1.4

Fixes

  • Fixed a bug with directory_list method on the Prompt class when a file has no extension.
  • dandy.llm.conf.LlmConfigs are now checked during usage to allow for better control loading environments.
  • dandy.conf.settings now reloads its current state at the time of attribute access instead of once during init.
  • Fixed many issues with customizing all Service and Processor subclasses.
  • Decouple a lot of the LLM modules to allow for better maintainability and testing by using the new LlmConnector.
  • Added 2026_roadmap.md to track future development plans.
  • Reorganized project structure to be more service-oriented.

v2.0.0.alpha.0

02 Feb 05:14

Choose a tag to compare

v2.0.0.alpha.0 Pre-release
Pre-release

Major Release

  • This is a major release, and we highly recommend reading our documentation before upgrading.

Breaking

  • Ollama API support has been removed from this project and is now defaulted to the OpenAI API standard.
  • Since Ollama supports the OpenAI API standard, you can continue to use Dandy with Ollama.
  • Removed calculator module.
  • Removed processor module and BaseProcessor class; Bot is now the primary entry point.
  • Adapted Decoder from standalone processor into a service usable via the Bot module.
    • Bot().llm.decoder
  • Removed Agent module.
  • Removed all LLM_DEFAULT_* from settings and now require it to be set inside of LLM_CONFIGS for each model.
    • By default, it uses the defaults on the llm endpoint.
  • All exceptions that were postfixed Exception are now postfixed Error.
    • Example: DandyCriticalException is now DandyCriticalError
  • The example project has been removed.
  • Removed PromptOrStr and PromptOrStrOrNone TypeAlias's.
  • Removed toolbox module (functionality replaced by the new CLI).
  • Removed makefile.
  • All Prompt methods have had the argument triple_quote changed to triple_backtick.

Changes

  • All options in LLM_CONFIGS now need to be inside an OPTIONS key and are set as lower case keys.
    • Example: OPTIONS: {'temperature': 1.4, 'top_p': 0.7, 'frequency_penalty': 0.2, 'presence_penalty': 0.1}.

Features

  • FileService is now available on Bot via Bot().file to make it easy to manipulate and work with files.
  • HttpService is now available on Bot via Bot().http for making HTTP requests.
  • IntelService is now available on Bot via Bot().intel for manipulating Intel classes and objects.
  • The Bot().llm.prompt_to_intel method now supports Vision and Audio.
  • The command line interface is back and better than ever, check it out by typing dandy in your terminal.
    • Type into the input to learn more about the features.
    • Use / to run a command.
  • BaseListIntel is a new base class for creating Intel objects that primarily wrap a list of items, providing list-like access.
  • DefaultIntel is a simple Intel class with a text field for quick use.
  • The BaseIntel has some new convenience methods save_to_file and create_from_file for easy long-term storage.
  • Configuring LLM options can now be done through Bot().llm.options and Bot().llm.decoder.options.
    • Example: new_bot = Bot() ... new_bot.llm.options.temperature=1.4

Fixes

  • Fixed a bug with directory_list method on the Prompt class when a file has no extension.
  • dandy.llm.conf.LlmConfigs are now checked during usage to allow for better control loading environments.
  • dandy.conf.settings now reloads its current state at the time of attribute access instead of once during init.
  • Fixed many issues with customizing all Service and Processor subclasses.
  • Decouple a lot of the LLM modules to allow for better maintainability and testing by using the new LlmConnector.
  • Added 2026_roadmap.md to track future development plans.
  • Reorganized project structure to be more service-oriented.

v1.3.5

17 Nov 04:53

Choose a tag to compare

Fixes

  • Fixed the Decoder prompt and intel classes to properly match each other.

v1.3.4

16 Nov 17:52

Choose a tag to compare

Fixes

  • Cleaned up the default llm_role and llm_task instructions to be more generic and less intrusive.
  • Cleaned up the Decoder system prompt to be more accurate.
  • Cleaned the main llm service system prompt to improve results across Dandy.
  • Added the missing line break when adding a Prompt to a Prompt for better formatting.
  • Changed the formatting on the Decoder prompt to produce better results when there is no max_return_values
  • Fixed a bug with decoder not rendering recorder events properly.
  • Corrected recorder failing to render OpenaiRequestMessage structure.
  • Fixed the problem with setting custom attributes on instantiating processors.

v1.3.3

12 Nov 18:29

Choose a tag to compare

Fixes

  • Fixed bug causing system_override_prompt to not be handled properly in the llm service.

v1.3.2

10 Nov 23:21

Choose a tag to compare

Fixes

  • Fixed the system_prompt_override to properly override the system prompt.
  • Fixed the Decoder prompt to properly show plural when there is no key limit.

v1.3.1

07 Nov 03:52

Choose a tag to compare

Fixes

  • Fixed and order of operations mistake with generating the llm service system prompt.

v1.3.0

03 Nov 21:30

Choose a tag to compare

Breaking

  • Service method llm.prompt_to_intel no longer takes postfix_system_prompt as an argument as its redundant.

Features

  • All Processor classes Bot, Agent and Decoder now have a reset_services method.
    • This provides an easy and lightweight way to reset any persistent data in all the processor services.
  • All Service classes Llm, Http, Intel now have a reset_service method.
    • This is for having more fine control over persistent data.
    • there are other more fine control reset methods, and we recommend you check out the API reference.

Fixes

  • Fixed the signatures for __init__ method on Agent, Bot and Decoder.
  • Correct the order of operation for when message history adds responses in the llm service.