Releases: stratusadv/dandy
Releases · stratusadv/dandy
v2.2.0
Features
- Diligence System (Experimental)
- The
llmservice class has a new sub service calleddiligence. - You can activate different diligence systems through
Bot().llm.diligence.- Note diligence can have both positive and negative effects and should be used with caution
- Example
Bot.llm.diligence.stop_word_removal.activate()will activate the stop word removal in the llm connector. Bot.llm.diligence.vowel_removal&Bot.llm.diligence.second_pass
- The
v2.1.0
Features
- CLI !!!
- Use
dandyto access the new command line interface. - All commands have a basic help provided with autocomplete.
- The default / no command will answer questions about the cli.
- Use
- Diligence System (Experimental)
- The
Botclass has a new attribute calleddiligencewhich is defaulted to1.0 - You can adjust
diligencebetween0.0(almost no effort) and2.0(maximum effort) - Adjusting the diligence level allows you to control processing with any LLM.
- This feature is experimental and works with any llm model.
- The
Fixes
- Updated the
dandy.conf.settingsto be much more flexible with different use cases and systems. - Lots of refactoring, typing and code cleaning to new
ruffandtyconfiguration.
v2.0.0
Major Release
- This is a major release, and we highly recommend reading our documentation before upgrading.
Breaking
- Ollama API support has been removed from this project and is now defaulted to the OpenAI API standard.
- Since Ollama supports the OpenAI API standard, you can continue to use Dandy with Ollama.
- Removed
calculatormodule. - Removed
processormodule andBaseProcessorclass;Botis now the primary entry point. - Adapted
Decoderfrom standalone processor into a service usable via theBotmodule.Bot().llm.decoder
- Removed
Agentmodule. - Removed all
LLM_DEFAULT_*from settings and now require it to be set inside ofLLM_CONFIGSfor each model.- By default, it uses the defaults on the llm endpoint.
- All exceptions that were postfixed
Exceptionare now postfixedError.- Example:
DandyCriticalExceptionis nowDandyCriticalError
- Example:
- The
exampleproject has moved totests.example_projectand has been added as a required test. - Removed
PromptOrStrandPromptOrStrOrNoneTypeAlias's. - Removed
toolboxmodule (functionality replaced by the new CLI). - Removed
makefile. - All
Promptmethods have had the argumenttriple_quotechanged totriple_backtick. - All attributes on
Botthat were prefixed withllm_have had their prefix removed (exceptllm_config).- Example:
Bot().llm_taskis nowBot().task
- Example:
Changes
- All options in
LLM_CONFIGSnow need to be inside anOPTIONSkey and are set as lower case keys.- Example:
OPTIONS: {'temperature': 1.4, 'top_p': 0.7, 'frequency_penalty': 0.2, 'presence_penalty': 0.1}.
- Example:
Features
FileServiceis now available onBotviaBot().fileto make it easy to manipulate and work with files.HttpServiceis now available onBotviaBot().httpfor making HTTP requests.IntelServiceis now available onBotviaBot().intelfor manipulatingIntelclasses and objects.- The
Bot().llm.prompt_to_intelmethod now supports Vision and Audio. - The command line interface is back and better than ever, check it out by typing
dandyin your terminal.- Type into the input to learn more about the features.
- Use
/to run a command.
BaseListIntelis a new base class for creating Intel objects that primarily wrap a list of items, providing list-like access.DefaultIntelis a simple Intel class with atextfield for quick use.- The
BaseIntelhas some new convenience methodssave_to_fileandcreate_from_filefor easy long-term storage. - Configuring LLM options can now be done through
Bot().llm.optionsandBot().llm.decoder.options.- Example:
new_bot = Bot() ... new_bot.llm.options.temperature=1.4
- Example:
Fixes
- Fixed a bug with
directory_listmethod on thePromptclass when a file has no extension. dandy.llm.conf.LlmConfigsare now checked during usage to allow for better control loading environments.dandy.conf.settingsnow reloads its current state at the time of attribute access instead of once during init.- Fixed many issues with customizing all
ServiceandProcessorsubclasses. - Decouple a lot of the LLM modules to allow for better maintainability and testing by using the new
LlmConnector. - Added
2026_roadmap.mdto track future development plans. - Reorganized project structure to be more service-oriented.
v2.0.0.alpha.0
Major Release
- This is a major release, and we highly recommend reading our documentation before upgrading.
Breaking
- Ollama API support has been removed from this project and is now defaulted to the OpenAI API standard.
- Since Ollama supports the OpenAI API standard, you can continue to use Dandy with Ollama.
- Removed
calculatormodule. - Removed
processormodule andBaseProcessorclass;Botis now the primary entry point. - Adapted
Decoderfrom standalone processor into a service usable via theBotmodule.Bot().llm.decoder
- Removed
Agentmodule. - Removed all
LLM_DEFAULT_*from settings and now require it to be set inside ofLLM_CONFIGSfor each model.- By default, it uses the defaults on the llm endpoint.
- All exceptions that were postfixed
Exceptionare now postfixedError.- Example:
DandyCriticalExceptionis nowDandyCriticalError
- Example:
- The example project has been removed.
- Removed
PromptOrStrandPromptOrStrOrNoneTypeAlias's. - Removed
toolboxmodule (functionality replaced by the new CLI). - Removed
makefile. - All
Promptmethods have had the argumenttriple_quotechanged totriple_backtick.
Changes
- All options in
LLM_CONFIGSnow need to be inside anOPTIONSkey and are set as lower case keys.- Example:
OPTIONS: {'temperature': 1.4, 'top_p': 0.7, 'frequency_penalty': 0.2, 'presence_penalty': 0.1}.
- Example:
Features
FileServiceis now available onBotviaBot().fileto make it easy to manipulate and work with files.HttpServiceis now available onBotviaBot().httpfor making HTTP requests.IntelServiceis now available onBotviaBot().intelfor manipulatingIntelclasses and objects.- The
Bot().llm.prompt_to_intelmethod now supports Vision and Audio. - The command line interface is back and better than ever, check it out by typing
dandyin your terminal.- Type into the input to learn more about the features.
- Use
/to run a command.
BaseListIntelis a new base class for creating Intel objects that primarily wrap a list of items, providing list-like access.DefaultIntelis a simple Intel class with atextfield for quick use.- The
BaseIntelhas some new convenience methodssave_to_fileandcreate_from_filefor easy long-term storage. - Configuring LLM options can now be done through
Bot().llm.optionsandBot().llm.decoder.options.- Example:
new_bot = Bot() ... new_bot.llm.options.temperature=1.4
- Example:
Fixes
- Fixed a bug with
directory_listmethod on thePromptclass when a file has no extension. dandy.llm.conf.LlmConfigsare now checked during usage to allow for better control loading environments.dandy.conf.settingsnow reloads its current state at the time of attribute access instead of once during init.- Fixed many issues with customizing all
ServiceandProcessorsubclasses. - Decouple a lot of the LLM modules to allow for better maintainability and testing by using the new
LlmConnector. - Added
2026_roadmap.mdto track future development plans. - Reorganized project structure to be more service-oriented.
v1.3.5
Fixes
- Fixed the
Decoderprompt and intel classes to properly match each other.
v1.3.4
Fixes
- Cleaned up the default
llm_roleandllm_taskinstructions to be more generic and less intrusive. - Cleaned up the
Decodersystem prompt to be more accurate. - Cleaned the main llm service system prompt to improve results across
Dandy. - Added the missing line break when adding a
Promptto aPromptfor better formatting. - Changed the formatting on the
Decoderprompt to produce better results when there is nomax_return_values - Fixed a bug with decoder not rendering recorder events properly.
- Corrected recorder failing to render
OpenaiRequestMessagestructure. - Fixed the problem with setting custom attributes on instantiating processors.
v1.3.3
Fixes
- Fixed bug causing
system_override_promptto not be handled properly in the llm service.
v1.3.2
Fixes
- Fixed the
system_prompt_overrideto properly override the system prompt. - Fixed the Decoder prompt to properly show plural when there is no key limit.
v1.3.1
Fixes
- Fixed and order of operations mistake with generating the llm service system prompt.
v1.3.0
Breaking
- Service method
llm.prompt_to_intelno longer takespostfix_system_promptas an argument as its redundant.
Features
- All
ProcessorclassesBot,AgentandDecodernow have areset_servicesmethod.- This provides an easy and lightweight way to reset any persistent data in all the processor services.
- All
ServiceclassesLlm,Http,Intelnow have areset_servicemethod.- This is for having more fine control over persistent data.
- there are other more fine control reset methods, and we recommend you check out the API reference.
Fixes
- Fixed the signatures for
__init__method onAgent,BotandDecoder. - Correct the order of operation for when message history adds responses in the llm service.