Table of Contents
DeepThought Engine is an experimental Unreal Engine plugin designed to facilitate the integration and use of various machine learning model architectures. Inspired by HuggingFace Transformers' relationship with PyTorch, DeepThought Engine goes beyond to support a diverse range of models.
- Integrate diverse machine learning model architectures within Unreal Engine.
- Easily interact with Transformer models.
- Utilize the power of LibTorch for deep learning operations.
- Seamlessly tokenize and preprocess data using Tokenizers-UE5.
- Future-proof design with planned support for multiple backends.
To get started with DeepThought Engine, ensure you have the necessary prerequisites and follow the installation instructions.
In order for this plugin to work, additional files which are not present in this repository are required. Those files can be accessed from the official PyTorch website.
- OS: Windows - 64 bit
- UE Version: 5.0 - 5.3
- GPU Support: CUDA-only (for GPU acceleration)
- LibTorch-UE5 Plugin: Required (only backend available for now)
- Tokenizers-UE5 Plugin: Optional (for tokenization functionality)
- In your Unreal Engine project, create a
Pluginsfolder if it doesn't already exist. - Clone this repository inside the
Pluginsfolder:
git clone https://github.com/P1ayer-1/DeepThought-Engine.gitDeepThought Engine currently supports LibTorch as the primary backend. Follow these steps to set up LibTorch:
- Navigate to the Releases page.
- Download the source code for a release of the LibTorch-UE5 plugin.
- Extract the source code archive to the
Pluginsfolder. - Download the CPU or GPU LibTorch distribution from the same release.
- Clone the LibTorch-UE5 repository inside the
Pluginsfolder:
git clone https://github.com/P1ayer-1/LibTorch-UE5.git- Download the appropriate CPU or GPU LibTorch distribution from the official PyTorch website.
- Extract the
includeandlibfolders from the downloaded LibTorch archive intoPlugins/LibTorch-UE5/Source/ThirdParty/LibTorch/Win64.
After you have downloaded the required files, follow the steps below:
- Extract the
includeandlibfolders toPlugins/LibTorch-UE5/Source/ThirdParty/LibTorch/Win64. - Navigate to
Plugins/LibTorch-UE5/Source/ThirdParty/LibTorch/Win64. - Remove the
includefolder. - Extract the
includeandlibfolders from the downloaded LibTorch release. - Go to
include/torch/csrc/api/include/torch/enum.h. - Find the following line of code:
TORCH_API extern const enumtype::k##name k##name;- Remove
TORCH_API externfrom the aforementioned line of code. - Go to
include/torch/csrc/api/include/torch/nn/module.h. - Find the following comment in the
Moduleclass:
// Friend classes.- Add the following line of code below the comment:
friend class IAtumLayer;- Navigate to the Releases page.
- Download the source code for the release you want to use.
- Extract the downloaded source code into the
Pluginsdirectory. - Navigate to
Plugins/Tokenizers-UE5/Source/ThirdParty/TokenizersLibrary/Win64. - From the same release page, download
tokenizers_c.liband place it inside theWin64folder. - Delete the placeholder file named
PLACE STATIC LIB HEREfrom theWin64folder.
Disclaimer: The tutorials were published before this plugin was created. They only focus on Tokenizers-UE5 and LibTorch-UE5.
Interested in contributing? Great! Check out the contributing guidelines to get started. Contributors are also encouraged to join our community Discord server.
Distributed under the MIT License. See LICENSE for more information.