Skip to content

NextGen-GameDev/DeepThought-Engine

Repository files navigation

Logo

DeepThought Engine

Driving Open Source Innovation in Game Development with Advanced AI Integration

Contributors Forks Stargazers Issues MIT License

                                                                                                             

Table of Contents

  1. About the Plugin
  2. Getting Started
  3. Guides and Tutorials
  4. Contributing
  5. License

About the Plugin

DeepThought Engine is an experimental Unreal Engine plugin designed to facilitate the integration and use of various machine learning model architectures. Inspired by HuggingFace Transformers' relationship with PyTorch, DeepThought Engine goes beyond to support a diverse range of models.

Features

  • Integrate diverse machine learning model architectures within Unreal Engine.
  • Easily interact with Transformer models.
  • Utilize the power of LibTorch for deep learning operations.
  • Seamlessly tokenize and preprocess data using Tokenizers-UE5.
  • Future-proof design with planned support for multiple backends.

(back to top)

Getting Started

To get started with DeepThought Engine, ensure you have the necessary prerequisites and follow the installation instructions.

Prerequisites

In order for this plugin to work, additional files which are not present in this repository are required. Those files can be accessed from the official PyTorch website.

  • OS: Windows - 64 bit
  • UE Version: 5.0 - 5.3
  • GPU Support: CUDA-only (for GPU acceleration)
  • LibTorch-UE5 Plugin: Required (only backend available for now)
  • Tokenizers-UE5 Plugin: Optional (for tokenization functionality)

Installation

  1. In your Unreal Engine project, create a Plugins folder if it doesn't already exist.
  2. Clone this repository inside the Plugins folder:
git clone https://github.com/P1ayer-1/DeepThought-Engine.git

Backend Setup

DeepThought Engine currently supports LibTorch as the primary backend. Follow these steps to set up LibTorch:

Method 1: LibTorch-UE5 Releases

  1. Navigate to the Releases page.
  2. Download the source code for a release of the LibTorch-UE5 plugin.
  3. Extract the source code archive to the Plugins folder.
  4. Download the CPU or GPU LibTorch distribution from the same release.

Method 2: LibTorch-UE5 Repo (For Developers/Contributors)

  1. Clone the LibTorch-UE5 repository inside the Plugins folder:
git clone https://github.com/P1ayer-1/LibTorch-UE5.git
  1. Download the appropriate CPU or GPU LibTorch distribution from the official PyTorch website.
  2. Extract the include and lib folders from the downloaded LibTorch archive into Plugins/LibTorch-UE5/Source/ThirdParty/LibTorch/Win64.

Common Steps for Both Methods

After you have downloaded the required files, follow the steps below:

  1. Extract the include and lib folders to Plugins/LibTorch-UE5/Source/ThirdParty/LibTorch/Win64.
  2. Navigate to Plugins/LibTorch-UE5/Source/ThirdParty/LibTorch/Win64.
  3. Remove the include folder.
  4. Extract the include and lib folders from the downloaded LibTorch release.
  5. Go to include/torch/csrc/api/include/torch/enum.h.
  6. Find the following line of code:
TORCH_API extern const enumtype::k##name k##name;
  1. Remove TORCH_API extern from the aforementioned line of code.
  2. Go to include/torch/csrc/api/include/torch/nn/module.h.
  3. Find the following comment in the Module class:
// Friend classes.
  1. Add the following line of code below the comment:
friend class IAtumLayer;

Tokenizers Plugin Setup (Optional)

  1. Navigate to the Releases page.
  2. Download the source code for the release you want to use.
  3. Extract the downloaded source code into the Plugins directory.
  4. Navigate to Plugins/Tokenizers-UE5/Source/ThirdParty/TokenizersLibrary/Win64.
  5. From the same release page, download tokenizers_c.lib and place it inside the Win64 folder.
  6. Delete the placeholder file named PLACE STATIC LIB HERE from the Win64 folder.

(back to top)

Guides and Tutorials

Disclaimer: The tutorials were published before this plugin was created. They only focus on Tokenizers-UE5 and LibTorch-UE5.

YouTube Tutorials

Wiki

(back to top)

Contributing

Interested in contributing? Great! Check out the contributing guidelines to get started. Contributors are also encouraged to join our community Discord server.

(back to top)

License

Distributed under the MIT License. See LICENSE for more information.

(back to top)

About

No description, website, or topics provided.

Resources

License

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors