This project provides a Docker container setup optimized for NVIDIA GPU development using CUDA and Python. It is designed to facilitate the development of machine learning and data science applications by providing a pre-configured environment with essential tools and libraries.
- CUDA Support: Built on NVIDIA's CUDA base image, enabling GPU acceleration for compute-intensive tasks.
- Python Environment: Comes with Python pre-installed, along with essential libraries for data science and machine learning.
- Poetry for Dependency Management: Utilizes Poetry to manage project dependencies, ensuring a clean and reproducible environment.
- Customizable: Easily modify the Dockerfile to add additional packages or configurations as needed.
To get started with this project, follow these steps:
-
Clone the Repository:
git clone https://github.com/yourusername/nvidia-devcontainer-base.git cd nvidia-devcontainer-base -
Build the Docker Image:
docker build -t nvidia-devcontainer-base . -
Run the Docker Container:
docker run --gpus all -it nvidia-devcontainer-base
Once inside the container, you can start developing your application. The environment is set up with common data science libraries, and you can install additional packages using Poetry.
Contributions are welcome! Please open an issue or submit a pull request for any improvements or bug fixes.
This project is licensed under the MIT License - see the LICENSE file for details.