Move Your Hugging Face LLM to S3 Like a Pro (Without Wasting Local Space!) 🚀
Samuel Vazquez

Samuel Vazquez @codexmaker

About: https://aws-prod.d1xb88hl1vgt2o.amplifyapp.com/

Location:
Mexico
Joined:
Oct 19, 2021

Move Your Hugging Face LLM to S3 Like a Pro (Without Wasting Local Space!) 🚀

Publish Date: Mar 8
1 0

So you've got this huge LLM (we're talking 100+ GB of pure AI magic), and you need to get it into AWS S3 without clogging up your disk?

Let's do it the smart way—download, upload, and free up your space in one go! 💪

Best part? This guide works inside AWS SageMaker (Using a Jupyter Notebook) or your local machine with AWS CLI configured.

🔧 Prerequisites

Before we start, make sure you have:
👉 AWS CLI configured (if running locally)
👉 SageMaker Notebook (if using AWS)
👉 Hugging Face & Boto3 installed:

!pip install huggingface_hub boto3

Then, import the necessary libraries:

import os
import boto3
from huggingface_hub import hf_hub_download
Enter fullscreen mode Exit fullscreen mode

🎯 The Mission

  • Download model files from Hugging Face.
  • Upload them to an AWS S3 bucket.
  • Free up local storage by removing the files after upload.
  • Clean up Hugging Face cache to reclaim disk space.

📥 Step 1: Set Up S3 and Model Paths

s3_client = boto3.client('s3')

# CHANGE this to your bucket
BUCKET_NAME = 'your-s3-bucket-name'  

# CHANGE this to your bucket
MODEL_PATH = "deepseek" 

# Make sure you have enough space!
SAVE_DIR = "/home/sagemaker-user/"  

# CHANGE from your selected model at https://huggingface.co/
repo_id = "deepseek-ai/DeepSeek-R1-Distill-Llama-70B"
Enter fullscreen mode Exit fullscreen mode

We've listed all the files to grab from Hugging Face:

List all your transformers and the required data the LLM needs, in this case each transformer goes around 9GB * 17 resources this model have is a total of 153GB not as good for a single file download

model_files = [
    "model-00001-of-000017.safetensors", 
    "model-00002-of-000017.safetensors", ...
    "tokenizer.json", 
    "tokenizer_config.json"
]
Enter fullscreen mode Exit fullscreen mode

📄 Step 2: Upload to S3

Define a function to upload and other to delete files:

def upload_to_s3(local_file_path, s3_key):
    s3_client.upload_file(local_file_path, BUCKET_NAME, s3_key)
    print(f"Uploaded {local_file_path} to S3 bucket {BUCKET_NAME} with key {s3_key}")
Enter fullscreen mode Exit fullscreen mode
def remove_local_file(file_path):
    try:
        os.remove(file_path)
        print(f"Removed local file {file_path}")
    except Exception as e:
        print(f"Error removing file {file_path}: {str(e)}")
Enter fullscreen mode Exit fullscreen mode

Now, download, upload, and delete locally each file:

for file_name in model_files:
    local_file_path = hf_hub_download(repo_id=repo_id, filename=file_name, local_dir=SAVE_DIR)
    print(f"Downloaded {file_name} to {local_file_path}")

    s3_key = f"models/deepseek/{file_name}"
    upload_to_s3(local_file_path, s3_key)

    remove_local_file(local_file_path)
Enter fullscreen mode Exit fullscreen mode

🗑 Extra Step: Clear Hugging Face Cache and debug your storage

import shutil

def clear_huggingface_cache(cache_dir):
    """Clear Hugging Face cache directory"""
    try:
        shutil.rmtree(cache_dir)
        print(f"Cleared Hugging Face cache at {cache_dir}")
    except Exception as e:
        print(f"Error clearing cache: {str(e)}")

clear_huggingface_cache("/home/sagemaker-user/.cache/huggingface")
Enter fullscreen mode Exit fullscreen mode
def check_disk_space(directory):
    """Log disk space for a specific directory"""
    total, used, free = shutil.disk_usage(directory)
    print(f"Disk space for {directory} - Total: {total} GB, Used: {used} GB, Free: {free} GB")

check_disk_space("/home/sagemaker-user/.cache")
Enter fullscreen mode Exit fullscreen mode

And That's It!

✅ Your Model is now safely stored in S3.
✅ Your local disk is free of clutter.
✅ You're ready to deploy or fine-tune without worrying about storage!🚀

💬 Got questions or improvements? Drop a comment below!
🔥 If this helped you, give it a ❤️ and share it with other LLM builders!

🔗 Connect With Me!

💼 LinkedIn: CodexMaker
📂 GitHub: CodexMaker
🎥 YouTube: CodexMaker

Have questions or improvements? Drop a comment below! 🚀🔥

"Remember AI is poetry, words become numbers, numbers will shape the
future"
Codexmaker

Comments 0 total

    Add comment