Data Migration Design in a CLI Tool: From Local JSON to Cloud Database
Chloe Zhou

Chloe Zhou @czhoudev

Joined:
Jul 26, 2024

Data Migration Design in a CLI Tool: From Local JSON to Cloud Database

Publish Date: Jul 5
1 0

Originally published on Medium.

Introduction

I built a simple CLI tool that lets users take notes from the terminal. In version 1, there was no login and no cloud — notes were just saved to a local JSON file on the user’s computer.

That setup worked fine at the beginning. But as the tool grew, I added basic authentication and moved storage to the cloud (using Supabase), so users could access their notes across machines.

That change introduced a new problem:

What about users who already had notes saved locally?

If I just switched systems without thinking it through, they’d open the new version and find their notes gone.

This post explains how I handled that: designing a migration system that moves notes from the local file to the cloud — safely, clearly, and without making a mess.

Migration Means Moving Houses

When I was trying to understand how to approach migration, I asked AI for help. It gave me a useful analogy:

“It’s like moving houses.”

You have stuff in your old home — your local JSON file — and you want to move it into a new one — your cloud database. That sounds simple, but moving isn’t just about copying boxes from one place to another. It’s about making sure everything still works in the new place, nothing important gets lost, and the move itself doesn’t break anything.

Here’s what that actually means.

It’s not just copying — it’s cleaning up

In version 1, all notes were saved in a single JSON file on the user’s machine. That file might contain empty notes, malformed entries, or slightly inconsistent formatting.

But version 2 uses a structured cloud database (Supabase), where everything needs to follow a fixed schema. So before I can move anything, I need to sanitize the data:

  • Removing notes that are clearly empty or invalid
  • Making sure each note is shaped like an object with a content field, and a tags field, which is an array

In other words, before I move in, I clean the data up.

The data’s meaning also changes

In version 1, the tool assumed that one computer = one user. Notes were tied to the machine.

In version 2, notes are tied to a cloud user account. Now, you can log in from anywhere and access your notes. That’s a big shift in meaning:

  • The same note is now “owned” by a user, not a device
  • Access is controlled by authentication, not file access
  • Notes can now live across machines, not just one

So part of the migration is not just “moving files,” but changing what the data represents.

Users need to know what’s happening

You don’t move into a new house without telling your friends. Migration needs to be transparent too.

If I silently switch to the cloud without warning, users might open the new version and think all their notes are gone. That’s bad UX. So I designed the migration to include:

  • A check to see if old notes exist
  • A prompt letting users choose whether to migrate
  • Clear feedback on what was migrated and what wasn’t

Good UX means telling users what’s going on, not making them guess.

Bringing it together

Migration isn’t just a file transfer. It’s three things working together:

  • Data transformation: Cleaning and reshaping the notes so they can live in a database.
  • State change: Notes now belong to a cloud account, not just a local machine.
  • User experience: Letting users know what’s happening and giving them control.

That’s what it really means to move from version 1 to version 2. It’s not just a new backend — it’s a change in what data is, who owns it, and how users interact with it.

The Migration Strategy: Five Steps

Once I understood what this migration really meant, I broke the process down into five concrete steps. Each step solves a specific problem, and together, they ensure the user’s data moves safely from version 1 to version 2.

Detection — Does the user have old data?

Before running any migration, I need to know whether there’s anything to migrate.

The version 1 CLI tool stored notes locally in a known file path. So the first step is to check if that file exists on the user’s machine.

If the file isn’t there, then this user is either new or never saved anything — no migration needed.

Safety — Make a backup before touching anything

Even if migration is simple, data loss is never acceptable.

Before doing anything else, I make a full backup of the original JSON file. This way, if something goes wrong during migration — or if the user just wants to revert — they can recover their data.

This is a simple but important rule: never destroy the original source.

Translation — Clean up and reshape the data

The local notes file was flexible. In version 2, the data needs to follow a specific structure to be accepted by the cloud database.

So before inserting anything, I sanitize each note:

  • Remove empty notes (e.g. whitespace-only)
  • Skip notes with missing or malformed fields
  • Make sure each note has the expected shape

This is the “translation” step: same ideas, but restructured to fit the new format.

Transfer — Save the cleaned notes to the cloud

After cleanup, I use the database utility functions I already built (like createNote) to send each note to Supabase. These functions are the same ones used by the app during normal use — so the migrated notes behave exactly like new ones.

If any note fails to save, I don’t crash the whole process. I log it, skip it, and continue.

Verification — Show the result to the user

Once the migration runs, I want to tell the user exactly what happened.

So I track:

  • How many notes were found
  • How many were skipped
  • How many were successfully migrated

The CLI shows a short summary at the end, so the user knows whether everything worked — or if they need to take a closer look.

Why this approach works

This strategy covers both data integrity and user experience:

  • Each step is small, focused, and testable
  • If anything fails, users don’t lose their data
  • Users are never left guessing what just happened

UX Considerations

I didn’t want the migration to interrupt users. If they don’t have old data, it should just be invisible.

If they do, they should get a clear path — but only when they need it.

It only checks once, during setup

When the user runs note setup, the CLI checks if there’s a local JSON file from version 1.

If it exists, the CLI shows this:

Legacy notes detected!
You can run:
  notes migrate check   # See what can be migrated
  notes migrate         # Perform the migration
Enter fullscreen mode Exit fullscreen mode

If there’s no local data, nothing happens. No prompt, no message.

This avoids showing unnecessary stuff to new users.

Output is simple and direct

When note migrate runs, the CLI prints something like:

Found 8 local notes.
6 migrated successfully.
2 skipped (empty or invalid).
Enter fullscreen mode Exit fullscreen mode

It doesn’t hide anything. If a note fails, it’s listed and skipped.

The point is to tell users exactly what was moved, and what wasn’t.

It won’t run again once migration is done

After a successful migration, the CLI deletes the old JSON file and archives a copy in a separate folder.

So if the user runs note migrate again, the tool won’t find anything to migrate — it just says:

No legacy notes found. Nothing to migrate.
Enter fullscreen mode Exit fullscreen mode

There’s no per-note tracking. Migration is a one-time thing.

If something goes wrong, the backup is still there.

Key Principles Behind Good Migrations

When I put together this migration logic, I mostly followed a few basic rules.

Back up first

Before doing anything, the CLI makes a copy of the notes file (e.g. db-backup.json in the same folder. If something breaks, the original file is still there.

If something fails, keep going

One bad note shouldn’t block everything.

The CLI skips anything invalid, and finishes what it can.

No need to roll back or crash the whole thing.

Say what’s happening

This isn’t a silent background process.

If notes are being moved, the CLI tells you how many it found, how many migrated, and what got skipped.

Let the user choose

Migration only runs if the user decides to.

The CLI checks once during note setup, but it’s up to the user whether to run note migrate.

No auto-magic.

Safe to run more than once

If the migration already happened, running it again just says:

No legacy notes found. Nothing to migrate.
Enter fullscreen mode Exit fullscreen mode

No side effects, no surprises.

Conclusion

Working on this migration helped me see that data migration is more than just a technical task. It touches multiple areas — technology, product design, and user experience.

Even for a lightweight CLI tool, data needs to be treated as a valuable user asset. Losing data means losing user trust, and that’s not easy to regain.

A good migration might run only once — but how it’s designed says a lot about how much you care about your users.

Comments 0 total

    Add comment