When Code Collides: How to Prevent Data Loss in Node.js Apps with Cron Jobs and API Calls
Muhammad Yasir Rafique

Muhammad Yasir Rafique @yasir_rafique_27550feb631

Location:
London, England, United Kingdom
Joined:
Aug 31, 2024

When Code Collides: How to Prevent Data Loss in Node.js Apps with Cron Jobs and API Calls

Publish Date: Jul 21
2 7

Introduction
Have you ever built a system where both users and automated tasks need to update the same file at the same time? It sounds simple, but in reality, things can go wrong.
A while ago, I ran into a tricky situation. My app saved important info to a JSON file. On one side, a scheduled cron job was updating this file every hour. Whereas, users could change the same file at any moment through an API. Everything worked great - until the day both tried to write at the exact same time. The result? Sometimes the file got corrupted, sometimes the latest changes were lost, and other times, the app just threw a weird error.
This kind of “collision” between code is more common than we imagine. In this article, I’ll break down why this happens, how it can mess up your app, and most importantly - show you practical ways to fix it. Whether you’re building something big or just learning, these lessons can save you hours of debugging and a lot of headaches.

The Problem
In our project, we needed to make session data “live” for OpenActive. That meant generating and updating JSON feed files. Every time a user/client added or updated a session like a new event or a change in the schedule, we needed to update both the database and the JSON file, so the data should be ready for OpenActive consumers.
To keep the data fresh, we also set up a cron job. This job would run every half hour, check for sessions that had expired, and then update the JSON feed file marking those sessions as deleted or inactive.
At first, this setup looked simple. The API wrote to the file whenever users made changes, and the cron job did its cleanup work in the background. But pretty quickly, we realized both could try to write to the same JSON file, one from a user action, and one from the automated job and sometimes it happens at the same time.
That’s when the real problems started to show up.

What Can Go Wrong?
When two parts of your app try to update the same JSON file at the same time, things can get messy fast. This is called a race condition and the results aren’t always easy to spot until something breaks.
Here’s what can actually go wrong:

Lost Data
Imagine the cron job and a user both update the feed file within seconds of each other. If they both read the file, make their changes, and then write it back, the last one to write “wins.” Any changes the other made are lost, with no warning.

Corrupted Files
Sometimes, if both try to write at the exact same moment, you can end up with a half-written or empty file. This means your JSON is broken, and when OpenActive or any other system tries to read, it will throw errors.

Random Errors
These issues can be hard to catch. Your app might work fine for days, then suddenly throw weird errors. These bugs are unpredictable and can be frustrating to debug.
Here’s a sample code snippet showing the problem:

// ---- API Call Example ----
let feed = await fetchFeedsFromS3('feed.json');
feed.items.push(newFeedItem); // User adds a new session
await saveFeedsToS3('feed.json', feed);

// ---- Cron Job Example ----
let feed = await fetchFeedsFromS3('feed.json');
feed.items = feed.items.filter(item => !isExpired(item)); // Remove expired sessions
await saveFeedsToS3('feed.json', feed);
Enter fullscreen mode Exit fullscreen mode

Solutions
So, how do you avoid losing data or breaking your JSON file when both your API and cron job need to update it? Here are some real solutions you can try:
1. File Locking (Mutex)
One way to make sure only one process writes to your file at a time is to use a software “lock.” This can be as simple as checking for a lock file before writing, or using a library that handles it for you.
Example using a lock file:

// Pseudocode
if (!fs.existsSync('feed.lock')) {
  fs.writeFileSync('feed.lock', 'locked');
  // Read, update, and save feed.json
  fs.unlinkSync('feed.lock');
} else {
  // Wait or try again later
}
Enter fullscreen mode Exit fullscreen mode

There are also libraries like Proper-lockfile for Node.js that make this safer and easier.
2. Use a Database for Syncing
Instead of relying on files, use your database as the single source of truth. The cron job and API both update the database, and only one process (for example, the cron job) generates the JSON feed when needed. Most databases handle concurrent updates safely.
3. Queue the Updates
If your system gets lots of updates, consider using a message queue (like AWS SQS or RabbitMQ). Each change request is added to the queue, and a single worker handles updates to the JSON file in order.

Our Solution: Moving to API Endpoints
We eventually decided to skip file writing altogether. Instead of creating and updating JSON files, we built API endpoints that deliver JSON responses on demand. This approach works perfectly with OpenActive requirements and it completely avoids the risks of file conflicts and makes our data always up to date.
By serving JSON directly from the API, we made our system simpler, faster, and easier to maintain.

Lessons Learned / Conclusion
Handling data updates from both scheduled jobs and user actions might seem easy at first, but race conditions can sneak up and cause big problems. If you’re working with JSON or any type of files, it’s important to think about locking, or using the database as your main source of truth.
For us, switching to API endpoints that return live JSON turned out to be the best solution. It keeps our data fresh, avoids file conflicts, and makes our system more reliable for everyone. Always go for a solution that better suits your requirement and environment.
The main lesson? Think ahead about how different parts of your app will interact with the same data. Even simple setups can run into trouble when things happen at the same time but with a little planning, you can avoid the headaches and keep your project running smoothly.

Comments 7 total

  • Zeeshan Mazhar
    Zeeshan MazharJul 23, 2025

    Great post, Yasir! As someone with 8 years of experience using Express, I really appreciate how clearly you explained the potential pitfalls with concurrent writes. The practical solutions—especially the idea of routing all updates through an API—are spot on. Solid work! 👏

  • Shegufa Nasrin
    Shegufa NasrinJul 28, 2025

    This is a very smart way to prevent possible data loss! Great article I must say

  • Kanak Daur
    Kanak DaurJul 30, 2025

    Very informative post! 👏
    I'll definitely keep this in mind when handling cron jobs and API data handling.
    Thanks for sharing!

  • Imtiaz Hussain
    Imtiaz HussainAug 3, 2025

    Great Stuff Muhammad Yasir👌

  • Ahson
    AhsonAug 4, 2025

    Great work done @yasir_rafique_27550feb631. Deeply Explained and provides practical solutions. Really appreciate that 👏🏻.

Add comment