Engineering

Migrating 10,000 Jobs from EasyCron

4 min readJustRun Team

One of the most requested features since we launched the beta has been migration support from EasyCron. It makes sense: EasyCron is one of the oldest cron-as-a-service providers, and many developers have dozens or even hundreds of jobs configured there. Switching services manually is painful enough that most people just never do it.

We wanted to make migration a one-click operation. Paste your EasyCron API key, hit import, and every job appears in JustRun with the correct schedule, headers, HTTP method, and request body. Here is what that took to build.

The 6-field problem

Standard cron uses five fields: minute, hour, day-of-month, month, and day-of-week. EasyCron uses a six-field format that adds seconds as the first field. This is a subtle difference that can silently break migrations if you do not handle it.

Our importer detects whether a cron expression has five or six fields. If it has six, we strip the leading seconds field and convert it to standard five-field format. If the seconds field was anything other than 0 or*, we flag the job for manual review since sub-minute scheduling requires a different approach in our system.

Header preservation

Many EasyCron users configure custom HTTP headers for authentication, content types, or application-specific metadata. The EasyCron API returns headers as a flat key-value object, which maps cleanly to our schema. But there are edge cases: some users store bearer tokens directly in headers, and we need to encrypt those at rest using AES-256 before persisting them to our database.

The import flow works like this: fetch all jobs from the EasyCron API, transform each job into our internal schema, encrypt any sensitive header values, validate the resulting cron expression, and insert everything in a single database transaction. If any job in the batch fails validation, we roll back the entire import and return a detailed error report showing which jobs failed and why.

Handling scale

When a user has 10,000 jobs, you cannot import them synchronously in a single API request. We paginate the EasyCron API at 100 jobs per page, process each page as a batch, and stream progress updates to the frontend via Server-Sent Events. The user sees a real-time progress bar with a count of imported, skipped, and failed jobs.

The entire import for 10,000 jobs takes about 90 seconds, which is mostly spent waiting on the EasyCron API rate limit. On our end, the database inserts run in under 3 seconds thanks to batch inserts with Drizzle ORM.

Lessons learned

  • Always validate cron expressions after transformation. A valid 6-field expression can become invalid when truncated to 5 fields.
  • Encrypt sensitive data before it hits the database, not after. This sounds obvious, but it is easy to miss in batch operations.
  • Stream progress for long operations. A spinning loader with no feedback is the fastest way to make users think something is broken.
  • Provide a dry-run mode. Let users preview what will be imported before committing. This caught dozens of edge cases during testing.

Start using JustRun

Set up your first cron job in 60 seconds. Free plan includes 10 jobs and AI diagnostics.

Get Started Free

No credit card required