Development

Building a Production ERP System with AI — Lessons from Schiffli

Apr 1, 2026 15 min read Ankur Jain

ERPs are where ambitious developers go to learn humility. Every senior developer I respect has at least one ERP story -- usually involving late nights, unexpected edge cases in inventory math, and a hard-won respect for the complexity of business operations that look simple from the outside but are fractal in their detail.

Schiffli ERP is my contribution to the genre. It's a production system for textile manufacturing -- specifically for Schiffli embroidery operations. Built with Next.js 16, React 19, PostgreSQL, Drizzle ORM, and Better Auth. Shipped in three weeks. Running in production handling real business operations: purchase orders, inventory, job work tracking, invoicing, and multi-role access control.

This is the full story of building it, the decisions that mattered, the traps I fell into, and what AI made possible that would have been unreasonable without it.

The Domain: Textile Manufacturing Is Harder Than You Think

Schiffli embroidery is a specific type of textile manufacturing. A Schiffli machine is a massive loom that produces embroidered fabric in large panels. The business operations around it involve:

Most ERP vendors try to sell generic systems that "can be customized." In practice, the customization takes longer than building something domain-specific. Schiffli operations have enough unique workflows that a purpose-built system made more sense.

Tech Stack Decisions

Next.js 16 with React 19

This was a deliberate choice, not a default. Next.js 16 with React Server Components gave me the architecture I wanted: server-side data fetching for complex queries, client-side interactivity for forms and dashboards, and streaming for the heavier pages. React 19's improved form handling and Suspense support simplified the data mutation patterns significantly.

The alternative was a separate backend (Express or Fastify) with a React SPA frontend. I've built plenty of those. The problem is the duplication: you define types on the backend, then redefine them on the frontend. With Next.js, the type definitions live in one place and flow naturally from database to component.

PostgreSQL + Drizzle ORM

PostgreSQL was the only realistic option. The data model for an ERP is relational to its core: purchase orders reference suppliers, line items reference products, inventory movements reference both the source and destination. You need foreign keys, constraints, transactions, and complex joins. Document databases don't work here.

Drizzle over Prisma was a conviction choice. Drizzle gives you SQL-like control with TypeScript types. The query builder maps directly to SQL -- there's no query abstraction layer that hides what's happening. When you're debugging why an inventory reconciliation is off by 0.5 kilograms, you need to see the actual query, not a Prisma abstraction of it.

// Drizzle schema for inventory movements
export const inventoryMovements = pgTable("inventory_movements", {
  id: uuid("id").defaultRandom().primaryKey(),
  productId: uuid("product_id").notNull()
    .references(() => products.id),
  warehouseId: uuid("warehouse_id").notNull()
    .references(() => warehouses.id),
  type: text("type", {
    enum: ["purchase", "sale", "job_work_out",
           "job_work_in", "adjustment", "transfer"]
  }).notNull(),
  quantity: numeric("quantity", {
    precision: 12, scale: 4
  }).notNull(),
  referenceType: text("reference_type"),
  referenceId: uuid("reference_id"),
  createdBy: uuid("created_by").notNull()
    .references(() => users.id),
  createdAt: timestamp("created_at").defaultNow(),
});

Note the numeric(12, 4) for quantity. Textile quantities are measured in meters and kilograms with decimal precision. Using floats for inventory math is a guaranteed source of rounding errors. Decimals with fixed precision are non-negotiable for any system that tracks physical quantities.

Better Auth

Authentication in ERPs is annoying because it's not just "logged in or not." It's role-based, with roles that have different permissions on different modules. Better Auth handled the core auth flow (sessions, tokens, password hashing) while I built the role-permission layer on top.

The permission model ended up being a simple matrix: each role has a list of permitted actions per module. Admin can do everything. Accountant can read/write financial modules but only read production data. Floor manager can read/write production but can't see financials. This pattern is simple to implement, simple to understand, and handles 95% of real-world access control requirements.

Domain-Driven Design for Manufacturing

The codebase is organized by domain, not by technical layer. There's no controllers/ folder with 40 files in it. Instead:

src/
  modules/
    inventory/
      schema.ts        # Drizzle tables
      queries.ts       # Read operations
      mutations.ts     # Write operations
      components/      # UI components
      types.ts         # Domain types
    purchasing/
      schema.ts
      queries.ts
      mutations.ts
      components/
    job-work/
      ...
    invoicing/
      ...
    auth/
      ...

Each module owns its schema, its queries, its mutations, and its UI. Cross-module interactions happen through well-defined interfaces. The inventory module exports a recordMovement() function. The purchasing module calls it when a purchase order is received. The job-work module calls it when materials are sent to or returned from a contractor.

This structure made AI-assisted development dramatically easier. When Claude Code is implementing a feature in the job-work module, it reads the module's files and understands the domain. It doesn't need to comprehend the entire codebase -- just the module it's working on plus the interfaces of modules it depends on.

The Tricky Parts

Inventory Tracking

Inventory is the heart of any manufacturing ERP, and it's where the complexity lives. The naive approach is to track a "current quantity" field on each product. Every purchase adds to it, every sale subtracts from it. Simple, right?

Wrong. This approach fails the moment someone needs to answer "where did my thread go?" You need a full movement ledger: every change to inventory is recorded as a movement with a type, quantity, reference, and timestamp. The current stock is a derived value -- the sum of all movements for that product in that warehouse.

If your inventory system can't explain the discrepancy between expected and actual stock by showing the exact trail of movements, it's not a real inventory system. It's a counter.

This ledger approach is more work to implement but saves hundreds of hours of manual reconciliation. When the warehouse count doesn't match the system, you can trace every movement and find exactly where the discrepancy occurred.

Job Work Flows

Job work is the most complex workflow in the system. The flow goes: create a job work order, allocate materials from inventory, record the dispatch to the contractor, track processing time, record the return (which might be a different quantity than what was sent -- some material is consumed in processing), and reconcile.

The state machine for a job work order has seven states: Draft, Approved, Material Allocated, Dispatched, In Process, Partially Received, Completed. Each transition has validations. You can't dispatch without allocating materials. You can't mark as completed if the return quantity hasn't been recorded. And any transition needs to update inventory accordingly.

// Job work state transitions
const transitions: Record<JobWorkState, JobWorkState[]> = {
  draft:              ["approved", "cancelled"],
  approved:           ["material_allocated", "cancelled"],
  material_allocated: ["dispatched", "cancelled"],
  dispatched:         ["in_process"],
  in_process:         ["partially_received", "completed"],
  partially_received: ["completed"],
  completed:          [],
  cancelled:          [],
};

function canTransition(
  from: JobWorkState,
  to: JobWorkState
): boolean {
  return transitions[from]?.includes(to) ?? false;
}

Invoice Generation

Indian GST invoicing has specific requirements: CGST and SGST for intra-state transactions, IGST for inter-state, HSN codes for each line item, reverse charge mechanism for certain services, and specific formatting requirements. Getting this wrong means tax compliance issues, which is the fastest way to lose a client's trust.

I implemented the invoice engine as a pure function: input the line items, tax configuration, and party details, and get back a fully computed invoice with all tax breakdowns. No side effects, fully testable. The AI agent wrote the initial implementation and the test suite simultaneously, which caught three edge cases I would have missed: zero-quantity line items, mixed GST rates on a single invoice, and rounding behavior when splitting tax between CGST and SGST.

Multi-Role Authentication

The auth system had a subtle bug that Claude Code caught during a code review pass. I'd implemented the permission check correctly in the API layer, but the navigation component was rendering links based on a cached role that didn't update when an admin changed a user's role. The user would see menu items they no longer had access to. Clicking them would fail (the API check worked), but the UX was broken.

The fix was straightforward -- invalidate the client-side session when role changes are detected -- but it's the kind of bug that a human reviewer might not catch because the security boundary was correct. It was a UX issue masquerading as a security issue, and the AI's systematic review approach found it.

How AI Accelerated the Build

Three weeks for a production ERP is aggressive. Here's specifically where AI made the difference:

Lessons Learned

1. ERPs Are the Ultimate Test of a Developer

ERPs touch everything: database design, complex queries, state machines, financial calculations, access control, document generation, data validation, reporting. If you can build a production ERP, you can build anything. If you struggle with ERPs, the gaps in your skills will be exposed ruthlessly.

2. Domain Knowledge Is the Bottleneck, Not Code

AI can write code faster than I can. But it can't understand why a textile manufacturer tracks thread by weight and fabric by meters. It can't understand why job work returns are sometimes less than dispatched quantities (processing loss). That domain knowledge came from hours of conversation with the client, watching their manual processes, and asking "why do you do it this way?" The code is the easy part. Understanding the business is the hard part.

3. Start With the Hard Parts

Most developers start with the login page and the dashboard. I started with the inventory movement ledger and the job work state machine. If the core business logic doesn't work, nothing else matters. If it does work, everything else is CRUD around it.

4. Decimal Precision Is Non-Negotiable

I've already mentioned this, but it bears repeating: never use floating-point numbers for quantities or money in business software. Use decimal types with explicit precision. The bugs from floating-point arithmetic are subtle, cumulative, and discovered by accountants who will question your competence.

5. Migration Order Matters

A hard lesson from this project: when adding a database constraint via migration, update the existing data before adding the constraint. I wrote a migration that added a CHECK constraint before the UPDATE statement that would make the existing data comply. The migration failed. The fix was trivial, but the downtime was embarrassing.

ERPs are where "it works on my machine" meets "the accountant found a discrepancy of 0.001 rupees and wants an explanation." Build accordingly.

Would I Build It Again With AI?

Absolutely. Without AI, this project would have been 8-12 weeks. With AI, it was 3 weeks. The quality is higher because the agent catches bugs during implementation that I would have found later in testing. The code is more consistent because the agent follows the same patterns across every module.

But AI didn't replace the hard work. I still spent hours understanding the domain. I still made architectural decisions that the agent couldn't make. I still reviewed every line of generated code. AI handled the translation from specification to implementation. I handled the translation from business reality to specification. Both are necessary. Neither is sufficient alone.


If you're in manufacturing and running your business on spreadsheets, or if you've been burned by generic ERP vendors who promised customization and delivered complexity, I build purpose-built systems that match your actual workflows. Not a generic platform you'll fight against -- a tool that works the way your business works.

ERP Next.js PostgreSQL Manufacturing Full-Stack

Want me to build something like this?

I build production business systems -- ERPs, inventory tools, invoicing platforms. Purpose-built for your domain.

Let's Talk