Most labs inspect finished work. Top labs inspect at four points. The difference is not perfectionism — it is catching a $340 error when it costs $12 to fix instead of $340 to redo.
Digital workflows did not eliminate errors. They moved them. And the errors that digital workflows create are harder to see, faster to compound, and more expensive to fix.
A single case now passes through intraoral scanner, CAD software, CAM machine, and sintering oven. Each handoff is a failure point. An analog workflow had three steps. A digital one can have eight. More steps without more inspection means more undetected errors.
Clinics no longer accept "sometimes good, sometimes not." They expect every case to match the prescription exactly. One remake loses a month of goodwill. Three remakes lose the account. QC is not quality improvement — it is client retention.
MDR in Europe. ISO 13485 alignment. GDPR for patient data. Regulators want documented processes, not promises. A checklist on paper is no longer sufficient — you need timestamped, auditable records that prove every case was inspected and by whom.
A 4-checkpoint QC system adds 12-15 minutes per case. A single remake consumes 3-5 hours of technician time, $150-400 in materials, and one production slot that could have generated revenue. The math is not close. See the full cost breakdown.
The uncomfortable truth: Most labs already do informal quality checks. The problem is not the absence of inspection — it is the absence of documented, consistent, accountable inspection. When QC lives in one technician's head, it leaves when they do.
Top-performing labs do not inspect more[1] — they inspect earlier. Here is the exact framework used by labs with sub-5% remake rates.
Before any production begins. Catches 40% of all errors at the cheapest fix point.
After CAD design, before milling or printing. Catches design errors that are invisible in production.
After manufacturing, before finishing. Catches fit, finish, and color issues while correction is still cheap.
Before shipping. The last defense against avoidable remakes and incomplete documentation.
Why four checkpoints instead of one? A single end-of-line inspection catches errors — but only after all production time and materials have been spent. The 4-checkpoint approach catches errors when they are cheapest to fix. An incomplete prescription caught at Checkpoint 1 costs nothing. The same error caught at Checkpoint 4 costs the full remake.
If you can only implement one checkpoint, start with Checkpoint 1. It delivers the highest ROI of any single quality control measure because it prevents work from starting on flawed inputs. For a deeper look at how communication errors drive remakes, see our guide to reducing dental lab remakes.
Do not touch a bur, click a mouse, or open CAD software until every item on this list has a green check. This single habit eliminates the largest category of remakes.
Material specified. Shade selected. Design type defined. Tooth numbers confirmed. No blanks, no "same as last time," no assumptions.
STL files present and openable. No corrupted uploads, no missing arches. File format matches your CAD system requirements.
STL is manifold (watertight). No non-manifold edges, no holes, no inverted normals. A 30-second check that prevents hours of CAD headaches. Use our free STL checker.
Shade tab and number specified. Photos uncompressed with visible shade tab in-frame. No "A2-ish" or "match the neighbor." See shade photography standards.
Can you meet the requested delivery date given current workload? Accepting impossible deadlines guarantees rushed work and quality shortcuts. Flag unrealistic timelines now, not on delivery day.
You cannot design occlusion without the opposing arch. If it is missing, stop. Request it before starting design. This is the third most common cause of fit-related remakes.
Physical or digital bite registration present and usable. No ambiguous interocclusal records. If the bite looks questionable, request a new one rather than guessing.
In digital scans: margin lines are detectable and not obscured by tissue. In physical impressions: margins are tear-free and clearly defined. Margins you cannot see will produce margins that do not fit.
For implant cases: implant brand, platform, connection type, and abutment specifications documented. Generic "implant crown" is not a prescription. Missing specs cause the most expensive category of remakes.
Any verbal instructions from the dentist documented in the case file. Phone calls summarized in a note. Nothing left to memory. If it is not written down, it does not exist.
Implementation tip: Print this checklist and tape it to your intake station for the first two weeks. After that, move it into your case tracking system as mandatory fields that must be completed before the case can advance. The friction of the checklist is the point — it forces a pause before production begins.
Digital files fail silently. A corrupted STL will not crash your software — it will produce a restoration that does not fit. Here is what to check and what constitutes a pass or fail.
Check for non-manifold edges, holes, and inverted normals. These geometry errors cause CAD crashes, inaccurate margins, and unpredictable milling paths.
For CBCT data: verify slice thickness, field of view coverage, and absence of metal scatter artifacts that corrupt the scan volume.
Clinical photos sent via WhatsApp or email lose 60-85% of shade data through compression. Verify photos are original resolution with EXIF data intact.
Which scan is current? If the clinic sent three versions, which one reflects the latest preparation? No labeling system means your technician guesses.
The version control problem is more common than you think. A clinic rescans after adjusting a preparation. They send the new file via email. But the old file is already in your CAD software from the original upload. Your technician designs on the old scan. The restoration does not fit. Nobody is at fault — the system failed because there was no system.
Purpose-built platforms solve this by linking every file to its case with automatic version tracking. The latest file is always the one your technician sees. Use our free STL integrity checker to validate files before importing them into your workflow.
Different materials fail differently.[2] A zirconia crown and a PFM bridge need different inspection criteria. Here are the pass/fail standards by material.
| Inspection Criteria | Zirconia | PFM | Removable |
|---|---|---|---|
| Fit accuracy | <50μm marginal gap | <80μm marginal gap | Uniform tissue contact, no rocking |
| Surface finish | Smooth glaze, no pitting or microcracks | Porcelain free of bubbles, no exposed metal | Denture base polished, no rough edges |
| Color match | Matches shade tab under D65 light | Porcelain shade matches, no gray-through | Tooth shade matches prescription |
| Material-specific check | Sintering temp verified (1450-1550°C logged) | Solder joints intact, metal substructure fit verified | Tooth positioning on ridge, balanced occlusion |
| Structural integrity | No chipping at connectors, min 9mm² cross-section | No porcelain delamination, metal thickness ≥0.3mm | Clasps functional, base thickness ≥2mm |
| Occlusion | Even contacts, no prematurities on articulator | Centric and excursive contacts verified | Bilateral balanced occlusion confirmed |
| Common failure mode | Translucency mismatch, shade shift from sintering | Porcelain fracture at thin areas, gray margins | Midline deviation, incorrect VDO |
A generic "does it look good?" inspection misses failure modes that are specific to each material. Zirconia can look perfect on the bench and shift shade after sintering at the wrong temperature. PFM bridges can pass a visual check but have a solder joint that will fail under occlusal load. Removable prosthetics can appear correct until the patient bites down and the VDO is off by 2mm.
The inspection must match the material. Train your QC team on the specific failure modes for each restoration type they handle. Generic training produces generic inspection — which misses the errors that actually cause remakes.
Quality control without documentation is quality guessing. If you cannot prove the inspection happened, it may as well not have.
At each checkpoint: what was inspected, what passed, what failed, who inspected it, and when. For failures: what was the issue, what action was taken, and was the clinic notified. No narrative needed — structured fields are faster and more searchable.
Photograph every case at Checkpoints 3 and 4. Include: buccal, lingual, and occlusal views against a neutral gray background with shade tab in frame. Consistent lighting. No filters. These photos become evidence in disputes and training material for QC improvement.
For EU labs: patient-linked records require GDPR-compliant storage. Access controls, encryption, and audit logging are not optional. Paper logs fail on all three counts. Digital systems with role-based access and automatic timestamping satisfy regulatory requirements by default.
Paper checklists work for the first week. Then they pile up, get lost, and become unsearchable. When a clinic disputes a case from three months ago, you need to find the specific checklist, read the handwriting, and hope the initials are legible.
Digital QC logs solve every problem paper creates. They are timestamped automatically, searchable instantly, linked to the case permanently, and accessible by anyone with permission. When a dispute arises, you pull up the case, show the inspection photos, and the conversation ends.
TrazaLab builds the audit trail into the case workflow itself. Every checkpoint, every photo, every communication is automatically logged and linked to the case. There is no separate documentation step — the documentation happens as you work. See how case-linked documentation works.
The biggest objection to quality control is speed. Here is how to implement a 4-checkpoint system without losing production capacity.
Add only the incoming case review. This is 3-5 minutes per case and prevents the most expensive category of errors. Do not add any other checkpoints yet. Let the team get comfortable with one new habit before adding more. Assign one person to be the intake reviewer — this should not be the technician who will produce the case.
Add the final delivery check. Between Checkpoints 1 and 4, you now have quality gates at both ends of the workflow. These two checkpoints alone catch 70% of quality issues. Track every case that would have shipped with an error — this data builds the case for expanding QC to your team.
Once your team has absorbed the endpoint checks, add design verification and production inspection. By now, they have seen enough caught errors to understand the value. Resistance drops when people see the data. Frame QC as "catching problems before they become remakes" — not as "checking your work."
Compare your remake rate from the 90 days before QC to the 90 days after. Track which checkpoints catch the most errors. Adjust your checklists based on actual failure patterns — not theoretical ones. Hold a monthly 15-minute QC review meeting where the team reviews the top 3 error categories and discusses prevention. Start with a rework risk assessment to establish your baseline.
Use caught errors as training material. When Checkpoint 3 catches a shade mismatch, photograph it, document what went wrong, and share it in the next team meeting. Real examples from your own lab are ten times more effective than generic training. Build a library of caught errors — it becomes your most valuable training asset.
The speed objection, answered: A full 4-checkpoint QC cycle adds 12-18 minutes per case. A single remake consumes 3-5 hours. If your QC system catches even one remake per week, you save a net 2.5-4.5 hours weekly. Within 60 days, most labs report that they are producing faster than before QC because they spend less time redoing work.
Every checkpoint, every check item, in one summary. Pin this to your production floor or build it into your digital workflow.
28 inspection items across 4 checkpoints
TrazaLab builds these checkpoints into your case workflow — QC becomes automatic, not extra paperwork.
Top-performing labs use 4 checkpoints: incoming case review, design verification, production inspection, and final delivery check. Labs with fewer than 3 checkpoints miss errors that compound through the workflow. The incoming case review alone catches 40% of issues that would otherwise cause remakes, making it the single most impactful checkpoint to implement first.
Labs without structured quality control average a 20-25% remake or adjustment rate when all rework is counted — not just full remakes but shade corrections, fit adjustments, and re-fires. Labs that implement a documented 4-checkpoint QC system typically reduce this to 5-8% within 90 days. The improvement comes from catching errors early, not from improving technician skill.
The incoming case review should verify 10 items before any production begins: prescription completeness (material, shade, design specs), digital file integrity (STL manifold check, DICOM quality), photo quality (resolution, color accuracy, no compression artifacts), deadline feasibility, patient history notes, opposing arch data, bite registration clarity, preparation margin visibility, implant specifications if applicable, and special instructions documented in writing rather than verbally.
Start with Checkpoint 1 (incoming case review) only. This takes 3-5 minutes per case and prevents the most expensive errors. Once your team is comfortable — usually within 2 weeks — add Checkpoint 4 (final delivery). These two endpoints catch 70% of quality issues. Add Checkpoints 2 and 3 in month two. The initial slowdown is temporary. Within 60 days, labs report net time savings because they spend less time on remakes and corrections than they spend on inspections.
At minimum, document: the original prescription and any changes, photos at each checkpoint (incoming impressions or scans, post-design screenshots, production photos, final delivery images), pass/fail decisions with timestamps and inspector initials, any communication with the clinic about the case, and the final delivery confirmation. Digital records are preferable to paper because they are searchable, timestamped automatically, and cannot be backdated. For EU labs, GDPR requires that patient-related records be stored securely with access controls.
Software cannot replace human inspection of fit, shade, and finish — those require trained eyes and hands. But software can automate the framework around inspection: enforcing that all checklist items are completed before a case advances, logging who inspected what and when, flagging cases that skip a checkpoint, generating QC reports, and maintaining the audit trail. TrazaLab builds this into the case workflow so QC documentation happens as a natural part of production rather than as a separate paperwork task.
TrazaLab enforces checkpoint-based quality control as part of the case workflow. Every inspection is logged, every photo is linked, every decision is documented — automatically. Start your free 14-day trial and see what structured QC does to your remake rate.