Skip to content
Cranes Consulting Tampa, FL
All insights
Software & Tech 9 min read · April 2026

The Shelfware Construction Software Audit

You're paying $6K to $12K a year for Procore, Buildertrend, or JobTread. Your team is using maybe 30% of it. Here's a 20-minute audit to figure out exactly where the gap is, and which gaps are actually worth closing.

The shelfware problem, quantified

Construction PM software at small-firm scale costs real money. Buildertrend runs a flat $499 to $999 per month for teams under 15 users. Call it $8K a year for the mid-tier. Procore starts around $375 per user per month on enterprise plans and only gets cost-effective at scale. JobTread, ServiceTitan, BuildOps, CoConstruct all cluster in the same range.

If your team is using 30% of the platform's capability, you're paying roughly $5,500/year for software that's sitting idle. That's not the worst part. The worst part is the process work that didn't get done because everyone assumed the software would do it: the RFI cycle that's still untracked, the change orders that still get billed late, the daily logs that live in someone's text messages, the reports that never get run.

Most small GCs know they're under-using their software. Almost none have actually measured the gap. That's what this audit is for.

How to run the audit

Block 20 minutes. Pull up your platform. Score each of the seven categories below honestly on a 1–5 scale:

  • 1. We don't use this feature at all.
  • 2. Started using it, abandoned it.
  • 3. Office uses it, field doesn't (or vice versa).
  • 4. Used inconsistently. Some PMs, some jobs.
  • 5. Standard practice, used on every project, every day.

Anything 3 or below is shelfware. Anything 5 is working. Score everything in between against what you've actually observed in the last 30 days, not what you wish were true.

The seven categories

1. Daily logs and field reporting

The platform has a daily log feature. Foremen are supposed to fill it out from a phone at end-of-shift. Photos, weather, crew, hours, work performed.

Score 5 looks like: Every active job has a daily log every workday. Photos are tagged to the job. Weather is auto-pulled. The owner can pull a project history without calling anyone.

Score 1–2 looks like: Logs live in text messages, in someone's iPhone camera roll, in a Word doc on the office computer, or nowhere at all. Comes back to bite you on the first lawsuit, insurance claim, or change-order dispute.

2. RFIs and submittals

Requests for information and submittal tracking are the platform's bread and butter. Every RFI has a number, a due date, an owner, and a status. Same for submittals.

Score 5: Every RFI flows through the platform. Aging report runs weekly. Average RFI close time is tracked and trending down.

Score 1–2: RFIs are emails. The architect's response is buried in a thread. Nobody knows what's outstanding. The PM finds out an RFI was never answered when the foreman calls about it.

3. Change order management

Change requests get logged, priced, and signed inside the platform before work starts. Billing flows from the same record.

Score 5: "No signature, no work" is the rule. Every change has a written, priced, signed CO in the system before any hour gets billed.

Score 1–2: Change orders are written at invoicing, meaning you've already done the work and are now negotiating against a finished product. Industry data puts unbilled CO leakage at 2 to 4 percent of revenue when this is broken.

4. Schedule management

The platform has a schedule. The schedule reflects reality. Subs are notified through the platform when their slot moves.

Score 5: Every active project has a maintained schedule. Look-ahead is published weekly. Subs get automated notifications.

Score 1–2: The schedule was set up at project kickoff and hasn't been touched since. The real schedule is a whiteboard in the office and the PM's text messages.

5. Time tracking and labor costing

Crew clocks in via the platform's mobile app. Hours flow to job-cost reports automatically. You can see actual-vs-budget labor by job at any time.

Score 5: Daily labor cost per job is visible. PMs see overruns the week they happen, not the month after.

Score 1–2: Time gets handwritten on paper, dropped at the office Friday, entered into QuickBooks the following week. By the time you see a labor overrun, you can't do anything about it. Industry research puts labor time variance at ~$4,285/worker/year for firms without real-time visibility.

6. Document management (drawings, specs, photos)

Current drawings are in the platform. Markups happen in the platform. Photos are uploaded by job. There's one source of truth.

Score 5: Field crew opens the latest drawing on a phone or iPad on-site. Photo of any field issue gets snapped to the project record. Markups (Bluebeam or native) sync back.

Score 1–2: Drawings live in email attachments, iCloud, Dropbox, Google Drive, AND the platform, and nobody knows which version is current. Photos live in 14 different iPhones.

7. Reporting and dashboards

Someone runs a financial report on every job at least monthly. Job profitability is visible. Trends are visible. The owner makes decisions from data, not vibes.

Score 5: Owner reviews a profitability dashboard weekly. Slow-paying customers, over-budget jobs, and labor overruns surface without anyone asking.

Score 1–2: Reports exist in the platform but nobody runs them. The first time you find out a job lost money is at year-end with the accountant.

Scoring

Total your seven scores out of 35:

  • 30–35: You're getting your money's worth. Audit the gaps for incremental wins.
  • 20–29: You're using the platform, but inconsistently. Two or three high-impact features need real adoption work.
  • 10–19: You're paying for software you mostly don't use. The fix is a workflow re-design and a 60–90 day re-implementation, not a different platform.
  • Below 10: Either the platform is wrong for your operation, or it never got implemented in the first place. Don't switch yet. Figure out the underlying issue first.

The four root causes

Across dozens of these audits, the failure modes cluster into four causes. Usually two or three of them stacked on top of each other:

1. The platform got bought, never implemented

Someone signed up, watched a 30-minute demo, and started entering data. Nobody designed a workflow, defined who owns which feature, or built training around the field roles. Software is a tool, not a system. Without the system, the tool just sits there.

2. No accountability for adoption

The PMs are "supposed to" use it. The foremen "should" upload daily logs. But there's no consequence when they don't, and no visible benefit when they do. People don't change behavior without a reason.

3. Field/office gap

The office team has the desktop app and uses it. Field crew has phones, hates the mobile UX, and reverts to texts and calls. Nothing flows. If the field doesn't adopt, the platform is half-broken by design.

4. No reporting layer

Data goes in. Data never comes out. People entering data don't see any payoff, so over time the data quality decays. Without reports, the platform becomes a paperwork chore rather than a decision tool.

What to fix yourself vs. what to bring help on

Fix it yourself if: Your audit score is 25 or above, you have a clear sense of which two or three features to push on, and you have a PM or office manager with the authority and time to drive adoption for 60 days. Most of the win is workflow design and a daily cadence, not technology.

Bring help if:

  • You've tried 2+ times to drive adoption and it didn't stick.
  • You're at $5M+ revenue and the lost margin from low utilization easily covers the cost of a re-implementation.
  • You're consolidating from 3+ tools (estimating, PM, accounting) and need integrations designed before you implement.
  • You're switching platforms. Don't repeat the failure mode of platform #1 on platform #2.

The real ROI math

A clean implementation typically gets a small GC from a 15/35 score to a 28/35 score in 60–90 days. The recurring impact:

  • Recovered software value: ~$5,500/year.
  • Reduced unbilled change-order leakage (1–2% of revenue): $20–40K/year for a $2M GC, $50–100K for a $5M GC.
  • Labor variance reduction at 15–25% improvement: $10–20K/year for a 20-person crew.
  • Better job-profitability visibility → better future bids → compounding margin.

The implementation pays for itself in the first project that doesn't run a CO leak. Everything after is upside.

Tyler Alexander runs Cranes Consulting in Tampa, FL. If you ran the audit and want a second opinion on what to fix first, set up a 30-minute call. No deck, no pitch.