
“DECplate Flow Chart Template” by mrbill is licensed under CC BY 2.0.
Technology DD Report: What Buyers and Sellers Should Expect.
A technology due diligence report arrives. It is forty pages long. It contains findings you do not fully understand, ratings you cannot contextualise, and a conclusion that is simultaneously alarming and vague. You are expected to make a commercial decision based on it.
This is the experience of most buyers and sellers encountering a tech DD report for the first time. The document is the output of weeks of specialist work, but without knowing how to read it, what to weight, what to probe, what to push back on, its value is limited.
This guide explains what a credible technology due diligence report should contain, how to interpret what it says, and how buyers and sellers should respond to its findings. It covers the structure and principles, the kind of report you should expect from any competent assessor. What it does not cover is the depth of analysis that experienced CTOs bring to each section, the pattern recognition built across hundreds of assessments, or the calibration that distinguishes a genuinely insightful report from a competent one. That is what you are paying for when you commission a professional assessment. For investors commissioning assessments (whether PE, VC, or private credit) our services for investors page covers how we work across different transaction types.
What a Credible Report Looks Like
Before interpreting the content, it is worth understanding the structure. A well-constructed technology DD report follows a consistent format. Deviations from this format, particularly reports that lead with technical detail rather than commercial implications, are worth questioning.
Executive Summary
Two pages, maximum. This is the section that matters to boards, investment committees, and senior management teams who need to make decisions based on the assessment without reading the full report. A good executive summary answers three questions: what is the overall technology position, what are the most material findings, and what is the commercial recommendation?
If the executive summary cannot answer these three questions clearly, either the assessment is incomplete or the assessor does not understand their audience. Summaries that lead with technical architecture descriptions rather than commercial implications are written for the wrong reader.
The 5P Assessment
The body of the report should be structured around the five assessment dimensions: Product, Platform, Process, Protection, and People. Each section should contain the findings for that pillar, the evidence on which they are based, a severity rating, and the commercial implication.
The evidence matters. Findings unsupported by specific evidence, code reviewed, interviews conducted, documentation examined, cannot be challenged, verified, or contextualised. A report that says "the architecture has scalability concerns" without specifying which components, under what load conditions, and based on what evidence is not a finding. It is an opinion.
Key Findings Summary
A well-structured report includes a summary of the most important findings across all five pillars, typically presented as a structured table with numbered references that map back to the detailed sections. Each finding should be concise enough to read in a single line, with the severity immediately apparent. This is the section that boards and investment committees will look at first, so clarity matters more than comprehensiveness. The detail lives in the pillar sections; the summary gives the commercial reader what they need to act.
Recommendations
A report that identifies problems without proposing solutions requires the buyer to commission additional work. The recommendations section should translate findings into concrete actions, distinguishing what must be done from what should be done, with a sense of relative cost and urgency for each.
An undifferentiated list of twenty recommendations is not useful. A good report makes clear which actions are high-impact and time-sensitive, which are important but can wait, and which are longer-term improvements. How assessors structure this varies, but the principle is the same: give the reader a practical basis for planning, not just a list of problems.
Team Assessment
The technology findings describe the current state of the system. The team assessment describes the organisation's capacity to change that state. These are different questions and both matter.
We assess the CTO's technical depth and commercial acuity, the senior team's competence and stability, the engineering team's structure and seniority mix, and the hiring pipeline's ability to fill gaps. A business with significant technology debt and an exceptional team capable of addressing it is a different investment proposition from one with the same debt and a team without the capacity to fix it.
“The technology findings describe the current state of the system. The team assessment describes the organisation's capacity to change it. Both matter. The second is frequently more important than the first.”
How to Read the Findings
The most important thing to understand about DD findings is that severity alone does not tell you what to do. Two findings rated with equal severity can require completely different commercial responses.
A critical security gap might be closable in four weeks at minimal cost. A fundamental architecture constraint might take eighteen months of investment to resolve. A good DD report connects each finding to its commercial context: not just "this is a problem" but "this is what it means for the deal."
A simplified approach to reading findings might involve two questions for each one:
First: how serious is this? At the simplest level, is the finding good practice, something that needs improvement, or something that poses genuine risk? Most reports use some form of severity scale. The important thing is not how many levels the scale has; it is whether the rating connects to a commercial action. A finding rated as serious should come with a recommendation. A finding rated as acceptable should explain why.
Second: is this normal for the stage? This is where many reports fall short. A five-person seed-stage company with no penetration testing is in a fundamentally different position from a Series C company with the same gap. A report that applies the same absolute standard to both is not telling you something useful; it is applying a checklist without judgment. The best assessments calibrate to what you would expect for the company's size, sector, and trajectory, and flag the gaps that are genuinely unusual rather than simply listing everything that is not perfect.
If the report does not make these connections, if it lists findings without commercial context or applies a one-size-fits-all standard, the assessor has done only half the job. Our own methodology goes considerably deeper than this simplified framing, but these two questions are a good starting point for any reader evaluating a DD report.
For Buyers: How to Use the Report
The report is a tool for commercial decision-making. Here is how to use it effectively.
Distinguish Material from Immaterial Findings
Start with the findings register and identify which findings are connected to your investment thesis. A finding that would affect the value creation plan is material. A finding that would not is noted but does not drive commercial action.
The most common mistake buyers make is treating all findings with equal weight. A forty-page report with thirty findings creates a false impression of severity. The question to ask for each finding is: "If this is not addressed within the hold period, does it prevent us from realising the investment thesis?" If the answer is no, it is not a dealbreaker.
Price Adjustments vs Completion Conditions
Not every Problematic finding warrants a price reduction. The appropriate commercial response depends on whether the finding affects current value (price adjustment), future value (earn-out or milestone structure), or transaction certainty (completion condition).
Security findings that represent existential risk (data breach exposure, regulatory compliance failures) are typically completion conditions: they must be addressed before the transaction completes. Architecture findings that affect the growth plan are typically price adjustments: the cost of remediation is reflected in the acquisition price. Team and process findings are typically post-completion workstreams with defined milestones.
The Management Team's Response to Findings
Pay as much attention to how the management team responds to the findings as to the findings themselves. A team that engages constructively, acknowledging findings, providing context where relevant, and presenting a credible remediation plan, is demonstrating exactly the operational maturity that acquirers value. A team that disputes every finding defensively, without evidence, is telling you something important about how they operate under pressure.
The most effective seller response to difficult DD findings is a pre-prepared remediation roadmap: "We agree with this finding. Here is our plan to address it, here is the timeline, and here is the estimated cost." This reframes the commercial conversation from price negotiation to shared planning.
Validate Key Assumptions
DD reports are produced under time pressure with limited information. Some findings will be wrong, or will be correct but overstated, or will miss context that would change their severity rating. It is entirely reasonable to ask the assessor to explain their evidence for specific findings and to present additional information where relevant.
What is not reasonable is disputing findings without evidence. Assessors who are challenged with "we don't agree with this" without specific counter-evidence will (and should) maintain their position.
For Sellers: How to Prepare
The most effective thing a seller can do is commission their own technology assessment before going to market. Not to manage what the buyer's assessors find - a pre-sale assessment done by credible operators will find the same things a buy-side assessment will find. The value is in the twelve to eighteen months between the pre-sale assessment and the transaction, during which material findings can be addressed, the remediation can be documented, and the seller can present investors with a story of improvement rather than a list of problems.
What Pre-Sale Preparation Looks Like
The ideal position is to arrive at due diligence with three documents ready: the original pre-sale technology assessment, the remediation plan that was prepared in response to it, and evidence of progress against that plan. This framing changes the entire character of the due diligence process. Instead of responding to findings defensively, the seller is presenting a management team that identifies problems and fixes them, the quality that every acquirer values most.
This approach works because it addresses the fundamental information asymmetry in technology due diligence. The buyer's assessors are working with limited information in a compressed timeframe. The seller's team knows the technology intimately. When a seller can demonstrate that they have already thought carefully about the technology position and invested in improving it, the buyer's assessors spend less time discovering problems and more time validating the remediation, a significantly more positive dynamic.
Responding to Findings You Disagree With
Dispute findings you genuinely believe are incorrect, with evidence. Provide architecture documentation that demonstrates the scalability claim is wrong. Show test coverage data that contradicts the assessment. Present incident history that challenges the reliability finding.
Do not dispute findings simply because they are uncomfortable. Assessors who have conducted a hundred transactions have seen most patterns before. Pushing back without evidence on a finding that is accurate damages credibility and raises questions about what else the management team is not being straight about.
The Remediation Roadmap as a Commercial Tool
A seller who arrives at due diligence with a prepared remediation roadmap for anticipated findings is in a significantly stronger position than one who is responding reactively. The roadmap should acknowledge each likely finding, propose a specific action, assign ownership, provide a timeline, and estimate the cost. Presented alongside the DD findings, it reframes the commercial negotiation: instead of "how much do we reduce the price for these problems?", the conversation becomes "how do we structure the completion conditions and post-completion milestones to reflect the remediation plan?"
Red Flags in a DD Report
Not all technology due diligence reports are equal. These are the patterns that suggest the assessment is not as reliable as it should be.
Findings without evidence. Any finding that cannot be traced to a specific observation (code reviewed, interview conducted, document examined) is an opinion, not a finding. Reputable assessors document their evidence. If a report contains conclusions without supporting evidence, ask for it.
No positive findings. Every technology organisation has things it is doing well. A report with no positive findings has either not looked carefully enough or is not providing a balanced picture. Unrelenting negativity in a DD report is as unreliable as unrelenting positivity.
Missing remediation guidance. A report that identifies problems without proposing solutions is incomplete. The commercial value of technology due diligence is not in discovering problems; it is in understanding what those problems cost to fix.
Generic observations. "The team would benefit from improved documentation" is not a finding. "The payment processing system has no documented architecture decision records, and the engineer who built it left eighteen months ago" is a finding. The difference is specificity.
Pricing as a percentage of transaction value. An assessor whose fee scales with the deal size has a financial interest in the transaction completing and in larger transactions. Credible operators price by scope.
Related Reading
- Technology Due Diligence Checklist, the structured 5P Framework assessment guide
- What We Actually Assess in Technology Due Diligence - patterns from 100+ transactions
- You Failed Tech DD. Now What?, how to respond to difficult findings
- Fractional CTO Services, implementing DD recommendations with the assessor who identified them
- Technology Audit - pre-sale and buy-side technology due diligence
Frequently Asked Questions
References
- BVCA. Private Equity and Venture Capital Report 2024. British Venture Capital Association (2024).
- Deloitte. M&A Trends Survey 2025. Deloitte (2025).
- IBM Security. Cost of a Data Breach Report 2025. IBM / Ponemon Institute (2025).
- NCSC. Cyber Security Considerations for Mergers and Acquisitions. National Cyber Security Centre, GOV.UK (2023).
- UK DSIT. Cyber Security Breaches Survey 2025. GOV.UK (2025).
Preparing for technology due diligence?
We conduct pre-sale technology assessments and buy-side due diligence for PE firms, VC investors, and management teams. Structured, commercially connected, and delivered by operators who have done it across 100+ transactions.