Document Crunch

View Original

Delivering Knowledge Instead of Answers: How AI Should Solve Contract Issues for Construction Project Teams

Contributions by Trent Miskelly and Maegan Spivey


The best use of AI is when software answers questions you didn’t ask, reading between the lines of your prompts. Here’s how that could work in construction...


With the latest iterations of generative AI, like ChatGPT, we sometimes think of AI like a calculator. You input a prompt, it gives you an answer. The ultimate use of AI, though, is when it answers questions you didn’t ask, reading between the lines of your queries.

We’re building Document Crunch as a tailored intelligence engine around knowledge engineering, not just prompt engineering.

Prompt engineering is focused on eliciting a directionally correct response using keywords and questions. Knowledge engineering, on the other hand, creates a framework of understanding, rules, and mechanisms that allows the cumulative input of each prompt to thoroughly answer the questions beneath the initial question and continue to deliver those same meaningful answers in the future.

In building our tailored intelligence engine, we use finely tuned prompts crafted by subject matter experts in the construction industry, which create a digital knowledge base and function as building blocks for this engine. All this is to help you—and anyone in construction—get the answers you’re really looking for. Great prompts include an element of education, empowering the technology to find the right answer for you. Precision, structure, and patience are key. Let’s look at an example that could work within our Chat feature:

Suppose you type the prompt: “What are the tasks the general contractor must complete after the construction contract is executed?”

If you've been in construction, you have inherent knowledge when drafting a prompt for AI. The large language model (LLM) behind that AI does not have that same knowledge. So, it's important when building a prompt that you spend a little time explaining your underlying knowledge and assumptions to the AI. In the prompt, you’d want to include a description of what you want the answer to include:

Updated Prompt: “What are the tasks the general contractor must complete after the construction contract is executed? Include tasks to be completed for the Owner or items that should be submitted to the Owner through the course of the project.”

After testing that, we may see the answer is missing some information that we hadn’t thought to ask before, like when each task should be completed:

Updated Prompt: “What are the tasks the general contractor must complete after the construction contract is executed? Include tasks to be completed for the Owner or items that should be submitted to the Owner through the course of the project. For each task or submission, indicate when it is due to or expected by the Owner.”

AI is fast and instantaneous, but it still requires patience as you teach and guide it toward the information you need. Systems built on knowledge engineering will continuously learn from inputs, so when eventually asked a similar question, it’ll automatically give you more information than you asked for because it’s being built on the context and assumptions that are what a user really wants to know.

We believe our software—and any contract management software or technology you use to get needed information from documents—should be able to predict and answer common project questions quickly and thoroughly for project management teams trying to abide by their construction contracts. Not only should these answers be thorough, but the answers should include a way to quickly verify them. 

Without having to CTRL+F their way through a contract, general contractors and trade partners should be able to trust AI to find the answer and show the proof quickly.

For example, when it comes to delay remedies, notice requirements and unforeseen conditions, contract AI software should provide succinct details, checklists and timelines for any issues that require Owner notification. At Document Crunch, we’re building a system that answers the questions behind the questions. We're giving contractors the details they didn't think to ask for and the data they didn't realize they had. We're delivering overviews straight from the contract to common timeline obstacles like how the project schedule should be submitted, what should be included in pay apps, potential remedies for flooding and more. We’re also providing quicker answers to project teams, helping them avoid the headaches of delayed responses and interpreting legalese. 

Through building a tailored intelligence engine, we’ve also learned that the context of a contract is paramount. When AI software uses only publicly available data, you can scan a full construction contract, but the software will still struggle to provide high-quality and accurate responses regarding that contract document. That’s because the LLM only has publicly available information and the information you’ve given it. It doesn’t understand standard construction-related concerns. The public LLMs also struggle with lower answer accuracy because they don't know what information to prioritize in providing answers.

To fully understand and prioritize terms in your construction contract, the AI must have construction-based and user-based foundations to the LLM. 

Generative AI is not a replacement for humans, but it can significantly augment human work when tailored toward the right context.  However, we are still, and always, responsible for the decisions that are made from AI outputs, so we need to check its work. Some argue, why use AI if it’s not 100% correct? We’d point out that a response that gets us 95% of the answer, still saves us 95% of the work. The remaining 5% often includes referencing the “work” the AI has done to produce the answer and checking that work. While this feels tedious, it helps us sign our name on information we can confirm, and it still saves us from 95% of the work spent searching a 200-page contract document.  

At Document Crunch, we’re building this tailored intelligence as a foundation to assist users and give them superpowers.  We’ve also provided templates and prompt suggestions so users can start to build their own great queries and tailor this engine further to their own needs. Meanwhile, our team will continue to build on the knowledge from historical data, prompts, and industry experts to provide the most relevant, accurate answers for construction contract questions. 

If you’re interested in experiencing the difference between prompt engineering and knowledge engineering, we’d love to show you. Schedule a demo today.