In any large multidisciplinary project, the task of reviewing the composite design requires a mix of applying subjective judgment as well as standard tests and analysis to identify conflicts between the elements incorporated from the models integrated.
The use of BIM models has revolutionized the way integrated project models are reviewed. By objectively merging elements from all models, it ensures that all members of the design team are seeing and understanding the same design. The ability of computers to automatically verify the spatial relationship between various model elements allows clash detection tools to easily find conflicts between complex systems, eliminating the need for project members to spend countless hours over light tables manually searching for clashes with each model revision.
Reviewing and Marking Up the Composite Model
Autodesk® Navisworks® software provides four essential tools for reviewing and capturing feedback on the composite model. These tools enable you to:
- Measure and verify the placement and clearance between model elements.
- Redline to highlight and annotate potential problems and issues found during review.
- Tag and classify issues for follow up.
- Comment and capture textual descriptions of the issues for later review.
As issues are found and tagged, it is helpful to classify and organize them for later retrieval and follow up. This can be done with folders and careful naming of the saved viewpoints to help team members can quickly sort out and find issues that pertain to their tasks.
Performing Clash Detection
Clash detection enables the effective identification, inspection, and reporting of interferences in the composite project model. It helps reduce the risk of human error and oversights during model reviews by automatically detecting model interferences. Clash detection can be used as a one-time sanity check for completed design work or as part of an ongoing project audit and quality control process.
Navisworks® Manage software’s Clash Detective tool enables teams to conduct clash tests between model elements by checking across the entire composite model or by checking specific subsets of the model elements. Clash checking can look for these types of conflicts:
- Hard—conflicts of elements in 3D space (If such a conflict is temporal in that it occurs for only a certain phase of project, it is termed a “soft” clash).
- Clearance—instances of not meeting set clearances between pairs of objects.
- Duplicates—identical instances of the same geometry.
It is important to recognize that not all clashes are truly problems. In fact, some clashes may have been intentional during the modeling process for the sake of simplifying the modeling task. Clash results need to be judged in the context of the level of detail included in each model, and this need underscores the importance of having an experienced model manager with a strong foundation of construction and design experience.
Teams can create batches of clash tests to be repeated with each model revision, and these batches can be exported and shared. Teams can also create a custom clash test suite for reuse on multiple projects. This approach provides an easy way to roll out a standardized set of tests across an organization that enables the expertise of sophisticated model users to be reused by everyone.
Clash tests can also be used as a way of implementing object intelligence. For example, a custom clash test could be created to check for compliance with a local building codes based on object information and the properties defined in a particular model system.
Creating Digital Requests for Information
Issues that require resolution are typically documented, communicated, and tracked as requests for information (RFIs).
With Navisworks® Manage, RFIs can easily be created using the results of the clash detection process. These results can be saved as viewpoints with associated tags and reviewer comments, and then shared as fully formatted reports that contain both an image and the description of the problem.
Clash detection can be overwhelming if not approached in a systematic way. Defining clash tests too broadly can yield an enormous number of clashes, so experience and judgment are required to define meaningful test. By carefully defining selection and search sets of model elements, the accuracy and effectiveness of the process can be greatly improved.
When clash results contain multiple clashes deriving from a single design issue, reviewers can organize the clashes into folders and subfolders to simplify reporting and tracking. After organizing and classifying the issues, images can be annotated and reports prepared with comments that synthesize the problems and communicate them effectively to the designers who must resolve them.