Data management Archives - AEC Magazine https://aecmag.com/data-management/ Technology for the product lifecycle Sat, 15 Nov 2025 14:14:06 +0000 en-GB hourly 1 https://aecmag.com/wp-content/uploads/2021/02/cropped-aec-favicon-32x32.png Data management Archives - AEC Magazine https://aecmag.com/data-management/ 32 32 Vektor.io to bring visibility to Baltic States rail project https://aecmag.com/geospatial/vektor-io-to-bring-visibility-to-baltic-states-rail-project/ https://aecmag.com/geospatial/vektor-io-to-bring-visibility-to-baltic-states-rail-project/#disqus_thread Fri, 07 Nov 2025 11:27:21 +0000 https://aecmag.com/?p=25523 E.R.B. Rail JV PS will use digital platform for managing and visualising infrastructure design information

The post Vektor.io to bring visibility to Baltic States rail project appeared first on AEC Magazine.

]]>
E.R.B. Rail JV PS will use digital platform for managing and visualising infrastructure design information

E.R.B. Rail JV PS, the joint venture leading the construction of the Rail Baltica mainline in Latvia, one of the largest infrastructure projects in Europe, has chosen Vektor.io as its digital platform for managing and visualising infrastructure design information.

ERB will used the platform to bring together 2D plans, BIM models, GIS data, and other reference materials spread across many different formats and systems, directly in the browser, accessible both in the office and on site

“We chose Vektor.io because it enables our team to access all design information in one platform, regardless of file format – to view 2D, 3D and survey data together,” said Agnis Mārtiņš Bērziņš, Regional BIM Coordinator of E.R.B. Rail JV PS.

“This helps all team members to work more efficiently by providing access to not only design, point clouds, orthophotos, and public maps with various data, but also to impressive measurement, sectioning, quantifying and analysing tools to ensure rapid and well-informed decision making.”

“We are proud to work alongside E.R.B. Rail on Rail Baltica,” added Teemu Nivell, Chief Commercial Officer, Vektor.io. “What excites us most is how technology supports their teams in daily work – from design reviews to measurements – turning complex data into a practical tool everyone can use.”

E.R.B. Rail JV PS is a partnership between Eiffage Génie Civil SAS (France), Budimex S.A. (Poland), and Rizzani de Eccher S.p.A. (Italy).

The consortium is delivering a high-speed rail corridor that will connect the Baltic States with the broader European network.


Discover what’s new in technology for architecture, engineering and construction — read the latest edition of AEC Magazine
👉 Subscribe FREE here

The post Vektor.io to bring visibility to Baltic States rail project appeared first on AEC Magazine.

]]>
https://aecmag.com/geospatial/vektor-io-to-bring-visibility-to-baltic-states-rail-project/feed/ 0
AI extracts knowledge from past projects https://aecmag.com/data-management/ai-extracts-knowledge-from-past-projects/ https://aecmag.com/data-management/ai-extracts-knowledge-from-past-projects/#disqus_thread Wed, 05 Nov 2025 07:00:40 +0000 https://aecmag.com/?p=25374 AI platform designed to turning know-how from past projects into “design Intelligence”

The post AI extracts knowledge from past projects appeared first on AEC Magazine.

]]>
Tektome KnowledgeBuilder designed to turn know-how from past projects into “design Intelligence”

Tektome has unveiled KnowledgeBuilder, an AI-powered platform designed to automatically organise massive volumes of siloed project documents to help AEC teams make smarter decisions while avoiding repeated mistakes.

The software, already piloted by Takenaka Corporation, one of Japan’s largest construction firms, analyses and extracts key content from scattered files — including drawings, reports, photos, and handwritten notes on marked up PDFs — and consolidates the data into a central, structured, and “instantly searchable” knowledge base.

The platform enables architects and engineers to ask questions in plain language and quickly see how similar issues were handled in past projects, eliminating the need to “reinvent the wheel.”

According to the company, even non-IT staff can configure what to pull from drawings, proposals, photos or meeting minutes without coding or complex setup.

KnowledgeBuilder works across PDFs, CAD files (DWG and DXF), scanned images, handwritten markups, and more, with support for BIM files (RVT and IFC) coming soon.

Its search functionality includes two modes: combined filtering across project attributes, file-level attributes, and keywords for precise narrowing, and a semantic keyword search that understands context and synonyms, and highlights matches inside documents and drawings.

According to Tektome, this natural query ability helps teams retrieve critical insights on demand, greatly improving decision-making speed and confidence.

For its KnowledgeBuilder implementation, Takenaka Corporation established multiple working groups. On-site team members took the lead in verifying how as-built drawing data could be structured and after a three-month pilot programme, were able to set up their own custom extraction criteria tailored to their needs. This allowed them to quickly retrieve practical information such as past as-built drawings that matched specific project requirements, reference examples from similar projects, and summaries of relevant numerical data from previous designs.

Takenaka is now expanding the system by importing a larger volume of past as-built drawings and further enhancing search options and usability.

“The shift from manually searching hundreds of thousands of pages of drawings to instant access through natural language has revived veteran design expertise as actionable knowledge, allowing designers to focus on more creative tasks,” said Mr. Takaoka of the Structural Department, Design Division at Takenaka’s Tokyo headquarters.


The post AI extracts knowledge from past projects appeared first on AEC Magazine.

]]>
https://aecmag.com/data-management/ai-extracts-knowledge-from-past-projects/feed/ 0
Part3 to give architects control over construction drawings https://aecmag.com/data-management/part3-to-give-architects-control-over-construction-drawings/ https://aecmag.com/data-management/part3-to-give-architects-control-over-construction-drawings/#disqus_thread Fri, 07 Nov 2025 08:31:55 +0000 https://aecmag.com/?p=25505 ProjectFiles provides a single source of truth for drawings, feeding into submittals, RFIs, and field reports.

The post Part3 to give architects control over construction drawings appeared first on AEC Magazine.

]]>
ProjectFiles is designed to provide a single source of truth for drawings, feeding into submittals, RFIs, change documents, instructions, and field reports.

ProjectFiles from Part3 is a new construction drawing and documentation management system for architects designed to help ensure the right drawings are always accessible on site, in real time, to everyone that needs them.

According to the company, unlike other tools that were built for contractors and retrofitted for everyone else, ProjectFiles was designed specifically with architects in mind.

ProjectFiles is a key element of Part3’s broader construction administration platform, and also connects drawings to the day-to-day management of submittals, RFIs, change documents, instructions, and field reports.


Discover what’s new in technology for architecture, engineering and construction — read the latest edition of AEC Magazine
👉 Subscribe FREE here

Automatic version tracking helps ensures the entire team is working from the most up-to-date drawings and documents. According to Part3, it’s designed to overcome problems such as walking onto site and finding contractors working from outdated drawings, or wasting time hunting through folders trying to find the current structural set before an RFI deadline.

The software also features AI-assisted drawing detection, where files are automatically tagged with the correct drawing numbers, titles, and disciplines.


Meanwhile, learn more about Part3’s AI capabilities, along with tonnes of other AI-powered tools, in AEC Magazine’s AI Spotlight Drectory

The post Part3 to give architects control over construction drawings appeared first on AEC Magazine.

]]>
https://aecmag.com/data-management/part3-to-give-architects-control-over-construction-drawings/feed/ 0
Nemetschek and Takenaka form strategic partnership https://aecmag.com/bim/nemetschek-and-takenaka-form-strategic-partnership/ https://aecmag.com/bim/nemetschek-and-takenaka-form-strategic-partnership/#disqus_thread Thu, 06 Nov 2025 12:21:52 +0000 https://aecmag.com/?p=25488 Memorandum of Understanding to help drive “digital transformation” and AI-driven solutions for AECO

The post Nemetschek and Takenaka form strategic partnership appeared first on AEC Magazine.

]]>
Memorandum of Understanding to help drive “digital transformation” and AI-driven solutions for AECO

Nemetschek Group – the AECO software developer whose brands include Graphisoft, Vectorworks, Allplan and Bluebeam – and Takenaka, one of Japan’s largest construction companies, have signed a Memorandum of Understanding (MoU) to advance digital transformation and AI-driven solutions in the construction sector.

The MoU initiates a strategic partnership to develop and pilot AI-assisted, cloud-based, and open digital platforms that streamline and enhance collaborative workflows across planning, design, construction, and operation processes.

“This partnership with Takenaka, a true leader with deep expertise in the construction industry, is a pivotal step,” said Marc Nezet, chief strategy officer at the Nemetschek Group. “By combining their extensive, practical know-how with our advanced digital and AI capabilities, we are co-creating a more efficient, sustainable, and data-driven future for the entire AEC/O industry.


Discover what’s new in technology for architecture, engineering and construction — read the latest edition of AEC Magazine
👉 Subscribe FREE here

“We believe in empowering our partners and customers to combine human-centric AI innovations with sustainability across the building lifecycle.”

Key areas outlined within the agreement include a commitment to best practice exchange through regular knowledge-sharing sessions, methodologies, and operational insights.

Nemetschek and Takenaka will also focus on joint AI and digital platform innovation, working together to identify, prioritise, and develop cloud-based digital and AI solutions for the AECO sector.

Secure data sharing and validation form another cornerstone of the agreement, with governance models and technical safeguards established to enable data-driven transformation.

Finally, both parties reaffirm their commitment to data protection and compliance, ensuring adherence to privacy, security, and intellectual property standards in line with global best practices.

“This partnership embodies the forward-thinking spirit of our industry,” said Daniel Csillag, CEO of Graphisoft. “By partnering with Takenaka Corporation, we are laying the groundwork for truly collaborative, open, and data-driven workflows that benefit architects, engineers, and contractors worldwide. We are proud to contribute our expertise and technology towards this transformative journey, also building on an existing Enterprise Licensing and Service Agreement between Graphisoft and Takenaka Corporation.”

Nemetschek stated that the MoU serves as a foundation and guiding framework for future joint project-specific agreements. The agreement takes effect immediately and will remain in place for a period of five years.


Main image caption: From left to right: Mr Tetsuo Harada (Executive Managing Officer, Takenaka Corporation), Mr Daniel Csillag (CEO, Graphisoft), Mr Susumu Matsuo (General Manager, Digital Division Head Office, Takenaka Corporation).

The post Nemetschek and Takenaka form strategic partnership appeared first on AEC Magazine.

]]>
https://aecmag.com/bim/nemetschek-and-takenaka-form-strategic-partnership/feed/ 0
Egnyte puts AEC AI Agents to work https://aecmag.com/ai/egnyte-puts-aec-ai-agents-to-work/ https://aecmag.com/ai/egnyte-puts-aec-ai-agents-to-work/#disqus_thread Mon, 15 Sep 2025 15:33:57 +0000 https://aecmag.com/?p=24757 Agents extract details from specification files and deliver AI guidance for building code compliance

The post Egnyte puts AEC AI Agents to work appeared first on AEC Magazine.

]]>
Agents extract details from specification files and deliver AI guidance for building code compliance

Egnyte has embedded its first ‘secure, domain-specific’ AI agents within its platform to target some of the most time-consuming and costly parts of the AEC process, from bid to completion.

The Specifications Analyst and Building Code Analyst are designed to extract details from large specification files and quickly deliver AI guidance for building code compliance.

“These tools enable customers to take advantage of the power of AI without having to move their data and potentially expose it to security, compliance, and governance risks,” said Amrit Jassal, co-founder and CTO at Egnyte. “The AEC industry relies heavily on complex, content-intensive documents to make informed decisions throughout the project lifecycle, and a single error in a spec sheet or misinterpretation of a building code can lead to significant project delays and cost overruns. These AEC AI agents fundamentally reduce project risk and help firms to deliver better, more profitable outcomes.”


Find this article plus many more in the September / October 2025 Edition of AEC Magazine
👉 Subscribe FREE here 👈

According to Egnyte, the Specifications Analyst allows users to transform any size specification document or multiple documents into source data that delivers fast and useful answers. Users can apply smart filters, including table of contents and materials, to quickly locate key sections and aggregate extracted spec data across the spec divisions.

Meanwhile, the Building Code Analyst is designed to consolidate disparate codebooks (i.e. state, county, and municipality) into a unified source of truth. Egnyte explains that the agent enables users to quickly find, compare, and check code requirements across relevant codebooks and produce consistent, useful AI-powered answers.

The agent instantly surfaces key passages with links to the relevant source text and automatically flags overlapping or contradictory code provisions, even providing the ability to include previous clarifications to speed up the resolution of such issues.

Egnyte states that all of its AI agents have access to content in the Egnyte repository while preserving its security, compliance, and data governance.

The agents also have access to data sources on the internet. According to the company, this helps ensure their outputs reflect the latest updates and amendments to building codes and other relevant information without compromising data saved in Egnyte repositories.

The post Egnyte puts AEC AI Agents to work appeared first on AEC Magazine.

]]>
https://aecmag.com/ai/egnyte-puts-aec-ai-agents-to-work/feed/ 0
KINDAutomatic to liberate Revit data https://aecmag.com/data-management/kindautomatic-extracts-data-from-revit/ https://aecmag.com/data-management/kindautomatic-extracts-data-from-revit/#disqus_thread Wed, 16 Jul 2025 06:47:29 +0000 https://aecmag.com/?p=24424 New cloud platform designed to streamline flow of BIM data

The post KINDAutomatic to liberate Revit data appeared first on AEC Magazine.

]]>
New cloud platform designed to streamline flow of BIM data

KINDAutomatic, which launched in beta this month, aims to streamline BIM workflows by offering Revit users and AEC software developers a faster, more intuitive way to access data in Revit (RVT/RFA) and IFC files — bypassing the need for traditional desktop software.

The cloud platform delivers a ‘comprehensive set of tools’ for extracting, processing, and parsing BIM data.

Through API integration, data can be seamlessly fed into existing software, while custom workflows enable the automation of data extraction.

According to the company, its scalable cloud platform can extract data from Revit and IFC files up to 10x faster than conventional methods.

Users can extract views (PDFs and SVG), Geometry (IFC and glTF), and Metadata (JSON). Files can be viewed directly in the UI, with a full breakdown of geometry and metadata.

Meanwhile, with ‘advanced filtering, users can filter and query BIM data without having to ‘wade through irrelevant information’.

Extracted data can also be pushed to third party tools such as Speckle (for real-time collaboration) or AWS IoT TwinMaker (digital twins).

The post KINDAutomatic to liberate Revit data appeared first on AEC Magazine.

]]>
https://aecmag.com/data-management/kindautomatic-extracts-data-from-revit/feed/ 0
Bluebeam expands third party integrations https://aecmag.com/collaboration/bluebeam-expands-third-party-integrations/ https://aecmag.com/collaboration/bluebeam-expands-third-party-integrations/#disqus_thread Mon, 07 Jul 2025 06:50:05 +0000 https://aecmag.com/?p=24287 New Integrations Directory connects users with Procore, SharePoint, ACC and more

The post Bluebeam expands third party integrations appeared first on AEC Magazine.

]]>
New Integrations Directory connects users with Procore, SharePoint, Autodesk Construction Cloud, Vectorworks and more

AEC software developer Bluebeam has launched an Integrations Directory, designed to provide a centralised hub for Bluebeam users to connect with third-party tools such as Procore Documents, Microsoft SharePoint, Autodesk Construction Cloud, and Vectorworks.

Bluebeam Integrations cover a broad range of connection types—including plug-ins, API-driven integrations, and document management systems—that link Bluebeam with external applications to streamline workflows, eliminate manual processes, and enhance collaboration.

Bluebeam has also made several enhancements to Bluebeam Revu, its flagship tool for markup and collaboration on PDFs, drawings and documents.

The latest release, Revu 21.6, now runs natively and 30% faster on ARM-based devices such as Microsoft Surface tablets and via Parallels on Apple M-series Macs.

Revu 21.6 also introduces new features aimed at reducing the time it takes to communicate project issues and streamline repetitive tasks such as Markups on Capture – where users can now draw directly on photos taken in the field to pinpoint issues with greater clarity. Tool Chest enhancements provide users with multi-select drag-and-drop, and customizable punch keys now give users more control and speed when using reusable markups.

“These updates reflect our commitment to solving real-world challenges faced by AEC professionals in the office, in the trailer and on their job sites,” said Jason Bonifay, Chief Technology Officer at Bluebeam. “Whether you’re syncing data across tools or just trying to close punch items faster, we’re delivering solutions that remove friction so teams can focus less on workarounds and more on building great projects.”

The post Bluebeam expands third party integrations appeared first on AEC Magazine.

]]>
https://aecmag.com/collaboration/bluebeam-expands-third-party-integrations/feed/ 0
AI: Information Integrity https://aecmag.com/ai/ai-information-integrity/ https://aecmag.com/ai/ai-information-integrity/#disqus_thread Wed, 16 Apr 2025 05:00:12 +0000 https://aecmag.com/?p=23410 How to harness the power of LLMs without losing sight of critical thinking

The post AI: Information Integrity appeared first on AEC Magazine.

]]>
As AI reshapes how we engage with information, Emma Hooper, head of information management strategy at RLB Digital, explores how we can refine large language models to improve accuracy, reduce bias, and uphold data integrity — without losing the essential human skill of critical thinking

In a world where AI is becoming an increasingly integral part of our everyday lives, the potential benefits are immense. However, as someone with a background in technology — having spent my career producing, managing or thinking about information — I continue to contemplate how AI will alter our relationship with information and how the integrity and quality of data will be managed.

Understanding LLMs

AI is a broad field focused on simulating human intelligence, enabling machines to learn from examples and apply this learning to new situations. As we delve deeper into its sub-types, we become more detached from the inner workings of these models, and the statistical patterns they use become increasingly complex. This is particularly relevant with large language models (LLMs), which generate new content based on training data and user instructions (prompts).

A large language model (LLM) uses a transformer model, that is a specific type of neural network. These models learn patterns and connections from words or phrases, so the more examples they are fed, the more accurate they become. Consequently, they require vast amounts of data and significant computational power, which puts considerable pressure on the environment. These models power tools such as ChatGPT, Gemini, and Claude.


Find this article plus many more in the March / April 2025 Edition of AEC Magazine
👉 Subscribe FREE here 👈

The case of DeepSeek-R1

DeepSeek-R1 which has recently been in the news, demonstrates how constraints can drive innovation through good old-fashioned problem-solving. This open-source LLM uses rule-based reinforcement learning, making it cheaper and less compute-intensive to train compared to more established models.

However, since it is an LLM it still faces limitations in output quality. However, when it comes to accuracy, LLMs are statistical models that operate based on probabilities. Therefore, their responses are limited to what they’ve been trained on. They perform well when operating within their dataset, but if there are gaps or they go out of scope, inaccuracies or hallucinations can occur.

Inaccurate information is problematic when reliability is crucial, but trust in quality isn’t the only issue. General LLMs are trained on internet content, but much domain-specific knowledge isn’t captured online or is behind downloads/paywalls, so we’re missing out on a significant chunk of knowledge.

Training LLMs: the built environment

Training LLMs is resource-intensive and requires vast amounts of data. However, data sharing in the built environment is limited, and ownership is often debated. This raises several questions in my mind: Where does the training data come from? Do trainers have permission to use it? How can organisations ensure their models’ outputs are interoperable? Are SMEs disadvantaged due to limited data access? How can we reduce bias from proprietary terminology and data structures? Will the vast variation hinder the ability to spot correct patterns?

With my information manager hat on, without proper application and understanding it’s not just rubbish in and rubbish out, it’s rubbish out on a huge scale that is all artificial and completely overwhelms us.

How do we improve the use of LLMs?

There are techniques such as Retrieval Augmented Generation (RAG), that use vector databases to retrieve relevant information from a specific knowledge base. This information is used within the LLM prompt to provide outputs that are much more relevant and up to date. Having more control over the knowledge base ensures the sources are known and reliable.

This leads to an improvement, but the machine still doesn’t fully understand what it’s being asked. By introducing more context and meaning, we might achieve better outputs. This is where returning to information science and using knowledge graphs can help.

A knowledge graph is a collection of interlinked descriptions of things or concepts. It uses a graph-structured data model within a database to create connections – a web of facts. These graphs link many ideas into a cohesive whole, allowing computers to understand real world relationships much more quickly. They are underpinned by ontologies, which provide a domain-focused framework to give formal meaning. This meaning, or semantics, is key. The ontology organises information by defining relationships and concepts to help with reasoning and inference.

Knowledge graphs enhance the RAG process by providing structured information with defined relationships, creating more context-enriched prompts. Organisations across various industries are exploring how to integrate knowledge graphs into their enterprise data strategies. So much so they even made it onto the Gartner Hype Cycle on the slope of enlightenment.

The need for critical thinking

From an industry perspective, semantics is not just where the magic lies for AI; it is also crucial for sorting out the information chaos in the industry. The tools discussed can improve LLMs, but the results still depend on a backbone of good information management. This includes having strategies in place to ensure information meets the needs of its original purpose and implementing strong assurance processes to provide governance.

Therefore, before we move too far ahead, I believe it’s crucial for the industry to return to the theory and roots of information science. By understanding this, we can lay strong foundations that all stakeholders can work from, providing a common starting point and a sound base to meet AI halfway and derive the most value from it.

Above all it’s important to not lose sight that this begins and ends with people and one of the greatest things we can ever do is to think critically and keep questioning!

The post AI: Information Integrity appeared first on AEC Magazine.

]]>
https://aecmag.com/ai/ai-information-integrity/feed/ 0
Future of AEC Software: Special Report https://aecmag.com/bim/future-of-aec-software-special-report/ https://aecmag.com/bim/future-of-aec-software-special-report/#disqus_thread Mon, 22 Jul 2024 17:00:42 +0000 https://aecmag.com/?p=20962 This must read report details what the AEC industry wants from future design tools

The post Future of AEC Software: Special Report appeared first on AEC Magazine.

]]>
What the AEC industry wants from future design tools

Written by Aaron Perry, Head of Digital Design at Allford Hall Monaghan Morris, and Andy Watts, director of design technology at Grimshaw


This must read report details what the AEC industry wants from future design tools, covering everything from data framework, context and scale, responsible design, and modular construction, to user experience, modelling capabilities, automation, intelligence, deliverables and more.



Watch the NXT DEV presentations from Aaron Perry and Andy Watts

NXT DEV 2023 – watch the video on NXTAEC.com

Aaron Perry, talking on behalf of a collective of medium-to-large AEC firms, gives a masterful presentation as he introduces the ‘Future Design Software Specification’.


NXT DEV 2024 – watch the video on NXTAEC.com

Andy Watts gives an important update on the specification, then hands over to Allister Lewis, ADDD, to talk about benchmarking software against the specification.


The post Future of AEC Software: Special Report appeared first on AEC Magazine.

]]>
https://aecmag.com/bim/future-of-aec-software-special-report/feed/ 0
BHoM – addressing the interoperability challenge https://aecmag.com/data-management/bhom-addressing-the-interoperability-challenge/ https://aecmag.com/data-management/bhom-addressing-the-interoperability-challenge/#disqus_thread Tue, 03 Dec 2024 08:00:39 +0000 https://aecmag.com/?p=21988 This computational development project allows AEC teams to improve project collaboration and foster standardisation

The post BHoM – addressing the interoperability challenge appeared first on AEC Magazine.

]]>
The BHoM computational development project allows AEC teams to improve project collaboration, foster standardisation and develop advanced computational workflows, as Buro Happold’s Giorgio Albieri and Christopher Short explain

In Buro Happold’s structural engineering team, we’re constantly working on unique and challenging projects, from towering skyscrapers to expansive stadiums, intricate museums to impressive bridges.

Our approach is all about exploring multiple options, conducting detailed analyses, and generating 3D and BIM models to bring these projects to life. But this process comes with the major challenge of interoperability – the ability of different systems to exchange information.

Since we collaborate with multiple disciplines and design teams from all over the world, we regularly deal with data from various sources and formats, which can be a real challenge to manage.

The AEC industry often deals with this by creating ad-hoc tools as and when the need arises (such as complex spreadsheets or macros). But these tools often end up being one-offs, used by only a small group so we end up reinventing the wheel again and again.

This is where the BHoM (Buildings and Habitats object Model) comes into play, a powerful open-source collaborative computational development project for the built environment supported by Buro Happold.

BHoM helps improve collaboration, foster standardisation and develop advanced computational workflows. Thanks to its central common language, it makes it possible to interoperate between many different programs.

Instead of creating translators between every possible combination of software applications, we just need to write one single translator between BHoM and a target software, to then connect to all the others.

One-to-one connection approach between software packages (top) vs direct connection to BHoM centralised software-agnostic environment (above) highlighting current collection of main BHoM adapters

The solution: The BHoM

The BHoM consists of a collection of schemas, functionalities and conversions with the following three main characteristics:

• It attempts to unify the “shape” of the data

• It is crafted as software-agnostic

• It is open source so that everyone can contribute and use it

Currently, the BHoM has over 1,200 object models with an extendable data dictionary and adapters to over 30 different software packages.

With the BHoM, we’ve refined and enhanced our approach to structural design.

Once the architectural model is received, using the BHoM we can quickly and precisely build several Finite Element Analysis (FEA) structural models for conducting structural analyses.

It’s possible to clean and rationalise the original geometries for specific purposes and assign/update attributes to all objects based on the results of both design and coordination with other disciplines.

Finally, the BIM model of the structure can be generated in an algorithmic manner.


BHoM
Algorithm for the computation and documentation of the connection forces with textual and graphical outputs

BHoM in practice

It’s often thought that computational and parametric design is only applicable to the very early stage of a project that relies on very complex geometry.

The reality is, computational design is greatly beneficial at every stage: from the conceptual feasibility study to the detailed design of steel connections.

At Buro Happold, we use the BHoM to help us address multiple stages throughout a project, as demonstrated in the following case study examples which focus on the re-development of a desalination plant in Saudi Arabia into a huge museum.


Find this article plus many more in the Nov / Dec 2024 Edition of AEC Magazine
👉 Subscribe FREE here 👈

Modelling the existing and the new

Let’s see how a computational workflow applies to the modelling and analysis of existing structures making use of the BHoM.

For the Saudi Arabian project, all we had was a set of scanned PDF drawings of the existing structures.

Within a couple of months, we had to build accurate Finite Element Models for each of them and run several feasibility studies against the new proposed loadings.

A parametric approach was vital. Therefore, we developed a computational workflow that allowed us to create the geometric models of all the built assets in Rhino via Grasshopper by tracing the PDF drawings, assigning them with metadata and pushing them via BHoM into Robot to carry out preliminary analyses and design checks.

Of course, there’s no need to mention how much time and effort this approach has saved us compared to a more traditional workflow.

Moving on to the next stage of the project, we needed to test very quickly many different options for the proposed structures, by modifying grids, floor heights, beams and column arrangements, as well as playing with the geometry of arched trusses and trussed mega-portals.

Again, going for a computational approach was the only way to face the challenge and we developed a large-scale algorithm in Grasshopper.

By pulling data from a live database in Excel and making use of an in-house library of clusters and textual scripts, this algorithm was able to leverage the capabilities of the BHoM to model the building parametrically in Rhino, push it to Robot for the FEA and finally generate the BIM model in Revit – all in a single parametric workflow.

Managing data flow: BIM – FEA

As we move into later stages of the project, the more we can see how computational workflows are not only beneficial for geometry generation but also for data management and design calculations.

At Stage 03 and 04 we needed to be able to transfer and modify very quickly all the huge sets of metadata assigned to any asset within our BIM models while being able to test them on a design perspective in Finite Element Software.

Again, we developed an algorithm in Grasshopper leveraging the BHoM to allow for this circular data flow from BIM to analysis software – Revit and ETABS in this instance.

This made it possible to test and update all our models quickly and precisely, notwithstanding the sheer amount of data involved.

Interdisciplinary coordination

As usual, when moving forward in the project, coordination with MEP engineers starts to ramp up and when structures are big and complex, it becomes even more difficult. The challenge we had to face was intimidating. We had eight concrete cores, 45m tall, more than 9,000 Mechanical, Electrical and Plumbing (MEP) assets for the building and around 1,500 builderswork openings to be provided in the core walls to allow them to pass through.


BHoM
Graphical representation of the algorithm for the automated creation of builderswork openings in the concrete cores of the building

On top of this, we had the need to specify openings of different sizes depending on different requirements based on the type of MEP asset, as well as the need to group and cluster openings based on their relative distance and other design criteria.

Again, a high level of complexity and a huge amount of data to deal with. Indeed, a computational approach was needed.

Using Grasshopper, BHoM and Rhino. Inside Revit, we developed an algorithm, graphically represented below.


BHoM
Flowchart of typical BHoM-based computational workflow on projects

Through grouping operations, model laundry algorithms and the parametric modelling of the builderswork openings, we were able to generate parametrically the BIM model of the cores provided with the required builderswork penetrations.

In parallel with this, the algorithm also generated the corresponding FE model of the core walls, so the structural feasibility of the penetrations could be checked before incorporating them in Revit.

The algorithm detected the intersections between pipes and walls, then generated openings around each intersection of different size and colour depending on different input criteria. Then, using a fine-tuned grouping algorithm, it clustered and rationalised them into bigger openings, wrapping all of them together based on user-input criteria.

Finally, after testing the openings in the Finite Element software, the algorithm pushed them into Revit as Wall Hosted Families and a live connection between the Rhino and the Revit environment streamlined any update process in parallel.

Producing large data sets

Moving even further into detailed design, the amount of data to deal with on a project of such scale becomes more and more overwhelming.

This is what we had to face when dealing with the design of the connections. Although the design was subcontracted to another office, we faced the challenge of providing all the connection design forces in a consistent and comprehensive format, both in textual and graphical contexts.

Indeed, this is not an easy task, especially when dealing with around 35,000 connections, 60 load combinations, 2,000 different frame inclinations, six design forces per connection and spanning over three different finite element software packages (ETABS, Robot, and Oasys GSA).

We had to deal with 12.6 million pieces of data and we had to do it very quickly, being able to update them on the fly. Again, a computational workflow was required.

Via Grasshopper and the BHoM, we developed an algorithm to extract, post-process and format the connection forces from the Finite Element models of all the assets of the project, serialise them in JSON, save them in properly formatted Excel files and show them graphically in corresponding Rhino 3D models via tagging and attributes assignment.

All this information was sent out for the design to be carried out by other parties.

Conclusions

Applying a specialised approach, relying on algorithmic methodology and leveraging state-of-the art computational tools, such as the BHoM, enable us, at Buro Happold, to deliver comprehensive and advanced structural solutions, ensuring efficiency, sustainability, and optimal performance across all the stages of the project.

Resources

[1] BHoM Documentation (2024)

[2] LOMBARDI, Alessio, (2023), Interoperability Challenges. Exploring Trends, Patterns, Practices and Possible Futures for Enhanced Collaboration and Efficiency in the AEC Industry, in, London, UK.

[3] ELSHANI, Diellza, STAAB, Steffen, WORTMANN, Thomas (2022), Towards Better Co-Design with Disciplinary Ontologies: Review and Evaluation of Data Interoperability in the AEC Industry, in LDAC 2022: 10th Linked Data in Architecture and Construction Workshop, Hersonissos, Greece.

The post BHoM – addressing the interoperability challenge appeared first on AEC Magazine.

]]>
https://aecmag.com/data-management/bhom-addressing-the-interoperability-challenge/feed/ 0