Skip to main content

Prerequisites

Before installing AI-BIM App, ensure your development environment meets these requirements:

Required Software

Minimum version: Node.js 16.x or higherRecommended: Node.js 18.x LTS or Node.js 20.x LTSCheck your version:
node --version
npm --version
Install Node.js:
# Using nvm
nvm install 20
nvm use 20
An OpenAI API key is required for the AI querying functionality.Get your API key:
  1. Sign up at platform.openai.com
  2. Navigate to API Keys section
  3. Create a new secret key
  4. Copy and save it securely
Keep your API key secret. Never commit it to version control or share it publicly.
Required browser support:
  • Chrome 90+
  • Firefox 88+
  • Safari 14+
  • Edge 90+
WebGL 2.0 support required for 3D rendering.

Installation Steps

1

Clone the repository

git clone <your-repository-url>
cd bim-app
Project structure:
bim-app/
├── src/
│   ├── main.ts              # Application entry point
│   ├── bim-components/      # Custom BIM components
│   │   ├── ChatGpt/         # AI query component
│   │   ├── LoadIfc/         # IFC file loader
│   │   ├── ChartData/       # Analytics component
│   │   └── AppManager/      # Global state management
│   └── components/          # UI components
├── package.json             # Dependencies
├── tsconfig.json            # TypeScript configuration
├── vite.config.ts           # Vite build configuration
└── index.html               # HTML entry point
2

Install dependencies

npm install
This installs all required packages. See Dependencies Explained below for details.
3

Configure environment variables

Create a .env file in the project root:
touch .env
Add your OpenAI API key:
.env
VITE_OPENAI_API_KEY=sk-your-actual-api-key-here
The VITE_ prefix is required for Vite to expose environment variables to the browser.
Verify the .env file is gitignored:
cat .gitignore | grep .env
# Should output: .env
4

Start the development server

npm run dev
Expected output:
VITE v5.2.0  ready in 543 ms

➜  Local:   http://localhost:5173/
➜  Network: use --host to expose
➜  press h to show help
Open your browser and navigate to the provided URL (usually http://localhost:5173).

Dependencies Explained

The package.json defines all project dependencies:

Core Dependencies

package.json
{
  "dependencies": {
    "@thatopen/components": "~2.4.2",
    "@thatopen/components-front": "~2.4.2",
    "@thatopen/fragments": "~2.4.0",
    "@thatopen/ui": "~2.4.0",
    "@thatopen/ui-obc": "~2.4.0",
    "chart.js": "^4.4.8",
    "jszip": "3.10.1",
    "openai": "^4.89.0",
    "three": "0.160.1",
    "web-ifc": "~0.0.66"
  }
}
Purpose: BIM-specific component framework built on Three.jsPackages:
  • @thatopen/components - Core components (FragmentsManager, IfcLoader, Worlds, Classifier)
  • @thatopen/components-front - Frontend rendering (PostproductionRenderer, Highlighter, IfcStreamer)
  • @thatopen/fragments - Fragment-based geometry system for efficient BIM loading
  • @thatopen/ui - Base UI component system
  • @thatopen/ui-obc - BIM-specific UI components
Key features:
  • Fragment-based IFC loading (faster than traditional methods)
  • Built-in IFC relationship indexing
  • Spatial structure navigation
  • Post-processing effects (AO, custom lines)
Purpose: WebGL-based 3D graphics libraryUsed for:
  • 3D scene rendering
  • Camera controls (OrthoPerspectiveCamera)
  • Materials and shaders
  • Geometry management
Note: Version locked to 0.160.1 to match ThatOpen components compatibility.
Purpose: High-performance IFC parser compiled to WebAssemblyFeatures:
  • Parses IFC files directly in the browser
  • Extracts geometry and properties
  • Provides IFC entity type constants (IFCWALL, IFCSLAB, etc.)
import * as WEBIFC from "web-ifc";

// Entity type constants
WEBIFC.IFCWALL
WEBIFC.IFCWALLSTANDARDCASE
WEBIFC.IFCSLAB
WEBIFC.IFCBEAM
WEBIFC.IFCBUILDINGSTOREY
Purpose: Official OpenAI API clientUsage in app:
// src/bim-components/ChatGpt/index.ts
const response = await fetch("https://api.openai.com/v1/chat/completions", {
  method: "POST",
  headers: {
    "Content-Type": "application/json",
    "Authorization": `Bearer ${apiKey}`
  },
  body: JSON.stringify({
    model: "gpt-3.5-turbo",
    messages: [...]
  })
});
Model used: GPT-3.5-turbo (cost-effective for IFC queries)
Purpose: Data visualization libraryUsed for:
  • Bar charts (element counts by type)
  • Pie charts (distribution by building storey)
  • Interactive data exploration
// src/bim-components/ChartData/index.ts
import Chart, { ChartType } from "chart.js/auto";

const chartDataset = {
  type: "bar" as ChartType,
  data: {
    labels: ["IfcWall", "IfcSlab", "IfcBeam"],
    datasets: [{ data: [120, 45, 78], ... }]
  }
};
Purpose: ZIP file handling for fragment filesUse case: Loading pre-converted fragment files which are packaged as ZIP:
// src/components/Toolbars/Sections/Import.ts
const zip = new Zip();
await zip.loadAsync(zipBuffer);

const geometry = await zip.file("geometry.frag").async("uint8array");
const properties = await zip.file("properties.json").async("string");
const relationsMap = await zip.file("relations-map.json").async("string");

Development Dependencies

{
  "devDependencies": {
    "typescript": "5.2.2",
    "vite": "5.2.0",
    "@types/three": "0.160.0",
    "eslint": "8.57.0",
    "prettier": "3.3.3"
  }
}
TypeScript 5.2.2 - Type safety and modern JavaScript features Vite 5.2.0 - Fast build tool with:
  • Hot module replacement (HMR)
  • Optimized production builds
  • Native ES modules support

Build Process

Development Mode

npm run dev
What happens:
  1. Vite starts development server on port 5173
  2. TypeScript files are transpiled on-the-fly
  3. Hot Module Replacement (HMR) enables instant updates
  4. Source maps are generated for debugging
Vite configuration:
vite.config.ts
export default defineConfig({
  base: "./",
  esbuild: {
    supported: {
      "top-level-await": true, // Required for IFC loading
    },
  },
});

Production Build

npm run build
Build steps:
  1. TypeScript compilation (tsc)
  2. Vite optimization and bundling
  3. Output to dist/ directory
Preview production build:
npm run preview

TypeScript Configuration

tsconfig.json
{
  "compilerOptions": {
    "target": "ES2020",
    "module": "ESNext",
    "moduleResolution": "bundler",
    "strict": true,
    "skipLibCheck": true
  }
}

Environment Setup

OpenAI API Configuration

The API key is accessed in the ChatGpt component:
src/bim-components/ChatGpt/index.ts
const apiKey = import.meta.env.VITE_OPENAI_API_KEY;

const response = await fetch("https://api.openai.com/v1/chat/completions", {
  headers: {
    "Authorization": `Bearer ${apiKey}`
  },
  // ...
});
If the API key is missing or invalid, AI query features will fail with a 401 error.

Environment Variables

Available variables:
VariableRequiredDescription
VITE_OPENAI_API_KEYYesOpenAI API key for GPT queries
Adding more variables: All environment variables must be prefixed with VITE_ to be exposed to the client:
.env
VITE_API_ENDPOINT=https://api.example.com
VITE_ENABLE_ANALYTICS=true

Troubleshooting Common Issues

Cause: Missing or incorrectly named environment variableSolution:
  1. Verify .env file exists in project root
  2. Check variable is named exactly VITE_OPENAI_API_KEY
  3. Restart the dev server after creating/modifying .env
# Stop the server (Ctrl+C) and restart
npm run dev
Cause: Browser doesn’t support WebGL 2.0 or GPU acceleration is disabledCheck WebGL support: Visit webglreport.comEnable GPU acceleration:
  • Chrome: chrome://settings → Advanced → System → Enable hardware acceleration
  • Firefox: about:configwebgl.force-enabled = true
Update graphics drivers if WebGL is still unavailable
Possible causes:1. File too large
  • Files >50MB may cause browser memory issues
  • Solution: Use the Tiles streaming option for large models
2. Unsupported IFC version
  • Check IFC schema version (should be IFC2x3, IFC4, or IFC4.3)
  • Open file in text editor and check header: FILE_SCHEMA(('IFC4'));
3. Corrupted file4. Browser console errors
  • Open DevTools (F12) and check Console for specific error messages
OpenAI API 401 Unauthorized:
  • Invalid API key → Check .env configuration
  • Expired API key → Generate new key from OpenAI platform
OpenAI API 429 Rate Limit:
  • Too many requests → Wait before retrying
  • Upgrade OpenAI plan for higher rate limits
No file data loaded:
  • Remember to use “Load to” tab to load IFC as text
  • Check browser console: console.log(chatGpt.fileData)
ERESOLVE dependency conflicts:
npm install --legacy-peer-deps
Node version incompatibility:
nvm use 20  # Switch to compatible Node version
npm install
Network/registry issues:
npm cache clean --force
npm install
Strict mode errors: The project uses "strict": true in tsconfig.jsonCommon fixes:
// Null checking
if (fileData) {
  // Use fileData safely
}

// Type assertions when necessary
const element = e.target as BUI.Button;

// Optional chaining
const value = object?.property?.nestedProperty;

Performance Optimization

For Large IFC Files

1

Convert to fragments

Pre-convert IFC files to fragments format for faster loading:
// After first load, export fragments
const fragments = components.get(OBC.FragmentsManager);
// Save geometry, properties, and relations map as ZIP
2

Use streaming for massive models

For files >200MB, use BIM Tiles streaming:
const tilesLoader = components.get(OBF.IfcStreamer);
tilesLoader.culler.threshold = 10;
tilesLoader.culler.maxHiddenTime = 1000;
tilesLoader.culler.maxLostTime = 40000;
3

Enable culling

Uncomment culling in main.ts for better performance:
const culler = components.get(OBC.Cullers).create(world);
culler.threshold = 5; // Adjust based on model complexity

Browser Performance

  • Use Chrome or Edge for best WebGL performance
  • Close unused tabs to free GPU memory
  • Monitor DevTools Performance tab during interaction

Next Steps

Quickstart Guide

Load your first IFC model and make AI queries

Configuration

Advanced configuration options

Need Help?

GitHub Issues

Report bugs or request features

ThatOpen Docs

Learn more about ThatOpen Components at docs.thatopen.com

Build docs developers (and LLMs) love