OpenMind is a high-performance mind mapping application built exclusively with Apple technologies. The architecture prioritizes performance, native platform integration, and seamless cross-device synchronization while maintaining a clean, modular codebase.
- Language: Swift 6.0 with strict concurrency
- UI Framework: SwiftUI (primary) + UIKit (performance-critical paths)
- Rendering: Metal (GPU acceleration) + Core Graphics (fallback)
- Storage: SwiftData + CloudKit
- Architecture Pattern: MVVM with Combine
- Foundation: Core framework functionality
- UIKit/AppKit: Platform-specific UI components
- Core Graphics: 2D rendering and drawing
- Metal: GPU-accelerated rendering
- MetalKit: Metal integration helpers
- Core Animation: Smooth animations
- CloudKit: Cross-device sync
- PencilKit: Apple Pencil support
- PDFKit: PDF export
- AVFoundation: Audio notes
- Natural Language: Text processing
- Vision: OCR and image analysis
- StoreKit 2: In-app purchases
- WidgetKit: Home screen widgets
- Intents: Siri integration
┌─────────────────────────────────────────────────────────────┐
│ Presentation Layer │
│ ┌─────────────┐ ┌──────────────┐ ┌──────────────────┐ │
│ │ SwiftUI │ │ UIKit │ │ Platform │ │
│ │ Views │ │ Components │ │ Specific │ │
│ └─────────────┘ └──────────────┘ └──────────────────┘ │
└─────────────────────────────────────────────────────────────┘
│
┌─────────────────────────────────────────────────────────────┐
│ Business Logic Layer │
│ ┌─────────────┐ ┌──────────────┐ ┌──────────────────┐ │
│ │ ViewModels │ │ Services │ │ Repositories │ │
│ │ (MVVM) │ │ │ │ │ │
│ └─────────────┘ └──────────────┘ └──────────────────┘ │
└─────────────────────────────────────────────────────────────┘
│
┌─────────────────────────────────────────────────────────────┐
│ Rendering Pipeline │
│ ┌─────────────┐ ┌──────────────┐ ┌──────────────────┐ │
│ │ Metal │ │ Core Graphics│ │ Layout │ │
│ │ Engine │ │ Fallback │ │ Algorithms │ │
│ └─────────────┘ └──────────────┘ └──────────────────┘ │
└─────────────────────────────────────────────────────────────┘
│
┌─────────────────────────────────────────────────────────────┐
│ Data Layer │
│ ┌─────────────┐ ┌──────────────┐ ┌──────────────────┐ │
│ │ SwiftData │ │ CloudKit │ │ File System │ │
│ │ Models │ │ Sync │ │ Storage │ │
│ └─────────────┘ └──────────────┘ └──────────────────┘ │
└─────────────────────────────────────────────────────────────┘
The rendering engine uses a dual approach for maximum performance:
- Handles 10,000+ nodes at 60fps
- GPU-accelerated node rendering
- Instanced rendering for similar nodes
- Custom shaders for effects
- Efficient culling and LOD
- Used for older devices or when Metal unavailable
- Optimized drawing paths
- Intelligent caching
- Progressive rendering
MindMapNode
├── id: UUID
├── text: String
├── position: CGPoint
├── children: [MindMapNode]
├── style: NodeStyle
└── metadata: NodeMetadata
MindMapConnection
├── fromNode: MindMapNode
├── toNode: MindMapNode
├── style: ConnectionStyle
└── path: BezierPath- Local Storage: SwiftData with SQLite backend
- Cloud Sync: CloudKit with CKRecord
- File Format: Custom
.mindmap(JSON-based) - Conflict Resolution: Last-write-wins with branching
- Radial Layout: Default, balanced distribution
- Tree Layout: Hierarchical, top-down or left-right
- Organic Layout: Force-directed, physics simulation
- Manual Layout: User-controlled with snapping
- Incremental layout updates
- Background calculation on separate thread
- Layout caching
- Viewport-based priority
Touch Event → Gesture Recognizer → Canvas Coordinator → Action
↓ ↓ ↓ ↓
Apple Pencil Multi-touch Hit Testing State Update
- Tap: Selection
- Double Tap: Create node
- Pan: Move nodes/viewport
- Pinch: Zoom
- Long Press: Context menu
- Pencil Double Tap: Tool switch
Local Change → SwiftData → CloudKit → Remote Devices
↓ ↓ ↓
Local Cache CKRecord Push Notification
- Timestamp-based resolution
- User notification for conflicts
- Branching for major conflicts
- Automatic merge for compatible changes
- Culling: Only render visible nodes
- LOD: Reduce detail for distant nodes
- Batching: Group similar draw calls
- Caching: Reuse rendered content
- Lazy Loading: Load content on demand
- Node Virtualization: Keep only visible nodes in memory
- Image Compression: Optimize embedded images
- Weak References: Prevent retain cycles
- Background Cleanup: Periodic memory optimization
- Main Thread Protection: UI updates only
- Background Queues: Heavy computation
- Debouncing: Limit update frequency
- Progressive Loading: Show content as available
- Touch-first interface
- Apple Pencil optimization
- Split View support
- Drag and Drop
- Stage Manager
- Mouse and trackpad support
- Keyboard shortcuts
- Menu bar integration
- Multiple windows
- AppleScript support
- Encryption: On-device encryption
- CloudKit: End-to-end encryption
- Keychain: Secure credential storage
- App Transport Security: HTTPS only
- No Analytics: No third-party tracking
- Local Processing: All processing on-device
- User Control: Full data export/delete
- Model logic validation
- Layout algorithm correctness
- Sync conflict resolution
- Performance benchmarks
- Gesture recognition
- Canvas interaction
- Cross-device sync
- Accessibility
- Render performance (10K nodes)
- Memory usage tracking
- Launch time optimization
- Battery impact
workflows:
- name: "Production Build"
triggers:
- branch: main
actions:
- build
- test
- archive
- distribute_testflight- Automated testing via Xcode Cloud
- TestFlight beta distribution
- Phased App Store rollout
- Automatic crash reporting
- AR/VR support (visionOS)
- ML-powered layout suggestions
- Collaborative editing improvements
- Advanced automation (Shortcuts)
- Migrate remaining UIKit code to SwiftUI
- Implement more sophisticated merge algorithms
- Add offline-first capabilities
- Enhance accessibility features
This architecture provides a solid foundation for a professional-grade mind mapping application that fully leverages Apple's ecosystem while maintaining the performance and user experience standards expected by Apple users.