Introduction
When building large interactive walls or floors with multiple PoE LiDAR sensors, developers often face the challenge of overlapping or conflicting touch points. Without proper handling, these conflicts can cause jitter, duplicated interactions, or false triggers in Unity.
This article explains how to detect and compensate multi-LiDAR touch conflicts in Unity, using practical methods that ensure accurate and stable multi-touch interaction.

Why Touch Point Conflicts Occur
In multi-sensor setups, each LiDAR scans from a different angle or position. When their detection areas overlap:
- The same physical touch may be reported multiple times.
- Slight coordinate mismatches cause “ghost” touches.
- Timing differences create inconsistent events.
To solve this, Unity developers need data fusion and conflict compensation strategies.

Key Methods to Handle Multi-LiDAR Touch Conflicts
1. Unified Coordinate System Conversion
Before merging touch data, ensure all LiDAR sensors map their inputs into a common global coordinate system. This eliminates mismatches caused by local sensor positioning.
2. Touch Point De-duplication and Merging
If two or more LiDARs detect touches within a defined proximity threshold, treat them as a single touch point. This prevents duplicate triggers of the same interaction.
3. Conflict Priority Rules
Not all touch points are equally reliable. Assign priority rules, for example:
- Prefer points with stronger signal quality.
- Use the closest point to the wall’s center.
- Favor the earliest timestamp.
This ensures the most accurate input is kept.
4. Time Synchronization and Filtering
Synchronize event timestamps across LiDARs. Apply filtering techniques (e.g., Kalman filter) to smooth noisy or jittery positions, producing a stable and responsive touch experience.
5. Unified Event Dispatching
Combine all validated touch points into a single event manager in Unity. This prevents multiple sensors from redundantly triggering the same gesture or action.
6. Debugging with Visualization Tools
During development, visualize all detected points in Unity. This allows real-time tuning of thresholds, filters, and merging rules for optimal performance.
Best Practices for Unity Developers
- Start small: test with two LiDARs before scaling to full walls.
- Use TUIO input plugins such as TouchScript for multi-touch support.
- Implement configurable thresholds for merging to adapt to different wall sizes.
- Continuously monitor system latency, as added filtering can introduce delays.
FAQs
Q1: Why do I need multiple LiDARs for a long wall?
A single LiDAR may leave blind spots on walls over 5–6 meters. Multiple LiDARs eliminate coverage gaps and improve accuracy.
Q2: How do I prevent duplicate touches?
Use distance-based merging rules to combine touches from overlapping LiDAR fields.
Q3: Does using more LiDARs increase latency?
Not necessarily. With proper synchronization and optimized code, multi-sensor setups can remain highly responsive.
Q4: Can Unity handle real-time filtering?
Yes, Unity scripts can implement smoothing algorithms such as Kalman or moving average filters without significant performance loss.
Q5: Is CPJROBOT PoE LiDAR compatible with Unity?
Yes, CPJROBOT PoE LiDAR supports TUIO and Windows multi-touch protocols, making it easy to integrate with Unity applications.
Conclusion
Managing multi-LiDAR touch conflicts in Unity is essential for creating reliable and immersive interactive walls. By applying coordinate unification, de-duplication, prioritization, and filtering, developers can deliver smooth, accurate, and professional experiences for users.
Looking to build a large-scale interactive projection wall or floor with stable multi-LiDAR touch support?
CPJROBOT specializes in PoE interactive LiDAR and reception/navigation robots, offering sensors designed for Unity integration and advanced interactive applications.
Contact CPJROBOT today to explore reliable LiDAR solutions for your next project.