Reputation: 1
I am working on a network closure problem where I have multiple nodes and their measured delays. Given a set of nodes and delay measurements, I need to ensure that each triangle of nodes satisfies closure: Total delay sum = d_AB + d_BC + d_CA ≈ 0 (within a tolerance of ±2)
If a triangle fails closure, I need to:
My data is structured as:
nodes = ['A', 'B', 'C', 'D']
delays = {
('A','B'): (-13.52, 0.32),
('A','C'): (-29.88, 0.23),
('B','C'): (-1.16, 0.38),
('A','D'): (5.37, 0.14),
('D','B'): (-34.45, 0.38),
('D','C'): (-35.48, 0.29)
}
Each entry (A, B): (value, uncertainty) represents the delay and its measurement uncertainty. The values are reversible: (A, B) = -(B, A) Current Approach
Identify all unique triangles in the network.
Check closure for each triangle:
If |d_AB + d_BC + d_CA| > 2 ns, the triangle fails closure.
Iterate over all edges in non-closure triangles:
Apply ±15 ns adjustments to an edge.
Score the adjustment based on how many triangles it fixes.
Apply the adjustment that maximizes the number of valid triangles.
If no adjustment fixes closure, remove the worst-performing node.
Issue
Despite applying ±15 ns adjustments, the script does not always preserve the maximum number of triangles. In some cases, it removes a node prematurely instead of adjusting the correct baseline. Expected Behaviour
Instead of removing a node too early, the script should try all possible baseline adjustments. Adjustments should prioritize preserving the largest number of valid triangles before considering node removal.
Current Code Snippet
from itertools import combinations
import numpy as np
def check_closure(triangle, delays, tolerance=2):
""" Checks if a given triangle satisfies closure. """
A, B, C = triangle
d_AC = delays[(A, C)][0]
d_AB = delays[(A, B)][0]
d_BC = delays[(B, C)][0]
discrepancy = d_AC - (d_AB + d_BC)
return abs(discrepancy) <= tolerance, discrepancy
def adjust_delays(non_closure_triangles, delays, nodes, tolerance=2):
"""
Iterates over all baselines in non-closure triangles and applies ±15 ns adjustments.
Selects the best adjustment that maximizes closure.
"""
best_adjustment = None
best_score = float('-inf')
for triangle, discrepancy in non_closure_triangles:
for edge in [(triangle[0], triangle[1]), (triangle[1], triangle[2]), (triangle[0], triangle[2])]:
for adjustment in [-15, +15]:
temp_delays = delays.copy()
temp_delays[edge] = (delays[edge][0] + adjustment, delays[edge][1]) # Apply adjustment
# Check closure after adjustment
new_score = sum(check_closure(t, temp_delays, tolerance)[0] for t in combinations(nodes, 3))
if new_score > best_score:
best_score = new_score
best_adjustment = (edge, adjustment)
# Apply the best adjustment
if best_adjustment:
edge, adjustment = best_adjustment
delays[edge] = (delays[edge][0] + adjustment, delays[edge][1])
return delays
Questions
How can I ensure that my script always prioritizes preserving triangles over removing nodes? Are there better heuristics to determine the best baseline adjustment? Is there a more efficient way to iterate over baselines and adjustments to maximize closure?
Upvotes: 0
Views: 14