Tuesday, July 1, 2025

 As an example of drone formation transformation discussed in the previous article, the following code demonstrates the application of Hungarian Algorithm to determine the position allocation in formation transformation.  

#! /usr/bin/python 

# pip install hungarian-algorithm 
 

from hungarian_algorithm import algorithm 
import numpy as np 
 
# Source: drones in a 3×3 grid on Z=0 plane 
source_positions = [ 
    (x, y, 0) 
    for y in range(3) 
    for x in range(3) 
] 
 
# Target: drones in a single horizontal line (linear flight path), spaced 10 units apart 
target_positions = [ 
    (i * 10, 0, 0) for i in range(9) 
] 
 
# Compute cost matrix (Euclidean distance) 
cost_matrix = [ 
    [ 
        np.linalg.norm(np.array(src) - np.array(dst)) 
        for dst in target_positions 
    ] 
    for src in source_positions 
] 
 
# Run Hungarian Algorithm to get minimum-cost assignment 
assignment = algorithm.find_matching(cost_matrix, matching_type='min') 
 
# Report matched pairs 
for src_idx, dst_idx in enumerate(assignment): 
    print(f"Drone {src_idx} → Target Position {dst_idx}: {target_positions[dst_idx]}") 


The above does not take velocity and heading into consideration but that can be adjusted as per trajectory.


#Codingexercise: https://1drv.ms/w/c/d609fb70e39b65c8/EYzGgu5Fc4dEoCUHWQYxMbUBSfvC36iKh8ESBaLtozvdqA?e=VfWGog

Monday, June 30, 2025

 Most drones don’t have radars. They merely have positions that they change based on fully autonomous decisions or provided by a controller. In the former case, the waypoints and trajectory determine the flight path, and each drone independently tries to minimize the errors in deviations from the flight path while aligning its path using least squares method. The selection of waypoints and the velocity and ETA at each waypoint is determined for each unit in a UAV swarm with ability to make up delays or adjust ETAs using conditional probability between past and next waypoint while choosing a path of least resistance or conflict between the two. Usually, a formation, say matrix, already spreads out the units and its center of mass is used to calculate the progress on the flight path for the formation. This article discusses a novel approach to minimize the conflicts and adhere to the path of least resistance.

For example, to transform between an “Abreast” and a “Diamond” formation, any technique must demonstrate efficiency in minimizing transformation distance and maintaining formation coherence. Similarly, to transform between matrix formation to flying linearly under a bridge between its piers, any technique must demonstrate a consensus based pre-determined order.

The approach included here defines a drone’s formation state with six parameters: time, 3D positions, yaw angle (heading), and velocity. For a formation to be considered coherent, all drones must share the same heading and speed while maintaining relative positions—essential for realistic aerial maneuvers.

The transformation itself consists of two steps: location assignment and path programming. First, to determine which drone should move to which position in the new formation, the Hungarian algorithm, a centralized optimization method is used or in its absence the information about the greatest common denominator for volume between two waypoints determines the number of multiple simultaneous paths to choose and the matrix model is used to assign the positions for the drones to the nearest path. If there is only one path and no centralized controller, the units use Paxos algorithm for coming to a consensus on the linear order. This first step evaluates the cost of moving each drone to each new position by considering spatial displacement, heading change, and velocity difference. This ensures the assignment minimizes overall disruption and maneuvering effort.

Second, each drone calculates its own flight path to the newly assigned position using a Dubins path model, which generates the shortest possible route under a minimum turning radius constraint—a requirement for fixed-wing drones that can’t make sharp turns or hover. Positions alone do not guarantee compliance and the velocity adjustments for each unit must also be layered over the transition. The adjustment of velocities follows a Bayesian conditional probability along the associated path for the unit. This involves computing acceleration and deceleration phases to fine-tune the duration and dynamics of the transition with error corrections against deviations.

Overall, this provides a cohesive framework for in-flight drone formation reconfiguration that balances centralized planning with distributed execution. By coding the physical constraints and states for each unit and classifying the adherence, outliers can be handled by rotating them with other units for a smooth overall progression for the formation and overcoming environmental factors such as turbulence with error corrections.

#Codingexercise: https://1drv.ms/w/c/d609fb70e39b65c8/Echlm-Nw-wkggNaVNQEAAAAB63QJqDjFIKM2Vwrg34NWVQ?e=yTCv5p

Sunday, June 29, 2025

 In addition to structured query operator based queries on the metadata of aerial drone images, semantic and bm25 search on the text descriptions associated with the images and vector queries of an image from the collection with the rest of the images for similarity, we can also do an object search from the detected objects for images containing those objects with the sample code as follows: 

#! /usr/bin/python 
import json 
import sys 
import os 
import requests 
import numpy as np 
from azure.core.credentials import AzureKeyCredential 
from azure.identity import DefaultAzureCredential 
from azure.search.documents import SearchClient   
from azure.search.documents.indexes import SearchIndexClient   
 
from azure.search.documents.models import ( 
    QueryCaptionResult, 
    QueryType, 
    VectorizedQuery, 
    VectorQuery, 
) 
sys.path.insert(0, os.path.abspath("..")) 
from visionprocessor.vectorizer import vectorize_image 
search_endpoint = os.getenv("AZURE_SEARCH_SERVICE_ENDPOINT")   
index_name = os.getenv("AZURE_SEARCH_INDEX_NAME") 
api_version = os.getenv("AZURE_SEARCH_API_VERSION") 
search_api_key = os.getenv("AZURE_SEARCH_ADMIN_KEY") 
vision_api_key = os.getenv("AZURE_AI_VISION_API_KEY") 
vision_api_version = os.getenv("AZURE_AI_VISION_API_VERSION") 
vision_region = os.getenv("AZURE_AI_VISION_REGION") 
vision_endpoint =  os.getenv("AZURE_AI_VISION_ENDPOINT") 
red_car_sas_url = os.getenv("AZURE_RED_CAR_SAS_URL").strip('"') 
credential = AzureKeyCredential(search_api_key) 
 
search_client = SearchClient(endpoint=search_endpoint, index_name=index_name, credential=credential) 
vector = vectorize_image(red_car_sas_url, vision_api_key, "eastus") 
vector = np.pad(vector, (0, 1536 - len(vector)), mode='constant') 
# vector_query = VectorQuery(vector=vector, 
                              # k=1000, 
                              # fields = "image_vector")   
vector_query = { 
    "vector": vector.tolist(), 
    "k": 1000,  # retrieve up to 1000 matches, adjust as needed 
    "fields": "vector", 
    "kind": "vector", 
    "exhaustive": True 
} 
results = search_client.search( 
    search_text="", 
    vector_queries= [vector_query], 
    select=["id"], 
    include_total_count=True, 
    top=1000, 
) 
if results: 
    if results.get_count() != 1000 
        print(f"Number of results: {results.get_count()}") 
    items=[] 
    for result in results: 
        if result: 
        line = result["id"] 
        id = int(line) 
        match = False 
        for item in items: 
           if item < id and id-item < 1000: 
              match = True 
              break 
           if item > id and item-id < 1000: 
              match = True 
              break 
        if match == False: 
           items+=[id] 
    for item in items: 
        print(f"{item:06d}") 
""" 
Sample output: 
008812 # invalid match no red car, all others have red cars 
011214 
015644 
006475 
010111 
007526 
017760 
012911 
014182 
016672 # only red car, full precision 
004559 
003267 
000601 
001680 
"""