Thursday, August 7, 2025

 This explains why location services from public cloud providers are unreliable for aerial drone images unless they are using custom models trained to detect based on features of the scene. 

import requests 
import os 
from azure.cognitiveservices.vision.computervision import ComputerVisionClient 
from msrest.authentication import CognitiveServicesCredentials 
from PIL import Image 
 
# === Azure Computer Vision credentials === 
vision_api_key = os.getenv("AZURE_AI_VISION_API_KEY") 
vision_endpoint = os.getenv("AZURE_AI_VISION_ENDPOINT") 
computervision_client = ComputerVisionClient(vision_endpoint, CognitiveServicesCredentials(vision_api_key)) 
 
# === Azure Maps credentials === 
azure_maps_key = os.getenv("AZURE_MAPS_SUBSCRIPTION_KEY") 
 
# === Load local image and get tags === 
image_path = "frame5.jpg" 
with open(image_path, "rb") as img_stream: 
    analysis = computervision_client.analyze_image_in_stream( 
        img_stream, 
        visual_features=["Tags"] 
    ) 
 
tags = [tag.name for tag in analysis.tags if tag.confidence > 0.5] 
 
# === Azure Maps Search API for landmark coordinates === 
def get_coordinates_from_azure_maps(landmark, azure_key): 
    url = f"https://atlas.microsoft.com/search/address/json" 
    params = { 
        "api-version": "1.0", 
        "subscription-key": azure_key, 
        "query": landmark 
    } 
    response = requests.get(url, params=params) 
    data = response.json() 
    results = data.get("results", []) 
    if results: 
        position = results[0]["position"] 
        return (position["lat"], position["lon"]) 
    return None 
tags = ["circular plaza"] 
# === Display matched coordinates === 
for tag in tags: 
    coords = get_coordinates_from_azure_maps(tag, azure_maps_key) 
    if coords: 
        print(f"Landmark: {tag}, Latitude: {coords[0]}, Longitude: {coords[1]}") 
    else: 
        print(f"No match found for tag: {tag}") 
 
""" 
Output: 
Landmark: outdoor, Latitude: 39.688359, Longitude: -84.235051 
Landmark: text, Latitude: 17.9739757, Longitude: -76.7856201 
Landmark: building, Latitude: 23.3531395, Longitude: -75.0597782 
Landmark: car, Latitude: 18.5366554, Longitude: -72.4020263 
Landmark: urban design, Latitude: 48.4732981, Longitude: 35.0019145 
Landmark: metropolitan area, Latitude: 55.6033166, Longitude: 13.0013362 
Landmark: urban area, Latitude: 8.448839, Longitude: -13.258005 
Landmark: neighbourhood, Latitude: 54.8811412, Longitude: -6.2779797 
Landmark: intersection, Latitude: 34.899284, Longitude: -83.392743 
Landmark: vehicle, Latitude: 38.6151446, Longitude: -121.273215 
Landmark: residential area, Latitude: 9.982962, Longitude: 76.2954466 
Landmark: city, Latitude: 19.4326773, Longitude: -99.1342112 
Landmark: traffic, Latitude: 23.5786896, Longitude: 87.1950397 
Landmark: street, Latitude: 51.1250213, Longitude: -2.7313088 
Landmark: aerial, Latitude: 34.95435, Longitude: -117.826011 
 
#  
# Not even close to the nearest neigbhorhood: https://www.google.com/maps?q=42.3736,-71.1097 
and when trying google cloud: 

gcloud ml vision detect-landmarks frame5.jpg 

{ 

  "responses": [ 

    {} 

  ] 

} 
 
import nyckel 
import os 
nyckel_client_id = os.getenv("NYCKEL_CLIENT_ID") 
nyckel_client_secret = os.getenv("NYCKEL_CLIENT_SECRET") 
credentials = nyckel.Credentials(nyckel_client_id, nyckel_client_secret) 
image_url = os.getenv("CIRCULAR_BUILDING_SAS_URL").strip('"') 
response = nyckel.invoke("landmark-identifier", image_url, credentials) 
print(response) 
# Output: 
# {'labelName': 'Yellowstone National Park', 'labelId': 'label_wottnvl9ole6ch4o', 'confidence': 0.02} 
""" 

 

Or the landmarks may not be detected at all: 
import requests 
import os 
from azure.cognitiveservices.vision.computervision import ComputerVisionClient 
from azure.cognitiveservices.vision.computervision.models import VisualFeatureTypes 
from msrest.authentication import CognitiveServicesCredentials 
from PIL import Image 
from pprint import pprint 
 
# === Azure Computer Vision credentials === 
vision_api_key = os.getenv("AZURE_AI_VISION_API_KEY") 
vision_endpoint = os.getenv("AZURE_AI_VISION_ENDPOINT") 
computervision_client = ComputerVisionClient(vision_endpoint, CognitiveServicesCredentials(vision_api_key)) 
scene_url = os.getenv("CIRCULAR_BUILDING_SAS_URL").strip('"') 
 
def get_landmark_info(image_path_or_url): 
    """ 
    Detects landmarks in an aerial image and returns detailed metadata. 
    Supports both local file paths and image URLs. 
    """ 
    visual_features = [VisualFeatureTypes.categories, VisualFeatureTypes.description, VisualFeatureTypes.tags] 
 
    if image_path_or_url.startswith("http"): 
        analysis = computervision_client.analyze_image(image_path_or_url, visual_features) 
    else: 
        with open(image_path_or_url, "rb") as image_stream: 
            analysis = computervision_client.analyze_image_in_stream(image_stream, visual_features) 
 
    # Extract landmark-related tags and descriptions 
    landmark_tags = [tag.name for tag in analysis.tags if "landmark" in tag.name.lower()] 
    description = analysis.description.captions[0].text if analysis.description.captions else "No description available" 
 
    result = { 
        "description": description, 
        "landmark_tags": landmark_tags, 
        "categories": [cat.name for cat in analysis.categories] 
    } 
 
    return result 
 
# Example usage 
if __name__ == "__main__": 
    landmark_data = get_landmark_info(scene_url) 
    pprint(landmark_data) 
 
 
### output: 
# {'categories': ['abstract_', 'others_', 'outdoor_', 'text_sign'], 
#  'description': 'graphical user interface', 
#  'landmark_tags': []} 
# actual location information is  
# 42.371305, -71.117339 

# Orthodox Minyan at Harvard Hillel, 52 Mt Auburn St, Cambridge, MA 02138 

 

# and the drone provided GPS information is the most accurate in this regard such as: 
import json 
import numpy as np 
 
# Replace this with actual GPS bounds for transformation 
# Example: top-left, top-right, bottom-right, bottom-left in pixel & GPS 
pixel_bounds = np.array([[0, 0], [4096, 0], [4096, 4096], [0, 4096]]) 
gps_bounds = np.array([[39.735, -104.997], [39.735, -104.989], 
                       [39.729, -104.989], [39.729, -104.997]]) 
 
# Compute affine transform matrix from pixel to GPS 
A = np.linalg.lstsq(pixel_bounds, gps_bounds, rcond=None)[0] 
 
def pixel_to_gps(coord): 
    """Map pixel coordinate to GPS using affine approximation""" 
    return tuple(np.dot(coord, A)) 
 
def parse_json_gps(json_data): 
    gps_coords = [] 
    for frame in json_data: 
        if frame is None: 
            continue 
        frame_coords = [pixel_to_gps(coord) for coord in frame] 
        gps_coords.append(frame_coords) 
    return gps_coords 
 
# Example JSON input 
data = [None, [[3132, 4151], [3354, 2924], [4044, 3056], [3824, 4275]], 
              [[3095, 4164], [3318, 2939], [4006, 3073], [3787, 4289]]] 
 
gps_output = parse_json_gps(data) 
for i, frame in enumerate(gps_output): 
    print(f"Frame {i+1}:") 
    for lat, lon in frame: 
        print(f"Latitude: {lat:.6f}, Longitude: {lon:.6f}") 

Wednesday, August 6, 2025

 This is a summary of the book titled “Marketing built by love: Human-Centered Foundation to Delight Your Customers, Increase Your Revenue, and Grow Your Business” written by Daniel Bussius and published by Greenleaf Book Group Press in 2023.

In his book, Daniel Bussius reimagines the foundations of marketing, proposing a human-centered framework that prioritizes empathy, trust, and long-term relationships over transactional tactics. In a marketplace crowded with impersonal strategies and outdated funnels, Bussius offers a compelling blueprint rooted in emotional resonance and connection.

He introduces the Marketing RAMP – a responsive, aligned master plan – as a reliable system built on four pillars: identifying ideal customers (“people you love”), expressing your “love language” through meaningful interactions, offering a clear value-driven proposal, and nurturing customer relationships through stages that mirror human connections. Bussius argues that successful marketing today must be built with the same care, patience, and attentiveness as nurturing a romantic relationship.

A major focus of the book is shifting the mindset from pure sales to genuine value delivery. Traditional approaches, Bussius warns, fall prey to five “fatal flaws”: being too transactional, ignoring instinctual emotional triggers, clinging to linear funnel systems, using outdated technology, and lacking cohesive organization. Instead, he urges brands to engage the “reptilian brain” – our pain-averse, emotion-driven decision center – by positioning themselves as protectors, alleviating customer stress through simple, empathetic messaging.

Understanding and segmenting customer pain points becomes central to building rapport. Marketers must deeply understand who their customers are and speak their language – whether informal or formal, humorous or serious – with consistent, story-driven messaging that builds tribal alignment and trust. By doing so, businesses connect through shared values and build what Bussius calls “tribes,” safe communities that foster loyalty.

He lays out a ten-stage journey that echoes human relationships: from first impressions to long-term love, each step is an invitation to deepen customer engagement. From the “honeymoon” phase to occasional fallouts, the emphasis is on continuous care—soliciting feedback, listening actively, and resolving concerns with dignity. Neglecting the post-purchase experience, Bussius argues, results in regret and disconnection; delighting customers after the sale is what builds lifetime value.

Recommitment is also a key theme. Brands shouldn’t assume happy customers will return unprompted. Instead, they must offer clear ascension paths, personalized solutions, and consistent touchpoints through strategic content planning. By tapping into customers’ emotional rhythms and rituals, businesses stay relevant and compelling.

Ultimately, Bussius positions love as a transformative force—not just for marketing, but for the entire organization. When companies treat customers with respect, sincerity, and generosity, the ripple effects strengthen team cohesion, stakeholder alignment, and overall business growth. By embracing this ethos, marketing becomes less about persuasion and more about authentic service—and the impact is felt far beyond the balance sheet.


Tuesday, August 5, 2025

 Regaining chronicle in video indexing.

The previous articles examined various cloud service APIs and dedicated web services to extract gps coordinates (latitude, longitude) with varying degrees of success. This article describes how to regain the timestamps post video indexing so that it can be added as a time field for extracted frames and uploaded in the document to a vector store.

def repeat_video_index(access_token, video_id):

    """Retrieve the index/insights for a video by its ID."""

    url = f"{video_indexer_endpoint}/{video_indexer_region}/Accounts/{video_indexer_account_id}/Videos/{video_id}/ReIndex?accessToken={access_token}"

    response = requests.put(url)

    if response.status_code == 200:

        return response

    return get_video_insights(access_token, video_id)

def get_video_insights(access_token, video_id):

    url = f"{video_indexer_endpoint}/{video_indexer_region}/Accounts/{video_indexer_account_id}/Videos/{video_id}/Index?accessToken={access_token}"

    count = 0

    while True:

        response = requests.get(url)

        data = response.json()

        if "state" in data and data['state'] == 'Processed':

            return data

        count+=1

        if count%10 == 0:

            print(data)

        print("Sleeping for ten seconds...")

        time.sleep(10) # Wait 10 seconds before checking again

def get_timestamps(access_token, video_id):

    # Main workflow

    # access_token = get_access_token()

    insights = get_video_insights(access_token, video_id)

    #pprint(insights)

    timestamps=[]

    for keyframe in insights['videos'][0]['insights']['shots'][0]['keyFrames']:

        timestamps+=[(keyframe['instances'][0]['start'], keyframe['instances'][0]['end'])]

    print(timestamps)

    return timestamps

# [('0:00:00.2219828', '0:00:00.2570043'), ('0:00:00.5030307', '0:00:00.5379849'), ('0:00:00.8780307', '0:00:00.9249731')]


Monday, August 4, 2025

 This is a summary of the book titled “Hope for cynics: the surprising science of human goodness” written by psychologist Jamil Zaki and published by Grand Central in 2024.With faith in others and institutions constantly eroded, the author claims the easy slipping into cynicism is an illness and instead suggests that a strong dose of “hopeful skepticism” may be preferable instead. He reassures that “people are better than you probably think”. He draws on his own experience and case studies to claim that cynicism is a social disease and that “preexisting conditions” breeds cynics. An epidemic of loneliness is making it worse. Cynicism draws into an unescapable descent into a spiral and “creative maladjustment” is needed to go against its pull. You must earn your hope because it cannot be borrowed.

In his book, he offers a compelling argument against the creeping influence of cynicism in modern society. With trust in institutions and one another steadily eroding, Zaki reframes cynicism as a “social disease” — contagious and debilitating — worsened by loneliness, inequality, and disillusionment. Instead of succumbing to this bleak worldview, he advocates for “hopeful skepticism”: an active, discerning mindset that challenges assumptions without abandoning faith in humanity.

Zaki draws from psychology experiments — like the trust-based investment game where most people act cooperatively — to demonstrate that despite cynics expecting betrayal, the majority of people still choose fairness. He shows that cynicism is not just a perspective; it is a predictor of unhappiness. Cynics tend to struggle more with addiction, relationships, and financial security. Ironically, their mistrust often isolates them even further, perpetuating the very conditions they fear.

The book explores the roots of cynicism, tracing its evolution from Ancient Greek philosopher Diogenes to today’s widespread pessimism. Diogenes may have challenged greed and power with biting humor and contempt, but modern cynics often distrust not just systems, but individuals themselves. Through vivid case studies — including contrasting fishing villages in Brazil where cooperation bred trust and stinginess bred suspicion — Zaki illustrates how environments and lived experiences shape people's capacity to trust. Cynicism, he argues, is learned and reinforced, not innate.

Zaki urges readers to nurture hope by connecting with core values, engaging with others, and resisting negativity bias. He cites surveys revealing that people underestimated acts of kindness during the COVID-19 pandemic — highlighting how media and selective attention distort perceptions. Leaders have immense influence in either encouraging cynicism or cultivating collaboration. Zaki contrasts Steve Ballmer’s rigid, mistrustful leadership at Microsoft with Satya Nadella’s people-first, innovation-driven culture shift — a transformation that revived morale and creativity.

Importantly, Zaki stresses that cynicism does not inspire change — it stifles it. Instead, what is needed is “creative maladjustment,” a concept inspired by Martin Luther King Jr., where people embrace moral discomfort and actively work toward solutions. This blend of hope and resolve is crucial in facing systemic issues like poverty and climate change. Zaki dismantles myths used to deflect collective responsibility, such as the fabricated “carbon footprint” campaign by BP and harmful stereotypes like the “welfare queen.”

Ultimately, Zaki’s message is clear: you cannot borrow hope — you must earn it. Trust, like love or creativity, flourishes not from blind optimism but from deliberate effort and collective belief. By embracing hopeful skepticism, individuals can reject fear-driven narratives and begin building a healthier, more connected society.


#Codingexercise: https://1drv.ms/w/c/d609fb70e39b65c8/EZGJFZKaQ_9An0czcGHmSdsB0ydSGQqr1f8sBdVE5Ua1Hg?e=OlrlFH