The previous article` discussed ways to enhance the index in an azure ai search vector store by promoting text that can be used queries along with semantic configuration. The following for instance is an example of semantic search on the drone images by leveraging the extracted metadata as text fields.
# Runs a semantic query (runs a BM25-ranked query and promotes the most relevant matches to the top)
results = search_client.search(query_type='semantic', semantic_configuration_name='my-semantic-config',
search_text="Are there images that show red cars as parked?",
select='Id,Description,title,tags,bounding_box', query_caption='extractive')
for result in results:
print(result["@search.reranker_score"])
print(result["id"])
print(f"Description: {result['Description']}")
tags = result["@search.tags"]
if "red car" in tags:
print(f"Title: {result["title"]}\n")
And this doesn’t just stop at query responses. Instead of the embeddings model, now we can leverage gpt-4o chat LLMs to help us generate appropriate answers to the queries given that everything is text.
Similarly, queries are text and besides the semantic search as above, we can also decompose the queries to suit the ontology we derive from the metadata including labels and tags. The way we compose lower-level queries and reusable higher-level queries helps build intelligent drone sensing applications.
No comments:
Post a Comment