Latest Tech News

Stay updated with the latest in technology, AI, cybersecurity, and more

Filtered by: vector Clear Filter

The messy reality of SIMD (vector) functions

We’ve discussed SIMD and vectorization extensively on this blog, and it was only a matter of time before SIMD (or vector) functions came up. In this post, we explore what SIMD functions are, when they are useful, and how to declare and use them effectively. A SIMD function is a function that processes more than one piece of data. Take for example a mathematical sin function: double sin(double angle); This function takes one double and returns one double. The vector version that processes four

Muvera: Making multi-vector retrieval as fast as single-vector search

Neural embedding models have become a cornerstone of modern information retrieval (IR). Given a query from a user (e.g., “How tall is Mt Everest?”), the goal of IR is to find information relevant to the query from a very large collection of data (e.g., the billions of documents, images, or videos on the Web). Embedding models transform each datapoint into a single-vector “embedding”, such that semantically similar datapoints are transformed into mathematically similar vectors. The embeddings are

Neural Texture Compression demo shows it can do wonders for VRAM usage

Serving tech enthusiasts for over 25 years.TechSpot means tech analysis and advice you can trust In context: Modern game engines can put severe strain on today's hardware. However, Nvidia's business decisions have left many GPUs with less VRAM than they should. Fortunately, improved texture compression in games helps make the most of what's available. Neural Texture Compression (NTC) is a new technique that improves texture quality while reducing VRAM usage. It relies on a specialized neural n

Building agents using streaming SQL queries

LLMs are general-purpose models created from huge bodies of publicly available datasets. However, many, if not most, AI Agents for enterprise use cases require access to context such as internal data and resources, tools and services. How can this be implemented when building an agentic system using Flink SQL? First, let’s consider the case of structured data, for instance details about a given customer stored in an external database. SQL is a natural fit for accessing that kind of data: Flink

This AI Agent Should Have Been a SQL Query

LLMs are general-purpose models created from huge bodies of publicly available datasets. However, many, if not most, AI Agents for enterprise use cases require access to context such as internal data and resources, tools and services. How can this be implemented when building an agentic system using Flink SQL? First, let’s consider the case of structured data, for instance details about a given customer stored in an external database. SQL is a natural fit for accessing that kind of data: Flink