Aisaq All-in-storage ANNS With Product Quantization For Dram-free Information Retrieval | Awesome Learning to Hash Add your paper to Learning2Hash

Aisaq All-in-storage ANNS With Product Quantization For Dram-free Information Retrieval

Tatsuno Kento, Miyashita Daisuke, Ikeda Taiga, Ishiyama Kiyoshi, Sumiyoshi Kazunari, Deguchi Jun. Arxiv 2024

[Paper]    
ARXIV Graph Quantisation

In approximate nearest neighbor search (ANNS) methods based on approximate proximity graphs, DiskANN achieves good recall-speed balance for large-scale datasets using both of RAM and storage. Despite it claims to save memory usage by loading compressed vectors by product quantization (PQ), its memory usage increases in proportion to the scale of datasets. In this paper, we propose All-in-Storage ANNS with Product Quantization (AiSAQ), which offloads the compressed vectors to storage. Our method achieves \(\sim\)10 MB memory usage in query search even with billion-scale datasets with minor performance degradation. AiSAQ also reduces the index load time before query search, which enables the index switch between muitiple billion-scale datasets and significantly enhances the flexibility of retrieval-augmented generation (RAG). This method is applicable to all graph-based ANNS algorithms and can be combined with higher-spec ANNS methods in the future.

Similar Work