Store fp16 vector blobs in sqlite. Load the vectors after filter queries into memory and do a matvec multiplication for similarity scores (this part will be fast if the library (e.g. numpy/torch) uses multithreading/blas/GPU). I will migrate this to the very based https://github.com/sqliteai/sqlite-vector when it starts to become a bottleneck. In my case the filters by other features (e.g. date, location) just subset a lot. All this is behind some interface that will allow me to switch out the backend.