- Not pure Python
- Fixed columns: vw = db.getas(“urls[url:S,filename:S,headers:S]”)
- Writes the full dataset on commit
- Stores arbitrary JSON, auto-indexes all keys.
I wonder how well an API like the following would work for in-memory databases:
urls_dict = indexed({'http://example.org': {'filename': 'index.html'}}, index=['filename'])
urls_dict.index('index.html') -> single entry
urls_list = indexed([{'http://example.org': {'filename': 'index.html'}}], index=['filename'])
urls_list.index('index.html') -> multiple entries