Discussion:
[leveldb] LevelDB large database memory consumption
Conner Hewitt
2016-06-08 21:32:57 UTC
Permalink
Hi,

I'm using the Go implementation of LevelDB (
https://github.com/syndtr/goleveldb) to build a priority queue. I'm trying
to decide the best way to structure it based on the memory usage of LevelDB.

Would it be better to:

1. Prefix the priority to the key and store everything in one database
(which could grow to terabytes), then use an iterator to loop through the
levels?

2. Separate out each queue into its own database file, then loop through
each file?

My main concern is memory. Ideally, this queue will be able to perform
well (and linearly scale as data grows) on a low end machine with very
little memory, even with terabytes of data, using an SSD for the drive.

Does the memory usage of LevelDB in Go stay constant, no matter the
database size, and just the read time increases as it has to perform more
seeks?

Thanks!
--
You received this message because you are subscribed to the Google Groups "leveldb" group.
To unsubscribe from this group and stop receiving emails from it, send an email to leveldb+***@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.
Loading...