mirror of https://github.com/facebook/rocksdb.git
30a017feca
Summary: Previously on a blob db read, we are making a read of the blob value, and then make another read to get CRC checksum. I'm combining the two read into one. readrandom db_bench with 1G database with base db size of 13M, value size 1k: `./db_bench --db=/home/yiwu/tmp/db_bench --use_blob_db --value_size=1024 --num=1000000 --benchmarks=readrandom --use_existing_db --cache_size=32000000` master: throughput 234MB/s, get micros p50 5.984 p95 9.998 p99 20.817 p100 787 this PR: throughput 261MB/s, get micros p50 5.157 p95 9.928 p99 20.724 p100 190 Closes https://github.com/facebook/rocksdb/pull/3301 Differential Revision: D6615950 Pulled By: yiwu-arbug fbshipit-source-id: 052410c6d8539ec0cc305d53793bbc8f3616baa3 |
||
---|---|---|
.. | ||
backupable | ||
blob_db | ||
cassandra | ||
checkpoint | ||
compaction_filters | ||
convenience | ||
date_tiered | ||
document | ||
geodb | ||
leveldb_options | ||
lua | ||
memory | ||
merge_operators | ||
option_change_migration | ||
options | ||
persistent_cache | ||
redis | ||
simulator_cache | ||
spatialdb | ||
table_properties_collectors | ||
transactions | ||
ttl | ||
write_batch_with_index | ||
col_buf_decoder.cc | ||
col_buf_decoder.h | ||
col_buf_encoder.cc | ||
col_buf_encoder.h | ||
column_aware_encoding_exp.cc | ||
column_aware_encoding_test.cc | ||
column_aware_encoding_util.cc | ||
column_aware_encoding_util.h | ||
debug.cc | ||
env_librados.cc | ||
env_librados.md | ||
env_librados_test.cc | ||
env_mirror.cc | ||
env_mirror_test.cc | ||
env_timed.cc | ||
env_timed_test.cc | ||
merge_operators.h | ||
object_registry_test.cc | ||
util_merge_operators_test.cc |