rocksdb/hdfs
Sagar Vemuri dc3528077a Update all unique/shared_ptr instances to be qualified with namespace std (#4638)
Summary:
Ran the following commands to recursively change all the files under RocksDB:
```
find . -type f -name "*.cc" -exec sed -i 's/ unique_ptr/ std::unique_ptr/g' {} +
find . -type f -name "*.cc" -exec sed -i 's/<unique_ptr/<std::unique_ptr/g' {} +
find . -type f -name "*.cc" -exec sed -i 's/ shared_ptr/ std::shared_ptr/g' {} +
find . -type f -name "*.cc" -exec sed -i 's/<shared_ptr/<std::shared_ptr/g' {} +
```
Running `make format` updated some formatting on the files touched.
Pull Request resolved: https://github.com/facebook/rocksdb/pull/4638

Differential Revision: D12934992

Pulled By: sagar0

fbshipit-source-id: 45a15d23c230cdd64c08f9c0243e5183934338a8
2018-11-09 11:19:58 -08:00
..
README using java7 in runtime for hdfs env (#1072) 2016-04-12 00:08:26 -04:00
env_hdfs.h Update all unique/shared_ptr instances to be qualified with namespace std (#4638) 2018-11-09 11:19:58 -08:00
setup.sh Suppress lint in old files 2018-01-29 12:56:42 -08:00

README

This directory contains the hdfs extensions needed to make rocksdb store
files in HDFS.

It has been compiled and testing against CDH 4.4 (2.0.0+1475-1.cdh4.4.0.p0.23~precise-cdh4.4.0).

The configuration assumes that packages libhdfs0, libhdfs0-dev are 
installed which basically means that hdfs.h is in /usr/include and libhdfs in /usr/lib

The env_hdfs.h file defines the rocksdb objects that are needed to talk to an
underlying filesystem. 

If you want to compile rocksdb with hdfs support, please set the following
environment variables appropriately (also defined in setup.sh for convenience)
   USE_HDFS=1
   JAVA_HOME=/usr/local/jdk-7u79-64
   LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/jdk-7u79-64/jre/lib/amd64/server:/usr/local/jdk-7u79-64/jre/lib/amd64/:./snappy/libs
   make clean all db_bench

To run dbbench,
  set CLASSPATH to include your hadoop distribution
  db_bench --hdfs="hdfs://hbaseudbperf001.snc1.facebook.com:9000"