Issue 17554

maps indexing throws cryptic error on startup

17554
Reporter: omeyn
Type: Bug
Summary: maps indexing throws cryptic error on startup
Priority: Major
Resolution: WontFix
Status: Closed
Created: 2015-04-22 16:41:37.858
Updated: 2018-05-31 16:42:39.348
Resolved: 2018-05-31 16:42:39.316
        
Description: When a maps indexer receives its first message after starting up it throws this:

WARN  [2015-04-22 16:37:20,516+0200] [HBase DB flusher ] org.apache.hadoop.hbase.util.DynamicClassLoader: Failed to identify the fs of dir /tmp/hbase-crap/hbase/lib, ignored
java.io.IOException: No FileSystem for scheme: file
        at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2584) ~[metrics-cli.jar:na]
        at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2591) ~[metrics-cli.jar:na]
        at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:91) ~[metrics-cli.jar:na]
        at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2630) ~[metrics-cli.jar:na]
        at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2612) ~[metrics-cli.jar:na]
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:370) ~[metrics-cli.jar:na]
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:169) ~[metrics-cli.jar:na]
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:354) ~[metrics-cli.jar:na]
        at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296) ~[metrics-cli.jar:na]
        at org.apache.hadoop.hbase.util.DynamicClassLoader.(DynamicClassLoader.java:104) ~[metrics-cli.jar:na]
        at org.apache.hadoop.hbase.protobuf.ProtobufUtil.(ProtobufUtil.java:221) [metrics-cli.jar:na]
        at org.apache.hadoop.hbase.ClusterId.parseFrom(ClusterId.java:64) [metrics-cli.jar:na]
        at org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:69) [metrics-cli.jar:na]
        at org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:83) [metrics-cli.jar:na]
        at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.retrieveClusterId(HConnectionManager.java:839) [metrics-cli.jar:na]
        at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.(HConnectionManager.java:642) [metrics-cli.jar:na]
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) [na:1.7.0_75]
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) [na:1.7.0_75]
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) [na:1.7.0_75]
        at java.lang.reflect.Constructor.newInstance(Constructor.java:526) [na:1.7.0_75]
        at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:411) [metrics-cli.jar:na]
        at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:390) [metrics-cli.jar:na]
        at org.apache.hadoop.hbase.client.HConnectionManager.getConnection(HConnectionManager.java:271) [metrics-cli.jar:na]
        at org.apache.hadoop.hbase.client.HTable.(HTable.java:197) [metrics-cli.jar:na]
        at org.apache.hadoop.hbase.client.HTable.(HTable.java:174) [metrics-cli.jar:na]
        at org.gbif.metrics.cube.HBaseCubes$1.createHTableInterface(HBaseCubes.java:196) [metrics-cli.jar:na]
        at org.apache.hadoop.hbase.client.HTablePool.createHTable(HTablePool.java:274) [metrics-cli.jar:na]
        at org.apache.hadoop.hbase.client.HTablePool.findOrCreateTable(HTablePool.java:204) [metrics-cli.jar:na]
        at org.apache.hadoop.hbase.client.HTablePool.getTable(HTablePool.java:183) [metrics-cli.jar:na]
        at org.apache.hadoop.hbase.client.HTablePool.getTable(HTablePool.java:222) [metrics-cli.jar:na]
        at com.urbanairship.datacube.dbharnesses.WithHTable.run(WithHTable.java:34) [metrics-cli.jar:na]
        at com.urbanairship.datacube.dbharnesses.WithHTable.get(WithHTable.java:68) [metrics-cli.jar:na]
        at com.urbanairship.datacube.dbharnesses.HBaseDbHarness.readCombineCas(HBaseDbHarness.java:225) [metrics-cli.jar:na]
        at com.urbanairship.datacube.dbharnesses.HBaseDbHarness.flushBatch(HBaseDbHarness.java:281) [metrics-cli.jar:na]
        at com.urbanairship.datacube.dbharnesses.HBaseDbHarness.access$200(HBaseDbHarness.java:32) [metrics-cli.jar:na]
        at com.urbanairship.datacube.dbharnesses.HBaseDbHarness$FlushWorkerRunnable.call(HBaseDbHarness.java:189) [metrics-cli.jar:na]
        at java.util.concurrent.FutureTask.run(FutureTask.java:262) [na:1.7.0_75]
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) [na:1.7.0_75]
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) [na:1.7.0_75]
        at java.lang.Thread.run(Thread.java:745) [na:1.7.0_75]

That appears to be the default hbase.tmp.dir it's trying to find, but that's only used in non-distrib mode or on the region server (ie not clients like this). So I don't think it's critical, but it's definitely not right.]]>