1. down flume 1.5.2 source code and change solr version to 5.1.0
2. compile and install
3. cp solr 4.10.1 related jars to lib dir to sove this error
CloudSolrServer' (current frame, stack[2]) is not assignable to 'org/apache/solr/client/solrj/SolrServer
4. alter solr's schema.xml for some fieldtype to dealwith these errors
change distanceUnits="kilometers" =====> units="degrees"
SolrException: Must specify units="degrees" on field types with class SpatialRecursivePrefixTreeFieldType
SolrException: Must specify units="degrees" on field types with class BBoxField
But, there are still some confilcts such as
SolrException: Could not find collection (Due to using solr-solrj-4.10.1.jar to access solr 5.1.0 cloud)
org/kitesdk/morphline/solr/SolrLocator', 'org/apache/solr/client/solrj/impl/CloudSolrServer
The kitesdk jars still can't support solr5.1.0 till now. So, only way is to modify kitesdk !
You should replace kite-morphlines-solr-core-0.12.0.jar with the new jar which support solr5.1.0
-----------------------------------------------
alter solrcloud configure files in zookeeper
1. #zkCli.sh
#ls /
#rmr /configs/fifo_task
2. use solr's zkCli.sh to upload configure files to zookeeper
#./scripts/cloud-scripts/zkcli.sh -cmd upconfig -zkhost 192.168.0.135:2181 -collection task -confname fifo_task -solrhome solr -confdir solr/configsets/fifo_configs/conf
3. unload cores in solr webui and then load them.
相关推荐
flume:构建高可用、可扩展的海量日志采集系统 flume:构建高可用、可扩展的海量日志采集系统
其中上篇介绍了HDFS以及流式数据/日志的问题,同时还谈到了Flume是如何解决这些问题的。本书展示了Flume的架构,包括将数据移动到数据库中以及从数据库中获取数据、NoSQL数据存储和性能调优。对于各个架构组件(源、...
flume定制化sink,用于参考,使用了多线程及读取配置文件的技术
flume 自定义sink组件 实现sink直接写入mysql数据库
Flume是Cloudera提供的一个高可用的,高可靠的,分布式的海量日志采集、聚合和传输的系统,Flume支持在日志系统中定制各类数据发送方,用于收集数据;同时,Flume提供对数据进行简单处理,并写到各种数据接受方(可...
Apache Flume, Distributed Log Collection for Hadoop,2015 第二版,Packt Publishing
Flume:构建高可用、可扩展的海量日志采集系统 第一部分
// Assembly with dependencies mvn assembly:assembly -DskipTests=true Flume-NG 的配置 # example.conf: A single-node Flume configuration # Name the components on this agent a1.sources = r1 a1.sinks = ...
spark-streaming-flume-sink_2.11-2.0.0.jar的jar包。
Flume sink 使用 logstash V2 格式将日志事件发送到 ES 集群 看到 要启用事件序列化器,您需要将此行添加到水槽接收器配置中: [agent_name].sinks.[sink_name].indexNameBuilder = ...
rocketmq-flume Source&Sink该项目用于与之间的消息接收和投递。首先请确定您已经对和有了基本的了解确保本地maven库中已经存在,或者下载RocketMQ源码自行编译在rocketmq-flume项目根目录执行mvn clean install ...
flume-ng安装
01.flume中的sink-batchsize和channel的transactionCapacity大小之间的注意点.mp4
要部署它,请在flume类路径中复制flume-influxdb-sink-0.0.2.jar及其依赖项。 一个胖罐子,包括maven在build中的所有依赖项,因此也可以将其复制。 配置 这是示例接收器配置: agent.sinks.influx.type = ...
该脑图是介绍Flume+Solr演示demo,请贡献给大家下载!
flume与spark streaming结合(pull方式)报错:org.apache.flume.FlumeException: Unable to load sink type: org.apache.spark.streaming.flume.sink.SparkSink, class: org.apache.spark.streaming.flume.sink....
通过修改flume源码实现flume向两个HA hadoop集群分发数据。