小编典典

如何通过Java API在ElasticSearch中重新编制索引

elasticsearch

就像标题所说的…

我阅读了这篇文章(https://www.elastic.co/blog/changing-mapping-with-zero-
downtime),这个概念很棒,但是我很难找到有关如何通过JAVA API进行操作的不错的参考。

我找到了这个插件:https :
//github.com/karussell/elasticsearch-
reindex,但似乎对我尝试做的事情有些过头


阅读 333

收藏
2020-06-22

共1个答案

小编典典

经过在当地星巴克的研究后,我得出了以下结论:

假设我们已经有索引(“ old_index”)并且它有数据…现在让我们将数据移动到我们创建的新索引(“
new_index”)中(对于某个字段,可能具有不同的STRING vs INT,或现在您决定不再希望分析或存储某些字段等)。

这里的基本思想是从已经存在的索引(“ old_index”)中检索所有数据,并将其提取到新索引(“ new_index”)中。但是,您只需要做几件事:

步骤1.您需要执行搜索滚动
https://www.elastic.co/guide/zh/elasticsearch/reference/current/search-
request-
scroll.html

与常规搜索相比,它所做的一切都可以更高效地检索结果。没有评分,等等。这是文档必须说的:“滚动并不是为了实时用户请求,而是为了处理大量数据,例如,为了将一个索引的内容重新索引为新索引具有不同的配置。”

这是有关如何使用Java
API的链接:https :
//www.elastic.co/guide/en/elasticsearch/client/java-
api/current/scrolling.html

第2步。插入时,必须使用批量摄取。再次出于性能原因完成此操作。这是Bulk Ingest Java
API的链接:https :
//www.elastic.co/guide/en/elasticsearch/client/java-
api/current/bulk.html#_using_bulk_processor

现在到ho上去做吧…

步骤1.设置滚动搜索以从旧索引中“加载”数据

SearchResponse scrollResp = client.prepareSearch("old_index") // Specify index
    .setSearchType(SearchType.SCAN)
    .setScroll(new TimeValue(60000))
    .setQuery(QueryBuilders.matchAllQuery()) // Match all query
    .setSize(100).execute().actionGet(); //100 hits per shard will be returned for each scroll

步骤2.设置批量处理器。

int BULK_ACTIONS_THRESHOLD = 1000;
int BULK_CONCURRENT_REQUESTS = 1;
BulkProcessor bulkProcessor = BulkProcessor.builder(client, new BulkProcessor.Listener() {
    @Override
    public void beforeBulk(long executionId, BulkRequest request) {
        logger.info("Bulk Going to execute new bulk composed of {} actions", request.numberOfActions());
    }

    @Override
    public void afterBulk(long executionId, BulkRequest request, BulkResponse response) {
        logger.info("Executed bulk composed of {} actions", request.numberOfActions());
    }

    @Override
    public void afterBulk(long executionId, BulkRequest request, Throwable failure) {
        logger.warn("Error executing bulk", failure);
    }
    }).setBulkActions(BULK_ACTIONS_THRESHOLD).setConcurrentRequests(BULK_CONCURRENT_REQUESTS).setFlushInterval(TimeValue.timeValueMillis(5)).build();

步骤3.在步骤1中通过创建的滚动搜索器从旧索引中读取,直到剩下mo记录并插入新索引中

//Scroll until no hits are returned
while (true) {
    scrollResp = client.prepareSearchScroll(scrollResp.getScrollId()).setScroll(new TimeValue(600000)).execute().actionGet();
    //Break condition: No hits are returned
    if (scrollResp.getHits().getHits().length == 0) {
        logger.info("Closing the bulk processor");
        bulkProcessor.close();
        break; 
    }
    // Get results from a scan search and add it to bulk ingest
    for (SearchHit hit: scrollResp.getHits()) {
        IndexRequest request = new IndexRequest("new_index", hit.type(), hit.id());
        Map source = ((Map) ((Map) hit.getSource()));
        request.source(source);
        bulkProcessor.add(request);
   }
}

步骤4.现在是将指向旧索引的现有别名分配给新索引的时候了。然后删除对旧索引的别名引用,然后删除旧索引本身。为新索引分配别名

client.admin().indices().prepareAliases().addAlias("new_index", "alias_name").get();

从旧索引中删除别名,然后删除旧索引

client.admin().indices().prepareAliases().removeAlias("old_index", "alias_name").execute().actionGet();
client.admin().indices().prepareDelete("old_index").execute().actionGet();
2020-06-22