- Lab
-
Libraries: If you want this lab, consider one of these libraries.
- Cloud
- Data
ECE Practice Exam — Part 2
In Part 2 of the Elastic Certified Engineer practice exam, you will be tested on the following exam objectives: * Perform index, create, read, update, and delete operations on the documents of an index * Use the Reindex API and Update By Query API to reindex and/or update documents * Define and use an ingest pipeline that satisfies a given set of requirements, including the use of Painless to modify documents * Diagnose Shard Issues and Repair a Cluster’s Health * Write and execute a search query for terms and/or phrases in one or more fields of an index * Write and execute a search query that is a Boolean combination of multiple queries and filters * Highlight the search terms in the response of a query * Sort the results of a query by a given set of requirements * Implement pagination in the results of a search query * Apply fuzzy matching to a query * Define and Use a Search Template * Write and execute a query that searches multiple clusters * Write and execute metric and bucket aggregations * Write and execute aggregations that contain sub-aggregations * Write and execute pipeline aggregations * Back up and restore a cluster and/or specific indices * Configure a cluster for cross-cluster search
Lab Info
Table of Contents
-
Challenge
Diagnose and Repair the "c1" cluster.
Start Elasticsearch
Using the Secure Shell (SSH), log in to the
c1-data-1node ascloud_uservia the public IP address.Become the
elasticuser:sudo su - elasticStart Elasticsearch as a daemon:
/home/elastic/elasticsearch/bin/elasticsearch -d -p pidReplicate the
logsindexUse the Kibana console tool on the
c1cluster to execute the following:PUT logs/_settings { "number_of_replicas": 1 }Reduce the
shakespeareindex's replicationUse the Kibana console tool on the
c1cluster to execute the following:PUT shakespeare/_settings { "number_of_replicas": 1 }Remove allocation filtering for the
bankindexUse the Kibana console tool on the
c1cluster to execute the following:PUT bank/_settings { "index.routing.allocation.require._name": null } -
Challenge
Transfer the "bank" index to the "c2" cluster.
Configure the
c2cluster to remote reindex from thec1clusterUsing the Secure Shell (SSH), log in to the
c2cluster nodes ascloud_uservia the public IP address.Become the
elasticuser:sudo su - elasticAdd the following line to
/home/elastic/elasticsearch/config/elasticsearch.yml:reindex.remote.whitelist: "10.0.1.101:9200, 10.0.1.102:9200, 10.0.1.103:9200, 10.0.1.104:9200"Stop Elasticsearch:
pkill -F /home/elastic/elasticsearch/pidStart Elasticsearch as a background daemon and record the PID to a file:
/home/elastic/elasticsearch/bin/elasticsearch -d -p pidCreate the
bankindex on thec2clusterUse the Kibana console tool on the
c2cluster to execute the following:PUT bank { "settings": { "number_of_shards": 1, "number_of_replicas": 0 } }Reindex the
bankindex on thec2clusterUse the Kibana console tool on the
c2cluster to execute the following:POST _reindex { "source": { "remote": { "host": "http://10.0.1.101:9200", "username": "elastic", "password": "la_elastic_409" }, "index": "bank" }, "dest": { "index": "bank" } }Delete the
bankindex on thec1clusterUse the Kibana console tool on the
c1cluster to execute the following:DELETE bank -
Challenge
Backup the "bank" index on the "c2" cluster.
Configure the nodes
Using the Secure Shell (SSH), log in to the
c2-master-1node ascloud_uservia the public IP address.Become the
elasticuser:sudo su - elasticCreate the repo directory:
mkdir /home/elastic/snapshotsAdd the following line to
/home/elastic/elasticsearch/config/elasticsearch.yml:path.repo: "/home/elastic/snapshots"Stop Elasticsearch:
pkill -F /home/elastic/elasticsearch/pidStart Elasticsearch as a background daemon and record the PID to a file:
/home/elastic/elasticsearch/bin/elasticsearch -d -p pidCreate the
local_reporepositoryUse the Kibana console tool on the
c2cluster to execute the following:PUT _snapshot/local_repo { "type": "fs", "settings": { "location": "/home/elastic/snapshots" } }Backup the
bankindexUse the Kibana console tool on the
c2cluster to execute the following:PUT _snapshot/local_repo/bank_1?wait_for_completion=true { "indices": "bank", "include_global_state": true } -
Challenge
Configure Cross-Cluster Search.
Use the Kibana console tool on the
c1cluster to execute the following:PUT _cluster/settings { "persistent": { "cluster": { "remote": { "c2": { "seeds": [ "10.0.1.105:9300" ] } } } } } -
Challenge
Create, Update, and Delete Documents.
Delete the
bankdocumentsUse the Kibana console tool on the
c2cluster to execute the following:DELETE bank/_doc/5 DELETE bank/_doc/27 DELETE bank/_doc/819Update the
bankdocumentUse the Kibana console tool on the
c2cluster to execute the following:POST bank/_update/67 { "doc": { "lastname": "Alonso" } }Create the
bankdocumentUse the Kibana console tool on the
c2cluster to execute the following:PUT bank/_doc/1000 { "account_number": 1000, "balance": 35550, "firstname": "Stosh", "lastname": "Pearson", "age": 45, "gender": "M", "address": "125 Bear Creek Pkwy", "employer": "Linux Academy", "email": "[email protected]", "city": "Keller", "state": "TX" }Update the
shakespearemappingUse the Kibana console tool on the
c1cluster to execute the following:PUT shakespeare/_mappings { "properties": { "line_id": { "type": "integer" }, "line_number": { "type": "text", "fields": { "keyword": { "type": "keyword", "ignore_above": 256 } } }, "play_name": { "type": "keyword" }, "speaker": { "type": "keyword" }, "speech_number": { "type": "integer" }, "text_entry": { "type": "text", "fields": { "english": { "type": "text", "analyzer": "english" }, "keyword": { "type": "keyword", "ignore_above": 256 } } }, "type": { "type": "text", "fields": { "keyword": { "type": "keyword", "ignore_above": 256 } } } } }Delete and update the
shakespearedocumentsUse the Kibana console tool on the
c1cluster to execute the following:POST shakespeare/_update_by_query { "script": { "lang": "painless", "source": """ if (ctx._source.line_number == "") { ctx.op = "delete" } """ } }Create the ingest pipeline
Use the Kibana console tool on the
c1cluster to execute the following:PUT _ingest/pipeline/fix_logs { "processors": [ { "remove": { "field": "@message" } }, { "split": { "field": "spaces", "separator": "\\s+" } }, { "script": { "lang": "painless", "source": "ctx.relatedContent_count = ctx.relatedContent.length" } }, { "uppercase": { "field": "extension" } } ] }Create the
logs_newindexUse the Kibana console tool on the
c1cluster to execute the following:PUT logs_new { "settings": { "number_of_shards": 2, "number_of_replicas": 1 } }Reindex the
logsdocumentsUse the Kibana console tool on the
c1cluster to execute the following:POST _reindex { "source": { "index": "logs" }, "dest": { "index": "logs_new", "pipeline": "fix_logs" } } -
Challenge
Search Documents.
Search the
bankindexUse the Kibana console tool on the
c1cluster to execute the following:GET c2:bank/_search { "from": 0, "size": 50, "sort": [ { "age": { "order": "asc" } }, { "balance": { "order": "desc" } }, { "lastname.keyword": { "order": "asc" } } ], "query": { "bool": { "must": [ { "term": { "gender.keyword": { "value": "F" } } }, { "range": { "balance": { "gt": 10000 } } } ], "must_not": [ { "terms": { "state.keyword": ["PA", "VA", "IL"] } } ], "filter": { "range": { "age": { "gte": 18, "lte": 35 } } } } } }Search the
shakespeareindexUse the Kibana console tool on the
c1cluster to execute the following:GET shakespeare/_search { "from": 0, "size": 20, "highlight": { "pre_tags": "<b>", "post_tags": "</b>", "fields": { "text_entry.english": {} } }, "query": { "bool": { "should": [ { "match": { "text_entry.english": "life" } }, { "match": { "text_entry.english": "love" } }, { "match": { "text_entry.english": "death" } } ], "minimum_should_match": 2 } } }Search the
logsindexUse the Kibana console tool on the
c1cluster to execute the following:GET logs/_search { "highlight": { "fields": { "relatedContent.twitter:description": {}, "relatedContent.twitter:title": {} } }, "query": { "bool": { "must": [ { "match": { "relatedContent.twitter:description": { "query": "never", "fuzziness": 2 } } }, { "match_phrase": { "relatedContent.twitter:title": "Golden State" } } ] } } } -
Challenge
Aggregate Documents.
Aggregate on the
bankindexUse the Kibana console tool on the
c1cluster to execute the following:GET c2:bank/_search { "size": 0, "aggs": { "state": { "terms": { "field": "state.keyword", "size": 5, "order": { "avg_balance": "desc" } }, "aggs": { "avg_balance": { "avg": { "field": "balance" } } } } }, "query": { "range": { "age": { "gte": 30 } } } }Aggregate on the
shakespeareindexUse the Kibana console tool on the
c1cluster to execute the following:GET shakespeare/_search { "size": 0, "aggs": { "plays": { "terms": { "field": "play_name", "size": 10 }, "aggs": { "speakers": { "cardinality": { "field": "speaker" } } } }, "most_parts": { "max_bucket": { "buckets_path": "plays>speakers" } } } }Aggregate on the
logsindexUse the Kibana console tool on the
c1cluster to execute the following:GET logs/_search { "size": 0, "aggs": { "hour": { "date_histogram": { "field": "@timestamp", "calendar_interval": "hour" }, "aggs": { "clients": { "cardinality": { "field": "clientip.keyword" } }, "cumulative_clients": { "cumulative_sum": { "buckets_path": "clients" } }, "clients_per_minute": { "derivative": { "buckets_path": "cumulative_clients", "unit": "1m" } } } }, "peak": { "max_bucket": { "buckets_path": "hour>clients" } } }, "query": { "range": { "@timestamp": { "gte": "2015-05-19", "lt": "2015-05-20", "format": "yyyy-MM-dd" } } } } -
Challenge
Create the Search Template.
Use the Kibana console tool on the
c2cluster to execute the following:POST _scripts/accounts_search { "script": { "lang": "mustache", "source": { "from": "{{from}}{{^from}}0{{/from}}", "size": "{{size}}{{^size}}25{{/size}}", "query": { "bool": { "must": [ { "wildcard": { "firstname.keyword": "{{first_name}}{{^first_name}}*{{/first_name}}" } }, { "wildcard": { "lastname.keyword": "{{last_name}}{{^last_name}}*{{/last_name}}" } } ] } } } } }
About the author
Real skill practice before real-world application
Hands-on Labs are real environments created by industry experts to help you learn. These environments help you gain knowledge and experience, practice without compromising your system, test without risk, destroy without fear, and let you learn from your mistakes. Hands-on Labs: practice your skills before delivering in the real world.
Learn by doing
Engage hands-on with the tools and technologies you’re learning. You pick the skill, we provide the credentials and environment.
Follow your guide
All labs have detailed instructions and objectives, guiding you through the learning process and ensuring you understand every step.
Turn time into mastery
On average, you retain 75% more of your learning if you take time to practice. Hands-on labs set you up for success to make those skills stick.