作者:菜鸟 | 来源:互联网 | 2023-07-20 19:25
ik中文分词器也安装好了,按照ik给的教程:https://github.com/medcl/elas我使用postman工具访问elastsearch:第一条ge
ik中文分词器也安装好了,
按照ik给的教程:https://github.com/medcl/elas...
我使用postman工具访问elastsearch:
第一条get方式 http://localhost:9200/
返回的结果:
{
1 2 3 4 5 6 7 8 9 10 11 12 13
| "name": "DiKMetP",
"cluster_name": "elasticsearch",
"cluster_uuid": "KdmRqu-_R9usCI-qOCs5iA",
"version": {
"number": "6.1.2",
"build_hash": "5b1fea5",
"build_date": "2018-01-10T02:35:59.208Z",
"build_snapshot": false,
"lucene_version": "7.1.0",
"minimum_wire_compatibility_version": "5.6.0",
"minimum_index_compatibility_version": "5.0.0"
},
"tagline": "You Know, for Search" |
}
说明访问正常。
第二条创建index索引,put方式:http://localhost:9200/index
然后get方式访问http://localhost:9200/index,返回的结果:
{
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16
| "index": {
"aliases": {},
"mappings": {},
"settings": {
"index": {
"creation_date": "1516362624711",
"number_of_shards": "5",
"number_of_replicas": "1",
"uuid": "znuaHlkhSueGMPA2UPf1Rw",
"version": {
"created": "6010299"
},
"provided_name": "index"
}
}
} |
}
第三条,elastsearch默认是可以对中文分词的,我就试了试不使用ik分词器的post请求:http://localhost:9200/_analyze?analyzer=standard&pretty=true&text=我是中国人
返回结果:
{
1 2 3 4 5 6 7 8 9 10 11
| "error": {
"root_cause": [
{
"type": "parse_exception",
"reason": "request body or source parameter is required"
}
],
"type": "parse_exception",
"reason": "request body or source parameter is required"
},
"status": 400 |
}
哪位大神指点指点哪里出错了,不使用ik分词器如何使用postman访问以及使用ik分词器的写法,不胜感激,网上资料说elastsearch依赖maven,我没有安装maven,一定要安装maven么?