I used ElasticSearch java client. I do search with query_string, and I get response, but score always be 1.0 .
code is:
String query = "{\"query\": {\"query_string\": {\"query\": \"weblog data4\"}}}";
SearchRequestBuilder builder = client
.prepareSearch("flume-2016-08-10")
.setQuery(query)
.addHighlightedField("*")
.setHighlighterRequireFieldMatch(false)
.setFrom(0).setSize(60).setExplain(true);
SearchResponse response = builder.execute().actionGet();
System.out.println(response.toString());
System.out.println(response.getHits().getAt(0).getSource());
System.out.println(response.getHits().getAt(0).getHighlightFields());
client.close();
result is:
However, I do search in elaseticsearch-head, I get response with correct score.
So, How do I get correct score with java?
This solved my problem:
change:
String query = "{\"query\": {\"query_string\": {\"query\": \"weblog data4\"}}}";
to:
String query = "{\"query_string\": {\"query\": \"weblog data4\"}}";
more details:
answer by jpountz
Related
I am writing a REST application (javax.ws.rs) that takes search requests from clients and submits them to the Elasticsearch high-level API. I want the clients
(browser based javascript mostly) to be able to compose their searches using the Elasticsearch REST API instructions.
The REST end point is defined like this:
#Path("list")
#POST
#Consumes(MediaType.APPLICATION_JSON)
#Produces(MediaType.APPLICATION_JSON)
public Response list(Map<String, Object> req) {
...
The following code would implement a security layer function and then pass the query on to SearchRequest object pretty much unchanged. So I don't want to build queries using QueryBuilders here.
I have tried the instructions in this article but it doesn't work. I think the createParser method has changed since that example was written. If someone could review this and suggest a solution that would be much appreciated.
UPDATE: Using ES 7.2 I have come up with the following code. There have been many changes in the API not all of which I understand but here is what seems like it should work.
XContentBuilder xcb = XContentFactory.contentBuilder(Requests.CONTENT_TYPE);
xcb.map(req);
String json = Strings.toString(xcb);
XContentParser parser = JsonXContent.jsonXContent.createParser(
NamedXContentRegistry.EMPTY, LoggingDeprecationHandler.INSTANCE, json);
SearchSourceBuilder ssb = new SearchSourceBuilder();
ssb.parseXContent(parser);
SearchRequest sr = new SearchRequest(Log.INDEX);
sr.source(ssb);
SearchResponse resp = client.search(sr, RequestOptions.DEFAULT);
I get an IOException off of the call to parseXContent. Looking with the debugger the string json has unprintable characters in it. Any suggestions?
I found a code pattern that works and it seems a bit convoluted but logical. There is no documentation anywhere that would lead you to this. This was pieced together from some fragments posted in message boards here and there.
try {
// convert the Map into a JSON string to parse. Alternatively
// you could just take the string directly from the HTTP request
// but the Map form makes it easy to manipulate.
XContentBuilder xcb = XContentFactory.jsonBuilder();
xcb.map(req);
String json = Strings.toString(xcb);
// Create an XContentParser and borrow a NamedXContentRegistry from
// the SearchModule class. Without that the parser has no way of
// knowing the query syntax.
SearchModule sm = new SearchModule(Settings.EMPTY, false, Collections.emptyList());
XContentParser parser = XContentFactory.xContent(XContentType.JSON)
.createParser(new NamedXContentRegistry(sm.getNamedXContents()),
LoggingDeprecationHandler.INSTANCE,
json);
// Finally we can create our SearchSourceBuilder and feed it the
// parser to ingest the request. This can throw and IllegalArgumentException
// if something isn't right with the JSON that we started with.
SearchSourceBuilder ssb = new SearchSourceBuilder();
ssb.parseXContent(parser);
// Now create a search request and use it
SearchRequest sr = new SearchRequest(Log.INDEX);
sr.source(ssb);
SearchResponse resp = client.search(sr, RequestOptions.DEFAULT);
I have tested this with a number of different JSON queries from the client and they all seem to work the way the direct REST API would. Here is an example:
{
from: 0,
size: 1000,
query: {
match_all: { boost: 1 }
},
sort: [
{ timestamp: { 'order': 'asc' } }
]
}
Hopefully this post will save someone else from the painful search I went through. I would appreciate any comments from anyone who could suggest a better way of doing this.
XContentBuilder xcb = XContentFactory.jsonBuilder();
xcb.map(req);
String json = Strings.toString(xcb);
SearchSourceBuilder ssb = new SearchSourceBuilder();
ssb.query(QueryBuilders.wrapperQuery(json));
I am trying to read data from reddit using java. I am using JRAW.
Here is my code:
public class Main {
public static void main(String args[]) {
System.out.println('a');
String username = "dummyName";
UserAgent userAgent = new UserAgent("crawl", "com.example.crawl", "v0.1", username);
Credentials credentials = Credentials.script(username, <password>,<clientID>, <client-secret>);
NetworkAdapter adapter = new OkHttpNetworkAdapter(userAgent);
RedditClient reddit = OAuthHelper.automatic(adapter, credentials);
Account me = reddit.me().about();
System.out.println(me.getName());
SubmissionReference submission = reddit.submission("https://www.reddit.com/r/diabetes/comments/9rlkdm/shady_insurance_work_around_to_pay_for_my_dexcom/");
RootCommentNode rcn = submission.comments();
System.out.println(rcn.getDepth());
System.out.println();
// Submission submission1 = submission.inspect();
// System.out.println(submission1.getSelfText());
// System.out.println(submission1.getUrl());
// System.out.println(submission1.getTitle());
// System.out.println(submission1.getAuthor());
// System.out.println(submission1.getCreated());
System.out.println("-----------------------------------------------------------------");
}
}
I am making two requests as of now, first one is reddit.me().about(); and the second is reddit.submission("https://www.reddit.com/r/diabetes/comments/9rlkdm/ shady_insurance_work_around_to_pay_for_my_dexcom/");
The output is:
a
[1 ->] GET https://oauth.reddit.com/api/v1/me?raw_json=1
[<- 1] 200 application/json: '{"is_employee": false, "seen_layout_switch": true, "has_visited_new_profile": false, "pref_no_profanity": true, "has_external_account": false, "pref_geopopular": "GL(...)
dummyName
[2 ->] GET https://oauth.reddit.com/comments/https%3A%2F%2Fwww.reddit.com%2Fr%2Fdiabetes%2Fcomments%2F9rlkdm%2Fshady_insurance_work_around_to_pay_for_my_dexcom%2F?sort=confidence&sr_detail=false&(...)
[<- 2] 400 application/json: '{"message": "Bad Request", "error": 400}'
Exception in thread "main" net.dean.jraw.ApiException: API returned error: 400 (Bad Request), relevant parameters: []
at net.dean.jraw.models.internal.ObjectBasedApiExceptionStub.create(ObjectBasedApiExceptionStub.java:57)
at net.dean.jraw.models.internal.ObjectBasedApiExceptionStub.create(ObjectBasedApiExceptionStub.java:33)
at net.dean.jraw.RedditClient.request(RedditClient.kt:186)
at net.dean.jraw.RedditClient.request(RedditClient.kt:219)
at net.dean.jraw.RedditClient.request(RedditClient.kt:255)
at net.dean.jraw.references.SubmissionReference.comments(SubmissionReference.kt:50)
at net.dean.jraw.references.SubmissionReference.comments(SubmissionReference.kt:28)
at Main.main(Main.java:36)
Caused by: net.dean.jraw.http.NetworkException: HTTP request created unsuccessful response: GET https://oauth.reddit.com/comments/https%3A%2F%2Fwww.reddit.com%2Fr%2Fdiabetes%2Fcomments%2F9rlkdm%2Fshady_insurance_work_around_to_pay_for_my_dexcom%2F?sort=confidence&sr_detail=false&raw_json=1 -> 400
... 6 more
As it can been that my first request gives me a response of my username but in the second response i am getting a bad request 400 error.
To check whether my client ID and client secret were working correctly I did the same request using python PRAW library.
import praw
from praw.models import MoreComments
reddit = praw.Reddit(client_id=<same-as-in-java>, client_secret=<same-as-in-java>,
password=<same-as-in-java>, user_agent='crawl',
username="dummyName")
submission = reddit.submission(
url='https://www.reddit.com/r/redditdev/comments/1x70wl/how_to_get_all_replies_to_a_comment/')
print(submission.selftext)
print(submission.url)
print(submission.title)
print(submission.author)
print(submission.created_utc)
print('-----------------------------------------------------------------')
This gives the desired result without any errors so the client secret details must be working.
The only doubt I have is in the user agent creation in java UserAgent userAgent = new UserAgent("crawl", "com.example.crawl", "v0.1", username);.
I followed the following link.
What exactly does the target platform, the unique ID or the version mean. I tried to keep the same format as in the link. Also using the same username as in other places. On the other hand the user_agent in python was a string crawl.
Please tell me if I am missing anything and what could be the issue.
Thank you
P.S. I want to do this in java. not python.
Since your first query is working the credentials are correct. In JRAW don't give the whole URL but only the id in the submission function.
Change this
SubmissionReference submission = reddit.submission("https://www.reddit.com/r/diabetes/comments/9rlkdm/shady_insurance_work_around_to_pay_for_my_dexcom/");
to this
SubmissionReference submission = reddit.submission("9rlkdm");
where the id is the random string after /comment/ in the URL.
Hope this helps.
According to Facebook Docs
If your app is making enough calls to be considered for rate limiting by our system, we return an X-App-Usage HTTP header. [...] When any of these metrics exceed 100 the app will be rate limited.
I am using Facebook4J to connect my application to the Facebook API. But I could not find any documentation about how I can get the X-App-Usage HTTP header after a Facebook call, in order to avoid being rate limited. I want to use this header to know dinamically if I need to increase or decrease the time between each API call.
So, my question is: using Facebook4J, is possible to check if Facebook returned the X-App-Usage HTTP header and get it? How?
There is a getResponseHeader method for the response of BatchRequests in facebook4j see Facebook4j code examples
You could try getResponseHeader("X-App-Usage")
// Executing "me" and "me/friends?limit=50" endpoints
BatchRequests<BatchRequest> batch = new BatchRequests<BatchRequest>();
batch.add(new BatchRequest(RequestMethod.GET, "me"));
batch.add(new BatchRequest(RequestMethod.GET, "me/friends?limit=50"));
List<BatchResponse> results = facebook.executeBatch(batch);
BatchResponse result1 = results.get(0);
BatchResponse result2 = results.get(1);
// You can get http status code or headers
int statusCode1 = result1.getStatusCode();
String contentType = result1.getResponseHeader("Content-Type");
// You can get body content via as****() method
String jsonString = result1.asString();
JSONObject jsonObject = result1.asJSONObject();
ResponseList<JSONObject> responseList = result2.asResponseList();
// You can map json to java object using DataObjectFactory#create****()
User user = DataObjectFactory.createUser(jsonString);
Friend friend1 = DataObjectFactory.createFriend(responseList.get(0).toString());
Friend friend2 = DataObjectFactory.createFriend(responseList.get(1).toString());
I am having trouble making this URL query work in Java, it does not return any results. but from the browser it returns all the results, here is the URL that returns result:
_search?pretty&q=*357*+AND+account_id:574fe92c9179a809fd76f0b8+AND+invalid:false
And here is my code (does not return any results):
FilterBuilder[] filtersArray = new FilterBuilder[2];
filtersArray[0] = FilterBuilders.termFilter("account_id", "574fe92c9179a809fd76f0b8");
filtersArray[1] = FilterBuilders.termFilter("invalid", false);
QueryBuilder query = QueryBuilders.filteredQuery(QueryBuilders.simpleQueryStringQuery("*357*"), FilterBuilders.andFilter(filtersArray));
SearchResponse response = esClient.prepareSearch(SecurityManager.getNamespace())
.addSort("created_time", SortOrder.DESC)
.setTypes(dataType)
.setQuery(query)
.addFields("_id")
.setFrom(page * size)
.setSize(size)
.setExplain(false)
.execute()
.actionGet();
Can someone tell me what is the best way to translate the URL query into a java query?
First off, the URL query you should use is this one
?q=*357*+AND+account_id:574fe92c9179a809fd76f0b8+AND+invalid:false
otherwise you'll have no constraint on account_id and invalid
Then, the exact translation of this new URL query in Java is
QueryBuilder query = QueryBuilders.queryStringQuery("*357* AND account_id:574fe92c9179a809fd76f0b8 AND invalid:false");
SearchResponse response = esClient.prepareSearch(SecurityManager.getNamespace())
.addSort("created_time", SortOrder.DESC)
.setTypes(dataType)
.setQuery(query)
.addFields("_id")
.setFrom(page * size)
.setSize(size)
.setExplain(false)
.execute()
.actionGet();
Notes:
queryStringQuery and not simpleQueryStringQuery
no filters as they are all in the query string already
I started to use ElasticSearch in my test project and cant figure out hot to create search for all fields. For instance we have some words as a search query and i want to find all indexed object in ElasticSearch, using Java API.
My obj have: id, name, adress, etc
I searched for this kind of info and wrote this:
Node node = nodeBuilder().node();
Client client = node.client();
RegexpFilterBuilder qFilter = FilterBuilders.regexpFilter("_all", (".*" + query + ".*").replace(" ", ".*"));
SearchResponse response = client.prepareSearch(index)
.setTypes(type)
.setSearchType(SearchType.DFS_QUERY_THEN_FETCH)
.setPostFilter(qFilter)
.setFrom(0).setSize(100).setExplain(true)
.execute()
.actionGet();
SearchHit[] results = response.getHits().getHits();
System.out.println("Current results: " + results.length);
I also tried to use one field:
SearchResponse response = client.prepareSearch(index)
.setTypes(type)
.setSearchType(SearchType.QUERY_AND_FETCH)
.setQuery(termQuery(field, value))
.setFrom(0).setSize(100).setExplain(true)
.execute()
.actionGet();
I always get 0 result.
Can you show me, how to do this in right way in java?
Ok, i spent more time on documentation and i found a solution, hope it helps to someone else!
You just need to use QueryBuilders.multiMatchQuery, the value is our searching word and other strings are the columns where to search to.
SearchResponse response = client.prepareSearch(index)
.setTypes(type)
.setSearchType(SearchType.QUERY_AND_FETCH)
.setQuery(QueryBuilders.multiMatchQuery(value,
"name", "address1", "city", "postalCode",
"countryCode", "airportCode", "locationDescription",
"shortDescription"
))
.setFrom(0).setSize(100).setExplain(true)
.execute()
.actionGet();