on ‎2018 Jul 31 5:20 PM
We need to export search results from the PLP page in excel format , so the count of products in the file should be equivalent to the count displayed in the page. Currently the search results will have data only for 100 products(No of products displayed in a page). Any suggestions /ideas on the same.
Request clarification before answering.
Hi I was reading solr documentation and came across Solr's export handler. So anytime a user clicks on the 'Export' button, you can invoke Solr's export handler . See below link.
https://lucene.apache.org/solr/guide/6_6/exporting-result-sets.html
If the user has selected facets like (size, brand etc.), just pass the appropriate query to the solr export handler. See below query.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
: Assuming, if I reach a PLP page by clicking on a category called 'Bags' and you do the same thing, we both download the same file. Essentially what I am trying to establish is that file contents are only based on category and not the user.
the best way I can think of is to generate an event whenever user clicks 'Download' button. So, create a controller called PLPCrazyFileDownloadController and create a function called downloadFileForCategory. pass in the category code here. See below code
@Controller
@RequestMapping("/downloadexcel")
public class PLPCrazyFileDownloadController {
@RequestMapping("/{category}", method=RequestMethod.POST)
public String downloadFileForCategory(@PathVariable final String category)
{
// hit solr with same query as you do with category page and give page
// size as 10000
}
}
Assuming you are on PLP page for category 'watches'. Once you hit solr with category = 'watches' and without the pageSize attribute or with pageSize=<some big number>. Solr will give you all the results pertaining to this category. Now you will have to convert those results to your ProductData and write that data in excel and send that excel to the user.
Now had I been you, my question to the requirement team would be, how many categories do we have in total ? If there are 200-300 categories in total, then before sending 'watches' category file to the user. I would have saved this file in hybris/data/category-excels/ folder with the name watches.xls. Every time a second or a third or a fourth user wants this file. I would not hit solr but give them the same file.
Assuming your full solr index runs once in a day. I would delete all these files once full solr index is successfull and then the first request on the next morning would generate the files again.
So, essentially if you have 1000 customers hitting the download button in a day, you generate the file only once when your full solr index runs and the next 999 customers would get the same file.
This is same like invalidating the cache after 24 hours. you have to invalidate the files after your solr index is successful.
Let me know how you go and please like my answers if they are helpful. So others can also benefit.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Thanks , I am going to try out this suggesstion . One more thing to add on , the customer has a search box in which he can search against 5 different parameters , also further filter can be done with left navigation facets(size,brand etc). Hence saving the file is not useful in our case . One doubt :Is the "pageSize" attribute of PageableData considered to limit the number of records pulled from solr?
yup, pageSize and offset. If pageSize is 36 and offset is 0, you will get 0 to 36 results. if pageSize is 36 and offset is 10, you will get 10 to 46 results
if pageSize is 100 and offset is 0, you will get 100 results.
so, pageSize controls the number of results while offset controls from where to start the count. I would suggest you ready solr documentation and debug CategoryPageController and SearchPageController
good luck.
It is not advisable because if say, if you have 2000 products for a category, you have to fetch all those from solr. Official solr documentation says it works best for small result set.
And fetching 2000 results from solr also slows down your page loads.
if you have to do it for every page load, I personally think it is a crazy idea. For example, let us say you have 500 active users on your website and 10% of those hit the PLP via clicking a category and 10% hit the PLP via search. That means hitting the PLP 100 times. And each time you fetch a lot of products because your requirement is to fetch the entire data set.
so, Now the next question is ,do you have to do it for every page load or just once or twice in a day ?
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi , The idea is to intoduce a new download feature in the PLP page . When the user click on the download button then download must happen in the background(without affecting any other functionality) for the entire search result (max 18k in our site) . Hence the results are loaded on the page will only be based on pagination defined. Please let me know any suggestions.
sorry, didn't get your question.
For example, total results returned from Solr is 72 and your first page shows only 24 results. and then the second page shows results 25 to 48 and then third page shows 49 to 72.
do you wish to export first 24 and then export the next 24 only on some user action or
or do you wish to export all 72 in one go ?
The strategy would depend on how often do you wish to export these products ? Can the export contain any stale data like price, if that's ok.
Please contact me at saurabh24grover@gmail.com if you need further help.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
| User | Count |
|---|---|
| 4 | |
| 2 | |
| 1 | |
| 1 | |
| 1 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.