cancel
Showing results for 
Search instead for 
Did you mean: 

Indexing

former_member304001
Active Contributor
0 Kudos

Hi

I have created a index for a folder in KM , iam able to search all documents ,

Now my question is when ever new documents is uploaded to that folder i need to reindex ? or we need to make any settings to index documents automatically when ever new documents is uploaded to folder?

Pls Help

Regards

Krishna.

Accepted Solutions (1)

Accepted Solutions (1)

Former Member
0 Kudos

Hi Krishna,

<b>When you create an index, in the field "Crawler Parameters:" you can add a schedule to index files, daily or monthly, or whenever you want.</b>

Indexing is a heavy task for the server, so I do it everynight at 4am.

You can select your index and click on the button "reindex" if you want your files be indexed before the next automatic indexing. You can also use the "incremental update", if you don't want to rebuild your index from scratches.

Have a look to this help page http://help.sap.com/saphelp_nw04/helpdata/en/30/6f76534f90744daeecc45015263c59/frameset.htm

Hope it helps.

Best regards,

Julien.

Answers (2)

Answers (2)

sascha_tubbesing
Employee
Employee
0 Kudos

Hi,

depending on the type of repository and where you are uploading.

No manual action is needed. It is not even needed to schedule a crawler.

If you have a repository manager which sends events (see option an configuration object) and you are working (uploading, editing, deleting) with the CM UI, everything will be done automatically.

If you have a FS repository and you are working directly on the FS, you will need to schedule a crawler.

If you have a Web Rep. you will also need to schedule a crawler.

Regards,

Sascha

Former Member
0 Kudos

HI Krishna

No need to reindex everytime new documents are included in the folder. That is why the concept of crawlers came.

The crawler service allows crawlers to collect resources located in internal or external repositories, for example, for indexing purposes. A crawler returns the resources and the hierarchical or net-like structures of the respective repositories.

Services and applications that need repositories to be crawled (for example, the index management service) request a crawler from the crawler service.

You can create a new crawler parameter depending upon the frequency of the addition of new documents. Everytime the scheduled time comes the added documents gets indexed.

Regards,

Ganesh N