|TheWord Performance on multi-core CPU's
|Page 1 of 1|
|Author:||Bro_Marko [ Thu Oct 31, 2019 2:39 pm ]|
|Post subject:||TheWord Performance on multi-core CPU's|
yes, i know theWord is pretty fast, and i have been using it since more than 10+ years happily,
and do not intend to switch unto anything else - because it is the best, even better than LOGOS.
and with every couple years, that i buy a new PC, i notice an improvement in search speeds
(in searching over my gazillion of modules mostly self-created large ones)
but my question is even while it searches, it only uses just like 5% of the CPU
is there NO WAY to max out that CPU usage much more,
because i still have to wait sometimes "1 single minute" (i know
for it to complete a full search through all my 1000 split up modules.
i assume, it could be even MUCH faster, if it could be OPTIMIZED to be more multi-threaded (AMD style)
and thus max out even more the full capacity of even a single CPU, or multi-threading even so the more.
so, while i do not hope for an improvement soon with the current update speeds,
it would be still great if SOMEDAY one could enable a multi-CPU option
to let it fly over the modules even faster than ever before!
thanks Costas & all !
|Author:||Bro_Marko [ Wed Nov 06, 2019 11:26 am ]|
|Post subject:||Re: TheWord Performance on multi-core CPU's|
after all my testing, TheWord seems clearly to be still
a 32-bit, single core (even single thread mostly) working application.
Nevertheless, it is still remarkably fast at that,
and here are now my final tips, to make it even faster:
aside from all the usual best recommendations for a very fast search,
which are to get a fast SSD, fastest CPU you can afford, and fast RAM
the real clincher to get from fast, to SUPER fast speeds (at least 2x faster on any PC),
seems to be the following tip, before you enter on your quest for some heavy searching:
1. perform once on start up a "long search" over ALL MODULES, to thus seemingly load them up into memory (while you do something else, in another program)
2. then amazingly the next searches from then, will be at least 2x faster than this initial search (which is a huge improvement compared to the initial search)
3. and then of course, making also specialized collections, where you set you own search boundaries,
also greatly helps to both make the whole experience much faster and to narrow down also your search results as-well.
and generally speaking, i also found that theWord search usually struggles much more with LONG DOCUMENTS all under one page/title pasted,
like 100 pages long under 1 title, taking much more time there, seemingly un-proportionately longer, than for 100 short documents of just 1 page size,
so that it is therefore a good idea to split up as much as possible your long texts documents into single pages or even single paragraphs if you can
using for example the module creation tools from e-sword (e-sword ToolTip Maker, to set your paragraph / page marks to thus make the auto-split)
and which splitting thus (either by page or even by paragraph as i often have it) also compensates thus mostly,
for the lacking NEARx search proximity command in the book search window,
that otherwise is greatly available in the much more advanced
and much more search options having - bible search window
but hopefully ONE DAY Costas will bring over this same power searching ability
with thus the NEARx command also unto the book search eventually SOME day.
in any case, these are all the search tips, that help thus to squeeze out of TheWord,
at least a 2x greater search performance, on whatsoever PC so already pre-existing.
so happy searching
therefore unto all !
|Page 1 of 1||All times are UTC + 2 hours|
|Powered by phpBB © 2000, 2002, 2005, 2007 phpBB Group