Looking at c:\ and *.*, the "Analyzing which of the found files are duplicates" phase never completes, even after 12+ hours.

frankc2 wrote
Looking at c:\ and *.*, the "Analyzing which of the found files are duplicates" phase never completes, even after 12+ hours.

Leonardo wrote
Interesting, I checked and found out that I have ~98.750 files on my c:\ disk. I have NO IDEA how to get a fair estimate of the "Analyze time" this is going to take!

Running XP-Pro-SP3:

Just for test purposes I scanned my E:\ disk, containing 12.400 files. Duplicate file scan found 2.211 Duplicate files.

The whole process took ~3,5 minutes

frankc2 wrote
Looking at c:\ and *.*, the "Analyzing which of the found files are duplicates" phase never completes, even after 12+ hours.

Does it work if you try to scan a sub-directory? Also, you are aware that simply scanning your system and removing all the duplicate files is a bad idea and probably will kill your computer?

2 months later

frankc2 wrote:

Looking at c:\ and *.*, the "Analyzing which of the found files are duplicates" phase never completes, even after 12+ hours.


I have the exact same problem, running Windows XP SP3, and I'm well aware of how dangerous it is to blindly delete duplicate files. I'm specifically looking for NON-system duplicates of docs, mp3s, etc. which I tend to get a lot of and need to get rid of (running out of HD space with 2 HDs!)

I'm trying it one more time on the whole HD (an external 500 GB HD) and if it still never finishes will try examining a few directories instead of the entire HD

BTW it doesn't take too long to ID all files on this HD

PS: The duplicate file finder in JV16 2008 worked fine on this PC and this HD and didn't take anywhere near 12 hrs to correctly spot all duplicate files

It indeed looks that something is wrong in build 970 'File Tools' -> File Finder -> ....and others....

EDITTED:

jv16PT 2009 b606 File Finder (*.*) (all options unchecked) found 13682 file on my E:\ disk; runtime exactly 4 minutes (09-10).

jv16PT 2009 b606 Duplicate File Finder (*.*) (all options unchecked) found 4267 duplicate files on my E:\ disk; runtime >1 hour (11-12).


jv16PT 2010 b970 File Finder (*.*) (unchecked option "'skip deep directory structures...'") never completed on my E:\ disk. Jv16PT.exe is monopolizing(looping?) on one engine until I cancel jv16PT, after >2 hours, with the Windows Task Manager.

jv16PT 2010 b970 File Finder (*.*) (checked only option "'skip deep directory structures...'") only found 1712 files on my E:\ disk; runtime ~1 minute (09-09).


Attached are 3 Screen shots, two of jv16PT-2009-build 606 and one of jv16PT-2010-build 970.


I can try build 970 in DEBUG mode, if a Debug log.html is of any use in this case?

Is there any development comment on this item?

Please see: http://www.macecraft.com/phpBB3/viewtopic.php?f=26&t=4099#p25642

Leonardo wrote

I can try build 970 in DEBUG mode, if a Debug log.html is of any use in this case?

It would most likely become handy.

jv16 wrote
Leonardo wrote

I can try build 970 in DEBUG mode, if a Debug log.html is of any use in this case?

It would most likely become handy.

Test: Find all files on my e:\ disk with b970 File Finder. There are ~12.740 file on my e:\disk according to jv16PT-2009-b606!


1. First test: File tools -> File Finder -> e:/, find files( *.*), ALL options unchecked except 'Skip deep directory structures', NO grouping

This test does not end after finding the ~12.700 files, jv16PT.exe monopolized one cp fully (loop?). Could end 'File Finder' using the 'Close' button.


2. Second test: File tools -> File Finder -> e:/, find files( *.*), ALL options unchecked except 'Skip deep directory structures', Directory grouping

This test finished quickly, but found only a 'few' files.


Attached are the two .zip files from run-1. and run-2. , each containing a Screen shot and Debug Log.html.

Unchecking to option 'Skip deep directory structures' as well as running 'Duplicate File finder' makes things worse. Could be a different test set.

I can reproduce the problem, I'm working on it.

This issue is now fixed, thank you guys for reporting!


The Duplicate File Finder had a bug that caused the feature to get stuck to an eternal loop in a case it run across a file that it couldn't open (due to protection by Windows, for example). The problem mainly occurred if the user ran the Duplicate File Finder on a very large set of data, such as C:\, because the more files it analyzed, the more likely it was to attempt to analyze a file it couldn't open for reading.


This fix will be included to the next released build.

jv16 wrote

........The Duplicate File Finder had a bug that caused the feature to get stuck to an eternal loop .....................

This fix will be included to the next released build.

'File Finder' has problems too, -- Application not ending, with option 'NO Grouping', --Application ends quickly but find only few files, with option 'Directory Grouping'.

Are these problems correlated to the 'Duplicate File Finder' problem? And if so are they also solved in this fix?

Leonardo wrote
jv16 wrote

........The Duplicate File Finder had a bug that caused the feature to get stuck to an eternal loop .....................

This fix will be included to the next released build.

'File Finder' has problems too, -- Application not ending, with option 'NO Grouping', --Application ends quickly but find only few files, with option 'Directory Grouping'.

Are these problems correlated to the 'Duplicate File Finder' problem? And if so are they also solved in this fix?

I think this problem relates more to the generic problem of the PowerTools slowing down / freezing when attempting to add too much data to a list. We are working on this problem as well.

Sorry, we encountered an error while displaying this content. If you're a user, please try again later. If you're an administrator, take a look in your Flarum log files for more information.
Sorry, we encountered an error while displaying this content. If you're a user, please try again later. If you're an administrator, take a look in your Flarum log files for more information.

Problem number 1 I could reproduce and it's now fixed.


I could not reproduce the problem number 2, but I did do some changes and let see if I got it fixed.


These fixes are included in the next released build.