Skip to content

Releases: pkolaczk/fclones

v0.17.1

01 Nov 19:31
Compare
Choose a tag to compare

What's Changed

  • Improve warning message when FIEMAP not supported by @pkolaczk in #88
  • Always restore metadata after reflink by @pkolaczk in #89
  • Make initializing libc::flock more portable by @pkolaczk in #87

Full Changelog: v0.17.0...v0.17.1

Release 0.17.0

23 Oct 12:52
Compare
Choose a tag to compare

This release introduces a new command fclones dedupe
contributed by Thomas Otto (@th1000s).

Dedupe does not remove duplicate files, but it uses the copy-on-write capability
available on some file systems like BTRF or XFS to deduplicate
file data transparently.

What's Changed

Full Changelog: v0.16.1...v0.17.0

Match duplicate files only between different paths

26 Sep 14:16
Compare
Choose a tag to compare

In this release, a new flag -I, --isolate has been added. If this flag is given, duplicates found within a directory tree given as a single argument are all counted as one. Hence, in the following example, only duplicates that exist both inside dir1 and dir2 will be reported:

$ echo "foo" > test/dir1/foo1
$ echo "foo" > test/dir1/foo2
$ fclones group --isolate test/dir1 test/dir2

# Report by fclones 0.16.0
# Timestamp: 2021-09-26 16:11:21.905 +0200
# Command: fclones group --isolate test/dir1 test/dir2
# Found 0 file groups
# 0 B (0 B) in 0 redundant files can be removed

$ echo "foo" > test/dir2/foo3

# Report by fclones 0.16.0
# Timestamp: 2021-09-26 16:12:03.224 +0200
# Command: fclones group --isolate test/dir1 test/dir2
# Found 1 file groups
# 8 B (8 B) in 2 redundant files can be removed
6109f093b3fd5eb1060989c990d1226f, 4 B (4 B) * 3:
    /home/pkolaczk/Projekty/fclones/test/dir1/foo1
    /home/pkolaczk/Projekty/fclones/test/dir1/foo2
    /home/pkolaczk/Projekty/fclones/test/dir2/foo3

Additionally, the output has been made deterministic when option --hardlinks is set (patch by @th1000s).

Move duplicates to a different directory

13 Sep 07:34
Compare
Choose a tag to compare

New feature: fclones move <target_dir> moves duplicates to a given directory.

The directory structure is preserved, so files with same names don't conflict with each other and it is easy to undo the move manually.
If files are moved within the same mount-point, only file metadata is modified, which is very fast.
If files are moved to a different mount-point, the data are copied first and the source files are removed afterwards.

Improved initialization speed

05 Sep 19:15
Compare
Choose a tag to compare

Now fclones doesn't scan sysfs on startup on Linux systems.
That saves additional 0.5-0.8 s (on my computer).

Removing duplicates based on JSON report

28 Aug 17:06
Compare
Choose a tag to compare

Now JSON formatted reports are accepted as input to fclones remove and fclones link.
The report format is automatically detected.

This new feature allows for easier programmatic processing of the list of duplicate files by standard json processing tools like jq.

Fix --stdin option

20 Jun 18:12
Compare
Choose a tag to compare

This is a bugfix release to fix #60.

Fix deduplication on Windows

07 Jun 16:29
Compare
Choose a tag to compare

This is a bugfix release.
Fixes a bug with incorrect handling of line terminators (CRLF) on Windows.
Additionally improves error reporting accuracy.

Increase default size of thread pools used for SSD

06 Jun 12:22
Compare
Choose a tag to compare

This is a minor release that introduces better defaults for SSD.
The SSD benchmarks in README.md have been updated and extended with more programs.

Automatic file deduplication

05 Jun 19:38
Compare
Choose a tag to compare

This release introduces subcommands. The old functionality of duplicate file search has been moved to the group subcommand. New subcommands have been introduced: link and remove. Both commands work on a list of files previously generated with group, passed on the standard input. fclones remove removes redundant files, fclones link replaces redundant files with links.

Additionally, a set of options was added to control the selection of files that should be removed. It is possible to select files by their creation time, modification time, last access time, nesting level, or to match them by path / name patterns.