Skip to main content

Questions tagged [compression]

Shrinking (compressing) and restoring (decompressing) of data.

Filter by
Sorted by
Tagged with
0 votes
2 answers
135 views

I have a folder with around seventy subfolders, each containing a few tarballs which are nightly backups of a few directories (the largest being /home) from an old Raspberry Pi. Each is a full backup; ...
kj7rrv's user avatar
  • 261
2 votes
0 answers
129 views

My backup strategy currently primarily consists of daily backups of all of my machines with Borg Backup, stored on different storage devices in different locations, following the 3-2-1 strategy. These ...
PhrozenByte's user avatar
0 votes
1 answer
79 views

I have page_1.pnm, …, page_6.pnm, which represent 6 pages of a scanned document, all in gray PNM produced by scanimage and manually postprocessed with GIMP. The command convert $(for i in 1 2 3 4 5 6; ...
AlMa1r's user avatar
  • 1
0 votes
1 answer
107 views

Assume an image opened in GIMP in Debian 12. From this image, you would like to create a single-page PDF file with maximum lossless compression. How? As of 2024-12-19, https://docs.gimp.org/en/gimp-...
AlMa1r's user avatar
  • 1
0 votes
1 answer
147 views

I'm trying to create an archive e.g. archive.tar.gz inside the current working directory e.g. /builds/project/ without saving the archive.tar.gz inside archive.tar.gz. To prevent this I'm trying to ...
rosaLux161's user avatar
1 vote
1 answer
596 views

I am brand new to zstd/pzstd, trying out its features, compression, benchmarking it, and so on. (I run Linux Mint 22 Cinnamon.) This computer has 32 GB RAM. The basic command appears to be working, ...
Vlastimil Burián's user avatar
1 vote
1 answer
106 views

I have a Z6 HDD pool with 6 x 18T drives and a SLOG, LZ4 compression is enabled by default. Now I need to store large amount of small files and I'm worried about fragmentation. The files: 70K files ...
7E10FC9A's user avatar
  • 121
0 votes
1 answer
113 views

I have to restore a single file from a fairly large (~ 1 TB) afio archive I created by using the following script (debug messages omitted): #!/bin/bash # SRCDIR=/bak BAKDEV=/dev/disk/by-partlabel/...
Neppomuk's user avatar
  • 364
0 votes
2 answers
92 views

folks, need a hand with unpacking a big batch of rars. Don't want to do this one-by-one. I have several directories inside subdirectories of 'dir' with many .rar's in each. dir subdir_a ...
Jehan Alvani's user avatar
0 votes
1 answer
192 views

Basics I am running the Apache/2.4.62 on my Raspberry Pi 4B (ARM64/AArch64) with the Debian GNU/Linux 12 (bookworm). Some HW/OS info: # neofetch _,met$$$$$gg. root@rpi4 ,g$$$$$$$$...
Vlastimil Burián's user avatar
1 vote
2 answers
1k views

I'm not currently using compression with my btrfs-formatted disk, but am wondering how much space I'd save if I did enable it. Short of actually enabling compression on the disk and comparing the ...
Psychonaut's user avatar
0 votes
1 answer
199 views

I created some pigz (parallel gzip) - home page - compressed archives of my SSD disk drives. (compiled version 2.8) I called one of them 4TB-SATA-disk--Windows10--2024-Jan-21.img.gz which says the ...
Vlastimil Burián's user avatar
0 votes
3 answers
164 views

I need to extract a specific folder from .tar.bz2 (34G). The issue is that it takes 1 hour. I guess that this is due to the compression. I guess that w/o compression extraction of a specific folder ...
pmor's user avatar
  • 757
0 votes
1 answer
235 views

I am working on an embedded Linux system (5.10.24), and I am using jffs2 as the rootfs. Now I changed the kernel configuration of jffs2 as follows, # CONFIG_JFFS2_FS_WBUF_VERIFY is not set # ...
wangt13's user avatar
  • 651
1 vote
1 answer
664 views

My system is running Ubuntu 22.04.3 LTS. I plug a USB drive into that system. That USB drive (/dev/sdb) contains an Ubuntu installation, mostly in an ext4 partition (/dev/sdb3). That installation ...
Ray Woodcock's user avatar
2 votes
0 answers
57 views

I have a large number of .tif's coming out of ScanTailor. Is there a way that I might OCR those .tif's with tesseract, holding the OCR data separate from the images; then compress the images, and ...
Diagon's user avatar
  • 740
0 votes
1 answer
637 views

I am looking to backup a large amount (~400GB) of different types of files (50% images and videos, 30% audio, 20% text). I hesitate to give an amount by which I'd like the size to be reduced, since I ...
Lukas's user avatar
  • 77
6 votes
1 answer
424 views

I need to download and decompress a file as quickly as possible in a very latency sensitive environment with limited resources (A VM with 1 cpu, 2 cores, 128MB RAM) Naturally, I tried to pipe the ...
Richard's user avatar
  • 113
2 votes
1 answer
225 views

I have on my server more file rotate, but when I want to compress it, i Have an extension 1.gz, but I just want .gz These files are automatically rotate by the application: server.log.2023-03-16 ...
sam's user avatar
  • 21
10 votes
1 answer
945 views

On my FreeBSD 13.2 system, the zless utility cannot view text files compressed with gzip or compress, warning that they may be binary files and then showing garbage if I say I want to see the contents ...
Kusalananda's user avatar
  • 356k
12 votes
4 answers
2k views
+100

I am looking for a way to compress swap on disk. I am not looking for wider discussion of alternative solutions. See discussion at the end. I have tried: Using compressed zfs zvol for swap is NOT ...
linuxlover69's user avatar
0 votes
2 answers
111 views

Instead of using non-POSIX tools such as unxz, what POSIX utility can I use to decompress files with the .xz extension? Neither xz nor unxz is a POSIX command, so if I want to run only POSIX commands, ...
just_another_human's user avatar
1 vote
1 answer
3k views

I have a gz archive but for some reason tar said that the format is incorrect even though I can double click it in mac Finder and extract it normally, and file command shows the same format just like ...
phuclv's user avatar
  • 2,442
0 votes
1 answer
244 views

I have tons of .Z compressed files scattered across various directores and need to see the size of the file within it. I don't plan on uncompressing all the .Z files. Is there a way to see the content ...
Steve237's user avatar
  • 103
0 votes
1 answer
378 views

I have a .tar.gz as input and want to extract the first 128 MiB of it and output as a .tar.gz in a single command. I tried: sudo tar xzOf input.tar.gz | sudo dd of=output bs=1M count=128 iflag=...
JohnnyFromBF's user avatar
  • 3,606

1
2 3 4 5
9