Skip to main content

Questions tagged [large-files]

2 votes
1 answer
312 views

I have a very large SQL dumpfile (30GB) that I need to edit (do some find/replace) before loading back into the database. Besides having a large size, the file also contains very long lines. Except ...
fernan's user avatar
  • 23
16 votes
5 answers
13k views

I have a log of 55GB in size. I tried: cat logfile.log | tail But this approach takes a lot of time. Is there any way to read huge files faster or any other approach?
Yi Qiang Ji's user avatar
2 votes
4 answers
6k views

I'd like to format a 12 TB HDD (not SSD) with EXT4, in order to store large video files (each file being at least 1 GiB in size). I am working with an x86-64 (a.k.a. x64 or amd64) processor. There's ...
ChennyStar's user avatar
  • 2,019
1 vote
1 answer
647 views

We have to move a set of very large data (in petabytes) from HPC cluster to a storage server. We have a high capacity communication link between the devices. However, the bottleneck seems to be a fast ...
Ikram Ullah's user avatar
6 votes
4 answers
1k views

How do I delete this large directory? stat session/ File: ‘session/’ Size: 321540096 Blocks: 628040 IO Block: 4096 directory Device: 903h/2307d Inode: 11149319 Links: 2 Access: ...
Bojan Hrnkas's user avatar
9 votes
2 answers
3k views

I need to copy one very large file (3TB) on the same machine from one external drive to another. This might take (because of low bandwidth) many days. So I want to be prepared when I have to interrupt ...
halloleo's user avatar
  • 659
3 votes
1 answer
9k views

I am currently using rsync to copy a 73GB file from a Samsung portable SSD T7 to an HPC cluster. rsync -avh path/to/dataset [email protected]:/path/to/dest The following applies: My local machine (...
b0neval's user avatar
  • 135
2 votes
5 answers
2k views

In a Unix command line context I would like to compare two truly huge files (around 1TB each), preferable with a progress indicator. I have tried diff and cmp, and they both crashed the system (macOS ...
halloleo's user avatar
  • 659
3 votes
0 answers
2k views

I have a relatively large file, a minified json file which is about 4 Gigs. The file isn't huge...but a lot of programs choke on it because it's a single-lined file. I have noticed less works ok on it ...
David542's user avatar
  • 359
1 vote
5 answers
590 views

I have a bash commands pipeline that produces a ton of logging text output. But mostly it repeats the previous line except for the timestamp and some minor flags, the main output data changes only ...
scythargon's user avatar
2 votes
3 answers
2k views

I have a big text file containing a million lines. I would like to find identical lines that match my specific text and leave the first occurrence intact. Any ideas? So the algorithm should roughly be ...
nmzik's user avatar
  • 21
2 votes
1 answer
2k views

I'm having trouble with a large text file (30GB) I would like to create smaller files from it (5GB lets say) But sadly I don't have no more storage (only ~10G is available). This line: split -b 5g &...
Andras Karpati's user avatar
0 votes
1 answer
622 views

I am using unison file syncing software and I am aware of the config that ignores syncing files with specific regex or name. But is there a way to block syncing of large files e.g. larger than 10 MB? ...
mtk's user avatar
  • 28.6k
1 vote
0 answers
123 views

I have a large potentially zipped log file and I can identify which line number some text I'm interested in is on using: find . -name "*" -exec zgrep -C 1 -n -i -H TextToFind {} \; But in a second ...
Steven Roman's user avatar
1 vote
0 answers
208 views

I have a music folder of nearly 10 GB, of completely offline music. I want to keep it on my phone, so i can take it on the go. The simplest answer is a file transfer between my phone and PC, however ...
Cosmo's user avatar
  • 11

15 30 50 per page
1
2 3 4 5
7