Skip to main content
Notice removed Reward existing answer by Fabby
Bounty Ended with FelixJN's answer chosen by Fabby
Notice added Reward existing answer by Fabby
Bounty Started worth 200 reputation by Fabby
not the example we want to set for new contributors
Source Link
muru
  • 78.4k
  • 16
  • 214
  • 321

Efficient way to move similarly named PR0Nlog files into multiple directories

So I've been told that if you put PR0N in your title, you get many more hits, so now that I've got your attention... ;-)

The issue:

I have a directory on an CIFS share with 10000+ files that contains a number of porn actress movies server log filesserver log files in the format CCLLLTTTFFFFNNN YYYY-MM-DD at a minimum. Where:

  • Server name consisting of:
    • CC = ISO country name
    • LLL = Location (IATA code of closest city)
    • TTT = WIN or UNX (and why it's a CIFS share)
    • FFFF = Function of the server
    • nnn = number
  • a space
  • a date
  • sometimes some more text containing words with spaces
  • an extension (always)

Someone who's no longer working for the company set this up and all servers globally dump their daily logs there and it takes forever to load the list of files! Everyone who needs logs whines and bitches grumbles about it, but no one ever does anything so I started doing something about it just for me.

The idea:

Instead of a long list of files, why not have a short list of directories at least 2 orders of magnitude shorter with the server names in them and cron a script daily that moves all these files into the directory? ¹

What do I have?

  • bash
  • Access to gcc
  • Write access to the CIFS share (obviously)
  • Manjaro, an Arch derivative
  • OpenOffice

What have I done so far?

  • ls /mnt/logshare/*UNXSAP* > ~/Documents/logs/logshare.txt
  • Import logshare.txt into OpenOffice Calc
    • create a directory with the server name
    • Generate a ton of mv commands using Calc and formulas
  • copy-paste that into a shell-script
  • execute shell-script

But:

  • I've become a victim of my own success
  • The security and application group have seen my directories crop up and want me to not be such an egoist and do that for everyone.
  • No real devs, no real scripters available.
  • I've been thinking about this for a week and wouldn't even know where to start. awk? find? Start writing C-code again? (Haven't done that in 20+ years. Unfortunately, I've become what I always dreaded: a suit...) ;-(
  • Whenever a new server gets added, a directory should be created automatically
  • script should be run daily

Is there anyone out there who has solved this already for their own server / data file collection and has such a bash script (C-source?) handy already that I can modify? and if not: helpful hints, please?

Note 1: Yes, the intelligent thing to do would be for the servers to dump their logs into a directory named after the server name, but that's a roll-out, a CAB, and other head-aches like mobilising all the world-wide server admins...

Efficient way to move similarly named PR0N files into multiple directories

So I've been told that if you put PR0N in your title, you get many more hits, so now that I've got your attention... ;-)

The issue:

I have a directory on an CIFS share with 10000+ files that contains a number of porn actress movies server log files in the format CCLLLTTTFFFFNNN YYYY-MM-DD at a minimum. Where:

  • Server name consisting of:
    • CC = ISO country name
    • LLL = Location (IATA code of closest city)
    • TTT = WIN or UNX (and why it's a CIFS share)
    • FFFF = Function of the server
    • nnn = number
  • a space
  • a date
  • sometimes some more text containing words with spaces
  • an extension (always)

Someone who's no longer working for the company set this up and all servers globally dump their daily logs there and it takes forever to load the list of files! Everyone who needs logs whines and bitches grumbles about it, but no one ever does anything so I started doing something about it just for me.

The idea:

Instead of a long list of files, why not have a short list of directories at least 2 orders of magnitude shorter with the server names in them and cron a script daily that moves all these files into the directory? ¹

What do I have?

  • bash
  • Access to gcc
  • Write access to the CIFS share (obviously)
  • Manjaro, an Arch derivative
  • OpenOffice

What have I done so far?

  • ls /mnt/logshare/*UNXSAP* > ~/Documents/logs/logshare.txt
  • Import logshare.txt into OpenOffice Calc
    • create a directory with the server name
    • Generate a ton of mv commands using Calc and formulas
  • copy-paste that into a shell-script
  • execute shell-script

But:

  • I've become a victim of my own success
  • The security and application group have seen my directories crop up and want me to not be such an egoist and do that for everyone.
  • No real devs, no real scripters available.
  • I've been thinking about this for a week and wouldn't even know where to start. awk? find? Start writing C-code again? (Haven't done that in 20+ years. Unfortunately, I've become what I always dreaded: a suit...) ;-(
  • Whenever a new server gets added, a directory should be created automatically
  • script should be run daily

Is there anyone out there who has solved this already for their own server / data file collection and has such a bash script (C-source?) handy already that I can modify? and if not: helpful hints, please?

Note 1: Yes, the intelligent thing to do would be for the servers to dump their logs into a directory named after the server name, but that's a roll-out, a CAB, and other head-aches like mobilising all the world-wide server admins...

Efficient way to move similarly named log files into multiple directories

I have a directory on an CIFS share with 10000+ files that contains a number of server log files in the format CCLLLTTTFFFFNNN YYYY-MM-DD at a minimum. Where:

  • Server name consisting of:
    • CC = ISO country name
    • LLL = Location (IATA code of closest city)
    • TTT = WIN or UNX (and why it's a CIFS share)
    • FFFF = Function of the server
    • nnn = number
  • a space
  • a date
  • sometimes some more text containing words with spaces
  • an extension (always)

Someone who's no longer working for the company set this up and all servers globally dump their daily logs there and it takes forever to load the list of files! Everyone who needs logs whines and bitches grumbles about it, but no one ever does anything so I started doing something about it just for me.

The idea:

Instead of a long list of files, why not have a short list of directories at least 2 orders of magnitude shorter with the server names in them and cron a script daily that moves all these files into the directory? ¹

What do I have?

  • bash
  • Access to gcc
  • Write access to the CIFS share (obviously)
  • Manjaro, an Arch derivative
  • OpenOffice

What have I done so far?

  • ls /mnt/logshare/*UNXSAP* > ~/Documents/logs/logshare.txt
  • Import logshare.txt into OpenOffice Calc
    • create a directory with the server name
    • Generate a ton of mv commands using Calc and formulas
  • copy-paste that into a shell-script
  • execute shell-script

But:

  • I've become a victim of my own success
  • The security and application group have seen my directories crop up and want me to not be such an egoist and do that for everyone.
  • No real devs, no real scripters available.
  • I've been thinking about this for a week and wouldn't even know where to start. awk? find? Start writing C-code again? (Haven't done that in 20+ years. Unfortunately, I've become what I always dreaded: a suit...) ;-(
  • Whenever a new server gets added, a directory should be created automatically
  • script should be run daily

Is there anyone out there who has solved this already for their own server / data file collection and has such a bash script (C-source?) handy already that I can modify? and if not: helpful hints, please?

Note 1: Yes, the intelligent thing to do would be for the servers to dump their logs into a directory named after the server name, but that's a roll-out, a CAB, and other head-aches like mobilising all the world-wide server admins...

added 40 characters in body
Source Link
Fabby
  • 5.5k
  • 2
  • 25
  • 38
  • CC = ISO country Server name
  • LLL = Location (IATA code of closest city)
  • TTT = WIN or UNX (and why it's a CIFS share)
  • FFFF = Function consisting of the server
  • nnn = number:
    • CC = ISO country name
    • LLL = Location (IATA code of closest city)
    • TTT = WIN or UNX (and why it's a CIFS share)
    • FFFF = Function of the server
    • nnn = number
  • a space
  • a date
  • sometimes some more text containing words with spaces
  • an extension (always)
  • CC = ISO country name
  • LLL = Location (IATA code of closest city)
  • TTT = WIN or UNX (and why it's a CIFS share)
  • FFFF = Function of the server
  • nnn = number
  • a space
  • a date
  • sometimes some more text containing words with spaces
  • an extension (always)
  • Server name consisting of:
    • CC = ISO country name
    • LLL = Location (IATA code of closest city)
    • TTT = WIN or UNX (and why it's a CIFS share)
    • FFFF = Function of the server
    • nnn = number
  • a space
  • a date
  • sometimes some more text containing words with spaces
  • an extension (always)
added 28 characters in body
Source Link
Fabby
  • 5.5k
  • 2
  • 25
  • 38

So I've been told that if you put PR0N in your title, you get many more hits, so now that I've got your attention... ;-)

The issue:

I have a directory on an CIFS share with 10000+ files that contains a number of porn actress movies server log files in the format CCLLLTTTFFFFNNN YYYY-MM-DD at a minimum. Where:

  • CC = ISO country name
  • LLL = Location (IATA code of closest city)
  • TTT = WIN or UNX (and why it's a CIFS share)
  • FFFF = Function of the server
  • nnn = number
  • a space
  • a date
  • sometimes some more text containing words with spaces
  • an extension (always)

Someone who's no longer working for the company set this up and all servers globally dump their daily logs there and it takes forever to load the list of files! Everyone who needs logs whines and bitches grumbles about it, but no one ever does anything so I started doing something about it just for me.

The idea:

Instead of a long list of files, why not have a short list of directories at least 2 orders of magnitude shorter with the server names in them and cron a script daily that moves all these files into the directory? ¹

What do I have?

  • bash
  • Access to gcc
  • Write access to the CIFS share (obviously)
  • Manjaro, an Arch derivative
  • OpenOffice

What have I done so far?

  • ls /mnt/logshare/*UNXSAP* > ~/Documents/logs/logshare.txt
  • Import logshare.txt into OpenOffice Calc
    • create a directory with the server name
    • Generate a ton of mv commands using Calc and formulas
  • copy-paste that into a shell-script
  • execute shell-script

But:

  • I've become a victim of my own success
  • The security and application group have seen my directories crop up and want me to not be such an egoist and do that for everyone.
  • No real devs, no real scripters available.
  • I've been thinking about this for a week and wouldn't even know where to start. awk? find? Start writing C-code again? (Haven't done that in 20+ years. Unfortunately, I've become what I always dreaded: a suit...) ;-(
  • Whenever a new server gets added, a directory should be created automatically
  • script should be run daily

Is there anyone out there who has solved this already for their own server / data file collection and has such a bash script (C-source?) handy already that I can modify? and if not: helpful hints, please?

Note 1: Yes, the intelligent thing to do would be for the servers to dump their logs into a directory named after the server name, but that's a roll-out, a CAB, and other head-aches like mobilising all the world-wide server admins...

So I've been told that if you put PR0N in your title, you get many more hits, so now that I've got your attention... ;-)

The issue:

I have a directory on an CIFS share with 10000+ files that contains a number of porn actress movies server log files in the format CCLLLTTTFFFFNNN YYYY-MM-DD at a minimum. Where:

  • CC = ISO country name
  • LLL = Location (IATA code of closest city)
  • TTT = WIN or UNX (and why it's a CIFS share)
  • FFFF = Function of the server
  • nnn = number
  • a space
  • a date
  • sometimes some more text containing words with spaces
  • an extension (always)

Someone who's no longer working for the company set this up and all servers globally dump their daily logs there and it takes forever to load the list of files! Everyone who needs logs whines and bitches grumbles about it, but no one ever does anything so I started doing something about it just for me.

The idea:

Instead of a long list of files, why not have a short list of directories at least 2 orders of magnitude shorter with the server names in them and cron a script daily that moves all these files into the directory? ¹

What do I have?

  • bash
  • Access to gcc
  • Write access to the CIFS share (obviously)
  • Manjaro, an Arch derivative
  • OpenOffice

What have I done so far?

  • ls /mnt/logshare/*UNXSAP* > ~/Documents/logs/logshare.txt
  • Import logshare.txt into OpenOffice Calc
    • create a directory with the server name
    • Generate a ton of mv commands using Calc and formulas
  • copy-paste that into a shell-script
  • execute shell-script

But:

  • I've become a victim of my own success
  • The security and application group have seen my directories crop up and want me to not be such an egoist and do that for everyone.
  • No real devs, no real scripters available.
  • I've been thinking about this for a week and wouldn't even know where to start. awk? find? Start writing C-code again? (Haven't done that in 20+ years. Unfortunately, I've become what I always dreaded: a suit...) ;-(
  • Whenever a new server gets added, a directory should be created automatically
  • script should be run daily

Is there anyone out there who has solved this already for their own server / data file collection and has such a bash script (C-source?) handy already that I can modify? and if not: helpful hints, please?

Note 1: Yes, the intelligent thing to do would be for the servers to dump their logs into a directory, but that's a roll-out, a CAB, and other head-aches like mobilising all the world-wide server admins...

So I've been told that if you put PR0N in your title, you get many more hits, so now that I've got your attention... ;-)

The issue:

I have a directory on an CIFS share with 10000+ files that contains a number of porn actress movies server log files in the format CCLLLTTTFFFFNNN YYYY-MM-DD at a minimum. Where:

  • CC = ISO country name
  • LLL = Location (IATA code of closest city)
  • TTT = WIN or UNX (and why it's a CIFS share)
  • FFFF = Function of the server
  • nnn = number
  • a space
  • a date
  • sometimes some more text containing words with spaces
  • an extension (always)

Someone who's no longer working for the company set this up and all servers globally dump their daily logs there and it takes forever to load the list of files! Everyone who needs logs whines and bitches grumbles about it, but no one ever does anything so I started doing something about it just for me.

The idea:

Instead of a long list of files, why not have a short list of directories at least 2 orders of magnitude shorter with the server names in them and cron a script daily that moves all these files into the directory? ¹

What do I have?

  • bash
  • Access to gcc
  • Write access to the CIFS share (obviously)
  • Manjaro, an Arch derivative
  • OpenOffice

What have I done so far?

  • ls /mnt/logshare/*UNXSAP* > ~/Documents/logs/logshare.txt
  • Import logshare.txt into OpenOffice Calc
    • create a directory with the server name
    • Generate a ton of mv commands using Calc and formulas
  • copy-paste that into a shell-script
  • execute shell-script

But:

  • I've become a victim of my own success
  • The security and application group have seen my directories crop up and want me to not be such an egoist and do that for everyone.
  • No real devs, no real scripters available.
  • I've been thinking about this for a week and wouldn't even know where to start. awk? find? Start writing C-code again? (Haven't done that in 20+ years. Unfortunately, I've become what I always dreaded: a suit...) ;-(
  • Whenever a new server gets added, a directory should be created automatically
  • script should be run daily

Is there anyone out there who has solved this already for their own server / data file collection and has such a bash script (C-source?) handy already that I can modify? and if not: helpful hints, please?

Note 1: Yes, the intelligent thing to do would be for the servers to dump their logs into a directory named after the server name, but that's a roll-out, a CAB, and other head-aches like mobilising all the world-wide server admins...

Clarified question
Source Link
Fabby
  • 5.5k
  • 2
  • 25
  • 38
Loading
added 16 characters in body
Source Link
Fabby
  • 5.5k
  • 2
  • 25
  • 38
Loading
ok, we get it, once is enough.
Source Link
glenn jackman
  • 88.6k
  • 16
  • 124
  • 180
Loading
Source Link
Fabby
  • 5.5k
  • 2
  • 25
  • 38
Loading