Mildly pretty display of nmap results over time.
Because it's useful to know when kit was turned on or off and thus keep DNS accurate.
nmap is run daily (in the crontab) against the netblocks file and produces a greppable
output file. This is post-processed by some nasty perl ruby to update a file redis table that shows
which IPs are up or down.
The posh version of that file redis table is generated by more nasty perl ruby which queries the DNS.
Finally, the Sinatra job makes a stab at a website for same.
You will need to run this on a box that can nmap your target IP ranges. Or arrange for the nmap results to be copied to the ../data directory.
You should also make sure that running a nmap ping-scan on your target netblocks won't make the networking kit or people catch fire. And indeed that you've the relevant permission to even think about doing that.
You will also need access to the relevant zonefiles. We track ours in git, which is easy.
Sadly this works best when you have rbenv installed. It'll probably work well enough without if you've full control of the box and don't have to worry about Ruby versioning. You lucky sod.
If you like nginx and unicorn, you're in luck, since there are pre-packaged vhosts and initscripts for those things. If you like other types of scaffolding, then you'll not need me to tell you how to work them.
Other than that, it ought to contain itself within the tree you've just checked out.
Remember to fill in the IP ranges that nmap should scan in the config file.
i) Ruby versions of the Dreadful Perl
ii) Which live in the ../cron directory
iii) Along with a file you can bung in the crontab which will run everything in the
right order so it is easy to install. And emit data to the ../data dir.
iv) The zonefile grovelling comes back out into a separate file.
v) Which emits its data to a Redis DB every mumble minutes in the same cron file above.
vi) Then the hard work of updating the DNS lookups is asynchronous.
vii) There's likely not much point in making the basic IP->name lookups happen
any more regularly.
viii) Since we have the other data - DNS lookup failures and external addresses
pointed to by local DNS - we should display them on separate pages.
ix) It's almost worth having a proper DB for this. Almost, but not quite.
x) Don't try to automate WHOIS lookups it does not work. Patches welcome, mind.
xi) Actually, you could re-run the nshow-equivalent when you do the zonefile grovelling
because all it's really doing is updating the DNS names.
xii) Instead of using redis as a dumb k/v store, work out how to store all the gubbins in one redis hash and have redis do the sort-by-ip.
(Yes, this is a stab at portability so it can go on the githubs.)
Patches and/or better ideas welcome, mind.
[Agricultural code for agricultural people]